References
This is the list of papers, OpenNMT has been inspired on:
- Luong, M. T., Pham, H., & Manning, C. D. (2015). Effective approaches to attention-based neural machine translation. arXiv preprint arXiv:1508.04025.
- Sennrich, R., & Haddow, B. (2016). Linguistic input features improve neural machine translation. arXiv preprint arXiv:1606.02892.
- Sennrich, R., Haddow, B., & Birch, A. (2015). Neural machine translation of rare words with subword units. arXiv preprint arXiv:1508.07909.
- Wu, Y., Schuster, M., Chen, Z., Le, Q. V., Norouzi, M., Macherey, W., ... & Klingner, J. (2016). Google's Neural Machine Translation System: Bridging the Gap between Human and Machine Translation. arXiv preprint arXiv:1609.08144.
- Jean, S., Cho, K., Memisevic, R., Bengio, Y. (2015). On Using Very Large Target Vocabulary for Neural Machine Translation. ACL 2015
- Gehring, J., Auli, M., Grangier D., Dauphin Y. N. (2017). A Convolutional Encoder Model for Neural Machine Translation. arXiv preprint arXiv:1611.02344.
- Bengio, S., Vinyals, O., Jaitly, N., & Shazeer, N. (2015). Scheduled sampling for sequence prediction with recurrent neural networks. In Advances in Neural Information Processing Systems (pp. 1171-1179).
- Gulcehre, C., Firat, O., Xu, K., Cho, K., Barrault, L., Lin, H. C., ... & Bengio, Y. (2015). On using monolingual corpora in neural machine translation. arXiv preprint arXiv:1503.03535.
- Hokamp, C. & Liu, Q. (2017). Lexically Constrained Decoding for Sequence Generation Using Grid Beam Search. arXiv preprint arXiv:1704.07138.