TensorFlow 神经机器翻译教程-TensorFlow Neural Machine Translation Tutorial
seq2seq 模型在多领域的任务中已经取得了巨大的成功,比如说机器翻译,语音识别,文字总结。这个教程给读者 seq2seq 模型一个完整的理解,并且表明如何从草稿构建一个有竞争力的 seq2seq 模型。我们专注于机器翻译任务,这个任务是 seq2seq 模型取得的第一个广泛的成功。下面的代码是轻量级,高质量,高可用,并且结合了最新的研究思想。
Sequence-to-sequence (seq2seq) models (Sutskever et al., 2014, Cho et al., 2014) have enjoyed great success in a variety of tasks such as machine translation, speech recognition, and text summarization. This tutorial gives readers a full understanding of seq2seq models and shows how to build a competitive seq2seq model from scratch. We focus on the task of Neural Machine Translation (NMT) which was the very first testbed for seq2seq models with wildsuccess. The included code is lightweight, high-quality, production-ready, and incorporated with the latest research ideas. We achieve this goal by:
- Using the recent decoder / attention wrapper API, TensorFlow 1.2 data iterator
- Incorporating our strong expertise in building recurrent and seq2seq models
- Providing tips and tricks for building the very best NMT models and replicating Google’s NMT (GNMT) system.
原创文章,作者:fendouai,如若转载,请注明出处:https://panchuang.net/2017/07/14/tensorflow-neural-machine-translation-tutorial/