Skip to content

build a neural machine translator using seq2seq, attention mechanism.

Notifications You must be signed in to change notification settings

mrvgao/neural-machine-translation

Repository files navigation

Neural Machine Translation

Machine Translation Framework.

Machine Translation, Image Caption, Text Summary, Music Generation, ChatBot all could be trained by this model.

Environment:

Tensorflow: >= 1.2

References:

  1. Sutskever et al., 2014
  2. Cho et al., 2014
  3. Neural Machine Translation and Sequence-to-sequence Models: A Tutorial

Framework Points:

  • Seq2Seq by tensorflow
    • Encoder
    • Decoder
    • Optimizer
    • weighted crossentropy
  • Attention Model in tensorflow.
  • Stack based Model
  • Data Pipeline in tensorflow.

TO DO:

  • a convenient GUI
  • CNN based Encoder.

e.g:

'知识就是力量' ==> 'Knowledge is power'

About

build a neural machine translator using seq2seq, attention mechanism.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages