Skip to content

A sequence-to-sequence neural machine translation model

Notifications You must be signed in to change notification settings

yc930401/seq2seq_NMT_kerass

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

seq2seq_NMT_keras

This is a character level sequence-to-sequence model for neural machine translation (english to french).

Introduction

Sequence to sequence model is widely used in research papers, but it is hard to find code about it on how to build one with keras. I found a good resource these days, and tried to understand and rewrote the code.

Methodology

  1. Prepare data for encoder and decoder.
  2. Build a encoder-decoder model for training (teacher forcing).
  3. Build another model for inference.
  4. Testing Note: For layer reusing, we need to define a Functional model.

Result

Input sentence: Many fish died.
Decoded sentence: L'aidenu en laume.

Input sentence: This is silly.
Decoded sentence: C'est simple.

Input sentence: We can meet.
Decoded sentence: Nous pouvons le saiver.

Input sentence: Who cares?
Decoded sentence: Qui s'en soucie ?

Input sentence: Did I say that?
Decoded sentence: Ai-je dit cela ?

Input sentence: They're early.
Decoded sentence: Elles sont en avance.

Input sentence: He also saw it.
Decoded sentence: Il l’a aussi vu.

Input sentence: Bring him to me.
Decoded sentence: Apportez-le-moi.

Input sentence: Bottoms up!
Decoded sentence: Santé !

Input sentence: I'm not sad.
Decoded sentence: Je ne suis pas sourde.

References:

https://blog.keras.io/a-ten-minute-introduction-to-sequence-to-sequence-learning-in-keras.html
https://jovianlin.io/keras-models-sequential-vs-functional/

About

A sequence-to-sequence neural machine translation model

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages