Skip to content

Latest commit

 

History

History
19 lines (14 loc) · 448 Bytes

File metadata and controls

19 lines (14 loc) · 448 Bytes

Seq2Seq

It is a seq2seq model that translates German into English. The dataset used Multi30k in French and English. This implementation focuses on the following features:

  • Using Bahdanau Attention
  • Comparison between models with and without attention

Model

Without Attention

  • Encoder: RNN
  • Decoder: RNN

Without Attention

  • Encoder : GRU
  • Decoder : GRU
  • Attention : Bahdanau Attention