Sequence to Sequence with attention implemented with PyTorch
This is a fork from OpenNMT-py.
The master branch now requires PyTorch 0.4.x. There is also a branch called 0.3.0 which supports PyTorch 0.3.x.
- Must use bidirectional GRU by setting
-brnn
.