Skip to content

Commit

Permalink
Readme: Fix link to mBART documentation (#1789)
Browse files Browse the repository at this point in the history
Summary:
Hi,

this PR updates the link to mBART documentation in main readme.
Pull Request resolved: #1789

Differential Revision: D20322673

Pulled By: myleott

fbshipit-source-id: b59c94f49176ba5bbd664791818b5b8ce7402698
  • Loading branch information
stefan-it authored and facebook-github-bot committed Mar 7, 2020
1 parent 1f04f81 commit 3dd221c
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -51,7 +51,7 @@ Fairseq provides reference implementations of various sequence-to-sequence model
- [RoBERTa: A Robustly Optimized BERT Pretraining Approach (Liu et al., 2019)](examples/roberta/README.md)
- [Facebook FAIR's WMT19 News Translation Task Submission (Ng et al., 2019)](examples/wmt19/README.md)
- [Jointly Learning to Align and Translate with Transformer Models (Garg et al., 2019)](examples/joint_alignment_translation/README.md )
- [Multilingual Denoising Pre-training for Neural Machine Translation (Liu et at., 2020)] (examples/mbart/README.md)
- [Multilingual Denoising Pre-training for Neural Machine Translation (Liu et at., 2020)](examples/mbart/README.md)
- **Non-autoregressive Transformers**
- Non-Autoregressive Neural Machine Translation (Gu et al., 2017)
- Deterministic Non-Autoregressive Neural Sequence Modeling by Iterative Refinement (Lee et al. 2018)
Expand Down

0 comments on commit 3dd221c

Please sign in to comment.