Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix building of docs #1340

Closed
wants to merge 1 commit into from
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion docs/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -50,7 +50,7 @@

# General information about the project.
project = 'fairseq'
copyright = '2018, Facebook AI Research (FAIR)'
copyright = '2019, Facebook AI Research (FAIR)'
author = 'Facebook AI Research (FAIR)'

github_doc_root = 'https://github.com/pytorch/fairseq/tree/master/docs/'
Expand Down
9 changes: 7 additions & 2 deletions examples/backtranslation/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,9 +8,14 @@ Model | Description | Dataset | Download
---|---|---|---
`transformer.wmt18.en-de` | Transformer <br> ([Edunov et al., 2018](https://arxiv.org/abs/1808.09381)) <br> WMT'18 winner | [WMT'18 English-German](http://www.statmt.org/wmt18/translation-task.html) | [download (.tar.gz)](https://dl.fbaipublicfiles.com/fairseq/models/wmt18.en-de.ensemble.tar.gz) <br> See NOTE in the archive

## Example usage
## Example usage (torch.hub)

Interactive generation from the full ensemble via PyTorch Hub:
We require a few additional Python dependencies for preprocessing:
```bash
pip install subword_nmt sacremoses
```

Then to generate translations from the full model ensemble:
```python
import torch

Expand Down
5 changes: 5 additions & 0 deletions examples/language_model/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -12,6 +12,11 @@ Model | Description | Dataset | Download

## Example usage

We require a few additional Python dependencies for preprocessing:
```bash
pip install fastBPE sacremoses
```

To sample from a language model using PyTorch Hub:
```python
import torch
Expand Down
5 changes: 5 additions & 0 deletions examples/translation/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -20,6 +20,11 @@ Model | Description | Dataset | Download

## Example usage (torch.hub)

We require a few additional Python dependencies for preprocessing:
```bash
pip install sacremoses subword_nmt
```

Interactive translation via PyTorch Hub:
```python
import torch
Expand Down
13 changes: 13 additions & 0 deletions examples/wmt19/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -16,6 +16,15 @@ Model | Description | Download

## Example usage (torch.hub)

#### Requirements

We require a few additional Python dependencies for preprocessing:
```bash
pip install fastBPE sacremoses
```

#### Translation

```python
import torch

Expand All @@ -38,7 +47,11 @@ en2ru.translate("Machine learning is great!") # 'Машинное обучен
ru2en = torch.hub.load('pytorch/fairseq', 'transformer.wmt19.ru-en', checkpoint_file='model1.pt:model2.pt:model3.pt:model4.pt',
tokenizer='moses', bpe='fastbpe')
ru2en.translate("Машинное обучение - это здорово!") # 'Machine learning is great!'
```

#### Language Modeling

```python
# Sample from the English LM
en_lm = torch.hub.load('pytorch.fairseq', 'transformer_lm.wmt19.en', tokenizer='moses', bpe='fastbpe')
en_lm.sample("Machine learning is") # 'Machine learning is the future of computing, says Microsoft boss Satya Nadella ...'
Expand Down
49 changes: 38 additions & 11 deletions setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -4,13 +4,13 @@
# This source code is licensed under the MIT license found in the
# LICENSE file in the root directory of this source tree.

import os
from setuptools import setup, find_packages, Extension
from torch.utils import cpp_extension
import sys


if sys.version_info < (3,):
sys.exit('Sorry, Python3 is required for fairseq.')
if sys.version_info < (3, 5):
sys.exit('Sorry, Python >=3.5 is required for fairseq.')


with open('README.md') as f:
Expand Down Expand Up @@ -61,15 +61,42 @@ def include_dirs(self, dirs):
language='c++',
extra_compile_args=extra_compile_args,
),
cpp_extension.CppExtension(
'fairseq.libnat',
sources=[
'fairseq/clib/libnat/edit_dist.cpp',
],
)
]


cmdclass = {}


try:
# torch is not available when generating docs
from torch.utils import cpp_extension
extensions.extend([
cpp_extension.CppExtension(
'fairseq.libnat',
sources=[
'fairseq/clib/libnat/edit_dist.cpp',
],
),
])
cmdclass['build_ext'] = cpp_extension.BuildExtension
except ImportError:
pass


if 'READTHEDOCS' in os.environ:
# don't build extensions when generating docs
extensions = []
if 'build_ext' in cmdclass:
del cmdclass['build_ext']

# use CPU build of PyTorch
dependency_links = [
'https://download.pytorch.org/whl/cpu/torch-1.3.0%2Bcpu-cp36-cp36m-linux_x86_64.whl'
]
else:
dependency_links = []


setup(
name='fairseq',
version='0.8.0',
Expand All @@ -92,13 +119,13 @@ def include_dirs(self, dirs):
install_requires=[
'cffi',
'cython',
'fastBPE',
'numpy',
'regex',
'sacrebleu',
'torch',
'tqdm',
],
dependency_links=dependency_links,
packages=find_packages(exclude=['scripts', 'tests']),
ext_modules=extensions,
test_suite='tests',
Expand All @@ -113,6 +140,6 @@ def include_dirs(self, dirs):
'fairseq-validate = fairseq_cli.validate:cli_main',
],
},
cmdclass={'build_ext': cpp_extension.BuildExtension},
cmdclass=cmdclass,
zip_safe=False,
)