Skip to content

Commit

Permalink
Fixed failing linkcheck CI job (#3010)
Browse files Browse the repository at this point in the history
* Fixed failing linkcheck CI job

* http -> https and add urls to ignore

* fixed trailing whitespace

* Removed accidental addition
  • Loading branch information
vfdev-5 committed Jul 29, 2023
1 parent a46903e commit 95287aa
Show file tree
Hide file tree
Showing 6 changed files with 16 additions and 9 deletions.
2 changes: 1 addition & 1 deletion docs/Makefile
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ help:
@$(SPHINXBUILD) -M help "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O)

docset: html
doc2dash --name $(SPHINXPROJ) --icon $(SOURCEDIR)/_static/img/pytorch-logo-flame.png --enable-js --online-redirect-url http://pytorch.org/ignite/ --force $(BUILDDIR)/html/
doc2dash --name $(SPHINXPROJ) --icon $(SOURCEDIR)/_static/img/pytorch-logo-flame.png --enable-js --online-redirect-url https://pytorch.org/ignite/ --force $(BUILDDIR)/html/

# Manually fix because Zeal doesn't deal well with `icon.png`-only at 2x resolution.
cp $(SPHINXPROJ).docset/icon.png $(SPHINXPROJ).docset/icon@2x.png
Expand Down
6 changes: 6 additions & 0 deletions docs/source/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -346,6 +346,12 @@ def run(self):
("py:class", "torch.utils.data.dataloader.DataLoader"),
]

linkcheck_ignore = [
"https://github.com/fossasia/visdom#visdom-arguments-python-only",
"https://github.com/pytorch/ignite/tree/master/examples/contrib/cifar10#check-resume-training",
"https://github.com/pytorch/ignite/tree/master/examples/mnist#training-save--resume",
]


def setup(app):
app.add_directive("autosummary", AutolistAutosummary, override=True)
8 changes: 4 additions & 4 deletions docs/source/engine.rst
Original file line number Diff line number Diff line change
Expand Up @@ -117,8 +117,8 @@ from iteration.

Complete examples that resumes the training from a checkpoint can be found here:

- `save/resume MNIST <https://github.com/pytorch/ignite/tree/master/examples/mnist#user-content-training-save--resume>`_
- `save/resume Distributed CIFAR10 <https://github.com/pytorch/ignite/tree/master/examples/contrib/cifar10#user-content-check-resume-training>`_
- `save/resume MNIST <https://github.com/pytorch/ignite/tree/master/examples/mnist#training-save--resume>`_
- `save/resume Distributed CIFAR10 <https://github.com/pytorch/ignite/tree/master/examples/contrib/cifar10#check-resume-training>`_

Deterministic training
----------------------
Expand Down Expand Up @@ -213,8 +213,8 @@ We can see that the data samples are exactly the same between original and resum
Complete examples that simulates a crash on a defined iteration and resumes the training from a checkpoint can be found
here:

- `save/resume MNIST <https://github.com/pytorch/ignite/tree/master/examples/mnist#user-content-training-save--resume>`_
- `save/resume Distributed CIFAR10 <https://github.com/pytorch/ignite/tree/master/examples/contrib/cifar10#user-content-check-resume-training>`_
- `save/resume MNIST <https://github.com/pytorch/ignite/tree/master/examples/mnist#training-save--resume>`_
- `save/resume Distributed CIFAR10 <https://github.com/pytorch/ignite/tree/master/examples/contrib/cifar10#check-resume-training>`_


.. Note ::
Expand Down
2 changes: 1 addition & 1 deletion ignite/contrib/handlers/visdom_logger.py
Original file line number Diff line number Diff line change
Expand Up @@ -43,7 +43,7 @@ class VisdomLogger(BaseLogger):
visdom server. Default, `num_workers=1`. If `num_workers=0` and logger uses the main thread. If using
Python 2.7 and `num_workers>0` the package `futures` should be installed: `pip install futures`
kwargs: kwargs to pass into
`visdom.Visdom <https://github.com/fossasia/visdom#user-content-visdom-arguments-python-only>`_.
`visdom.Visdom <https://github.com/fossasia/visdom#visdom-arguments-python-only>`_.
Note:
We can also specify username/password using environment variables: VISDOM_USERNAME, VISDOM_PASSWORD
Expand Down
5 changes: 3 additions & 2 deletions ignite/distributed/auto.py
Original file line number Diff line number Diff line change
Expand Up @@ -57,7 +57,8 @@ def auto_dataloader(dataset: Dataset, **kwargs: Any) -> Union[DataLoader, "_MpDe
)
.. _torch DataLoader: https://pytorch.org/docs/stable/data.html#torch.utils.data.DataLoader
.. _XLA MpDeviceLoader: https://github.com/pytorch/xla/blob/master/torch_xla/distributed/parallel_loader.py#L178
.. _XLA MpDeviceLoader:
https://pytorch.org/xla/release/2.0/index.html#running-on-multiple-xla-devices-with-multi-processing
.. _torch DistributedSampler:
https://pytorch.org/docs/stable/data.html#torch.utils.data.distributed.DistributedSampler
.. _torch IterableDataset: https://pytorch.org/docs/stable/data.html#torch.utils.data.IterableDataset
Expand Down Expand Up @@ -255,7 +256,7 @@ def auto_optim(optimizer: Optimizer, **kwargs: Any) -> Optimizer:
optimizer = idist.auto_optim(optimizer)
.. _xm.optimizer_step: http://pytorch.org/xla/release/1.5/index.html#torch_xla.core.xla_model.optimizer_step
.. _xm.optimizer_step: https://pytorch.org/xla/release/1.5/index.html#torch_xla.core.xla_model.optimizer_step
.. versionchanged:: 0.4.2
Added Horovod distributed optimizer.
Expand Down
2 changes: 1 addition & 1 deletion ignite/distributed/utils.py
Original file line number Diff line number Diff line change
Expand Up @@ -305,7 +305,7 @@ def train_fn(local_rank, a, b, c, d=12):
.. _dist.init_process_group: https://pytorch.org/docs/stable/distributed.html#torch.distributed.init_process_group
.. _mp.start_processes: https://pytorch.org/docs/stable/multiprocessing.html#torch.multiprocessing.spawn
.. _xmp.spawn: http://pytorch.org/xla/release/1.6/index.html#torch_xla.distributed.xla_multiprocessing.spawn
.. _xmp.spawn: https://pytorch.org/xla/release/1.6/index.html#torch_xla.distributed.xla_multiprocessing.spawn
.. _hvd_run: https://horovod.readthedocs.io/en/latest/api.html#module-horovod.run
.. versionchanged:: 0.4.2
Expand Down

0 comments on commit 95287aa

Please sign in to comment.