Skip to content

Commit

Permalink
convert readme to markdown, add note about min python versions
Browse files Browse the repository at this point in the history
Markdown is easier to edit for many folks as it has broader usage outside the python docs ecosystem.
  • Loading branch information
NeoLegends committed Dec 13, 2024
1 parent 1b79a6a commit c00d1f5
Show file tree
Hide file tree
Showing 3 changed files with 74 additions and 76 deletions.
72 changes: 72 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,72 @@
# Welcome to RETURNN

[GitHub repository](https://github.com/rwth-i6/returnn),
[RETURNN paper 2016](https://arxiv.org/abs/1608.00895),
[RETURNN paper 2018](https://arxiv.org/abs/1805.05225).

RETURNN - RWTH extensible training framework for universal recurrent neural networks,
is a PyTorch/TensorFlow-based implementation of modern neural network architectures.
It is optimized for fast and reliable training of neural networks in a multi-GPU environment.

The high-level features and goals of RETURNN are:

- **Simplicity**
- Writing config / code is simple & straight-forward (setting up experiment, defining model)
- Debugging in case of problems is simple
- Reading config / code is simple (defined model, training, decoding all becomes clear)

- **Flexibility**
- Allow for many different kinds of experiments / models

- **Efficiency**
- Training speed
- Decoding speed

All items are important for research, decoding speed is esp. important for production.

See our [Interspeech 2020 tutorial "Efficient and Flexible Implementation of Machine Learning for ASR and MT" video](https://www.youtube.com/watch?v=wPKdYqSOlAY)
([slides](https://www-i6.informatik.rwth-aachen.de/publications/download/1154/Zeyer--2020.pdf))
with an introduction of the core concepts.

More specific features include:

- Mini-batch training of feed-forward neural networks
- Sequence-chunking based batch training for recurrent neural networks
- Long short-term memory recurrent neural networks
including our own fast CUDA kernel
- Multidimensional LSTM (GPU only, there is no CPU version)
- Memory management for large data sets
- Work distribution across multiple devices
- Flexible and fast architecture which allows all kinds of encoder-attention-decoder models

See [documentation](https://returnn.readthedocs.io/).
See [basic usage](https://returnn.readthedocs.io/en/latest/basic_usage.html) and [technological overview](https://returnn.readthedocs.io/en/latest/tech_overview.html).

[Here is the video recording of a RETURNN overview talk](https://www-i6.informatik.rwth-aachen.de/web/Software/returnn/downloads/workshop-2019-01-29/01.recording.cut.mp4)
([slides](https://www-i6.informatik.rwth-aachen.de/web/Software/returnn/downloads/workshop-2019-01-29/01.returnn-overview.session1.handout.v1.pdf),
[exercise sheet](https://www-i6.informatik.rwth-aachen.de/web/Software/returnn/downloads/workshop-2019-01-29/01.exercise_sheet.pdf); hosted by eBay).

There are [many example demos](https://github.com/rwth-i6/returnn/blob/master/demos/)
which work on artificially generated data,
i.e. they should work as-is.

There are [some real-world examples](https://github.com/rwth-i6/returnn-experiments)
such as setups for speech recognition on the Switchboard or LibriSpeech corpus.

Some benchmark setups against other frameworks
can be found [here](https://github.com/rwth-i6/returnn-benchmarks).
The results are in the [RETURNN paper 2016](https://arxiv.org/abs/1608.00895).
Performance benchmarks of our LSTM kernel vs CuDNN and other TensorFlow kernels
are in [TensorFlow LSTM benchmark](https://returnn.readthedocs.io/en/latest/tf_lstm_benchmark.html).

There is also [a wiki](https://github.com/rwth-i6/returnn/wiki).
Questions can also be asked on
[StackOverflow using the RETURNN tag](https://stackoverflow.com/questions/tagged/returnn).

[![CI](https://github.com/rwth-i6/returnn/workflows/CI/badge.svg)](https://github.com/rwth-i6/returnn/actions)

## Dependencies

pip dependencies are listed in `requirements.txt` and `requirements-dev`, although some parts of the code may require additional dependencies (e.g. `librosa`, `resampy`) on-demand.

RETURNN supports Python >= 3.8. Bumps to the minimum Python version are listed in [`CHANGELOG.md`](https://github.com/rwth-i6/returnn/blob/master/CHANGELOG.md).
74 changes: 0 additions & 74 deletions README.rst

This file was deleted.

4 changes: 2 additions & 2 deletions setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -87,8 +87,8 @@ def main():
author_email="albzey@gmail.com",
url="https://github.com/rwth-i6/returnn/",
license="RETURNN license",
long_description=open("README.rst").read(),
long_description_content_type="text/x-rst",
long_description=open("README.md").read(),
long_description_content_type="text/markdown",
# https://pypi.python.org/pypi?%3Aaction=list_classifiers
classifiers=[
"Development Status :: 5 - Production/Stable",
Expand Down

0 comments on commit c00d1f5

Please sign in to comment.