Skip to content

EricKani/onnx

 
 

Repository files navigation

The next ONNX Community Workshop will be held on November 18 in Shanghai! This is a great opportunity to meet with and hear from people working with ONNX from many companies. You’ll also have opportunities to participate in technical breakout sessions. Due to limited space, please submit a proposal for a short talk if you would like to attend. Submit your proposal.

Build Status Build status Build Status

Open Neural Network Exchange (ONNX) is an open ecosystem that empowers AI developers to choose the right tools as their project evolves. ONNX provides an open source format for AI models, both deep learning and traditional ML. It defines an extensible computation graph model, as well as definitions of built-in operators and standard data types. Currently we focus on the capabilities needed for inferencing (scoring).

ONNX is widely supported and can be found in many frameworks, tools, and hardware. Enabling interoperability between different frameworks and streamlining the path from research to production helps increase the speed of innovation in the AI community. We invite the community to join us and further evolve ONNX.

Use ONNX

Learn about the ONNX spec

Programming utilities for working with ONNX Graphs

Contribute

ONNX is a community project. We encourage you to join the effort and contribute feedback, ideas, and code. You can participate in the SIGs and Working Groups to shape the future of ONNX.

Check out our contribution guide to get started.

If you think some operator should be added to ONNX specification, please read this document.

Discuss

We encourage you to open Issues, or use Gitter for more real-time discussion: Join the chat at https://gitter.im/onnx/Lobby

Follow Us

Stay up to date with the latest ONNX news. [Facebook] [Twitter]

Installation

Binaries

A binary build of ONNX is available from Conda, in conda-forge:

conda install -c conda-forge onnx

Source

You will need an install of protobuf and numpy to build ONNX. One easy way to get these dependencies is via Anaconda:

# Use conda-forge protobuf, as default doesn't come with protoc
conda install -c conda-forge protobuf numpy

You can then install ONNX from PyPi (Note: Set environment variable ONNX_ML=1 for onnx-ml):

pip install onnx

You can also build and install ONNX locally from source code:

git clone https://github.com/onnx/onnx.git
cd onnx
git submodule update --init --recursive
python setup.py install

Note: When installing in a non-Anaconda environment, make sure to install the Protobuf compiler before running the pip installation of onnx. For example, on Ubuntu:

sudo apt-get install protobuf-compiler libprotoc-dev
pip install onnx

After installation, run

python -c "import onnx"

to verify it works. Note that this command does not work from a source checkout directory; in this case you'll see:

ModuleNotFoundError: No module named 'onnx.onnx_cpp2py_export'

Change into another directory to fix this error.

Testing

ONNX uses pytest as test driver. In order to run tests, first you need to install pytest:

pip install pytest nbval

After installing pytest, do

pytest

to run tests.

Development

Check out contributor guide for instructions.

License

MIT License

Code of Conduct

ONNX Open Source Code of Conduct

About

Open Neural Network Exchange

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages

  • PureBasic 44.7%
  • C++ 26.0%
  • Python 24.6%
  • C 3.2%
  • Jupyter Notebook 0.7%
  • CMake 0.7%
  • Other 0.1%