Skip to content

Latest commit

 

History

History
132 lines (100 loc) · 5.61 KB

README.md

File metadata and controls

132 lines (100 loc) · 5.61 KB

Build Status License Join the chat at https://gitter.im/apache/toree Binder

The main goal of the Toree is to provide the foundation for interactive applications to connect to and use Apache Spark.

Overview

Toree provides an interface that allows clients to interact with a Spark Cluster. Clients can send libraries and snippets of code that are interpreted and ran against a preconfigured Spark context. These snippets can do a variety of things:

  1. Define and run spark jobs of all kinds
  2. Collect results from spark and push them to the client
  3. Load necessary dependencies for the running code
  4. Start and monitor a stream
  5. ...

The main supported language is Scala, but it is also capable of processing both Python and R. It implements the latest Jupyter message protocol (5.0), so it can easily plug into the latest releases of Jupyter/IPython (3.2.x+ and 4.x+) for quick, interactive data exploration.

Try It

A version of Toree is deployed as part of the Try Jupyter! site. Select Scala 2.10.4 (Spark 1.4.1) under the New dropdown. Note that this version only supports Scala.

Develop

This project uses make as the entry point for build, test, and packaging. It supports 2 modes, local and vagrant. The default is local and all command (i.e. sbt) will be ran locally on your machine. This means that you need to install sbt, jupyter/ipython, and other development requirements locally on your machine. The 2nd mode uses Vagrant to simplify the development experience. In vagrant mode, all commands are sent to the vagrant box that has all necessary dependencies pre-installed. To run in vagrant mode, run export USE_VAGRANT=true.

To build and interact with Toree using Jupyter, run

make dev

This will start a Jupyter notebook server. Depending on your mode, it will be accessible at http://localhost:8888 or http://192.168.44.44:8888. From here you can create notebooks that use Toree configured for Spark local mode.

Tests can be run by doing make test.

NOTE: Do not use sbt directly.

Build & Package

To build and package up Toree, run

make release

This results in 2 packages.

  • ./dist/toree-<VERSION>-binary-release.tar.gz is a simple package that contains JAR and executable
  • ./dist/toree-<VERSION>.tar.gz is a pip installable package that adds Toree as a Jupyter kernel.

NOTE: make release uses docker. Please refer to docker installation instructions for your system. USE_VAGRANT is not supported by this make target.

Run Examples

To play with the example notebooks, run

make jupyter

A notebook server will be launched in a Docker container with Toree and some other dependencies installed. Refer to your Docker setup for the ip address. The notebook will be at http://<ip>:8888/.

Install

Dev snapshots of Toree are located at https://dist.apache.org/repos/dist/dev/incubator/toree. To install using one of those packages, you can use the following:

pip install <PIP_RELEASE_URL>
jupyter toree install

where PIP_RELEASE_URL is one of the pip packages. For example:

pip install https://dist.apache.org/repos/dist/dev/incubator/toree/0.2.0/snapshots/dev1/toree-pip/toree-0.2.0.dev1.tar.gz
jupyter toree install

Reporting Issues

Refer to and open issue here

Communication

You can reach us through gitter or our mailing list

Version

We are working on publishing binary releases of Toree soon. As part of our move into Apache Incubator, Toree will start a new version sequence starting at 0.1.

Our goal is to keep master up to date with the latest version of Spark. When new versions of Spark require specific code changes to Toree, we will branch out older Spark version support.

As it stands, we maintain several branches for legacy versions of Spark. The table below shows what is available now.

Branch Apache Spark Version
master 2.0
0.1.x 1.6+

Please note that for the most part, new features will mainly be added to the master branch.

Resources

We are currently enhancing our documentation, which is available in our website.