Skip to content

dre-hh/spark-docker

Repository files navigation

Spark Docker

Launch a spark docker cluster with one master node and n worker nodes. Start a separate jupyter notebook pyspark node allowing to submit jobs to the spark cluster.

docker build  -t dre_hh/spark  -f Dockerfile.spark .
docker-compose up
visit  http://127.0.0.1:9000/?token=<PRINTED_TOKEN>

# open demo notebook in py/Spark.ipynb

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published