Skip to content

docker-compose up for a sanbox spark cluster and hello-pypspark application

Notifications You must be signed in to change notification settings

hughesadam87/pyspark-sandbox-cluster

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

PySpark Sandbox Cluster

A minimal sandbox cluster running via docker-compose (1 master, 1 worker) as described in this medium article.

TLDR: I'm not reading that article

  1. export JAVA_HOME=/path/to/jre
  2. docker pull bitnami/spark:3.5.1
  3. conda create -n pyspark-311 python=3.11 pyspark=3.5.1
  4. export HOST_IP=<YOUR IPV4 IP>
  5. docker-compose up --build
  6. conda activate pyspark-311 && python hello-pyspark.py

About

docker-compose up for a sanbox spark cluster and hello-pypspark application

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published