Skip to content

A spark sbt blueprint to build your own spark apps off of.

Notifications You must be signed in to change notification settings

mieitza/SparkStreamingApps

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

98 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

This repo has two different resources.

  • testing of spark docker containers, orchestrated via vagrant.
  • spark streaming blueprint applications.

If interested just in the docker containers, checkout the deploy/ directory.

These are mostly unrelated. They just happen to be in the same repo.

Spark Streaming Blueprint apps.

To use it, import it into INtelliJ or your favorite IDE as an SBT project.

Then you should be able to run standard SBT tests/compile tasks etc inside your idea and also in standalone mode.

  1. git clone jayunit100/SparkBluePrint

  2. remove folders that dont apply to your project.

  3. Now open intellij, and import.

  4. Pick "SBT" project as the template

  5. Run the Tester class via intelliJ

  6. Change the .git/config to point to your repository.

Running in a real cluster

  • Set up a spark cluster w/ cassandra slaves. There is a WIP project under deploy/ which sets scaffolding for this up using dockerfiles and vagrant to create a n-node spark cluster w/ a cassandra sink.

  • Then run sbt package, to create the jar.

  • copy the jar into the shared directory defined in the vagrantfile in deploy (in general, just copy the file into your machine that submits the spark jobs).

  • spark-submit the application jar w/ desired class name (details coming soon).

Feedback welcome !

About

A spark sbt blueprint to build your own spark apps off of.

Resources

Stars

Watchers

Forks

Packages

No packages published

Languages

  • Scala 74.1%
  • Shell 22.6%
  • Ruby 3.3%