-
Notifications
You must be signed in to change notification settings - Fork 309
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Upgrading to Spark 1.0 #256
Conversation
Fixes #253 |
All automated tests passed. |
@@ -102,6 +102,10 @@ | |||
<groupId>org.apache.spark</groupId> | |||
<artifactId>spark-core_${scala.artifact.suffix}</artifactId> | |||
</dependency> | |||
<dependency> |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can you (or Intellij) format this for better readability?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Will do. I blame vim for this.
Also adding the fastutil dependency back in. Spark 0.9 had included the dependency on fastutil, which we also depended on! But Spark 1.0 apparently removes that dependency. So this commit adds that back in, for our own use.
Apparently, the signature for the RDD.groupBy method has changed (in 1.0) to return an Iterable rather than a Seq. This commit includes all the changes taht are needed to account for this change downstream in our code, mostly updating types to Iterable and inserting a few calls to toSeq in cases where that's not sufficient.
As suggested by Matt and Frank, updated two things: 1. the spark.kryo.referenceTracking value, set to 'true', which fixes a StackOverflowError, and 2. updated the target (test) values for the IndelRealignmentTargetSuite tests, which Frank says are apparently going to change soon anyway.
See the thread here: https://issues.apache.org/jira/browse/SPARK-1851
Matt, I think this rebase should address your comments. Let me know if you see any other details to be fixed! |
All automated tests passed. |
Thanks, Timothy! |
Upgrading the dependency on Spark to version 1.0.0.
The major changes here are: