Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Attn] Support for Spark 2.4.2 #60

Closed
imback82 opened this issue Apr 26, 2019 · 4 comments
Closed

[Attn] Support for Spark 2.4.2 #60

imback82 opened this issue Apr 26, 2019 · 4 comments
Assignees
Labels
wontfix This will not be worked on

Comments

@imback82
Copy link
Contributor

imback82 commented Apr 26, 2019

Summary: You cannot use .NET for Apache Spark with Apache Spark 2.4.2

Details: Spark 2.4.2 was released on 4/23/19 and using it against microsoft.spark.2.4.x results in unexpected behavior (reported in #48, #49); the expected behavior is that you would get an exception message such as Unsupported spark version used: 2.4.2. Supported versions: 2.4.0, 2.4.1 causing less confusion. This is likely due to the scala version upgrade to 2.12 in 2.4.2. Note that microsoft.spark.2.4.x is being built with 2.11.

There is an ongoing discussion about this (http://apache-spark-developers-list.1001551.n3.nabble.com/VOTE-Release-Apache-Spark-2-4-2-tc27075.html#a27139), so depending on the outcome of the discussion, 2.4.2 may or may not be supported.

Do you want to help? While we are closely monitoring and working with the Apache Spark community in addressing this issue, you can also feel free to reply back to the main thread about any problems this issue has caused so we can avoid such mishaps in the future.

@imback82 imback82 added the enhancement New feature or request label Apr 26, 2019
@imback82 imback82 self-assigned this Apr 26, 2019
@rapoth rapoth pinned this issue Apr 26, 2019
@imback82 imback82 changed the title Support for Spark 2.4.2 [Announcement] Support for Spark 2.4.2 Apr 26, 2019
@imback82 imback82 changed the title [Announcement] Support for Spark 2.4.2 [Attn] Support for Spark 2.4.2 Apr 26, 2019
@tang2087
Copy link

Strongly support for this one. It took me more than 1 hour to doubt there might be something wrong with my Spark configuration as it keeps raising error about Spark Logging class not found while I was using 2.4.2

@rapoth rapoth added the bug Something isn't working label Apr 26, 2019
@borgdylan
Copy link

I get this:

dylan@ubuntu-server:/mnt/ssd/vnext/sparktest/bin/Debug/netcoreapp2.0/ubuntu.16.04-x64$ spark-submit --class org.apache.spark.deploy.DotnetRunner --master local microsoft-spark-2.4.x-0.2.0.jar dotnet sparktest.dll
19/05/08 23:05:31 WARN Utils: Your hostname, ubuntu-server resolves to a loopback address: 127.0.1.1; using 10.11.1.24 instead (on interface team0)
19/05/08 23:05:31 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address
19/05/08 23:05:31 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/spark/internal/Logging$class
	at org.apache.spark.deploy.DotnetRunner$.<init>(DotnetRunner.scala:34)
	at org.apache.spark.deploy.DotnetRunner$.<clinit>(DotnetRunner.scala)
	at org.apache.spark.deploy.DotnetRunner.main(DotnetRunner.scala)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
	at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:849)
	at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:167)
	at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:195)
	at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
	at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:924)
	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:933)
	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.ClassNotFoundException: org.apache.spark.internal.Logging$class
	at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
	... 15 more
log4j:WARN No appenders could be found for logger (org.apache.spark.util.ShutdownHookManager).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.

when using v. 2.4.2. Please provide directiosnd on which version of spark to install.

@imback82
Copy link
Contributor Author

imback82 commented May 8, 2019

@imback82
Copy link
Contributor Author

Spark 2.4.3 is now released: http://spark.apache.org/releases/spark-release-2-4-3.html, and it fixes the default Scala version. If you use microsoft-spark-2.4.x-0.2.0.jar against the Spark 2.4.3, you get the correct error message now:

Exception in thread "main" java.lang.IllegalArgumentException: Unsupported spark version used: 2.4.3. Normalized spark version used: 2.4.3. Supported versions: 2.4.0, 2.4.1

We are not going to support Spark 2.4.2 since it requires a new microsoft.spark-2.4.2-<version>.jar that just targets Spark 2.4.2.

We will release 0.3.0 that supports Spark 2.4.3 soon.

@imback82 imback82 unpinned this issue May 10, 2019
@imback82 imback82 added wontfix This will not be worked on and removed bug Something isn't working enhancement New feature or request labels May 10, 2019
@imback82 imback82 pinned this issue May 10, 2019
@rapoth rapoth unpinned this issue Aug 19, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
wontfix This will not be worked on
Projects
None yet
Development

No branches or pull requests

4 participants