Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[SPARK-44259][CONNECT][TESTS][FOLLOWUP] No longer initializing Ammonite for Java 21 #41814

Closed
wants to merge 6 commits into from

Conversation

LuciferYang
Copy link
Contributor

@LuciferYang LuciferYang commented Jul 2, 2023

What changes were proposed in this pull request?

This pr adds a check condition for beforeAll function of ReplE2ESuite, make the beforeAll function No longer initializing Ammonite test with Java 17+.

Why are the changes needed?

Make connect-client-jvm module test pass with Java 21 on GA.

Does this PR introduce any user-facing change?

No

How was this patch tested?

  • Pass GitHub Actions
  • Checked with GA

Before

at java.base/java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:200)
	at java.base/sun.nio.ch.FileChannelImpl.endBlocking(FileChannelImpl.java:172)
	at java.base/sun.nio.ch.FileChannelImpl.size(FileChannelImpl.java:430)
	at jdk.zipfs/jdk.nio.zipfs.ZipFileSystem.findEND(ZipFileSystem.java:1255)
	at jdk.zipfs/jdk.nio.zipfs.ZipFileSystem.initCEN(ZipFileSystem.java:1541)
	at jdk.zipfs/jdk.nio.zipfs.ZipFileSystem.<init>(ZipFileSystem.java:179)
	at jdk.zipfs/jdk.nio.zipfs.ZipFileSystemProvider.getZipFileSystem(ZipFileSystemProvider.java:125)
	at jdk.zipfs/jdk.nio.zipfs.ZipFileSystemProvider.newFileSystem(ZipFileSystemProvider.java:106)
	at java.base/java.nio.file.FileSystems.newFileSystem(FileSystems.java:339)
	at java.base/java.nio.file.FileSystems.newFileSystem(FileSystems.java:288)
	at io.github.retronym.java9rtexport.Export.rt(Export.java:60)
	at io.github.retronym.java9rtexport.Export.rtTo(Export.java:88)
	at io.github.retronym.java9rtexport.Export.rtAt(Export.java:100)
	at io.github.retronym.java9rtexport.Export.rtAt(Export.java:105)
	at ammonite.util.Classpath$.classpath(Classpath.scala:76)
	at ammonite.compiler.CompilerLifecycleManager.init(CompilerLifecycleManager.scala:92)
	at ammonite.compiler.CompilerLifecycleManager.preprocess(CompilerLifecycleManager.scala:64)
	at ammonite.interp.Interpreter.compileRunBlock$1(Interpreter.scala:526)
	at ammonite.interp.Interpreter.$anonfun$processAllScriptBlocks$15(Interpreter.scala:587)
	at ammonite.util.Res$Success.flatMap(Res.scala:62)
	at ammonite.interp.Interpreter.$anonfun$processAllScriptBlocks$14(Interpreter.scala:584)
	at ammonite.util.Res$Success.flatMap(Res.scala:62)
	at ammonite.interp.Interpreter.$anonfun$processAllScriptBlocks$12(Interpreter.scala:581)
	at scala.Option.getOrElse(Option.scala:189)
	at ammonite.interp.Interpreter.loop$1(Interpreter.scala:581)
	at ammonite.interp.Interpreter.processAllScriptBlocks(Interpreter.scala:619)
	at ammonite.interp.Interpreter.$anonfun$processModule$6(Interpreter.scala:414)
	at ammonite.util.Catching.flatMap(Res.scala:115)
	at ammonite.interp.Interpreter.$anonfun$processModule$5(Interpreter.scala:405)
	at ammonite.util.Res$Success.flatMap(Res.scala:62)
	at ammonite.interp.Interpreter.processModule(Interpreter.scala:395)
	at ammonite.interp.Interpreter.$anonfun$initializePredef$3(Interpreter.scala:148)
	at ammonite.interp.Interpreter.$anonfun$initializePredef$3$adapted(Interpreter.scala:148)
	at ammonite.interp.PredefInitialization$.$anonfun$apply$2(PredefInitialization.scala:79)
	at ammonite.util.Res$.$anonfun$fold$1(Res.scala:32)
	at scala.collection.LinearSeqOptimized.foldLeft(LinearSeqOptimized.scala:126)
	at scala.collection.LinearSeqOptimized.foldLeft$(LinearSeqOptimized.scala:122)
	at scala.collection.immutable.List.foldLeft(List.scala:91)
	at ammonite.util.Res$.fold(Res.scala:30)
	at ammonite.interp.PredefInitialization$.apply(PredefInitialization.scala:67)
	at ammonite.interp.Interpreter.initializePredef(Interpreter.scala:150)
	at ammonite.repl.Repl.initializePredef(Repl.scala:144)
	at ammonite.Main.run(Main.scala:224)
	at org.apache.spark.sql.application.ConnectRepl$.doMain(ConnectRepl.scala:104)
	at org.apache.spark.sql.application.ReplE2ESuite$$anon$1.run(ReplE2ESuite.scala:60)
	at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:572)
	at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317)
	at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144)
	at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642)
	at java.base/java.lang.Thread.run(Thread.java:1583)
	...
[error] Error during tests:
[error] 	Running java with options -classpath /home/runner/work/spark/spark/connector/connect/client/jvm/target/scala-2.12/test-classes:/home/runner/work/spark/spark/connector/connect/client/jvm/target/scala-2.12/spark-connect-client-jvm_2.12-3.5.0-SNAPSHOT.jar:/home/runner/work/spark/spark/connector/connect/common/target/scala-2.12/spark-connect-common_2.12-3.5.0-
...
[error] (connect-client-jvm / Test / test) sbt.TestsFailedException: Tests unsuccessful
[error] Total time: 50 s, completed Jul 2, 2023, 4:55:33 AM
[error] running /home/runner/work/spark/spark/build/sbt -Phadoop-3 -Pyarn -Pmesos -Pconnect -Phadoop-cloud -Pkubernetes -Pspark-ganglia-lgpl -Pvolcano sql-kafka-0-10/test connect/test connect-client-jvm/test protobuf/test streaming/test streaming-kafka-0-10/test token-provider-kafka-0-10/test mllib-local/test mllib/test yarn/test network-yarn/test mesos/test kubernetes/test hadoop-cloud/test ; received return code 1
Error: Process completed with exit code 18.

The test result was judged as failed on GA.

After

[info] Run completed in 10 seconds, 973 milliseconds.
[info] Total number of tests run: 858
[info] Suites: completed 22, aborted 0
[info] Tests: succeeded 858, failed 0, canceled 167, ignored 1, pending 0
[info] All tests passed.
image

@LuciferYang
Copy link
Contributor Author

connect-client-jvm with Java 21 still test failed, not sure if connect Ammonite timeout is a critical issue, let's give it a try

@LuciferYang
Copy link
Contributor Author

Another possible issue is:

Reporter completed abruptly with an exception after receiving event: TestSucceeded(Ordinal(0, 1103),PlanGenerationTestSuite,org.apache.spark.sql.PlanGenerationTestSuite,Some(org.apache.spark.sql.PlanGenerationTestSuite),function power,function power,Vector(),Some(1),Some(IndentedText(- function power,function power,0)),Some(LineInFile(145,PlanGenerationTestSuite.scala,Some(Please set the environment variable SCALACTIC_FILL_FILE_PATHNAMES to yes at compile time to enable this feature.))),Some(org.apache.spark.sql.PlanGenerationTestSuite),None,pool-1-thread-1-ScalaTest-running-PlanGenerationTestSuite,1688273732864).
java.net.SocketException: Broken pipe
	at java.base/sun.nio.ch.SocketDispatcher.write0(Native Method)
	at java.base/sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:62)
	at java.base/sun.nio.ch.NioSocketImpl.tryWrite(NioSocketImpl.java:394)
	at java.base/sun.nio.ch.NioSocketImpl.implWrite(NioSocketImpl.java:410)
	at java.base/sun.nio.ch.NioSocketImpl.write(NioSocketImpl.java:440)
	at java.base/sun.nio.ch.NioSocketImpl$2.write(NioSocketImpl.java:819)
	at java.base/java.net.Socket$SocketOutputStream.write(Socket.java:1195)
	at java.base/java.io.ObjectOutputStream$BlockDataOutputStream.drain(ObjectOutputStream.java:1899)
	at java.base/java.io.ObjectOutputStream$BlockDataOutputStream.writeByte(ObjectOutputStream.java:1937)
	at java.base/java.io.ObjectOutputStream.writeFatalException(ObjectOutputStream.java:1614)

But I cannot reproduce the failed locally.

cc @dongjoon-hyun

@LuciferYang
Copy link
Contributor Author

LuciferYang commented Jul 2, 2023

@LuciferYang
Copy link
Contributor Author

Let's wait the test result of https://github.com/LuciferYang/spark/actions/runs/5434811680/jobs/9883503801 first

[info] UserDefinedFunctionSuite:
[info] - udf and encoder serialization (41 milliseconds)
[info] SparkConnectClientBuilderParseTestSuite:
[info] - Argument - host (1 millisecond)
[info] - Argument - port (1 millisecond)
[info] - Argument - token (0 milliseconds)
[info] - Argument - user_id (0 milliseconds)
[info] - Argument - user_name (0 milliseconds)
[info] - Argument - user_agent (0 milliseconds)
[info] - Argument - remote (0 milliseconds)
[info] - Argument - use_ssl (0 milliseconds)
[info] - Argument - option (9 milliseconds)
[info] - Argument - unsupported (0 milliseconds)
[info] - SparkSession - create (1 millisecond)
[info] Run completed in 11 seconds, 332 milliseconds.
[info] Total number of tests run: 858
[info] Suites: completed 22, aborted 0
[info] Tests: succeeded 858, failed 0, canceled 167, ignored 1, pending 0
[info] All tests passed.

should be ok

@github-actions github-actions bot removed the DSTREAM label Jul 3, 2023
@LuciferYang
Copy link
Contributor Author

revert and wait test result for Java 21

@github-actions github-actions bot removed the INFRA label Jul 3, 2023
@LuciferYang LuciferYang marked this pull request as ready for review July 3, 2023 06:12
@LuciferYang
Copy link
Contributor Author

cc @HyukjinKwon @dongjoon-hyun FYI

All Scala tests using Java 21 should be successful after this pr

@LuciferYang
Copy link
Contributor Author

friendly ping @HyukjinKwon, please take a look if you have time, thanks ~

outputStream = ammoniteOut,
errorStream = errorStream)
// TODO(SPARK-44121) Remove this check condition
if (SystemUtils.isJavaVersionAtMost(JavaVersion.JAVA_17)) {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hm, how does this test work with 21 though? we're skipping the Spark Connect shell here.

Copy link
Contributor Author

@LuciferYang LuciferYang Jul 5, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

All RemoteSparkSession-based test using Java 21 already skipped in #41805, so we shouldn't initialize Ammonite here and connect to the remote spark connect server here.

This did not make local test fail, but the test on GA failed (Daily testing of Java 21)

@HyukjinKwon
Copy link
Member

Merged to master.

@LuciferYang
Copy link
Contributor Author

Thanks @HyukjinKwon ~, I think Java 21 daily test will be green tomorrow :)

@LuciferYang
Copy link
Contributor Author

https://github.com/apache/spark/actions/runs/5471402855

image

All passed, we need to wait for Arrow to release a new version to continue with future work :)

ragnarok56 pushed a commit to ragnarok56/spark that referenced this pull request Mar 2, 2024
…ite` for Java 21

### What changes were proposed in this pull request?
This pr adds a check condition for `beforeAll` function of `ReplE2ESuite`, make the `beforeAll` function No longer initializing Ammonite test with Java 17+.

### Why are the changes needed?
Make `connect-client-jvm` module test pass with Java 21 on GA.

### Does this PR introduce _any_ user-facing change?
No

### How was this patch tested?
- Pass GitHub Actions
- Checked with GA

**Before**

- https://github.com/apache/spark/actions/runs/5434602425/jobs/9883143909

```
at java.base/java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:200)
	at java.base/sun.nio.ch.FileChannelImpl.endBlocking(FileChannelImpl.java:172)
	at java.base/sun.nio.ch.FileChannelImpl.size(FileChannelImpl.java:430)
	at jdk.zipfs/jdk.nio.zipfs.ZipFileSystem.findEND(ZipFileSystem.java:1255)
	at jdk.zipfs/jdk.nio.zipfs.ZipFileSystem.initCEN(ZipFileSystem.java:1541)
	at jdk.zipfs/jdk.nio.zipfs.ZipFileSystem.<init>(ZipFileSystem.java:179)
	at jdk.zipfs/jdk.nio.zipfs.ZipFileSystemProvider.getZipFileSystem(ZipFileSystemProvider.java:125)
	at jdk.zipfs/jdk.nio.zipfs.ZipFileSystemProvider.newFileSystem(ZipFileSystemProvider.java:106)
	at java.base/java.nio.file.FileSystems.newFileSystem(FileSystems.java:339)
	at java.base/java.nio.file.FileSystems.newFileSystem(FileSystems.java:288)
	at io.github.retronym.java9rtexport.Export.rt(Export.java:60)
	at io.github.retronym.java9rtexport.Export.rtTo(Export.java:88)
	at io.github.retronym.java9rtexport.Export.rtAt(Export.java:100)
	at io.github.retronym.java9rtexport.Export.rtAt(Export.java:105)
	at ammonite.util.Classpath$.classpath(Classpath.scala:76)
	at ammonite.compiler.CompilerLifecycleManager.init(CompilerLifecycleManager.scala:92)
	at ammonite.compiler.CompilerLifecycleManager.preprocess(CompilerLifecycleManager.scala:64)
	at ammonite.interp.Interpreter.compileRunBlock$1(Interpreter.scala:526)
	at ammonite.interp.Interpreter.$anonfun$processAllScriptBlocks$15(Interpreter.scala:587)
	at ammonite.util.Res$Success.flatMap(Res.scala:62)
	at ammonite.interp.Interpreter.$anonfun$processAllScriptBlocks$14(Interpreter.scala:584)
	at ammonite.util.Res$Success.flatMap(Res.scala:62)
	at ammonite.interp.Interpreter.$anonfun$processAllScriptBlocks$12(Interpreter.scala:581)
	at scala.Option.getOrElse(Option.scala:189)
	at ammonite.interp.Interpreter.loop$1(Interpreter.scala:581)
	at ammonite.interp.Interpreter.processAllScriptBlocks(Interpreter.scala:619)
	at ammonite.interp.Interpreter.$anonfun$processModule$6(Interpreter.scala:414)
	at ammonite.util.Catching.flatMap(Res.scala:115)
	at ammonite.interp.Interpreter.$anonfun$processModule$5(Interpreter.scala:405)
	at ammonite.util.Res$Success.flatMap(Res.scala:62)
	at ammonite.interp.Interpreter.processModule(Interpreter.scala:395)
	at ammonite.interp.Interpreter.$anonfun$initializePredef$3(Interpreter.scala:148)
	at ammonite.interp.Interpreter.$anonfun$initializePredef$3$adapted(Interpreter.scala:148)
	at ammonite.interp.PredefInitialization$.$anonfun$apply$2(PredefInitialization.scala:79)
	at ammonite.util.Res$.$anonfun$fold$1(Res.scala:32)
	at scala.collection.LinearSeqOptimized.foldLeft(LinearSeqOptimized.scala:126)
	at scala.collection.LinearSeqOptimized.foldLeft$(LinearSeqOptimized.scala:122)
	at scala.collection.immutable.List.foldLeft(List.scala:91)
	at ammonite.util.Res$.fold(Res.scala:30)
	at ammonite.interp.PredefInitialization$.apply(PredefInitialization.scala:67)
	at ammonite.interp.Interpreter.initializePredef(Interpreter.scala:150)
	at ammonite.repl.Repl.initializePredef(Repl.scala:144)
	at ammonite.Main.run(Main.scala:224)
	at org.apache.spark.sql.application.ConnectRepl$.doMain(ConnectRepl.scala:104)
	at org.apache.spark.sql.application.ReplE2ESuite$$anon$1.run(ReplE2ESuite.scala:60)
	at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:572)
	at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317)
	at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144)
	at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642)
	at java.base/java.lang.Thread.run(Thread.java:1583)
	...
[error] Error during tests:
[error] 	Running java with options -classpath /home/runner/work/spark/spark/connector/connect/client/jvm/target/scala-2.12/test-classes:/home/runner/work/spark/spark/connector/connect/client/jvm/target/scala-2.12/spark-connect-client-jvm_2.12-3.5.0-SNAPSHOT.jar:/home/runner/work/spark/spark/connector/connect/common/target/scala-2.12/spark-connect-common_2.12-3.5.0-
...
[error] (connect-client-jvm / Test / test) sbt.TestsFailedException: Tests unsuccessful
[error] Total time: 50 s, completed Jul 2, 2023, 4:55:33 AM
[error] running /home/runner/work/spark/spark/build/sbt -Phadoop-3 -Pyarn -Pmesos -Pconnect -Phadoop-cloud -Pkubernetes -Pspark-ganglia-lgpl -Pvolcano sql-kafka-0-10/test connect/test connect-client-jvm/test protobuf/test streaming/test streaming-kafka-0-10/test token-provider-kafka-0-10/test mllib-local/test mllib/test yarn/test network-yarn/test mesos/test kubernetes/test hadoop-cloud/test ; received return code 1
Error: Process completed with exit code 18.
```

The test result was judged as failed on GA.

**After**

- https://github.com/LuciferYang/spark/actions/runs/5439928518/jobs/9892364759

```
[info] Run completed in 10 seconds, 973 milliseconds.
[info] Total number of tests run: 858
[info] Suites: completed 22, aborted 0
[info] Tests: succeeded 858, failed 0, canceled 167, ignored 1, pending 0
[info] All tests passed.
```
<img width="1274" alt="image" src="https://github.com/apache/spark/assets/1475305/8f21a8dc-18b1-4663-9698-27513adbc38d">

Closes apache#41814 from LuciferYang/SPARK-44259-FOLLOWUP.

Lead-authored-by: yangjie01 <yangjie01@baidu.com>
Co-authored-by: YangJie <yangjie01@baidu.com>
Signed-off-by: Hyukjin Kwon <gurwls223@apache.org>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants