You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
[ERROR] [Error] /home/user/gits/NVIDIA/spark-rapids/sql-plugin/src/main/spark320/scala/com/nvidia/spark/rapids/v1FallbackWriters.scala:104: type mismatch;
found : Unit
required: Seq[org.apache.spark.sql.catalyst.InternalRow]
[INFO] [Info] : Unit <: Seq[org.apache.spark.sql.catalyst.InternalRow]?
[INFO] [Info] : false
[ERROR] [Error] /home/user/gits/NVIDIA/spark-rapids/sql-plugin/src/main/spark320/scala/com/nvidia/spark/rapids/v1FallbackWriters.scala:102: local val writtenRows in method run is never used
Applicable -Wconf / @nowarn filters for this fatal warning: msg=<part of the message>, cat=unused-locals, site=com.nvidia.spark.rapids.GpuV1FallbackWriters.run.writtenRows
[ERROR] two errors found
Describe the bug
spark400 build fails because of apache/spark@f1b68d8
Steps/Code to reproduce bug
Expected behavior
should build
Environment details (please complete the following information)
local
Additional context
Add any other context about the problem here.
The text was updated successfully, but these errors were encountered: