spark.eventLog.compress |
- false |
+ true |
Whether to compress logged events, if spark.eventLog.enabled is true.
|
diff --git a/docs/core-migration-guide.md b/docs/core-migration-guide.md
index 3f97a484e1a68..765c3494f669b 100644
--- a/docs/core-migration-guide.md
+++ b/docs/core-migration-guide.md
@@ -22,6 +22,10 @@ license: |
* Table of contents
{:toc}
+## Upgrading from Core 3.4 to 4.0
+
+- Since Spark 4.0, Spark will compress event logs. To restore the behavior before Spark 4.0, you can set `spark.eventLog.compress` to `false`.
+
## Upgrading from Core 3.3 to 3.4
- Since Spark 3.4, Spark driver will own `PersistentVolumnClaim`s and try to reuse if they are not assigned to live executors. To restore the behavior before Spark 3.4, you can set `spark.kubernetes.driver.ownPersistentVolumeClaim` to `false` and `spark.kubernetes.driver.reusePersistentVolumeClaim` to `false`.
diff --git a/docs/index.md b/docs/index.md
index bd77fd75a0b3b..4620c4f072b42 100644
--- a/docs/index.md
+++ b/docs/index.md
@@ -34,7 +34,7 @@ source, visit [Building Spark](building-spark.html).
Spark runs on both Windows and UNIX-like systems (e.g. Linux, Mac OS), and it should run on any platform that runs a supported version of Java. This should include JVMs on x86_64 and ARM64. It's easy to run locally on one machine --- all you need is to have `java` installed on your system `PATH`, or the `JAVA_HOME` environment variable pointing to a Java installation.
-Spark runs on Java 8/11/17, Scala 2.12/2.13, Python 3.8+, and R 3.5+.
+Spark runs on Java 8/11/17, Scala 2.13, Python 3.8+, and R 3.5+.
Java 8 prior to version 8u371 support is deprecated as of Spark 3.5.0.
When using the Scala API, it is necessary for applications to use the same version of Scala that Spark was compiled for.
For example, when using Scala 2.13, use Spark compiled for 2.13, and compile code/applications for Scala 2.13 as well.
diff --git a/docs/spark-connect-overview.md b/docs/spark-connect-overview.md
index 0673763f03bcc..82d84f39ca1da 100644
--- a/docs/spark-connect-overview.md
+++ b/docs/spark-connect-overview.md
@@ -101,13 +101,13 @@ Spark before and run the `start-connect-server.sh` script to start Spark server
Spark Connect, like in this example:
{% highlight bash %}
-./sbin/start-connect-server.sh --packages org.apache.spark:spark-connect_2.12:{{site.SPARK_VERSION_SHORT}}
+./sbin/start-connect-server.sh --packages org.apache.spark:spark-connect_2.13:{{site.SPARK_VERSION_SHORT}}
{% endhighlight %}
-Note that we include a Spark Connect package (`spark-connect_2.12:{{site.SPARK_VERSION_SHORT}}`), when starting
+Note that we include a Spark Connect package (`spark-connect_2.13:{{site.SPARK_VERSION_SHORT}}`), when starting
Spark server. This is required to use Spark Connect. Make sure to use the same version
of the package as the Spark version you downloaded previously. In this example,
-Spark {{site.SPARK_VERSION_SHORT}} with Scala 2.12.
+Spark {{site.SPARK_VERSION_SHORT}} with Scala 2.13.
Now Spark server is running and ready to accept Spark Connect sessions from client
applications. In the next section we will walk through how to use Spark Connect
diff --git a/docs/sql-error-conditions-unsupported-view-operation-error-class.md b/docs/sql-error-conditions-expect-table-not-view-error-class.md
similarity index 82%
rename from docs/sql-error-conditions-unsupported-view-operation-error-class.md
rename to docs/sql-error-conditions-expect-table-not-view-error-class.md
index 88f3adc3fdbae..9be7fad6cc777 100644
--- a/docs/sql-error-conditions-unsupported-view-operation-error-class.md
+++ b/docs/sql-error-conditions-expect-table-not-view-error-class.md
@@ -1,7 +1,7 @@
---
layout: global
-title: UNSUPPORTED_VIEW_OPERATION error class
-displayTitle: UNSUPPORTED_VIEW_OPERATION error class
+title: EXPECT_TABLE_NOT_VIEW error class
+displayTitle: EXPECT_TABLE_NOT_VIEW error class
license: |
Licensed to the Apache Software Foundation (ASF) under one or more
contributor license agreements. See the NOTICE file distributed with
@@ -21,15 +21,15 @@ license: |
SQLSTATE: none assigned
-The view `` does not support ``.
+'``' expects a table but `` is a view.
This error class has the following derived error classes:
-## WITHOUT_SUGGESTION
+## NO_ALTERNATIVE
-## WITH_SUGGESTION
+## USE_ALTER_VIEW
Please use ALTER VIEW instead.
diff --git a/docs/sql-error-conditions-unsupported-table-operation-error-class.md b/docs/sql-error-conditions-expect-view-not-table-error-class.md
similarity index 87%
rename from docs/sql-error-conditions-unsupported-table-operation-error-class.md
rename to docs/sql-error-conditions-expect-view-not-table-error-class.md
index 5e4c07ccffd23..d4ed0d9457cc8 100644
--- a/docs/sql-error-conditions-unsupported-table-operation-error-class.md
+++ b/docs/sql-error-conditions-expect-view-not-table-error-class.md
@@ -1,7 +1,7 @@
---
layout: global
-title: UNSUPPORTED_TABLE_OPERATION error class
-displayTitle: UNSUPPORTED_TABLE_OPERATION error class
+title: EXPECT_VIEW_NOT_TABLE error class
+displayTitle: EXPECT_VIEW_NOT_TABLE error class
license: |
Licensed to the Apache Software Foundation (ASF) under one or more
contributor license agreements. See the NOTICE file distributed with
@@ -25,11 +25,11 @@ The table `` does not support ``.
This error class has the following derived error classes:
-## WITHOUT_SUGGESTION
+## NO_ALTERNATIVE
-## WITH_SUGGESTION
+## USE_ALTER_TABLE
Please use ALTER TABLE instead.
diff --git a/docs/sql-error-conditions.md b/docs/sql-error-conditions.md
index 1df00f72bc970..38cfc28ba099a 100644
--- a/docs/sql-error-conditions.md
+++ b/docs/sql-error-conditions.md
@@ -297,6 +297,12 @@ The value `` of the type `` cannot be cast to ``
Fail to assign a value of `` type to the `` type column or variable `` due to an overflow. Use `try_cast` on the input value to tolerate overflow and return NULL instead.
+### CLASS_UNSUPPORTED_BY_MAP_OBJECTS
+
+SQLSTATE: none assigned
+
+`MapObjects` does not support the class `` as resulting collection.
+
### CODEC_NOT_AVAILABLE
SQLSTATE: none assigned
@@ -517,6 +523,28 @@ SQLSTATE: none assigned
Exceeds char/varchar type length limitation: ``.
+### EXPECT_PERMANENT_VIEW_NOT_TEMP
+
+SQLSTATE: none assigned
+
+'``' expects a permanent view but `` is a temp view.
+
+### [EXPECT_TABLE_NOT_VIEW](sql-error-conditions-expect-table-not-view-error-class.html)
+
+SQLSTATE: none assigned
+
+'``' expects a table but `` is a view.
+
+For more details see [EXPECT_TABLE_NOT_VIEW](sql-error-conditions-expect-table-not-view-error-class.html)
+
+### [EXPECT_VIEW_NOT_TABLE](sql-error-conditions-expect-view-not-table-error-class.html)
+
+SQLSTATE: none assigned
+
+The table `` does not support ``.
+
+For more details see [EXPECT_VIEW_NOT_TABLE](sql-error-conditions-expect-view-not-table-error-class.html)
+
### EXPRESSION_TYPE_IS_NOT_ORDERABLE
SQLSTATE: none assigned
@@ -2100,28 +2128,12 @@ Unsupported subquery expression:
For more details see [UNSUPPORTED_SUBQUERY_EXPRESSION_CATEGORY](sql-error-conditions-unsupported-subquery-expression-category-error-class.html)
-### [UNSUPPORTED_TABLE_OPERATION](sql-error-conditions-unsupported-table-operation-error-class.html)
-
-SQLSTATE: none assigned
-
-The table `` does not support ``.
-
-For more details see [UNSUPPORTED_TABLE_OPERATION](sql-error-conditions-unsupported-table-operation-error-class.html)
-
### UNSUPPORTED_TYPED_LITERAL
[SQLSTATE: 0A000](sql-error-conditions-sqlstates.html#class-0A-feature-not-supported)
Literals of the type `` are not supported. Supported types are ``.
-### [UNSUPPORTED_VIEW_OPERATION](sql-error-conditions-unsupported-view-operation-error-class.html)
-
-SQLSTATE: none assigned
-
-The view `` does not support ``.
-
-For more details see [UNSUPPORTED_VIEW_OPERATION](sql-error-conditions-unsupported-view-operation-error-class.html)
-
### UNTYPED_SCALA_UDF
SQLSTATE: none assigned
diff --git a/docs/storage-openstack-swift.md b/docs/storage-openstack-swift.md
index 73b21a1f7c27b..52e2d11b99126 100644
--- a/docs/storage-openstack-swift.md
+++ b/docs/storage-openstack-swift.md
@@ -44,7 +44,7 @@ For example, for Maven support, add the following to the pom.xml
fi
...
org.apache.spark
- hadoop-cloud_2.12
+ hadoop-cloud_2.13
${spark.version}
...
diff --git a/examples/pom.xml b/examples/pom.xml
index c5644f6a08950..9470f13ecfc2c 100644
--- a/examples/pom.xml
+++ b/examples/pom.xml
@@ -20,12 +20,12 @@
4.0.0
org.apache.spark
- spark-parent_2.12
+ spark-parent_2.13
4.0.0-SNAPSHOT
../pom.xml
- spark-examples_2.12
+ spark-examples_2.13
jar
Spark Project Examples
https://spark.apache.org/
diff --git a/graphx/pom.xml b/graphx/pom.xml
index 3771a274082cd..ce29c1845422a 100644
--- a/graphx/pom.xml
+++ b/graphx/pom.xml
@@ -20,12 +20,12 @@
4.0.0
org.apache.spark
- spark-parent_2.12
+ spark-parent_2.13
4.0.0-SNAPSHOT
../pom.xml
- spark-graphx_2.12
+ spark-graphx_2.13
graphx
diff --git a/hadoop-cloud/pom.xml b/hadoop-cloud/pom.xml
index b27df3597eb13..934ff29be4060 100644
--- a/hadoop-cloud/pom.xml
+++ b/hadoop-cloud/pom.xml
@@ -21,12 +21,12 @@
4.0.0
org.apache.spark
- spark-parent_2.12
+ spark-parent_2.13
4.0.0-SNAPSHOT
../pom.xml
- spark-hadoop-cloud_2.12
+ spark-hadoop-cloud_2.13
jar
Spark Project Hadoop Cloud Integration
diff --git a/launcher/pom.xml b/launcher/pom.xml
index d87f7bd8fef5c..c47244ff887a6 100644
--- a/launcher/pom.xml
+++ b/launcher/pom.xml
@@ -21,12 +21,12 @@
4.0.0
org.apache.spark
- spark-parent_2.12
+ spark-parent_2.13
4.0.0-SNAPSHOT
../pom.xml
- spark-launcher_2.12
+ spark-launcher_2.13
jar
Spark Project Launcher
https://spark.apache.org/
diff --git a/mllib-local/pom.xml b/mllib-local/pom.xml
index fe8ec5721ea7e..408aec1ff276b 100644
--- a/mllib-local/pom.xml
+++ b/mllib-local/pom.xml
@@ -20,12 +20,12 @@
4.0.0
org.apache.spark
- spark-parent_2.12
+ spark-parent_2.13
4.0.0-SNAPSHOT
../pom.xml
- spark-mllib-local_2.12
+ spark-mllib-local_2.13
mllib-local
diff --git a/mllib/pom.xml b/mllib/pom.xml
index 0b5292d7c63b5..88400e7ba6ac4 100644
--- a/mllib/pom.xml
+++ b/mllib/pom.xml
@@ -20,12 +20,12 @@
4.0.0
org.apache.spark
- spark-parent_2.12
+ spark-parent_2.13
4.0.0-SNAPSHOT
../pom.xml
- spark-mllib_2.12
+ spark-mllib_2.13
mllib
@@ -91,12 +91,10 @@
test-jar
test
-
org.scalanlp
breeze_${scala.binary.version}
diff --git a/pom.xml b/pom.xml
index 971cb07ea40ea..e1c88327a9402 100644
--- a/pom.xml
+++ b/pom.xml
@@ -25,7 +25,7 @@
18
org.apache.spark
- spark-parent_2.12
+ spark-parent_2.13
4.0.0-SNAPSHOT
pom
Spark Project Parent POM
@@ -172,8 +172,8 @@
3.2.2
4.4
- 2.12.18
- 2.12
+ 2.13.11
+ 2.13
2.2.0
4.7.1
@@ -440,13 +440,11 @@
${project.version}
test-jar
-
com.twitter
chill_${scala.binary.version}
@@ -1110,7 +1108,7 @@
org.scala-lang.modules
- scala-xml_2.12
+ scala-xml_2.13
@@ -2819,6 +2817,7 @@
org.jboss.netty
org.codehaus.groovy
+ *:*_2.12
*:*_2.11
*:*_2.10
@@ -2934,9 +2933,53 @@
-feature
-explaintypes
-target:jvm-1.8
- -Xfatal-warnings
- -Ywarn-unused:imports
- -P:silencer:globalFilters=.*deprecated.*
+ -Wconf:cat=deprecation:wv,any:e
+ -Wunused:imports
+
+ -Wconf:cat=scaladoc:wv
+ -Wconf:cat=lint-multiarg-infix:wv
+ -Wconf:cat=other-nullary-override:wv
+ -Wconf:cat=other-match-analysis&site=org.apache.spark.sql.catalyst.catalog.SessionCatalog.lookupFunction.catalogFunction:wv
+ -Wconf:cat=other-pure-statement&site=org.apache.spark.streaming.util.FileBasedWriteAheadLog.readAll.readFile:wv
+ -Wconf:cat=other-pure-statement&site=org.apache.spark.scheduler.OutputCommitCoordinatorSuite.<local OutputCommitCoordinatorSuite>.futureAction:wv
+
+ -Wconf:msg=^(?=.*?method|value|type|object|trait|inheritance)(?=.*?deprecated)(?=.*?since 2.13).+$:s
+ -Wconf:msg=^(?=.*?Widening conversion from)(?=.*?is deprecated because it loses precision).+$:s
+ -Wconf:msg=Auto-application to \`\(\)\` is deprecated:s
+ -Wconf:msg=method with a single empty parameter list overrides method without any parameter list:s
+ -Wconf:msg=method without a parameter list overrides a method with a single empty one:s
+
+ -Wconf:cat=deprecation&msg=procedure syntax is deprecated:e
+
+ -Wconf:cat=unchecked&msg=outer reference:s
+ -Wconf:cat=unchecked&msg=eliminated by erasure:s
+ -Wconf:msg=^(?=.*?a value of type)(?=.*?cannot also be).+$:s
+
+ -Wconf:cat=unused-imports&src=org\/apache\/spark\/graphx\/impl\/VertexPartitionBase.scala:s
+ -Wconf:cat=unused-imports&src=org\/apache\/spark\/graphx\/impl\/VertexPartitionBaseOps.scala:s
+
+ -Wconf:msg=Implicit definition should have explicit type:s
-Xss128m
@@ -2952,13 +2995,6 @@
${java.version}
-Xlint:all,-serial,-path,-try
-
-
- com.github.ghik
- silencer-plugin_${scala.version}
- 1.7.13
-
-
@@ -3610,99 +3646,24 @@
-
- scala-2.12
-
-
- 2.12.18
-
-
-
-
-
-
-
-
-
+
+
- -Wconf:cat=scaladoc:wv
- -Wconf:cat=lint-multiarg-infix:wv
- -Wconf:cat=other-nullary-override:wv
- -Wconf:cat=other-match-analysis&site=org.apache.spark.sql.catalyst.catalog.SessionCatalog.lookupFunction.catalogFunction:wv
- -Wconf:cat=other-pure-statement&site=org.apache.spark.streaming.util.FileBasedWriteAheadLog.readAll.readFile:wv
- -Wconf:cat=other-pure-statement&site=org.apache.spark.scheduler.OutputCommitCoordinatorSuite.<local OutputCommitCoordinatorSuite>.futureAction:wv
-
- -Wconf:msg=^(?=.*?method|value|type|object|trait|inheritance)(?=.*?deprecated)(?=.*?since 2.13).+$:s
- -Wconf:msg=^(?=.*?Widening conversion from)(?=.*?is deprecated because it loses precision).+$:s
- -Wconf:msg=Auto-application to \`\(\)\` is deprecated:s
- -Wconf:msg=method with a single empty parameter list overrides method without any parameter list:s
- -Wconf:msg=method without a parameter list overrides a method with a single empty one:s
-
- -Wconf:cat=deprecation&msg=procedure syntax is deprecated:e
-
- -Wconf:cat=unchecked&msg=outer reference:s
- -Wconf:cat=unchecked&msg=eliminated by erasure:s
- -Wconf:msg=^(?=.*?a value of type)(?=.*?cannot also be).+$:s
-
- -Wconf:cat=unused-imports&src=org\/apache\/spark\/graphx\/impl\/VertexPartitionBase.scala:s
- -Wconf:cat=unused-imports&src=org\/apache\/spark\/graphx\/impl\/VertexPartitionBaseOps.scala:s
-
- -Wconf:msg=Implicit definition should have explicit type:s
-
-
-
-
-
-
+ -->
-
- org.codehaus.mojo
- build-helper-maven-plugin
-
-
- add-scala-sources
- generate-sources
-
- add-source
-
-
-
-
-
-
-
-
- add-scala-test-sources
- generate-test-sources
-
- add-test-source
-
-
-
-
-
-
-
-
-
-
diff --git a/repl/src/main/scala-2.12/org/apache/spark/repl/Main.scala b/repl/src/main/scala-2.12/org/apache/spark/repl/Main.scala
deleted file mode 100644
index eaca4ad6ee29a..0000000000000
--- a/repl/src/main/scala-2.12/org/apache/spark/repl/Main.scala
+++ /dev/null
@@ -1,135 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one or more
- * contributor license agreements. See the NOTICE file distributed with
- * this work for additional information regarding copyright ownership.
- * The ASF licenses this file to You under the Apache License, Version 2.0
- * (the "License"); you may not use this file except in compliance with
- * the License. You may obtain a copy of the License at
- *
- * http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing, software
- * distributed under the License is distributed on an "AS IS" BASIS,
- * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- * See the License for the specific language governing permissions and
- * limitations under the License.
- */
-
-package org.apache.spark.repl
-
-import java.io.File
-import java.net.URI
-import java.util.Locale
-
-import scala.tools.nsc.GenericRunnerSettings
-
-import org.apache.spark._
-import org.apache.spark.internal.Logging
-import org.apache.spark.sql.SparkSession
-import org.apache.spark.sql.internal.StaticSQLConf.CATALOG_IMPLEMENTATION
-import org.apache.spark.util.Utils
-
-object Main extends Logging {
-
- initializeLogIfNecessary(true)
- Signaling.cancelOnInterrupt()
-
- val conf = new SparkConf()
- val rootDir = conf.getOption("spark.repl.classdir").getOrElse(Utils.getLocalDir(conf))
- val outputDir = Utils.createTempDir(root = rootDir, namePrefix = "repl")
-
- var sparkContext: SparkContext = _
- var sparkSession: SparkSession = _
- // this is a public var because tests reset it.
- var interp: SparkILoop = _
-
- private var hasErrors = false
- private var isShellSession = false
-
- private def scalaOptionError(msg: String): Unit = {
- hasErrors = true
- // scalastyle:off println
- Console.err.println(msg)
- // scalastyle:on println
- }
-
- def main(args: Array[String]): Unit = {
- isShellSession = true
- doMain(args, new SparkILoop)
- }
-
- // Visible for testing
- private[repl] def doMain(args: Array[String], _interp: SparkILoop): Unit = {
- interp = _interp
- val jars = Utils.getLocalUserJarsForShell(conf)
- // Remove file:///, file:// or file:/ scheme if exists for each jar
- .map { x => if (x.startsWith("file:")) new File(new URI(x)).getPath else x }
- .mkString(File.pathSeparator)
- val interpArguments = List(
- "-Yrepl-class-based",
- "-Yrepl-outdir", s"${outputDir.getAbsolutePath}",
- "-classpath", jars
- ) ++ args.toList
-
- val settings = new GenericRunnerSettings(scalaOptionError)
- settings.processArguments(interpArguments, true)
-
- if (!hasErrors) {
- interp.process(settings) // Repl starts and goes in loop of R.E.P.L
- Option(sparkContext).foreach(_.stop)
- }
- }
-
- def createSparkSession(): SparkSession = {
- try {
- val execUri = System.getenv("SPARK_EXECUTOR_URI")
- conf.setIfMissing("spark.app.name", "Spark shell")
- // SparkContext will detect this configuration and register it with the RpcEnv's
- // file server, setting spark.repl.class.uri to the actual URI for executors to
- // use. This is sort of ugly but since executors are started as part of SparkContext
- // initialization in certain cases, there's an initialization order issue that prevents
- // this from being set after SparkContext is instantiated.
- conf.set("spark.repl.class.outputDir", outputDir.getAbsolutePath())
- if (execUri != null) {
- conf.set("spark.executor.uri", execUri)
- }
- if (System.getenv("SPARK_HOME") != null) {
- conf.setSparkHome(System.getenv("SPARK_HOME"))
- }
-
- val builder = SparkSession.builder.config(conf)
- if (conf.get(CATALOG_IMPLEMENTATION.key, "hive").toLowerCase(Locale.ROOT) == "hive") {
- if (SparkSession.hiveClassesArePresent) {
- // In the case that the property is not set at all, builder's config
- // does not have this value set to 'hive' yet. The original default
- // behavior is that when there are hive classes, we use hive catalog.
- sparkSession = builder.enableHiveSupport().getOrCreate()
- logInfo("Created Spark session with Hive support")
- } else {
- // Need to change it back to 'in-memory' if no hive classes are found
- // in the case that the property is set to hive in spark-defaults.conf
- builder.config(CATALOG_IMPLEMENTATION.key, "in-memory")
- sparkSession = builder.getOrCreate()
- logInfo("Created Spark session")
- }
- } else {
- // In the case that the property is set but not to 'hive', the internal
- // default is 'in-memory'. So the sparkSession will use in-memory catalog.
- sparkSession = builder.getOrCreate()
- logInfo("Created Spark session")
- }
- sparkContext = sparkSession.sparkContext
- sparkSession
- } catch {
- case e: ClassNotFoundException if isShellSession && e.getMessage.contains(
- "org.apache.spark.sql.connect.SparkConnectPlugin") =>
- logError("Failed to load spark connect plugin.")
- logError("You need to build Spark with -Pconnect.")
- sys.exit(1)
- case e: Exception if isShellSession =>
- logError("Failed to initialize Spark session.", e)
- sys.exit(1)
- }
- }
-
-}
diff --git a/repl/src/main/scala-2.12/org/apache/spark/repl/SparkILoop.scala b/repl/src/main/scala-2.12/org/apache/spark/repl/SparkILoop.scala
deleted file mode 100644
index 92984ed45f828..0000000000000
--- a/repl/src/main/scala-2.12/org/apache/spark/repl/SparkILoop.scala
+++ /dev/null
@@ -1,273 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one or more
- * contributor license agreements. See the NOTICE file distributed with
- * this work for additional information regarding copyright ownership.
- * The ASF licenses this file to You under the Apache License, Version 2.0
- * (the "License"); you may not use this file except in compliance with
- * the License. You may obtain a copy of the License at
- *
- * http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing, software
- * distributed under the License is distributed on an "AS IS" BASIS,
- * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- * See the License for the specific language governing permissions and
- * limitations under the License.
- */
-
-package org.apache.spark.repl
-
-import java.io.BufferedReader
-
-// scalastyle:off println
-import scala.Predef.{println => _, _}
-// scalastyle:on println
-import scala.concurrent.Future
-import scala.reflect.classTag
-import scala.reflect.io.File
-import scala.tools.nsc.{GenericRunnerSettings, Properties}
-import scala.tools.nsc.Settings
-import scala.tools.nsc.interpreter.{isReplDebug, isReplPower, replProps}
-import scala.tools.nsc.interpreter.{AbstractOrMissingHandler, ILoop, IMain, JPrintWriter}
-import scala.tools.nsc.interpreter.{NamedParam, SimpleReader, SplashLoop, SplashReader}
-import scala.tools.nsc.interpreter.StdReplTags.tagOfIMain
-import scala.tools.nsc.util.stringFromStream
-import scala.util.Properties.{javaVersion, javaVmName, versionString}
-
-/**
- * A Spark-specific interactive shell.
- */
-class SparkILoop(in0: Option[BufferedReader], out: JPrintWriter)
- extends ILoop(in0, out) {
- def this(in0: BufferedReader, out: JPrintWriter) = this(Some(in0), out)
- def this() = this(None, new JPrintWriter(Console.out, true))
-
- val initializationCommands: Seq[String] = Seq(
- """
- @transient val spark = if (org.apache.spark.repl.Main.sparkSession != null) {
- org.apache.spark.repl.Main.sparkSession
- } else {
- org.apache.spark.repl.Main.createSparkSession()
- }
- @transient val sc = {
- val _sc = spark.sparkContext
- if (_sc.getConf.getBoolean("spark.ui.reverseProxy", false)) {
- val proxyUrl = _sc.getConf.get("spark.ui.reverseProxyUrl", null)
- if (proxyUrl != null) {
- println(
- s"Spark Context Web UI is available at ${proxyUrl}/proxy/${_sc.applicationId}")
- } else {
- println(s"Spark Context Web UI is available at Spark Master Public URL")
- }
- } else {
- _sc.uiWebUrl.foreach {
- webUrl => println(s"Spark context Web UI available at ${webUrl}")
- }
- }
- println("Spark context available as 'sc' " +
- s"(master = ${_sc.master}, app id = ${_sc.applicationId}).")
- println("Spark session available as 'spark'.")
- _sc
- }
- """,
- "import org.apache.spark.SparkContext._",
- "import spark.implicits._",
- "import spark.sql",
- "import org.apache.spark.sql.functions._"
- )
-
- def initializeSpark(): Unit = {
- if (!intp.reporter.hasErrors) {
- // `savingReplayStack` removes the commands from session history.
- savingReplayStack {
- initializationCommands.foreach(intp quietRun _)
- }
- } else {
- throw new RuntimeException(s"Scala $versionString interpreter encountered " +
- "errors during initialization")
- }
- }
-
- /** Print a welcome message */
- override def printWelcome(): Unit = {
- import org.apache.spark.SPARK_VERSION
- echo("""Welcome to
- ____ __
- / __/__ ___ _____/ /__
- _\ \/ _ \/ _ `/ __/ '_/
- /___/ .__/\_,_/_/ /_/\_\ version %s
- /_/
- """.format(SPARK_VERSION))
- val welcomeMsg = "Using Scala %s (%s, Java %s)".format(
- versionString, javaVmName, javaVersion)
- echo(welcomeMsg)
- echo("Type in expressions to have them evaluated.")
- echo("Type :help for more information.")
- }
-
- /** Available commands */
- override def commands: List[LoopCommand] = standardCommands
-
- override def resetCommand(line: String): Unit = {
- super.resetCommand(line)
- initializeSpark()
- echo("Note that after :reset, state of SparkSession and SparkContext is unchanged.")
- }
-
- override def replay(): Unit = {
- initializeSpark()
- super.replay()
- }
-
- /**
- * The following code is mostly a copy of `process` implementation in `ILoop.scala` in Scala
- *
- * In newer version of Scala, `printWelcome` is the first thing to be called. As a result,
- * SparkUI URL information would be always shown after the welcome message.
- *
- * However, this is inconsistent compared with the existing version of Spark which will always
- * show SparkUI URL first.
- *
- * The only way we can make it consistent will be duplicating the Scala code.
- *
- * We should remove this duplication once Scala provides a way to load our custom initialization
- * code, and also customize the ordering of printing welcome message.
- */
- override def process(settings: Settings): Boolean = {
-
- def newReader = in0.fold(chooseReader(settings))(r => SimpleReader(r, out, interactive = true))
-
- /** Reader to use before interpreter is online. */
- def preLoop = {
- val sr = SplashReader(newReader) { r =>
- in = r
- in.postInit()
- }
- in = sr
- SplashLoop(sr, prompt)
- }
-
- /* Actions to cram in parallel while collecting first user input at prompt.
- * Run with output muted both from ILoop and from the intp reporter.
- */
- def loopPostInit(): Unit = mumly {
- // Bind intp somewhere out of the regular namespace where
- // we can get at it in generated code.
- intp.quietBind(NamedParam[IMain]("$intp", intp)(tagOfIMain, classTag[IMain]))
-
- // Auto-run code via some setting.
- ( replProps.replAutorunCode.option
- flatMap (f => File(f).safeSlurp())
- foreach (intp quietRun _)
- )
- // power mode setup
- if (isReplPower) enablePowerMode(true)
- initializeSpark()
- loadInitFiles()
- // SI-7418 Now, and only now, can we enable TAB completion.
- in.postInit()
- }
- def loadInitFiles(): Unit = settings match {
- case settings: GenericRunnerSettings =>
- for (f <- settings.loadfiles.value) {
- loadCommand(f)
- addReplay(s":load $f")
- }
- for (f <- settings.pastefiles.value) {
- pasteCommand(f)
- addReplay(s":paste $f")
- }
- case _ =>
- }
- // wait until after startup to enable noisy settings
- def withSuppressedSettings[A](body: => A): A = {
- val ss = this.settings
- import ss._
- val noisy = List(Xprint, Ytyperdebug)
- val noisesome = noisy.exists(!_.isDefault)
- val current = (Xprint.value, Ytyperdebug.value)
- if (isReplDebug || !noisesome) body
- else {
- this.settings.Xprint.value = List.empty
- this.settings.Ytyperdebug.value = false
- try body
- finally {
- Xprint.value = current._1
- Ytyperdebug.value = current._2
- intp.global.printTypings = current._2
- }
- }
- }
- def startup(): String = withSuppressedSettings {
- // let them start typing
- val splash = preLoop
-
- // while we go fire up the REPL
- try {
- // don't allow ancient sbt to hijack the reader
- savingReader {
- createInterpreter()
- }
- intp.initializeSynchronous()
-
- val field = classOf[ILoop].getDeclaredFields.filter(_.getName.contains("globalFuture")).head
- field.setAccessible(true)
- field.set(this, Future successful true)
-
- if (intp.reporter.hasErrors) {
- echo("Interpreter encountered errors during initialization!")
- null
- } else {
- loopPostInit()
- printWelcome()
- splash.start()
-
- val line = splash.line // what they typed in while they were waiting
- if (line == null) { // they ^D
- try out print Properties.shellInterruptedString
- finally closeInterpreter()
- }
- line
- }
- } finally splash.stop()
- }
-
- this.settings = settings
- startup() match {
- case null => false
- case line =>
- try loop(line) match {
- case LineResults.EOF => out print Properties.shellInterruptedString
- case _ =>
- }
- catch AbstractOrMissingHandler()
- finally closeInterpreter()
- true
- }
- }
-}
-
-object SparkILoop {
-
- /**
- * Creates an interpreter loop with default settings and feeds
- * the given code to it as input.
- */
- def run(code: String, sets: Settings = new Settings): String = {
- import java.io.{ BufferedReader, StringReader, OutputStreamWriter }
-
- stringFromStream { ostream =>
- Console.withOut(ostream) {
- val input = new BufferedReader(new StringReader(code))
- val output = new JPrintWriter(new OutputStreamWriter(ostream), true)
- val repl = new SparkILoop(input, output)
-
- if (sets.classpath.isDefault) {
- sets.classpath.value = sys.props("java.class.path")
- }
- repl process sets
- }
- }
- }
- def run(lines: List[String]): String = run(lines.map(_ + "\n").mkString)
-}
diff --git a/repl/src/main/scala-2.13/org/apache/spark/repl/Main.scala b/repl/src/main/scala/org/apache/spark/repl/Main.scala
similarity index 100%
rename from repl/src/main/scala-2.13/org/apache/spark/repl/Main.scala
rename to repl/src/main/scala/org/apache/spark/repl/Main.scala
diff --git a/repl/src/main/scala-2.13/org/apache/spark/repl/SparkILoop.scala b/repl/src/main/scala/org/apache/spark/repl/SparkILoop.scala
similarity index 100%
rename from repl/src/main/scala-2.13/org/apache/spark/repl/SparkILoop.scala
rename to repl/src/main/scala/org/apache/spark/repl/SparkILoop.scala
diff --git a/repl/src/test/scala-2.12/org/apache/spark/repl/Repl2Suite.scala b/repl/src/test/scala-2.12/org/apache/spark/repl/Repl2Suite.scala
deleted file mode 100644
index 15b45ad797ef2..0000000000000
--- a/repl/src/test/scala-2.12/org/apache/spark/repl/Repl2Suite.scala
+++ /dev/null
@@ -1,51 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one or more
- * contributor license agreements. See the NOTICE file distributed with
- * this work for additional information regarding copyright ownership.
- * The ASF licenses this file to You under the Apache License, Version 2.0
- * (the "License"); you may not use this file except in compliance with
- * the License. You may obtain a copy of the License at
- *
- * http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing, software
- * distributed under the License is distributed on an "AS IS" BASIS,
- * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- * See the License for the specific language governing permissions and
- * limitations under the License.
- */
-
-package org.apache.spark.repl
-
-import java.io._
-
-import scala.tools.nsc.interpreter.SimpleReader
-
-import org.apache.spark.{SparkContext, SparkFunSuite}
-
-class Repl2Suite extends SparkFunSuite {
- test("propagation of local properties") {
- // A mock ILoop that doesn't install the SIGINT handler.
- class ILoop(out: PrintWriter) extends SparkILoop(None, out) {
- settings = new scala.tools.nsc.Settings
- settings.usejavacp.value = true
- org.apache.spark.repl.Main.interp = this
- in = SimpleReader()
- }
-
- val out = new StringWriter()
- Main.interp = new ILoop(new PrintWriter(out))
- Main.sparkContext = new SparkContext("local", "repl-test")
- Main.interp.createInterpreter()
-
- Main.sparkContext.setLocalProperty("someKey", "someValue")
-
- // Make sure the value we set in the caller to interpret is propagated in the thread that
- // interprets the command.
- Main.interp.interpret("org.apache.spark.repl.Main.sparkContext.getLocalProperty(\"someKey\")")
- assert(out.toString.contains("someValue"))
-
- Main.sparkContext.stop()
- System.clearProperty("spark.driver.port")
- }
-}
diff --git a/repl/src/test/scala-2.12/org/apache/spark/repl/SingletonRepl2Suite.scala b/repl/src/test/scala-2.12/org/apache/spark/repl/SingletonRepl2Suite.scala
deleted file mode 100644
index a4eff392a2c99..0000000000000
--- a/repl/src/test/scala-2.12/org/apache/spark/repl/SingletonRepl2Suite.scala
+++ /dev/null
@@ -1,171 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one or more
- * contributor license agreements. See the NOTICE file distributed with
- * this work for additional information regarding copyright ownership.
- * The ASF licenses this file to You under the Apache License, Version 2.0
- * (the "License"); you may not use this file except in compliance with
- * the License. You may obtain a copy of the License at
- *
- * http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing, software
- * distributed under the License is distributed on an "AS IS" BASIS,
- * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- * See the License for the specific language governing permissions and
- * limitations under the License.
- */
-
-package org.apache.spark.repl
-
-import java.io._
-
-import org.apache.spark.SparkFunSuite
-
-/**
- * A special test suite for REPL that all test cases share one REPL instance.
- */
-class SingletonRepl2Suite extends SparkFunSuite {
- private val out = new StringWriter()
- private val in = new PipedOutputStream()
- private var thread: Thread = _
-
- private val CONF_EXECUTOR_CLASSPATH = "spark.executor.extraClassPath"
- private val oldExecutorClasspath = System.getProperty(CONF_EXECUTOR_CLASSPATH)
-
- override def beforeAll(): Unit = {
- super.beforeAll()
-
- val classpath = System.getProperty("java.class.path")
- System.setProperty(CONF_EXECUTOR_CLASSPATH, classpath)
-
- Main.conf.set("spark.master", "local-cluster[2,1,1024]")
- val interp = new SparkILoop(
- new BufferedReader(new InputStreamReader(new PipedInputStream(in))),
- new PrintWriter(out))
-
- // Forces to create new SparkContext
- Main.sparkContext = null
- Main.sparkSession = null
-
- // Starts a new thread to run the REPL interpreter, so that we won't block.
- thread = new Thread(() => Main.doMain(Array("-classpath", classpath), interp))
- thread.setDaemon(true)
- thread.start()
-
- waitUntil(() => out.toString.contains("Type :help for more information"))
- }
-
- override def afterAll(): Unit = {
- in.close()
- thread.join()
- if (oldExecutorClasspath != null) {
- System.setProperty(CONF_EXECUTOR_CLASSPATH, oldExecutorClasspath)
- } else {
- System.clearProperty(CONF_EXECUTOR_CLASSPATH)
- }
- super.afterAll()
- }
-
- private def waitUntil(cond: () => Boolean): Unit = {
- import scala.concurrent.duration._
- import org.scalatest.concurrent.Eventually._
-
- eventually(timeout(50.seconds), interval(500.millis)) {
- assert(cond(), "current output: " + out.toString)
- }
- }
-
- /**
- * Run the given commands string in a globally shared interpreter instance. Note that the given
- * commands should not crash the interpreter, to not affect other test cases.
- */
- def runInterpreter(input: String): String = {
- val currentOffset = out.getBuffer.length()
- // append a special statement to the end of the given code, so that we can know what's
- // the final output of this code snippet and rely on it to wait until the output is ready.
- val timestamp = System.currentTimeMillis()
- in.write((input + s"\nval _result_$timestamp = 1\n").getBytes)
- in.flush()
- val stopMessage = s"_result_$timestamp: Int = 1"
- waitUntil(() => out.getBuffer.substring(currentOffset).contains(stopMessage))
- out.getBuffer.substring(currentOffset)
- }
-
- def assertContains(message: String, output: String): Unit = {
- val isContain = output.contains(message)
- assert(isContain,
- "Interpreter output did not contain '" + message + "':\n" + output)
- }
-
- def assertDoesNotContain(message: String, output: String): Unit = {
- val isContain = output.contains(message)
- assert(!isContain,
- "Interpreter output contained '" + message + "':\n" + output)
- }
-
- test("SPARK-31399: should clone+clean line object w/ non-serializable state in ClosureCleaner") {
- // Test ClosureCleaner when a closure captures the enclosing `this` REPL line object, and that
- // object contains an unused non-serializable field.
- // Specifically, the closure in this test case contains a directly nested closure, and the
- // capture is triggered by the inner closure.
- // `ns` should be nulled out, but `topLevelValue` should stay intact.
-
- // Can't use :paste mode because PipedOutputStream/PipedInputStream doesn't work well with the
- // EOT control character (i.e. Ctrl+D).
- // Just write things on a single line to emulate :paste mode.
-
- // NOTE: in order for this test case to trigger the intended scenario, the following three
- // variables need to be in the same "input", which will make the REPL pack them into the
- // same REPL line object:
- // - ns: a non-serializable state, not accessed by the closure;
- // - topLevelValue: a serializable state, accessed by the closure;
- // - closure: the starting closure, captures the enclosing REPL line object.
- val output = runInterpreter(
- """
- |class NotSerializableClass(val x: Int)
- |val ns = new NotSerializableClass(42); val topLevelValue = "someValue"; val closure =
- |(j: Int) => {
- | (1 to j).flatMap { x =>
- | (1 to x).map { y => y + topLevelValue }
- | }
- |}
- |val r = sc.parallelize(0 to 2).map(closure).collect
- """.stripMargin)
- assertContains("r: Array[scala.collection.immutable.IndexedSeq[String]] = " +
- "Array(Vector(), Vector(1someValue), Vector(1someValue, 1someValue, 2someValue))", output)
-// assertContains("r: Array[IndexedSeq[String]] = " +
-// "Array(Vector(), Vector(1someValue), Vector(1someValue, 1someValue, 2someValue))", output)
- assertDoesNotContain("Exception", output)
- }
-
- test("SPARK-31399: ClosureCleaner should discover indirectly nested closure in inner class") {
- // Similar to the previous test case, but with indirect closure nesting instead.
- // There's still nested closures involved, but the inner closure is indirectly nested in the
- // outer closure, with a level of inner class in between them.
- // This changes how the inner closure references/captures the outer closure/enclosing `this`
- // REPL line object, and covers a different code path in inner closure discovery.
-
- // `ns` should be nulled out, but `topLevelValue` should stay intact.
-
- val output = runInterpreter(
- """
- |class NotSerializableClass(val x: Int)
- |val ns = new NotSerializableClass(42); val topLevelValue = "someValue"; val closure =
- |(j: Int) => {
- | class InnerFoo {
- | val innerClosure = (x: Int) => (1 to x).map { y => y + topLevelValue }
- | }
- | val innerFoo = new InnerFoo
- | (1 to j).flatMap(innerFoo.innerClosure)
- |}
- |val r = sc.parallelize(0 to 2).map(closure).collect
- """.stripMargin)
- assertContains("r: Array[scala.collection.immutable.IndexedSeq[String]] = " +
- "Array(Vector(), Vector(1someValue), Vector(1someValue, 1someValue, 2someValue))", output)
-// assertContains("r: Array[IndexedSeq[String]] = " +
-// "Array(Vector(), Vector(1someValue), Vector(1someValue, 1someValue, 2someValue))", output)
- assertDoesNotContain("Array(Vector(), Vector(1null), Vector(1null, 1null, 2null)", output)
- assertDoesNotContain("Exception", output)
- }
-
- }
diff --git a/repl/src/test/scala-2.13/org/apache/spark/repl/Repl2Suite.scala b/repl/src/test/scala/org/apache/spark/repl/Repl2Suite.scala
similarity index 100%
rename from repl/src/test/scala-2.13/org/apache/spark/repl/Repl2Suite.scala
rename to repl/src/test/scala/org/apache/spark/repl/Repl2Suite.scala
diff --git a/repl/src/test/scala-2.13/org/apache/spark/repl/SingletonRepl2Suite.scala b/repl/src/test/scala/org/apache/spark/repl/SingletonRepl2Suite.scala
similarity index 100%
rename from repl/src/test/scala-2.13/org/apache/spark/repl/SingletonRepl2Suite.scala
rename to repl/src/test/scala/org/apache/spark/repl/SingletonRepl2Suite.scala
diff --git a/resource-managers/kubernetes/core/pom.xml b/resource-managers/kubernetes/core/pom.xml
index e95909288c37b..f260b8f07ce5a 100644
--- a/resource-managers/kubernetes/core/pom.xml
+++ b/resource-managers/kubernetes/core/pom.xml
@@ -19,12 +19,12 @@
4.0.0
org.apache.spark
- spark-parent_2.12
+ spark-parent_2.13
4.0.0-SNAPSHOT
../../../pom.xml
- spark-kubernetes_2.12
+ spark-kubernetes_2.13
jar
Spark Project Kubernetes
diff --git a/resource-managers/kubernetes/docker/src/main/dockerfiles/spark/Dockerfile b/resource-managers/kubernetes/docker/src/main/dockerfiles/spark/Dockerfile
index 88304c87a79c3..02559b8de2d0a 100644
--- a/resource-managers/kubernetes/docker/src/main/dockerfiles/spark/Dockerfile
+++ b/resource-managers/kubernetes/docker/src/main/dockerfiles/spark/Dockerfile
@@ -14,9 +14,9 @@
# See the License for the specific language governing permissions and
# limitations under the License.
#
-ARG java_image_tag=17-jre
+ARG java_image_tag=21-jre
-FROM eclipse-temurin:${java_image_tag}
+FROM azul/zulu-openjdk:${java_image_tag}
ARG spark_uid=185
diff --git a/resource-managers/kubernetes/integration-tests/README.md b/resource-managers/kubernetes/integration-tests/README.md
index 909e5b652d448..d39fdfbfd3966 100644
--- a/resource-managers/kubernetes/integration-tests/README.md
+++ b/resource-managers/kubernetes/integration-tests/README.md
@@ -127,7 +127,7 @@ configuration is provided in `dev/spark-rbac.yaml`.
If you prefer to run just the integration tests directly, then you can customise the behaviour via passing system
properties to Maven. For example:
- mvn integration-test -am -pl :spark-kubernetes-integration-tests_2.12 \
+ mvn integration-test -am -pl :spark-kubernetes-integration-tests_2.13 \
-Pkubernetes -Pkubernetes-integration-tests \
-Phadoop-3 -Dhadoop.version=3.3.6 \
-Dspark.kubernetes.test.sparkTgz=spark-3.0.0-SNAPSHOT-bin-example.tgz \
@@ -329,11 +329,11 @@ You can also specify your specific dockerfile to build JVM/Python/R based image
## Requirements
- A minimum of 6 CPUs and 9G of memory is required to complete all Volcano test cases.
-- Volcano v1.7.0.
+- Volcano v1.8.0.
## Installation
- kubectl apply -f https://raw.githubusercontent.com/volcano-sh/volcano/v1.7.0/installer/volcano-development.yaml
+ kubectl apply -f https://raw.githubusercontent.com/volcano-sh/volcano/v1.8.0/installer/volcano-development.yaml
## Run tests
@@ -354,5 +354,5 @@ You can also specify `volcano` tag to only run Volcano test:
## Cleanup Volcano
- kubectl delete -f https://raw.githubusercontent.com/volcano-sh/volcano/v1.7.0/installer/volcano-development.yaml
+ kubectl delete -f https://raw.githubusercontent.com/volcano-sh/volcano/v1.8.0/installer/volcano-development.yaml
diff --git a/resource-managers/kubernetes/integration-tests/pom.xml b/resource-managers/kubernetes/integration-tests/pom.xml
index d1be2dae066f5..c5f55c52d0b64 100644
--- a/resource-managers/kubernetes/integration-tests/pom.xml
+++ b/resource-managers/kubernetes/integration-tests/pom.xml
@@ -19,12 +19,12 @@
4.0.0
org.apache.spark
- spark-parent_2.12
+ spark-parent_2.13
4.0.0-SNAPSHOT
../../../pom.xml
- spark-kubernetes-integration-tests_2.12
+ spark-kubernetes-integration-tests_2.13
kubernetes-integration-tests
diff --git a/resource-managers/kubernetes/integration-tests/src/test/scala/org/apache/spark/deploy/k8s/integrationtest/VolcanoTestsSuite.scala b/resource-managers/kubernetes/integration-tests/src/test/scala/org/apache/spark/deploy/k8s/integrationtest/VolcanoTestsSuite.scala
index 06d6f7dc100f3..35da48f61b366 100644
--- a/resource-managers/kubernetes/integration-tests/src/test/scala/org/apache/spark/deploy/k8s/integrationtest/VolcanoTestsSuite.scala
+++ b/resource-managers/kubernetes/integration-tests/src/test/scala/org/apache/spark/deploy/k8s/integrationtest/VolcanoTestsSuite.scala
@@ -123,7 +123,7 @@ private[spark] trait VolcanoTestsSuite extends BeforeAndAfterEach { k8sSuite: Ku
assert(pod.getSpec.getSchedulerName === "volcano")
}
- protected def checkAnnotaion(pod: Pod): Unit = {
+ protected def checkAnnotation(pod: Pod): Unit = {
val appId = pod.getMetadata.getLabels.get("spark-app-selector")
val annotations = pod.getMetadata.getAnnotations
assert(annotations.get("scheduling.k8s.io/group-name") === s"$appId-podgroup")
@@ -218,7 +218,7 @@ private[spark] trait VolcanoTestsSuite extends BeforeAndAfterEach { k8sSuite: Ku
runSparkDriverSubmissionAndVerifyCompletion(
driverPodChecker = (driverPod: Pod) => {
checkScheduler(driverPod)
- checkAnnotaion(driverPod)
+ checkAnnotation(driverPod)
checkPodGroup(driverPod, queue)
},
customSparkConf = Option(conf),
@@ -228,12 +228,12 @@ private[spark] trait VolcanoTestsSuite extends BeforeAndAfterEach { k8sSuite: Ku
runSparkPiAndVerifyCompletion(
driverPodChecker = (driverPod: Pod) => {
checkScheduler(driverPod)
- checkAnnotaion(driverPod)
+ checkAnnotation(driverPod)
checkPodGroup(driverPod, queue)
},
executorPodChecker = (executorPod: Pod) => {
checkScheduler(executorPod)
- checkAnnotaion(executorPod)
+ checkAnnotation(executorPod)
},
customSparkConf = Option(conf),
customAppLocator = Option(appLoc)
@@ -314,13 +314,13 @@ private[spark] trait VolcanoTestsSuite extends BeforeAndAfterEach { k8sSuite: Ku
driverPodChecker = (driverPod: Pod) => {
doBasicDriverPodCheck(driverPod)
checkScheduler(driverPod)
- checkAnnotaion(driverPod)
+ checkAnnotation(driverPod)
checkPodGroup(driverPod)
},
executorPodChecker = (executorPod: Pod) => {
doBasicExecutorPodCheck(executorPod)
checkScheduler(executorPod)
- checkAnnotaion(executorPod)
+ checkAnnotation(executorPod)
}
)
}
diff --git a/resource-managers/mesos/pom.xml b/resource-managers/mesos/pom.xml
index 29c341f8c3525..da0f31996350b 100644
--- a/resource-managers/mesos/pom.xml
+++ b/resource-managers/mesos/pom.xml
@@ -19,12 +19,12 @@
4.0.0
org.apache.spark
- spark-parent_2.12
+ spark-parent_2.13
4.0.0-SNAPSHOT
../../pom.xml
- spark-mesos_2.12
+ spark-mesos_2.13
jar
Spark Project Mesos
diff --git a/resource-managers/yarn/pom.xml b/resource-managers/yarn/pom.xml
index e58ab1ea25050..073661e9ac63d 100644
--- a/resource-managers/yarn/pom.xml
+++ b/resource-managers/yarn/pom.xml
@@ -19,12 +19,12 @@
4.0.0
org.apache.spark
- spark-parent_2.12
+ spark-parent_2.13
4.0.0-SNAPSHOT
../../pom.xml
- spark-yarn_2.12
+ spark-yarn_2.13
jar
Spark Project YARN
diff --git a/sql/api/pom.xml b/sql/api/pom.xml
index 93c47c968e743..bcf01bbe0cd9a 100644
--- a/sql/api/pom.xml
+++ b/sql/api/pom.xml
@@ -21,12 +21,12 @@
4.0.0
org.apache.spark
- spark-parent_2.12
+ spark-parent_2.13
4.0.0-SNAPSHOT
../../pom.xml
- spark-sql-api_2.12
+ spark-sql-api_2.13
jar
Spark Project SQL API
https://spark.apache.org/
@@ -81,24 +81,6 @@
target/scala-${scala.binary.version}/classes
target/scala-${scala.binary.version}/test-classes
-
- org.codehaus.mojo
- build-helper-maven-plugin
-
-
- add-sources
- generate-sources
-
- add-source
-
-
-
-
-
-
-
-
-
org.antlr
antlr4-maven-plugin
diff --git a/sql/api/src/main/scala-2.12/org/apache/spark/sql/catalyst/util/CaseInsensitiveMap.scala b/sql/api/src/main/scala-2.12/org/apache/spark/sql/catalyst/util/CaseInsensitiveMap.scala
deleted file mode 100644
index 14b8f620017f6..0000000000000
--- a/sql/api/src/main/scala-2.12/org/apache/spark/sql/catalyst/util/CaseInsensitiveMap.scala
+++ /dev/null
@@ -1,65 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one or more
- * contributor license agreements. See the NOTICE file distributed with
- * this work for additional information regarding copyright ownership.
- * The ASF licenses this file to You under the Apache License, Version 2.0
- * (the "License"); you may not use this file except in compliance with
- * the License. You may obtain a copy of the License at
- *
- * http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing, software
- * distributed under the License is distributed on an "AS IS" BASIS,
- * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- * See the License for the specific language governing permissions and
- * limitations under the License.
- */
-
-package org.apache.spark.sql.catalyst.util
-
-import java.util.Locale
-
-/**
- * Builds a map in which keys are case insensitive. Input map can be accessed for cases where
- * case-sensitive information is required. The primary constructor is marked private to avoid
- * nested case-insensitive map creation, otherwise the keys in the original map will become
- * case-insensitive in this scenario.
- * Note: CaseInsensitiveMap is serializable. However, after transformation, e.g. `filterKeys()`,
- * it may become not serializable.
- */
-class CaseInsensitiveMap[T] private (val originalMap: Map[String, T]) extends Map[String, T]
- with Serializable {
-
- // Note: this class supports Scala 2.12. A parallel source tree has a 2.13 implementation.
-
- val keyLowerCasedMap = originalMap.map(kv => kv.copy(_1 = kv._1.toLowerCase(Locale.ROOT)))
-
- override def get(k: String): Option[T] = keyLowerCasedMap.get(k.toLowerCase(Locale.ROOT))
-
- override def contains(k: String): Boolean =
- keyLowerCasedMap.contains(k.toLowerCase(Locale.ROOT))
-
- override def +[B1 >: T](kv: (String, B1)): CaseInsensitiveMap[B1] = {
- new CaseInsensitiveMap(originalMap.filter(!_._1.equalsIgnoreCase(kv._1)) + kv)
- }
-
- def ++(xs: TraversableOnce[(String, T)]): CaseInsensitiveMap[T] = {
- xs.foldLeft(this)(_ + _)
- }
-
- override def iterator: Iterator[(String, T)] = keyLowerCasedMap.iterator
-
- override def -(key: String): Map[String, T] = {
- new CaseInsensitiveMap(originalMap.filter(!_._1.equalsIgnoreCase(key)))
- }
-
- def toMap: Map[String, T] = originalMap
-}
-
-object CaseInsensitiveMap {
- def apply[T](params: Map[String, T]): CaseInsensitiveMap[T] = params match {
- case caseSensitiveMap: CaseInsensitiveMap[T] => caseSensitiveMap
- case _ => new CaseInsensitiveMap(params)
- }
-}
-
diff --git a/sql/api/src/main/scala-2.13/org/apache/spark/sql/catalyst/util/CaseInsensitiveMap.scala b/sql/api/src/main/scala/org/apache/spark/sql/catalyst/util/CaseInsensitiveMap.scala
similarity index 100%
rename from sql/api/src/main/scala-2.13/org/apache/spark/sql/catalyst/util/CaseInsensitiveMap.scala
rename to sql/api/src/main/scala/org/apache/spark/sql/catalyst/util/CaseInsensitiveMap.scala
diff --git a/sql/catalyst/pom.xml b/sql/catalyst/pom.xml
index 7feeb1581435b..8f2b9ccffeb8c 100644
--- a/sql/catalyst/pom.xml
+++ b/sql/catalyst/pom.xml
@@ -21,12 +21,12 @@
4.0.0
org.apache.spark
- spark-parent_2.12
+ spark-parent_2.13
4.0.0-SNAPSHOT
../../pom.xml
- spark-catalyst_2.12
+ spark-catalyst_2.13
jar
Spark Project Catalyst
https://spark.apache.org/
@@ -83,12 +83,10 @@
spark-sketch_${scala.binary.version}
${project.version}
-
org.scalacheck
scalacheck_${scala.binary.version}
@@ -154,24 +152,6 @@
-ea -Xmx4g -Xss4m -XX:ReservedCodeCacheSize=${CodeCacheSize} ${extraJavaTestArgs} -Dio.netty.tryReflectionSetAccessible=true
-
- org.codehaus.mojo
- build-helper-maven-plugin
-
-
- add-sources
- generate-sources
-
- add-source
-
-
-
-
-
-
-
-
-
diff --git a/sql/catalyst/src/main/scala-2.12/org/apache/spark/sql/catalyst/expressions/AttributeMap.scala b/sql/catalyst/src/main/scala-2.12/org/apache/spark/sql/catalyst/expressions/AttributeMap.scala
deleted file mode 100644
index 504b65e3db693..0000000000000
--- a/sql/catalyst/src/main/scala-2.12/org/apache/spark/sql/catalyst/expressions/AttributeMap.scala
+++ /dev/null
@@ -1,60 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one or more
- * contributor license agreements. See the NOTICE file distributed with
- * this work for additional information regarding copyright ownership.
- * The ASF licenses this file to You under the Apache License, Version 2.0
- * (the "License"); you may not use this file except in compliance with
- * the License. You may obtain a copy of the License at
- *
- * http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing, software
- * distributed under the License is distributed on an "AS IS" BASIS,
- * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- * See the License for the specific language governing permissions and
- * limitations under the License.
- */
-
-package org.apache.spark.sql.catalyst.expressions
-
-/**
- * Builds a map that is keyed by an Attribute's expression id. Using the expression id allows values
- * to be looked up even when the attributes used differ cosmetically (i.e., the capitalization
- * of the name, or the expected nullability).
- */
-object AttributeMap {
- def apply[A](kvs: Map[Attribute, A]): AttributeMap[A] = {
- new AttributeMap(kvs.map(kv => (kv._1.exprId, kv)))
- }
-
- def apply[A](kvs: Seq[(Attribute, A)]): AttributeMap[A] = {
- new AttributeMap(kvs.map(kv => (kv._1.exprId, kv)).toMap)
- }
-
- def apply[A](kvs: Iterable[(Attribute, A)]): AttributeMap[A] = {
- new AttributeMap(kvs.map(kv => (kv._1.exprId, kv)).toMap)
- }
-
- def empty[A]: AttributeMap[A] = new AttributeMap(Map.empty)
-}
-
-class AttributeMap[A](val baseMap: Map[ExprId, (Attribute, A)])
- extends Map[Attribute, A] with Serializable {
-
- // Note: this class supports Scala 2.12. A parallel source tree has a 2.13 implementation.
-
- override def get(k: Attribute): Option[A] = baseMap.get(k.exprId).map(_._2)
-
- override def getOrElse[B1 >: A](k: Attribute, default: => B1): B1 = get(k).getOrElse(default)
-
- override def contains(k: Attribute): Boolean = get(k).isDefined
-
- override def + [B1 >: A](kv: (Attribute, B1)): AttributeMap[B1] =
- AttributeMap(baseMap.values.toMap + kv)
-
- override def iterator: Iterator[(Attribute, A)] = baseMap.valuesIterator
-
- override def -(key: Attribute): Map[Attribute, A] = baseMap.values.toMap - key
-
- def ++(other: AttributeMap[A]): AttributeMap[A] = new AttributeMap(baseMap ++ other.baseMap)
-}
diff --git a/sql/catalyst/src/main/scala-2.12/org/apache/spark/sql/catalyst/expressions/ExpressionSet.scala b/sql/catalyst/src/main/scala-2.12/org/apache/spark/sql/catalyst/expressions/ExpressionSet.scala
deleted file mode 100644
index 3e545f745baee..0000000000000
--- a/sql/catalyst/src/main/scala-2.12/org/apache/spark/sql/catalyst/expressions/ExpressionSet.scala
+++ /dev/null
@@ -1,151 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one or more
- * contributor license agreements. See the NOTICE file distributed with
- * this work for additional information regarding copyright ownership.
- * The ASF licenses this file to You under the Apache License, Version 2.0
- * (the "License"); you may not use this file except in compliance with
- * the License. You may obtain a copy of the License at
- *
- * http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing, software
- * distributed under the License is distributed on an "AS IS" BASIS,
- * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- * See the License for the specific language governing permissions and
- * limitations under the License.
- */
-
-package org.apache.spark.sql.catalyst.expressions
-
-import scala.collection.mutable
-import scala.collection.mutable.ArrayBuffer
-
-object ExpressionSet {
- /** Constructs a new [[ExpressionSet]] by applying [[Canonicalize]] to `expressions`. */
- def apply(expressions: TraversableOnce[Expression]): ExpressionSet = {
- val set = new ExpressionSet()
- expressions.foreach(set.add)
- set
- }
-
- def apply(): ExpressionSet = {
- new ExpressionSet()
- }
-}
-
-/**
- * A [[Set]] where membership is determined based on determinacy and a canonical representation of
- * an [[Expression]] (i.e. one that attempts to ignore cosmetic differences).
- * See [[Canonicalize]] for more details.
- *
- * Internally this set uses the canonical representation, but keeps also track of the original
- * expressions to ease debugging. Since different expressions can share the same canonical
- * representation, this means that operations that extract expressions from this set are only
- * guaranteed to see at least one such expression. For example:
- *
- * {{{
- * val set = ExpressionSet(a + 1, 1 + a)
- *
- * set.iterator => Iterator(a + 1)
- * set.contains(a + 1) => true
- * set.contains(1 + a) => true
- * set.contains(a + 2) => false
- * }}}
- *
- * For non-deterministic expressions, they are always considered as not contained in the [[Set]].
- * On adding a non-deterministic expression, simply append it to the original expressions.
- * This is consistent with how we define `semanticEquals` between two expressions.
- *
- * The constructor of this class is protected so caller can only initialize an Expression from
- * empty, then build it using `add` and `remove` methods. So every instance of this class holds the
- * invariant that:
- * 1. Every expr `e` in `baseSet` satisfies `e.deterministic && e.canonicalized == e`
- * 2. Every deterministic expr `e` in `originals` satisfies that `e.canonicalized` is already
- * accessed.
- */
-class ExpressionSet protected(
- private val baseSet: mutable.Set[Expression] = new mutable.HashSet,
- private var originals: mutable.Buffer[Expression] = new ArrayBuffer)
- extends scala.collection.Set[Expression]
- with scala.collection.SetLike[Expression, ExpressionSet] {
-
- // Note: this class supports Scala 2.12. A parallel source tree has a 2.13 implementation.
- override def empty: ExpressionSet = new ExpressionSet()
-
- protected def add(e: Expression): Unit = {
- if (!e.deterministic) {
- originals += e
- } else if (!baseSet.contains(e.canonicalized)) {
- baseSet.add(e.canonicalized)
- originals += e
- }
- }
-
- protected def remove(e: Expression): Unit = {
- if (e.deterministic) {
- baseSet.remove(e.canonicalized)
- originals = originals.filter(!_.semanticEquals(e))
- }
- }
-
- override def contains(elem: Expression): Boolean = baseSet.contains(elem.canonicalized)
-
- override def filter(p: Expression => Boolean): ExpressionSet = {
- val newBaseSet = baseSet.filter(e => p(e))
- val newOriginals = originals.filter(e => p(e.canonicalized))
- new ExpressionSet(newBaseSet, newOriginals)
- }
-
- override def filterNot(p: Expression => Boolean): ExpressionSet = {
- val newBaseSet = baseSet.filterNot(e => p(e))
- val newOriginals = originals.filterNot(e => p(e.canonicalized))
- new ExpressionSet(newBaseSet, newOriginals)
- }
-
- override def +(elem: Expression): ExpressionSet = {
- val newSet = clone()
- newSet.add(elem)
- newSet
- }
-
- override def -(elem: Expression): ExpressionSet = {
- val newSet = clone()
- newSet.remove(elem)
- newSet
- }
-
- def map(f: Expression => Expression): ExpressionSet = {
- val newSet = new ExpressionSet()
- this.iterator.foreach(elem => newSet.add(f(elem)))
- newSet
- }
-
- def flatMap(f: Expression => Iterable[Expression]): ExpressionSet = {
- val newSet = new ExpressionSet()
- this.iterator.foreach(f(_).foreach(newSet.add))
- newSet
- }
-
- override def iterator: Iterator[Expression] = originals.iterator
-
- override def apply(elem: Expression): Boolean = this.contains(elem)
-
- override def equals(obj: Any): Boolean = obj match {
- case other: ExpressionSet => this.baseSet == other.baseSet
- case _ => false
- }
-
- override def hashCode(): Int = baseSet.hashCode()
-
- override def clone(): ExpressionSet = new ExpressionSet(baseSet.clone(), originals.clone())
-
- /**
- * Returns a string containing both the post [[Canonicalize]] expressions and the original
- * expressions in this set.
- */
- def toDebugString: String =
- s"""
- |baseSet: ${baseSet.mkString(", ")}
- |originals: ${originals.mkString(", ")}
- """.stripMargin
-}
diff --git a/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/Analyzer.scala b/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/Analyzer.scala
index 6491a4eea955b..aac85e1972102 100644
--- a/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/Analyzer.scala
+++ b/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/Analyzer.scala
@@ -1128,10 +1128,10 @@ class Analyzer(override val catalogManager: CatalogManager) extends RuleExecutor
lookupTableOrView(identifier).map {
case v: ResolvedPersistentView =>
val nameParts = v.catalog.name() +: v.identifier.asMultipartIdentifier
- throw QueryCompilationErrors.unsupportedViewOperationError(
+ throw QueryCompilationErrors.expectTableNotViewError(
nameParts, cmd, suggestAlternative, u)
case _: ResolvedTempView =>
- throw QueryCompilationErrors.unsupportedViewOperationError(
+ throw QueryCompilationErrors.expectTableNotViewError(
identifier, cmd, suggestAlternative, u)
case table => table
}.getOrElse(u)
@@ -1139,7 +1139,8 @@ class Analyzer(override val catalogManager: CatalogManager) extends RuleExecutor
case u @ UnresolvedView(identifier, cmd, allowTemp, suggestAlternative) =>
lookupTableOrView(identifier, viewOnly = true).map {
case _: ResolvedTempView if !allowTemp =>
- throw QueryCompilationErrors.unsupportedViewOperationError(identifier, cmd, false, u)
+ throw QueryCompilationErrors.expectPermanentViewNotTempViewError(
+ identifier, cmd, u)
case t: ResolvedTable =>
val nameParts = t.catalog.name() +: t.identifier.asMultipartIdentifier
throw QueryCompilationErrors.expectViewNotTableError(
@@ -1150,7 +1151,8 @@ class Analyzer(override val catalogManager: CatalogManager) extends RuleExecutor
case u @ UnresolvedTableOrView(identifier, cmd, allowTempView) =>
lookupTableOrView(identifier).map {
case _: ResolvedTempView if !allowTempView =>
- throw QueryCompilationErrors.unsupportedViewOperationError(identifier, cmd, false, u)
+ throw QueryCompilationErrors.expectPermanentViewNotTempViewError(
+ identifier, cmd, u)
case other => other
}.getOrElse(u)
}
@@ -3891,9 +3893,9 @@ object CleanupAliases extends Rule[LogicalPlan] with AliasHelper {
Window(cleanedWindowExprs, partitionSpec.map(trimAliases),
orderSpec.map(trimAliases(_).asInstanceOf[SortOrder]), child)
- case CollectMetrics(name, metrics, child) =>
+ case CollectMetrics(name, metrics, child, dataframeId) =>
val cleanedMetrics = metrics.map(trimNonTopLevelAliases)
- CollectMetrics(name, cleanedMetrics, child)
+ CollectMetrics(name, cleanedMetrics, child, dataframeId)
case Unpivot(ids, values, aliases, variableColumnName, valueColumnNames, child) =>
val cleanedIds = ids.map(_.map(trimNonTopLevelAliases))
diff --git a/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/CheckAnalysis.scala b/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/CheckAnalysis.scala
index 3c9a816df26f2..83b682bc9179e 100644
--- a/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/CheckAnalysis.scala
+++ b/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/CheckAnalysis.scala
@@ -497,7 +497,7 @@ trait CheckAnalysis extends PredicateHelper with LookupCatalog with QueryErrorsB
groupingExprs.foreach(checkValidGroupingExprs)
aggregateExprs.foreach(checkValidAggregateExpression)
- case CollectMetrics(name, metrics, _) =>
+ case CollectMetrics(name, metrics, _, _) =>
if (name == null || name.isEmpty) {
operator.failAnalysis(
errorClass = "INVALID_OBSERVED_METRICS.MISSING_NAME",
@@ -1097,17 +1097,15 @@ trait CheckAnalysis extends PredicateHelper with LookupCatalog with QueryErrorsB
* are allowed (e.g. self-joins).
*/
private def checkCollectedMetrics(plan: LogicalPlan): Unit = {
- val metricsMap = mutable.Map.empty[String, LogicalPlan]
+ val metricsMap = mutable.Map.empty[String, CollectMetrics]
def check(plan: LogicalPlan): Unit = plan.foreach { node =>
node match {
- case metrics @ CollectMetrics(name, _, _) =>
- val simplifiedMetrics = simplifyPlanForCollectedMetrics(metrics.canonicalized)
+ case metrics @ CollectMetrics(name, _, _, dataframeId) =>
metricsMap.get(name) match {
case Some(other) =>
- val simplifiedOther = simplifyPlanForCollectedMetrics(other.canonicalized)
// Exact duplicates are allowed. They can be the result
// of a CTE that is used multiple times or a self join.
- if (simplifiedMetrics != simplifiedOther) {
+ if (dataframeId != other.dataframeId) {
failAnalysis(
errorClass = "DUPLICATED_METRICS_NAME",
messageParameters = Map("metricName" -> name))
@@ -1126,32 +1124,6 @@ trait CheckAnalysis extends PredicateHelper with LookupCatalog with QueryErrorsB
check(plan)
}
- /**
- * This method is only used for checking collected metrics. This method tries to
- * remove extra project which only re-assign expr ids from the plan so that we can identify exact
- * duplicates metric definition.
- */
- def simplifyPlanForCollectedMetrics(plan: LogicalPlan): LogicalPlan = {
- plan.resolveOperators {
- case p: Project if p.projectList.size == p.child.output.size =>
- val assignExprIdOnly = p.projectList.zipWithIndex.forall {
- case (Alias(attr: AttributeReference, _), index) =>
- // The input plan of this method is already canonicalized. The attribute id becomes the
- // ordinal of this attribute in the child outputs. So an alias-only Project means the
- // the id of the aliased attribute is the same as its index in the project list.
- attr.exprId.id == index
- case (left: AttributeReference, index) =>
- left.exprId.id == index
- case _ => false
- }
- if (assignExprIdOnly) {
- p.child
- } else {
- p
- }
- }
- }
-
/**
* Validates to make sure the outer references appearing inside the subquery
* are allowed.
diff --git a/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/v2ResolutionPlans.scala b/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/v2ResolutionPlans.scala
index 0306385cc465a..6634ce72d7bd3 100644
--- a/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/v2ResolutionPlans.scala
+++ b/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/v2ResolutionPlans.scala
@@ -54,7 +54,7 @@ case class UnresolvedView(
multipartIdentifier: Seq[String],
commandName: String,
allowTemp: Boolean,
- suggestAlternative: Boolean) extends UnresolvedLeafNode
+ suggestAlternative: Boolean = false) extends UnresolvedLeafNode
/**
* Holds the name of a table or view that has yet to be looked up in a catalog. It will
diff --git a/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/encoders/ExpressionEncoder.scala b/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/encoders/ExpressionEncoder.scala
index ff72b5a0d9653..74d7a5e7a6757 100644
--- a/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/encoders/ExpressionEncoder.scala
+++ b/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/encoders/ExpressionEncoder.scala
@@ -170,7 +170,7 @@ object ExpressionEncoder {
* Function that deserializes an [[InternalRow]] into an object of type `T`. This class is not
* thread-safe.
*/
- class Deserializer[T](private val expressions: Seq[Expression])
+ class Deserializer[T](val expressions: Seq[Expression])
extends (InternalRow => T) with Serializable {
@transient
private[this] var constructProjection: Projection = _
diff --git a/sql/catalyst/src/main/scala-2.13/org/apache/spark/sql/catalyst/expressions/AttributeMap.scala b/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/AttributeMap.scala
similarity index 100%
rename from sql/catalyst/src/main/scala-2.13/org/apache/spark/sql/catalyst/expressions/AttributeMap.scala
rename to sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/AttributeMap.scala
diff --git a/sql/catalyst/src/main/scala-2.13/org/apache/spark/sql/catalyst/expressions/ExpressionSet.scala b/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/ExpressionSet.scala
similarity index 100%
rename from sql/catalyst/src/main/scala-2.13/org/apache/spark/sql/catalyst/expressions/ExpressionSet.scala
rename to sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/ExpressionSet.scala
diff --git a/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/collectionOperations.scala b/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/collectionOperations.scala
index 9c9127efb172a..4a3c7bbc2beb6 100644
--- a/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/collectionOperations.scala
+++ b/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/collectionOperations.scala
@@ -159,7 +159,7 @@ object Size {
4
""",
since = "3.3.0",
- group = "collection_funcs")
+ group = "array_funcs")
case class ArraySize(child: Expression)
extends RuntimeReplaceable with ImplicitCastInputTypes with UnaryLike[Expression] {
diff --git a/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/mathExpressions.scala b/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/mathExpressions.scala
index d1f45d20ae60b..e87d0bc414124 100644
--- a/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/mathExpressions.scala
+++ b/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/mathExpressions.scala
@@ -1271,7 +1271,7 @@ case class Pow(left: Expression, right: Expression)
4
""",
since = "1.5.0",
- group = "math_funcs")
+ group = "bitwise_funcs")
case class ShiftLeft(left: Expression, right: Expression)
extends BinaryExpression with ImplicitCastInputTypes with NullIntolerant {
diff --git a/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/optimizer/InjectRuntimeFilter.scala b/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/optimizer/InjectRuntimeFilter.scala
index 44c5586037514..13554908379a7 100644
--- a/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/optimizer/InjectRuntimeFilter.scala
+++ b/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/optimizer/InjectRuntimeFilter.scala
@@ -229,19 +229,15 @@ object InjectRuntimeFilter extends Rule[LogicalPlan] with PredicateHelper with J
* - The filterApplicationSideJoinExp can be pushed down through joins, aggregates and windows
* (ie the expression references originate from a single leaf node)
* - The filter creation side has a selective predicate
- * - The current join is a shuffle join or a broadcast join that has a shuffle below it
* - The max filterApplicationSide scan size is greater than a configurable threshold
*/
private def extractBeneficialFilterCreatePlan(
filterApplicationSide: LogicalPlan,
filterCreationSide: LogicalPlan,
filterApplicationSideExp: Expression,
- filterCreationSideExp: Expression,
- hint: JoinHint): Option[LogicalPlan] = {
+ filterCreationSideExp: Expression): Option[LogicalPlan] = {
if (findExpressionAndTrackLineageDown(
filterApplicationSideExp, filterApplicationSide).isDefined &&
- (isProbablyShuffleJoin(filterApplicationSide, filterCreationSide, hint) ||
- probablyHasShuffle(filterApplicationSide)) &&
satisfyByteSizeRequirement(filterApplicationSide)) {
extractSelectiveFilterOverScan(filterCreationSide, filterCreationSideExp)
} else {
@@ -326,15 +322,21 @@ object InjectRuntimeFilter extends Rule[LogicalPlan] with PredicateHelper with J
isSimpleExpression(l) && isSimpleExpression(r)) {
val oldLeft = newLeft
val oldRight = newRight
- if (canPruneLeft(joinType)) {
- extractBeneficialFilterCreatePlan(left, right, l, r, hint).foreach {
+ // Check if the current join is a shuffle join or a broadcast join that
+ // has a shuffle below it
+ val hasShuffle = isProbablyShuffleJoin(left, right, hint)
+ if (canPruneLeft(joinType) && (hasShuffle || probablyHasShuffle(left))) {
+ extractBeneficialFilterCreatePlan(left, right, l, r).foreach {
filterCreationSidePlan =>
newLeft = injectFilter(l, newLeft, r, filterCreationSidePlan)
}
}
// Did we actually inject on the left? If not, try on the right
- if (newLeft.fastEquals(oldLeft) && canPruneRight(joinType)) {
- extractBeneficialFilterCreatePlan(right, left, r, l, hint).foreach {
+ // Check if the current join is a shuffle join or a broadcast join that
+ // has a shuffle below it
+ if (newLeft.fastEquals(oldLeft) && canPruneRight(joinType) &&
+ (hasShuffle || probablyHasShuffle(right))) {
+ extractBeneficialFilterCreatePlan(right, left, r, l).foreach {
filterCreationSidePlan =>
newRight = injectFilter(r, newRight, l, filterCreationSidePlan)
}
diff --git a/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/parser/AstBuilder.scala b/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/parser/AstBuilder.scala
index f8a9ea268d2fa..10628bb306a3b 100644
--- a/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/parser/AstBuilder.scala
+++ b/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/parser/AstBuilder.scala
@@ -2812,8 +2812,8 @@ class AstBuilder extends DataTypeAstBuilder with SQLConfHelper with Logging {
private def createUnresolvedTable(
ctx: IdentifierReferenceContext,
commandName: String,
- hint: Boolean = false): LogicalPlan = withOrigin(ctx) {
- withIdentClause(ctx, UnresolvedTable(_, commandName, hint))
+ suggestAlternative: Boolean = false): LogicalPlan = withOrigin(ctx) {
+ withIdentClause(ctx, UnresolvedTable(_, commandName, suggestAlternative))
}
/**
@@ -2823,8 +2823,8 @@ class AstBuilder extends DataTypeAstBuilder with SQLConfHelper with Logging {
ctx: IdentifierReferenceContext,
commandName: String,
allowTemp: Boolean = true,
- hint: Boolean = false): LogicalPlan = withOrigin(ctx) {
- withIdentClause(ctx, UnresolvedView(_, commandName, allowTemp, hint))
+ suggestAlternative: Boolean = false): LogicalPlan = withOrigin(ctx) {
+ withIdentClause(ctx, UnresolvedView(_, commandName, allowTemp, suggestAlternative))
}
/**
@@ -4359,7 +4359,7 @@ class AstBuilder extends DataTypeAstBuilder with SQLConfHelper with Logging {
ctx.identifierReference,
commandName = "ALTER VIEW ... SET TBLPROPERTIES",
allowTemp = false,
- hint = true),
+ suggestAlternative = true),
cleanedTableProperties)
} else {
SetTableProperties(
@@ -4392,7 +4392,7 @@ class AstBuilder extends DataTypeAstBuilder with SQLConfHelper with Logging {
ctx.identifierReference,
commandName = "ALTER VIEW ... UNSET TBLPROPERTIES",
allowTemp = false,
- hint = true),
+ suggestAlternative = true),
cleanedProperties,
ifExists)
} else {
@@ -4418,8 +4418,7 @@ class AstBuilder extends DataTypeAstBuilder with SQLConfHelper with Logging {
SetTableLocation(
createUnresolvedTable(
ctx.identifierReference,
- "ALTER TABLE ... SET LOCATION ...",
- true),
+ "ALTER TABLE ... SET LOCATION ..."),
Option(ctx.partitionSpec).map(visitNonOptionalPartitionSpec),
visitLocationSpec(ctx.locationSpec))
}
@@ -4715,8 +4714,7 @@ class AstBuilder extends DataTypeAstBuilder with SQLConfHelper with Logging {
RecoverPartitions(
createUnresolvedTable(
ctx.identifierReference,
- "ALTER TABLE ... RECOVER PARTITIONS",
- true))
+ "ALTER TABLE ... RECOVER PARTITIONS"))
}
/**
@@ -4745,8 +4743,7 @@ class AstBuilder extends DataTypeAstBuilder with SQLConfHelper with Logging {
AddPartitions(
createUnresolvedTable(
ctx.identifierReference,
- "ALTER TABLE ... ADD PARTITION ...",
- true),
+ "ALTER TABLE ... ADD PARTITION ..."),
specsAndLocs.toSeq,
ctx.EXISTS != null)
}
@@ -4764,8 +4761,7 @@ class AstBuilder extends DataTypeAstBuilder with SQLConfHelper with Logging {
RenamePartitions(
createUnresolvedTable(
ctx.identifierReference,
- "ALTER TABLE ... RENAME TO PARTITION",
- true),
+ "ALTER TABLE ... RENAME TO PARTITION"),
UnresolvedPartitionSpec(visitNonOptionalPartitionSpec(ctx.from)),
UnresolvedPartitionSpec(visitNonOptionalPartitionSpec(ctx.to)))
}
@@ -4793,8 +4789,7 @@ class AstBuilder extends DataTypeAstBuilder with SQLConfHelper with Logging {
DropPartitions(
createUnresolvedTable(
ctx.identifierReference,
- "ALTER TABLE ... DROP PARTITION ...",
- true),
+ "ALTER TABLE ... DROP PARTITION ..."),
partSpecs.toSeq,
ifExists = ctx.EXISTS != null,
purge = ctx.PURGE != null)
diff --git a/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/plans/logical/basicLogicalOperators.scala b/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/plans/logical/basicLogicalOperators.scala
index efb7dbb44efef..8f976a49a2b10 100644
--- a/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/plans/logical/basicLogicalOperators.scala
+++ b/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/plans/logical/basicLogicalOperators.scala
@@ -1969,7 +1969,8 @@ trait SupportsSubquery extends LogicalPlan
case class CollectMetrics(
name: String,
metrics: Seq[NamedExpression],
- child: LogicalPlan)
+ child: LogicalPlan,
+ dataframeId: Long)
extends UnaryNode {
override lazy val resolved: Boolean = {
diff --git a/sql/catalyst/src/main/scala/org/apache/spark/sql/errors/QueryCompilationErrors.scala b/sql/catalyst/src/main/scala/org/apache/spark/sql/errors/QueryCompilationErrors.scala
index 7b83844ff336c..3536626d23920 100644
--- a/sql/catalyst/src/main/scala/org/apache/spark/sql/errors/QueryCompilationErrors.scala
+++ b/sql/catalyst/src/main/scala/org/apache/spark/sql/errors/QueryCompilationErrors.scala
@@ -460,7 +460,7 @@ private[sql] object QueryCompilationErrors extends QueryErrorsBase with Compilat
def insertIntoViewNotAllowedError(identifier: TableIdentifier, t: TreeNode[_]): Throwable = {
new AnalysisException(
- errorClass = "UNSUPPORTED_VIEW_OPERATION.WITHOUT_SUGGESTION",
+ errorClass = "EXPECT_TABLE_NOT_VIEW.NO_ALTERNATIVE",
messageParameters = Map(
"viewName" -> toSQLId(identifier.nameParts),
"operation" -> "INSERT"),
@@ -481,16 +481,16 @@ private[sql] object QueryCompilationErrors extends QueryErrorsBase with Compilat
origin = t.origin)
}
- def unsupportedViewOperationError(
+ def expectTableNotViewError(
nameParts: Seq[String],
cmd: String,
suggestAlternative: Boolean,
t: TreeNode[_]): Throwable = {
new AnalysisException(
errorClass = if (suggestAlternative) {
- "UNSUPPORTED_VIEW_OPERATION.WITH_SUGGESTION"
+ "EXPECT_TABLE_NOT_VIEW.USE_ALTER_VIEW"
} else {
- "UNSUPPORTED_VIEW_OPERATION.WITHOUT_SUGGESTION"
+ "EXPECT_TABLE_NOT_VIEW.NO_ALTERNATIVE"
},
messageParameters = Map(
"viewName" -> toSQLId(nameParts),
@@ -501,13 +501,13 @@ private[sql] object QueryCompilationErrors extends QueryErrorsBase with Compilat
def expectViewNotTableError(
nameParts: Seq[String],
cmd: String,
- hint: Boolean,
+ suggestAlternative: Boolean,
t: TreeNode[_]): Throwable = {
new AnalysisException(
- errorClass = if (hint) {
- "UNSUPPORTED_TABLE_OPERATION.WITH_SUGGESTION"
+ errorClass = if (suggestAlternative) {
+ "EXPECT_VIEW_NOT_TABLE.USE_ALTER_TABLE"
} else {
- "UNSUPPORTED_TABLE_OPERATION.WITHOUT_SUGGESTION"
+ "EXPECT_VIEW_NOT_TABLE.NO_ALTERNATIVE"
},
messageParameters = Map(
"tableName" -> toSQLId(nameParts),
@@ -515,6 +515,18 @@ private[sql] object QueryCompilationErrors extends QueryErrorsBase with Compilat
origin = t.origin)
}
+ def expectPermanentViewNotTempViewError(
+ nameParts: Seq[String],
+ cmd: String,
+ t: TreeNode[_]): Throwable = {
+ new AnalysisException(
+ errorClass = "EXPECT_PERMANENT_VIEW_NOT_TEMP",
+ messageParameters = Map(
+ "viewName" -> toSQLId(nameParts),
+ "operation" -> cmd),
+ origin = t.origin)
+ }
+
def expectPersistentFuncError(
name: String, cmd: String, mismatchHint: Option[String], t: TreeNode[_]): Throwable = {
val hintStr = mismatchHint.map(" " + _).getOrElse("")
@@ -2854,15 +2866,24 @@ private[sql] object QueryCompilationErrors extends QueryErrorsBase with Compilat
"dataColumns" -> query.output.map(c => toSQLId(c.name)).mkString(", ")))
}
- def tableIsNotViewError(name: TableIdentifier, replace: Boolean): Throwable = {
- val operation = if (replace) "CREATE OR REPLACE VIEW" else "CREATE VIEW"
- new AnalysisException(
- errorClass = "UNSUPPORTED_TABLE_OPERATION.WITHOUT_SUGGESTION",
- messageParameters = Map(
- "tableName" -> toSQLId(name.nameParts),
- "operation" -> operation
+ def unsupportedCreateOrReplaceViewOnTableError(
+ name: TableIdentifier, replace: Boolean): Throwable = {
+ if (replace) {
+ new AnalysisException(
+ errorClass = "EXPECT_VIEW_NOT_TABLE.NO_ALTERNATIVE",
+ messageParameters = Map(
+ "tableName" -> toSQLId(name.nameParts),
+ "operation" -> "CREATE OR REPLACE VIEW"
+ )
)
- )
+ } else {
+ new AnalysisException(
+ errorClass = "TABLE_OR_VIEW_ALREADY_EXISTS",
+ messageParameters = Map(
+ "relationName" -> toSQLId(name.nameParts)
+ )
+ )
+ }
}
def viewAlreadyExistsError(name: TableIdentifier): Throwable = {
diff --git a/sql/catalyst/src/main/scala/org/apache/spark/sql/errors/QueryExecutionErrors.scala b/sql/catalyst/src/main/scala/org/apache/spark/sql/errors/QueryExecutionErrors.scala
index e14fef1fad728..84472490128b8 100644
--- a/sql/catalyst/src/main/scala/org/apache/spark/sql/errors/QueryExecutionErrors.scala
+++ b/sql/catalyst/src/main/scala/org/apache/spark/sql/errors/QueryExecutionErrors.scala
@@ -422,7 +422,7 @@ private[sql] object QueryExecutionErrors extends QueryErrorsBase with ExecutionE
def classUnsupportedByMapObjectsError(cls: Class[_]): SparkRuntimeException = {
new SparkRuntimeException(
- errorClass = "_LEGACY_ERROR_TEMP_2018",
+ errorClass = "CLASS_UNSUPPORTED_BY_MAP_OBJECTS",
messageParameters = Map("cls" -> cls.getName))
}
diff --git a/sql/catalyst/src/test/scala-2.12/org/apache/spark/sql/catalyst/analysis/ExtractGeneratorSuite.scala b/sql/catalyst/src/test/scala-2.12/org/apache/spark/sql/catalyst/analysis/ExtractGeneratorSuite.scala
deleted file mode 100644
index 7df96a8c64e8d..0000000000000
--- a/sql/catalyst/src/test/scala-2.12/org/apache/spark/sql/catalyst/analysis/ExtractGeneratorSuite.scala
+++ /dev/null
@@ -1,41 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one or more
- * contributor license agreements. See the NOTICE file distributed with
- * this work for additional information regarding copyright ownership.
- * The ASF licenses this file to You under the Apache License, Version 2.0
- * (the "License"); you may not use this file except in compliance with
- * the License. You may obtain a copy of the License at
- *
- * http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing, software
- * distributed under the License is distributed on an "AS IS" BASIS,
- * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- * See the License for the specific language governing permissions and
- * limitations under the License.
- */
-
-package org.apache.spark.sql.catalyst.analysis
-
-import org.apache.spark.sql.catalyst.expressions._
-import org.apache.spark.sql.catalyst.plans.logical._
-import org.apache.spark.sql.types._
-
-/**
- * Note: this test supports Scala 2.12. A parallel source tree has a 2.13 implementation.
- */
-class ExtractGeneratorSuite extends AnalysisTest {
-
- test("SPARK-34141: ExtractGenerator with lazy project list") {
- val b = AttributeReference("b", ArrayType(StringType))()
-
- val columns = AttributeReference("a", StringType)() :: b :: Nil
- val explode = Alias(Explode(b), "c")()
-
- // view is a lazy seq
- val rel = LocalRelation(output = columns.view)
- val plan = Project(rel.output ++ (explode :: Nil), rel)
-
- assertAnalysisSuccess(plan)
- }
-}
diff --git a/sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/analysis/AnalysisSuite.scala b/sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/analysis/AnalysisSuite.scala
index ed3137430df6a..ffc12a2b9810c 100644
--- a/sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/analysis/AnalysisSuite.scala
+++ b/sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/analysis/AnalysisSuite.scala
@@ -779,34 +779,35 @@ class AnalysisSuite extends AnalysisTest with Matchers {
val literal = Literal(1).as("lit")
// Ok
- assert(CollectMetrics("event", literal :: sum :: random_sum :: Nil, testRelation).resolved)
+ assert(CollectMetrics("event", literal :: sum :: random_sum :: Nil, testRelation, 0).resolved)
// Bad name
- assert(!CollectMetrics("", sum :: Nil, testRelation).resolved)
+ assert(!CollectMetrics("", sum :: Nil, testRelation, 0).resolved)
assertAnalysisErrorClass(
- CollectMetrics("", sum :: Nil, testRelation),
+ CollectMetrics("", sum :: Nil, testRelation, 0),
expectedErrorClass = "INVALID_OBSERVED_METRICS.MISSING_NAME",
expectedMessageParameters = Map(
- "operator" -> "'CollectMetrics , [sum(a#x) AS sum#xL]\n+- LocalRelation , [a#x]\n")
+ "operator" ->
+ "'CollectMetrics , [sum(a#x) AS sum#xL], 0\n+- LocalRelation , [a#x]\n")
)
// No columns
- assert(!CollectMetrics("evt", Nil, testRelation).resolved)
+ assert(!CollectMetrics("evt", Nil, testRelation, 0).resolved)
def checkAnalysisError(exprs: Seq[NamedExpression], errors: String*): Unit = {
- assertAnalysisError(CollectMetrics("event", exprs, testRelation), errors)
+ assertAnalysisError(CollectMetrics("event", exprs, testRelation, 0), errors)
}
// Unwrapped attribute
assertAnalysisErrorClass(
- CollectMetrics("event", a :: Nil, testRelation),
+ CollectMetrics("event", a :: Nil, testRelation, 0),
expectedErrorClass = "INVALID_OBSERVED_METRICS.NON_AGGREGATE_FUNC_ARG_IS_ATTRIBUTE",
expectedMessageParameters = Map("expr" -> "\"a\"")
)
// Unwrapped non-deterministic expression
assertAnalysisErrorClass(
- CollectMetrics("event", Rand(10).as("rnd") :: Nil, testRelation),
+ CollectMetrics("event", Rand(10).as("rnd") :: Nil, testRelation, 0),
expectedErrorClass = "INVALID_OBSERVED_METRICS.NON_AGGREGATE_FUNC_ARG_IS_NON_DETERMINISTIC",
expectedMessageParameters = Map("expr" -> "\"rand(10) AS rnd\"")
)
@@ -816,7 +817,7 @@ class AnalysisSuite extends AnalysisTest with Matchers {
CollectMetrics(
"event",
Sum(a).toAggregateExpression(isDistinct = true).as("sum") :: Nil,
- testRelation),
+ testRelation, 0),
expectedErrorClass =
"INVALID_OBSERVED_METRICS.AGGREGATE_EXPRESSION_WITH_DISTINCT_UNSUPPORTED",
expectedMessageParameters = Map("expr" -> "\"sum(DISTINCT a) AS sum\"")
@@ -827,7 +828,7 @@ class AnalysisSuite extends AnalysisTest with Matchers {
CollectMetrics(
"event",
Sum(Sum(a).toAggregateExpression()).toAggregateExpression().as("sum") :: Nil,
- testRelation),
+ testRelation, 0),
expectedErrorClass = "INVALID_OBSERVED_METRICS.NESTED_AGGREGATES_UNSUPPORTED",
expectedMessageParameters = Map("expr" -> "\"sum(sum(a)) AS sum\"")
)
@@ -838,7 +839,7 @@ class AnalysisSuite extends AnalysisTest with Matchers {
WindowSpecDefinition(Nil, a.asc :: Nil,
SpecifiedWindowFrame(RowFrame, UnboundedPreceding, CurrentRow)))
assertAnalysisErrorClass(
- CollectMetrics("event", windowExpr.as("rn") :: Nil, testRelation),
+ CollectMetrics("event", windowExpr.as("rn") :: Nil, testRelation, 0),
expectedErrorClass = "INVALID_OBSERVED_METRICS.WINDOW_EXPRESSIONS_UNSUPPORTED",
expectedMessageParameters = Map(
"expr" ->
@@ -856,14 +857,14 @@ class AnalysisSuite extends AnalysisTest with Matchers {
// Same result - duplicate names are allowed
assertAnalysisSuccess(Union(
- CollectMetrics("evt1", count :: Nil, testRelation) ::
- CollectMetrics("evt1", count :: Nil, testRelation) :: Nil))
+ CollectMetrics("evt1", count :: Nil, testRelation, 0) ::
+ CollectMetrics("evt1", count :: Nil, testRelation, 0) :: Nil))
// Same children, structurally different metrics - fail
assertAnalysisErrorClass(
Union(
- CollectMetrics("evt1", count :: Nil, testRelation) ::
- CollectMetrics("evt1", sum :: Nil, testRelation) :: Nil),
+ CollectMetrics("evt1", count :: Nil, testRelation, 0) ::
+ CollectMetrics("evt1", sum :: Nil, testRelation, 1) :: Nil),
expectedErrorClass = "DUPLICATED_METRICS_NAME",
expectedMessageParameters = Map("metricName" -> "evt1")
)
@@ -873,17 +874,17 @@ class AnalysisSuite extends AnalysisTest with Matchers {
val tblB = LocalRelation(b)
assertAnalysisErrorClass(
Union(
- CollectMetrics("evt1", count :: Nil, testRelation) ::
- CollectMetrics("evt1", count :: Nil, tblB) :: Nil),
+ CollectMetrics("evt1", count :: Nil, testRelation, 0) ::
+ CollectMetrics("evt1", count :: Nil, tblB, 1) :: Nil),
expectedErrorClass = "DUPLICATED_METRICS_NAME",
expectedMessageParameters = Map("metricName" -> "evt1")
)
// Subquery different tree - fail
- val subquery = Aggregate(Nil, sum :: Nil, CollectMetrics("evt1", count :: Nil, testRelation))
+ val subquery = Aggregate(Nil, sum :: Nil, CollectMetrics("evt1", count :: Nil, testRelation, 0))
val query = Project(
b :: ScalarSubquery(subquery, Nil).as("sum") :: Nil,
- CollectMetrics("evt1", count :: Nil, tblB))
+ CollectMetrics("evt1", count :: Nil, tblB, 1))
assertAnalysisErrorClass(
query,
expectedErrorClass = "DUPLICATED_METRICS_NAME",
@@ -895,7 +896,7 @@ class AnalysisSuite extends AnalysisTest with Matchers {
case a: AggregateExpression => a.copy(filter = Some(true))
}.asInstanceOf[NamedExpression]
assertAnalysisErrorClass(
- CollectMetrics("evt1", sumWithFilter :: Nil, testRelation),
+ CollectMetrics("evt1", sumWithFilter :: Nil, testRelation, 0),
expectedErrorClass =
"INVALID_OBSERVED_METRICS.AGGREGATE_EXPRESSION_WITH_FILTER_UNSUPPORTED",
expectedMessageParameters = Map("expr" -> "\"sum(a) FILTER (WHERE true) AS sum\"")
@@ -1675,18 +1676,4 @@ class AnalysisSuite extends AnalysisTest with Matchers {
checkAnalysis(ident2.select($"a"), testRelation.select($"a").analyze)
}
}
-
- test("simplifyPlanForCollectedMetrics should handle non alias-only project case") {
- val inner = Project(
- Seq(
- Alias(testRelation2.output(0), "a")(),
- testRelation2.output(1),
- Alias(testRelation2.output(2), "c")(),
- testRelation2.output(3),
- testRelation2.output(4)
- ),
- testRelation2)
- val actualPlan = getAnalyzer.simplifyPlanForCollectedMetrics(inner.canonicalized)
- assert(actualPlan == testRelation2.canonicalized)
- }
}
diff --git a/sql/catalyst/src/test/scala-2.13/org/apache/spark/sql/catalyst/analysis/ExtractGeneratorSuite.scala b/sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/analysis/ExtractGeneratorSuite.scala
similarity index 100%
rename from sql/catalyst/src/test/scala-2.13/org/apache/spark/sql/catalyst/analysis/ExtractGeneratorSuite.scala
rename to sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/analysis/ExtractGeneratorSuite.scala
diff --git a/sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/expressions/ObjectExpressionsSuite.scala b/sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/expressions/ObjectExpressionsSuite.scala
index 3a662e68d58c2..de85d6fe0b748 100644
--- a/sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/expressions/ObjectExpressionsSuite.scala
+++ b/sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/expressions/ObjectExpressionsSuite.scala
@@ -404,11 +404,12 @@ class ObjectExpressionsSuite extends SparkFunSuite with ExpressionEvalHelper {
customCollectionClasses.foreach(testMapObjects(collection, _, inputType))
// Unsupported custom collection class
- val errMsg = intercept[RuntimeException] {
- testMapObjects(collection, classOf[scala.collection.Map[Int, Int]], inputType)
- }.getMessage()
- assert(errMsg.contains("`scala.collection.Map` is not supported by `MapObjects` " +
- "as resulting collection."))
+ checkError(
+ exception = intercept[SparkRuntimeException] {
+ testMapObjects(collection, classOf[scala.collection.Map[Int, Int]], inputType)
+ },
+ errorClass = "CLASS_UNSUPPORTED_BY_MAP_OBJECTS",
+ parameters = Map("cls" -> "scala.collection.Map"))
}
}
diff --git a/sql/core/pom.xml b/sql/core/pom.xml
index bf3caf58fe276..b2b8398c9d5de 100644
--- a/sql/core/pom.xml
+++ b/sql/core/pom.xml
@@ -21,12 +21,12 @@
4.0.0
org.apache.spark
- spark-parent_2.12
+ spark-parent_2.13
4.0.0-SNAPSHOT
../../pom.xml
- spark-sql_2.12
+ spark-sql_2.13
jar
Spark Project SQL
https://spark.apache.org/
@@ -89,12 +89,10 @@
test
-
org.apache.orc
orc-core
@@ -263,18 +261,6 @@
org.codehaus.mojo
build-helper-maven-plugin
-
- add-sources
- generate-sources
-
- add-source
-
-
-
-
-
-
-
add-scala-test-sources
generate-test-sources
diff --git a/sql/core/src/main/scala-2.12/org/apache/spark/sql/execution/streaming/StreamProgress.scala b/sql/core/src/main/scala-2.12/org/apache/spark/sql/execution/streaming/StreamProgress.scala
deleted file mode 100644
index 9e5bb8e061ccb..0000000000000
--- a/sql/core/src/main/scala-2.12/org/apache/spark/sql/execution/streaming/StreamProgress.scala
+++ /dev/null
@@ -1,54 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one or more
- * contributor license agreements. See the NOTICE file distributed with
- * this work for additional information regarding copyright ownership.
- * The ASF licenses this file to You under the Apache License, Version 2.0
- * (the "License"); you may not use this file except in compliance with
- * the License. You may obtain a copy of the License at
- *
- * http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing, software
- * distributed under the License is distributed on an "AS IS" BASIS,
- * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- * See the License for the specific language governing permissions and
- * limitations under the License.
- */
-
-package org.apache.spark.sql.execution.streaming
-
-import scala.collection.{immutable, GenTraversableOnce}
-
-import org.apache.spark.sql.connector.read.streaming.{Offset => OffsetV2, SparkDataStream}
-
-/**
- * A helper class that looks like a Map[Source, Offset].
- */
-class StreamProgress(
- val baseMap: immutable.Map[SparkDataStream, OffsetV2] =
- new immutable.HashMap[SparkDataStream, OffsetV2])
- extends scala.collection.immutable.Map[SparkDataStream, OffsetV2] {
-
- // Note: this class supports Scala 2.12. A parallel source tree has a 2.13 implementation.
-
- def toOffsetSeq(source: Seq[SparkDataStream], metadata: OffsetSeqMetadata): OffsetSeq = {
- OffsetSeq(source.map(get), Some(metadata))
- }
-
- override def toString: String =
- baseMap.map { case (k, v) => s"$k: $v"}.mkString("{", ",", "}")
-
- override def +[B1 >: OffsetV2](kv: (SparkDataStream, B1)): Map[SparkDataStream, B1] = {
- baseMap + kv
- }
-
- override def get(key: SparkDataStream): Option[OffsetV2] = baseMap.get(key)
-
- override def iterator: Iterator[(SparkDataStream, OffsetV2)] = baseMap.iterator
-
- override def -(key: SparkDataStream): Map[SparkDataStream, OffsetV2] = baseMap - key
-
- def ++(updates: GenTraversableOnce[(SparkDataStream, OffsetV2)]): StreamProgress = {
- new StreamProgress(baseMap ++ updates)
- }
-}
diff --git a/sql/core/src/main/scala/org/apache/spark/sql/Dataset.scala b/sql/core/src/main/scala/org/apache/spark/sql/Dataset.scala
index 528904bb29a18..f07496e643048 100644
--- a/sql/core/src/main/scala/org/apache/spark/sql/Dataset.scala
+++ b/sql/core/src/main/scala/org/apache/spark/sql/Dataset.scala
@@ -2218,7 +2218,7 @@ class Dataset[T] private[sql](
*/
@varargs
def observe(name: String, expr: Column, exprs: Column*): Dataset[T] = withTypedPlan {
- CollectMetrics(name, (expr +: exprs).map(_.named), logicalPlan)
+ CollectMetrics(name, (expr +: exprs).map(_.named), logicalPlan, id)
}
/**
diff --git a/sql/core/src/main/scala/org/apache/spark/sql/catalyst/analysis/ResolveSessionCatalog.scala b/sql/core/src/main/scala/org/apache/spark/sql/catalyst/analysis/ResolveSessionCatalog.scala
index 23a149fc03ab3..033fe5b0d1fdc 100644
--- a/sql/core/src/main/scala/org/apache/spark/sql/catalyst/analysis/ResolveSessionCatalog.scala
+++ b/sql/core/src/main/scala/org/apache/spark/sql/catalyst/analysis/ResolveSessionCatalog.scala
@@ -76,7 +76,7 @@ class ResolveSessionCatalog(val catalogManager: CatalogManager)
}
if (a.position.isDefined) {
throw QueryCompilationErrors.unsupportedTableOperationError(
- catalog, ident, "ALTER COLUMN ... FIRST | ALTER")
+ catalog, ident, "ALTER COLUMN ... FIRST | AFTER")
}
val builder = new MetadataBuilder
// Add comment to metadata
diff --git a/sql/core/src/main/scala/org/apache/spark/sql/execution/SparkStrategies.scala b/sql/core/src/main/scala/org/apache/spark/sql/execution/SparkStrategies.scala
index 903565a6d591b..d851eacd5ab92 100644
--- a/sql/core/src/main/scala/org/apache/spark/sql/execution/SparkStrategies.scala
+++ b/sql/core/src/main/scala/org/apache/spark/sql/execution/SparkStrategies.scala
@@ -935,7 +935,7 @@ abstract class SparkStrategies extends QueryPlanner[SparkPlan] {
throw QueryExecutionErrors.ddlUnsupportedTemporarilyError("UPDATE TABLE")
case _: MergeIntoTable =>
throw QueryExecutionErrors.ddlUnsupportedTemporarilyError("MERGE INTO TABLE")
- case logical.CollectMetrics(name, metrics, child) =>
+ case logical.CollectMetrics(name, metrics, child, _) =>
execution.CollectMetricsExec(name, metrics, planLater(child)) :: Nil
case WriteFiles(child, fileFormat, partitionColumns, bucket, options, staticPartitions) =>
WriteFilesExec(planLater(child), fileFormat, partitionColumns, bucket, options,
diff --git a/sql/core/src/main/scala/org/apache/spark/sql/execution/columnar/InMemoryRelation.scala b/sql/core/src/main/scala/org/apache/spark/sql/execution/columnar/InMemoryRelation.scala
index 45d006b58e879..27860f23d9b54 100644
--- a/sql/core/src/main/scala/org/apache/spark/sql/execution/columnar/InMemoryRelation.scala
+++ b/sql/core/src/main/scala/org/apache/spark/sql/execution/columnar/InMemoryRelation.scala
@@ -217,6 +217,11 @@ case class CachedRDDBuilder(
val cachedName = tableName.map(n => s"In-memory table $n")
.getOrElse(StringUtils.abbreviate(cachedPlan.toString, 1024))
+ val supportsColumnarInput: Boolean = {
+ cachedPlan.supportsColumnar &&
+ serializer.supportsColumnarInput(cachedPlan.output)
+ }
+
def cachedColumnBuffers: RDD[CachedBatch] = {
if (_cachedColumnBuffers == null) {
synchronized {
@@ -264,8 +269,7 @@ case class CachedRDDBuilder(
}
private def buildBuffers(): RDD[CachedBatch] = {
- val cb = if (cachedPlan.supportsColumnar &&
- serializer.supportsColumnarInput(cachedPlan.output)) {
+ val cb = if (supportsColumnarInput) {
serializer.convertColumnarBatchToCachedBatch(
cachedPlan.executeColumnar(),
cachedPlan.output,
diff --git a/sql/core/src/main/scala/org/apache/spark/sql/execution/columnar/InMemoryTableScanExec.scala b/sql/core/src/main/scala/org/apache/spark/sql/execution/columnar/InMemoryTableScanExec.scala
index 08244a4f84fea..064a46369055f 100644
--- a/sql/core/src/main/scala/org/apache/spark/sql/execution/columnar/InMemoryTableScanExec.scala
+++ b/sql/core/src/main/scala/org/apache/spark/sql/execution/columnar/InMemoryTableScanExec.scala
@@ -46,6 +46,15 @@ case class InMemoryTableScanExec(
}
}
+ override def simpleStringWithNodeId(): String = {
+ val columnarInfo = if (relation.cacheBuilder.supportsColumnarInput || supportsColumnar) {
+ s" (columnarIn=${relation.cacheBuilder.supportsColumnarInput}, columnarOut=$supportsColumnar)"
+ } else {
+ ""
+ }
+ super.simpleStringWithNodeId() + columnarInfo
+ }
+
override def innerChildren: Seq[QueryPlan[_]] = Seq(relation) ++ super.innerChildren
override def doCanonicalize(): SparkPlan =
diff --git a/sql/core/src/main/scala/org/apache/spark/sql/execution/command/views.scala b/sql/core/src/main/scala/org/apache/spark/sql/execution/command/views.scala
index b27bd5bf90967..88b7826bd91b3 100644
--- a/sql/core/src/main/scala/org/apache/spark/sql/execution/command/views.scala
+++ b/sql/core/src/main/scala/org/apache/spark/sql/execution/command/views.scala
@@ -146,7 +146,7 @@ case class CreateViewCommand(
// Handles `CREATE VIEW IF NOT EXISTS v0 AS SELECT ...`. Does nothing when the target view
// already exists.
} else if (tableMetadata.tableType != CatalogTableType.VIEW) {
- throw QueryCompilationErrors.tableIsNotViewError(name, replace)
+ throw QueryCompilationErrors.unsupportedCreateOrReplaceViewOnTableError(name, replace)
} else if (replace) {
// Detect cyclic view reference on CREATE OR REPLACE VIEW.
val viewIdent = tableMetadata.identifier
diff --git a/sql/core/src/main/scala-2.13/org/apache/spark/sql/execution/streaming/StreamProgress.scala b/sql/core/src/main/scala/org/apache/spark/sql/execution/streaming/StreamProgress.scala
similarity index 100%
rename from sql/core/src/main/scala-2.13/org/apache/spark/sql/execution/streaming/StreamProgress.scala
rename to sql/core/src/main/scala/org/apache/spark/sql/execution/streaming/StreamProgress.scala
diff --git a/sql/core/src/test/resources/sql-tests/analyzer-results/change-column.sql.out b/sql/core/src/test/resources/sql-tests/analyzer-results/change-column.sql.out
index b13c603e71c6f..9fdf09c4c651a 100644
--- a/sql/core/src/test/resources/sql-tests/analyzer-results/change-column.sql.out
+++ b/sql/core/src/test/resources/sql-tests/analyzer-results/change-column.sql.out
@@ -86,7 +86,7 @@ org.apache.spark.sql.AnalysisException
"errorClass" : "UNSUPPORTED_FEATURE.TABLE_OPERATION",
"sqlState" : "0A000",
"messageParameters" : {
- "operation" : "ALTER COLUMN ... FIRST | ALTER",
+ "operation" : "ALTER COLUMN ... FIRST | AFTER",
"tableName" : "`spark_catalog`.`default`.`test_change`"
}
}
@@ -100,7 +100,7 @@ org.apache.spark.sql.AnalysisException
"errorClass" : "UNSUPPORTED_FEATURE.TABLE_OPERATION",
"sqlState" : "0A000",
"messageParameters" : {
- "operation" : "ALTER COLUMN ... FIRST | ALTER",
+ "operation" : "ALTER COLUMN ... FIRST | AFTER",
"tableName" : "`spark_catalog`.`default`.`test_change`"
}
}
@@ -206,7 +206,7 @@ ALTER TABLE temp_view CHANGE a TYPE INT
-- !query analysis
org.apache.spark.sql.AnalysisException
{
- "errorClass" : "UNSUPPORTED_VIEW_OPERATION.WITHOUT_SUGGESTION",
+ "errorClass" : "EXPECT_TABLE_NOT_VIEW.NO_ALTERNATIVE",
"messageParameters" : {
"operation" : "ALTER TABLE ... CHANGE COLUMN",
"viewName" : "`temp_view`"
@@ -234,7 +234,7 @@ ALTER TABLE global_temp.global_temp_view CHANGE a TYPE INT
-- !query analysis
org.apache.spark.sql.AnalysisException
{
- "errorClass" : "UNSUPPORTED_VIEW_OPERATION.WITHOUT_SUGGESTION",
+ "errorClass" : "EXPECT_TABLE_NOT_VIEW.NO_ALTERNATIVE",
"messageParameters" : {
"operation" : "ALTER TABLE ... CHANGE COLUMN",
"viewName" : "`global_temp`.`global_temp_view`"
diff --git a/sql/core/src/test/resources/sql-tests/results/change-column.sql.out b/sql/core/src/test/resources/sql-tests/results/change-column.sql.out
index 83d0ef57001c2..0b7a6e88f51c0 100644
--- a/sql/core/src/test/resources/sql-tests/results/change-column.sql.out
+++ b/sql/core/src/test/resources/sql-tests/results/change-column.sql.out
@@ -112,7 +112,7 @@ org.apache.spark.sql.AnalysisException
"errorClass" : "UNSUPPORTED_FEATURE.TABLE_OPERATION",
"sqlState" : "0A000",
"messageParameters" : {
- "operation" : "ALTER COLUMN ... FIRST | ALTER",
+ "operation" : "ALTER COLUMN ... FIRST | AFTER",
"tableName" : "`spark_catalog`.`default`.`test_change`"
}
}
@@ -128,7 +128,7 @@ org.apache.spark.sql.AnalysisException
"errorClass" : "UNSUPPORTED_FEATURE.TABLE_OPERATION",
"sqlState" : "0A000",
"messageParameters" : {
- "operation" : "ALTER COLUMN ... FIRST | ALTER",
+ "operation" : "ALTER COLUMN ... FIRST | AFTER",
"tableName" : "`spark_catalog`.`default`.`test_change`"
}
}
@@ -270,7 +270,7 @@ struct<>
-- !query output
org.apache.spark.sql.AnalysisException
{
- "errorClass" : "UNSUPPORTED_VIEW_OPERATION.WITHOUT_SUGGESTION",
+ "errorClass" : "EXPECT_TABLE_NOT_VIEW.NO_ALTERNATIVE",
"messageParameters" : {
"operation" : "ALTER TABLE ... CHANGE COLUMN",
"viewName" : "`temp_view`"
@@ -300,7 +300,7 @@ struct<>
-- !query output
org.apache.spark.sql.AnalysisException
{
- "errorClass" : "UNSUPPORTED_VIEW_OPERATION.WITHOUT_SUGGESTION",
+ "errorClass" : "EXPECT_TABLE_NOT_VIEW.NO_ALTERNATIVE",
"messageParameters" : {
"operation" : "ALTER TABLE ... CHANGE COLUMN",
"viewName" : "`global_temp`.`global_temp_view`"
diff --git a/sql/core/src/test/scala/org/apache/spark/sql/DatasetSuite.scala b/sql/core/src/test/scala/org/apache/spark/sql/DatasetSuite.scala
index e05b545f235ba..3246953497876 100644
--- a/sql/core/src/test/scala/org/apache/spark/sql/DatasetSuite.scala
+++ b/sql/core/src/test/scala/org/apache/spark/sql/DatasetSuite.scala
@@ -20,6 +20,7 @@ package org.apache.spark.sql
import java.io.{Externalizable, ObjectInput, ObjectOutput}
import java.sql.{Date, Timestamp}
+import scala.reflect.ClassTag
import scala.util.Random
import org.apache.hadoop.fs.{Path, PathFilter}
@@ -32,8 +33,9 @@ import org.apache.spark.TestUtils.withListener
import org.apache.spark.internal.config.MAX_RESULT_SIZE
import org.apache.spark.scheduler.{SparkListener, SparkListenerJobStart}
import org.apache.spark.sql.catalyst.{FooClassWithEnum, FooEnum, ScroogeLikeExample}
-import org.apache.spark.sql.catalyst.encoders.{ExpressionEncoder, OuterScopes}
-import org.apache.spark.sql.catalyst.expressions.GenericRowWithSchema
+import org.apache.spark.sql.catalyst.encoders.{AgnosticEncoders, ExpressionEncoder, OuterScopes}
+import org.apache.spark.sql.catalyst.encoders.AgnosticEncoders.BoxedIntEncoder
+import org.apache.spark.sql.catalyst.expressions.{CodegenObjectFactoryMode, GenericRowWithSchema}
import org.apache.spark.sql.catalyst.plans.{LeftAnti, LeftSemi}
import org.apache.spark.sql.catalyst.util.sideBySide
import org.apache.spark.sql.execution.{LogicalRDD, RDDScanExec, SQLExecution}
@@ -2561,6 +2563,40 @@ class DatasetSuite extends QueryTest
checkDataset(ds.filter(f(col("_1"))), Tuple1(ValueClass(2)))
}
+
+ test("CLASS_UNSUPPORTED_BY_MAP_OBJECTS when creating dataset") {
+ withSQLConf(
+ // Set CODEGEN_FACTORY_MODE to default value to reproduce CLASS_UNSUPPORTED_BY_MAP_OBJECTS
+ SQLConf.CODEGEN_FACTORY_MODE.key -> CodegenObjectFactoryMode.NO_CODEGEN.toString) {
+ // Create our own encoder to cover the default encoder from spark.implicits._
+ implicit val im: ExpressionEncoder[Array[Int]] = ExpressionEncoder(
+ AgnosticEncoders.IterableEncoder(
+ ClassTag(classOf[Array[Int]]), BoxedIntEncoder, false, false))
+
+ val df = spark.createDataset(Seq(Array(1)))
+ val exception = intercept[org.apache.spark.SparkRuntimeException] {
+ df.collect()
+ }
+ val expressions = im.resolveAndBind(df.queryExecution.logical.output,
+ spark.sessionState.analyzer)
+ .createDeserializer().expressions
+
+ // Expression decoding error
+ checkError(
+ exception = exception,
+ errorClass = "_LEGACY_ERROR_TEMP_2151",
+ parameters = Map(
+ "e" -> exception.getCause.toString(),
+ "expressions" -> expressions.map(
+ _.simpleString(SQLConf.get.maxToStringFields)).mkString("\n"))
+ )
+ // class unsupported by map objects
+ checkError(
+ exception = exception.getCause.asInstanceOf[org.apache.spark.SparkRuntimeException],
+ errorClass = "CLASS_UNSUPPORTED_BY_MAP_OBJECTS",
+ parameters = Map("cls" -> classOf[Array[Int]].getName))
+ }
+ }
}
class DatasetLargeResultCollectingSuite extends QueryTest
diff --git a/sql/core/src/test/scala/org/apache/spark/sql/connector/DataSourceV2SQLSuite.scala b/sql/core/src/test/scala/org/apache/spark/sql/connector/DataSourceV2SQLSuite.scala
index c8ad7ec02a991..8e0580bf644fe 100644
--- a/sql/core/src/test/scala/org/apache/spark/sql/connector/DataSourceV2SQLSuite.scala
+++ b/sql/core/src/test/scala/org/apache/spark/sql/connector/DataSourceV2SQLSuite.scala
@@ -2601,7 +2601,7 @@ class DataSourceV2SQLSuiteV1Filter
sql("create global temp view v as select 1")
checkError(
exception = intercept[AnalysisException](sql("COMMENT ON TABLE global_temp.v IS NULL")),
- errorClass = "UNSUPPORTED_VIEW_OPERATION.WITHOUT_SUGGESTION",
+ errorClass = "EXPECT_TABLE_NOT_VIEW.NO_ALTERNATIVE",
parameters = Map(
"viewName" -> "`global_temp`.`v`",
"operation" -> "COMMENT ON TABLE"),
diff --git a/sql/core/src/test/scala/org/apache/spark/sql/execution/SQLViewSuite.scala b/sql/core/src/test/scala/org/apache/spark/sql/execution/SQLViewSuite.scala
index ff483aa4da6a5..75e5d4d452e15 100644
--- a/sql/core/src/test/scala/org/apache/spark/sql/execution/SQLViewSuite.scala
+++ b/sql/core/src/test/scala/org/apache/spark/sql/execution/SQLViewSuite.scala
@@ -115,7 +115,7 @@ abstract class SQLViewSuite extends QueryTest with SQLTestUtils {
exception = intercept[AnalysisException] {
sql("CREATE OR REPLACE VIEW tab1 AS SELECT * FROM jt")
},
- errorClass = "UNSUPPORTED_TABLE_OPERATION.WITHOUT_SUGGESTION",
+ errorClass = "EXPECT_VIEW_NOT_TABLE.NO_ALTERNATIVE",
parameters = Map(
"tableName" -> s"`$SESSION_CATALOG_NAME`.`default`.`tab1`",
"operation" -> "CREATE OR REPLACE VIEW")
@@ -124,16 +124,15 @@ abstract class SQLViewSuite extends QueryTest with SQLTestUtils {
exception = intercept[AnalysisException] {
sql("CREATE VIEW tab1 AS SELECT * FROM jt")
},
- errorClass = "UNSUPPORTED_TABLE_OPERATION.WITHOUT_SUGGESTION",
+ errorClass = "TABLE_OR_VIEW_ALREADY_EXISTS",
parameters = Map(
- "tableName" -> s"`$SESSION_CATALOG_NAME`.`default`.`tab1`",
- "operation" -> "CREATE VIEW")
+ "relationName" -> s"`$SESSION_CATALOG_NAME`.`default`.`tab1`")
)
checkError(
exception = intercept[AnalysisException] {
sql("ALTER VIEW tab1 AS SELECT * FROM jt")
},
- errorClass = "UNSUPPORTED_TABLE_OPERATION.WITHOUT_SUGGESTION",
+ errorClass = "EXPECT_VIEW_NOT_TABLE.NO_ALTERNATIVE",
parameters = Map(
"tableName" -> s"`$SESSION_CATALOG_NAME`.`default`.`tab1`",
"operation" -> "ALTER VIEW ... AS"
@@ -162,7 +161,7 @@ abstract class SQLViewSuite extends QueryTest with SQLTestUtils {
exception = intercept[AnalysisException] {
sql(s"ALTER VIEW $viewName SET TBLPROPERTIES ('p' = 'an')")
},
- errorClass = "UNSUPPORTED_VIEW_OPERATION.WITHOUT_SUGGESTION",
+ errorClass = "EXPECT_PERMANENT_VIEW_NOT_TEMP",
parameters = Map(
"viewName" -> s"`$viewName`",
"operation" -> "ALTER VIEW ... SET TBLPROPERTIES"
@@ -177,7 +176,7 @@ abstract class SQLViewSuite extends QueryTest with SQLTestUtils {
exception = intercept[AnalysisException] {
sql(s"ALTER VIEW $viewName UNSET TBLPROPERTIES ('p')")
},
- errorClass = "UNSUPPORTED_VIEW_OPERATION.WITHOUT_SUGGESTION",
+ errorClass = "EXPECT_PERMANENT_VIEW_NOT_TEMP",
parameters = Map(
"viewName" -> s"`$viewName`",
"operation" -> "ALTER VIEW ... UNSET TBLPROPERTIES"
@@ -199,7 +198,7 @@ abstract class SQLViewSuite extends QueryTest with SQLTestUtils {
exception = intercept[AnalysisException] {
sql(s"ALTER TABLE $viewName SET SERDE 'whatever'")
},
- errorClass = "UNSUPPORTED_VIEW_OPERATION.WITH_SUGGESTION",
+ errorClass = "EXPECT_TABLE_NOT_VIEW.USE_ALTER_VIEW",
parameters = Map(
"viewName" -> s"`$viewName`",
"operation" -> "ALTER TABLE ... SET [SERDE|SERDEPROPERTIES]"
@@ -210,7 +209,7 @@ abstract class SQLViewSuite extends QueryTest with SQLTestUtils {
exception = intercept[AnalysisException] {
sql(s"ALTER TABLE $viewName PARTITION (a=1, b=2) SET SERDE 'whatever'")
},
- errorClass = "UNSUPPORTED_VIEW_OPERATION.WITH_SUGGESTION",
+ errorClass = "EXPECT_TABLE_NOT_VIEW.USE_ALTER_VIEW",
parameters = Map(
"viewName" -> s"`$viewName`",
"operation" -> "ALTER TABLE ... SET [SERDE|SERDEPROPERTIES]"
@@ -221,7 +220,7 @@ abstract class SQLViewSuite extends QueryTest with SQLTestUtils {
exception = intercept[AnalysisException] {
sql(s"ALTER TABLE $viewName SET SERDEPROPERTIES ('p' = 'an')")
},
- errorClass = "UNSUPPORTED_VIEW_OPERATION.WITH_SUGGESTION",
+ errorClass = "EXPECT_TABLE_NOT_VIEW.USE_ALTER_VIEW",
parameters = Map(
"viewName" -> s"`$viewName`",
"operation" -> "ALTER TABLE ... SET [SERDE|SERDEPROPERTIES]"
@@ -232,7 +231,7 @@ abstract class SQLViewSuite extends QueryTest with SQLTestUtils {
exception = intercept[AnalysisException] {
sql(s"ALTER TABLE $viewName PARTITION (a='4') RENAME TO PARTITION (a='5')")
},
- errorClass = "UNSUPPORTED_VIEW_OPERATION.WITH_SUGGESTION",
+ errorClass = "EXPECT_TABLE_NOT_VIEW.NO_ALTERNATIVE",
parameters = Map(
"viewName" -> s"`$viewName`",
"operation" -> "ALTER TABLE ... RENAME TO PARTITION"
@@ -243,7 +242,7 @@ abstract class SQLViewSuite extends QueryTest with SQLTestUtils {
exception = intercept[AnalysisException] {
sql(s"ALTER TABLE $viewName RECOVER PARTITIONS")
},
- errorClass = "UNSUPPORTED_VIEW_OPERATION.WITH_SUGGESTION",
+ errorClass = "EXPECT_TABLE_NOT_VIEW.NO_ALTERNATIVE",
parameters = Map(
"viewName" -> s"`$viewName`",
"operation" -> "ALTER TABLE ... RECOVER PARTITIONS"
@@ -254,7 +253,7 @@ abstract class SQLViewSuite extends QueryTest with SQLTestUtils {
exception = intercept[AnalysisException] {
sql(s"ALTER TABLE $viewName SET LOCATION '/path/to/your/lovely/heart'")
},
- errorClass = "UNSUPPORTED_VIEW_OPERATION.WITH_SUGGESTION",
+ errorClass = "EXPECT_TABLE_NOT_VIEW.NO_ALTERNATIVE",
parameters = Map(
"viewName" -> s"`$viewName`",
"operation" -> "ALTER TABLE ... SET LOCATION ..."
@@ -265,7 +264,7 @@ abstract class SQLViewSuite extends QueryTest with SQLTestUtils {
exception = intercept[AnalysisException] {
sql(s"ALTER TABLE $viewName PARTITION (a='4') SET LOCATION '/path/to/home'")
},
- errorClass = "UNSUPPORTED_VIEW_OPERATION.WITH_SUGGESTION",
+ errorClass = "EXPECT_TABLE_NOT_VIEW.NO_ALTERNATIVE",
parameters = Map(
"viewName" -> s"`$viewName`",
"operation" -> "ALTER TABLE ... SET LOCATION ..."
@@ -276,7 +275,7 @@ abstract class SQLViewSuite extends QueryTest with SQLTestUtils {
exception = intercept[AnalysisException] {
sql(s"ALTER TABLE $viewName ADD IF NOT EXISTS PARTITION (a='4', b='8')")
},
- errorClass = "UNSUPPORTED_VIEW_OPERATION.WITH_SUGGESTION",
+ errorClass = "EXPECT_TABLE_NOT_VIEW.NO_ALTERNATIVE",
parameters = Map(
"viewName" -> s"`$viewName`",
"operation" -> "ALTER TABLE ... ADD PARTITION ..."
@@ -287,7 +286,7 @@ abstract class SQLViewSuite extends QueryTest with SQLTestUtils {
exception = intercept[AnalysisException] {
sql(s"ALTER TABLE $viewName DROP PARTITION (a='4', b='8')")
},
- errorClass = "UNSUPPORTED_VIEW_OPERATION.WITH_SUGGESTION",
+ errorClass = "EXPECT_TABLE_NOT_VIEW.NO_ALTERNATIVE",
parameters = Map(
"viewName" -> s"`$viewName`",
"operation" -> "ALTER TABLE ... DROP PARTITION ..."
@@ -298,7 +297,7 @@ abstract class SQLViewSuite extends QueryTest with SQLTestUtils {
exception = intercept[AnalysisException] {
sql(s"ALTER TABLE $viewName SET TBLPROPERTIES ('p' = 'an')")
},
- errorClass = "UNSUPPORTED_VIEW_OPERATION.WITH_SUGGESTION",
+ errorClass = "EXPECT_TABLE_NOT_VIEW.USE_ALTER_VIEW",
parameters = Map(
"viewName" -> s"`$viewName`",
"operation" -> "ALTER TABLE ... SET TBLPROPERTIES"
@@ -309,7 +308,7 @@ abstract class SQLViewSuite extends QueryTest with SQLTestUtils {
exception = intercept[AnalysisException] {
sql(s"ALTER TABLE $viewName UNSET TBLPROPERTIES ('p')")
},
- errorClass = "UNSUPPORTED_VIEW_OPERATION.WITH_SUGGESTION",
+ errorClass = "EXPECT_TABLE_NOT_VIEW.USE_ALTER_VIEW",
parameters = Map(
"viewName" -> s"`$viewName`",
"operation" -> "ALTER TABLE ... UNSET TBLPROPERTIES"
@@ -339,7 +338,7 @@ abstract class SQLViewSuite extends QueryTest with SQLTestUtils {
exception = intercept[AnalysisException] {
sql(sqlText)
},
- errorClass = "UNSUPPORTED_VIEW_OPERATION.WITHOUT_SUGGESTION",
+ errorClass = "EXPECT_TABLE_NOT_VIEW.NO_ALTERNATIVE",
parameters = Map(
"viewName" -> s"`$viewName`",
"operation" -> "LOAD DATA"
@@ -354,7 +353,7 @@ abstract class SQLViewSuite extends QueryTest with SQLTestUtils {
exception = intercept[AnalysisException] {
sql(s"SHOW CREATE TABLE $viewName")
},
- errorClass = "UNSUPPORTED_VIEW_OPERATION.WITHOUT_SUGGESTION",
+ errorClass = "EXPECT_PERMANENT_VIEW_NOT_TEMP",
parameters = Map(
"viewName" -> s"`$viewName`",
"operation" -> "SHOW CREATE TABLE"
@@ -369,7 +368,7 @@ abstract class SQLViewSuite extends QueryTest with SQLTestUtils {
exception = intercept[AnalysisException] {
sql(s"ANALYZE TABLE $viewName COMPUTE STATISTICS")
},
- errorClass = "UNSUPPORTED_VIEW_OPERATION.WITHOUT_SUGGESTION",
+ errorClass = "EXPECT_PERMANENT_VIEW_NOT_TEMP",
parameters = Map(
"viewName" -> s"`$viewName`",
"operation" -> "ANALYZE TABLE"
@@ -406,7 +405,7 @@ abstract class SQLViewSuite extends QueryTest with SQLTestUtils {
exception = intercept[AnalysisException] {
sql(s"INSERT INTO TABLE $viewName SELECT 1")
},
- errorClass = "UNSUPPORTED_VIEW_OPERATION.WITHOUT_SUGGESTION",
+ errorClass = "EXPECT_TABLE_NOT_VIEW.NO_ALTERNATIVE",
parameters = Map(
"viewName" -> s"`$SESSION_CATALOG_NAME`.`default`.`testview`",
"operation" -> "INSERT"
@@ -421,7 +420,7 @@ abstract class SQLViewSuite extends QueryTest with SQLTestUtils {
exception = intercept[AnalysisException] {
sql(sqlText)
},
- errorClass = "UNSUPPORTED_VIEW_OPERATION.WITHOUT_SUGGESTION",
+ errorClass = "EXPECT_TABLE_NOT_VIEW.NO_ALTERNATIVE",
parameters = Map(
"viewName" -> s"`$SESSION_CATALOG_NAME`.`default`.`testview`",
"operation" -> "LOAD DATA"),
diff --git a/sql/core/src/test/scala/org/apache/spark/sql/execution/SQLViewTestSuite.scala b/sql/core/src/test/scala/org/apache/spark/sql/execution/SQLViewTestSuite.scala
index c62227626b148..73d0bd19bf671 100644
--- a/sql/core/src/test/scala/org/apache/spark/sql/execution/SQLViewTestSuite.scala
+++ b/sql/core/src/test/scala/org/apache/spark/sql/execution/SQLViewTestSuite.scala
@@ -488,7 +488,7 @@ abstract class TempViewTestSuite extends SQLViewTestSuite {
exception = intercept[AnalysisException] {
sql(s"SHOW CREATE TABLE ${formattedViewName(viewName)}")
},
- errorClass = "UNSUPPORTED_VIEW_OPERATION.WITHOUT_SUGGESTION",
+ errorClass = "EXPECT_PERMANENT_VIEW_NOT_TEMP",
parameters = Map(
"viewName" -> toSQLId(tableIdentifier(viewName).nameParts),
"operation" -> "SHOW CREATE TABLE"),
diff --git a/sql/core/src/test/scala/org/apache/spark/sql/execution/command/AlterTableAddPartitionParserSuite.scala b/sql/core/src/test/scala/org/apache/spark/sql/execution/command/AlterTableAddPartitionParserSuite.scala
index 90b1587f39721..73430e8fe5e35 100644
--- a/sql/core/src/test/scala/org/apache/spark/sql/execution/command/AlterTableAddPartitionParserSuite.scala
+++ b/sql/core/src/test/scala/org/apache/spark/sql/execution/command/AlterTableAddPartitionParserSuite.scala
@@ -32,8 +32,7 @@ class AlterTableAddPartitionParserSuite extends AnalysisTest with SharedSparkSes
val expected = AddPartitions(
UnresolvedTable(
Seq("a", "b", "c"),
- "ALTER TABLE ... ADD PARTITION ...",
- true),
+ "ALTER TABLE ... ADD PARTITION ..."),
Seq(
UnresolvedPartitionSpec(Map("dt" -> "2008-08-08", "country" -> "us"), Some("location1")),
UnresolvedPartitionSpec(Map("dt" -> "2009-09-09", "country" -> "uk"), None)),
@@ -47,8 +46,7 @@ class AlterTableAddPartitionParserSuite extends AnalysisTest with SharedSparkSes
val expected = AddPartitions(
UnresolvedTable(
Seq("a", "b", "c"),
- "ALTER TABLE ... ADD PARTITION ...",
- true),
+ "ALTER TABLE ... ADD PARTITION ..."),
Seq(UnresolvedPartitionSpec(Map("dt" -> "2008-08-08"), Some("loc"))),
ifNotExists = false)
diff --git a/sql/core/src/test/scala/org/apache/spark/sql/execution/command/AlterTableDropPartitionParserSuite.scala b/sql/core/src/test/scala/org/apache/spark/sql/execution/command/AlterTableDropPartitionParserSuite.scala
index 7d6b5730c9ca3..605d736673c1a 100644
--- a/sql/core/src/test/scala/org/apache/spark/sql/execution/command/AlterTableDropPartitionParserSuite.scala
+++ b/sql/core/src/test/scala/org/apache/spark/sql/execution/command/AlterTableDropPartitionParserSuite.scala
@@ -32,8 +32,7 @@ class AlterTableDropPartitionParserSuite extends AnalysisTest with SharedSparkSe
val expected = DropPartitions(
UnresolvedTable(
Seq("table_name"),
- "ALTER TABLE ... DROP PARTITION ...",
- true),
+ "ALTER TABLE ... DROP PARTITION ..."),
Seq(
UnresolvedPartitionSpec(Map("dt" -> "2008-08-08", "country" -> "us")),
UnresolvedPartitionSpec(Map("dt" -> "2009-09-09", "country" -> "uk"))),
@@ -52,8 +51,7 @@ class AlterTableDropPartitionParserSuite extends AnalysisTest with SharedSparkSe
val expected = DropPartitions(
UnresolvedTable(
Seq("table_name"),
- "ALTER TABLE ... DROP PARTITION ...",
- true),
+ "ALTER TABLE ... DROP PARTITION ..."),
Seq(
UnresolvedPartitionSpec(Map("dt" -> "2008-08-08", "country" -> "us")),
UnresolvedPartitionSpec(Map("dt" -> "2009-09-09", "country" -> "uk"))),
@@ -67,8 +65,7 @@ class AlterTableDropPartitionParserSuite extends AnalysisTest with SharedSparkSe
val expected = DropPartitions(
UnresolvedTable(
Seq("a", "b", "c"),
- "ALTER TABLE ... DROP PARTITION ...",
- true),
+ "ALTER TABLE ... DROP PARTITION ..."),
Seq(UnresolvedPartitionSpec(Map("ds" -> "2017-06-10"))),
ifExists = true,
purge = false)
@@ -81,8 +78,7 @@ class AlterTableDropPartitionParserSuite extends AnalysisTest with SharedSparkSe
val expected = DropPartitions(
UnresolvedTable(
Seq("table_name"),
- "ALTER TABLE ... DROP PARTITION ...",
- true),
+ "ALTER TABLE ... DROP PARTITION ..."),
Seq(UnresolvedPartitionSpec(Map("p" -> "1"))),
ifExists = false,
purge = true)
diff --git a/sql/core/src/test/scala/org/apache/spark/sql/execution/command/AlterTableRecoverPartitionsParserSuite.scala b/sql/core/src/test/scala/org/apache/spark/sql/execution/command/AlterTableRecoverPartitionsParserSuite.scala
index 7c82b1f81ab97..936b1a3dfdb20 100644
--- a/sql/core/src/test/scala/org/apache/spark/sql/execution/command/AlterTableRecoverPartitionsParserSuite.scala
+++ b/sql/core/src/test/scala/org/apache/spark/sql/execution/command/AlterTableRecoverPartitionsParserSuite.scala
@@ -38,8 +38,7 @@ class AlterTableRecoverPartitionsParserSuite extends AnalysisTest with SharedSpa
RecoverPartitions(
UnresolvedTable(
Seq("tbl"),
- "ALTER TABLE ... RECOVER PARTITIONS",
- true)))
+ "ALTER TABLE ... RECOVER PARTITIONS")))
}
test("recover partitions of a table in a database") {
@@ -48,8 +47,7 @@ class AlterTableRecoverPartitionsParserSuite extends AnalysisTest with SharedSpa
RecoverPartitions(
UnresolvedTable(
Seq("db", "tbl"),
- "ALTER TABLE ... RECOVER PARTITIONS",
- true)))
+ "ALTER TABLE ... RECOVER PARTITIONS")))
}
test("recover partitions of a table spark_catalog") {
@@ -58,8 +56,7 @@ class AlterTableRecoverPartitionsParserSuite extends AnalysisTest with SharedSpa
RecoverPartitions(
UnresolvedTable(
Seq("spark_catalog", "db", "TBL"),
- "ALTER TABLE ... RECOVER PARTITIONS",
- true)))
+ "ALTER TABLE ... RECOVER PARTITIONS")))
}
test("recover partitions of a table in nested namespaces") {
@@ -68,7 +65,6 @@ class AlterTableRecoverPartitionsParserSuite extends AnalysisTest with SharedSpa
RecoverPartitions(
UnresolvedTable(
Seq("ns1", "ns2", "ns3", "ns4", "ns5", "ns6", "ns7", "ns8", "t"),
- "ALTER TABLE ... RECOVER PARTITIONS",
- true)))
+ "ALTER TABLE ... RECOVER PARTITIONS")))
}
}
diff --git a/sql/core/src/test/scala/org/apache/spark/sql/execution/command/AlterTableRenamePartitionParserSuite.scala b/sql/core/src/test/scala/org/apache/spark/sql/execution/command/AlterTableRenamePartitionParserSuite.scala
index 16bf0c65217ea..848d17bf15f83 100644
--- a/sql/core/src/test/scala/org/apache/spark/sql/execution/command/AlterTableRenamePartitionParserSuite.scala
+++ b/sql/core/src/test/scala/org/apache/spark/sql/execution/command/AlterTableRenamePartitionParserSuite.scala
@@ -32,8 +32,7 @@ class AlterTableRenamePartitionParserSuite extends AnalysisTest with SharedSpark
val expected = RenamePartitions(
UnresolvedTable(
Seq("a", "b", "c"),
- "ALTER TABLE ... RENAME TO PARTITION",
- true),
+ "ALTER TABLE ... RENAME TO PARTITION"),
UnresolvedPartitionSpec(Map("ds" -> "2017-06-10")),
UnresolvedPartitionSpec(Map("ds" -> "2018-06-10")))
comparePlans(parsed, expected)
@@ -48,8 +47,7 @@ class AlterTableRenamePartitionParserSuite extends AnalysisTest with SharedSpark
val expected = RenamePartitions(
UnresolvedTable(
Seq("table_name"),
- "ALTER TABLE ... RENAME TO PARTITION",
- true),
+ "ALTER TABLE ... RENAME TO PARTITION"),
UnresolvedPartitionSpec(Map("dt" -> "2008-08-08", "country" -> "us")),
UnresolvedPartitionSpec(Map("dt" -> "2008-09-09", "country" -> "uk")))
comparePlans(parsed, expected)
diff --git a/sql/core/src/test/scala/org/apache/spark/sql/execution/command/AlterTableSetLocationParserSuite.scala b/sql/core/src/test/scala/org/apache/spark/sql/execution/command/AlterTableSetLocationParserSuite.scala
index a44bb8a3729c0..3f50d6316d4cd 100644
--- a/sql/core/src/test/scala/org/apache/spark/sql/execution/command/AlterTableSetLocationParserSuite.scala
+++ b/sql/core/src/test/scala/org/apache/spark/sql/execution/command/AlterTableSetLocationParserSuite.scala
@@ -28,7 +28,7 @@ class AlterTableSetLocationParserSuite extends AnalysisTest with SharedSparkSess
val sql1 = "ALTER TABLE a.b.c SET LOCATION 'new location'"
val parsed1 = parsePlan(sql1)
val expected1 = SetTableLocation(
- UnresolvedTable(Seq("a", "b", "c"), "ALTER TABLE ... SET LOCATION ...", true),
+ UnresolvedTable(Seq("a", "b", "c"), "ALTER TABLE ... SET LOCATION ..."),
None,
"new location")
comparePlans(parsed1, expected1)
@@ -36,7 +36,7 @@ class AlterTableSetLocationParserSuite extends AnalysisTest with SharedSparkSess
val sql2 = "ALTER TABLE a.b.c PARTITION(ds='2017-06-10') SET LOCATION 'new location'"
val parsed2 = parsePlan(sql2)
val expected2 = SetTableLocation(
- UnresolvedTable(Seq("a", "b", "c"), "ALTER TABLE ... SET LOCATION ...", true),
+ UnresolvedTable(Seq("a", "b", "c"), "ALTER TABLE ... SET LOCATION ..."),
Some(Map("ds" -> "2017-06-10")),
"new location")
comparePlans(parsed2, expected2)
diff --git a/sql/core/src/test/scala/org/apache/spark/sql/execution/command/DDLSuite.scala b/sql/core/src/test/scala/org/apache/spark/sql/execution/command/DDLSuite.scala
index f550ac6f08e98..f0a5f14bc4099 100644
--- a/sql/core/src/test/scala/org/apache/spark/sql/execution/command/DDLSuite.scala
+++ b/sql/core/src/test/scala/org/apache/spark/sql/execution/command/DDLSuite.scala
@@ -211,7 +211,7 @@ class InMemoryCatalogedDDLSuite extends DDLSuite with SharedSparkSession {
errorClass = "UNSUPPORTED_FEATURE.TABLE_OPERATION",
sqlState = "0A000",
parameters = Map("tableName" -> "`spark_catalog`.`default`.`t`",
- "operation" -> "ALTER COLUMN ... FIRST | ALTER"))
+ "operation" -> "ALTER COLUMN ... FIRST | AFTER"))
}
}
@@ -2068,7 +2068,7 @@ abstract class DDLSuite extends QueryTest with DDLSuiteBase {
exception = intercept[AnalysisException] {
sql("ALTER TABLE tmp_v ADD COLUMNS (c3 INT)")
},
- errorClass = "UNSUPPORTED_VIEW_OPERATION.WITHOUT_SUGGESTION",
+ errorClass = "EXPECT_TABLE_NOT_VIEW.NO_ALTERNATIVE",
parameters = Map(
"viewName" -> "`tmp_v`",
"operation" -> "ALTER TABLE ... ADD COLUMNS"),
@@ -2087,7 +2087,7 @@ abstract class DDLSuite extends QueryTest with DDLSuiteBase {
exception = intercept[AnalysisException] {
sql("ALTER TABLE v1 ADD COLUMNS (c3 INT)")
},
- errorClass = "UNSUPPORTED_VIEW_OPERATION.WITHOUT_SUGGESTION",
+ errorClass = "EXPECT_TABLE_NOT_VIEW.NO_ALTERNATIVE",
parameters = Map(
"viewName" -> s"`$SESSION_CATALOG_NAME`.`default`.`v1`",
"operation" -> "ALTER TABLE ... ADD COLUMNS"),
diff --git a/sql/core/src/test/scala/org/apache/spark/sql/execution/command/TruncateTableSuiteBase.scala b/sql/core/src/test/scala/org/apache/spark/sql/execution/command/TruncateTableSuiteBase.scala
index 431751c9c2910..facbfa3dedf8c 100644
--- a/sql/core/src/test/scala/org/apache/spark/sql/execution/command/TruncateTableSuiteBase.scala
+++ b/sql/core/src/test/scala/org/apache/spark/sql/execution/command/TruncateTableSuiteBase.scala
@@ -181,7 +181,7 @@ trait TruncateTableSuiteBase extends QueryTest with DDLCommandTestUtils {
exception = intercept[AnalysisException] {
sql("TRUNCATE TABLE v0")
},
- errorClass = "UNSUPPORTED_VIEW_OPERATION.WITHOUT_SUGGESTION",
+ errorClass = "EXPECT_TABLE_NOT_VIEW.NO_ALTERNATIVE",
parameters = Map(
"viewName" -> "`spark_catalog`.`default`.`v0`",
"operation" -> "TRUNCATE TABLE"),
@@ -198,7 +198,7 @@ trait TruncateTableSuiteBase extends QueryTest with DDLCommandTestUtils {
exception = intercept[AnalysisException] {
sql("TRUNCATE TABLE v1")
},
- errorClass = "UNSUPPORTED_VIEW_OPERATION.WITHOUT_SUGGESTION",
+ errorClass = "EXPECT_TABLE_NOT_VIEW.NO_ALTERNATIVE",
parameters = Map(
"viewName" -> "`v1`",
"operation" -> "TRUNCATE TABLE"),
@@ -213,7 +213,7 @@ trait TruncateTableSuiteBase extends QueryTest with DDLCommandTestUtils {
exception = intercept[AnalysisException] {
sql(s"TRUNCATE TABLE $v2")
},
- errorClass = "UNSUPPORTED_VIEW_OPERATION.WITHOUT_SUGGESTION",
+ errorClass = "EXPECT_TABLE_NOT_VIEW.NO_ALTERNATIVE",
parameters = Map(
"viewName" -> "`global_temp`.`v2`",
"operation" -> "TRUNCATE TABLE"),
diff --git a/sql/core/src/test/scala/org/apache/spark/sql/execution/command/v1/ShowPartitionsSuite.scala b/sql/core/src/test/scala/org/apache/spark/sql/execution/command/v1/ShowPartitionsSuite.scala
index e282e20bdeab9..9863942c6ea19 100644
--- a/sql/core/src/test/scala/org/apache/spark/sql/execution/command/v1/ShowPartitionsSuite.scala
+++ b/sql/core/src/test/scala/org/apache/spark/sql/execution/command/v1/ShowPartitionsSuite.scala
@@ -57,7 +57,7 @@ trait ShowPartitionsSuiteBase extends command.ShowPartitionsSuiteBase {
exception = intercept[AnalysisException] {
sql(s"SHOW PARTITIONS $view")
},
- errorClass = "UNSUPPORTED_VIEW_OPERATION.WITHOUT_SUGGESTION",
+ errorClass = "EXPECT_TABLE_NOT_VIEW.NO_ALTERNATIVE",
parameters = Map(
"viewName" -> s"`spark_catalog`.`default`.`view1`",
"operation" -> "SHOW PARTITIONS"
@@ -80,7 +80,7 @@ trait ShowPartitionsSuiteBase extends command.ShowPartitionsSuiteBase {
exception = intercept[AnalysisException] {
sql(s"SHOW PARTITIONS $viewName")
},
- errorClass = "UNSUPPORTED_VIEW_OPERATION.WITHOUT_SUGGESTION",
+ errorClass = "EXPECT_TABLE_NOT_VIEW.NO_ALTERNATIVE",
parameters = Map(
"viewName" -> "`test_view`",
"operation" -> "SHOW PARTITIONS"
@@ -124,7 +124,7 @@ class ShowPartitionsSuite extends ShowPartitionsSuiteBase with CommandSuiteBase
exception = intercept[AnalysisException] {
sql(s"SHOW PARTITIONS $viewName")
},
- errorClass = "UNSUPPORTED_VIEW_OPERATION.WITHOUT_SUGGESTION",
+ errorClass = "EXPECT_TABLE_NOT_VIEW.NO_ALTERNATIVE",
parameters = Map(
"viewName" -> "`test_view`",
"operation" -> "SHOW PARTITIONS"
diff --git a/sql/core/src/test/scala/org/apache/spark/sql/internal/CatalogSuite.scala b/sql/core/src/test/scala/org/apache/spark/sql/internal/CatalogSuite.scala
index 518302f811ef1..01e23d5681977 100644
--- a/sql/core/src/test/scala/org/apache/spark/sql/internal/CatalogSuite.scala
+++ b/sql/core/src/test/scala/org/apache/spark/sql/internal/CatalogSuite.scala
@@ -713,7 +713,7 @@ class CatalogSuite extends SharedSparkSession with AnalysisTest with BeforeAndAf
exception = intercept[AnalysisException] {
spark.catalog.recoverPartitions("my_temp_table")
},
- errorClass = "UNSUPPORTED_VIEW_OPERATION.WITHOUT_SUGGESTION",
+ errorClass = "EXPECT_TABLE_NOT_VIEW.NO_ALTERNATIVE",
parameters = Map(
"viewName" -> "`my_temp_table`",
"operation" -> "recoverPartitions()")
diff --git a/sql/hive-thriftserver/pom.xml b/sql/hive-thriftserver/pom.xml
index 76a1037f1cbac..920a9b67376f0 100644
--- a/sql/hive-thriftserver/pom.xml
+++ b/sql/hive-thriftserver/pom.xml
@@ -21,12 +21,12 @@
4.0.0
org.apache.spark
- spark-parent_2.12
+ spark-parent_2.13
4.0.0-SNAPSHOT
../../pom.xml
- spark-hive-thriftserver_2.12
+ spark-hive-thriftserver_2.13
jar
Spark Project Hive Thrift Server
https://spark.apache.org/
@@ -61,12 +61,10 @@
test-jar
test
-
com.google.guava
guava
diff --git a/sql/hive/pom.xml b/sql/hive/pom.xml
index 7a12c6222eb4d..8b8842333793f 100644
--- a/sql/hive/pom.xml
+++ b/sql/hive/pom.xml
@@ -21,12 +21,12 @@
4.0.0
org.apache.spark
- spark-parent_2.12
+ spark-parent_2.13
4.0.0-SNAPSHOT
../../pom.xml
- spark-hive_2.12
+ spark-hive_2.13
jar
Spark Project Hive
https://spark.apache.org/
@@ -79,12 +79,10 @@
test-jar
test
-
${hive.group}
hive-common
diff --git a/sql/hive/src/test/scala/org/apache/spark/sql/hive/execution/HiveDDLSuite.scala b/sql/hive/src/test/scala/org/apache/spark/sql/hive/execution/HiveDDLSuite.scala
index 99250ed6a91b7..b6971204b1cc1 100644
--- a/sql/hive/src/test/scala/org/apache/spark/sql/hive/execution/HiveDDLSuite.scala
+++ b/sql/hive/src/test/scala/org/apache/spark/sql/hive/execution/HiveDDLSuite.scala
@@ -876,7 +876,7 @@ class HiveDDLSuite
exception = intercept[AnalysisException] {
sql(s"ALTER VIEW $tabName SET TBLPROPERTIES ('p' = 'an')")
},
- errorClass = "UNSUPPORTED_TABLE_OPERATION.WITH_SUGGESTION",
+ errorClass = "EXPECT_VIEW_NOT_TABLE.USE_ALTER_TABLE",
parameters = Map(
"tableName" -> s"`$SESSION_CATALOG_NAME`.`default`.`$tabName`",
"operation" -> "ALTER VIEW ... SET TBLPROPERTIES"),
@@ -887,7 +887,7 @@ class HiveDDLSuite
exception = intercept[AnalysisException] {
sql(s"ALTER TABLE $oldViewName SET TBLPROPERTIES ('p' = 'an')")
},
- errorClass = "UNSUPPORTED_VIEW_OPERATION.WITH_SUGGESTION",
+ errorClass = "EXPECT_TABLE_NOT_VIEW.USE_ALTER_VIEW",
parameters = Map(
"viewName" -> s"`$SESSION_CATALOG_NAME`.`default`.`$oldViewName`",
"operation" -> "ALTER TABLE ... SET TBLPROPERTIES"),
@@ -898,7 +898,7 @@ class HiveDDLSuite
exception = intercept[AnalysisException] {
sql(s"ALTER VIEW $tabName UNSET TBLPROPERTIES ('p')")
},
- errorClass = "UNSUPPORTED_TABLE_OPERATION.WITH_SUGGESTION",
+ errorClass = "EXPECT_VIEW_NOT_TABLE.USE_ALTER_TABLE",
parameters = Map(
"tableName" -> s"`$SESSION_CATALOG_NAME`.`default`.`$tabName`",
"operation" -> "ALTER VIEW ... UNSET TBLPROPERTIES"),
@@ -909,7 +909,7 @@ class HiveDDLSuite
exception = intercept[AnalysisException] {
sql(s"ALTER TABLE $oldViewName UNSET TBLPROPERTIES ('p')")
},
- errorClass = "UNSUPPORTED_VIEW_OPERATION.WITH_SUGGESTION",
+ errorClass = "EXPECT_TABLE_NOT_VIEW.USE_ALTER_VIEW",
parameters = Map(
"viewName" -> s"`$SESSION_CATALOG_NAME`.`default`.`$oldViewName`",
"operation" -> "ALTER TABLE ... UNSET TBLPROPERTIES"),
@@ -920,7 +920,7 @@ class HiveDDLSuite
exception = intercept[AnalysisException] {
sql(s"ALTER TABLE $oldViewName SET LOCATION '/path/to/home'")
},
- errorClass = "UNSUPPORTED_VIEW_OPERATION.WITH_SUGGESTION",
+ errorClass = "EXPECT_TABLE_NOT_VIEW.NO_ALTERNATIVE",
parameters = Map(
"viewName" -> s"`$SESSION_CATALOG_NAME`.`default`.`$oldViewName`",
"operation" -> "ALTER TABLE ... SET LOCATION ..."),
@@ -931,7 +931,7 @@ class HiveDDLSuite
exception = intercept[AnalysisException] {
sql(s"ALTER TABLE $oldViewName SET SERDE 'whatever'")
},
- errorClass = "UNSUPPORTED_VIEW_OPERATION.WITH_SUGGESTION",
+ errorClass = "EXPECT_TABLE_NOT_VIEW.USE_ALTER_VIEW",
parameters = Map(
"viewName" -> s"`$SESSION_CATALOG_NAME`.`default`.`$oldViewName`",
"operation" -> "ALTER TABLE ... SET [SERDE|SERDEPROPERTIES]"),
@@ -942,7 +942,7 @@ class HiveDDLSuite
exception = intercept[AnalysisException] {
sql(s"ALTER TABLE $oldViewName SET SERDEPROPERTIES ('x' = 'y')")
},
- errorClass = "UNSUPPORTED_VIEW_OPERATION.WITH_SUGGESTION",
+ errorClass = "EXPECT_TABLE_NOT_VIEW.USE_ALTER_VIEW",
parameters = Map(
"viewName" -> s"`$SESSION_CATALOG_NAME`.`default`.`$oldViewName`",
"operation" -> "ALTER TABLE ... SET [SERDE|SERDEPROPERTIES]"),
@@ -953,7 +953,7 @@ class HiveDDLSuite
exception = intercept[AnalysisException] {
sql(s"ALTER TABLE $oldViewName PARTITION (a=1, b=2) SET SERDEPROPERTIES ('x' = 'y')")
},
- errorClass = "UNSUPPORTED_VIEW_OPERATION.WITH_SUGGESTION",
+ errorClass = "EXPECT_TABLE_NOT_VIEW.USE_ALTER_VIEW",
parameters = Map(
"viewName" -> s"`$SESSION_CATALOG_NAME`.`default`.`$oldViewName`",
"operation" -> "ALTER TABLE ... SET [SERDE|SERDEPROPERTIES]"),
@@ -964,7 +964,7 @@ class HiveDDLSuite
exception = intercept[AnalysisException] {
sql(s"ALTER TABLE $oldViewName RECOVER PARTITIONS")
},
- errorClass = "UNSUPPORTED_VIEW_OPERATION.WITH_SUGGESTION",
+ errorClass = "EXPECT_TABLE_NOT_VIEW.NO_ALTERNATIVE",
parameters = Map(
"viewName" -> s"`$SESSION_CATALOG_NAME`.`default`.`$oldViewName`",
"operation" -> "ALTER TABLE ... RECOVER PARTITIONS"),
@@ -975,7 +975,7 @@ class HiveDDLSuite
exception = intercept[AnalysisException] {
sql(s"ALTER TABLE $oldViewName PARTITION (a='1') RENAME TO PARTITION (a='100')")
},
- errorClass = "UNSUPPORTED_VIEW_OPERATION.WITH_SUGGESTION",
+ errorClass = "EXPECT_TABLE_NOT_VIEW.NO_ALTERNATIVE",
parameters = Map(
"viewName" -> s"`$SESSION_CATALOG_NAME`.`default`.`$oldViewName`",
"operation" -> "ALTER TABLE ... RENAME TO PARTITION"),
@@ -986,7 +986,7 @@ class HiveDDLSuite
exception = intercept[AnalysisException] {
sql(s"ALTER TABLE $oldViewName ADD IF NOT EXISTS PARTITION (a='4', b='8')")
},
- errorClass = "UNSUPPORTED_VIEW_OPERATION.WITH_SUGGESTION",
+ errorClass = "EXPECT_TABLE_NOT_VIEW.NO_ALTERNATIVE",
parameters = Map(
"viewName" -> s"`$SESSION_CATALOG_NAME`.`default`.`$oldViewName`",
"operation" -> "ALTER TABLE ... ADD PARTITION ..."),
@@ -997,7 +997,7 @@ class HiveDDLSuite
exception = intercept[AnalysisException] {
sql(s"ALTER TABLE $oldViewName DROP IF EXISTS PARTITION (a='2')")
},
- errorClass = "UNSUPPORTED_VIEW_OPERATION.WITH_SUGGESTION",
+ errorClass = "EXPECT_TABLE_NOT_VIEW.NO_ALTERNATIVE",
parameters = Map(
"viewName" -> s"`$SESSION_CATALOG_NAME`.`default`.`$oldViewName`",
"operation" -> "ALTER TABLE ... DROP PARTITION ..."),
diff --git a/streaming/pom.xml b/streaming/pom.xml
index b36289e4e4945..f409791f46704 100644
--- a/streaming/pom.xml
+++ b/streaming/pom.xml
@@ -20,12 +20,12 @@
4.0.0
org.apache.spark
- spark-parent_2.12
+ spark-parent_2.13
4.0.0-SNAPSHOT
../pom.xml
- spark-streaming_2.12
+ spark-streaming_2.13
streaming
@@ -50,12 +50,10 @@
org.apache.spark
spark-tags_${scala.binary.version}
-