-
Notifications
You must be signed in to change notification settings - Fork 244
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Integration-test tests jar for hive UDF tests #4890
Conversation
1, Add commit version info for the integration-test tests jar. 2, Upload Databricks shims of integration-test tests jar to internal maven repo. 3, Download integration-test tests jar for the hive UDF integration tests. Signed-off-by: Tim Liu <timl@nvidia.com>
build |
@@ -38,3 +38,7 @@ mvn -B deploy:deploy-file $MVN_URM_MIRROR -Durl=$SERVER_URL -DrepositoryId=$SERV | |||
DBINTTESTJARFPATH=./integration_tests/target/rapids-4-spark-integration-tests_$SCALA_VERSION-$SPARK_PLUGIN_JAR_VERSION-${DB_SHIM_NAME}.jar | |||
mvn -B deploy:deploy-file $MVN_URM_MIRROR -Durl=$SERVER_URL -DrepositoryId=$SERVER_ID \ | |||
-Dfile=$DBINTTESTJARFPATH -DpomFile=integration_tests/pom.xml -Dclassifier=$DB_SHIM_NAME | |||
# Deploy integration-tests tests jar for hive UDF tests | |||
HIVEUDFTESTSJAR=./integration_tests/target/rapids-4-spark-integration-tests_$SCALA_VERSION-$SPARK_PLUGIN_JAR_VERSION-${DB_SHIM_NAME}'tests'.jar | |||
mvn -B deploy:deploy-file $MVN_URM_MIRROR -Durl=$SERVER_URL -DrepositoryId=$SERVER_ID \ |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Deploy Databricks shims (e.g. integration-test-spark301dbtests.jar) of integration-test tests jar in the nightly BUILD job. We will need this jar to run hive UDF tests nightly IT job on Databricks.
echo "-------------------- rapids-4-spark-integration-tests BUILD INFO --------------------" >> "$tmp_info" | ||
it_ver=$(getRevision $JARS_PATH/$RAPIDS_TEST_JAR rapids4spark-version-info.properties) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
JARS_PATH
is unset/empty in nightly IT jobs, and vars of CUDF_JAR/RAPIDS_PLUGIN_JAR are already absolute paths, e.g. https://github.com/NVIDIA/spark-rapids/blob/branch-22.04/jenkins/spark-tests.sh#L43. Remove unused var JARS_PATH
LGTM |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This looks OK, but it is highly confusing that we have both a main jar and a tests jar for the integration_tests artifact, especially when the code in the tests jar has nothing to do with testing the integration_tests main code. I'll file a followup to make it consistent.
Going to force merge this one to unblock nightly CI first |
This reverts commit 1d0004c.
Fixes #4883
We moved HiveUDF classes to the the dir
integration_tests/src/test
(built to integration-tests-sparkxxxtests.jar) with #4619, so need to add this jar into the jars_path of the integration tests, to PASS hive UDF tests.1, Add commit version info for the integration-test tests jar.
2, Upload Databricks shims of integration-test tests jar to internal maven repo.
3, Download integration-test tests jar for the hive UDF integration tests.
Signed-off-by: Tim Liu timl@nvidia.com