We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
In Spark 3.1 and 3.2 nightly tests on 22.08, test_cast_neg_to_decimal_err failed, e.g.:
11:44:56 _________________________ test_cast_neg_to_decimal_err _________________________ 11:44:56 11:44:56 def test_cast_neg_to_decimal_err(): 11:44:56 # -12 cannot be represented as decimal(7,7) 11:44:56 data_gen = _decimal_gen_7_7 11:44:56 exception_content = "Decimal(compact,-120000000,20,0}) cannot be represented as Decimal(7, 7)" 11:44:56 exception_str = "java.lang.ArithmeticException: " + exception_content if is_before_spark_330() \ 11:44:56 and not is_databricks104_or_later() else "org.apache.spark.SparkArithmeticException: " \ 11:44:56 + exception_content 11:44:56 11:44:56 > assert_gpu_and_cpu_error( 11:44:56 lambda spark : unary_op_df(spark, data_gen).selectExpr( 11:44:56 'cast(-12 as {})'.format(to_cast_string(data_gen.data_type))).collect(), 11:44:56 ansi_enabled_conf, 11:44:56 exception_str) 11:44:56 11:44:56 ../../src/main/python/arithmetic_ops_test.py:305: 11:44:56 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:44:56 ../../src/main/python/asserts.py:572: in assert_gpu_and_cpu_error 11:44:56 assert_py4j_exception(lambda: with_cpu_session(df_fun, conf), error_message) 11:44:56 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 11:44:56 11:44:56 func = <function assert_gpu_and_cpu_error.<locals>.<lambda> at 0x7f5b01d83d30> 11:44:56 error_message = 'java.lang.ArithmeticException: Decimal(compact,-120000000,20,0}) cannot be represented as Decimal(7, 7)' 11:44:56 11:44:56 def assert_py4j_exception(func, error_message): 11:44:56 """ 11:44:56 Assert that a specific Java exception is thrown 11:44:56 :param func: a function to be verified 11:44:56 :param error_message: a string such as the one produce by java.lang.Exception.toString 11:44:56 :return: Assertion failure if no exception matching error_message has occurred. 11:44:56 """ 11:44:56 with pytest.raises(Py4JJavaError) as py4jError: 11:44:56 func() 11:44:56 > assert error_message in str(py4jError.value.java_exception) 11:44:56 E AssertionError 11:44:56 11:44:56 ../../src/main/python/asserts.py:561: AssertionError
The text was updated successfully, but these errors were encountered:
test_cast_neg_to_decimal_err
In Spark 314, the error message has been updated to Decimal(compact, -120000000, 20, 0}) cannot be represented as Decimal(7, 7)
Decimal(compact, -120000000, 20, 0}) cannot be represented as Decimal(7, 7)
Sorry, something went wrong.
please link to the Spark issue or Pull request that made the changes
https://issues.apache.org/jira/browse/SPARK-39060 Affected versions: 3.1.4, 3.2.2, 3.3.x, 3.4
HaoYang670
Successfully merging a pull request may close this issue.
In Spark 3.1 and 3.2 nightly tests on 22.08, test_cast_neg_to_decimal_err failed, e.g.:
The text was updated successfully, but these errors were encountered: