-
Notifications
You must be signed in to change notification settings - Fork 244
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Update the error checking of test_cast_neg_to_decimal_err
#5690
Conversation
build |
@@ -294,6 +294,7 @@ def test_mod_pmod_by_zero(data_gen, overflow_exp): | |||
ansi_enabled_conf, | |||
exception_str) | |||
|
|||
@pytest.mark.skipif(is_spark_314() or is_spark_322(), reason="Spark 3.1.4 and 3.2.2 will return null instead of throwing exception") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
what about spark 33X, 34X? it would be nice to have better context for this condition
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Updated
Signed-off-by: remzi <13716567376yh@gmail.com>
build |
what Spark change does this map back to? I would like to make sure this as intentional as it seems odd to change it to not throw |
test_cast_neg_to_decimal_err
for Spark 314 and 322test_cast_neg_to_decimal_err
Signed-off-by: remzi <13716567376yh@gmail.com>
Signed-off-by: remzi <13716567376yh@gmail.com>
build |
test_cast_neg_to_decimal_err
test_cast_neg_to_decimal_err
exception_str = "java.lang.ArithmeticException: " + exception_content if is_before_spark_330() \ | ||
and not is_databricks104_or_later() else "org.apache.spark.SparkArithmeticException: " \ | ||
+ exception_content | ||
exception_str = "java.lang.ArithmeticException: " if is_before_spark_330() \ |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this doesn't really match what we have done for other tests, other tests we check the message but differentiate versions. We were trying to match the exceptions as close as possible.
We also have #5196 which is to make exceptions match.
so I would rather see us go that direction
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Updated!
Signed-off-by: remzi <13716567376yh@gmail.com>
build |
Signed-off-by: remzi 13716567376yh@gmail.com
Closes #5683.
Spark has updated the error message of decimal overflow: https://issues.apache.org/jira/browse/SPARK-39060
We should also match the update.
Affected versions: 3.1.x (x >= 4), 3.2.x (x >= 2), 3.3.x (x >= 0), 3.4