Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update the error checking of test_cast_neg_to_decimal_err #5690

Merged
merged 7 commits into from
Jun 8, 2022

Conversation

HaoYang670
Copy link
Collaborator

@HaoYang670 HaoYang670 commented May 30, 2022

Signed-off-by: remzi 13716567376yh@gmail.com
Closes #5683.

Spark has updated the error message of decimal overflow: https://issues.apache.org/jira/browse/SPARK-39060
We should also match the update.
Affected versions: 3.1.x (x >= 4), 3.2.x (x >= 2), 3.3.x (x >= 0), 3.4

Signed-off-by: remzi <13716567376yh@gmail.com>
@HaoYang670
Copy link
Collaborator Author

build

@@ -294,6 +294,7 @@ def test_mod_pmod_by_zero(data_gen, overflow_exp):
ansi_enabled_conf,
exception_str)

@pytest.mark.skipif(is_spark_314() or is_spark_322(), reason="Spark 3.1.4 and 3.2.2 will return null instead of throwing exception")
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

what about spark 33X, 34X? it would be nice to have better context for this condition

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Updated

HaoYang670 and others added 2 commits May 31, 2022 16:59
@HaoYang670
Copy link
Collaborator Author

build

@tgravescs
Copy link
Collaborator

what Spark change does this map back to? I would like to make sure this as intentional as it seems odd to change it to not throw

@sameerz sameerz added the bug Something isn't working label May 31, 2022
@sameerz sameerz added this to the May 23 - Jun 3 milestone May 31, 2022
@HaoYang670 HaoYang670 changed the title Skip test_cast_neg_to_decimal_err for Spark 314 and 322 Update the error message of test_cast_neg_to_decimal_err Jun 5, 2022
Signed-off-by: remzi <13716567376yh@gmail.com>
Signed-off-by: remzi <13716567376yh@gmail.com>
@HaoYang670
Copy link
Collaborator Author

build

@HaoYang670 HaoYang670 changed the title Update the error message of test_cast_neg_to_decimal_err Update the error checking of test_cast_neg_to_decimal_err Jun 5, 2022
exception_str = "java.lang.ArithmeticException: " + exception_content if is_before_spark_330() \
and not is_databricks104_or_later() else "org.apache.spark.SparkArithmeticException: " \
+ exception_content
exception_str = "java.lang.ArithmeticException: " if is_before_spark_330() \
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this doesn't really match what we have done for other tests, other tests we check the message but differentiate versions. We were trying to match the exceptions as close as possible.

We also have #5196 which is to make exceptions match.

so I would rather see us go that direction

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Updated!

@sameerz sameerz modified the milestones: May 23 - Jun 3, Jun 6 - Jun 17 Jun 6, 2022
@HaoYang670
Copy link
Collaborator Author

build

@tgravescs tgravescs merged commit f546edd into NVIDIA:branch-22.08 Jun 8, 2022
@HaoYang670 HaoYang670 deleted the spark_314_322_error branch June 13, 2022 05:47
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[BUG] test_cast_neg_to_decimal_err failed in recent 22.08 tests
4 participants