Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG] Integration tests run with default configs that don't match RapidsConf defaults #11134

Open
jlowe opened this issue Jul 3, 2024 · 5 comments · Fixed by #11283
Open
Assignees
Labels
bug Something isn't working test Only impacts tests

Comments

@jlowe
Copy link
Member

jlowe commented Jul 3, 2024

@pmattione-nvidia discovered that integration_tests/src/main/python/spark_session.py has default configurations that will set spark.rapids.sql.castFloatToDecimal.enabled to false despite the config being defaulted to true in RapidsConf. That means we're not really testing what we're shipping for most tests if they happen to involve operations that would require a cast of a float to a decimal.

Similarly there are many other cast-related configs being setup for integration test default that don't match what the defaults are in RapidsConf. If we need to disable these operations because too many tests fail when they're enabled, that implies we should reconsider whether these configs should be enabled by default.

@jlowe jlowe added bug Something isn't working ? - Needs Triage Need team to review and classify test Only impacts tests labels Jul 3, 2024
@mattahrens
Copy link
Collaborator

Related code in spark_session.py: https://github.com/NVIDIA/spark-rapids/blob/branch-24.08/integration_tests/src/main/python/spark_session.py#L44-L58. Scope is to update conf settings in spark_session.py to match defaults in RapidsConf in the plugin. Then run integration tests and investigate any failures (file issues as needed).

@mattahrens mattahrens removed the ? - Needs Triage Need team to review and classify label Jul 9, 2024
@jihoonson
Copy link
Collaborator

So, enforcing the default configs were added in #4917 for integration tests. This was to address #4893. I see how irritating one would feel if some cluster setting sneaks into the integration tests.

However, the problem I see with the current approach is that we have two places where we keep default configs, one in scala and another in python. As long as we have them separate, this issue of inconsistent defaults will keep coming back to us whenever we change the default of some config only in one place but not in another. I'd like to fix this issue once and for all. Two approaches in my mind. Both of them will use the scala code as the source of truth.

  • We could erase all existing spark-rapids config when the spark session is initialized.
  • We could expose default values of all spark-rapid, such as by adding RapidsConf.getAllDefaults(), so that they override any existing configs when the spark session is initialized.

Both look feasible and not hard to me. I wonder though if I'm missing something and they are actually not possible or hard.

@gerashegalov
Copy link
Collaborator

+1 for the single source of truth RapidsConf generating the default conf dictionary for PyTests. It could be a generated file, just like we do for the doc to be loaded to init _default_conf

@jihoonson
Copy link
Collaborator

That would work too. Alright, I will try a couple of things and see what's best.

@jihoonson
Copy link
Collaborator

Reopening as we have reverted the fix in #11347.

@jihoonson jihoonson reopened this Aug 19, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working test Only impacts tests
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants