Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix: need default value for getSizeAsMb(EXECUTOR_MEMORY.key) #1046

Merged
merged 1 commit into from
Oct 31, 2024

Conversation

neyama
Copy link
Contributor

@neyama neyama commented Oct 31, 2024

Which issue does this PR close?

Closes #1045 .

Rationale for this change

Provide a default value 1g to getSizeAsMb(EXECUTOR_MEMORY.key), which is the default value of spark.executor.memory in Apache Spark 3.5.3.

What changes are included in this PR?

  • Define private val EXECUTOR_MEMORY_DEFAULT in the class
  • Call getSizeAsMb(EXECUTOR_MEMORY.key) with EXECUTOR_MEMORY_DEFAULT

How are these changes tested?

Tested with the datafusion-benchmarks as described in the issue #1045 .

Copy link
Member

@andygrove andygrove left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks @neyama

@viirya
Copy link
Member

viirya commented Oct 31, 2024

Thanks @neyama @andygrove

@viirya viirya merged commit b38c361 into apache:main Oct 31, 2024
75 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Default value is required for sc.getConf.getSizeAsMb(EXECUTOR_MEMORY.key)
3 participants