Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support DataFrame.spark.explain(extended: str) case. #1563

Merged
merged 2 commits into from
Jun 4, 2020

Conversation

ueshin
Copy link
Collaborator

@ueshin ueshin commented Jun 3, 2020

Since Spark 3.0 will support DataFrame.explain(extended: str) case (apache/spark#28711), we can follow it.

>>> df.spark.explain("extended")  # doctest: +ELLIPSIS
== Parsed Logical Plan ==
...
== Analyzed Logical Plan ==
...
== Optimized Logical Plan ==
...
== Physical Plan ==
 ...

@ueshin ueshin requested a review from HyukjinKwon June 3, 2020 19:38
@codecov-commenter
Copy link

codecov-commenter commented Jun 3, 2020

Codecov Report

Merging #1563 into master will decrease coverage by 0.01%.
The diff coverage is 60.00%.

Impacted file tree graph

@@            Coverage Diff             @@
##           master    #1563      +/-   ##
==========================================
- Coverage   94.51%   94.50%   -0.02%     
==========================================
  Files          38       38              
  Lines        8740     8742       +2     
==========================================
+ Hits         8261     8262       +1     
- Misses        479      480       +1     
Impacted Files Coverage Δ
databricks/koalas/spark/accessors.py 91.93% <60.00%> (-0.69%) ⬇️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update ae57c2a...3e14c0a. Read the comment docs.

@@ -729,6 +729,16 @@ def explain(self, extended: Optional[bool] = None, mode: Optional[str] = None):
== Physical Plan ==
...

>>> df.spark.explain("extended") # doctest: +SKIP
Copy link
Collaborator Author

@ueshin ueshin Jun 3, 2020

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is not supported in Spark 3.0.0-rc2 yet. I'd skip this for now.

@HyukjinKwon HyukjinKwon merged commit 85e08da into databricks:master Jun 4, 2020
@ueshin ueshin deleted the explain branch June 4, 2020 01:47
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants