-
Notifications
You must be signed in to change notification settings - Fork 169
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: Use CometPlugin as main entrypoint #853
Conversation
Codecov ReportAttention: Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## main #853 +/- ##
============================================
+ Coverage 34.05% 34.25% +0.20%
- Complexity 879 888 +9
============================================
Files 112 112
Lines 42959 43016 +57
Branches 9488 9492 +4
============================================
+ Hits 14629 14736 +107
+ Misses 25330 25301 -29
+ Partials 3000 2979 -21 ☔ View full report in Codecov by Sentry. |
} | ||
} else { | ||
sc.conf.set(extensionKey, extensionClass) | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
its probably would be good to log the overriden value for spark.sql.extensions
?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Great idea of having CometPlugin as main entry point
if (sc.conf.contains(extensionKey)) { | ||
val extensions = sc.conf.get(extensionKey) | ||
if (!extensions.split(",").map(_.trim).contains(extensionClass)) { | ||
sc.conf.set(extensionKey, s"$extensions,$extensionClass") | ||
} else { | ||
sc.conf.set(extensionKey, extensionClass) | ||
} | ||
} else { | ||
sc.conf.set(extensionKey, extensionClass) | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't have an IDE opened now, but I think this code can be simplified using this overloaded method of get https://spark.apache.org/docs/3.2.1/api/java/org/apache/spark/SparkConf.html#get-java.lang.String-java.lang.String-
if (sc.conf.contains(extensionKey)) { | |
val extensions = sc.conf.get(extensionKey) | |
if (!extensions.split(",").map(_.trim).contains(extensionClass)) { | |
sc.conf.set(extensionKey, s"$extensions,$extensionClass") | |
} else { | |
sc.conf.set(extensionKey, extensionClass) | |
} | |
} else { | |
sc.conf.set(extensionKey, extensionClass) | |
} | |
val extensions = sc.conf.get(extensionKey, "") | |
if (!extensions.split(",").map(_.trim).contains(extensionClass)) { | |
sc.conf.set(extensionKey, s"$extensions,$extensionClass") | |
} else { | |
sc.conf.set(extensionKey, extensionClass) | |
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The else clause it's also suspicious, are we sure if there are multiple extensions with the key we want to override everything? I suspect the code doesn't need an else branch in reality
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for the review @edmondop. You are correct that the original code was not correct. I have added a unit test and fixed the issue. It is close to your suggestion.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
lgtm thanks @andygrove
the formatting needs to be fixed
Which issue does this PR close?
Closes #.
Rationale for this change
What changes are included in this PR?
How are these changes tested?