-
Notifications
You must be signed in to change notification settings - Fork 244
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support array_repeat #5293
Support array_repeat #5293
Conversation
sql-plugin/src/main/scala/org/apache/spark/sql/rapids/collectionOperations.scala
Show resolved
Hide resolved
sql-plugin/src/main/scala/org/apache/spark/sql/rapids/collectionOperations.scala
Show resolved
Hide resolved
sql-plugin/src/main/scala/org/apache/spark/sql/rapids/collectionOperations.scala
Show resolved
Hide resolved
} | ||
} | ||
// Step 3. generate list offsets from refined counts | ||
val offsets = withResource(refinedCount) { cnt => |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Speaking of "what if something throws?", repeated
is unprotected here -- if computing the offsets causes an exception we don't close it here.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, I added the missing the closeOnEx wrapper for repeated
sql-plugin/src/main/scala/org/apache/spark/sql/rapids/collectionOperations.scala
Show resolved
Hide resolved
sql-plugin/src/main/scala/org/apache/spark/sql/rapids/collectionOperations.scala
Outdated
Show resolved
Hide resolved
build |
build |
sql-plugin/src/main/scala/org/apache/spark/sql/rapids/collectionOperations.scala
Show resolved
Hide resolved
build |
1 similar comment
build |
Signed-off-by: sperlingxx lovedreamf@gmail.com
Closes #5226
This PR is to support array_repeat, which relies on unmerged cuDF PR(rapidsai/cudf#10683).
The primary issue of
array_repeat
is to workaround the null and negative count values, which are supported by Spark but invalid in cuDF repeat.