-
Notifications
You must be signed in to change notification settings - Fork 104
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Allow for more elemwise torch functions using broadcast_tensor
and vmap
#1032
base: main
Are you sure you want to change the base?
Allow for more elemwise torch functions using broadcast_tensor
and vmap
#1032
Conversation
I need to add a test, but I want to get feedback on #1031 before continuing. |
broadcast_tensor
and vmap
I'll fix the tests. |
Codecov ReportAttention: Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## main #1032 +/- ##
==========================================
- Coverage 81.90% 81.89% -0.01%
==========================================
Files 182 182
Lines 47879 47887 +8
Branches 8620 8619 -1
==========================================
+ Hits 39214 39216 +2
- Misses 6492 6498 +6
Partials 2173 2173
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Only some nits left, PR looks great!
|
||
def elemwise_fn(*inputs): | ||
Elemwise._check_runtime_broadcast(node, inputs) | ||
shaped_inputs = torch.broadcast_tensors(*inputs) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Nit: more precise name:
shaped_inputs = torch.broadcast_tensors(*inputs) | |
broadcasted_inputs = torch.broadcast_tensors(*inputs) |
Also needs to be changed below
# @todo: This will fail for anything that calls | ||
# `.item()` |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Remove todo, not something that should be addressed in this impl, but for specific Ops, so if anything should exist as a github issue?
Description
In the event the operator
Elemwise
is broadcasting over doesn't have a direct torch function, we can leveragevmap
andbroadcast_tensors
to replicate the ufunc machinery.Related Issue
Checklist
Type of change
📚 Documentation preview 📚: https://pytensor--1032.org.readthedocs.build/en/1032/