-
Notifications
You must be signed in to change notification settings - Fork 3.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Update virtual_node.py #7751
Update virtual_node.py #7751
Conversation
Instead of creating new tensors in each iteration, can't we modify the existing tensors in-place? also, instead of creating new tensors by concatenating existing tensors, isn't it better we use tensor views to represent the augmented tensors? It seems in `old_data = copy.copy(data)` the subsequent operations only modify the data object and don't change the underlying tensor data. are these valid points?
for more information, see https://pre-commit.ci
Codecov Report
@@ Coverage Diff @@
## master #7751 +/- ##
==========================================
- Coverage 91.91% 91.56% -0.36%
==========================================
Files 452 452
Lines 25550 25543 -7
==========================================
- Hits 23485 23389 -96
- Misses 2065 2154 +89
... and 19 files with indirect coverage changes 📣 We’re building smart automated test selection to slash your CI/CD build times. Learn more |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I guess the idea behind making a copy is that we want to avoid silently contaminating the original data. I believe that's the motivation for #7429.
However, for your change in this PR specifically, the data is already copy.copy()
'ed in the BaseTransform.__call__
, so it shouldn't change any behaviour.
Thank you. if that's the case then yes maybe having |
We need the separation here between Feel free to re-open if you have strong concerns. |
Instead of creating new tensors in each iteration, can't we modify the existing tensors in-place? also, instead of creating new tensors by concatenating existing tensors, isn't it better we use tensor views to represent the augmented tensors? It seems in
old_data = copy.copy(data)
the subsequent operations only modify the data object and don't change the underlying tensor data. are these valid points?