You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
You can check the git blame to find the PRs which add certain lines - which should normally give you the reasons for the code logic.
In this case, this line was added in #28538 - according to the PR it's for preparation of an upcoming torch release, where it's necessary for us to explicitly pass the use_reentrant kwarg.
The value is set to true based on the current default PT behaviour c.f. this comment
Ah. I'm sorry, I only searched the issues and found several discussion about this, but I didn't search through the PRs
Thank you very much for the link as well as the hint with the PRs (I was not aware of that) 👍
transformers/src/transformers/modeling_utils.py
Lines 2110 to 2111 in 350c5d1
Here
use_reentrant=True
is set when no kwargs are provided. From the docs of pytorch this seems to be the legacy variant.Does this have any performance or other advantages that I am not aware of?
The text was updated successfully, but these errors were encountered: