Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

set weights_only=False in torch.load #2683

Merged
merged 1 commit into from
Feb 4, 2025

Conversation

wasserth
Copy link
Contributor

Fixes this critical issue: #2681

@FabianIsensee FabianIsensee self-assigned this Jan 30, 2025
@snorthman
Copy link

snorthman commented Feb 3, 2025

Rather than the suggested fix here,

checkpoint = torch.load(join(model_training_output_dir, f'fold_{f}', checkpoint_name),
                                    map_location=torch.device('cpu'))
                                    map_location=torch.device('cpu'), weights_only=False)

may I instead suggest

import _codecs
torch.serialization.add_safe_globals([np.core.multiarray.scalar, np.dtype, np.dtypes.Float32DType, np.dtypes.Float64DType, _codecs.encode])
checkpoint = torch.load(join(model_training_output_dir, f'fold_{f}', checkpoint_name),
                                    map_location=torch.device('cpu'))

@FabianIsensee
Copy link
Member

Hey @snorthman can you please elaborate on why your suggestion is to be preferred? I would like to understand the differences

@FabianIsensee FabianIsensee merged commit dadc00d into MIC-DKFZ:master Feb 4, 2025
1 check failed
@FabianIsensee
Copy link
Member

I am merging @wasserth PR for now since this seems to be the most straightforward (+ it's urgent) but I am open for your variant as well

@snorthman
Copy link

My fix respects the warnings posed by torch:
"Re-running torch.load with weights_only set to False will likely succeed, but it can result in arbitrary code execution. Do it only if you got the file from a trusted source."

When torch loads a file, it only loads types on the "safe globals" list, which initially doesn't include some types used in nnUNet, most of which are numpy types. The end results is of course the same, whether we use weight_only=False or my solution, but my solution respects the warnings posed by torch.

I should note that the list of globals I put forward is what worked for me, but perhaps using different settings there may be more types needed for this to be a catch-all solution.

@jamesobutler
Copy link

@snorthman Have you issued a PR to this repo with your proposal?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants