You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have 16GB VRAM available.
While running Image2Video the VRAM usage never exeeds ~8gb
Settings i used:
Nvidia-smi
Vram usage:
Code Extract:
The result from the low vram calc results in 15.99 which is of course smaller than 16. Is this expected behaviour?
Can this be altered or would it result in a overflow of VRAM and a crash?
The text was updated successfully, but these errors were encountered:
You could change the limit value to lower that line (save the script, F3 > reload scripts), and see how long it'll take. (If you're not familiar with dev in Blender, here are some hints for fast tweaking of a script: #105 (comment) )
The challenging part is staying below overshooting the VRAM because offloading into the Static Memory will slow things. AFAIR is the "hi" vram code demanding around 18 GB VRAM. And going directly to CUDA it's around 35-36 GB of VRAM. For me with 24 GB VRAM it's faster render to stay below 18 GB than 35 GB.
Hi!
first i wanted to thank you for the project!
I have 16GB VRAM available.
While running Image2Video the VRAM usage never exeeds ~8gb
Settings i used:
Nvidia-smi
Vram usage:
Code Extract:
The result from the low vram calc results in 15.99 which is of course smaller than 16. Is this expected behaviour?
Can this be altered or would it result in a overflow of VRAM and a crash?
The text was updated successfully, but these errors were encountered: