Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

VRAM Requirement? #5

Open
jryebread opened this issue Jun 23, 2024 · 1 comment
Open

VRAM Requirement? #5

jryebread opened this issue Jun 23, 2024 · 1 comment

Comments

@jryebread
Copy link

Hi how does this compare to video2x? And what is the VRAM/GPU req for running the models? thanks!

@DachunKai
Copy link
Owner

Hi, we haven't compared our model directly with Video2X yet, but thank you for the suggestion. We will consider doing a comparison in the future. We tested upscaling 100 frames from a resolution of 180x320 to 720x1280 (4x upsampling) on 2 NVIDIA 3090 GPUs. The GPU memory usage was 10617 MiB / 24576 MiB per card. Thank you.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants