Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Inference Time #2

Open
Platorius opened this issue Mar 10, 2024 · 0 comments
Open

Inference Time #2

Platorius opened this issue Mar 10, 2024 · 0 comments

Comments

@Platorius
Copy link

I love this list and i like it, that you go for LPIPS

But i would suggest one metric more: Inference time which it took for one frame to generate (with a model the researchers trained).

Because doesn't matter how good sth is, if you get 0.5FPS output-speed on a RTX 4090 it is basically useless. Even in 10 years this would be still to slow. So important is quality Improvement while still beeing fast. Good quality but super slow will result in "no one uses that". It has no practical meaning.

So that's my suggestion: Trying to use this values in your tables. I know sadly most researchers do not tell, how fast it is.

And i know it would need also the used resolution and graphcis card, but it could be said too.

But it is only a suggestion.

Good list. And i am also for using LPIPS only and going away from PSNR and SSIM.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant