Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Tokens per sample upper limit for GPTJ #1728

Open
arjunsuresh opened this issue Jun 11, 2024 · 1 comment
Open

Tokens per sample upper limit for GPTJ #1728

arjunsuresh opened this issue Jun 11, 2024 · 1 comment

Comments

@arjunsuresh
Copy link
Contributor

Is there any reason why we have an accuracy upper limit for LLAMA2 Tokens per sample but not for GPT-J? It's good to document this reason for users.

@Oseltamivir
Copy link

Oseltamivir commented Oct 30, 2024

Apparently mixtral too: https://github.com/mlcommons/inference/blob/master/tools/submission/submission_checker.py#L381

Just my two cents but might it be because these models tend to keep repeating: issue on TRT for llama, and for mixtral or maybe I have just confirmation bias.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants