-
Notifications
You must be signed in to change notification settings - Fork 444
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Phi3 awq #1984
Phi3 awq #1984
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I tested InternVL2-4B, the AWQ of PyTorch backend worked fine
lmdeploy/lite/apis/auto_awq.py
Outdated
'InternLMXComposer2ForCausalLM': 'InternLM2RMSNorm', | ||
'ChatGLMForConditionalGeneration': 'RMSNorm', | ||
} | ||
NORM_TYPE_MAP = NORM_TYPE_MAP # legency |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
legency -> legacy
But I don't understand why we should make the NORM_TYPE_MAP=NORM_TYPE_MAP
statement
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
NORM_TYPE_MAP is not used in this file, unused import will lead to lint error. And I don't know if other files will reference it or not.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Get it. May fix the typo "legency -> legacy"
合入到 v0.5.3 中了 |
错误
因为提示我更新autoawq到0.1.8以上,我就pip install autoawq==0.1.8了。然后报的这个错误。 |
autoawq和transformers升级到最新版本之后解决了😊 |
requirements