Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add phi and mixtral model type to normalizedconfig #1625

Merged
merged 1 commit into from
Jan 8, 2024

Conversation

changwangss
Copy link
Contributor

What does this PR do?

according to the following models.
https://huggingface.co/mistralai/Mixtral-8x7B-Instruct-v0.1/blob/main/config.json
https://huggingface.co/susnato/phi-1_5_dev/blob/main/config.json mentioned by https://huggingface.co/docs/transformers/main/model_doc/phi#example-

Fixes # (issue)

Before submitting

  • This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
  • Did you make sure to update the documentation with your changes?
  • Did you write any new necessary tests?

Signed-off-by: Wang, Chang1 <chang1.wang@intel.com>
@changwangss
Copy link
Contributor Author

changwangss commented Jan 8, 2024

@fxmarty @echarlaix the failed CI seems not due to this PR, Could you give me some help to make the PR merge?
CI error infos as following.

/opt/hostedtoolcache/Python/3.8.18/x64/lib/python3.8/site-packages/transformers/pipelines/question_answering.py:563: KeyError

@echarlaix echarlaix merged commit ec14b3f into huggingface:main Jan 8, 2024
40 of 46 checks passed
echarlaix pushed a commit that referenced this pull request Jan 19, 2024
add phi and mixtral config

Signed-off-by: Wang, Chang1 <chang1.wang@intel.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants