-
Notifications
You must be signed in to change notification settings - Fork 1.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
fix: Update grammar generator and Fixes "None" model in chat_completion_proxy #1207
base: main
Are you sure you want to change the base?
fix: Update grammar generator and Fixes "None" model in chat_completion_proxy #1207
Conversation
processed_models, | ||
created_rules, | ||
field_info=None, | ||
) -> tuple[str, list[str]]: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is there a reason to go from Tuple
to tuple
?
Why not Tuple[str, List[str]]
?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
No I think some kind of formatter did that. I just copied the updated grammar generator from my llama-cpp-agent framework.
Please describe the purpose of this pull request.
Update the grammar generator to contain fixes and additional functionality. And fixes possible none value of the model in chat_completion_proxy
How to test
Use a local llm with grammar wrapper
Is your PR over 500 lines of code?
No