-
-
Notifications
You must be signed in to change notification settings - Fork 9
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support for GROQ API #65
Comments
Tagging @johannsky to this Feature Request. |
Thanks for your detailed description and proposed changes. I will look into this! |
Got it to work! A beta is out ( |
Will take a look and let you know. Currently Home Assistant is busy, weekend = everyone home, therefore no touchy the automations. LOL! |
Finally had a chance to sit down and spend some time on this. Been able to use my Ollama build, OpenAI and switched form Custom OpenAI to Groq with no issues. Runs just as it should in my testing. Appreciate the fast turn around. PS... I am currently using v1.1.3-beta3 |
Closing this request as it has been resolved with the latest version. |
Thanks for testing! Will merge this into main then. |
Re-opening this feature request from #57.
Since I'm not overly experienced with pull requests and programming in Home Assistant, I'll put what I've learnt and done here. This issue/feature has been solved and I now have Groq Vision LLM working in Home Assistant.
Groq endpoint used in Home Assistant: 'https://api.groq.com/openai/v1/chat/completions'
LLM Vision version: 1.1.1
Home Assistant Core version: 2024.8.3
Home Assistant Supervisor: 2024.09.1
Home Assistant OS: 13.1
Two issues to resolve:
Issue 1
<config_flow.py>
The validation fails for Groq due to the need to use the endpoint 'https://api.groq.com/openai/v1/chat/completions'. Specifically the fact that Groq uses 'openai' in the URL path. The validation code lobs this off, I simply fixed this issue for myself as I am only using Groq in my Custom OpenAI setup by adding '/openai' to the endpoint variable in the customer_openai on line 144:
I will note that it is absolutely imperative to put the '/' at the beginning, as the split logic removes it.
The resolution would be to redo the split operations or simply create a Groq function. This is really why I didn't do a pull request because this obviously would mess up the validation for any other custom API endpoints and whenever I played around with the variables, Home Assistant spit out errors that a 'Could not parse endpoint: cannot access local variable 'variable_name' where it is not associated with a value' error.
Issue 2
<request_handlers.py>
The order in which the text is added to the data json library matters. The prompt is required to be placed first, then whatever other text values such as the tags and then the image data. Below is the updated function that I have loaded into my Home Assistant.
I haven't tested whether this works with OpenAI directly, cause well, I'm cheap and I don't have an API key to use. So I'll defer to others to test. Assuming this would be fine with OpenAI, the resolution is to just move the block of code ahead of the images.
FYI... Groq does not support multiple images to be sent to it, so that may require the code to be placed in its own function to limit the images to 1. Assuming this will change at some point, it's using an older version of Llava right now.
For anyone trying to mod the files directly in Home Assistant, don't forget to Restart Home Assistant for the changes to take affect.
Hope this helps.
The text was updated successfully, but these errors were encountered: