-
-
Notifications
You must be signed in to change notification settings - Fork 194
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add OpenAI-compatible Groq #596
Conversation
Add `gem "ruby-openai", "~> 7.0"` to your Gemfile. | ||
|
||
```ruby | ||
llm = Langchain::LLM::GroqOpenAI.new(api_key: ENV["GROQ_API_KEY"]) # Defaults to Meta Llama 3 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I would prefer to call it: Langchain::LLM::Groq
. I find GroqOpenAI
rather confusing as it sounds like the OpenAI models are available on Groq, which they're not.
# llm_options: {}, | ||
# default_options: {} | ||
# ) | ||
class GroqOpenAi < OpenAI |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I would build a standalone integration to Groq directly instead of subclassing OpenAI. I understand that it works but it only works because Groq is trying to mimic the OpenAI interface.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Sure. By 'standalone integration' do you mean not use the ruby-openai gem? Or use the gem but don't subclass?
I liked the subclass since without it you have to define the chat method and all of its parameters and pass them through.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@mattlindsey Not use ruby-openai
gem and not subclass for Groq.
TODO: