-
Notifications
You must be signed in to change notification settings - Fork 15.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
top_k and top_p transposed in vertexai #5673
top_k and top_p transposed in vertexai #5673
Conversation
@khallbobo - you might be interested in this since you just fixed something related. :) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
good catch, thanks
Added a test to cover that args are sent to the library as expected. |
Wow. This will certainly change results! Without my change, these are ignored anyway. With my change, we will start seeing very different results. |
Google is now returning an error if you are using an older version of langchain. For anyone else wondering why their project has suddenly broken when nothing has changed their end (and they may not even be setting top_p), you may be seeing this message:
|
@bstrawson Can you give a little more detail about the issue you're running into? Is it a case where you had saved an instance of an llm model in a previous version and then when you ran it in this version, you hit that issue? To give a little more context, there are 2 changes that would have caused this. |
@mheguy-stingray to be clear this change fixes the issue I was seeing. I just wanted to highlight the error message so others can discover this issue and find a resolution quickly. I was using 0.188 which was working for for me up until yesterday. I then started seeing the error message I've referenced above and everything stopped working. I assumed it was my code - but even rolling back changes and deleting indexes didn't fix the issue. I spent a lot of time scratching my head and trying to work out why it suddenly broke for me. My assumption is that the VertexAI was previously ignoring an invalid top_p setting, hence everything was previously working. It is an assumption, but I see no other explanation. Either way, I thought it might be useful to highlight the error that the API returns in case others come across it. |
If you were using vertex in langchain, your llm settings (top_p, top_k, max_tokens, temperature) were never leaving langchain's code. They were never given to the vertex lib, nor the google servers. #5566 fixed that issue. The good news is that your settings are now actually being used. |
@mheguy-stingray I've looked at #5566 but that appears to only apply to chat. Your fix applies to all VertexAI calls as |
Fix transposed properties in vertexai model Co-authored-by: Dev 2049 <dev.dev2049@gmail.com>
Fix transposed properties in vertexai model
Models