Replies: 4 comments
-
Any updates on this? |
Beta Was this translation helpful? Give feedback.
-
Hi! We are a group of 4 students from the university of Toronto, and we are looking into implementing this shortly! |
Beta Was this translation helpful? Give feedback.
-
Here is our proposal! We will be implementing this soon. Any feedback is appreciated :)Outline of Changes Existing Architecture Files to Modify This issue will likely not require addition of any new files, unless we decide to create a separate test file. Pseudocode Outline of changes made to chat_models.py:
|
Beta Was this translation helpful? Give feedback.
-
PR for this is up! |
Beta Was this translation helpful? Give feedback.
-
Checked
Feature request
Please integrate Googles JSON Mode into
langchain_google_genai.ChatGoogleGenerativeAI
similar toChatOpenAI(..., model_kwargs={"response_format": {"type": "json_object"}})
Motivation
I want to use gemini for a generation task and the output must be in json format.
OpenAI cannot be used for my task because the token limit is too small.
Right now I need to switch to Googles API but I want to continue using langchain!
Proposal (If applicable)
I don't have a proposal on how to implement the feature but it would be great to use the keyword like this:
model = ChatGoogleGenerativeAI(..., model_kwargs={"generation_config": {"response_mime_type": "application/json"}})
Beta Was this translation helpful? Give feedback.
All reactions