You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
this was of course after installing that library in the venv. However, I'm having trouble getting the same change to work in the LangGraph Studio. Here's the received behavior:
ConnectError: [Errno 111] Connection refused
Here's what I've done to try to get it to work:
modify requirements.txt in the studio/ folder, adding the line:
langchain_ollama
modify router.py to use it:
fromlanggraph.graphimportMessagesState
...
fromlangchain_ollamaimportChatOllama# Tooldefmultiply(a: int, b: int) ->int:
"""Multiplies a and b. Args: a: first int b: second int """returna*b# LLM with bound tool# llm = ChatOpenAI(model="gpt-4o")llm=ChatOllama(
model="llama3.1:8b-instruct-q8_0",
temperature=0.8,
num_ctx=4096,
num_predict=4096,
)
llm_with_tools=llm.bind_tools([multiply])
...
I also at this point had to restart LangGraph studio, but from there it worked to load and submit messages in the router. But I get that error the running the graph.
By comparison, int he router.lpynb, from basically the cell representing the output, I get:
it just dawned on me what the real problem is. in my code, I am not specifying the hostname. Localhost in the container refer's to the virtual machine represented by the container. Docker provides a special hostname host.docker.internal that can be used to refer to the host machine from within the container.
I've been following along to router.py in module 1.
Using this code in the notebooks, in replace of the open_api_ai, I've found a lot of success in following along:
this was of course after installing that library in the venv. However, I'm having trouble getting the same change to work in the LangGraph Studio. Here's the received behavior:
Here's what I've done to try to get it to work:
I also at this point had to restart LangGraph studio, but from there it worked to load and submit messages in the router. But I get that error the running the graph.
By comparison, int he router.lpynb, from basically the cell representing the output, I get:
Studio doesn't leave the container instance running so I can't search for what is wrong, and I'm unsure how to change it :)
The text was updated successfully, but these errors were encountered: