Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

how to get the studio to work with local model? #13

Closed
robbiemu opened this issue Sep 12, 2024 · 2 comments
Closed

how to get the studio to work with local model? #13

robbiemu opened this issue Sep 12, 2024 · 2 comments

Comments

@robbiemu
Copy link
Contributor

robbiemu commented Sep 12, 2024

I've been following along to router.py in module 1.

Using this code in the notebooks, in replace of the open_api_ai, I've found a lot of success in following along:

from langchain_ollama import ChatOllama

llm = ChatOllama(
    model = "llama3.1:8b-instruct-q8_0",
    temperature = 0.8,
    num_ctx=4096,
    num_predict = 4096,
)

this was of course after installing that library in the venv. However, I'm having trouble getting the same change to work in the LangGraph Studio. Here's the received behavior:

ConnectError: [Errno 111] Connection refused

Here's what I've done to try to get it to work:

  1. modify requirements.txt in the studio/ folder, adding the line:
langchain_ollama
  1. modify router.py to use it:
from langgraph.graph import MessagesState
...
from langchain_ollama import ChatOllama

# Tool
def multiply(a: int, b: int) -> int:
    """Multiplies a and b.

    Args:
        a: first int
        b: second int
    """
    return a * b

# LLM with bound tool
# llm = ChatOpenAI(model="gpt-4o")
llm = ChatOllama(
    model = "llama3.1:8b-instruct-q8_0",
    temperature = 0.8,
    num_ctx=4096,
    num_predict = 4096,
)
llm_with_tools = llm.bind_tools([multiply])
...

I also at this point had to restart LangGraph studio, but from there it worked to load and submit messages in the router. But I get that error the running the graph.

By comparison, int he router.lpynb, from basically the cell representing the output, I get:

================================ Human Message =================================

Multiply 3 and 2
================================== Ai Message ==================================
Tool Calls:
  multiply (8b040f6e-39d4-4ae1-a46f-73bfb3ccd9da)
 Call ID: 8b040f6e-39d4-4ae1-a46f-73bfb3ccd9da
  Args:
    a: 3
    b: 2
================================= Tool Message =================================
Name: multiply

6

Studio doesn't leave the container instance running so I can't search for what is wrong, and I'm unsure how to change it :)

@robbiemu
Copy link
Contributor Author

btw it was very cool to see that I can change multiply to accept floats and tell it to multiply pi and e :)

================================ Human Message =================================

Multiply pi and e
================================== Ai Message ==================================
Tool Calls:
  multiply (fc7144da-1851-46b1-addf-aa90cad66fcc)
 Call ID: fc7144da-1851-46b1-addf-aa90cad66fcc
  Args:
    a: 3.14159
    b: 2.71828
================================= Tool Message =================================
Name: multiply

8.539721265199999

@robbiemu
Copy link
Contributor Author

it just dawned on me what the real problem is. in my code, I am not specifying the hostname. Localhost in the container refer's to the virtual machine represented by the container. Docker provides a special hostname host.docker.internal that can be used to refer to the host machine from within the container.

llm = ChatOllama(
    model = "mistral-nemo:12b-instruct-2407-q8_0",
    base_url="http://host.docker.internal:11434",
    temperature = 0.8,
    num_ctx=8192,
    num_predict = 4096,
)

That works! This was a non-issue I guess :) If anyone else has the same trouble, hopefully at least they can search here and find the answer.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant