Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

No one has noticed that if it's Windows, there will be a problem in step 8 according to your instructions, where socat listens on port 11434 and ollama also listens on port 11434, which will cause a process conflict. #253

Open
jiajiahard opened this issue Sep 25, 2024 · 6 comments

Comments

@jiajiahard
Copy link

When I skipped step 8, my Convex backend was completely unable to access Ollama and kept telling me that
9/25/2024, 9:26:31 PM [CONVEX A(aiTown/agentOperations:agentGenerateMessage)] Uncaught Error: Request to http://localhost:11434/api/embeddings forbidden

9/25/2024, 9:26:31 PM [CONVEX A(aiTown/agentOperations:agentGenerateMessage)] [LOG] 'Texts to be sent for embedding: ' [ 'Bob is talking to Stella' ]
9/25/2024, 9:26:31 PM [CONVEX A(aiTown/agentOperations:agentGenerateMessage)] [LOG] 'Sending data for embedding: {"model":"mxbai-embed-large","prompt":"Bob is talking to Stella"}'
9/25/2024, 9:26:31 PM [CONVEX A(aiTown/agentOperations:agentGenerateMessage)] Uncaught Error: Request to http://localhost:11434/api/embeddings forbidden

@jiajiahard
Copy link
Author

I cannot even see any record information on Ollama about the records convex have accessed.

@Extassey
Copy link

Yes I have been having this problem as well on multiple machines... I basically brute force it everytime. Will post in a minute when I get it working again..

@jiajiahard
Copy link
Author

Yes I have been having this problem as well on multiple machines... I basically brute force it everytime. Will post in a minute when I get it working again..
so,what should I do

@Extassey
Copy link

Extassey commented Sep 28, 2024

Honestly for now I am using the no clerk branch, makes my machine less hot anyhow, maybe one of the devs of the main fork will chime in eventually when they see this, I cant figure out how I got passed this last time.
I am having trouble with the convex precompiled local running, npm rum dev keeps prompting me to set up a remote convex project, this was not the case before. something must have updated somewhere and broke.

@Extassey
Copy link

Extassey commented Sep 29, 2024

Okay so I found out if you are using WSL you have to make sure you have Ollama installed in WSL and not on windows itself, not sure if there is a port problem with having it on windows itself but I wouldnt have it on there just to be safe.

Secondly install Llama with curl -fsSL https://ollama.com/install.sh | sh if you use snap it doesnt seem to work for me.

Thirdly, once you install Llama, you are good to go, don't worry about doing a Ollama serve, once its running you can feel free to do a /bye and it should still be running, open cmd and type netstat -ano | findstr :11434 you should get nothing in response, if you get something in response, do a kill <PID> and it should exit it.

Now back in WSL (Remember every one of these steps ive mentioned are things you will do in WSL not cmd, other than the netstat -ano | findstr :11434 step and the kill step) do a curl http://localhost:11434/ and you should see Ollama is running in response.

I am going to check the other socat config and convex port assign steps as well to make sure that is still working.

[EDIT]:

so with so cat first set your host ip like this HOST_IP=<YOUR HOST IP INSIDE HERE> to find out what it is run hostname -I | awk '{print $1}' then run this command to tell socat to listen socat TCP-LISTEN:11434,fork TCP:$HOST_IP:11434 & (You might get an error saying it is already listening but ignore that) and then check to make sure it is listening with ps aux | grep socat

@jiajiahard
Copy link
Author

I dont understamd your idea,if I use the ollama serve in WSL and socat TCP-LISTEN:11434,fork TCP:$HOST_IP:11434 &ps aux | grep socat,I will find that Process conflict,because they both use 11434,maybe your ollama serve started in other port

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants