-
Notifications
You must be signed in to change notification settings - Fork 1.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fundamental issue in response with IPEX-LLM with Ollama #12636
Comments
hi, what igpu device you are using? It looks fine on Ultra 7 258V
|
12th Gen Intel(R) Core(TM) i5-1240P
|
@anandnandagiri Can you update your ipex-llm[cpp] to latest nightly version? I can't reproduce the wrong output on similiar device i7 1270p. |
This got resolved Thank You @qiuxin2012 |
I am getting a weird response from Ollama. Hoping below screen shot helps in detail
Below is command I used to work
The text was updated successfully, but these errors were encountered: