-
Notifications
You must be signed in to change notification settings - Fork 5.2k
Issues: ollama/ollama
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
SSE streams possibly not in spec for /api/generate
bug
Something isn't working
#4788
opened Jun 2, 2024 by
Vali-98
Error: invalid file magic for IQ2_M.gguf based models
model request
Model requests
#4786
opened Jun 2, 2024 by
HakaishinShwet
Weird output with ordinary setting
bug
Something isn't working
#4784
opened Jun 2, 2024 by
JoonSumisu
How do I customize the number of layers to be loaded on GPU?
#4783
opened Jun 2, 2024 by
lingyezhixing
Support for jina-embeddings-v2-base-zh
model request
Model requests
#4778
opened Jun 2, 2024 by
wwjCMP
Error: llama runner process has terminated: exit status 1
bug
Something isn't working
#4775
opened Jun 2, 2024 by
BAK-HOME
Ignoring env, being weird with env
bug
Something isn't working
#4771
opened Jun 1, 2024 by
RealMrCactus
server.log grows indefinitely on windows
bug
Something isn't working
#4770
opened Jun 1, 2024 by
dhiltgen
Infinetely generating irrelavent response when running phi3-mini in Linux Terminal
bug
Something isn't working
#4769
opened Jun 1, 2024 by
MomenAbdelwadoud
Model response corruption and leaking data between session.
bug
Something isn't working
#4767
opened Jun 1, 2024 by
MarkWard0110
ollama stop [id of running model]
feature request
New feature or request
#4764
opened Jun 1, 2024 by
mrdev023
I created Ollama - Open WebUI Script - Give it a try!
feature request
New feature or request
#4763
opened Jun 1, 2024 by
Special-Niewbie
Add this web app to the list of apps in the README
feature request
New feature or request
#4758
opened May 31, 2024 by
greenido
(windows) ollama model download will not keep on downloading when reopen ollama
feature request
New feature or request
#4755
opened May 31, 2024 by
waldolin
FROM is not recognized
bug
Something isn't working
#4753
opened May 31, 2024 by
EugeoSynthesisThirtyTwo
Multi-GPU and batch management
feature request
New feature or request
#4752
opened May 31, 2024 by
LaetLanf
Garbage output running llama3 GGUF model
bug
Something isn't working
#4750
opened May 31, 2024 by
DiptenduIDEAS
OLLAMA_MODELS not applied on initial start or on restart after upgrade on macOS
feature request
New feature or request
#4749
opened May 31, 2024 by
vernonstinebaker
CMake Error at CMakeLists.txt:2 (project): Generator System.Management.Automation.RemoteException Ninja System.Management.Automation.RemoteException does not support platform specification, but platform
bug
Something isn't working
#4745
opened May 31, 2024 by
chaoqunxie
Previous Next
ProTip!
Exclude everything labeled
bug
with -label:bug.