Skip to content

Issues: ollama/ollama

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Author
Filter by author
Label
Filter by label
Use alt + click/return to exclude labels
or + click/return for logical OR
Projects
Filter by project
Milestones
Filter by milestone
Assignee
Filter by who’s assigned
Sort

Issues list

command-r:35b uses too much memory bug Something isn't working
#4790 opened Jun 2, 2024 by Zig1375
SSE streams possibly not in spec for /api/generate bug Something isn't working
#4788 opened Jun 2, 2024 by Vali-98
MiniCPM-Llama3-V-2_5 model request Model requests
#4787 opened Jun 2, 2024 by ddpasa
ollama save feature feature request New feature or request
#4785 opened Jun 2, 2024 by CorollaD
Weird output with ordinary setting bug Something isn't working
#4784 opened Jun 2, 2024 by JoonSumisu
ollama not show my model. bug Something isn't working
#4781 opened Jun 2, 2024 by tuantupharma
Pls add Radeon VII feature request New feature or request
#4780 opened Jun 2, 2024 by MrSteelRat
Support for jina-embeddings-v2-base-zh model request Model requests
#4778 opened Jun 2, 2024 by wwjCMP
Error: llama runner process has terminated: exit status 1 bug Something isn't working
#4775 opened Jun 2, 2024 by BAK-HOME
Ignoring env, being weird with env bug Something isn't working
#4771 opened Jun 1, 2024 by RealMrCactus
server.log grows indefinitely on windows bug Something isn't working
#4770 opened Jun 1, 2024 by dhiltgen
Model response corruption and leaking data between session. bug Something isn't working
#4767 opened Jun 1, 2024 by MarkWard0110
ollama stop [id of running model] feature request New feature or request
#4764 opened Jun 1, 2024 by mrdev023
Add this web app to the list of apps in the README feature request New feature or request
#4758 opened May 31, 2024 by greenido
FROM is not recognized bug Something isn't working
#4753 opened May 31, 2024 by EugeoSynthesisThirtyTwo
Multi-GPU and batch management feature request New feature or request
#4752 opened May 31, 2024 by LaetLanf
Garbage output running llama3 GGUF model bug Something isn't working
#4750 opened May 31, 2024 by DiptenduIDEAS
Custom-llama issue bug Something isn't working
#4748 opened May 31, 2024 by Ascariota
ProTip! Exclude everything labeled bug with -label:bug.