-
Notifications
You must be signed in to change notification settings - Fork 894
Issues: abetlen/llama-cpp-python
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
Model path does not exist: path/to/llama-2/llama-model.gguf
#1672
opened Aug 9, 2024 by
vinodhanvino
4 tasks done
uv pip install llama-cpp-python
fails "File exists when symlinking..."
bug
#1670
opened Aug 8, 2024 by
abetlen
error: no matching function for call to 'ggml_group_norm'
#1665
opened Aug 7, 2024 by
yurivict
4 tasks done
Wheel build showing error of cmake suddenly - building version 0.2.76 on windows
#1664
opened Aug 7, 2024 by
himanshu034
High CPU Usage, very slow performance, with flash_attn=true on ROCM 6.1.2
bug
Something isn't working
#1661
opened Aug 6, 2024 by
curvedinf
4 tasks done
Error When Loading Model with llama_cpp: [WinError -1073741795] Windows Error 0xc000001d
bug
Something isn't working
#1660
opened Aug 6, 2024 by
ltkien2003
pip install llama-cpp-python on anaconda
bug
Something isn't working
#1654
opened Aug 5, 2024 by
werruww
CUDA 12.1 Llama-cpp-python version 0.2.84 pre-built request.
#1652
opened Aug 2, 2024 by
gformcreation
Installing the pre-built v0.2.85 – v0.2.87 Metal wheels for Python {3.10, 3.11} yields Something isn't working
Bad magic number for file header
bug
#1650
opened Aug 2, 2024 by
lsorber
create_chat_completion is stuck in versions 0.2.84 and 0.2.85 for Mac Silicon
bug
Something isn't working
#1648
opened Aug 1, 2024 by
mobeetle
All requests end with 'finish_reason': 'length' when the max_tokens=-1 parameter is set.
bug
Something isn't working
#1645
opened Jul 31, 2024 by
tur0kmagalp
[Bug:Server] Lack of usage information on streaming response
enhancement
New feature or request
#1640
opened Jul 30, 2024 by
zm0n3
4 tasks done
Pre-built cpu wheel does not work on Ubuntu due to libc.musl dependency
bug
Something isn't working
#1628
opened Jul 27, 2024 by
OKUA1
4 tasks done
CUDA error: unspecified launch failure on inference on Nvidia V100 GPUs
bug
Something isn't working
#1624
opened Jul 26, 2024 by
rplescia
Not able to Install with cuda support in Bento
bug
Something isn't working
#1622
opened Jul 25, 2024 by
RakshitAralimatti
can not install llama-cpp-python
bug
Something isn't working
#1621
opened Jul 24, 2024 by
XingchenMengxiang
llama-cpp-python CMake,Failed building wheel for llama-cpp-python error on windows 11 pro
bug
Something isn't working
#1619
opened Jul 24, 2024 by
Shrutibajpeyi
Feat: Add support for Llama 3.1 function calling
enhancement
New feature or request
#1618
opened Jul 24, 2024 by
qnixsynapse
ERROR: Could not build wheels for llama-cpp-python
bug
Something isn't working
#1617
opened Jul 23, 2024 by
inst32i
Build fail for version from 0.2.80 to 0.2.83
bug
Something isn't working
#1616
opened Jul 23, 2024 by
congson1293
ERROR: Failed building wheel for llama-cpp-python for SYCL installation on Windows
bug
Something isn't working
#1614
opened Jul 22, 2024 by
sunilmathew-mcw
Add support for croos-encoders
enhancement
New feature or request
#1611
opened Jul 20, 2024 by
perpendicularai
destructor llama error: TypeError: 'NoneType' object is not callable
#1610
opened Jul 20, 2024 by
yanjunting1983
Pull from Ollama repo functionality
enhancement
New feature or request
#1607
opened Jul 18, 2024 by
ericcurtin
Previous Next
ProTip!
Follow long discussions with comments:>50.