You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
@wallartup Hi, having looked at your error, I see you're using an environment with cuda=12.3.
llama_cpp_python_cuda currently only supports up to cuda=12.2. Therefore, the 404 not found error occurred.
Please wait for the next version update; we will fix this issue.
Search before asking
Operating system information
Windows
Python version information
DB-GPT version
main
Related scenes
Installation Information
Installation From Source
Docker Installation
Docker Compose Installation
Cluster Installation
AutoDL Image
Other
Device information
NVIDIA GPU 24GB VRAM
Models information
LLAMA CPP
What happened
(dbgpt_env) D:\DB-GPT-main\DB-GPT-main>pip install -e ".[llama_cpp]"
Obtaining file:///D:/DB-GPT-main/DB-GPT-main
Preparing metadata (setup.py) ... done
WARNING: dbgpt 0.5.3 does not provide the extra 'llama-cpp'
Collecting llama_cpp_python_cuda@ https://github.com/jllllll/llama-cpp-python-cuBLAS-wheels/releases/download/textgen-webui/llama_cpp_python_cuda-0.2.10%2Bcu123avx-cp310-cp310-win_amd64.whl
ERROR: HTTP error 404 while getting https://github.com/jllllll/llama-cpp-python-cuBLAS-wheels/releases/download/textgen-webui/llama_cpp_python_cuda-0.2.10%2Bcu123avx-cp310-cp310-win_amd64.whl
ERROR: Could not install requirement llama_cpp_python_cuda@ https://github.com/jllllll/llama-cpp-python-cuBLAS-wheels/releases/download/textgen-webui/llama_cpp_python_cuda-0.2.10%2Bcu123avx-cp310-cp310-win_amd64.whl from https://github.com/jllllll/llama-cpp-python-cuBLAS-wheels/releases/download/textgen-webui/llama_cpp_python_cuda-0.2.10%2Bcu123avx-cp310-cp310-win_amd64.whl because of HTTP error 404 Client Error: Not Found for url: https://github.com/jllllll/llama-cpp-python-cuBLAS-wheels/releases/download/textgen-webui/llama_cpp_python_cuda-0.2.10%2Bcu123avx-cp310-cp310-win_amd64.whl for URL https://github.com/jllllll/llama-cpp-python-cuBLAS-wheels/releases/download/textgen-webui/llama_cpp_python_cuda-0.2.10%2Bcu123avx-cp310-cp310-win_amd64.whl
LLAMA cpp does not install
What you expected to happen
LLAMA CPP package should install
How to reproduce
Install sourcecode with default
Try to install llama cpp: pip install -e ".[llama_cpp]"
Additional context
No response
Are you willing to submit PR?
The text was updated successfully, but these errors were encountered: