Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug] [LLAMA CPP] LLAMA CPP not working in 0.5.3 #1376

Closed
3 of 15 tasks
wallartup opened this issue Apr 4, 2024 · 1 comment · Fixed by #1374
Closed
3 of 15 tasks

[Bug] [LLAMA CPP] LLAMA CPP not working in 0.5.3 #1376

wallartup opened this issue Apr 4, 2024 · 1 comment · Fixed by #1374
Labels
bug Something isn't working Waiting for reply

Comments

@wallartup
Copy link

Search before asking

  • I had searched in the issues and found no similar issues.

Operating system information

Windows

Python version information

=3.11

DB-GPT version

main

Related scenes

  • Chat Data
  • Chat Excel
  • Chat DB
  • Chat Knowledge
  • Model Management
  • Dashboard
  • Plugins

Installation Information

Device information

NVIDIA GPU 24GB VRAM

Models information

LLAMA CPP

What happened

(dbgpt_env) D:\DB-GPT-main\DB-GPT-main>pip install -e ".[llama_cpp]"
Obtaining file:///D:/DB-GPT-main/DB-GPT-main
Preparing metadata (setup.py) ... done
WARNING: dbgpt 0.5.3 does not provide the extra 'llama-cpp'
Collecting llama_cpp_python_cuda@ https://github.com/jllllll/llama-cpp-python-cuBLAS-wheels/releases/download/textgen-webui/llama_cpp_python_cuda-0.2.10%2Bcu123avx-cp310-cp310-win_amd64.whl
ERROR: HTTP error 404 while getting https://github.com/jllllll/llama-cpp-python-cuBLAS-wheels/releases/download/textgen-webui/llama_cpp_python_cuda-0.2.10%2Bcu123avx-cp310-cp310-win_amd64.whl
ERROR: Could not install requirement llama_cpp_python_cuda@ https://github.com/jllllll/llama-cpp-python-cuBLAS-wheels/releases/download/textgen-webui/llama_cpp_python_cuda-0.2.10%2Bcu123avx-cp310-cp310-win_amd64.whl from https://github.com/jllllll/llama-cpp-python-cuBLAS-wheels/releases/download/textgen-webui/llama_cpp_python_cuda-0.2.10%2Bcu123avx-cp310-cp310-win_amd64.whl because of HTTP error 404 Client Error: Not Found for url: https://github.com/jllllll/llama-cpp-python-cuBLAS-wheels/releases/download/textgen-webui/llama_cpp_python_cuda-0.2.10%2Bcu123avx-cp310-cp310-win_amd64.whl for URL https://github.com/jllllll/llama-cpp-python-cuBLAS-wheels/releases/download/textgen-webui/llama_cpp_python_cuda-0.2.10%2Bcu123avx-cp310-cp310-win_amd64.whl

LLAMA cpp does not install

What you expected to happen

LLAMA CPP package should install

How to reproduce

Install sourcecode with default

Try to install llama cpp: pip install -e ".[llama_cpp]"

Additional context

No response

Are you willing to submit PR?

  • Yes I am willing to submit a PR!
@wallartup wallartup added bug Something isn't working Waiting for reply labels Apr 4, 2024
@yyhhyyyyyy
Copy link
Contributor

@wallartup Hi, having looked at your error, I see you're using an environment with cuda=12.3.
llama_cpp_python_cuda currently only supports up to cuda=12.2. Therefore, the 404 not found error occurred.
Please wait for the next version update; we will fix this issue.

yyhhyyyyyy added a commit to yyhhyyyyyy/DB-GPT that referenced this issue Apr 8, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working Waiting for reply
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants