Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error When Loading Model with llama_cpp: [WinError -1073741795] Windows Error 0xc000001d #1660

Open
ltkien2003 opened this issue Aug 6, 2024 · 0 comments
Labels
bug Something isn't working

Comments

@ltkien2003
Copy link

I am experiencing an error when trying to load a model using the llama_cpp library in a Python application packaged with PyInstaller. The error message is as follows:

Traceback (most recent call last):
File "app.py", line 25, in
File "server\processor\text_translator.py", line 10, in init
File "llama_cpp\llama.py", line 372, in init
File "llama_cpp_internals.py", line 50, in init
OSError: [WinError -1073741795] Windows Error 0xc000001d [17724]
Failed to execute script 'app'

This issue occurs specifically when loading the model with llama_cpp.
Could you provide guidance on how to resolve this error? Is there a known issue with llama_cpp on Windows, or could this be related to compatibility problems with PyInstaller?

@abetlen abetlen added the bug Something isn't working label Aug 7, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants