Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Pre-built cpu wheel does not work on Ubuntu due to libc.musl dependency #1628

Open
4 tasks done
OKUA1 opened this issue Jul 27, 2024 · 6 comments
Open
4 tasks done

Pre-built cpu wheel does not work on Ubuntu due to libc.musl dependency #1628

OKUA1 opened this issue Jul 27, 2024 · 6 comments
Labels
bug Something isn't working

Comments

@OKUA1
Copy link

OKUA1 commented Jul 27, 2024

Prerequisites

Please answer the following questions for yourself before submitting an issue.

  • I am running the latest code. Development is very rapid so there are no tagged versions as of now.
  • I carefully followed the README.md.
  • I searched using keywords relevant to my issue to make sure that I am creating a new issue that is not already open (or closed).
  • I reviewed the Discussions, and have a new bug or useful enhancement to share.

Expected Behavior

The pre-built cpu wheels should work out of the box on Ubuntu

Current Behavior

The pre-built cpu wheels depend on libc.musl, which is generally not available on most of the popular linux distributions.

The following external shared libraries are required by the wheel:
{
    "libc.musl-x86_64.so.1": null,
    "libgcc_s.so.1": null,
    "libggml.so": null,
    "libgomp.so.1": null,
    "libllama.so": null,
    "libstdc++.so.6": null
}

Attempting to import llama_cpp results in the following error:

RuntimeError: Failed to load shared library '/usr/local/lib/python3.10/dist-packages/llama_cpp/lib/libllama.so': libc.musl-x86_64.so.1: cannot open shared object file: No such file or directory

At the same time, the cuda wheel does not depend on musl and works out of the box on the same system.

The following external shared libraries are required by the wheel:
{
    "libc.so.6": null,
    "libcublas.so.12": null,
    "libcuda.so.1": null,
    "libcudart.so.12": null,
    "libgcc_s.so.1": null,
    "libggml.so": null,
    "libgomp.so.1": null,
    "libllama.so": null,
    "libm.so.6": null,
    "libstdc++.so.6": null
}

Environment and Context

Ubuntu 22.04 / Ubuntu 24.04

Llama-cpp-python 0.2.82 / 0.2.83

Steps to Reproduce

Please provide detailed steps for reproducing the issue. We are not sitting in front of your screen, so the more detail the better.

  1. Install a pre-built cpu wheel
  2. try to import llama_cpp

Failure Logs

OSError                                   Traceback (most recent call last)

[/usr/local/lib/python3.10/dist-packages/llama_cpp/llama_cpp.py](https://localhost:8080/#) in _load_shared_library(lib_base_name)
     74             try:
---> 75                 return ctypes.CDLL(str(_lib_path), **cdll_args)  # type: ignore
     76             except Exception as e:

[/usr/lib/python3.10/ctypes/__init__.py](https://localhost:8080/#) in __init__(self, name, mode, handle, use_errno, use_last_error, winmode)
    373         if handle is None:
--> 374             self._handle = _dlopen(self._name, mode)
    375         else:

OSError: libc.musl-x86_64.so.1: cannot open shared object file: No such file or directory


During handling of the above exception, another exception occurred:

RuntimeError                              Traceback (most recent call last)

[<ipython-input-17-c8c7f50702fd>](https://localhost:8080/#) in <cell line: 1>()
----> 1 import llama_cpp

[/usr/local/lib/python3.10/dist-packages/llama_cpp/__init__.py](https://localhost:8080/#) in <module>
----> 1 from .llama_cpp import *
      2 from .llama import *
      3 
      4 __version__ = "0.2.83"

[/usr/local/lib/python3.10/dist-packages/llama_cpp/llama_cpp.py](https://localhost:8080/#) in <module>
     86 
     87 # Load the library
---> 88 _lib = _load_shared_library(_lib_base_name)
     89 
     90 

[/usr/local/lib/python3.10/dist-packages/llama_cpp/llama_cpp.py](https://localhost:8080/#) in _load_shared_library(lib_base_name)
     75                 return ctypes.CDLL(str(_lib_path), **cdll_args)  # type: ignore
     76             except Exception as e:
---> 77                 raise RuntimeError(f"Failed to load shared library '{_lib_path}': {e}")
     78 
     79     raise FileNotFoundError(

RuntimeError: Failed to load shared library '/usr/local/lib/python3.10/dist-packages/llama_cpp/lib/libllama.so': libc.musl-x86_64.so.1: cannot open shared object file: No such file or directory
@gaby
Copy link

gaby commented Jul 28, 2024

Yeah, for some reason the wheels are being build with musl instead of glibc. My fix was to do this:

apt install musl-dev
ln -s /usr/lib/x86_64-linux-musl/libc.so /lib/libc.musl-x86_64.so.1

@OKUA1
Copy link
Author

OKUA1 commented Jul 28, 2024

@gaby yes, I did the same and it works fine. But I would consider that to be more of a work around rather than a long-term solution.

@jcuenod
Copy link

jcuenod commented Aug 5, 2024

I would definitely consider this a workaround. I am using this lib to reduce a docker image size, so I don't want to install musl.

Probably related: #1507

@abetlen abetlen added the bug Something isn't working label Aug 7, 2024
@arpesenti
Copy link

It looks like the issue has been fixed in the latest version (0.2.88). Thanks!

@jcuenod
Copy link

jcuenod commented Aug 14, 2024

Note that the prebuilt wheel is not there for 0.2.88...

@OKUA1
Copy link
Author

OKUA1 commented Aug 15, 2024

@arpesenti I don't think it was fixed.

The following external shared libraries are required by the wheel:
{
    "libc.musl-x86_64.so.1": null,
    "libgcc_s.so.1": null,
    "libggml.so": null,
    "libgomp.so.1": null,
    "libllama.so": null,
    "libstdc++.so.6": null
}

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

5 participants