-
Notifications
You must be signed in to change notification settings - Fork 1.3k
Pre-built cpu wheel does not work on Ubuntu due to libc.musl dependency #1628
Copy link
Copy link
Closed
Labels
bugSomething isn't workingSomething isn't working
Description
Prerequisites
Please answer the following questions for yourself before submitting an issue.
- I am running the latest code. Development is very rapid so there are no tagged versions as of now.
- I carefully followed the README.md.
- I searched using keywords relevant to my issue to make sure that I am creating a new issue that is not already open (or closed).
- I reviewed the Discussions, and have a new bug or useful enhancement to share.
Expected Behavior
The pre-built cpu wheels should work out of the box on Ubuntu
Current Behavior
The pre-built cpu wheels depend on libc.musl, which is generally not available on most of the popular linux distributions.
The following external shared libraries are required by the wheel:
{
"libc.musl-x86_64.so.1": null,
"libgcc_s.so.1": null,
"libggml.so": null,
"libgomp.so.1": null,
"libllama.so": null,
"libstdc++.so.6": null
}
Attempting to import llama_cpp results in the following error:
RuntimeError: Failed to load shared library '/usr/local/lib/python3.10/dist-packages/llama_cpp/lib/libllama.so': libc.musl-x86_64.so.1: cannot open shared object file: No such file or directory
At the same time, the cuda wheel does not depend on musl and works out of the box on the same system.
The following external shared libraries are required by the wheel:
{
"libc.so.6": null,
"libcublas.so.12": null,
"libcuda.so.1": null,
"libcudart.so.12": null,
"libgcc_s.so.1": null,
"libggml.so": null,
"libgomp.so.1": null,
"libllama.so": null,
"libm.so.6": null,
"libstdc++.so.6": null
}
Environment and Context
Ubuntu 22.04 / Ubuntu 24.04
Llama-cpp-python 0.2.82 / 0.2.83
Steps to Reproduce
Please provide detailed steps for reproducing the issue. We are not sitting in front of your screen, so the more detail the better.
- Install a pre-built cpu wheel
- try to import llama_cpp
Failure Logs
OSError Traceback (most recent call last)
[/usr/local/lib/python3.10/dist-packages/llama_cpp/llama_cpp.py](https://localhost:8080/#) in _load_shared_library(lib_base_name)
74 try:
---> 75 return ctypes.CDLL(str(_lib_path), **cdll_args) # type: ignore
76 except Exception as e:
[/usr/lib/python3.10/ctypes/__init__.py](https://localhost:8080/#) in __init__(self, name, mode, handle, use_errno, use_last_error, winmode)
373 if handle is None:
--> 374 self._handle = _dlopen(self._name, mode)
375 else:
OSError: libc.musl-x86_64.so.1: cannot open shared object file: No such file or directory
During handling of the above exception, another exception occurred:
RuntimeError Traceback (most recent call last)
[<ipython-input-17-c8c7f50702fd>](https://localhost:8080/#) in <cell line: 1>()
----> 1 import llama_cpp
[/usr/local/lib/python3.10/dist-packages/llama_cpp/__init__.py](https://localhost:8080/#) in <module>
----> 1 from .llama_cpp import *
2 from .llama import *
3
4 __version__ = "0.2.83"
[/usr/local/lib/python3.10/dist-packages/llama_cpp/llama_cpp.py](https://localhost:8080/#) in <module>
86
87 # Load the library
---> 88 _lib = _load_shared_library(_lib_base_name)
89
90
[/usr/local/lib/python3.10/dist-packages/llama_cpp/llama_cpp.py](https://localhost:8080/#) in _load_shared_library(lib_base_name)
75 return ctypes.CDLL(str(_lib_path), **cdll_args) # type: ignore
76 except Exception as e:
---> 77 raise RuntimeError(f"Failed to load shared library '{_lib_path}': {e}")
78
79 raise FileNotFoundError(
RuntimeError: Failed to load shared library '/usr/local/lib/python3.10/dist-packages/llama_cpp/lib/libllama.so': libc.musl-x86_64.so.1: cannot open shared object file: No such file or directory
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
bugSomething isn't workingSomething isn't working