runtime error
Exit code: 1. Reason: 1-3b-GGUF/resolve/main/granite-4.1-3b-UD-IQ2_M.gguf "HTTP/1.1 302 Found" INFO: HTTP Request: GET https://huggingface.co/api/models/unsloth/granite-4.1-3b-GGUF/xet-read-token/5b88826e4b80789548180f8faab39c5cf68772c9 "HTTP/1.1 200 OK" INFO: â Model gotowy: /root/.cache/huggingface/hub/models--unsloth--granite-4.1-3b-GGUF/snapshots/5b88826e4b80789548180f8faab39c5cf68772c9/granite-4.1-3b-UD-IQ2_M.gguf ERROR: â Krytyczny bÅÄ d: Failed to load shared library '/usr/local/lib/python3.11/site-packages/llama_cpp/lib/libllama.so': libc.musl-x86_64.so.1: cannot open shared object file: No such file or directory Traceback (most recent call last): File "/usr/local/lib/python3.11/site-packages/llama_cpp/_ctypes_extensions.py", line 67, in load_shared_library return ctypes.CDLL(str(lib_path), **cdll_args) # type: ignore ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/ctypes/__init__.py", line 376, in __init__ self._handle = _dlopen(self._name, mode) ^^^^^^^^^^^^^^^^^^^^^^^^^ OSError: libc.musl-x86_64.so.1: cannot open shared object file: No such file or directory During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/app/app.py", line 44, in <module> from llama_cpp.server.app import create_app File "/usr/local/lib/python3.11/site-packages/llama_cpp/__init__.py", line 1, in <module> from .llama_cpp import * File "/usr/local/lib/python3.11/site-packages/llama_cpp/llama_cpp.py", line 38, in <module> _lib = load_shared_library(_lib_base_name, _base_path) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/llama_cpp/_ctypes_extensions.py", line 69, in load_shared_library raise RuntimeError(f"Failed to load shared library '{lib_path}': {e}") RuntimeError: Failed to load shared library '/usr/local/lib/python3.11/site-packages/llama_cpp/lib/libllama.so': libc.musl-x86_64.so.1: cannot open shared object file: No such file or directory
Container logs:
Fetching error logs...