Ollama Error Loading

#3
by stardomains - opened

ollama run hf.co/unsloth/Ministral-3-14B-Instruct-2512-GGUF:Q4_K_XL

Error: 500 Internal Server Error: unable to load model: /Users/whatever/.ollama/models/blobs/sha256-70b28fa330d8b4231f772ae5....

never had that issue before with unsloth UD ggufs on Ollama. 0.13.2 Ollama on Mac

ministral-3:14b model from ollama model page runs fine.

I believe it's related to this issue: https://github.com/ollama/ollama/issues/13321

I believe it's related to this issue: https://github.com/ollama/ollama/issues/13321

Thanks. Yup. FIX: I cp'ed the blobs to gguf. Then llama.cpp served them with latest llama.cpp. And all is good. This and new Unsloth Devstral 24B working fine now.

Sign up or log in to comment