Ollama Error

#16
by edm-research - opened

When I've tried to run the model with Ollama I got this error Error: 500 Internal Server Error: unable to load model

I'm getting the same error with Ollama,

Unsloth AI org

When I've tried to run the model with Ollama I got this error Error: 500 Internal Server Error: unable to load model

I'm getting the same error with Ollama,

GGUFs with separate mmproj files are not supported in Ollama. Use llama.cpp supported backends.

Meanwhile you can try and "remove" the vision model part:

Run this command to create a Modelfile:
ollama show --modelfile hf.co/unsloth/gemma-4-26B-A4B-it-GGUF:UD-Q4_K_M > gemma_4_unsloth_modelfile

Then open the gemma_4_unsloth_modelfile and add a # character in front of the second FROM line:

# Modelfile generated by "ollama show"
# To build a new Modelfile based on this, replace FROM with:
# FROM hf.co/unsloth/gemma-4-26B-A4B-it-GGUF:UD-Q4_K_M

FROM /root/.ollama/models/blobs/sha256-2f8672b0c2cca8dedfb8782815c2769ccdaa6512788f3ee87b32cf117f0dffc1
#FROM /root/.ollama/models/blobs/sha256-fc2ebf4c44528daa2cea7b39891712847ca5e4f87dcf578054a06c46bfe6da27
TEMPLATE "{{ if .System }}<bos><|turn>system
{{ .System }}<turn|>
{{ end }}{{ if .Prompt }}<|turn>user
{{ .Prompt }}<turn|>
{{ end }}<|turn>model
{{ .Response }}<turn|>
"
PARAMETER stop <bos>
PARAMETER stop <|turn>
PARAMETER stop <turn|>
PARAMETER stop <|turn>user

Then run:

ollama create gemma-4-unsloth -f gemma_4_unsloth_modelfile

Now you can run the unsloth version without vision:

ollama run gemma-4-unsloth

From this [GitHub issue](https://github.com/ollama/ollama/issues/15235#issuecomment-4187108500.

Just had the same 500 problem after downloading it with ollama.
What I dont undersatnd: Ollama offers to download the gemma4:26b model from their internal selection, and it works with images. sooo the one they offering is using something different to make the vision work ?

GGUF wont work in ollama i believe.
And some of the custom gemma426B oddly enough don't have vision ? So i'm pulling gemma4:26b-a4b-it-q4_K_M via ollama i'll ping back with findings.
https://ollama.com/library/gemma4:26b-a4b-it-q4_K_M

Hopefully it has vision and works fast.

Sign up or log in to comment