Works from command line ollama, doesn't work through OpenWebUI. Why?

#1
by ntsarb - opened

Hello,
I tested the model successfully from local ollama running the cli client, but when I try to use it from OpenWebUI (through the same ollama server), it doesn't; the model gets loaded but no response.
OpenWebUI.png

Let me know if there is additional information I can provide to resolve this.
Kind regards

Unsloth AI org

Are you using Ollama or llama.cpp for OpenWebUI?

Ollama in WSL2.

try this: https://ollama.com/amsaravi/medgemma-4b-it-q8

Thanks for your response. While the link doesn't work, the model itself now works, with the latest versions of ollama and openwebui.
"
404. That's an error.
The page was not found.
"

ntsarb changed discussion status to closed

Sign up or log in to comment