UD models not compatible with ollama and openwebui?

#11
by alejandroparbas - opened

Sorry I am a little bit new to all of the quantized models, but when trying to run models like Qwen3-32B-GGUF:Q3_K_XL in ollama and openwebui I get this error: '400: hf.co/unsloth/Qwen3-32B-GGUF:Q3_K_XL does not support thinking. Pull the model again to get the latest version with full thinking support'.

Is this intentional? Other models like Qwen3-8B-GGUF:Q4_K_XL seem to work correctly.

alejandroparbas changed discussion status to closed

Sign up or log in to comment