Qwen3ForCausalLM
#13
by cr-gkn - opened
anyone getting this error:
converting model
Error: unsupported architecture "Qwen3ForCausalLM"
I checked at many places but available architecuture is only "Qwen2ForCausalLM"
Latest architecture is giving error in installing model with Ollama!
I managed to install Qwen 2.5 successfully. but Qwen 3 is the problem because of new architecture.
anyone facing same issue?
I'm new to this area, but according to what I've found, ollama is not support Qwen3ForCausalLM right now, and it might be fixed soon?
A usable method is to use llama.cpp to convert it to GGUF and then import that? (ref)
I'm just trying to build env for llm coding tools and now using other models to avoid the mess.