Does this model support tool calls?
I tried using it on an ollama harness but no success with tool calls (the kind that openclaw requires).
Does this model support tool calls?
Here is what my curl call to Ollama looks like:
curl -X POST http://localhost:11434/api/chat -d '{
"model": "hf.co/TeichAI/GLM-4.7-Flash-Claude-Opus-4.5-High-Reasoning-Distill-GGUF:Q3_K_M",
"messages": [{"role": "user", "content": "What is 2+2?"}],
"tools": [
{
"type": "function",
"function": {
"name": "add",
"description": "Add two numbers",
"parameters": {
"type": "object",
"properties": {
"a": {"type": "number"},
"b": {"type": "number"}
},
"required": ["a", "b"]
}
}
}
],
"stream": false
}' -H "Content-Type: application/json"
{"error":"hf.co/TeichAI/GLM-4.7-Flash-Claude-Opus-4.5-High-Reasoning-Distill-GGUF:Q3_K_M does not support tools"}
hmm yea could be something with the ollama modelfile. I will look into a fix
Yea so the issue here is that we need to create the ollama model using the provided modelfile. You will want to:
Then from the directory the downloaded file is located you need to create the Ollama model using this modelfile:
ollama create TeichAI-GLM-4.7-Flash-Claude-Opus-4.5-Distill -f ./ModelfileLoad the model using
TeichAI-GLM-4.7-Flash-Claude-Opus-4.5-Distillinstead ofhf.co/TeichAI/GLM-4.7-Flash-Claude-Opus-4.5-High-Reasoning-Distill-GGUF:Q3_K_M
If you want to use any other quants you will need to edit the top line of the Modelfile accordingly. Hopefully this helps and resolves your issue :)
I am suggesting this purely from this discussion I found as I dont use ollama. https://huggingface.co/unsloth/GLM-4.7-Flash-GGUF/discussions/23
Please let me know if this works as I'm not sure if that fixed the issue
Glad to hear that worked!
Also I'm not really sure... I could update the model README with further Ollama instructions. I think hf controls those things
Yea so the issue here is that we need to create the ollama model using the provided modelfile. You will want to:
Then from the directory the downloaded file is located you need to create the Ollama model using this modelfile:
ollama create TeichAI-GLM-4.7-Flash-Claude-Opus-4.5-Distill -f ./ModelfileLoad the model using
TeichAI-GLM-4.7-Flash-Claude-Opus-4.5-Distillinstead ofhf.co/TeichAI/GLM-4.7-Flash-Claude-Opus-4.5-High-Reasoning-Distill-GGUF:Q3_K_MIf you want to use any other quants you will need to edit the top line of the Modelfile accordingly. Hopefully this helps and resolves your issue :)
I am suggesting this purely from this discussion I found as I dont use ollama. https://huggingface.co/unsloth/GLM-4.7-Flash-GGUF/discussions/23
Please let me know if this works as I'm not sure if that fixed the issue
Thanks! Works!

