Does this model support tool calls?

#10
by vivganes - opened

I tried using it on an ollama harness but no success with tool calls (the kind that openclaw requires).

Does this model support tool calls?

TeichAI org

image

Works for me in LM studio and llama.cpp. Could be a weird ollama issue. When I get a chance I will run some tests.

Here is what my curl call to Ollama looks like:

curl -X POST http://localhost:11434/api/chat   -d '{
    "model": "hf.co/TeichAI/GLM-4.7-Flash-Claude-Opus-4.5-High-Reasoning-Distill-GGUF:Q3_K_M",
    "messages": [{"role": "user", "content": "What is 2+2?"}],
    "tools": [
      {
        "type": "function",
        "function": {
          "name": "add",
          "description": "Add two numbers",
          "parameters": {
            "type": "object",
            "properties": {
              "a": {"type": "number"},
              "b": {"type": "number"}
            },
            "required": ["a", "b"]
          }
        }
      }
    ],
    "stream": false
  }'   -H "Content-Type: application/json"
  
{"error":"hf.co/TeichAI/GLM-4.7-Flash-Claude-Opus-4.5-High-Reasoning-Distill-GGUF:Q3_K_M does not support tools"}
TeichAI org

hmm yea could be something with the ollama modelfile. I will look into a fix

TeichAI org
β€’
edited Feb 22

Yea so the issue here is that we need to create the ollama model using the provided modelfile. You will want to:

  1. Download the Modelfile from the repository files

  2. Then from the directory the downloaded file is located you need to create the Ollama model using this modelfile:

    ollama create TeichAI-GLM-4.7-Flash-Claude-Opus-4.5-Distill -f ./Modelfile

  3. Load the model using TeichAI-GLM-4.7-Flash-Claude-Opus-4.5-Distill instead of hf.co/TeichAI/GLM-4.7-Flash-Claude-Opus-4.5-High-Reasoning-Distill-GGUF:Q3_K_M

If you want to use any other quants you will need to edit the top line of the Modelfile accordingly. Hopefully this helps and resolves your issue :)

I am suggesting this purely from this discussion I found as I dont use ollama. https://huggingface.co/unsloth/GLM-4.7-Flash-GGUF/discussions/23

Please let me know if this works as I'm not sure if that fixed the issue

Worked perfectly! Thanks a lot πŸ™πŸ½

Is it possible to update these instructions in the following location?

image

I dont know how to do it. Else I am happy to raise a pull request.

TeichAI org
β€’
edited Feb 23

Glad to hear that worked!

Also I'm not really sure... I could update the model README with further Ollama instructions. I think hf controls those things

Yea so the issue here is that we need to create the ollama model using the provided modelfile. You will want to:

  1. Download the Modelfile from the repository files

  2. Then from the directory the downloaded file is located you need to create the Ollama model using this modelfile:

    ollama create TeichAI-GLM-4.7-Flash-Claude-Opus-4.5-Distill -f ./Modelfile

  3. Load the model using TeichAI-GLM-4.7-Flash-Claude-Opus-4.5-Distill instead of hf.co/TeichAI/GLM-4.7-Flash-Claude-Opus-4.5-High-Reasoning-Distill-GGUF:Q3_K_M

If you want to use any other quants you will need to edit the top line of the Modelfile accordingly. Hopefully this helps and resolves your issue :)

I am suggesting this purely from this discussion I found as I dont use ollama. https://huggingface.co/unsloth/GLM-4.7-Flash-GGUF/discussions/23

Please let me know if this works as I'm not sure if that fixed the issue

Thanks! Works!

Sign up or log in to comment