Use this model in opencode

#28
by Jumpty - opened

hi all, I would like to ask if this model can be started as a service through Ollama and called from OpenCode?
After configuring it, I encountered a timeout when inputting in OpenCoder. The output log looked something like this: "time=2026-03-12T23:40:45.629+08:00 level=INFO source=server.go:1568 msg="aborting completion request due to client closing the connection"
[GIN] 2026/03/12 - 23:40:45 | 500 | 2m17s | 127.0.0.1 | POST "/v1/chat/completions"
[GIN] 2026/03/12 - 23:40:45 | 200 | 2m17s | 127.0.0.1 | POST "/v1/chat/completions""

How did you config your opencode.jsonc?

Use llama Cpp server to serve the gguf of this and use v1 api

How did you config your opencode.jsonc?

I had only configured Claude Opus in OpenCode before, with only the baseurl and API key.

Use @ai-sdk/anthropic and set a dummy api key

Example in "provider" block:

"llama.cpp": {
      "npm": "@ai-sdk/anthropic",
      "name": "llama.cpp",
      "options": {
        "baseURL": "http://192.168.0.101:8080/v1",
        "apiKey": "doesnotmatter",
        "structuredOutputs": false
},

Sign up or log in to comment