Use this model in opencode
hi all, I would like to ask if this model can be started as a service through Ollama and called from OpenCode?
After configuring it, I encountered a timeout when inputting in OpenCoder. The output log looked something like this: "time=2026-03-12T23:40:45.629+08:00 level=INFO source=server.go:1568 msg="aborting completion request due to client closing the connection"
[GIN] 2026/03/12 - 23:40:45 | 500 | 2m17s | 127.0.0.1 | POST "/v1/chat/completions"
[GIN] 2026/03/12 - 23:40:45 | 200 | 2m17s | 127.0.0.1 | POST "/v1/chat/completions""
OII
How did you config your opencode.jsonc?
Use llama Cpp server to serve the gguf of this and use v1 api
How did you config your opencode.jsonc?
I had only configured Claude Opus in OpenCode before, with only the baseurl and API key.