OpenCode
Open-source AI coding agent — add AntSeed as a custom OpenAI-compatible provider.
OpenCode is an MIT-licensed terminal coding agent built on the Vercel AI SDK. It supports 75+ providers out of the box and lets you register custom ones via opencode.json.
AntSeed plugs in as a custom provider using the @ai-sdk/openai-compatible adapter — the same one OpenCode recommends for any OpenAI-compatible endpoint (LM Studio, llama.cpp, Atomic Chat, etc.). No environment variables, no ANTHROPIC_BASE_URL: the config lives in JSON.
Each model you want to use must be listed under models. The id has to match what the buyer proxy returns from GET /v1/models — i.e. a service id advertised by your currently-pinned peer.
Run AntSeed first
Every integration assumes a buyer proxy at http://localhost:8377. One-time setup, ~2 minutes.
Step 1
Install OpenCode
- Install OpenCodenpm install -g opencode-ai
- Verify it runsopencode --version
Step 2
Point OpenCode at AntSeed
Step 3
Pick a model
The keys under `models` must exactly match service ids returned by `curl http://localhost:8377/v1/models`. If your pinned peer doesn't advertise an id, OpenCode will list it but every call to it returns `404 model_not_found`.
The exact list of models depends on which peer you pin. Run antseed network browse or open the live network page to see what's available right now.
Verify
Test it
- Confirm the proxy lists the same ids your config referencescurl -s http://localhost:8377/v1/models | jq '.data[].id'Example response"claude-opus-4-7" "claude-sonnet-4-6" "deepseek-v4-flash" "gpt-oss-120b"
Add or remove entries under `models` in `opencode.json` so they match this list.
- Launch OpenCode in your projectopencode
Inside the TUI, run `/models` and pick one of the AntSeed entries. OpenCode remembers your last selection per project.
How OpenCode talks to AntSeed
- Wire format sent by OpenCode:
OpenAI Chat Completions(hits/v1/chat/completionson the buyer proxy) - Best-fit services: any service whose
protocolsarray containsopenai-chat-completions. That's what the peer advertises as natively-supported — zero translation overhead, no transform edge cases. - How to check a peer: run
antseed network peer <peerId> --jsonand look atmatchingServices[].protocolsfor each model. The browse command shows the same data per peer inproviderServiceApiProtocols. - What happens when protocols don't match: AntSeed's
@antseed/api-adaptertranslates between OpenAI Chat Completions and the service's native protocol on the fly. So a request from OpenCode can still reach a service that only advertisesanthropic-messages— just with a small transform step. - One known caveat: services whose only advertised protocol is
openai-responsesrequire streaming. If OpenCode sends a non-streaming request and the proxy routes it to one of those services, the call fails withHTTP 400: Stream must be set to true. Pick a service whoseprotocolsincludesopenai-chat-completions(or another non-responses protocol) to avoid this.
If it goes wrong
Troubleshooting
- AntSeed doesn't appear in `/connect` or `/models`OpenCode only loads providers declared in `opencode.json`. Make sure the file is in your project root (or `~/.config/opencode/opencode.json`) and that the JSON is valid — a stray comma silently disables the whole provider.
- Model is listed but every call returns `model_not_found`The pinned peer doesn't advertise that service id. Run `antseed network peer <peerId>` to see what it actually offers, or pin a different peer.
- OpenCode prompts for an API keyThe proxy ignores auth, but the AI SDK sometimes asks anyway. Either skip the prompt (press enter on empty input) or set `"apiKey": "antseed"` inside `options` in `opencode.json`.
Reference
Links
Same category