Skip to main content

OpenClaw

Open-source autonomous agent runtime — register AntSeed as a custom provider in `openclaw.json`.

Autonomous agentsAnthropic Messages~3 min

What OpenClaw is. OpenClaw is an open-source agent runtime for autonomous, long-running tasks (research, coding, web automation). It loads its provider catalog from ~/.openclaw/openclaw.json — each entry is an HTTP endpoint plus a wire protocol (anthropic-messages, openai-chat, etc.) and a list of models.

How AntSeed plugs in. Add a provider entry called antseed that points at http://127.0.0.1:8377 with api: "anthropic-messages". Each model id you list under that provider must be a service id your pinned peer advertises — OpenClaw will surface them in its model picker as antseed/<service-id>.

Why a config entry instead of env vars. OpenClaw runs many providers in parallel (one per task, sometimes one per agent). A single base-URL override would force every agent through AntSeed; a named provider lets you mix AntSeed with hosted Anthropic, OpenAI, or local models on a per-agent basis.

Run AntSeed first

Every integration assumes a buyer proxy at http://localhost:8377. One-time setup, ~2 minutes.

Step 1

Install OpenClaw

  • Install OpenClaw
    npm install -g openclaw

    Verify with `openclaw --version`. The config file lives at `~/.openclaw/openclaw.json` and is created on first launch.

Step 2

Point OpenClaw at AntSeed

~/.openclaw/openclaw.json (merge into the existing `models.providers` object)
{ "models": { "providers": { "antseed": { "baseUrl": "http://127.0.0.1:8377", "apiKey": "antseed-p2p", "api": "anthropic-messages", "models": [ { "id": "claude-sonnet-4-6", "name": "Claude Sonnet 4.6 (via AntSeed)", "reasoning": false, "input": ["text"], "contextWindow": 200000, "maxTokens": 8192 }, { "id": "deepseek-v4-flash", "name": "DeepSeek v4 Flash (via AntSeed)", "reasoning": false, "input": ["text"], "contextWindow": 128000, "maxTokens": 8192 } ] } } } }
# Set AntSeed as the default model for new agents: openclaw config set agents.defaults.model.primary "antseed/claude-sonnet-4-6"

Step 3

Pick a model

claude-sonnet-4-6claude-opus-4-7deepseek-v4-flashgpt-oss-120b

Each `id` under `models[]` must match a service id from `curl http://127.0.0.1:8377/v1/models`. `apiKey` is required by OpenClaw's validator but ignored by the proxy — any non-empty string works. The `"antseed-p2p"` value is just convention.

The exact list of models depends on which peer you pin. Run antseed network browse or open the live network page to see what's available right now.

Verify

Test it

  • Confirm the proxy advertises the service ids you put in config
    curl -s http://127.0.0.1:8377/v1/models | jq '.data[].id'
    Example response
    "claude-opus-4-7" "claude-sonnet-4-6" "deepseek-v4-flash" "gpt-oss-120b"

    If a model id you listed in `openclaw.json` doesn't appear here, your pinned peer doesn't serve it. Pin a different peer or remove the entry.

  • Reload OpenClaw and check the provider list
    openclaw config reload && openclaw providers list

    Or restart OpenClaw. You should see `antseed` with the model count you configured.

  • Run an agent against AntSeed
    openclaw run "Summarize the README in this repo" --model antseed/claude-sonnet-4-6

How OpenClaw talks to AntSeed

  • Wire format sent by OpenClaw: Anthropic Messages (hits /v1/messages on the buyer proxy)
  • Best-fit services: any service whose protocols array contains anthropic-messages. That's what the peer advertises as natively-supported — zero translation overhead, no transform edge cases.
  • How to check a peer: run antseed network peer <peerId> --json and look at matchingServices[].protocols for each model. The browse command shows the same data per peer in providerServiceApiProtocols.
  • What happens when protocols don't match: AntSeed's @antseed/api-adapter translates between Anthropic Messages and the service's native protocol on the fly. So a request from OpenClaw can still reach a service that only advertises openai-chat-completions — just with a small transform step.
  • One known caveat: services whose only advertised protocol is openai-responses require streaming. If OpenClaw sends a non-streaming request and the proxy routes it to one of those services, the call fails with HTTP 400: Stream must be set to true. Pick a service whose protocols includes anthropic-messages (or another non-responses protocol) to avoid this.

If it goes wrong

Troubleshooting

  • `provider "antseed" not found` when launching an agentJSON parse error in `openclaw.json`, or you put the entry in the wrong nesting level. The provider must live under `models.providers.antseed`. Run `openclaw config validate` to surface parse errors.
  • OpenClaw lists `antseed/<id>` but every call returns `404 model_not_found`The pinned peer doesn't advertise that service id. Run `antseed network peer <peerId>` to see what it actually offers, or pin a different peer with `antseed buyer connection set --peer <peerId>`.
  • Streaming errors on long-running agentsAntSeed supports SSE streaming. If you see truncated responses, check that no proxy in front of OpenClaw is buffering (Cloudflare, nginx). The buyer proxy itself does not buffer.
  • Agent stalls on first request after a deployAntSeed opens a payment channel on the first request to a new peer (one on-chain transaction, ~5–15s on Base). Subsequent requests reuse the channel. Pre-warm by running a quick `curl` before launching the agent.

Reference

Links

Same category

Related