Skip to main content

Vercel AI SDK

Use `@ai-sdk/openai-compatible` to call AntSeed from `generateText` / `streamText` / `generateObject`.

FrameworksOpenAI Chat Completions~5 min

What the AI SDK is. Vercel's ai package is a provider-agnostic TypeScript toolkit for building LLM apps and agents. You pick a provider (a small adapter package), instantiate a model from it, and pass that model into one of the framework's primitives: generateText, streamText, generateObject, or streamObject. The AI SDK handles tool-calling, structured output, message history, and streaming for you.

How AntSeed plugs in. AntSeed is OpenAI-Chat-compatible at http://localhost:8377/v1, so the right adapter is @ai-sdk/openai-compatible (not @ai-sdk/openai). The official OpenAI provider is locked to OpenAI's API surface and quietly drops third-party fields; the openai-compatible provider is the one Vercel's own docs recommend for proxies, gateways, and any non-OpenAI server that speaks Chat Completions. You point it at the AntSeed proxy with baseURL and pass any non-empty apiKey placeholder — the proxy authenticates with your local identity key, not with this header.

Which model ids work. The first argument to the provider call is the AntSeed service id (e.g. claude-sonnet-4-6, deepseek-v4-flash). It must match a service your pinned peer advertises — confirm with curl http://localhost:8377/v1/models.

Run AntSeed first

Every integration assumes a buyer proxy at http://localhost:8377. One-time setup, ~2 minutes.

Before you start

Prerequisites

  • Node.js 18 or newer

Step 1

Install Vercel AI SDK

  • Install the SDK and the openai-compatible provider
    npm install ai @ai-sdk/openai-compatible zod

    `zod` is only needed if you call `generateObject` / `streamObject`. Skip it for plain text generation.

Step 2

Point Vercel AI SDK at AntSeed

// antseed.ts — a single provider instance you can import everywhere import { createOpenAICompatible } from '@ai-sdk/openai-compatible'; export const antseed = createOpenAICompatible({ name: 'antseed', baseURL: 'http://localhost:8377/v1', apiKey: 'antseed', // any non-empty string — proxy ignores this header includeUsage: true, // surface token counts in streaming responses too });
// stream.ts import { streamText } from 'ai'; import { antseed } from './antseed'; const result = streamText({ model: antseed('claude-sonnet-4-6'), // an AntSeed service id prompt: 'Why is the sky blue?', }); for await (const chunk of result.textStream) { process.stdout.write(chunk); } console.log('\nusage:', await result.usage);
// structured.ts — generateObject works the same way import { generateObject } from 'ai'; import { z } from 'zod'; import { antseed } from './antseed'; const { object } = await generateObject({ model: antseed('claude-sonnet-4-6'), schema: z.object({ title: z.string(), bullets: z.array(z.string()).min(3).max(5), }), prompt: 'Summarize the AntSeed buyer-proxy README as a slide.', }); console.log(object);

Step 3

Pick a model

claude-sonnet-4-6deepseek-v4-flashgpt-oss-120bqwen3-coder-480b

The string you pass to `antseed('<id>')` is forwarded verbatim as `model` in the OpenAI Chat request. Run `curl -s http://localhost:8377/v1/models | jq '.data[].id'` to see exactly what your pinned peer offers.

The exact list of models depends on which peer you pin. Run antseed network browse or open the live network page to see what's available right now.

Verify

Test it

  • Run a smoke test with `tsx`
    npx tsx stream.ts
    Example output
    The sky is blue because shorter (blue) wavelengths of sunlight scatter much more than longer (red) wavelengths in Earth's atmosphere… usage: { promptTokens: 14, completionTokens: 78, totalTokens: 92 }

    If you see `404 model_not_found`, the pinned peer does not advertise the id you passed. If you see `no_peer_pinned`, run `antseed buyer connection set --peer <peerId>` first — or send the per-request header (next step).

  • Per-request peer override (no session pin needed)
    // Use `headers` to fan out to different peers per call. const result = streamText({ model: antseed('claude-sonnet-4-6'), prompt: 'hi', headers: { 'x-antseed-pin-peer': 'cccccccccccccccccccccccccccccccccccccccc', }, });

    Useful when one Node process serves many tenants and you want each request routed to a different peer. The header overrides the session pin for that single call.

How Vercel AI SDK talks to AntSeed

  • Wire format sent by Vercel AI SDK: OpenAI Chat Completions (hits /v1/chat/completions on the buyer proxy)
  • Best-fit services: any service whose protocols array contains openai-chat-completions. That's what the peer advertises as natively-supported — zero translation overhead, no transform edge cases.
  • How to check a peer: run antseed network peer <peerId> --json and look at matchingServices[].protocols for each model. The browse command shows the same data per peer in providerServiceApiProtocols.
  • What happens when protocols don't match: AntSeed's @antseed/api-adapter translates between OpenAI Chat Completions and the service's native protocol on the fly. So a request from Vercel AI SDK can still reach a service that only advertises anthropic-messages — just with a small transform step.
  • One known caveat: services whose only advertised protocol is openai-responses require streaming. If Vercel AI SDK sends a non-streaming request and the proxy routes it to one of those services, the call fails with HTTP 400: Stream must be set to true. Pick a service whose protocols includes openai-chat-completions (or another non-responses protocol) to avoid this.

If it goes wrong

Troubleshooting

  • TypeScript complains that `antseed` has no call signatureYou imported from `@ai-sdk/openai` instead of `@ai-sdk/openai-compatible`. Switch the package — the SDK's official OpenAI provider is locked to OpenAI's service ids and rejects unknown ones.
  • `generateObject` returns malformed JSONThe AI SDK is strict about JSON Schema support. Pass `supportsStructuredOutputs: true` to `createOpenAICompatible` only if your pinned peer's service supports OpenAI-style structured outputs natively. If unsure, leave it off — the SDK falls back to tool-call-based JSON which works everywhere.
  • `includeUsage` is set but `result.usage` is undefinedSome upstream providers behind AntSeed do not emit usage on streamed responses. Try `generateText` instead of `streamText` for definitive token counts; otherwise run `antseed buyer metering` for the authoritative per-channel token + USDC totals AntSeed itself measured.
  • Browser/edge runtime fails with `fetch` errorsThe AntSeed proxy listens on `127.0.0.1:8377`, which is not reachable from a browser tab on a deployed site. The AI SDK is designed to run on the server (Route Handlers, Server Actions, edge functions on your own machine, or a Node process); don't call it from a client component when the model is AntSeed.

Reference

Links

Same category

Related