Talk to your agents

How to call your AI Agent Graph via a Vercel AI SDK format API

Copy page

Use a raw HTTP request to stream responses from your agent graph over the Vercel AI SDK data stream protocol.

Overview

This guide shows how to call your agent graph directly over HTTP and stream responses using the Vercel AI SDK data stream format. It covers the exact endpoint, headers, request body, and the event stream response you should expect.

Tip
Tip

If you are building a React UI, consider our prebuilt components under React UI Components. This page is for low-level HTTP usage.

Endpoint

  • Path (mounted by the Run API): /api/chat
  • Method: POST
  • Protocol: Server-Sent Events (SSE) encoded JSON, using Vercel AI SDK data-stream v2
  • Content-Type (response): text/event-stream
  • Response Header: x-vercel-ai-data-stream: v2

Authentication

Choose the authentication method:

See Authentication → Run API for more details.

Request Body Schema

{
  "messages": [
    {
      "role": "user",
      "content": "Hello!"
    }
  ],
  "conversationId": "optional-conversation-id"
}

Field Notes:

  • messages — Must include at least one user message
  • content — Can be a string or an object with parts for multi-part content
  • conversationId — Optional; server generates one if omitted
Note
Note

Provide request context via HTTP headers only (validated against your Context Config). Do not include it in the JSON body. See Request context.

Example cURL

When using an API key for auth:

curl -N \
  -X POST "http://localhost:3003/api/chat" \
  -H "Authorization: Bearer $INKEEP_API_KEY" \
  -H "Content-Type: application/json" \
  -d '{
    "messages": [
      { "role": "user", "content": "What can you do?" }
    ],
    "conversationId": "chat-1234"
  }'

Response: Vercel AI SDK Data Stream (v2)

The response is an SSE stream of JSON events compatible with the Vercel AI SDK UI message stream. The server sets x-vercel-ai-data-stream: v2.

Event Types

  • text-start — Indicates a new text segment is starting
  • text-delta — Carries the text content delta for the current segment
  • text-end — Marks the end of the current text segment
  • data-component — Structured UI data emitted by the agent (for rich UIs)
  • data-artifact — Artifact data emitted by tools/agents
  • data-operation — Operational events (status updates, completion, errors)
  • error — Error message emitted by the server/agent

Example Stream (abbreviated)

: keep-alive

data: {"type":"text-start","id":"1726247200-abc123"}

data: {"type":"text-delta","id":"1726247200-abc123","delta":"Hello! I can help with..."}

data: {"type":"text-end","id":"1726247200-abc123"}

data: {"type":"data-operation","data":{"type":"status_update","ctx":{"summary":"Searched docs"}}}

data: {"type":"data-component","id":"1726247200-abc123-0","data":{"type":"customer-info","name":"Ada","email":"ada@example.com"}}

data: {"type":"data-artifact","data":{"artifact_id":"art_abc","task_id":"task_xyz","summary":{}}}

Operation Events

data-operation events conform to an internal schema. Common types include:

  • agent_initializing — The agent runtime is starting
  • agent_ready — Agent is ready
  • agent_thinking — Agent is processing
  • status_update — Progress update; may include label and structured ctx data
  • completion — The agent completed the task
  • error — Error information (also emitted as a top-level error event)

Text Streaming Behavior

  • For each text segment, the server emits text-starttext-deltatext-end
  • The server avoids splitting content word-by-word; a segment is usually a coherent chunk
  • Operational events are queued during active text emission and flushed shortly after to preserve ordering and readability

Error Responses

Streamed Errors

{ "type": "error", "errorText": "<message>" }

Non-Streaming Errors

Validation failures and other errors return JSON with an appropriate HTTP status code.

HTTP Status Codes

  • 200 — Stream opened successfully
  • 401 — Missing/invalid authentication
  • 404 — Graph or agent not found
  • 400 — Invalid request body/context
  • 500 — Internal server error

Development Notes

  • Default local base URL: http://localhost:3003
  • Endpoint mounting in the server:
    • /api/chat → Vercel data stream (this page)
    • /v1/mcp → MCP JSON-RPC endpoint
Note
Note

To test quickly without a UI, use curl -N or a tool that supports Server-Sent Events.