Ai apiContext mode

Use the Context Mode API with the Vercel AI SDK

Copy page

With the Inkeep context mode, you can leverage all of the capabilities provided by a normal OpenAI API endpoint but "grounded" with context about your product.

This gives you the full flexibility of creating any LLM application: custom copilots, AI workflows, etc., all backed by your own data. Structured data can be particularly powerful in rendering dynamic UI components as part of a conversational experience.

As a basic example, instead of having a support bot that answers with a single content payload, we can instead define a response to be returned as a series of structured steps. The example shown below illustrates how to accomplish that with streamObject and Inkeep's context model.

Tip
Tip

Log your AI API conversations to the Analytics API to get analytics and reporting, track user feedback, and power features like shared or saved chats — all viewable in the Inkeep Portal.

index.ts
import { z } from "zod";

import { streamObject } from "ai";
import { createOpenAI } from "@ai-sdk/openai";
import dotenv from "dotenv";
dotenv.config();

if (!process.env.INKEEP_API_KEY) {
  throw new Error("INKEEP_API_KEY is required");
}

const openai = createOpenAI({
  apiKey: process.env.INKEEP_API_KEY,
  baseURL: "https://api.inkeep.com/v1",
});

const StepSchema = z.object({
  steps: z.array(
    z.object({
      step: z.string(),
      description: z.string(),
    })
  ),
});

async function getResponseFromAI() {
  const result = await streamObject({
    model: openai("inkeep-context-expert"),
    schema: StepSchema,
    messages: [
      {
        role: "system",
        content:
          "Generate step-by-step instructions to answer the user question about Inkeep only based on the information sources. Break it down to be as granular as possible. Always generate more than one step.",
      },
    ],
  });

  const { partialObjectStream } = result;
  for await (const partialObject of partialObjectStream) {
    console.clear();
    console.log(partialObject.steps);
  }
}

getResponseFromAI();