Context Mode
Vercel AI SDK
With the Inkeep context
mode, you can leverage all of the capabilities provided by a normal OpenAI API endpoint but “grounded” with context about your product.
This gives you the full flexibility of creating any LLM application: custom copilots, AI workflows, etc., all backed by your own data. Structured data can be particularly powerful in rendering dynamic UI components as part of a conversational experience.
As a basic example, instead of having a support bot that answers with a single content
payload, we can instead define a response to be returned as a series of structured steps. The example shown below illustrates how to accomplish that with streamObject and Inkeep’s context
model.
index.ts