Use the Context Mode API with the Vercel AI SDK
Copy page
With the Inkeep context mode, you can leverage all of the capabilities provided by a normal OpenAI API endpoint but "grounded" with context about your product.
This gives you the full flexibility of creating any LLM application: custom copilots, AI workflows, etc., all backed by your own data. Structured data can be particularly powerful in rendering dynamic UI components as part of a conversational experience.
As a basic example, instead of having a support bot that answers with a single content payload, we can instead define a response to be returned as a series of structured steps. The example shown below illustrates how to accomplish that with streamObject and Inkeep's context model.
Log your AI API conversations to the Analytics API to get analytics and reporting, track user feedback, and power features like shared or saved chats — all viewable in the Inkeep Portal.