Resolve and Render
Understand how context fragments become AI SDK-ready prompts through the resolution pipeline
The resolution pipeline transforms your context fragments into output ready for the AI SDK. This page explains what happens when you call resolve().
The Resolution Flow
┌─────────────────────────────────────────────────────────────────┐
│ resolve() called │
└─────────────────────────────────────────────────────────────────┘
│
▼
┌─────────────────────────────────────────────────────────────────┐
│ 1. Initialize Chat And Active Branch │
│ │
│ Upsert chat, merge metadata, load active branch │
└─────────────────────────────────────────────────────────────────┘
│
▼
┌─────────────────────────────────────────────────────────────────┐
│ 2. Split by Fragment Type │
│ │
│ ┌────────────────────┐ ┌────────────────────┐ │
│ │ Regular Fragments │ │ Message Fragments │ │
│ │ (type !== msg) │ │ (type === message)│ │
│ └────────────────────┘ └────────────────────┘ │
└─────────────────────────────────────────────────────────────────┘
│ │
▼ ▼
┌──────────────────────────┐ ┌───────────────────────────────┐
│ 3a. Render to String │ │ 3b. Materialize Messages │
│ │ │ │
│ renderer.render( │ │ persisted messages + │
│ regularFragments │ │ fragment.codec.encode() │
│ ) │ │ → validateUIMessages(...) │
└──────────────────────────┘ └───────────────────────────────┘
│ │
▼ ▼
┌─────────────────────────────────────────────────────────────────┐
│ 4. Return Result │
│ │
│ { │
│ systemPrompt: string, // Rendered regular fragments │
│ messages: UIMessage[], // Validated AI SDK messages │
│ } │
└─────────────────────────────────────────────────────────────────┘Step 1: Initialize Chat And Branch
On the first resolve() call, the engine initializes chat state and loads the active branch:
await this.#ensureInitialized();This creates the chat if needed, merges any initial metadata, and loads the active branch head so persisted conversation history can be replayed from the graph.
Step 2: Split by Fragment Type
Fragments are categorized using isMessageFragment():
for (const fragment of fragments) {
if (isMessageFragment(fragment)) {
pendingMessages.push(fragment);
} else {
regularFragments.push(fragment);
}
}A fragment is a message if fragment.type === 'message'. The built-in user(), assistant(), assistantText(), and message() helpers set this automatically.
Regular Fragments
These become part of the system prompt:
role()for system instructionshint()for guidelinesfragment()for custom structure- builder-backed domain and user fragments like
term(),guardrail(),identity(), andpreference()
Message Fragments
These become the conversation history:
user()for user messagesassistant()andassistantText()for assistant responsesmessage()for wrapping an existingUIMessage- any fragment with
type: 'message'
Step 3a: Render Regular Fragments
Regular fragments are rendered using the selected renderer, XmlRenderer by default:
const renderer = options.renderer ?? new XmlRenderer();
const systemPrompt = renderer.render(regularFragments);Example
const context = new ContextEngine({ store }).set(
role('You are a SQL expert.'),
hint('Use CTEs for complex queries.'),
);
const { systemPrompt } = await context.resolve({
renderer: new XmlRenderer(),
});Step 3b: Materialize Message Fragments
Persisted messages come from the graph store as stored UIMessage payloads. Pending message fragments are codec-first and contribute their message payload through codec.encode().
const messages: unknown[] = [];
for (const storedMessage of chain) {
messages.push(storedMessage);
}
for (const fragment of pendingMessages) {
messages.push(fragment.codec!.encode());
}
const validatedMessages =
messages.length === 0 ? [] : await validateUIMessages({ messages });validateUIMessages(...) comes from the ai package and ensures the final result matches the AI SDK UIMessage[] contract.
Example
const context = new ContextEngine({ store }).set(
user('What is TypeScript?'),
assistantText('TypeScript is a typed superset of JavaScript.'),
user('Show me an example.'),
);
const { messages } = await context.resolve({
renderer: new XmlRenderer(),
});
// [
// { id: '...', role: 'user', parts: [{ type: 'text', text: 'What is TypeScript?' }] },
// { id: '...', role: 'assistant', parts: [{ type: 'text', text: 'TypeScript is a typed superset of JavaScript.' }] },
// { id: '...', role: 'user', parts: [{ type: 'text', text: 'Show me an example.' }] },
// ]Step 4: Return Result
The final result is ready for direct use with the AI SDK:
interface ResolveResult {
systemPrompt: string;
messages: UIMessage[];
}Using with AI SDK
import { generateText } from 'ai';
import { groq } from '@ai-sdk/groq';
const { systemPrompt, messages } = await context.resolve({
renderer: new XmlRenderer(),
});
const response = await generateText({
model: groq('gpt-oss-20b'),
system: systemPrompt,
messages,
});Order Matters
Fragments are processed in the order they were added:
context
.set(role('You are helpful.'))
.set(user('Hello'))
.set(hint('Be concise.'))
.set(assistantText('Hi!'));This results in:
systemPrompt: rendered fromrole()andhint()messages: returned in chronological order asUIMessage[]
Regular fragments are always rendered into the system prompt regardless of when they were added. Message fragments maintain conversation order.
Performance Notes
- Branch and chat initialization happen once per
ContextEngineinstance. - Regular fragment rendering is pure and stateless.
- Message validation only runs when
resolve()has at least one message to return.
Related Pages
- Fragments - Fragment types and lifecycle
- Storage - Conversation graph persistence
- Renderers Overview - Output formats and renderer selection