Deep Agents
AgentContextOrchestratorRetrievalText2SQLToolbox

Context Engine

Complete API reference for ContextEngine - the graph-based context manager

The ContextEngine manages AI conversation context using a graph-based storage model. Messages form a DAG (directed acyclic graph) where branches and checkpoints are pointers to message nodes.

For a deep-dive into internals, see Architecture.

Creating an Engine

Every engine requires a store, chatId, and userId:

import { ContextEngine, SqliteContextStore } from '@deepagents/context';

const store = new SqliteContextStore('./context.db');
const context = new ContextEngine({
  store,
  chatId: 'chat-001',
  userId: 'user-001',
  metadata: { source: 'web' }, // optional initial metadata
});

The engine lazily initializes on first resolve() or save() call. A main branch is created automatically.

API Reference

Properties

chatId

Returns the current chat ID.

console.log(context.chatId); // 'chat-001'

branch

Returns the current branch name.

console.log(context.branch); // 'main' or 'main-v2', etc.

headMessageId

Returns the current branch head message ID, or undefined if no messages have been saved yet.

console.log(context.headMessageId); // 'msg-abc123' or undefined

chat

Returns chat metadata, or null if not yet initialized.

const meta = context.chat;
// {
//   id: 'chat-001',
//   userId: 'user-001',
//   createdAt: 1703123456789,
//   updatedAt: 1703123456789,
//   title: 'My Chat',
//   metadata: { source: 'web' }
// }

Core Methods

set(...fragments)

Add fragments to context. Message fragments queue for persistence; others stay in memory for system prompt.

import { role, hint, user, assistantText } from '@deepagents/context';

context.set(
  role('You are helpful.'),  // → system prompt
  hint('Be concise.'),       // → system prompt
  user('Hello!'),            // → queued for save()
);

// Chainable
context
  .set(role('You are helpful.'))
  .set(user('Hello!'));

resolve(options)

Convert context to AI SDK format. Requires a renderer.

import { XmlRenderer } from '@deepagents/context';

const { systemPrompt, messages } = await context.resolve({
  renderer: new XmlRenderer(),
});

// systemPrompt: rendered non-message fragments
// messages: array of AI SDK UIMessage objects

Behavior:

  1. Initializes chat/branch if needed
  2. Loads persisted messages from graph (walks parent chain)
  3. Renders context fragments to systemPrompt
  4. Appends pending messages to messages[]

save()

Persist pending messages to the graph. Returns a SaveResult containing the headMessageId of the branch after saving.

context.set(user('Hello'));
context.set(assistantText('Hi there!'));
const result = await context.save();
// result.headMessageId = 'msg-abc123'
// Messages now in graph, pending cleared

Each message becomes a node with parentId pointing to the previous message. The branch head updates to the last message.

If there are no pending messages, save() returns the current head without modifying the graph.

render(renderer)

Low-level render of context fragments. Use resolve() instead.

const xml = context.render(new XmlRenderer());

Graph Operations

rewind(messageId)

Create a new branch from a specific message. The original branch is preserved.

// Conversation: q1 → a1 (wrong answer)
context.set(
  user({ id: 'q1', role: 'user', parts: [{ type: 'text', text: 'What is 2+2?' }] }),
);
context.set(assistantText('The answer is 5.'));
await context.save();

// Rewind to q1, creates new branch 'main-v2'
const newBranch = await context.rewind('q1');
// newBranch = { name: 'main-v2', headMessageId: 'q1', ... }

// Now on main-v2, add correct answer
context.set(assistantText('The answer is 4.'));
await context.save();

// Original 'main' branch still has the wrong answer

Returns BranchInfo with the new branch details.

checkpoint(name)

Create a named pointer to the current branch head.

context.set(user('Should I learn Python or JavaScript?'));
context.set(assistantText('Both are great! What interests you more?'));
await context.save();

// Save this decision point
const cp = await context.checkpoint('before-choice');
// cp = { name: 'before-choice', messageId: 'msg-xxx', ... }

Checkpoints are stored in the database and survive across sessions.

restore(name)

Restore to a checkpoint by creating a new branch from that point.

// User chose Python, but wants to explore JavaScript path
await context.restore('before-choice');
// Now on new branch 'main-v2', at the checkpoint message

context.set(user('I want to learn JavaScript.'));
await context.save();

Internally calls rewind() on the checkpoint's message.

switchBranch(name)

Switch to an existing branch by name.

// See all branches
const branches = await store.listBranches(context.chatId);
// [{ name: 'main', ... }, { name: 'main-v2', ... }]

// Switch back to original
await context.switchBranch('main');

Clears pending messages when switching.

btw()

Create a parallel branch without switching ("by the way").

// User asks question, model is generating...
context.set(user('What is the weather?'));
await context.save();

// User wants to ask something else without waiting
const newBranch = await context.btw();
// newBranch = { name: 'main-v2', ... }
// Still on 'main', pending messages intact

// Later, switch to ask the other question
await context.switchBranch(newBranch.name);
context.set(user('Also, what time is it?'));
await context.save();

Unlike rewind():

  • Uses current HEAD (no messageId needed)
  • Does NOT switch to new branch
  • Keeps pending messages intact

Chat Metadata

updateChat(updates)

Update chat title and metadata.

await context.updateChat({
  title: 'Python Learning Session',
  metadata: { tags: ['python', 'beginner'] },
});

Metadata is merged with existing values.

Debugging

inspect(options)

Get a complete snapshot for debugging.

import { XmlRenderer } from '@deepagents/context';

const snapshot = await context.inspect({
  modelId: 'openai:gpt-4o',
  renderer: new XmlRenderer(),
});

console.log(snapshot);
// {
//   estimate: { tokens: 156, cost: 0.00078, ... },
//   rendered: '<role>You are helpful.</role>...',
//   fragments: {
//     context: [...],   // non-message fragments
//     pending: [...],   // unsaved messages
//     persisted: [...], // messages from store
//   },
//   graph: { nodes: [...], branches: [...], checkpoints: [...] },
//   meta: { chatId: 'chat-001', branch: 'main', timestamp: ... }
// }

// Write to file for analysis
await fs.writeFile('debug.json', JSON.stringify(snapshot, null, 2));

Complete Example

import { generateText } from 'ai';
import { groq } from '@ai-sdk/groq';
import {
  ContextEngine,
  SqliteContextStore,
  XmlRenderer,
  role,
  hint,
  user,
  assistant,
} from '@deepagents/context';

async function chat(userMessage: string) {
  const store = new SqliteContextStore('./chat.db');
  const context = new ContextEngine({
    store,
    chatId: 'chat-001',
    userId: 'user-001',
  });

  // Set system context
  context.set(
    role('You are a friendly assistant.'),
    hint('Keep responses brief.'),
  );

  // Add user message
  context.set(user(userMessage));

  // Resolve and call API
  const { systemPrompt, messages } = await context.resolve({
    renderer: new XmlRenderer(),
  });

  const response = await generateText({
    model: groq('gpt-oss-20b'),
    system: systemPrompt,
    messages,
  });

  // Save the exchange
  context.set(assistant(response.text));
  await context.save();

  return response.text;
}

Branching Workflow

// Setup
const context = new ContextEngine({ store, chatId: 'chat-001', userId: 'user-001' })
  .set(role('You are helpful.'));

// Conversation
context.set(user('Explain recursion', { id: 'q1' }));
const { systemPrompt, messages } = await context.resolve({ renderer: new XmlRenderer() });

const response = await generateText({
  model: groq('gpt-oss-20b'),
  system: systemPrompt,
  messages,
});

context.set(assistant(response.text));
await context.save();

// Response was too technical - try again
await context.rewind('q1');
// Now on 'main-v2' branch

context.set(hint('Explain like I am 5 years old.'));
const { systemPrompt: sp2, messages: m2 } = await context.resolve({ renderer: new XmlRenderer() });

const simpler = await generateText({
  model: groq('gpt-oss-20b'),
  system: sp2,
  messages: m2,
});

context.set(assistant(simpler.text));
await context.save();

// Both branches preserved:
// main: original technical explanation
// main-v2: simpler explanation

Next Steps