Memory
Persistent memory system for AI agents with built-in providers for development and production. Working memory, conversation history, and chat persistence with a simple 4-method interface. Required dependency for @ai-sdk-tools/agents.
npm install @ai-sdk-tools/memory
Features
Simple API - Just 4 methods to implement
Built-in Providers - InMemory, Drizzle ORM, and Upstash included
TypeScript-first - Full type safety
Flexible Scopes - Chat-level or user-level memory
Conversation History - Optional message tracking
Database Agnostic - Works with PostgreSQL, MySQL, and SQLite via Drizzle
Quick Start
InMemory Provider (Development)
Perfect for local development - works immediately, no setup needed.
import { InMemoryProvider } from '@ai-sdk-tools/memory' import { Agent } from '@ai-sdk-tools/agents' const memory = new InMemoryProvider() // Use with agents const agent = new Agent({ name: 'Assistant', model: openai('gpt-4'), instructions: 'You are a helpful assistant.', memory: { provider: memory, workingMemory: { enabled: true, scope: 'chat', }, history: { enabled: true, limit: 10, }, chats: { enabled: true, generateTitle: true, } }, })
Production Setup
Drizzle Provider (SQL Databases)
Works with PostgreSQL, MySQL, and SQLite via Drizzle ORM. Perfect if you already use Drizzle in your project.
import { drizzle } from 'drizzle-orm/vercel-postgres' import { sql } from '@vercel/postgres' import { pgTable, text, timestamp } from 'drizzle-orm/pg-core' import { DrizzleProvider } from '@ai-sdk-tools/memory' // Define your schema const workingMemory = pgTable('working_memory', { id: text('id').primaryKey(), scope: text('scope').notNull(), chatId: text('chat_id'), userId: text('user_id'), content: text('content').notNull(), updatedAt: timestamp('updated_at').notNull(), }) const messages = pgTable('conversation_messages', { id: serial('id').primaryKey(), chatId: text('chat_id').notNull(), userId: text('user_id'), role: text('role').notNull(), content: text('content').notNull(), timestamp: timestamp('timestamp').notNull(), }) // Initialize const db = drizzle(sql) const memory = new DrizzleProvider(db, { workingMemoryTable: workingMemory, messagesTable: messages, })
Full Drizzle documentation - Includes PostgreSQL, MySQL, SQLite/Turso examples
Upstash Provider (Serverless)
Perfect for edge and serverless environments.
import { Redis } from '@upstash/redis' import { UpstashProvider } from '@ai-sdk-tools/memory' const redis = Redis.fromEnv() const memory = new UpstashProvider(redis)
Usage with Agents
The memory package is a required dependency for @ai-sdk-tools/agents. The agent automatically handles:
Loads working memory into system prompt
Injects updateWorkingMemory tool
Captures and persists conversation messages
Generates chat titles from first message
import { Agent } from '@ai-sdk-tools/agents' import { DrizzleProvider } from '@ai-sdk-tools/memory' const agent = new Agent({ name: 'Financial Assistant', model: openai('gpt-4'), instructions: 'You help users manage their finances.', memory: { provider: new DrizzleProvider(db), workingMemory: { enabled: true, scope: 'user', // or 'chat' template: `# Working Memory ## User Preferences - [Preferred currency, date format, etc.] ## Important Context - [Key facts about the user's finances] ` }, history: { enabled: true, limit: 10, // Last 10 messages }, chats: { enabled: true, generateTitle: true, // Auto-generate from first message } }, }) // In your route handler export async function POST(req: Request) { const { message, chatId } = await req.json() return agent.toUIMessageStream({ message, // Single message - agent loads history context: { chatId, userId: 'user-123', // ... other context } }) }
Memory Scopes
Chat Scope (Recommended)
Memory is tied to a specific conversation. Each chat has its own working memory.
workingMemory: { enabled: true, scope: 'chat', }
User Scope
Memory persists across all conversations for a user. Useful for learning long-term preferences.
workingMemory: { enabled: true, scope: 'user', }
API Reference
MemoryProvider Interface
All providers implement this simple 4-method interface:
interface MemoryProvider { // Get working memory for a chat or user getWorkingMemory(params: { chatId?: string userId?: string scope: MemoryScope }): Promise<WorkingMemory | null> // Update working memory updateWorkingMemory(params: { chatId?: string userId?: string scope: MemoryScope content: string }): Promise<void> // Save a conversation message (optional) saveMessage?(message: ConversationMessage): Promise<void> // Get conversation messages (optional) getMessages?(params: { chatId: string limit?: number }): Promise<ConversationMessage[]> }
Types
interface WorkingMemory { content: string updatedAt: Date } type MemoryScope = 'chat' | 'user' interface ConversationMessage { chatId: string userId?: string role: 'user' | 'assistant' | 'system' content: string timestamp: Date }
Custom Provider
Implement your own memory backend by following the MemoryProvider interface:
import type { MemoryProvider, WorkingMemory, ConversationMessage, MemoryScope } from '@ai-sdk-tools/memory' class MyCustomProvider implements MemoryProvider { async getWorkingMemory(params: { chatId?: string userId?: string scope: MemoryScope }): Promise<WorkingMemory | null> { // Fetch from your database const key = scope === 'chat' ? params.chatId : params.userId const data = await myDb.get(key) if (!data) return null return { content: data.content, updatedAt: new Date(data.updatedAt) } } async updateWorkingMemory(params: { chatId?: string userId?: string scope: MemoryScope content: string }): Promise<void> { // Save to your database const key = scope === 'chat' ? params.chatId : params.userId await myDb.set(key, { content: params.content, updatedAt: new Date() }) } // Optional: Implement message storage async saveMessage(message: ConversationMessage): Promise<void> { await myDb.insertMessage(message) } // Optional: Implement message retrieval async getMessages(params: { chatId: string limit?: number }): Promise<ConversationMessage[]> { return await myDb.getMessages(params.chatId, params.limit) } }
Complete Example
Full example showing memory integration with agents:
// app/api/chat/route.ts import { Agent } from '@ai-sdk-tools/agents' import { DrizzleProvider } from '@ai-sdk-tools/memory' import { openai } from '@ai-sdk/openai' const memory = new DrizzleProvider(db) const agent = new Agent({ name: 'Assistant', model: openai('gpt-4'), instructions: 'You are a helpful assistant.', memory: { provider: memory, workingMemory: { enabled: true, scope: 'user', }, history: { enabled: true, limit: 10, }, chats: { enabled: true, generateTitle: true, } }, }) export async function POST(req: Request) { const { message, chatId } = await req.json() return agent.toUIMessageStream({ message, context: { chatId, userId: 'user-123', } }) } // Client usage import { useChat } from '@ai-sdk-tools/store' function ChatComponent() { const { messages, sendMessage } = useChat({ transport: new DefaultChatTransport({ api: '/api/chat', prepareSendMessagesRequest({ messages, id }) { return { body: { message: messages[messages.length - 1], chatId: id, }, } }, }), }) }