Adding Persistent Memory to Pydantic AI Agents
Free · Open source (MIT) · Works with LangChain, CrewAI, AutoGen · No signup
Pydantic AI agents lose their memory between Agent.run() calls. Each execution starts fresh, with no knowledge of previous conversations or state. This breaks chatbots, multi-step workflows, and any agent that needs to remember context across interactions.
The Problem: Stateless Agents in a Stateful World
By design, Pydantic AI agents are stateless. When you call Agent.run(), the agent processes the input, returns a response, and forgets everything. This works fine for one-shot tasks, but breaks down when you need:
- Conversational memory: "What did we discuss earlier?"
- Workflow state: Multi-step processes that span multiple runs
- User preferences: Settings that persist across sessions
- Cross-process persistence: Memory that survives restarts
from pydantic_ai import Agent
agent = Agent("gpt-4")
agent.run("My name is Sarah") # "Nice to meet you, Sarah!"
agent.run("What's my name?") # "I don't have that information"
The second call fails because pydantic agents state doesn't persist. Each run() is isolated, making it impossible to build stateful applications.
The Fix: Add BotWire Memory
BotWire provides persistent key-value memory for Pydantic AI agents. Install it and add memory to any agent in three lines:
pip install botwire
from pydantic_ai import Agent
from botwire import Memory
# Create agent with memory
agent = Agent("gpt-4")
memory = Memory("user-session")
# Remember across runs
result1 = agent.run("My name is Sarah")
memory.set("user_name", "Sarah")
result2 = agent.run("What's my name?")
user_name = memory.get("user_name") # "Sarah"
How It Works
BotWire Memory stores data remotely at https://botwire.dev with no signup required. Data persists across processes, machines, and restarts.
Basic Operations
from botwire import Memory
memory = Memory("my-agent")
# Store any JSON-serializable data
memory.set("conversation_count", 5)
memory.set("user_preferences", {"theme": "dark", "language": "en"})
# Retrieve with optional defaults
count = memory.get("conversation_count", 0)
prefs = memory.get("user_preferences", {})
# List all keys
keys = memory.list_keys() # ["conversation_count", "user_preferences"]
Real-World Pattern: Conversation Memory
from pydantic_ai import Agent
from botwire import Memory
import json
class ConversationalAgent:
def __init__(self, user_id: str):
self.agent = Agent("gpt-4")
self.memory = Memory(f"chat-{user_id}")
def run(self, message: str) -> str:
# Load conversation history
history = self.memory.get("history", [])
# Build context-aware prompt
context = "\n".join([f"User: {h['user']}\nBot: {h['bot']}" for h in history[-5:]])
prompt = f"Previous conversation:\n{context}\n\nUser: {message}\nBot:"
# Get response
response = self.agent.run(prompt)
# Save to memory
history.append({"user": message, "bot": response})
self.memory.set("history", history)
return response
# Usage persists across instances
bot = ConversationalAgent("user-123")
bot.run("I like pizza")
# Later, different process
bot2 = ConversationalAgent("user-123") # Remembers pizza preference
Memory operations handle TTL, cleanup, and cross-process synchronization automatically. Data is stored as JSON, supporting strings, numbers, lists, and dictionaries.
Integration with Pydantic AI Dependencies
For complex pydantic-ai memory needs, inject memory directly into agent dependencies:
from pydantic_ai import Agent, RunContext
from botwire import Memory
from dataclasses import dataclass
@dataclass
class AgentDeps:
memory: Memory
def get_user_context(ctx: RunContext[AgentDeps]) -> str:
"""Tool that provides persistent context"""
return ctx.deps.memory.get("user_context", "No previous context")
def save_user_context(ctx: RunContext[AgentDeps], context: str) -> str:
"""Tool that saves context for later"""
ctx.deps.memory.set("user_context", context)
return "Context saved"
agent = Agent(
"gpt-4",
deps_type=AgentDeps,
tools=[get_user_context, save_user_context]
)
# Run with persistent memory
memory = Memory("user-sessions")
deps = AgentDeps(memory=memory)
result = agent.run("Remember that I'm working on a Python project", deps=deps)
When NOT to Use BotWire
- Vector/semantic search: BotWire is key-value storage, not a vector database
- High-throughput applications: 1000 writes/day limit on free tier
- Sub-millisecond latency: HTTP calls add ~50-200ms overhead
FAQ
Why not Redis or a database? BotWire requires zero setup - no servers, no configuration, no connection strings. Just pip install and start storing data.
Is this actually free? Yes, 1000 writes per day per namespace, 50MB storage, unlimited reads. Forever free tier with no credit card required.
What about data privacy? Data is stored remotely but you can self-host the open-source version (MIT license) for complete control over your data.
Get Started
Add persistent memory to your Pydantic AI agents in 30 seconds:
pip install botwire
Full documentation and self-hosting instructions at botwire.dev.