How to Add Persistent Memory to LangChain Agents
Free · Open source (MIT) · Works with LangChain, CrewAI, AutoGen · No signup
Your LangChain agent works perfectly during a conversation, but the moment your script exits, it forgets everything. Every restart means starting from scratch — no chat history, no learned context, no memory of previous interactions. This is the persistent memory problem that breaks production agents.
Why LangChain Agents Lose Memory
LangChain's default memory implementations like ConversationBufferMemory store chat history in RAM. When your Python process ends, that memory vanishes. Even if you're using BaseChatMessageHistory, it typically points to in-memory storage unless explicitly configured otherwise.
This creates real problems:
- Development friction: Restart your script, lose your conversation context
- Production failures: Server restarts wipe all agent state
- User frustration: Customers have to re-explain context after any deployment
- Scaling issues: Multiple processes can't share conversation history
Here's what typically happens:
from langchain.memory import ConversationBufferMemory
from langchain.agents import create_react_agent
# This memory dies when the script exits
memory = ConversationBufferMemory()
# Agent conversations don't persist anywhere
The Fix: Persistent Memory with BotWire
BotWire provides drop-in persistent memory that survives restarts, deployments, and process crashes. Install and replace your memory layer:
pip install botwire
from langchain.agents import AgentExecutor, create_react_agent
from langchain_community.chat_models import ChatOpenAI
from langchain.memory import ConversationBufferMemory
from botwire import BotWireChatHistory
# Persistent chat history that survives restarts
persistent_history = BotWireChatHistory(session_id="user-42")
memory = ConversationBufferMemory(
chat_memory=persistent_history,
return_messages=True
)
llm = ChatOpenAI(temperature=0)
agent_executor = AgentExecutor.from_agent_and_tools(
agent=create_react_agent(llm, tools, prompt),
tools=tools,
memory=memory
)
Your agent now remembers conversations across restarts.
How It Works
BotWireChatHistory implements LangChain's BaseChatMessageHistory interface, storing messages in BotWire's persistent backend. Each message gets saved immediately via HTTP API to https://botwire.dev.
The session_id parameter creates isolated conversation threads. Use user IDs, conversation UUIDs, or any identifier that makes sense for your use case:
# Per-user memory
user_history = BotWireChatHistory(session_id=f"user-{user_id}")
# Per-conversation memory
conv_history = BotWireChatHistory(session_id=f"conv-{conversation_uuid}")
# Per-agent-type memory
support_history = BotWireChatHistory(session_id="support-agent-v2")
For key-value memory beyond chat history, use the core Memory API:
from botwire import Memory
# Persistent state across agent runs
agent_memory = Memory("my-agent")
agent_memory.set("user_preferences", {"theme": "dark", "language": "en"})
agent_memory.set("last_action", "processed_invoice_#1234")
# Later (different process, different machine, after restart):
prefs = agent_memory.get("user_preferences") # {"theme": "dark", "language": "en"}
last_action = agent_memory.get("last_action") # "processed_invoice_#1234"
# List all stored keys
all_keys = agent_memory.list_keys()
# Clean up when done
agent_memory.delete("temporary_data")
Memory persists across processes and machines. Your development laptop and production server share the same memory namespace.
LangChain Integration Patterns
For complex agents that need both chat history and structured memory:
from langchain.memory import ConversationBufferMemory
from botwire import BotWireChatHistory, Memory
class PersistentAgent:
def __init__(self, session_id: str):
# Chat history for conversation context
self.chat_history = BotWireChatHistory(session_id=session_id)
self.memory = ConversationBufferMemory(
chat_memory=self.chat_history,
return_messages=True
)
# Structured memory for agent state
self.kv_memory = Memory(f"agent-{session_id}")
def remember_fact(self, key: str, value: any):
"""Store structured data that survives restarts"""
self.kv_memory.set(key, value)
def recall_fact(self, key: str):
"""Retrieve persistent structured data"""
return self.kv_memory.get(key)
def get_conversation_memory(self):
"""Get LangChain-compatible memory for agent"""
return self.memory
This pattern gives you both conversational context and structured state that persists across agent lifecycles.
When NOT to Use BotWire
BotWire isn't the right choice for:
• Vector search or embeddings: It's a key-value store, not a vector database. Use Pinecone, Weaviate, or Chroma for semantic search. • High-frequency writes: Free tier caps at 1000 writes/day per namespace. For heavy logging or analytics, use dedicated databases. • Sub-millisecond latency: HTTP API adds network overhead. For real-time gaming or HFT, use in-memory solutions.
FAQ
Why not just use Redis or a database? You could, but then you're managing infrastructure, connection pools, serialization, and error handling. BotWire gives you a working solution in 2 lines of code with zero setup.
Is this actually free? Yes. 1000 writes per day per namespace, 50MB storage per namespace, unlimited reads. No signup, no API key, no credit card. Paid plans add higher limits.
What about data privacy? Data transmits over HTTPS to botwire.dev. For sensitive data, self-host the open source version (MIT license, single FastAPI service) or use namespacing to isolate data.
Your agents can finally remember. Install BotWire and add persistent memory in minutes, not hours.
pip install botwire
Get started at https://botwire.dev