Adding Persistent Memory to MCP Servers
Free · Open source (MIT) · Works with LangChain, CrewAI, AutoGen · No signup
When building MCP servers, you need state that survives restarts. Session data, user preferences, conversation history — all gone when your server crashes or redeploys. Most developers reach for Redis or local files, but there's a simpler path. This guide shows you how to add persistent memory to your MCP servers in under 5 minutes.
The Problem with MCP Server State
MCP servers are stateless by design. Every restart wipes your in-memory dictionaries, user sessions, and accumulated context. Your chatbot forgets conversations mid-stream. Your agent loses track of multi-step workflows. Configuration changes disappear.
Here's what breaks:
# This dies on restart
user_sessions = {}
conversation_history = []
agent_knowledge = {"user_preferences": {}}
def handle_request(user_id, message):
if user_id not in user_sessions:
user_sessions[user_id] = {"context": []}
# Process message... but this state vanishes on restart
The Model Context Protocol emphasizes stateless request/response patterns, but real applications need persistent memory. You need mcp persistent memory that survives process restarts, server crashes, and deployments.
The Fix: BotWire Memory
Install BotWire and get persistent storage in three lines:
pip install botwire
from botwire import Memory
# Persistent key-value storage
memory = Memory("my-mcp-server")
memory.set("user-42-preferences", {"theme": "dark", "notifications": True})
preferences = memory.get("user-42-preferences") # Survives restarts
That's it. Your MCP server now has mcp storage that persists across restarts, deployments, and even machine changes.
How It Works
BotWire provides a simple key-value API backed by a remote service. No Redis setup, no database configuration, no API keys. Every write goes to https://botwire.dev and stays there until you delete it.
from botwire import Memory
class MCPServerWithMemory:
def __init__(self):
self.memory = Memory("mcp-server-prod") # Namespace isolates your data
def store_conversation(self, user_id, messages):
key = f"conversation-{user_id}"
self.memory.set(key, messages)
def get_conversation(self, user_id):
key = f"conversation-{user_id}"
return self.memory.get(key) or [] # Returns None if key doesn't exist
def clear_user_data(self, user_id):
self.memory.delete(f"conversation-{user_id}")
self.memory.delete(f"preferences-{user_id}")
Cross-Process Memory
Multiple MCP server instances share the same memory namespace automatically:
# Server instance A
memory_a = Memory("shared-namespace")
memory_a.set("active-users", ["alice", "bob"])
# Server instance B (different process/machine)
memory_b = Memory("shared-namespace")
users = memory_b.get("active-users") # ["alice", "bob"]
This enables horizontal scaling and failover scenarios. When one server crashes, another picks up exactly where it left off.
Listing and Cleanup
# List all keys in namespace
all_keys = memory.list_keys()
# Cleanup old data
for key in all_keys:
if key.startswith("temp-"):
memory.delete(key)
Perfect for implementing model context protocol state management across server restarts.
Integration Patterns for MCP Servers
For conversation-heavy MCP servers, use the chat history adapter:
from botwire import BotWireChatHistory
from mcp import McpServer
class ConversationalMCPServer(McpServer):
def __init__(self):
super().__init__()
self.memory = BotWireChatHistory(session_id="default")
def handle_chat_message(self, message):
# Add to persistent history
self.memory.add_user_message(message)
# Process with full context
response = self.generate_response(self.memory.messages)
self.memory.add_ai_message(response)
return response
def generate_response(self, message_history):
# Your LLM call here, with full persistent context
pass
This pattern maintains conversation continuity across server restarts — essential for production MCP deployments.
When NOT to Use BotWire
BotWire isn't right for every use case:
• Vector search/embeddings — It's key-value storage, not a vector database • High-frequency writes — Free tier limits to 1000 writes/day per namespace • Sub-millisecond latency — HTTP calls add ~50-200ms overhead vs local storage
For semantic search or real-time gaming, look elsewhere. For typical MCP server memory needs, it's perfect.
FAQ
Why not Redis? Redis requires setup, hosting, and maintenance. BotWire works instantly with zero configuration.
Is this actually free? Yes, 1000 writes/day forever. No credit card, no trial limits. Unlimited reads.
What about data privacy? Data lives at botwire.dev. For sensitive data, self-host the open source version (MIT license).
Get Started
Add persistent memory to your MCP server today. No configuration, no API keys, no database setup required.
pip install botwire
Full documentation and self-hosting instructions at botwire.dev.