Give Claude Persistent Memory in 2 Minutes

Free · Open source (MIT) · Works with LangChain, CrewAI, AutoGen · No signup

Claude's API doesn't include built-in persistent memory — every conversation starts fresh unless you manually save and restore context. This means your Claude-powered apps lose all conversation history, user preferences, and learned context between sessions. Here's how to add persistent memory to Claude in under 2 minutes using BotWire.

The Memory Problem with Claude API

Claude's stateless design means each API call is independent. Your chatbot might have a great conversation with a user, learn their preferences, and build context — but the moment your process restarts or the user returns tomorrow, Claude forgets everything.

This breaks user experience in real applications:

import anthropic

client = anthropic.Anthropic(api_key="your-key")

# First conversation
response1 = client.messages.create(
    model="claude-3-sonnet-20240229",
    max_tokens=1000,
    messages=[{"role": "user", "content": "My name is Sarah and I prefer Python over JavaScript"}]
)

# Later conversation - Claude has no memory of Sarah or her preferences
response2 = client.messages.create(
    model="claude-3-sonnet-20240229", 
    max_tokens=1000,
    messages=[{"role": "user", "content": "What programming language should I use for my project?"}]
)

Claude won't remember Sarah's name or Python preference. You need persistent memory that survives restarts, works across processes, and requires zero setup.

The Fix: BotWire Persistent Memory

Install BotWire and add memory to Claude in three lines:

pip install botwire anthropic
import anthropic
from botwire import Memory

# Initialize Claude and persistent memory
client = anthropic.Anthropic(api_key="your-key")
memory = Memory("claude-chat-user-sarah")

# Store user context
memory.set("user_name", "Sarah")
memory.set("preferred_language", "Python")
memory.set("conversation_context", "Building a web scraping project")

# Retrieve context for Claude
user_name = memory.get("user_name")
context = f"User: {user_name}, Prefers: {memory.get('preferred_language')}"

response = client.messages.create(
    model="claude-3-sonnet-20240229",
    max_tokens=1000,
    messages=[{
        "role": "system", 
        "content": f"Remember: {context}"
    }, {
        "role": "user", 
        "content": "What should I use for error handling?"
    }]
)

Now Claude remembers Sarah and her Python preference across sessions.

How BotWire Memory Works

BotWire provides a simple key-value store that persists across processes and machines. Each Memory("namespace") creates an isolated storage space — perfect for per-user or per-conversation memory.

The memory operations are straightforward:

Here's a more complete Claude memory pattern:

from botwire import Memory
import anthropic
import json

class ClaudeWithMemory:
    def __init__(self, user_id, anthropic_api_key):
        self.client = anthropic.Anthropic(api_key=anthropic_api_key)
        self.memory = Memory(f"claude-user-{user_id}")
        
    def chat(self, user_message):
        # Get conversation history
        history = self.memory.get("conversation_history") or []
        
        # Build messages with memory context
        messages = []
        
        # Add system context from memory
        user_prefs = self.memory.get("user_preferences") or {}
        if user_prefs:
            context = f"User preferences: {json.dumps(user_prefs)}"
            messages.append({"role": "system", "content": context})
            
        # Add conversation history
        messages.extend(history)
        messages.append({"role": "user", "content": user_message})
        
        # Call Claude
        response = self.client.messages.create(
            model="claude-3-sonnet-20240229",
            max_tokens=1000,
            messages=messages
        )
        
        # Save conversation to memory
        history.append({"role": "user", "content": user_message})
        history.append({"role": "assistant", "content": response.content[0].text})
        
        # Keep only last 10 exchanges to avoid token limits
        if len(history) > 20:
            history = history[-20:]
            
        self.memory.set("conversation_history", history)
        return response.content[0].text

# Usage
claude = ClaudeWithMemory("user-123", "your-anthropic-key")
response = claude.chat("I'm working on a Python web scraper")

Memory persists automatically. Your Claude conversations survive server restarts, deploy cycles, and process crashes.

LangChain Integration

If you're using LangChain with Claude, BotWire provides a drop-in chat history adapter:

from langchain_anthropic import ChatAnthropic
from botwire import BotWireChatHistory

# Initialize Claude with persistent chat history
llm = ChatAnthropic(model="claude-3-sonnet-20240229")
history = BotWireChatHistory(session_id="user-sarah-session")

# Chat with automatic memory
history.add_user_message("I prefer Python for data processing")
history.add_ai_message("Got it! I'll recommend Python solutions for your data tasks.")

# Later session - history is automatically restored
messages = history.messages  # Contains previous conversation
response = llm.invoke(messages + [("user", "What's the best pandas alternative?")])

The BotWireChatHistory handles conversation persistence transparently. Each session_id gets its own memory namespace.

When NOT to Use BotWire

BotWire isn't right for every use case:

Vector search or embeddings: BotWire is key-value storage, not a vector database. Use Pinecone or Weaviate for semantic search. • High-throughput applications: With 1000 writes/day limit, BotWire fits prototypes and small apps, not enterprise-scale systems. • Sub-millisecond latency: HTTP-based storage adds ~50ms per request. Use Redis or in-memory caching for ultra-low latency.

FAQ

Why not just use Redis? Redis requires setup, hosting, and configuration. BotWire works immediately with zero config — just pip install botwire. Perfect for prototypes and small applications.

Is this actually free? Yes. 1000 writes/day per namespace, 50MB storage, unlimited reads, forever. No credit card or signup required.

What about data privacy? BotWire is open-source (MIT license) and self-hostable. For production, run your own instance with git clone https://github.com/pmestre-Forge/signal-api.

---

Stop fighting stateless APIs. Give Claude persistent memory that just works: pip install botwire. Full docs and self-hosting guide at botwire.dev.

Install in one command:

pip install botwire

Start free at botwire.dev