BotWire vs LangMem: When Simplicity Wins
Free · Open source (MIT) · Works with LangChain, CrewAI, AutoGen · No signup
LangChain Memory got complex fast. You're looking at ConversationBufferMemory, ConversationSummaryMemory, vectorstore-backed solutions, and wondering if there's something simpler. LangMem promises to solve this, but sometimes the simplest solution is just a persistent key-value store that works across processes and machines.
The LangChain Memory Problem
LangChain's memory system works great for single sessions, but breaks down when you need persistence across restarts or multiple processes. Your agent forgets everything when the process dies. ConversationBufferMemory lives in RAM. VectorStoreRetrieverMemory requires setting up Pinecone or Chroma for what should be simple key-value storage.
# This dies when your process restarts
from langchain.memory import ConversationBufferMemory
memory = ConversationBufferMemory()
memory.chat_memory.add_user_message("Remember my name is Alice")
# Process restarts... Alice is gone
LangMem tries to fix this with cloud persistence, but you're stuck with their API, pricing, and data handling. Sometimes you just need a simple alternative that persists data without the complexity.
The Simple Fix
Install BotWire and get persistent memory in three lines:
pip install botwire
from botwire import Memory
# Persistent across restarts, processes, machines
m = Memory("my-agent")
m.set("user_name", "Alice")
m.set("user_preferences", {"theme": "dark", "notifications": True})
# Later, in a different process:
name = m.get("user_name") # "Alice"
prefs = m.get("user_preferences") # {"theme": "dark", "notifications": True}
That's it. No configuration, no API keys, no database setup. Your data persists at https://botwire.dev automatically.
How It Works
BotWire Memory is a simple HTTP-backed key-value store. Each Memory("namespace") creates an isolated storage space. Data persists across Python processes, server restarts, and different machines.
from botwire import Memory
# Different namespaces = isolated data
user_memory = Memory("user-42")
system_memory = Memory("system-config")
# Store any JSON-serializable data
user_memory.set("conversation_history", [
{"role": "user", "message": "What's the weather?"},
{"role": "assistant", "message": "I'll check that for you."}
])
# List all keys in a namespace
all_keys = user_memory.list() # ["conversation_history"]
# Check if key exists
if user_memory.get("last_login"):
print("Welcome back!")
# Delete when done
user_memory.delete("temporary_data")
Cross-process persistence works immediately. Start a Python script, set some data, kill it, start another script - your data is still there. No Redis, no database, no configuration files.
# Process A
m = Memory("shared-state")
m.set("current_task", "processing_documents")
# Process B (different machine, same namespace)
task = Memory("shared-state").get("current_task") # "processing_documents"
LangChain Integration
BotWire provides a drop-in replacement for LangChain's chat memory:
from botwire import BotWireChatHistory
from langchain.schema import HumanMessage, AIMessage
# Persistent chat history
history = BotWireChatHistory(session_id="user-42")
# Add messages - they persist automatically
history.add_user_message("Remember I'm working on a Python project")
history.add_ai_message("Got it, I'll keep that context for our conversation")
# Use with LangChain chains
from langchain.memory import ConversationBufferMemory
memory = ConversationBufferMemory(
chat_memory=history,
return_messages=True
)
# Your conversation survives restarts
messages = history.messages # Loads from persistent storage
When NOT to Use BotWire
- Vector/semantic search: BotWire is key-value only. Use Pinecone, Weaviate, or Chroma for embeddings and similarity search
- High throughput: Free tier caps at 1000 writes/day per namespace. Redis or local databases handle millions of operations
- Sub-millisecond latency: HTTP calls add ~50-200ms. Use in-memory caches for microsecond response times
FAQ
Why not just use Redis or a database? You can, but then you need hosting, connection strings, error handling, and serialization. BotWire works immediately with zero setup. For prototypes and small apps, the convenience wins.
Is this actually free? Yes. 1000 writes/day per namespace, 50MB storage per namespace, unlimited reads. No credit card, no trial expiration. The service needs to stay free to be useful.
What about data privacy? Data flows through botwire.dev servers. For sensitive data, self-host the open source version (MIT license, single FastAPI service) or use a traditional database with proper encryption.
---
BotWire Memory gives you LangChain persistence without the complexity. Install it, set some keys, watch your data survive restarts.
pip install botwire
Try it at https://botwire.dev.