AI Agent Memory: BotWire vs Redis vs Vector DBs
Free · Open source (MIT) · Works with LangChain, CrewAI, AutoGen · No signup
You're building an AI agent that needs to remember things between conversations, sessions, and restarts. Redis feels like overkill for simple key-value persistence, and vector databases like Pinecone are expensive for basic memory storage. You need something between an in-memory dict and a full database infrastructure.
The Problem with Agent Memory Storage
Most developers start with in-memory dictionaries or session variables for agent memory, but this breaks the moment your process restarts or you scale beyond a single instance. You lose conversation context, user preferences, and any learned behaviors.
Redis works but requires infrastructure setup, connection pooling, and authentication. Vector databases like Pinecone are designed for semantic search with embeddings, not simple key-value agent memory — you're paying for features you don't need.
Here's what breaks with naive approaches:
# This dies when your process restarts
agent_memory = {}
agent_memory["user_123_preferences"] = {"language": "spanish"}
# Gone forever after restart
Your agent forgets everything it learned about users, can't maintain conversation context across sessions, and feels broken to end users.
The Fix: Persistent Key-Value Memory
BotWire Memory gives you Redis-like persistence without the infrastructure overhead. Install and start persisting agent state immediately:
pip install botwire
from botwire import Memory
# Create namespaced memory (survives restarts)
memory = Memory("my-agent")
# Store user preferences
memory.set("user_123_preferences", {
"language": "spanish",
"tone": "casual"
})
# Retrieve anytime, anywhere
prefs = memory.get("user_123_preferences")
# Returns: {"language": "spanish", "tone": "casual"}
This persists across process restarts, different machines, and scales automatically.
How Agent Memory Actually Works
The Memory API treats each namespace as an isolated key-value store. Multiple agents can share the same namespace or use separate ones for isolation:
from botwire import Memory
# Different agents, different namespaces
customer_agent = Memory("customer-service")
sales_agent = Memory("sales-bot")
# Store conversation context
customer_agent.set("user_456_context", {
"issue": "billing_question",
"attempts": 2,
"escalated": False
})
# List all keys in namespace
all_contexts = customer_agent.list()
# Returns: ["user_456_context", "user_789_context", ...]
# Check if key exists
if customer_agent.exists("user_456_context"):
context = customer_agent.get("user_456_context")
# Delete when resolved
customer_agent.delete("user_456_context")
Keys persist until explicitly deleted. No TTL expiration means your agent won't suddenly forget mid-conversation. The HTTP backend handles concurrency, so multiple processes can safely read/write the same memory namespace without conflicts.
Cross-process access works immediately — deploy your agent on different servers and they all share the same memory state through the BotWire API.
Integration with Agent Frameworks
For LangChain agents, use the built-in chat history adapter:
from botwire import BotWireChatHistory
from langchain.memory import ConversationBufferMemory
# Persistent conversation memory
chat_history = BotWireChatHistory(session_id="user-789")
memory = ConversationBufferMemory(
chat_memory=chat_history,
return_messages=True
)
# Your agent's conversations persist across restarts
# No setup, no Redis connection strings, no auth
For CrewAI, get pre-built memory tools:
from botwire.memory import memory_tools
# Get remember/recall/list_memory tools for your crew
tools = memory_tools("crew-namespace")
# Agents can now remember and recall information
# Tools handle the BotWire API calls automatically
When NOT to Use BotWire
- Vector search: BotWire isn't a vector database. Use Pinecone/Qdrant for embedding similarity search
- High throughput: Limited to 1000 writes/day on free tier. Use Redis for high-frequency updates
- Sub-millisecond latency: HTTP API adds network overhead. Use in-memory stores for microsecond response times
FAQ
Why not just use Redis? Redis requires infrastructure setup, authentication, and connection management. BotWire works immediately with no configuration — better for prototyping and small-scale production.
Is this actually free? Yes, 1000 writes/day per namespace forever. Unlimited reads. No credit card, no signup required. Open source MIT license if you want to self-host.
What about data privacy? Data flows through botwire.dev servers unless you self-host. It's a single FastAPI + SQLite service you can run locally. Check the MIT licensed code at github.com/pmestre-Forge/signal-api.
---
Skip the Redis setup and vector database costs for simple agent memory. pip install botwire and start building persistent agents in minutes. Full docs and self-hosting guide at botwire.dev.