Navigation
mem0 Memory System: Plug-and-Play, Cross-Interaction Recall - MCP Implementation

mem0 Memory System: Plug-and-Play, Cross-Interaction Recall

Boost your AI’s recall with mem0’s MCP server – the plug-and-play memory hub for OpenAI/Ollama agents. Keep conversations, tasks, and context alive across interactions. 🌟

Research And Data
4.6(169 reviews)
253 saves
118 comments

This tool saved users approximately 14699 hours last month!

About mem0 Memory System

What is mem0 Memory System: Plug-and-Play, Cross-Interaction Recall?

mem0 is an AI-driven memory framework enabling applications to retain and utilize user interaction data across sessions. It automates contextual memory capture through natural language processing, eliminating manual memory management. The system integrates with major LLM providers (e.g., OpenAI, Ollama) to deliver persistent user-state tracking for seamless multi-turn interactions.

How to use mem0 Memory System: Plug-and-Play, Cross-Interaction Recall?

Deployment involves three core steps:

  1. Install via package manager: pip install mem0
  2. Initialize memory context with provider credentials
  3. Integrate memory tracking in application workflows using API hooks or library functions

Environment configuration allows customization of retention policies and provider settings through YAML manifests.

mem0 Memory System Features

Key Features of mem0 Memory System: Plug-and-Play, Cross-Interaction Recall?

  • Automatic entity extraction for user preferences, locations, and transactional data
  • Multi-provider support with adaptive API routing (REST/gRPC/WebSockets)
  • Temporal context management for session-based and long-term memory retention
  • Compliance-ready data scrubbing and retention controls
  • Real-time memory visualization dashboard

Use cases of mem0 Memory System: Plug-and-Interaction Recall?

Common applications include:

  • Enterprise chatbots maintaining customer journey continuity
  • Healthcare platforms securely tracking patient treatment histories
  • Educational tools adapting content based on learner progress
  • IoT systems managing device interaction state across user sessions
  • Game NPCs demonstrating evolving behavior through contextual memory

mem0 Memory System FAQ

FAQ from mem0 Memory System: Plug-and-Play, Cross-Interaction Recall?

Does mem0 require cloud infrastructure?
No - supports local deployment with Ollama and other on-premise providers
How is data secured?
End-to-end encryption with role-based access control (RBAC) for memory segments
What languages are supported?
APIs available in Python, JavaScript, and Go; SDKs for .NET and Java in development
Can I customize memory retention duration?
Yes - via TTL policies and retention tiering configurations

Full documentation available at mem0.ai/docs

Content

✨ mem0 Memory System ✨

mem0 MCP Server Logo

A flexible memory system for AI applications that can be used in two ways:

  1. As an MCP (Memory Capabilities Provider) server for integration with MCP-compatible applications
  2. As a direct library integration for embedding memory capabilities directly in your applications

Made with ❤️ by Pink Pixel

Features

  • Multi-Provider Support : Use OpenAI, Anthropic, Google, DeepSeek, OpenRouter, or Ollama (local)
  • Flexible Embedding Options : Choose from OpenAI, HuggingFace, or Ollama for embeddings
  • Local Storage : Store memories locally with ChromaDB and SQLite
  • Configurable : Customize data directories, models, and parameters
  • Autonomous Memory : Automatically extracts, stores, and retrieves user information without explicit commands
  • User Isolation : Support for multiple users with isolated memory spaces
  • Two Integration Methods : Server-based (MCP) or direct library integration

Installation

There are multiple ways to install and set up the mem0 MCP server:

Method 1: Manual Installation

  1. Clone the repository and navigate to the directory:

    git clone https://github.com/pinkpixel-dev/mem0-mcp.git

cd mem0-mcp
  1. Install dependencies:

    pip install -r requirements.txt

  2. Create a .env file from the example:

    cp .env.example .env

  3. Edit the .env file with your API keys and settings (see Environment Configuration Guide for details).

Method 2: Using the Installer Script

The project includes a convenient installer script that automates the setup process:

./install.sh

The installer script provides the following features:

  • Creates a Python virtual environment
  • Installs all dependencies
  • Sets up environment configuration
  • Provides a guided setup experience with visual feedback

You can also customize the installation with options:

# For a quick, non-interactive installation
./install.sh --quick

# To specify a custom environment directory
./install.sh --env-dir ./custom_env

# To use a specific installation method (pip, uv, or conda)
./install.sh --method pip

Running the Server

There are multiple ways to run the mem0 MCP server:

Method 1: Using the Python Script Directly

Start the server with default settings:

python server.py

The server will automatically find the next available port if port 8000 is already in use.

Or customize with command-line arguments:

python server.py --host 127.0.0.1 --port 8080 --provider ollama --embedding-provider ollama --data-dir ./custom_memory_data

Additional options:

# Disable automatic port finding
python server.py --no-auto-port

# Enable auto-reload for development
python server.py --reload

Method 2: Using the Run Server Script

For a more convenient experience, use the included shell script:

./run_server.sh

This script supports the same options as the Python script:

# Customize host and port
./run_server.sh --host 127.0.0.1 --port 8080

# Specify providers
./run_server.sh --provider openai --embedding ollama

# Set a custom data directory
./run_server.sh --data-dir ~/mem0_data

Method 3: Using the Launcher for MCP-Compatible Applications

For integrating with command-line applications like Cursor that use MCP, use the launcher script:

python start_mem0_server.py

This script is specifically designed for MCP integration and outputs the server information in the format expected by MCP clients. It supports the same options as the other methods:

# Customize settings
python start_mem0_server.py --host 127.0.0.1 --port 8080 --provider ollama

# Run in quiet mode (reduced output)
python start_mem0_server.py --quiet

Integration Methods

Method 1: MCP Server (for MCP-Compatible Applications)

The MCP server provides a RESTful API that can be used by any application that supports the Machine Communication Protocol (MCP).

API Endpoints

Endpoint Method Description
/configure POST Configure the memory provider
/memory/add POST Add a memory
/memory/search POST Search for memories
/memory/chat POST Chat with the AI using memories
/memory/{memory_id} GET Get a memory by ID
/memory/{memory_id} DELETE Delete a memory by ID
/memories DELETE Clear all memories
/health GET Health check
/providers GET List available providers

Using the Client

from client import Mem0Client

# Initialize the client
client = Mem0Client(base_url="http://localhost:8000")

# Configure the memory provider
client.configure(
    provider="openai",
    embedding_provider="openai",
    data_dir="./memory_data"
)

# Add a memory
memory_id = client.add_memory(
    content="This is an important fact to remember",
    user_id="user123"
)

# Search for memories
results = client.search_memories(
    query="important fact",
    user_id="user123"
)

# Chat with context from memories
response = client.chat(
    message="What important information do you have?",
    user_id="user123"
)

print(response)

Integration with MCP-Compatible Applications

To integrate with applications that support MCP servers:

  1. Start the mem0 MCP server using the launcher script:

    python start_mem0_server.py

  2. The script will output MCP server information in the format expected by MCP clients:

    {"name": "mem0", "capabilities": ["memory"], "url": "http://0.0.0.0:8000", "version": "1.0.0"}

  3. Configure your application to use the MCP server URL

  4. Use the memory capabilities through your application's interface

For detailed instructions, see the MCP Integration Guide.

Method 2: Direct Library Integration (for Any Application)

You can directly integrate the memory system into your own applications without running a separate server:

from mem0.memory import MemoryManager
from mem0.providers import OpenAIProvider, OllamaEmbeddingProvider

# Initialize the memory manager
memory_manager = MemoryManager(
    provider=OpenAIProvider(api_key="your_openai_api_key"),
    embedding_provider=OllamaEmbeddingProvider(),
    data_dir="./memory_data"
)

# Add a memory
memory_id = memory_manager.add_memory(
    content="This is an important fact to remember",
    user_id="user123"
)

# Search for memories
results = memory_manager.search_memories(
    query="important fact",
    user_id="user123"
)

# Get memory by ID
memory = memory_manager.get_memory(memory_id)

# Delete a memory
memory_manager.delete_memory(memory_id)

# Clear all memories for a user
memory_manager.clear_memories(user_id="user123")

For detailed instructions, see the Direct Integration Guide.

Autonomous Memory System

The mem0 memory system includes an autonomous memory feature that can:

  1. Automatically extract important information from user interactions
  2. Store memories without explicit commands
  3. Retrieve relevant memories when needed for context
  4. Enhances the AI's responses by injecting memories into the context

This creates a seamless experience where the AI naturally remembers details about the user without requiring explicit memory commands.

Try the Autonomous Memory Example

The repository includes an example of the autonomous memory system:

python examples/autonomous_memory.py

This example demonstrates:

  • Automatic extraction of personal information
  • Contextual retrieval of memories
  • Natural incorporation of memories into responses

Environment Configuration

The mem0 memory system can be configured using environment variables or a .env file:

# LLM Provider Configuration
MEM0_PROVIDER=openai
OPENAI_API_KEY=your_openai_api_key

# Embedding Provider Configuration
MEM0_EMBEDDING_PROVIDER=openai

# Storage Configuration
MEM0_DATA_DIR=~/mem0_memories

# Server Configuration
MEM0_HOST=0.0.0.0
MEM0_PORT=8000

For a complete list of configuration options, see the Environment Configuration Guide.

Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

License

This project is licensed under the MIT License - see the LICENSE file for details.

Acknowledgements


Made with ❤️ by Pink Pixel | GitHub

Related MCP Servers & Clients