Skip to content

mem0ai/mem0

Repository files navigation

Mem0 - The Memory Layer for Personalized AI

mem0ai%2Fmem0 | Trendshift Launch YC: Mem0 - Open Source Memory Layer for AI Apps

Learn more · Join Discord

Mem0 Discord Mem0 PyPI - Downloads GitHub commit activity Package version Npm package Y Combinator S24

Introduction

Mem0 (pronounced as "mem-zero") enhances AI assistants and agents with an intelligent memory layer, enabling personalized AI interactions. Mem0 remembers user preferences, adapts to individual needs, and continuously improves over time, making it ideal for customer support chatbots, AI assistants, and autonomous systems.

Features & Use Cases

Core Capabilities:

  • Multi-Level Memory: User, Session, and AI Agent memory retention with adaptive personalization
  • Developer-Friendly: Simple API integration, cross-platform consistency, and hassle-free managed service

Applications:

  • AI Assistants: Seamless conversations with context and personalization
  • Learning & Support: Tailored content recommendations and context-aware customer assistance
  • Healthcare & Companions: Patient history tracking and deeper relationship building
  • Productivity & Gaming: Streamlined workflows and adaptive environments based on user behavior

Get Started

Get started quickly with Mem0 Platform - our fully managed solution that provides automatic updates, advanced analytics, enterprise security, and dedicated support. Create a free account to begin.

For complete control, you can self-host Mem0 using our open-source package. See the Quickstart guide below to set up your own instance.

Quickstart Guide

Install the Mem0 package via pip:

pip install mem0ai

Basic Usage

Mem0 requires an LLM to function, with gpt-4o from OpenAI as the default. However, it supports a variety of LLMs; for details, refer to our Supported LLMs documentation.

First step is to instantiate the memory:

from openai import OpenAI
from mem0 import Memory

openai_client = OpenAI()
mem0 = Memory()

def chat_with_memories(message: str, user_id: str = "default_user") -> str:
    # Retrieve relevant memories
    relevant_memories = mem0.search(query=message, user_id=user_id, limit=3)
    memories_str = "\n".join(f"- {entry['memory']}" for entry in relevant_memories)
    
    # Generate Assistant response
    system_prompt = f"You are a helpful AI. Answer the question based on query and memories.\nUser Memories:\n{memories_str}"
    messages = [{"role": "system", "content": system_prompt}, {"role": "user", "content": message}]
    response = openai_client.chat.completions.create(model="gpt-4o-mini", messages=messages)
    assistant_response = response.choices[0].message.content

    # Create new memories from the conversation
    messages.append({"role": "assistant", "content": assistant_response})
    mem0.add(messages, user_id=user_id)

    return assistant_response

def main():
    print("Chat with AI (type 'exit' to quit)")
    while True:
        user_input = input("You: ").strip()
        if user_input.lower() == 'exit':
            print("Goodbye!")
            break
        print(f"AI: {chat_with_memories(user_input)}")

if __name__ == "__main__":
    main()

For more advanced usage and API documentation, visit our documentation.

Tip

For a hassle-free experience, try our hosted platform with automatic updates and enterprise features.

Demos

  • AI Companion: Experience personalized conversations with an AI that remembers your preferences and past interactions
AI.companion.mp4



  • Enhance your AI interactions by storing memories across ChatGPT, Perplexity, and Claude using our browser extension. Get chrome extension.
Chrome.extension.video.mp4



  • Customer support bot using Langgraph and Mem0. Get the complete code from here
Customer.support.mp4



  • Use Mem0 with CrewAI to get personalized results. Full example here
crewai_demo.mp4

Documentation

For detailed usage instructions and API reference, visit our documentation. You'll find:

  • Complete API reference
  • Integration guides
  • Advanced configuration options
  • Best practices and examples
  • More details about:

Support

Join our community for support and discussions. If you have any questions, feel free to reach out to us using one of the following methods:

License

This project is licensed under the Apache 2.0 License - see the LICENSE file for details.