# MemoryOS **Repository Path**: lz98/MemoryOS ## Basic Information - **Project Name**: MemoryOS - **Description**: No description available - **Primary Language**: Unknown - **License**: Apache-2.0 - **Default Branch**: main - **Homepage**: None - **GVP Project**: No ## Statistics - **Stars**: 0 - **Forks**: 0 - **Created**: 2025-07-26 - **Last Updated**: 2025-07-26 ## Categories & Tags **Categories**: Uncategorized **Tags**: None ## README # MemoryOS
logo

Mem0 Discord Mem0 PyPI - Downloads Npm package Discord License: Apache 2.0

๐ŸŽ‰ If you like our project, please give us a star โญ on GitHub for the latest update.
**MemoryOS** is designed to provide a memory operating system for personalized AI agents, enabling more coherent, personalized, and context-aware interactions. Drawing inspiration from memory management principles in operating systems, it adopts a hierarchical storage architecture with four core modules: Storage, Updating, Retrieval, and Generation, to achieve comprehensive and efficient memory management. On the LoCoMo benchmark, the model achieved average improvements of **49.11%** and **46.18%** in F1 and BLEU-1 scores. - **Paper**: https://arxiv.org/abs/2506.06326 - **Website**: https://baijia.online/memoryos/ - **Documentation**: https://bai-lab.github.io/MemoryOS/docs - **YouTube Video**: **MemoryOS MCP + RAG Agent That Can Remember Anything** - https://www.youtube.com/watch?v=WHQu8fpEOaU ## โœจKey Features * ๐Ÿ† **TOP Performance** in Memory Management
The SOTA results in long-term memory benchmarks, boosting F1 scores by 49.11% and BLEU-1 by 46.18% on the LoCoMo benchmark. * ๐Ÿง  **Plug-and-Play** Memory Management Architecture
Enables seamless integration of pluggable memory modulesโ€”including storage engines, update strategies, and retrieval algorithms. * โœจ **Agent Workflow Create with Ease** (**MemoryOS-MCP**)
Inject long-term memory capabilities into various AI applications by calling modular tools provided by the MCP Server. * ๐ŸŒ **Universal LLM Support**
MemoryOS seamlessly integrates with a wide range of LLMs (e.g., OpenAI, Deepseek, Qwen ...) ## ๐Ÿ“ฃ Latest News * *[new]* ๐Ÿ”ฅ๐Ÿ”ฅ ๐Ÿ”ฅ **[2025-07-15]**: **๐Ÿ”Œ Support** for Vector Database [Chromadb](#memoryos_chromadb-getting-started) * *[new]* ๐Ÿ”ฅ๐Ÿ”ฅ ๐Ÿ”ฅ **[2025-07-15]**: **๐Ÿ”Œ Integrate** [Docker](#docker-getting-started) into deployment * *[new]* ๐Ÿ”ฅ๐Ÿ”ฅ **[2025-07-14]**: **โšก Acceleration** of MCP parallelization * *[new]* ๐Ÿ”ฅ๐Ÿ”ฅ **[2025-07-14]**: **๐Ÿ”Œ Support** for BGE-M3 & Qwen3 embeddings on PyPI and MCP. * *[new]* ๐Ÿ”ฅ **[2025-07-09]**: **๐Ÿ“Š Evaluation** of the MemoryOS on LoCoMo Dataset: Publicly Available [๐Ÿ‘‰Reproduce](#reproduce). * *[new]* ๐Ÿ”ฅ **[2025-07-08]**: **๐Ÿ† New Config Parameter** * New parameter configuration: **similarity_threshold**. For configuration file, see ๐Ÿ“– [Documentation](https://bai-lab.github.io/MemoryOS/docs) page. * *[new]* **[2025-07-07]**: **๐Ÿš€5 Times Faster** * The MemoryOS (PYPI) implementation has been upgraded: **5 times faster** (reduction in latency)ย through parallelization optimizations. * *[new]* **[2025-07-07]**: **โœจR1 models Support Now** * MemoryOS supports configuring and using inference models such as **Deepseek-r1 and Qwen3..** * *[new]* **[2025-07-07]**: **โœจMemoryOS Playground Launched** * The Playground of **MemoryOS Platform** has been launched! [๐Ÿ‘‰MemoryOS Platform](https://baijia.online/memoryos/). If you need an **Invitation Code**, please feel free to reach [Contact US](#community). * *[new]* **[2025-06-15]**:๐Ÿ› ๏ธ Open-sourced **MemoryOS-MCP** released! Now configurable on agent clients for seamless integration and customization. [๐Ÿ‘‰ MemoryOS-MCP](#memoryos-mcp-getting-started). * **[2025-05-30]**: ๐Ÿ“„ Paper-**Memory OS of AI Agent** is available on arXiv: https://arxiv.org/abs/2506.06326. * **[2025-05-30]**: Initial version of **MemoryOS** launched! Featuring short-term, mid-term, and long-term persona Memory with automated user profile and knowledge updating. ## ๐Ÿ”ฅ MemoryOS Support List
Type Name Open Source Support Configuration Description
Agent Client Claude Desktop โŒ โœ… claude_desktop_config.json Anthropic official client
Cline โœ… โœ… VS Code settings VS Code extension
Cursor โŒ โœ… Settings panel AI code editor
Model Provider OpenAI โŒ โœ… OPENAI_API_KEY GPT-4, GPT-3.5, etc.
Anthropic โŒ โœ… ANTHROPIC_API_KEY Claude series
Deepseek-R1 โœ… โœ… DEEPSEEK_API_KEY Chinese large model
Qwen/Qwen3 โœ… โœ… QWEN_API_KEY Alibaba Qwen
vLLM โœ… โœ… Local deployment Local model inference
Llama_factory โœ… โœ… Local deployment Local fine-tuning deployment
All model calls use the OpenAI API interface; you need to supply the API key and base URL. ## ๐Ÿ“‘ Table of Contents * โœจ Features * ๐Ÿ”ฅ News * ๐Ÿ”Support Lists * ๐Ÿ“Project Structure * ๐ŸŽฏ Quick Start * PYPI Install MemoryOS * MemoryOS-MCP * MemoryOS-chromadb * Docker * โ˜‘๏ธ Todo List * ๐Ÿ”ฌ How to Reproduce the Results in the Paper * ๐Ÿ“– Documentation * ๐ŸŒŸ Cite * ๐Ÿค Join the Community ## ๐Ÿ—๏ธ System Architecture image ## ๐Ÿ—๏ธ Project Structure ``` memoryos/ โ”œโ”€โ”€ __init__.py # Initializes the MemoryOS package โ”œโ”€โ”€ __pycache__/ # Python cache directory (auto-generated) โ”œโ”€โ”€ long_term.py # Manages long-term persona memory (user profile, knowledge) โ”œโ”€โ”€ memoryos.py # Main class for MemoryOS, orchestrating all components โ”œโ”€โ”€ mid_term.py # Manages mid-term memory, consolidating short-term interactions โ”œโ”€โ”€ prompts.py # Contains prompts used for LLM interactions (e.g., summarization, analysis) โ”œโ”€โ”€ retriever.py # Retrieves relevant information from all memory layers โ”œโ”€โ”€ short_term.py # Manages short-term memory for recent interactions โ”œโ”€โ”€ updater.py # Processes memory updates, including promoting information between layers โ””โ”€โ”€ utils.py # Utility functions used across the library ``` ## ๐Ÿ“–MemoryOS_PyPi Getting Started ### Prerequisites * Python >= 3.10 * conda create -n MemoryOS python=3.10 * conda activate MemoryOS ### Installation #### Download from PyPi ```bash pip install memoryos-pro -i https://pypi.org/simple ``` #### Download from GitHub (latest version) ```bash git clone https://github.com/BAI-LAB/MemoryOS.git cd MemoryOS/memoryos-pypi pip install -r requirements.txt ``` ### Basic Usage ```python import os from memoryos import Memoryos # --- Basic Configuration --- USER_ID = "demo_user" ASSISTANT_ID = "demo_assistant" API_KEY = "YOUR_OPENAI_API_KEY" # Replace with your key BASE_URL = "" # Optional: if using a custom OpenAI endpoint DATA_STORAGE_PATH = "./simple_demo_data" LLM_MODEL = "gpt-4o-mini" def simple_demo(): print("MemoryOS Simple Demo") # 1. Initialize MemoryOS print("Initializing MemoryOS...") try: memo = Memoryos( user_id=USER_ID, openai_api_key=API_KEY, openai_base_url=BASE_URL, data_storage_path=DATA_STORAGE_PATH, llm_model=LLM_MODEL, assistant_id=ASSISTANT_ID, short_term_capacity=7, mid_term_heat_threshold=5, retrieval_queue_capacity=7, long_term_knowledge_capacity=100, #Support Qwen/Qwen3-Embedding-0.6B, BAAI/bge-m3, all-MiniLM-L6-v2 embedding_model_name="BAAI/bge-m3" ) print("MemoryOS initialized successfully!\n") except Exception as e: print(f"Error: {e}") return # 2. Add some basic memories print("Adding some memories...") memo.add_memory( user_input="Hi! I'm Tom, I work as a data scientist in San Francisco.", agent_response="Hello Tom! Nice to meet you. Data science is such an exciting field. What kind of data do you work with?" ) test_query = "What do you remember about my job?" print(f"User: {test_query}") response = memo.get_response( query=test_query, ) print(f"Assistant: {response}") if __name__ == "__main__": simple_demo() ``` ## ๐Ÿ“– MemoryOS-MCP Getting Started ### ๐Ÿ”ง Core Tools #### 1. `add_memory` Saves the content of the conversation between the user and the AI assistant into the memory system, for the purpose of building a persistent dialogue history and contextual record. #### 2. `retrieve_memory` Retrieves related historical dialogues, user preferences, and knowledge information from the memory system based on a query, helping the AI assistant understand the userโ€™s needs and background. #### 3. `get_user_profile` Obtains a user profile generated from the analysis of historical dialogues, including the userโ€™s personality traits, interest preferences, and relevant knowledge background. ### 1. Install dependencies ```bash cd memoryos-mcp pip install -r requirements.txt ``` ### 2. configuration Edit `config.json`๏ผš ```json { "user_id": "user ID", "openai_api_key": "OpenAI API key", "openai_base_url": "https://api.openai.com/v1", "data_storage_path": "./memoryos_data", "assistant_id": "assistant_id", "llm_model": "gpt-4o-mini" "embedding_model_name":"BAAI/bge-m3" } ``` ### 3. Start the server ```bash python server_new.py --config config.json ``` ### 4. Test ```bash python test_comprehensive.py ``` ### 5. Configure it on Cline and other clients Copy the mcp.json file over, and make sure the file path is correct. ```bash command": "/root/miniconda3/envs/memos/bin/python" #This should be changed to the Python interpreter of your virtual environment ``` ## ๐Ÿ“–MemoryOS_Chromadb Getting Started ### 1. Install dependencies ```bash cd memoryos-chromadb pip install -r requirements.txt ``` ### 2. Test ```bash The edit information is in comprehensive_test.py memoryos = Memoryos( user_id='travel_user_test', openai_api_key='', openai_base_url='', data_storage_path='./comprehensive_test_data', assistant_id='travel_assistant', embedding_model_name='BAAI/bge-m3', mid_term_capacity=1000, mid_term_heat_threshold=13.0, mid_term_similarity_threshold=0.7, short_term_capacity=2 ) python3 comprehensive_test.py # Make sure to use a different data storage path when switching embedding models. ``` ## ๐Ÿ“–Docker Getting Started You can run MemoryOS using Docker in two ways: by pulling the official image or by building your own image from the Dockerfile. Both methods are suitable for quick setup, testing, and production deployment. ### Option 1: Pull the Official Image ```bash # Pull the latest official image docker pull ghcr.io/bai-lab/memoryos:latest docker run -it --gpus=all ghcr.io/bai-lab/memoryos /bin/bash ``` ### Option 2: Build from Dockerfile ```bash # Clone the repository git clone https://github.com/BAI-LAB/MemoryOS.git cd MemoryOS # Build the Docker image (make sure Dockerfile is present) docker build -t memoryos . docker run -it --gpus=all memoryos /bin/bash ``` ## ๐ŸŽฏReproduce ```bash cd eval Configure API keys and other settings in the code python3 main_loco_parse.py python3 evalution_loco.py ``` ## โ˜‘๏ธ Todo List MemoryOS is continuously evolving! Here's what's coming: - **Ongoing๐Ÿš€**: **Integrated Benchmarks**: Standardized benchmark suite with a cross-model comparison for Mem0, Zep, and OpenAI - ๐Ÿ—๏ธ Enabling seamless Memory exchange and integration across diverse systems. Have ideas or suggestions? Contributions are welcome! Please feel free to submit issues or pull requests! ๐Ÿš€ ## ๐Ÿ“– Documentation A more detailed documentation is coming soon ๐Ÿš€, and we will update in the [Documentation](https://bai-lab.github.io/MemoryOS/docs) page. ## ๐Ÿ“ฃ Citation **If you find this project useful, please consider citing our paper:** ```bibtex @misc{kang2025memoryosaiagent, title={Memory OS of AI Agent}, author={Jiazheng Kang and Mingming Ji and Zhe Zhao and Ting Bai}, year={2025}, eprint={2506.06326}, archivePrefix={arXiv}, primaryClass={cs.AI}, url={https://arxiv.org/abs/2506.06326}, } ``` ## ๐ŸŽฏ Contact us BaiJia AI is a research team guided by Associate Professor Bai Ting from Beijing University of Posts and Telecommunications, dedicated to creating emotionally rich and super-memory brains for AI agents. ๐Ÿค Cooperation and Suggestions: baiting@bupt.edu.cn ๐Ÿ“ฃFollow our **WeChat official account**, join the **WeChat group** or Discord https://discord.gg/SqVj7QvZ to get the latest updates.
็™พๅฎถAgentๅ…ฌไผ—ๅท ๅพฎไฟก็พคไบŒ็ปด็ 
## ๐ŸŒŸ Star History [![Star History Chart](https://api.star-history.com/svg?repos=BAI-LAB/MemoryOS&type=Timeline)](https://www.star-history.com/#BAI-LAB/MemoryOS&Timeline)