# MemoryOS
**Repository Path**: lz98/MemoryOS
## Basic Information
- **Project Name**: MemoryOS
- **Description**: No description available
- **Primary Language**: Unknown
- **License**: Apache-2.0
- **Default Branch**: main
- **Homepage**: None
- **GVP Project**: No
## Statistics
- **Stars**: 0
- **Forks**: 0
- **Created**: 2025-07-26
- **Last Updated**: 2025-07-26
## Categories & Tags
**Categories**: Uncategorized
**Tags**: None
## README
# MemoryOS
๐ If you like our project, please give us a star โญ on GitHub for the latest update.
**MemoryOS** is designed to provide a memory operating system for personalized AI agents, enabling more coherent, personalized, and context-aware interactions. Drawing inspiration from memory management principles in operating systems, it adopts a hierarchical storage architecture with four core modules: Storage, Updating, Retrieval, and Generation, to achieve comprehensive and efficient memory management. On the LoCoMo benchmark, the model achieved average improvements of **49.11%** and **46.18%** in F1 and BLEU-1 scores.
- **Paper**: https://arxiv.org/abs/2506.06326
- **Website**: https://baijia.online/memoryos/
- **Documentation**: https://bai-lab.github.io/MemoryOS/docs
- **YouTube Video**: **MemoryOS MCP + RAG Agent That Can Remember Anything**
- https://www.youtube.com/watch?v=WHQu8fpEOaU
## โจKey Features
* ๐ **TOP Performance** in Memory Management
The SOTA results in long-term memory benchmarks, boosting F1 scores by 49.11% and BLEU-1 by 46.18% on the LoCoMo benchmark.
* ๐ง **Plug-and-Play** Memory Management Architecture
Enables seamless integration of pluggable memory modulesโincluding storage engines, update strategies, and retrieval algorithms.
* โจ **Agent Workflow Create with Ease** (**MemoryOS-MCP**)
Inject long-term memory capabilities into various AI applications by calling modular tools provided by the MCP Server.
* ๐ **Universal LLM Support**
MemoryOS seamlessly integrates with a wide range of LLMs (e.g., OpenAI, Deepseek, Qwen ...)
## ๐ฃ Latest News
* *[new]* ๐ฅ๐ฅ ๐ฅ **[2025-07-15]**: **๐ Support** for Vector Database [Chromadb](#memoryos_chromadb-getting-started)
* *[new]* ๐ฅ๐ฅ ๐ฅ **[2025-07-15]**: **๐ Integrate** [Docker](#docker-getting-started) into deployment
* *[new]* ๐ฅ๐ฅ **[2025-07-14]**: **โก Acceleration** of MCP parallelization
* *[new]* ๐ฅ๐ฅ **[2025-07-14]**: **๐ Support** for BGE-M3 & Qwen3 embeddings on PyPI and MCP.
* *[new]* ๐ฅ **[2025-07-09]**: **๐ Evaluation** of the MemoryOS on LoCoMo Dataset: Publicly Available [๐Reproduce](#reproduce).
* *[new]* ๐ฅ **[2025-07-08]**: **๐ New Config Parameter**
* New parameter configuration: **similarity_threshold**. For configuration file, see ๐ [Documentation](https://bai-lab.github.io/MemoryOS/docs) page.
* *[new]* **[2025-07-07]**: **๐5 Times Faster**
* The MemoryOS (PYPI) implementation has been upgraded: **5 times faster** (reduction in latency)ย through parallelization optimizations.
* *[new]* **[2025-07-07]**: **โจR1 models Support Now**
* MemoryOS supports configuring and using inference models such as **Deepseek-r1 and Qwen3..**
* *[new]* **[2025-07-07]**: **โจMemoryOS Playground Launched**
* The Playground of **MemoryOS Platform** has been launched! [๐MemoryOS Platform](https://baijia.online/memoryos/). If you need an **Invitation Code**, please feel free to reach [Contact US](#community).
* *[new]* **[2025-06-15]**:๐ ๏ธ Open-sourced **MemoryOS-MCP** released! Now configurable on agent clients for seamless integration and customization. [๐ MemoryOS-MCP](#memoryos-mcp-getting-started).
* **[2025-05-30]**: ๐ Paper-**Memory OS of AI Agent** is available on arXiv: https://arxiv.org/abs/2506.06326.
* **[2025-05-30]**: Initial version of **MemoryOS** launched! Featuring short-term, mid-term, and long-term persona Memory with automated user profile and knowledge updating.
## ๐ฅ MemoryOS Support List
| Type |
Name |
Open Source |
Support |
Configuration |
Description |
| Agent Client |
Claude Desktop |
โ |
โ
|
claude_desktop_config.json |
Anthropic official client |
| Cline |
โ
|
โ
|
VS Code settings |
VS Code extension |
| Cursor |
โ |
โ
|
Settings panel |
AI code editor |
| Model Provider |
OpenAI |
โ |
โ
|
OPENAI_API_KEY |
GPT-4, GPT-3.5, etc. |
| Anthropic |
โ |
โ
|
ANTHROPIC_API_KEY |
Claude series |
| Deepseek-R1 |
โ
|
โ
|
DEEPSEEK_API_KEY |
Chinese large model |
| Qwen/Qwen3 |
โ
|
โ
|
QWEN_API_KEY |
Alibaba Qwen |
| vLLM |
โ
|
โ
|
Local deployment |
Local model inference |
| Llama_factory |
โ
|
โ
|
Local deployment |
Local fine-tuning deployment |
All model calls use the OpenAI API interface; you need to supply the API key and base URL.
## ๐ Table of Contents
* โจ Features
* ๐ฅ News
* ๐Support Lists
* ๐Project Structure
* ๐ฏ Quick Start
* PYPI Install MemoryOS
* MemoryOS-MCP
* MemoryOS-chromadb
* Docker
* โ๏ธ Todo List
* ๐ฌ How to Reproduce the Results in the Paper
* ๐ Documentation
* ๐ Cite
* ๐ค Join the Community
## ๐๏ธ System Architecture
## ๐๏ธ Project Structure
```
memoryos/
โโโ __init__.py # Initializes the MemoryOS package
โโโ __pycache__/ # Python cache directory (auto-generated)
โโโ long_term.py # Manages long-term persona memory (user profile, knowledge)
โโโ memoryos.py # Main class for MemoryOS, orchestrating all components
โโโ mid_term.py # Manages mid-term memory, consolidating short-term interactions
โโโ prompts.py # Contains prompts used for LLM interactions (e.g., summarization, analysis)
โโโ retriever.py # Retrieves relevant information from all memory layers
โโโ short_term.py # Manages short-term memory for recent interactions
โโโ updater.py # Processes memory updates, including promoting information between layers
โโโ utils.py # Utility functions used across the library
```
## ๐MemoryOS_PyPi Getting Started
### Prerequisites
* Python >= 3.10
* conda create -n MemoryOS python=3.10
* conda activate MemoryOS
### Installation
#### Download from PyPi
```bash
pip install memoryos-pro -i https://pypi.org/simple
```
#### Download from GitHub (latest version)
```bash
git clone https://github.com/BAI-LAB/MemoryOS.git
cd MemoryOS/memoryos-pypi
pip install -r requirements.txt
```
### Basic Usage
```python
import os
from memoryos import Memoryos
# --- Basic Configuration ---
USER_ID = "demo_user"
ASSISTANT_ID = "demo_assistant"
API_KEY = "YOUR_OPENAI_API_KEY" # Replace with your key
BASE_URL = "" # Optional: if using a custom OpenAI endpoint
DATA_STORAGE_PATH = "./simple_demo_data"
LLM_MODEL = "gpt-4o-mini"
def simple_demo():
print("MemoryOS Simple Demo")
# 1. Initialize MemoryOS
print("Initializing MemoryOS...")
try:
memo = Memoryos(
user_id=USER_ID,
openai_api_key=API_KEY,
openai_base_url=BASE_URL,
data_storage_path=DATA_STORAGE_PATH,
llm_model=LLM_MODEL,
assistant_id=ASSISTANT_ID,
short_term_capacity=7,
mid_term_heat_threshold=5,
retrieval_queue_capacity=7,
long_term_knowledge_capacity=100,
#Support Qwen/Qwen3-Embedding-0.6B, BAAI/bge-m3, all-MiniLM-L6-v2
embedding_model_name="BAAI/bge-m3"
)
print("MemoryOS initialized successfully!\n")
except Exception as e:
print(f"Error: {e}")
return
# 2. Add some basic memories
print("Adding some memories...")
memo.add_memory(
user_input="Hi! I'm Tom, I work as a data scientist in San Francisco.",
agent_response="Hello Tom! Nice to meet you. Data science is such an exciting field. What kind of data do you work with?"
)
test_query = "What do you remember about my job?"
print(f"User: {test_query}")
response = memo.get_response(
query=test_query,
)
print(f"Assistant: {response}")
if __name__ == "__main__":
simple_demo()
```
## ๐ MemoryOS-MCP Getting Started
### ๐ง Core Tools
#### 1. `add_memory`
Saves the content of the conversation between the user and the AI assistant into the memory system, for the purpose of building a persistent dialogue history and contextual record.
#### 2. `retrieve_memory`
Retrieves related historical dialogues, user preferences, and knowledge information from the memory system based on a query, helping the AI assistant understand the userโs needs and background.
#### 3. `get_user_profile`
Obtains a user profile generated from the analysis of historical dialogues, including the userโs personality traits, interest preferences, and relevant knowledge background.
### 1. Install dependencies
```bash
cd memoryos-mcp
pip install -r requirements.txt
```
### 2. configuration
Edit `config.json`๏ผ
```json
{
"user_id": "user ID",
"openai_api_key": "OpenAI API key",
"openai_base_url": "https://api.openai.com/v1",
"data_storage_path": "./memoryos_data",
"assistant_id": "assistant_id",
"llm_model": "gpt-4o-mini"
"embedding_model_name":"BAAI/bge-m3"
}
```
### 3. Start the server
```bash
python server_new.py --config config.json
```
### 4. Test
```bash
python test_comprehensive.py
```
### 5. Configure it on Cline and other clients
Copy the mcp.json file over, and make sure the file path is correct.
```bash
command": "/root/miniconda3/envs/memos/bin/python"
#This should be changed to the Python interpreter of your virtual environment
```
## ๐MemoryOS_Chromadb Getting Started
### 1. Install dependencies
```bash
cd memoryos-chromadb
pip install -r requirements.txt
```
### 2. Test
```bash
The edit information is in comprehensive_test.py
memoryos = Memoryos(
user_id='travel_user_test',
openai_api_key='',
openai_base_url='',
data_storage_path='./comprehensive_test_data',
assistant_id='travel_assistant',
embedding_model_name='BAAI/bge-m3',
mid_term_capacity=1000,
mid_term_heat_threshold=13.0,
mid_term_similarity_threshold=0.7,
short_term_capacity=2
)
python3 comprehensive_test.py
# Make sure to use a different data storage path when switching embedding models.
```
## ๐Docker Getting Started
You can run MemoryOS using Docker in two ways: by pulling the official image or by building your own image from the Dockerfile. Both methods are suitable for quick setup, testing, and production deployment.
### Option 1: Pull the Official Image
```bash
# Pull the latest official image
docker pull ghcr.io/bai-lab/memoryos:latest
docker run -it --gpus=all ghcr.io/bai-lab/memoryos /bin/bash
```
### Option 2: Build from Dockerfile
```bash
# Clone the repository
git clone https://github.com/BAI-LAB/MemoryOS.git
cd MemoryOS
# Build the Docker image (make sure Dockerfile is present)
docker build -t memoryos .
docker run -it --gpus=all memoryos /bin/bash
```
## ๐ฏReproduce
```bash
cd eval
Configure API keys and other settings in the code
python3 main_loco_parse.py
python3 evalution_loco.py
```
## โ๏ธ Todo List
MemoryOS is continuously evolving! Here's what's coming:
- **Ongoing๐**: **Integrated Benchmarks**: Standardized benchmark suite with a cross-model comparison for Mem0, Zep, and OpenAI
- ๐๏ธ Enabling seamless Memory exchange and integration across diverse systems.
Have ideas or suggestions? Contributions are welcome! Please feel free to submit issues or pull requests! ๐
## ๐ Documentation
A more detailed documentation is coming soon ๐, and we will update in the [Documentation](https://bai-lab.github.io/MemoryOS/docs) page.
## ๐ฃ Citation
**If you find this project useful, please consider citing our paper:**
```bibtex
@misc{kang2025memoryosaiagent,
title={Memory OS of AI Agent},
author={Jiazheng Kang and Mingming Ji and Zhe Zhao and Ting Bai},
year={2025},
eprint={2506.06326},
archivePrefix={arXiv},
primaryClass={cs.AI},
url={https://arxiv.org/abs/2506.06326},
}
```
## ๐ฏ Contact us
BaiJia AI is a research team guided by Associate Professor Bai Ting from Beijing University of Posts and Telecommunications, dedicated to creating emotionally rich and super-memory brains for AI agents.
๐ค Cooperation and Suggestions: baiting@bupt.edu.cn
๐ฃFollow our **WeChat official account**, join the **WeChat group** or
https://discord.gg/SqVj7QvZ to get the latest updates.
## ๐ Star History
[](https://www.star-history.com/#BAI-LAB/MemoryOS&Timeline)