# DeepSeek-Unity **Repository Path**: titd/DeepSeek-Unity ## Basic Information - **Project Name**: DeepSeek-Unity - **Description**: No description available - **Primary Language**: Unknown - **License**: MIT - **Default Branch**: main - **Homepage**: None - **GVP Project**: No ## Statistics - **Stars**: 0 - **Forks**: 0 - **Created**: 2026-02-04 - **Last Updated**: 2026-02-04 ## Categories & Tags **Categories**: Uncategorized **Tags**: None ## README
# π³ DeepSeek API for Unity > π¬ A clean, modular Unity integration for DeepSeek's powerful LLMs β chat, reasoning, and task automation made easy. > β οΈ **Note**: This is an unofficial integration not affiliated with or endorsed by DeepSeek. --- ## β¨ Features - β Clean, reusable SDK for DeepSeek API - π Supports true SSE-based streaming and non-streaming chat completions - π‘οΈ Robust error handling and automatic resource management - π§ Compatible with multiple models (DeepSeek Chat, Reasoner) - π¨ Modular & customizable UI chat component - π Secure API key storage (runtime-safe) - βοΈ Built with Unity Package Manager (UPM) - π§ͺ Includes sample scene & prefabs - π **NEW**: Advanced memory management with automatic cleanup --- ### π§© Supported Platforms & Unity Versions | Platform | Unity 2020.3 | Unity 2021 | Unity 2022 | Unity 6 | Notes | | --- | --- | --- | --- | --- | --- | | **Windows** | β | β | β | β | Fully supported (tested with IL2CPP & Mono) | | **Android** | β | β | β | β | Requires internet permission in manifest | | **WebGL** | β οΈ *Partial* | β οΈ *Partial* | β | β | Streaming unsupported; add CORS headers on server | | **Linux** | β | β | β | β | Likely works, but not yet tested | | **macOS** | β | β | β | β | Not tested, expected to work | | **iOS** | β | β | β | β | Not tested, expected to work (HTTPS required) | | **Consoles** | β | β | β | β | Not supported (Unity license + network limitations) | > β = Not tested yet β expected to work but needs verification > > > β οΈ = Partial support (some limitations) > --- ## π§° Requirements - Unity 2020.3 LTS or newer - TextMeshPro (via Package Manager) - DeepSeek API Key from [platform.deepseek.com](https://platform.deepseek.com/) --- ## π¦ Installation ### Option 1: Via Git URL (Unity Package Manager) 1. Open your Unity project 2. Go to **Window > Package Manager** 3. Click `+` β **Add package from Git URL** 4. Paste: ```csharp https://github.com/yagizeraslan/DeepSeek-Unity.git ``` 5. β Done --- ## π Getting Started ### π§ Setup 1. After installation, download Sample scene from Package Manager 2. Paste your **API key** into the DeepSeekSettings.asset. 3. Hit Play β and chat with DeepSeek AI in seconds π¬ --- ## π§ͺ Sample Scene To test everything: 1. In **Package Manager**, under **DeepSeek API for Unity**, click **Samples** 2. Click **Import** on `DeepSeek Chat Example` 3. Open: ```csharp Assets/Samples/DeepSeek API for Unity/1.0.1/DeepSeek Chat Example/Scenes/DeepSeek-Chat.unity ``` 4. Press Play β you're live. - You can change model type and streaming mode during play β the SDK picks up changes automatically for each new message. - You can also press **Enter** instead of clicking Send button β handy for fast testing. --- ## π API Key Handling - During dev: Store key via `EditorPrefs` using the DeepSeek Editor Window - In production builds: Use the `DeepSeekSettings` ScriptableObject (recommended) **DO NOT** hardcode your key in scripts or prefabs β Unity will reject the package. --- ## π§± Architecture Overview | Layer | Folder | Role | | --- | --- | --- | | API Logic | `Runtime/API/` | HTTP & model logic | | Data Models | `Runtime/Data/` | DTOs for requests/responses | | UI Component | `Runtime/UI/` | MonoBehaviour & Controller | | Config Logic | `Runtime/Common/` | Secure key storage | | Editor Tools | `Editor/` | Editor-only settings UI | | Example Scene | `Samples~/` | Demo prefab, scene, assets | --- ## π§© Example Integration ### π Non-Streaming (Full Response) ```csharp [SerializeField] private DeepSeekSettings config; private async void Start() { var api = new DeepSeekApi(config); var request = new ChatCompletionRequest { model = DeepSeekModel.DeepSeek_V3.ToModelString(), messages = new ChatMessage[] { new ChatMessage { role = "system", content = "You're a helpful assistant." }, new ChatMessage { role = "user", content = "Tell me something cool." } } }; var response = await api.CreateChatCompletion(request); Debug.Log("[FULL RESPONSE] " + response.choices[0].message.content); } ``` ### π Streaming (Real-Time Updates) ```csharp [SerializeField] private DeepSeekSettings config; private void Start() { RunStreamingExample(); } private void RunStreamingExample() { var request = new ChatCompletionRequest { model = DeepSeekModel.DeepSeek_V3.ToModelString(), messages = new ChatMessage[] { new ChatMessage { role = "user", content = "Stream a fun fact about the ocean." } }, stream = true }; var streamingApi = new DeepSeekStreamingApi(); streamingApi.CreateChatCompletionStream( request, config.apiKey, partial => { Debug.Log("[STREAMING] " + partial); // Called for each streamed segment } ); } ``` --- ## π Advanced Usage ### π Streaming Support DeepSeek-Unity supports **reliable real-time streaming** using DeepSeek's official `stream: true` Server-Sent Events (SSE) endpoint. β Uses Unity's `DownloadHandlerScript` for chunked response handling β UI updates per-token (no simulated typewriter effect) β Automatic resource cleanup and memory management β Built-in error handling with user-friendly messages β Request timeout protection (60s default) β No coroutines, no external libraries β works natively in Unity β **NEW**: Smart memory limits prevent unbounded growth in long conversations To enable: - Check `Use Streaming` in the chat prefab or component - Partial responses will automatically stream into the UI π You can toggle streaming on/off at runtime. ### π§ Memory Management DeepSeek-Unity now includes intelligent memory management to prevent memory bloat during long conversations: **Chat History Limits:** - Automatically caps conversation history at **50 messages** (configurable) - Trims to **30 messages** when limit is reached, preserving recent context - Manual cleanup available via `controller.ClearHistory()` **UI GameObject Management:** - Limits message GameObjects to **100 instances** (configurable in Inspector) - Automatically removes oldest UI elements when limit exceeded - Prevents UI hierarchy bloat and maintains performance **Controller Lifecycle:** - Single controller instance reused throughout chat session - Prevents memory leaks from abandoned controller instances - Proper cleanup on component destruction ```csharp // Access memory management features public class CustomChat : MonoBehaviour { private DeepSeekChatController controller; void SomeMethod() { // Check current history size Debug.Log($"History count: {controller.GetHistoryCount()}"); // Manual cleanup if needed controller.ClearHistory(); } } ``` ### π¬ Multiple Models ```csharp DeepSeekModel.DeepSeek_V3 DeepSeekModel.DeepSeek_R1 ``` --- ## π Troubleshooting **Can't add component?** β Make sure you dragged `DeepSeekSettings.asset` into the DeepSeekChat.cs's Config field. **Streaming not working?** β Make sure you're on a platform that supports `DownloadHandlerScript` (Standalone or Editor). β WebGL and iOS may have platform limitations for live SSE streams. **Getting error messages in the chat?** β Error messages now display directly in the chat interface for better debugging. β Check Unity Console for detailed technical error information. **Seeing JSON parse warnings in streaming mode?** β These are normal during SSE β they occur when the parser receives partial chunks. They're automatically skipped and won't affect the final output. --- ## π Support This Project If you find **DeepSeek-Unity** useful, please consider supporting its development! - [Become a sponsor on GitHub Sponsors](https://github.com/sponsors/yagizeraslan) - [Buy me a coffee on Ko-fi](https://ko-fi.com/yagizeraslan) Your support helps me continue maintaining and improving this project. Thank you! π --- ## π License Unofficial integration. DeepSeekβ’ is a trademark of Hangzhou DeepSeek Artificial Intelligence Co., Ltd. This project is licensed under the MIT License. --- ## π€ Contact & Support **Author**: [YaΔΔ±z ERASLAN](https://www.linkedin.com/in/yagizeraslan/) π¬ yagizeraslan@gmail.com π¬ GitHub Issues welcome!