This is an experimental Rust-based implementation of the Meeting Minutes AI assistant, located in the /experiment
directory of the main project. It aims to provide better performance and native integration compared to the main implementation.
The main project structure is:
/meeting-minutes
├── backend/ # Main Python backend
├── docs/ # Project documentation
├── frontend/ # Main Electron frontend
└── experiment/ # Experimental implementations
├── rust_based_implementation/ # This implementation
│ ├── src/ # Next.js frontend
│ ├── src-tauri/ # Rust backend
│ └── whisper-server-package/ # Local transcription server
├── screenpipe/ # Audio processing library
└── simple_recorder.rs # Basic audio implementation
Clone the main repository:
git clone <repository-url>
cd meeting-minutes
Clone screenpipe (required for audio capture):
cd experiment
git clone https://github.com/mediar-ai/screenpipe.git
cd rust_based_implementation
Install dependencies:
pnpm install
Use the provided script to run the app:
./clean_build.sh
This script will:
This implementation differs from the main project by:
This is an experimental implementation that explores:
For the production implementation, please see the main project in the root directory.
git checkout -b feature/amazing-feature
)git commit -m 'Add some amazing feature'
)git push origin feature/amazing-feature
)This project is licensed under the MIT License - see the LICENSE file for details.
此处可能存在不合适展示的内容,页面不予展示。您可通过相关编辑功能自查并修改。
如您确认内容无涉及 不当用语 / 纯广告导流 / 暴力 / 低俗色情 / 侵权 / 盗版 / 虚假 / 无价值内容或违法国家有关法律法规的内容,可点击提交进行申诉,我们将尽快为您处理。