# Eywa
**Repository Path**: uesoft/Eywa
## Basic Information
- **Project Name**: Eywa
- **Description**: No description available
- **Primary Language**: Unknown
- **License**: Apache-2.0
- **Default Branch**: main
- **Homepage**: None
- **GVP Project**: No
## Statistics
- **Stars**: 0
- **Forks**: 0
- **Created**: 2026-05-06
- **Last Updated**: 2026-05-06
## Categories & Tags
**Categories**: Uncategorized
**Tags**: None
## README
# Heterogeneous (Scientific) Foundation Model Collaboration
**Bring Domain-Specific Foundation Models into Agentic Systems.**
In *Avatar* franchise, *Eywa* ("All Mother") is a planetary-scale network. She connects and coordinates diverse life forms.
We bring *Eywa* from a fictional world into the digital world, for orchestrating heterogeneous foundation models.

[](https://www.python.org/)
[](LICENSE)
[](https://www.zihao.website/eywa.github.io/)
---
## โจ What's inside
- **Three execution modes.** Eywa has three instantiations: `single-agent`, `multi-agent`, and `orchestration`.
- **A cross-domain benchmark.** `eywabench.parquet` covers various scientific domains spanning material, energy, space, biology, clinic, drug, economy, business, and infrastructure.
- **Foundation model - Language Model "Tsaheylu".** Robust and stable communication channels between a domain-specific foundation model and a language model.
- **Async + worker pool.** `--num_workers` runs many tasks concurrently; each worker gets its own pair of MCP servers on isolated ports.
## โ๏ธ Environment Setup
This repository requires several dependencies to be installed: [langchain](https://docs.langchain.com/oss/python/langchain/install), [langchain-mcp](https://docs.langchain.com/oss/python/langchain/mcp), [langchain-openai](https://docs.langchain.com/oss/python/langchain/install), [langchain_google_genai](https://pypi.org/project/langchain-google-genai/), [fastmcp](https://gofastmcp.com/getting-started/installation), [autogluon](https://auto.gluon.ai/dev/install.html), [tabpfn](https://github.com/PriorLabs/TabPFN).
Since a recent update, TabPFN requires signing up with [PriorLabs](https://priorlabs.ai/tabpfn). During execution TabPFN will pop-up a browser window to sign in.
> **Note:** The commands below are a recommended starting point, not a fully portable installer. Dependency resolution can vary across operating systems, CUDA builds, and upstream package updates. If installation fails, create a fresh environment and follow the upstream installation guides linked above, especially for AutoGluon and TabPFN.
```bash
conda create -n eywa python=3.11
conda activate eywa
pip install autogluon
pip install tabpfn
pip install langchain langchain-openai langchain-mcp-adapters langchain-google-genai fastmcp==2.14.5
```
## ๐ Quickstart
```bash
# 1. Set up your environment yourself; then drop API keys into .env
cp .env.example .env
# edit OPENAI_API_KEY / GOOGLE_API_KEY
# 2. Launch foundation-model MCP servers (one pair per worker)
python launch_mcp_servers.py --num_workers 4
# 3. Run an experiment (in a second terminal)
python main.py --eywa --eywabench_name eywabench --exp_name first-run
# 4. Aggregate per-domain metrics
python eywabench/run_eval.py --exp_name first-run_gpt-5-nano_single-agent_eywa
```
> ๐ก Outputs land in `experiments/