# codag
**Repository Path**: cherng/codag
## Basic Information
- **Project Name**: codag
- **Description**: No description available
- **Primary Language**: Unknown
- **License**: MIT
- **Default Branch**: main
- **Homepage**: None
- **GVP Project**: No
## Statistics
- **Stars**: 0
- **Forks**: 0
- **Created**: 2026-02-27
- **Last Updated**: 2026-02-27
## Categories & Tags
**Categories**: Uncategorized
**Tags**: None
## README
# Codag
**See how your AI code actually works.**
[](https://marketplace.visualstudio.com/items?itemName=codag.codag)
[](LICENSE)
[](https://github.com/michaelzixizhou/codag/stargazers)
[](https://github.com/michaelzixizhou/codag/pulls)
Codag analyzes your code for LLM API calls and AI frameworks, then generates interactive workflow graphs — directly inside VSCode.
If you find Codag useful, please consider giving it a ⭐ — it helps others discover the project!
## Why Codag?
You're building an AI agent that chains 3 LLM calls across 5 files. A prompt change breaks something downstream. Which call? Which branch? You `grep` for `openai.chat`, open 8 tabs, and mentally trace the flow. Again.
Or you're onboarding onto someone's LangChain project — 20 files, tool calls inside tool calls, retry logic wrapping everything. The README says "it's straightforward." It's not.
Codag maps it out for you:
- **Extracts the workflow** — finds every LLM call, decision branch, and processing step across your entire codebase
- **Visualizes it as a graph** — interactive DAG with clickable nodes that link back to source code
- **Updates in real-time** — edit a file and watch the graph change instantly, no re-analysis needed
**Built for** AI engineers, agent builders, and anyone maintaining code that talks to LLMs — whether it's a single OpenAI call or a multi-agent LangGraph pipeline.
## Features
### Automatic Workflow Detection
Point Codag at your files and it maps out the entire AI pipeline — LLM calls, branching logic, data transformations — without any configuration.
### Live Graph Updates
Edit your code and the graph updates instantly using tree-sitter parsing. Changed functions get a green highlight so you can see exactly what moved.
### Click-to-Source Navigation
Every node links back to the exact function and line number. Click a node to open the side panel, click the source link to jump straight to the code.
### Export to PNG
Export your workflow graphs as high-resolution PNG images — the entire graph or individual workflows.
### Native Theme Support
Graphs automatically match your VS Code theme — light or dark. No configuration needed.
Supported Providers & Frameworks
**LLM Providers:** OpenAI, Anthropic, Google Gemini, Azure OpenAI, Vertex AI, AWS Bedrock, Mistral, xAI (Grok), Cohere, Ollama, Together AI, Replicate, Fireworks AI, AI21, DeepSeek, OpenRouter, Groq, Hugging Face
**Frameworks:** LangChain, LangGraph, Mastra, CrewAI, LlamaIndex, AutoGen, Haystack, Semantic Kernel, Pydantic AI, Instructor
**AI Services:** ElevenLabs, RunwayML, Stability AI, D-ID, HeyGen, and more
**IDE APIs:** VS Code Language Model API
**Languages:** Python, TypeScript, JavaScript (JSX/TSX), Go, Rust, Java, C, C++, Swift, Lua
Don't see yours? [Adding a provider](CONTRIBUTING.md#adding-a-provider) takes 5 lines of code.
## Quick Start
### 1. Clone & Setup
```bash
git clone https://github.com/michaelzixizhou/codag.git
cd codag
cp backend/.env.example backend/.env
# Add your Gemini API key to backend/.env (free tier: https://aistudio.google.com/apikey)
make setup
```
### 2. Start the Backend
**With Docker (recommended):**
```bash
docker compose up -d
```
**Without Docker:**
```bash
make run
```
Verify: `curl http://localhost:52104/health`
### 3. Install the Extension
**VS Code:** Search **"Codag"** in Extensions, or install from the [Marketplace](https://marketplace.visualstudio.com/items?itemName=codag.codag).
**Cursor:** Build and install the `.vsix` manually:
```bash
cd frontend && npx @vscode/vsce package
cursor --install-extension codag-*.vsix
```
### 4. Use It
1. `Cmd+Shift+P` / `Ctrl+Shift+P` → **"Codag: Open"**
2. Select files containing LLM/AI code
3. Explore the graph
### MCP Server (for Cursor Agent, Claude Code, etc.)
The extension automatically registers a bundled MCP server when activated. This gives coding agents access to your workflow graph — no extra setup required.
The config is written to `.cursor/mcp.json` (Cursor) or `.mcp.json` (Claude Code) in your workspace.
## How It Works
**Analysis Pipeline:**
1. Tree-sitter parses your code into ASTs across 10+ languages
2. Pattern matching detects LLM API calls and framework usage
3. Call graph extraction maps function relationships
4. Backend (Gemini 2.5 Flash) identifies workflow semantics — nodes, edges, decision points
**Live Updates:**
- File changes trigger incremental tree-sitter re-parsing
- AST diffs determine which functions changed
- Graph updates instantly without LLM round-trip
**Rendering:**
- ELK (Eclipse Layout Kernel) for orthogonal graph layout
- D3.js for interactive SVG rendering
- Per-file caching with content hashing — only changed files reanalyze
## Roadmap
- [ ] Hosted backend (no self-hosting required)
- [ ] Diff view: compare workflows across git commits
- [ ] Support for more languages and frameworks
Have a feature request? [Open an issue](https://github.com/michaelzixizhou/codag/issues/new).
## Star History
## Development
For contributors working on the extension itself:
```bash
cd frontend
npm install
npm run compile
```
Then press `F5` in VS Code to launch the Extension Development Host.
## Contributing
Contributions welcome! See [CONTRIBUTING.md](CONTRIBUTING.md) for full development setup.
## Contact
Questions or feedback? Reach out at michael@codag.ai
## License
[MIT](LICENSE)