# genie-web-client **Repository Path**: mirrors_openshift/genie-web-client ## Basic Information - **Project Name**: genie-web-client - **Description**: No description available - **Primary Language**: Unknown - **License**: Apache-2.0 - **Default Branch**: main - **Homepage**: None - **GVP Project**: No ## Statistics - **Stars**: 0 - **Forks**: 0 - **Created**: 2025-10-31 - **Last Updated**: 2026-02-07 ## Categories & Tags **Categories**: Uncategorized **Tags**: None ## README # Genie Web Client An AI-powered, extensible UI framework built on the OpenShift Console dynamic plugin, enabling unified, intelligent experiences across Red Hat products. ## Prerequisites - **Node.js 20+** and **yarn** - For frontend development - **Python 3.12+** (requires >=3.12, <3.14) - For backend (Lightspeed services). See [lightspeed-stack requirements](https://github.com/lightspeed-core/lightspeed-stack/blob/main/pyproject.toml#L21) - **Go 1.24.6+** - For obs-mcp server. See [obs-mcp requirements](https://github.com/rhobs/obs-mcp/blob/main/go.mod#L3) - **OpenShift CLI (`oc`)** - To connect to a cluster - **Podman 3.2.0+** or **Docker** - To run the console - **OpenAI API Key** - Or compatible LLM provider ## Container Images | File | Purpose | When to Use | |------|---------|-------------| | `Dockerfile` | Production build - compiles and serves via nginx | Production, CI/CD | | `Dockerfile.dev` | Fast cluster deploys - serves pre-built `dist/` via nginx | Quick iteration with `build-deploy.sh` | | `Dockerfile.local-dev` | Local dev server with hot-reload | Development without installing Node.js or yarn on host | **Note:** `Dockerfile.local-dev` allows you to develop locally using only Podman/Docker - no need to install Node.js, npm, or yarn on your machine. See [Option A: Using Podman](#option-a-using-podman-no-nodejs-or-yarn-required-on-host) for setup instructions. ## Getting Started Genie Web Client requires both a frontend (this repo) and a backend (AI service). Follow these steps: ### 1. Setup Backend (One-Time) The backend provides AI capabilities. See detailed instructions in [`backend/README.md`](./backend/README.md). **Quick Start:** ```bash # First, clone and start obs-mcp server (terminal 1) # Make sure you're logged into your OpenShift cluster oc login # Clone obs-mcp (one time only, skip if you already have it) cd ~/Documents/GHRepos # or wherever you keep repos git clone https://github.com/rhobs/obs-mcp.git cd obs-mcp # Start obs-mcp (auto-discovers thanos-querier in the cluster falls back to prometheus if not found) go run cmd/obs-mcp/main.go --listen 127.0.0.1:9100 --auth-mode kubeconfig --insecure --guardrails none # Runs on port 9100 - keep running # Then in another terminal, setup lightspeed-stack # Clone lightspeed-stack cd ~/Documents/GHRepos # or your preferred location git clone https://github.com/lightspeed-core/lightspeed-stack.git cd lightspeed-stack # Copy our configs cp ~/Documents/GHRepos/genie-web-client/backend/lightspeed-stack/*.yaml . # Install and start uv sync export OPENAI_API_KEY="sk-your-key-here" uv run python -m src.lightspeed_stack # Runs on port 8080 - keep this terminal running ``` ### 2. Setup Frontend In separate terminal windows, run: #### Option A: Using Podman (No Node.js or yarn required on host) **Terminal 3: Plugin Dev Server (Container)** ```bash cd ~/Documents/GHRepos/genie-web-client ``` ##### Build the dev image (first time only) ```bash podman build -f Dockerfile.local-dev -t genie-frontend-dev . ``` ##### Run the dev server ```bash podman run --rm -it \ -p 9001:9001 \ -v $(pwd)/src:/usr/src/app/src:z \ -v $(pwd)/locales:/usr/src/app/locales:z \ genie-frontend-dev # Runs on port 9001 - keep running ``` #### Option B: Using Node.js directly **Terminal 3: Plugin Dev Server** ```bash cd ~/Documents/GHRepos/genie-web-client yarn install yarn run start # Runs on port 9001 - keep running ``` #### OpenShift Console **Terminal 4:** ```bash cd ~/Documents/GHRepos/genie-web-client oc login # Connect to your cluster # If you have yarn installed: yarn run start-console # OR without yarn (set plugin name manually): npm_package_consolePlugin_name=genie-web-client ./start-console.sh # Runs on port 9000 - keep running ``` **Access the app:** ### Testing MCP Tool Calls Once everything is running, you can test obs-mcp integration with these queries: - "What alerts are firing in the cluster?" - "Show me CPU usage metrics" - "What pods are running in the openshift-monitoring namespace?" ## Development ### Option 1: Local (Recommended) In one terminal window, run: 1. `yarn install` 2. `yarn run start` In another terminal window, run: 1. `oc login` (requires [oc](https://console.redhat.com/openshift/downloads) and an [OpenShift cluster](https://console.redhat.com/openshift/create)) 2. `yarn run start-console` (requires [Docker](https://www.docker.com) or [podman 3.2.0+](https://podman.io)) This will run the OpenShift console in a container connected to the cluster you've logged into. The plugin HTTP server runs on port 9001 with CORS enabled. **Note:** Make sure the backend is running (see "Getting Started" section above) for full AI functionality. Navigate to to see the running plugin. #### Running start-console with Apple silicon and podman If you are using podman on a Mac with Apple silicon, `yarn run start-console` might fail since it runs an amd64 image. You can workaround the problem with [qemu-user-static](https://github.com/multiarch/qemu-user-static) by running these commands: ```bash podman machine ssh sudo -i rpm-ostree install qemu-user-static systemctl reboot ``` ### Option 2: Docker + VSCode Remote Container Make sure the [Remote Containers](https://marketplace.visualstudio.com/items?itemName=ms-vscode-remote.remote-containers) extension is installed. This method uses Docker Compose where one container is the OpenShift console and the second container is the plugin. It requires that you have access to an existing OpenShift cluster. After the initial build, the cached containers will help you start developing in seconds. 1. Create a `dev.env` file inside the `.devcontainer` folder with the correct values for your cluster: ```bash OC_PLUGIN_NAME=console-plugin-template OC_URL=https://api.example.com:6443 OC_USER=kubeadmin OC_PASS= ``` 1. `(Ctrl+Shift+P) => Remote Containers: Open Folder in Container...` 2. `yarn run start` 3. Navigate to ## Testing ### Unit Tests This project uses Jest and React Testing Library for unit testing. #### Running Tests ```bash # Run all tests once yarn test # Run tests in watch mode (re-runs on file changes) yarn test:watch # Run tests with coverage report yarn test:coverage ``` #### Writing Tests Tests should be placed alongside the components they test with a `.test.tsx` extension. For components with multiple test files, use a `__tests__/` directory. **File Organization:** - Single test file: `src/components/MyComponent.test.tsx` (co-located) - Multiple test files: `src/components/my-component/__tests__/` (organized) Example test: ```tsx import { render, screen } from '@testing-library/react'; import MyComponent from './MyComponent'; describe('MyComponent', () => { it('renders correctly', () => { render(); expect(screen.getByText('Expected Text')).toBeInTheDocument(); }); }); ``` ### Integration Tests Integration tests using Cypress are available. See the `integration-tests` directory for more details. ```bash # Run Cypress in interactive mode yarn test-cypress # Run Cypress in headless mode yarn test-cypress-headless ``` ## Troubleshooting ### "No module named 'mcp'" Error If you get this error when starting lightspeed-stack: ``` ModuleNotFoundError: No module named 'mcp' ``` **Solution:** Install the required dependencies: ```bash cd ~/Documents/GHRepos/lightspeed-stack uv pip install mcp # Or install all optional dependencies: uv pip install pandas psycopg2-binary redis aiosqlite pillow "mcp>=1.23.0" scikit-learn pymongo matplotlib ``` This happens because `uv sync` only installs dependencies from `pyproject.toml`, but llama-stack requires additional packages for MCP support. ### Other Issues For backend-specific troubleshooting (port conflicts, API keys, etc.), see [`backend/README.md`](./backend/README.md#troubleshooting). ## Contributing See `CONTRIBUTING.md` for guidelines. A PR template is in place (see `.github/pull_request_template.md`) prompting for a summary and testing details.