Quick Start
Get your AI development team running in under 5 minutes.
Prerequisites
- A Linux server (Ubuntu 22.04+ recommended) with 4GB+ RAM
- An OpenRouter API key (for LLM access)
- Optionally, an OpenAI API key (for embeddings)
Option 1: One-Line K3s Install
The fastest path — installs K3s, clones the repo, and deploys all 9 agents:
curl -sfL https://k8.virtualgpt.cloud/install.sh | bash
The installer will:
- Install K3s (lightweight Kubernetes) if not already present
- Clone the
agent0-bmad-k8repository - Prompt you for your OpenRouter API key
- Create Kubernetes secrets with your API keys
- Apply all K8s manifests (namespace, configmap, deployments, ingress)
- Wait for all 9 agent pods to become ready
Tip: The installer is idempotent — you can run it again safely to update or reconfigure.
Option 2: Docker Compose
For local development or testing on any machine with Docker:
git clone https://github.com/t4tarzan/agent0-bmad-k8.git
cd agent0-bmad-k8
# Create your .env.live with API keys
cat > .env.live << EOF
API_KEY_OPENROUTER=sk-or-v1-your-key-here
OPENAI_API_KEY=sk-your-openai-key
EOF
# Start 5 agents (live demo config)
docker compose -f docker-compose.live.yml up -d
After ~60 seconds, the agents will be available on ports 50001–50005.
Option 3: Manual Kubernetes
For existing clusters or custom configurations:
git clone https://github.com/t4tarzan/agent0-bmad-k8.git
cd agent0-bmad-k8
# Edit secrets with your API keys
vi k8s/bmad/secrets.yaml
# Deploy everything
./k8s/bmad/deploy-bmad.sh
Verify Deployment
Check that all agents are running:
# Docker Compose
docker ps | grep bmad-live
# Kubernetes
kubectl get pods -n bmad
# Test an agent directly
curl -s -X POST http://localhost:50001/api_message \
-H "Content-Type: application/json" \
-H "X-API-KEY: your-api-key" \
-d '{"message":"Hello!","lifetime_hours":1}'
Start the GMeet Interface
The GMeet interface provides a Google Meet-style UI for collaborating with agents:
# Install and start backend
cd gmeet/backend
npm install
node server.js
# In another terminal — build and serve frontend
cd gmeet/frontend
npm install
npm run build
# Serve the dist/ folder with any static server
Or try the hosted live demo directly.
Next Steps
- Agent Profiles — Learn about each agent
- BMAD Method — Understand the 4-phase workflow
- Configuration — Customize models, temperatures, and prompts
- GMeet Interface — Using the real-time collaboration UI