Watch real-time data flow through our multi-agent system. Click any simulation button to see how requests are processed across different workflows.
Explore each autonomous agent's role, workflow, and capabilities
Port 8000
API Gateway & LangGraph Orchestrator
Receives HTTP requests and orchestrates workflows using LangGraph state machines
Receives client HTTP requests
Creates correlation IDs for tracking
Publishes requests to pub/sub topics
Manages state: route → wait_response → complete
Polls for responses using peek/acknowledge
Returns results to clients
Stateful workflow management
Non-blocking async handling
Timeout management (300s)
Proxies analytics requests
FastAPI, LangGraph, httpx, structlogProduction-ready technologies powering the agentic architecture
Async/await for concurrent processing
High-performance async web framework
State machine workflow orchestration
Data validation and type safety
Graph database with vector search
768-dimensional semantic search
Graph traversal and analytics
Non-blocking database operations
LLM for extraction and synthesis
Embedding model (768-dim)
Vector comparison for search
Multi-factor relevance scoring
Containerization for each agent
Local development orchestration
Serverless container platform
Secure credential management
In-memory pub/sub with deque
Request tracking across agents
Reliable message delivery
Event-driven communication
Automatic sensitive data removal
Debugging with structlog
Full TypeScript/Pydantic coverage
Service monitoring endpoints
Each agent scales based on its workload (10 chat agents, 2 ingest agents)
One agent failure doesn't affect others - graceful degradation
Update agents independently without system-wide redeployment
Cloud Run scales to zero - pay only for actual usage
Test agents in isolation with clear pub/sub contracts
Add new agent types without modifying existing code
Built for scale, reliability, and maintainability with modern cloud-native technologies