A multi-tenant workflow automation platform with 9 specialized agents, 25 workflow nodes, visual builder, voice interaction, and enterprise compliance. Built with Google ADK and the A2A protocol.
9 specialized system agents connected via Google ADK's A2A protocol. Create custom dynamic agents from the UI without code.
25 node types with drag-and-drop canvas. Loops, fork/join, error handling, sub-workflows, undo/redo, debugging, and version history.
Organizations, workspaces, teams with RBAC (Owner, Admin, Member, Viewer). Full data isolation and workspace-scoped access.
WebSocket-based low-latency voice with Silero VAD (300ms detection). Rich content rendering: charts, tables, LaTeX, Mermaid diagrams.
Upload documents, create snippets, index URLs. Semantic search with pgvector embeddings and automatic context retrieval.
PII detection and masking, audit logging, policy enforcement, and governance reports for SOC2, GDPR, and HIPAA.
LangFuse distributed tracing for workflow executions. LLM cost tracking per workflow, agent, and model with daily trend analytics.
Database (SQL), AWS S3, Google Sheets, Slack, Teams, Email, HTTP/REST. MCP tools: Shell, Playwright, Tavily.
Per-node retry with exponential backoff, dead letter queue, workflow versioning with rollback, and step-through debugging.
Apps Admin UI (:5003) Voice Chat (:5001)
| |
Services Admin API (:5002) Voice Gateway (:9003)
| |
| Orchestrator (in-process)
| | A2A Protocol
| +----------+----------+----------+
| | | | |
| Developer Browser Researcher Knowledge
| :9000 :9004 :9005 :9006
|
| + 4 Compliance Agents (:9007-9010)
| PII Guardian, Audit, Policy, Governance
|
Infrastructure PostgreSQL (:5434) Redis (:6379) LangFuse (:3100)
Python 3.13 · FastAPI · SQLAlchemy · Alembic
Google ADK · A2A Protocol · LiteLLM · pgvector
React 18 · React Flow · Recharts · Framer Motion
PostgreSQL · Redis · Docker · LangFuse
OpenAI · Anthropic · Azure · Google Vertex · LiteLLM Proxy
WebSocket · Silero VAD · Azure Speech · PCM Streaming
Get up and running in under 5 minutes. Clone the repo, configure your LLM provider, and start building.