AI Fabric is a UX-first orchestration studio for building and running connected sub-agent workflows with:
- multi-workspace management
- visual graph editor for agent topology
- configurable worker/reviewer nodes
- conditional human-in-the-loop gate (only when worker has assigned reviewer)
- persistent run history (prompts, outputs, decisions)
- full flow execution from entry prompt (entire configured chain)
- model discovery via OpenAI
models.list() - local RAG memory with SQLite chunks + embeddings JSON + rerank
- runtime provider per node (
openai_apiorcursor_bridge) - session controls per node (clear context, switch dialog)
- bridge API contract document at
docs/cursor-bridge-contract.md
Implemented now:
Workspacecreation/list pageWorkfloweditor based on React Flow- node configuration (name, role, model, instructions, reviewer, success criteria)
- node runtime controls (provider, temperature, clear-context-on-run)
- workspace switcher directly in studio header
- graph persistence to SQLite via Prisma
- run creation and per-node execution
- run full chain in topological order with pause/resume on review gates
- modal quality gate with
Approve/Reject with notefor reviewer-linked workers - event persistence in
WorkflowRun,NodeRun,HumanDecision - API endpoint for available models with fallback list
- local knowledge base API and UI:
- add documents
- auto-chunking
- embeddings generation (OpenAI or fallback local embedding)
- hybrid rerank search (semantic + lexical)
- RAG context injection into node execution
- agent session API:
- list sessions by workspace
- clear context
- switch dialog
- cursor bridge runtime adapter with OpenAI fallback
- Next.js 16 (App Router, API routes)
- TypeScript
- React Flow
- Prisma 7 + SQLite (
better-sqlite3adapter) - OpenAI Node SDK (
responsesAPI)
- Install dependencies:
npm install- Set environment variables:
# .env
DATABASE_URL="file:./dev.db"
OPENAI_API_KEY="" # optional
CURSOR_BRIDGE_URL="http://127.0.0.1:4141" # optional, bridge adapter endpoint
# optional codex fallback controls:
# CODEX_BIN="C:\\Users\\User\\.cursor\\extensions\\openai.chatgpt-...\\bin\\windows-x64\\codex.exe"
# CODEX_EXEC_TIMEOUT_SECONDS="300"
# CURSOR_BRIDGE_TIMEOUT_MS="315000"- Push schema and generate Prisma client:
npx prisma db push
npm run db:generate- Start development server:
npm run dev- (Optional) Start local Cursor bridge runtime:
npm run bridge:dev- Open:
http://localhost:3000
You can also start everything by double-clicking:
start-ai-fabric.bat
This script checks:
- Node.js and npm availability
- dependency installation (
node_modules) .envpresence (creates from.env.exampleif missing)- Prisma client generation + DB sync
- free ports
3000and4141
npm run dev
npm run bridge:dev
npm run lint
npm run build
npm run db:generate
npm run db:migrate
npm run db:studioWorkspaceWorkflow(stores graph JSON)WorkflowRunNodeRunHumanDecisionAgentSession(prepared for per-node memory/session metadata)KnowledgeDocumentKnowledgeChunk
- Full graph execution engine:
- topological traversal
- retries by quality gate
- conditional branching
- Cursor bridge hardening:
- stream token-level events from bridge to UI
- strict bridge-only mode toggle (disable fallback by policy)
- richer bridge diagnostics per node run
- Optional RAG upgrade path:
- move from local SQLite retrieval to
pgvectoror dedicated vector DB - add metadata filters and advanced rerank strategies
- Multi-user readiness:
- auth and roles
- audit trail and immutable event log
- workspace permissions