Skip to content

Conversation

@ibetitsmike
Copy link
Contributor

Summary

Replaces the static "Welcome to Mux" landing view with a permanent Chat with Mux system workspace (mux-chat). This gives users a safe, always-available place to ask Mux questions and to edit global agent instructions.

Implementation

  • System workspace bootstrapping: ensure a singleton system workspace exists at startup (idempotent self-healing).
  • Default home route: app now routes to /workspace/mux-chat by default, and the Mux logo always navigates back to it.
  • Sandboxing / safety:
    • System workspace is forced to Local runtime.
    • No secrets are injected for mux-chat (project secrets omitted; MCP disabled).
    • Defense-in-depth tool policy: deny all tools and only enable safe utilities plus mux_global_agents_{read,write}.
  • Global instructions tools:
    • mux_global_agents_read + mux_global_agents_write provide allowlisted access to ~/.mux/AGENTS.md (write gated by explicit confirmation).
  • Onboarding: seed a synthetic welcome message into mux-chat history on first run.
  • UI guardrails: hide runtime badges and archive UI for the system workspace; backend hard-blocks destructive operations.

Validation

  • make static-check
  • Added/updated UI integration coverage for boot + logo navigation + permanence + welcome message rendering.

Risks

  • Touches startup routing + workspace bootstrapping; mitigated by idempotent creation and backend guardrails for the system workspace.

Generated with mux • Model: openai:gpt-5.2 • Thinking: xhigh • Cost: $30.25

Storybook mounts the full App shell, which now defaults to /workspace/mux-chat.

Inject mux-chat into the mock ORPC workspace list so Chromatic snapshots don't get stuck on "Workspace not found".

---

_Generated with `mux` • Model: `openai:gpt-5.2` • Thinking: `high` • Cost: `$51.16`_

<!-- mux-attribution: model=openai:gpt-5.2 thinking=high costs=51.16 -->
- Integration: extract assistant text from stream-end.content when deltas/parts are absent.
- UI integration: give git status more time to populate in CI.
- E2E: use the concurrent toast waiter for /providers and bump stream capture timeout.

---

_Generated with `mux` • Model: `openai:gpt-5.2` • Thinking: `high` • Cost: `$51.16`_

<!-- mux-attribution: model=openai:gpt-5.2 thinking=high costs=51.16 -->
@ibetitsmike ibetitsmike added this pull request to the merge queue Jan 27, 2026
@ibetitsmike ibetitsmike removed this pull request from the merge queue due to a manual request Jan 27, 2026
@ibetitsmike ibetitsmike added this pull request to the merge queue Jan 27, 2026
Merged via the queue into main with commit 3d32e0d Jan 27, 2026
38 of 40 checks passed
@ibetitsmike ibetitsmike deleted the mike/chat-with-mux-system-workspace branch January 27, 2026 01:42
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

enhancement New feature or functionality

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant