-
Notifications
You must be signed in to change notification settings - Fork 95
Description
Problem Statement
Currently, the codex module requires manual configuration to route Codex CLI requests through Coder's AI Bridge. The current workaround requires modifying the base_config_toml input to configure a model_provider and profile.
This manual configuration adds friction when deploying Codex with AI Bridge enabled deployments.
Proposed Solution
Add a simple boolean input variable enable_coder_aibridge that automatically configures Codex to route requests through Coder AI Bridge.
When enable_coder_aibridge = true, the module should automatically add a config section with the following content.
[model_providers.aibridge]
name = "AI Bridge"
base_url = "${data.coder_workspace.me.access_url}/api/v2/aibridge/openai/v1"
env_key = "CODER_AIBRIDGE_SESSION_TOKEN" # can be injected as `coder_env` with value `data.coder_workspace_owner.me.session_token`
wire_api = "responses"
[profiles.aibridge]
model_provider = "aibridge"
model = "${var.codex_model}"
model_reasoning_effort = "medium" # this can also be extracted as a module inputAdditionally, this should conflict with the openai_api_key input.
Example Usage
module "codex" {
source = "registry.coder.com/coder-labs/codex/coder"
version = "1.0.0"
agent_id = coder_agent.main.id
workdir = "/home/coder/project"
enable_coder_aibridge = true
}Notes
- AI Bridge is a Premium Coder feature that provides centralized LLM proxy management