Skip to content

chore(testing): add canonical config/testing.toml with provider=router for RAPS coverage#2107

Merged
bug-ops merged 1 commit intomainfrom
feat-issue-2104-fix-testing-toml-raps
Mar 22, 2026
Merged

chore(testing): add canonical config/testing.toml with provider=router for RAPS coverage#2107
bug-ops merged 1 commit intomainfrom
feat-issue-2104-fix-testing-toml-raps

Conversation

@bug-ops
Copy link
Owner

@bug-ops bug-ops commented Mar 22, 2026

Summary

  • Adds config/testing.toml as a tracked canonical reference config for CI sessions
  • Fixes provider = "openai"provider = "router" so [llm.router.reputation] RAPS config is active
  • Keeps identical LLM behavior: [llm.router] chain is ["openai"], same model/endpoint
  • Adds copy instruction in header comment: cp config/testing.toml .local/config/testing.toml

Root cause: With provider = "openai", bootstrap creates OpenAIProvider directly, skipping [llm.router] and [llm.router.reputation] entirely. ReputationTracker is never activated, record_quality_outcome is never called.

Evidence from CI-62: zero change to router_reputation_state.json with provider = "openai". With provider = "router" alpha incremented from 5.43 → 6.21 → 6.95 across sessions.

Closes #2104

Test plan

  • cargo +nightly fmt --check — pass
  • cargo clippy --workspace --features full -- -D warnings — pass (0 warnings)
  • cargo nextest run --workspace --features full --lib --bins — 6363 passed
  • Copy config/testing.toml to .local/config/testing.toml, run a live session, verify router_reputation_state.json updates after tool calls

@github-actions github-actions bot added documentation Improvements or additions to documentation config Configuration file changes chore Maintenance tasks size/L Large PR (201-500 lines) labels Mar 22, 2026
…r for RAPS coverage (#2104)

Previously the local .local/config/testing.toml used provider="openai" which caused the
bootstrap to create an OpenAIProvider directly, silently ignoring [llm.router] and
[llm.router.reputation] (RAPS). With provider="router" and chain=["openai"], LLM behavior
is identical but ReputationTracker is active and record_quality_outcome is called.

Add config/testing.toml as a tracked canonical reference with the correct configuration.
Developers should copy it to .local/config/testing.toml before running CI sessions.
@bug-ops bug-ops force-pushed the feat-issue-2104-fix-testing-toml-raps branch from e6bb8e2 to a5bcfa8 Compare March 22, 2026 00:42
@bug-ops bug-ops enabled auto-merge (squash) March 22, 2026 00:42
@bug-ops bug-ops merged commit 12e2a84 into main Mar 22, 2026
17 checks passed
@bug-ops bug-ops deleted the feat-issue-2104-fix-testing-toml-raps branch March 22, 2026 00:43
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

chore Maintenance tasks config Configuration file changes documentation Improvements or additions to documentation size/L Large PR (201-500 lines)

Projects

None yet

Development

Successfully merging this pull request may close these issues.

chore(testing): testing.toml uses provider="openai" which silently ignores [llm.router.reputation] RAPS config

1 participant