bug-provider-slash

Bug: provider slash-model routing test expects oai-compat but returns openai

Metadata

Statusdone
Assignedagent-2404
Agent identity02e879681e52e0a384106169be043416c4d946e850ab26b2269c57681b52a6e7
Created2026-05-04T19:07:17.948306219+00:00
Started2026-05-04T19:19:46.231998339+00:00
Completed2026-05-04T19:37:41.335265079+00:00
Tagseval-scheduled
Eval score0.86
└ blocking impact0.90
└ completeness0.88
└ coordination overhead0.85
└ correctness0.94
└ downstream usability0.86
└ efficiency0.82
└ intent fidelity0.79
└ style adherence0.92

Description

Description

cargo test test_create_provider_prefixed_is_openai --test integration_provider currently fails independently of the agency hash migration work. The test config sets native/openai endpoints, calls create_provider(tmp.path(), "deepseek/deepseek-chat"), and expects provider name oai-compat; current code returns openai.

Validation

  • Reproducer: env -u WG_LLM_PROVIDER -u WG_MODEL cargo test test_create_provider_prefixed_is_openai --test integration_provider fails before the fix
  • Implementation makes the provider expectation and routing behavior consistent
  • cargo build + cargo test pass with no regressions

Depends on

Required by

Log