#23694: fix: allow OAuth provider models in isolated sessions
agents
size: S
Cluster:
Model Cooldown Management
Fixes #23677
## Problem
OAuth provider models like `openai-codex/gpt-4o` and `github-copilot/gpt-4o` were rejected with 'Unknown model' errors when explicitly selected for isolated/cron sessions or subagents, even though they worked fine in fallback chains.
## Root Cause
`buildAllowedModelSet` only checked:
1. CLI providers
2. Models in the catalog
3. Providers in `models.providers` config
Built-in OAuth providers are handled via auth profiles and don't appear in `models.providers`.
## Solution
- Export `isOAuthProvider` from `auth-profiles/oauth.ts`
- Add OAuth provider check in `buildAllowedModelSet`
- Add test coverage for OAuth provider allowlisting
## Testing
- ✅ All existing tests pass
- ✅ Added 3 new tests covering OAuth provider model allowlisting
- ✅ Verified openai-codex models work in isolated sessions
Most Similar PRs
#19217: fix(auth): route openai-codex to built-in OAuth in models auth login
by nabbilkhan · 2026-02-17
71.1%
#9583: fix(models): allow models in agents.defaults.models even if not in ...
by hotzen100 · 2026-02-05
67.6%
#7570: fix: allow models from providers with auth profiles configured
by DonSqualo · 2026-02-03
66.9%
#14508: fix(models): allow forward-compat models in allowlist check
by jonisjongithub · 2026-02-12
66.4%
#9822: fix: allow local/custom model providers for sub-agent inference
by stammtobias91 · 2026-02-05
66.3%
#19774: Allow explicit model allowlist entries missing from provider catalogs
by harshang03 · 2026-02-18
65.0%
#20388: fix(failover): don't skip same-provider fallback models when cooldo...
by Limitless2023 · 2026-02-18
63.7%
#20593: fix: route Anthropic OAT tokens to /v1/messages endpoint (#20107)
by KyleKincer · 2026-02-19
63.4%
#7941: fix: scope rate-limit cooldowns per-model instead of per-provider
by adrrr · 2026-02-03
63.4%
#2353: fix: ensure api field is set for inline provider models
by sbknana · 2026-01-26
63.1%