#14475: feat: Add native Azure OpenAI support
docs
scripts
agents
stale
size: M
## Summary
Adds native support for Azure OpenAI models via configuration.
### Changes
- **Config:** Added `azure-openai-responses` API type support.
- **Core:** Updated model resolution to pass custom `headers` from configuration to the provider, enabling `api-key` header usage required by Azure and other custom gateways.
- **Docs:** Added `docs/providers/azure-openai.md` with configuration examples.
### Configuration Example
```json
{
"models": {
"providers": {
"azure": {
"baseUrl": "...",
"api": "azure-openai-responses",
"apiKey": "...",
"models": [{ "id": "gpt-4", "name": "gpt-4" }]
}
}
}
}
```
<!-- greptile_comment -->
<h2>Greptile Overview</h2>
<h3>Greptile Summary</h3>
This PR adds an `azure-openai-responses` model API option to the config schema/types, threads provider-configured HTTP `headers` into inline model resolution, and documents an Azure OpenAI provider configuration.
The config/type changes live in `src/config/types.models.ts` and `src/config/zod-schema.core.ts`, while the runtime wiring happens in `src/agents/pi-embedded-runner/model.ts` by carrying `headers` through the resolved `Model` object.
Main issue: `azure-openai-responses` is introduced as a selectable API value but there is no corresponding runtime/provider implementation (no code references outside config/docs). Selecting it is therefore expected to mis-route or fail at runtime unless it’s explicitly mapped/handled.
Secondary issue: `resolveModel`’s fallback path uses `providers[provider]` without provider-id normalization, unlike the inline-model match path. This can cause configured provider defaults (baseUrl/headers/api) to be skipped when the caller uses an equivalent provider id that normalizes to the same value.
<h3>Confidence Score: 2/5</h3>
- This PR is not safe to merge as-is due to an unimplemented new API option.
- A new `ModelApi` value (`azure-openai-responses`) is added to config/types/docs but has no corresponding runtime/provider handling in the repo (no references outside config/docs). If users select this API, requests are expected to fail or be misrouted at runtime. Additionally, provider config fallback lookup is inconsistent with normalized matching, which can silently drop baseUrl/headers defaults.
- src/config/types.models.ts, src/config/zod-schema.core.ts, src/agents/pi-embedded-runner/model.ts
<!-- greptile_other_comments_section -->
<sub>(3/5) Reply to the agent's comments like "Can you suggest a fix for this @greptileai?" or ask follow-up questions!</sub>
<!-- /greptile_comment -->
Most Similar PRs
#14239: Add Azure OpenAI Completions provider
by KJFromMicromonic · 2026-02-11
83.1%
#16766: fix(model): apply provider baseUrl/headers override to registry-fou...
by dzianisv · 2026-02-15
83.0%
#12059: feat(agents): Add Azure AI Foundry credential support
by lisanyambere · 2026-02-08
83.0%
#13079: feat: Add OpenAI-compatible API option to CLI for self-hosted models
by MikeWang0316tw · 2026-02-10
79.0%
#13295: feat: add Eternal AI provider integration
by peterparkernho · 2026-02-10
78.1%
#18002: fix: add Azure AI Foundry URL support for custom providers
by MisterGuy420 · 2026-02-16
77.1%
#8256: feat: Add rate limit strategy configuration
by revenuestack · 2026-02-03
76.6%
#2353: fix: ensure api field is set for inline provider models
by sbknana · 2026-01-26
75.9%
#7570: fix: allow models from providers with auth profiles configured
by DonSqualo · 2026-02-03
75.9%
#15991: feat: add Novita AI provider support with dynamic model discovery
by Alex-wuhu · 2026-02-14
75.6%