#21437: docs(ollama): add prominent warnings about /v1 breaking tool calling
docs
size: XS
Cluster:
Plugin Enhancements and Fixes
## Summary
- **Problem:** Users running Ollama on a remote host naturally configure `baseUrl: "http://host:11434/v1"` with `api: "openai-responses"`, which silently breaks tool calling
- **Why it matters:** Model outputs raw JSON as plain text instead of executing tools—no error surfaced, confusing UX
- **What changed:** Added prominent `<Warning>` callouts at top of doc, in custom URL section, and in legacy mode section
- **What did NOT change:** No code changes, docs-only
## Change Type (select all)
- [ ] Bug fix
- [ ] Feature
- [ ] Refactor
- [x] Docs
- [ ] Security hardening
- [ ] Chore/infra
## Scope (select all touched areas)
- [ ] Gateway / orchestration
- [ ] Skills / tool execution
- [ ] Auth / tokens
- [ ] Memory / storage
- [x] Integrations
- [ ] API / contracts
- [x] UI / DX
- [ ] CI/CD / infra
## Linked Issue/PR
- Closes #21243
## User-visible / Behavior Changes
None (docs-only). Users will now see clear warnings before misconfiguring Ollama.
## Security Impact (required)
- New permissions/capabilities? No
- Secrets/tokens handling changed? No
- New/changed network calls? No
- Command/tool execution surface changed? No
- Data access scope changed? No
## Repro + Verification
### Environment
- OS: Linux (Ubuntu)
- Runtime/container: N/A (docs change)
- Model/provider: Ollama
- Integration/channel: N/A
- Relevant config: N/A
### Steps
1. View docs/providers/ollama.md before change
2. Note warning about /v1 is buried in Advanced section
3. View docs/providers/ollama.md after change
4. Note prominent warning at top and in custom URL section
### Expected
- Clear warning visible before user configures remote Ollama incorrectly
### Actual
- Warning now appears at top of doc and in relevant config sections
## Evidence
- [x] Diff shows `<Warning>` callouts added in three locations
- Renders correctly in Mintlify (standard component)
## Human Verification (required)
- Verified scenarios: Read through doc flow as a new user setting up remote Ollama
- Edge cases checked: Ensured warning appears before any config example that could mislead
- What I did NOT verify: Live Mintlify rendering (no local Mintlify instance)
## Compatibility / Migration
- Backward compatible? Yes
- Config/env changes? No
- Migration needed? No
## Failure Recovery (if this breaks)
- How to disable/revert: Revert commit
- Files/config to restore: docs/providers/ollama.md
- Known bad symptoms: N/A (docs-only)
## Risks and Mitigations
None. Docs-only change adding warnings.
Most Similar PRs
#23776: feat(ollama): auto-register Ollama provider with placeholder key an...
by jayy-77 · 2026-02-22
72.8%
#23478: feat : add VPS deployment instructions for Ollama and enhance relat...
by jayy-77 · 2026-02-22
72.5%
#19612: feat(onboarding): add Ollama to onboarding provider list
by ParthSareen · 2026-02-18
69.4%
#16098: fix: omit tools param for models without tool support, surface erro...
by claw-sylphx · 2026-02-14
67.0%
#16961: docs: warn against storing secrets in injected workspace files
by soumikbhatta · 2026-02-15
66.8%
#7278: feat(ollama): optimize local LLM support with auto-discovery and ti...
by alltomatos · 2026-02-02
66.1%
#18587: fix(ollama): improve timeout handling and cooldown logic for local ...
by manthis · 2026-02-16
65.9%
#20892: docs: Fix quick wins - broken links, configure UX, Tailscale Aperture
by chilu18 · 2026-02-19
65.7%
#21199: Models: suppress repeated vLLM/Ollama discovery warnings (#21037)
by itsishant · 2026-02-19
65.5%
#22107: docs: add local providers troubleshooting guide
by pierreeurope · 2026-02-20
65.1%