#7432: Comprehensive Ollama Support PR
docs
cli
commands
docker
agents
Cluster:
Ollama Model Enhancements
Takes most/all of the Ollama PRs and merges them into 1 more managable PR.
<!-- greptile_comment -->
<h2>Greptile Overview</h2>
<h3>Greptile Summary</h3>
This PR consolidates Ollama onboarding and provider support by:
- Adding an interactive auth choice handler for Ollama plus a non-interactive `--ollama-base-url` path.
- Introducing a native Ollama client (`/api/chat` + `/api/tags`) with unit tests.
- Changing provider resolution to auto-discover a local Ollama instance and populate an `ollama` provider when models are detected.
The changes fit into the existing onboarding/auth-choice flow (`src/commands/onboard-*`) and provider discovery (`src/agents/models-config.providers.ts`) by treating Ollama as an OpenAI-compatible provider and wiring it into the same config/profile mechanisms used by other providers.
<h3>Confidence Score: 3/5</h3>
- This PR looks reasonably safe to merge, but has a few integration mismatches that could cause confusing behavior in real Ollama setups.
- Core changes are straightforward and covered by unit tests for the new native client, but there’s a likely sentinel mismatch for the local API key (`ollama` vs `ollama-local`) and provider discovery now introduces a network probe during implicit provider resolution, which could degrade UX when Ollama isn’t running.
- src/commands/onboard-auth.models.ts, src/agents/models-config.providers.ts, src/commands/onboard-auth.config-core.ts
<!-- greptile_other_comments_section -->
<sub>(2/5) Greptile learns from your feedback when you react with thumbs up/down!</sub>
**Context used:**
- Context from `dashboard` - CLAUDE.md ([source](https://app.greptile.com/review/custom-context?memory=fd949e91-5c3a-4ab5-90a1-cbe184fd6ce8))
- Context from `dashboard` - AGENTS.md ([source](https://app.greptile.com/review/custom-context?memory=0d0c8278-ef8e-4d6c-ab21-f5527e322f13))
<!-- /greptile_comment -->
Most Similar PRs
#7278: feat(ollama): optimize local LLM support with auto-discovery and ti...
by alltomatos · 2026-02-02
85.8%
#5195: update expose ollama to network settings in docs
by MorganMarshall · 2026-01-31
83.1%
#4782: fix: Auto-discover Ollama models without requiring explicit API key
by spiceoogway · 2026-01-30
80.1%
#9257: this is my first fork
by demonking369 · 2026-02-05
79.8%
#16032: Local first
by bczaicki · 2026-02-14
78.5%
#19612: feat(onboarding): add Ollama to onboarding provider list
by ParthSareen · 2026-02-18
77.2%
#5115: fix: guard against undefined model.name in Ollama discovery (#5062)
by TheWildHustle · 2026-01-31
76.7%
#11877: feat(ollama): auto-detect vision capability via /api/show
by Nina-VanKhan · 2026-02-08
76.0%
#16098: fix: omit tools param for models without tool support, surface erro...
by claw-sylphx · 2026-02-14
75.8%
#18587: fix(ollama): improve timeout handling and cooldown logic for local ...
by manthis · 2026-02-16
74.7%