#16032: Local first
docs
scripts
commands
agents
stale
size: XL
Cluster:
Ollama Model Enhancements
<!-- greptile_comment -->
<h2>Greptile Overview</h2>
<h3>Greptile Summary</h3>
This PR transforms OpenClaw into a local-first fork (`openclaw-local`) that defaults to Ollama for zero-setup local AI inference. The changes preserve all cloud provider functionality while making local operation the default path.
**Key changes:**
- Default provider switched from Anthropic → Ollama (`llama3.3`)
- Onboarding wizard reordered to present local/Ollama first
- Ollama provider auto-enabled without requiring API keys
- Added native thinking support via Ollama's `think` API with incremental streaming events (`thinking_start`, `thinking_delta`, `thinking_end`, `text_start`, `text_delta`, `text_end`)
- New setup script and example config for streamlined local deployment
- Comprehensive test coverage for new streaming behavior
- Updated documentation with local-first focus, including architecture guide and model recommendations
- Simplified CI workflow (removed heavy platform-specific jobs)
- Added open source scaffolding (CoC, PR template, CONTRIBUTING guide)
**Implementation quality:**
- Streaming implementation properly handles thinking/text block transitions with correct index management
- Tests verify both thinking-enabled and thinking-disabled modes
- Error handling preserved for Ollama API failures
- Backward compatible — all cloud providers still functional
<h3>Confidence Score: 5/5</h3>
- This PR is safe to merge with minimal risk
- The implementation is well-tested with comprehensive coverage of the new thinking/streaming features. The changes are primarily configuration defaults and new features rather than risky refactors. Ollama integration has proper error handling, and all existing cloud provider functionality is preserved. The only minor concerns are style-related (curl piping patterns in docs), which don't affect runtime safety.
- No files require special attention — the implementation is solid with good test coverage
<sub>Last reviewed commit: ae5f5b9</sub>
<!-- greptile_other_comments_section -->
<sub>(2/5) Greptile learns from your feedback when you react with thumbs up/down!</sub>
<!-- /greptile_comment -->
Most Similar PRs
#9257: this is my first fork
by demonking369 · 2026-02-05
83.1%
#7278: feat(ollama): optimize local LLM support with auto-discovery and ti...
by alltomatos · 2026-02-02
79.5%
#7432: Comprehensive Ollama Support PR
by charlieduzstuf · 2026-02-02
78.5%
#5195: update expose ollama to network settings in docs
by MorganMarshall · 2026-01-31
77.7%
#4782: fix: Auto-discover Ollama models without requiring explicit API key
by spiceoogway · 2026-01-30
77.0%
#18587: fix(ollama): improve timeout handling and cooldown logic for local ...
by manthis · 2026-02-16
76.5%
#22797: Feat/auto thinking mode
by jrthib · 2026-02-21
75.1%
#15606: LLM Task: add explicit thinking level wiring
by xadenryan · 2026-02-13
74.7%
#7051: Add io-intelligence model inference provider
by rajagurunath · 2026-02-02
74.6%
#16098: fix: omit tools param for models without tool support, surface erro...
by claw-sylphx · 2026-02-14
74.0%