#5783: fix(ollama): add streamToolCalls fallback for tool calling
agents
Cluster:
Ollama Model Enhancements
## Summary
- Adds `streamToolCalls` config option for providers to disable streaming when tools are present
- Ollama provider now defaults to `streamToolCalls: false`
- Implements non-streaming fallback using `complete()` instead of `streamSimple()` when tools are in the context
Ollama's streaming implementation doesn't properly emit `tool_calls` delta chunks, causing tool calls to be silently lost. This fix detects when tools are present and uses non-streaming requests for affected providers.
## Test plan
- [x] Added unit tests for `isOllamaProvider()`, `shouldDisableStreamingForTools()`, and `createOllamaAwareStreamFn()`
- [x] All 11 new tests passing
- [x] Existing `pi-embedded-runner` tests still pass
- [x] TypeScript compiles without errors
- [x] Linter passes
Fixes #5769
<!-- greptile_comment -->
<h2>Greptile Overview</h2>
<h3>Greptile Summary</h3>
This PR introduces a `streamToolCalls` provider config flag (defaulting to `false` for Ollama) and adds an Ollama-aware `StreamFn` wrapper that switches to non-streaming `complete()` when tools are present, working around Ollama streaming not emitting tool-call deltas. It integrates the wrapper into the embedded runner’s stream function setup and adds unit tests for the new helper logic.
Main concerns are around the fallback stream event semantics and option parity between `streamSimple()` and the `complete()`-based path (potentially changing stop reasons and dropping some options only when tools are present).
<h3>Confidence Score: 3/5</h3>
- Reasonably safe to merge, but verify stream event semantics in the complete() fallback.
- Core change is a targeted fallback for Ollama tool-calls and is covered by basic unit tests, but the new complete()-based stream emulation may not match pi-ai’s expected event/stop-reason conventions and may drop some options compared to the streaming path, which could cause subtle regressions only when tools are present.
- src/agents/pi-embedded-runner/ollama-stream.ts
<!-- greptile_other_comments_section -->
<sub>(3/5) Reply to the agent's comments like "Can you suggest a fix for this @greptileai?" or ask follow-up questions!</sub>
<!-- /greptile_comment -->
Most Similar PRs
#16098: fix: omit tools param for models without tool support, surface erro...
by claw-sylphx · 2026-02-14
79.5%
#12248: fix: wire streaming config field through resolveExtraParams to stre...
by mcaxtr · 2026-02-09
77.7%
#7278: feat(ollama): optimize local LLM support with auto-discovery and ti...
by alltomatos · 2026-02-02
76.4%
#19394: fix(agents): normalize tool call arguments dropped to {} (#19261)
by DevvGwardo · 2026-02-17
74.4%
#20665: fix(ollama): register API provider for completeSimple/compaction
by BrokenFinger98 · 2026-02-19
74.1%
#19235: fix(telegram): tool error warnings no longer overwrite streamed rep...
by gatewaybuddy · 2026-02-17
72.3%
#5115: fix: guard against undefined model.name in Ollama discovery (#5062)
by TheWildHustle · 2026-01-31
71.7%
#20705: fix(ollama): cover remaining completeSimple gaps and add transcript...
by BrokenFinger98 · 2026-02-19
71.1%
#11210: Harden Tool-Call Streaming Parser for OpenAI-Completions Backends (...
by ga-it · 2026-02-07
71.0%
#18587: fix(ollama): improve timeout handling and cooldown logic for local ...
by manthis · 2026-02-16
70.7%