#12248: fix: wire streaming config field through resolveExtraParams to streamFn
agents
size: S
trusted-contributor
experienced-contributor
Cluster:
Model Input and Streaming Fixes
## Summary
- Wire the `agents.defaults.models[key].streaming` config field through `resolveExtraParams()` so it reaches the stream function options
- Previously this field was declared in the Zod schema and TypeScript types but never consumed at runtime — `resolveExtraParams()` only read `modelConfig.params`, ignoring the top-level `streaming` boolean
- Now `resolveExtraParams()` merges `modelConfig.streaming` into the returned params object, and `createStreamFnWithExtraParams()` passes the `streaming` field through to the streamFn options
- This matches the existing Ollama workaround pattern (`params.streaming: false` in model discovery)
Fixes #12218
## Test plan
All 3 new tests fail before the fix, pass after:
- [x] `resolveExtraParams` includes `streaming` field when set in config (without `params`)
- [x] `resolveExtraParams` merges `streaming` with `params` when both are set
- [x] `applyExtraParamsToAgent` passes `streaming: false` through to streamFn options (end-to-end)
- [x] Existing 4 tests continue to pass
- [x] `pnpm build` passes
- [x] `pnpm check` passes (lint + format)
## Note on downstream provider consumption
The `streaming` field now flows correctly through the OpenClaw config resolution pipeline. However, whether a given pi-ai provider actually reads `options.streaming` to toggle HTTP-level streaming (`stream: true/false`) depends on the pi-ai library. Currently, providers like `openai-completions` hardcode `stream: true` in their `buildParams()` function. This is the same limitation that affects the existing Ollama workaround (`params.streaming: false`). When pi-ai adds per-request streaming control, this config path will "just work."
<!-- greptile_comment -->
<h2>Greptile Overview</h2>
<h3>Greptile Summary</h3>
This PR fixes a gap in the embedded runner’s config plumbing: the `agents.defaults.models["<provider>/<modelId>"].streaming` boolean was present in the config schema/types but wasn’t used at runtime.
Changes:
- `src/agents/pi-embedded-runner/extra-params.ts` now includes `modelConfig.streaming` in `resolveExtraParams()` output, and `createStreamFnWithExtraParams()` passes a boolean `streaming` value through into the streamFn option object.
- `src/agents/pi-embedded-runner-extraparams.test.ts` adds focused tests covering (a) `streaming`-only model config, (b) merge with `params`, and (c) end-to-end propagation into streamFn options.
This fits into the existing embedded runner pipeline in `src/agents/pi-embedded-runner/run/attempt.ts`, where `applyExtraParamsToAgent()` wraps the session’s `streamFn` to inject provider/model-specific stream options derived from the OpenClaw config.
<h3>Confidence Score: 5/5</h3>
- This PR is safe to merge with minimal risk.
- The change is small and well-scoped: it extends existing extra-param resolution/wrapping to include a boolean `streaming` field, and adds tests to lock the behavior in. I didn’t find any breaking behavior changes beyond the intended config-to-options propagation.
- No files require special attention
<!-- greptile_other_comments_section -->
<!-- /greptile_comment -->
Most Similar PRs
#6136: fix(agents): cast streamParams to allow cacheRetention property ass...
by macd2 · 2026-02-01
77.9%
#5783: fix(ollama): add streamToolCalls fallback for tool calling
by deepmehta11 · 2026-01-31
77.7%
#2353: fix: ensure api field is set for inline provider models
by sbknana · 2026-01-26
76.8%
#22175: fix: support xai tool stream and compat flags
by ShunsukeHayashi · 2026-02-20
76.1%
#12999: feat(agents): Add streaming response metrics tracking
by trevorgordon981 · 2026-02-10
76.0%
#14640: feat(agents): support per-agent temperature and maxTokens in agents...
by lailoo · 2026-02-12
75.9%
#13626: fix(model): propagate provider model properties in fallback resolution
by mcaxtr · 2026-02-10
75.6%
#19394: fix(agents): normalize tool call arguments dropped to {} (#19261)
by DevvGwardo · 2026-02-17
75.0%
#5764: fix(telegram): enable streaming in private chats without topics
by garnetlyx · 2026-01-31
74.9%
#9822: fix: allow local/custom model providers for sub-agent inference
by stammtobias91 · 2026-02-05
74.9%