#16098: fix: omit tools param for models without tool support, surface errors (#15702)
agents
stale
size: L
Cluster:
Ollama Model Enhancements
## Problem
When using local Ollama models that don't support tool/function calling, OpenClaw fails silently with `(no output)`. Direct curl to Ollama works fine. The issue is that OpenClaw sends tool definitions to models that can't handle them, and errors are swallowed.
Closes #15702
## Changes
### 1. Add `compat.supportsTools` config flag (`src/config/`)
- Added `supportsTools?: boolean` to `ModelCompatConfig` type and Zod schema
- When set to `false`, tools are omitted from API requests to the model
- Defaults to `true` (existing behavior preserved)
### 2. Skip tools when model lacks support (`src/agents/pi-embedded-runner/run/attempt.ts`)
- Check `model.compat.supportsTools` before creating tool definitions
- When `supportsTools === false`, tools are omitted (same as `disableTools`)
- Logs an info message when tools are skipped for debugging
### 3. Surface Ollama errors (`src/agents/ollama-stream.ts`)
- Detect and surface Ollama-level `error` field in streamed responses (e.g. model doesn't support tools)
- Detect empty responses when tools were sent and surface a helpful error message suggesting `compat.supportsTools=false`
- Added `error?` field to `OllamaChatResponse` interface
### 4. Tests (`src/agents/ollama-stream.test.ts`)
- Test: Ollama error field is surfaced from streamed response
- Test: empty response with tools triggers helpful error message
- Test: empty response without tools does NOT trigger error (normal behavior)
## Usage
For Ollama models that don't support tools, add `supportsTools: false` to the model's compat config:
```json
{
"models": {
"providers": {
"my-ollama": {
"baseUrl": "http://127.0.0.1:11434",
"api": "ollama",
"models": [{
"id": "tinyllama",
"name": "TinyLlama",
"compat": { "supportsTools": false }
}]
}
}
}
}
```
<!-- greptile_comment -->
<h3>Greptile Summary</h3>
This PR addresses issue #15702 where Ollama models without tool/function calling support fail silently with `(no output)`. It makes two main changes:
- Adds a `compat.supportsTools` config flag that, when set to `false`, omits tools from API requests to the model. This is checked in `attempt.ts` alongside the existing `disableTools` flag, and the type + Zod schema are updated accordingly.
- Surfaces Ollama-level errors that were previously swallowed: (1) detects the `error` field in streamed NDJSON responses, and (2) detects empty responses when tools were sent, providing a helpful error message pointing users to the `compat.supportsTools=false` config.
Additionally, this PR changes `sanitizeAntigravityThinkingBlocks` to convert unsigned thinking blocks to text blocks instead of dropping them entirely, preserving reasoning content. This is a separate behavioral fix bundled into the same commit (from #15681).
Test coverage is thorough across all changes, with both unit tests and e2e integration tests.
<h3>Confidence Score: 4/5</h3>
- This PR is safe to merge with minimal risk — changes are well-scoped, defensive, and backward-compatible.
- The config flag addition is backward-compatible (defaults to true). The Ollama error surfacing is a clear improvement over silent failures. The antigravity thinking block change preserves content that was previously lost. All changes have thorough test coverage. The `Record<string, unknown>` cast in attempt.ts is pragmatic given the external SDK type. No critical issues found.
- No files require special attention. The `src/agents/ollama-stream.ts` empty-response heuristic has a minor edge case with whitespace-only responses (already discussed in previous review threads), but this is unlikely to matter in practice.
<sub>Last reviewed commit: 3975256</sub>
<!-- greptile_other_comments_section -->
<!-- /greptile_comment -->
Most Similar PRs
#5115: fix: guard against undefined model.name in Ollama discovery (#5062)
by TheWildHustle · 2026-01-31
82.9%
#16328: fix: add 'does not support tools' to format error patterns for fail...
by virtualassistanterion · 2026-02-14
82.1%
#18587: fix(ollama): improve timeout handling and cooldown logic for local ...
by manthis · 2026-02-16
80.5%
#4782: fix: Auto-discover Ollama models without requiring explicit API key
by spiceoogway · 2026-01-30
80.4%
#7278: feat(ollama): optimize local LLM support with auto-discovery and ti...
by alltomatos · 2026-02-02
80.1%
#11877: feat(ollama): auto-detect vision capability via /api/show
by Nina-VanKhan · 2026-02-08
79.9%
#5783: fix(ollama): add streamToolCalls fallback for tool calling
by deepmehta11 · 2026-01-31
79.5%
#9257: this is my first fork
by demonking369 · 2026-02-05
78.1%
#21977: Preserve provider API for discovered Ollama models
by graysurf · 2026-02-20
77.6%
#7044: feat: Add local model tool calling support
by jokelord · 2026-02-02
77.0%