#7981: fix(runner): use configured primary model as fallback default
agents
stale
Cluster:
Model Configuration Enhancements
I am using google-gemini model as my primary model, but whenever I click new session, it throws `Error: No API key found for provider "anthropic"`
## Problem
Internal tools (like llm-slug-generator) calls `runEmbeddedPiAgent` without specifying a model. The runner was hardcoded to fall back to anthropic/claude-opus-4-5. If the I have no Anthropic key, it crashes.
## Solution
Modified `src/agents/pi-embedded-runner/run.ts` to check `params.config.agents.defaults.model.primary` before reverting to hardcoded defaults.
<!-- greptile_comment -->
<h2>Greptile Overview</h2>
<h3>Greptile Summary</h3>
This PR changes embedded runner default model resolution so that when `provider`/`model` aren’t provided in params, it falls back to `agents.defaults.model.primary` from config before using the hardcoded `DEFAULT_PROVIDER`/`DEFAULT_MODEL`. The logic is implemented in `src/agents/pi-embedded-runner/run.ts` prior to `resolveModel(...)`, affecting which model/provider the embedded agent uses by default.
<h3>Confidence Score: 4/5</h3>
- This PR is likely safe to merge, but the new precedence logic may not match the stated intent in partial-parameter cases.
- Change is localized and straightforward, but the condition that only consults config primary when both params are absent could produce unexpected model selection when only one of provider/model is specified.
- src/agents/pi-embedded-runner/run.ts
<!-- greptile_other_comments_section -->
<sub>(3/5) Reply to the agent's comments like "Can you suggest a fix for this @greptileai?" or ask follow-up questions!</sub>
**Context used:**
- Context from `dashboard` - CLAUDE.md ([source](https://app.greptile.com/review/custom-context?memory=fd949e91-5c3a-4ab5-90a1-cbe184fd6ce8))
- Context from `dashboard` - AGENTS.md ([source](https://app.greptile.com/review/custom-context?memory=0d0c8278-ef8e-4d6c-ab21-f5527e322f13))
<!-- /greptile_comment -->
Most Similar PRs
#5945: fix: use configured model for slug generator (AI-assisted)
by HEDELKA · 2026-02-01
84.9%
#9080: Fix: Use configured model for memory file slug generation
by vishaltandale00 · 2026-02-04
84.8%
#13401: fix: slug generator should use agent's primary model instead of har...
by pahud · 2026-02-10
80.9%
#23286: fix: use configured model in llm-slug-generator instead of hardcoded …
by wsman · 2026-02-22
80.7%
#4793: hooks: use configured model for slug generator
by yoyooyooo · 2026-01-30
80.4%
#7570: fix: allow models from providers with auth profiles configured
by DonSqualo · 2026-02-03
78.8%
#13191: pi-embedded: enable failover when per-agent fallbacks are configured
by zesty-clawd · 2026-02-10
78.3%
#15632: fix: use provider-qualified key in MODEL_CACHE for context window l...
by linwebs · 2026-02-13
78.2%
#15574: fix(hooks): use configured model for llm slug generation (#15510)
by TsekaLuk · 2026-02-13
78.2%
#20712: fix(subagents): prioritize agent runtime default model over global ...
by sourcesavant · 2026-02-19
77.6%