← Back to PRs

#14464: fix(hooks): use configured primary model for slug generation (#14272)

by lailoo open 2026-02-12 05:33 View on GitHub →
stale size: S
## Summary Fixes #14272 ## Problem `generateSlugViaLLM` in `src/hooks/llm-slug-generator.ts` hardcodes `anthropic` as the provider and `claude-haiku` as the model for slug generation. Users who configure a different primary model (e.g. `openai/gpt-4.1-mini`) still get Anthropic API calls, which fail if they don't have an Anthropic API key. ## Fix Use `resolveConfiguredModelRef()` to read the user's configured primary model and pass its `provider`/`model` to `runEmbeddedPiAgent`. Falls back to the original Anthropic defaults when no model is configured. ## Reproduction & Verification ### Before fix (main branch) — Bug confirmed: - `src/hooks/llm-slug-generator.ts` hardcodes `provider: "anthropic"` and `modelId: "claude-haiku"` — no test file exists on main. - Users with non-Anthropic configs get `FailoverError` when session-memory hook tries to generate slugs. ### After fix — All verified: ``` ✓ passes configured primary model to runEmbeddedPiAgent (#14272) ✓ falls back to default Anthropic model when no primary is configured ✓ returns cleaned slug from LLM response ✓ returns null when LLM returns no payloads ✓ returns null when LLM throws 5 tests pass (pnpm vitest run src/hooks/llm-slug-generator.test.ts) ``` ## Testing - ✅ 5 tests pass (`pnpm vitest run src/hooks/llm-slug-generator.test.ts`) - ✅ Lint passes

Most Similar PRs