#22069: fix(agents): add provider-specific hints for local model auth errors
agents
size: S
Cluster:
Ollama Model Enhancements
When Ollama, vLLM, or LM Studio users configure a local model as their primary but forget to set an API key, they get a confusing error pointing to auth-store paths and `openclaw agents add` -- none of which applies to local providers.
These providers just need any dummy value as a key to register (e.g. `OLLAMA_API_KEY="ollama-local"`), but the current error doesn't mention that.
**Changes:**
- `model-auth.ts`: detect known local providers (`ollama`, `vllm`, `lmstudio`) in the `resolveApiKeyForProvider` error path and show provider-specific hints with the right env var name and docs link
- 3 new tests confirming ollama/vllm get specific hints while other providers still get the generic message
**Before:**
```
No API key found for provider "ollama". Auth store: /path/to/auth-profiles.json (agentDir: ...).
Configure auth for this agent (openclaw agents add <id>) or copy auth-profiles.json from the main agentDir.
```
**After:**
```
No API key found for provider "ollama". Ollama needs any value as API key to register as a provider.
Set OLLAMA_API_KEY="ollama-local" or run "openclaw configure". Docs: https://docs.openclaw.ai/providers/ollama
```
Addresses the confusion in #22055, #4544, #1695. Complementary to #20872 (auto-discovery without key).
<!-- greptile_comment -->
<h3>Greptile Summary</h3>
Improves error messages for local model providers (Ollama, vLLM, LM Studio) by adding provider-specific hints that explain users need to set a dummy API key value. Replaces generic auth-store error with targeted guidance including env var names and docs links.
**Key changes:**
- Added `localProviderHints` map in `resolveApiKeyForProvider` error path with custom messages for ollama/vllm/lmstudio
- Created 3 new tests to verify specific hints appear for local providers vs generic message for cloud providers
**Issues found:**
- Test file lacks environment variable isolation - tests will fail if `OLLAMA_API_KEY`, `VLLM_API_KEY`, or `ANTHROPIC_API_KEY` happen to be set in the environment
- LM Studio hint references `LMSTUDIO_API_KEY` env var, but this variable is not registered in the `envMap` (unlike ollama/vllm), so the suggested fix won't actually work
<h3>Confidence Score: 2/5</h3>
- PR has critical issues: tests lack env isolation and will fail unpredictably, plus LM Studio hint suggests non-functional fix
- The core feature (provider-specific hints) is sound, but implementation has two logical errors: (1) tests don't isolate environment variables and will fail if certain env vars are set, following a pattern established elsewhere in the codebase that should have been followed; (2) the LM Studio hint tells users to set an env var that isn't actually supported in the code
- Both files need attention - test file needs env isolation pattern applied to all 3 tests, and model-auth.ts needs LM Studio env var either added to envMap or hint message corrected
<sub>Last reviewed commit: f43a7e0</sub>
<!-- greptile_other_comments_section -->
<!-- /greptile_comment -->
Most Similar PRs
#4782: fix: Auto-discover Ollama models without requiring explicit API key
by spiceoogway · 2026-01-30
79.3%
#7278: feat(ollama): optimize local LLM support with auto-discovery and ti...
by alltomatos · 2026-02-02
79.0%
#7568: feat(agents): add LM Studio auto-discovery and provider support
by sjseo298 · 2026-02-03
77.8%
#17469: Improve unknown-model errors for provider/model misconfiguration
by megahappyclaw · 2026-02-15
77.6%
#22107: docs: add local providers troubleshooting guide
by pierreeurope · 2026-02-20
76.8%
#18587: fix(ollama): improve timeout handling and cooldown logic for local ...
by manthis · 2026-02-16
76.3%
#9822: fix: allow local/custom model providers for sub-agent inference
by stammtobias91 · 2026-02-05
75.2%
#15632: fix: use provider-qualified key in MODEL_CACHE for context window l...
by linwebs · 2026-02-13
74.8%
#21977: Preserve provider API for discovered Ollama models
by graysurf · 2026-02-20
74.2%
#12059: feat(agents): Add Azure AI Foundry credential support
by lisanyambere · 2026-02-08
74.1%