← Back to PRs

#13079: feat: Add OpenAI-compatible API option to CLI for self-hosted models

by MikeWang0316tw open 2026-02-10 03:19 View on GitHub →
cli commands stale
This PR adds a new OpenAI-compatible API option to the `openclaw onboard` and `openclaw configure` commands, allowing users to easily configure self-hosted models like LM Studio, Open WebUI, and Ollama. Key features: - New "OpenAI-compatible API" option in the auth provider selection - Interactive prompts for configuring self-hosted model parameters: * Service base URL (e.g., http://localhost:1234/v1) * API key (if required) * Model name * Provider name * Reasoning capability (true/false) * Context window size * Max tokens - Prevents model name conflicts by disallowing '/' in model names - Maintains alphabetical ordering of auth choice groups - Updates both onboard and configure flows to support the new option This enhancement makes it much easier for users running self-hosted models to integrate them with OpenClaw without manual JSON configuration. <!-- greptile_comment --> <h2>Greptile Overview</h2> <h3>Greptile Summary</h3> This PR introduces a new auth choice (`self-hosted-openai-api`) and wiring to the onboarding/configure flows so users can configure OpenAI-compatible self-hosted endpoints (base URL, optional API key, provider/model naming, and token limits). It adds the choice to the CLI flag help text, auth choice groups/options, a new interactive apply handler, and a config helper that registers the provider + model in `models.providers` and `agents.defaults.models`. Main things to double-check before merging are the correctness of the generated provider/model identifiers and that the chosen provider API type matches how OpenClaw talks to OpenAI-compatible services (the repo already distinguishes `openai-completions` vs `openai-responses` for local providers). <h3>Confidence Score: 3/5</h3> - This PR is close to mergeable but has a couple of configuration correctness issues that can break self-hosted setups. - Most changes are additive and follow existing onboarding patterns, but (1) provider names are not validated to keep model refs well-formed and (2) the self-hosted provider config hard-codes an API type that appears inconsistent with existing local provider defaults in this repo, which can lead to misconfigured endpoints at runtime. - src/commands/auth-choice.apply.self-hosted-openai.ts, src/commands/onboard-auth.config-core.ts <!-- greptile_other_comments_section --> <!-- /greptile_comment -->

Most Similar PRs