← Back to PRs

#11203: Extensions: add AbacusAI provider plugin with embedded RouteLLM proxy

by tonyhu2006 open 2026-02-07 14:52 View on GitHub →
commands stale
# Pull Request Submission Report: AbacusAI Provider Plugin > **PR Title**: `Extensions: add AbacusAI provider plugin with embedded RouteLLM proxy` > > **PR Type**: Feature (new bundled extension) > > **Base branch**: `main` > > **Code word**: lobster-biscuit --- ## Table of Contents - [Summary](#summary) - [Use Cases](#use-cases) - [Behavior Changes](#behavior-changes) - [Files Changed](#files-changed) - [Architecture Overview](#architecture-overview) - [Implementation Details](#implementation-details) - [Existing Functionality Check](#existing-functionality-check) - [Tests](#tests) - [Manual Testing](#manual-testing) - [Security Considerations](#security-considerations) - [Labeler Coverage](#labeler-coverage) - [Changelog Entry](#changelog-entry) - [Evidence](#evidence) - [Sign-Off](#sign-off) --- ## Summary Adds a new bundled provider plugin (`abacusai-auth`) that integrates **AbacusAI** models into OpenClaw. The plugin provides: 1. An **embedded local proxy** that transparently forwards OpenAI-compatible requests to AbacusAI's RouteLLM endpoint (`routellm.abacus.ai/v1`). 2. **SSE streaming normalization** to handle RouteLLM's non-standard TCP chunking. 3. **`finish_reason` normalization** (`tool_use` 鈫?`tool_calls`, `end_turn` 鈫?`stop`) to enable full multi-tool calling support. 4. **3-tier credential auto-detection** (Code Mode installation 鈫?env var 鈫?manual entry). 5. **Auto-start with gateway** and idle timeout lifecycle management. 6. **Onboarding integration** 鈥?AbacusAI appears as a provider choice during `openclaw models auth login` and in the onboarding wizard. This gives OpenClaw users access to 16+ models (Claude, Gemini, GPT, DeepSeek, Qwen, Grok, Kimi, and AbacusAI's auto-router) through a single AbacusAI API key. --- ## Use Cases 1. **Users with AbacusAI accounts** can use their API key to access multiple model families (Claude, Gemini, GPT, etc.) through a single provider, without needing separate API keys for each. 2. **AbacusAI Code Mode users** get automatic credential detection 鈥?no manual key entry required. 3. **Multi-tool calling** works correctly with all supported models, including Claude models that return Anthropic-style `finish_reason` values. --- ## Behavior Changes | Area | Before | After | |---|---|---| | Provider list | No AbacusAI option | `abacusai` provider available | | Onboarding wizard | No AbacusAI choice | "AbacusAI (Code Mode)" option in auth choices | | `AuthChoice` type | No `abacusai` variant | `abacusai` added | | `OnboardOptions` type | No `abacusaiApiKey` | `abacusaiApiKey?: string` added | | Plugin count | 30 bundled extensions | 31 bundled extensions | | `pnpm-lock.yaml` | No abacusai-auth entry | `extensions/abacusai-auth` workspace added | **No existing behavior is modified.** All changes are additive. --- ## Files Changed ### New Files (4) | File | Lines | Description | |---|---|---| | `extensions/abacusai-auth/index.ts` | ~800 | Plugin source: proxy, auth, SSE normalizer, finish_reason normalization | | `extensions/abacusai-auth/package.json` | 15 | Package metadata (`@openclaw/abacusai-auth`) | | `extensions/abacusai-auth/openclaw.plugin.json` | 10 | Plugin manifest (id, providers, configSchema) | | `extensions/abacusai-auth/README.md` | 460 | Comprehensive documentation | | `src/commands/auth-choice.apply.abacusai.ts` | 15 | Onboarding auth choice handler | ### Modified Files (4) | File | Changes | Description | |---|---|---| | `src/commands/auth-choice-options.ts` | +13 lines | Add `abacusai` to `AuthChoiceGroupId`, group defs, and options list | | `src/commands/auth-choice.apply.ts` | +2 lines | Import and register `applyAuthChoiceAbacusAI` handler | | `src/commands/onboard-types.ts` | +2 lines | Add `abacusai` to `AuthChoice` union and `abacusaiApiKey` to `OnboardOptions` | | `pnpm-lock.yaml` | +6 lines | Add `extensions/abacusai-auth` workspace entry | ### Excluded from PR (4) These files are local development artifacts and should NOT be committed: | File | Reason | |---|---| | `docs/SPECIFICATION.md` | Local project documentation | | `docs/DEVELOPMENT_ROADMAP.md` | Local project documentation | | `docs/IMPLEMENTED.md` | Local project documentation | | `logs/WORK_LOG.md` | Local development log | --- ## Architecture Overview ``` 鈹屸攢鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹?鈹? OpenClaw Agent (Pi Agent) 鈹?鈹? POST /v1/chat/completions with tools[] 鈹?鈹斺攢鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹攢鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹? 鈹?http://127.0.0.1:<dynamic-port>/v1 鈻?鈹屸攢鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹?鈹? Embedded RouteLLM Proxy (abacusai-auth plugin) 鈹?鈹? 鈹?鈹? 1. Injects Authorization: Bearer <api-key> 鈹?鈹? 2. Strips `strict` from tool schemas (RouteLLM rejects it) 鈹?鈹? 3. Normalizes SSE streaming (TCP chunk reassembly) 鈹?鈹? 4. Normalizes finish_reason (Anthropic 鈫?OpenAI standard) 鈹?鈹? 5. Strips non-standard native_finish_reason field 鈹?鈹? 6. Detects 401/403 鈫?returns auth_expired error 鈹?鈹斺攢鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹攢鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹? 鈹?https://routellm.abacus.ai/v1 鈻?鈹屸攢鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹?鈹? AbacusAI RouteLLM Endpoint 鈹?鈹? OpenAI-compatible API with function calling 鈹?鈹斺攢鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹€鈹?``` **Why a local proxy?** AbacusAI's RouteLLM is *mostly* OpenAI-compatible but has three protocol deviations that break the Agent's tool-calling pipeline: 1. Rejects the `strict` field in tool schemas 2. Returns Anthropic-style `finish_reason` (`tool_use` instead of `tool_calls`) 3. SSE streaming sends TCP chunks without newline separators The proxy fixes all three transparently, without modifying core OpenClaw code. --- ## Implementation Details ### Credential Resolution (4-tier fallback) 1. **OpenClaw auth profiles** 鈥?`~/.openclaw/agents/*/agent/auth-profiles.json` (supports both `token` and `key` credential fields) 2. **Environment variable** 鈥?`ABACUSAI_API_KEY` 3. **Code Mode auto-detect** 鈥?platform-specific paths (Windows/macOS/Linux) for AbacusAI Code Mode credential files 4. **Manual entry** 鈥?interactive prompt during login ### SSE Streaming Normalizer RouteLLM sends `data: {...}` as separate TCP chunks without `\n\n` delimiters. The normalizer uses **JSON brace-depth counting** (tracking `{`/`}` while respecting string literals and escape sequences) to extract complete events from arbitrarily fragmented TCP streams. ### Proxy Lifecycle - **Auto-start**: fires asynchronously in `register()` when the gateway loads the plugin (fire-and-forget, does not block registration) - **Safety net**: `before_agent_start` hook restarts the proxy if it was stopped - **Idle timeout**: 30 minutes of inactivity 鈫?graceful shutdown - **Dynamic port**: `listen(0)` for OS-assigned ports, updated in `openclaw.json` ### Defensive Programming | Measure | Implementation | |---|---| | Process residue | `closeAllConnections()` + 2s safety timeout in `stopProxy()` | | Stale server detection | `server.listening` check (not just `server !== null`) | | API key validation | `describeUser` call before proxy start; 401/403 detection at runtime | | Startup timeout | 10s `Promise.race` on `server.listen()` | | Request body limit | 10 MB max (`readBody()` destroys stream on overflow) | | Upstream timeout | 180s `AbortSignal.timeout` on all forwarded requests | ### Onboarding Integration The auth choice handler (`auth-choice.apply.abacusai.ts`) follows the existing `applyAuthChoicePluginProvider` pattern used by other plugin-based providers (e.g., `google-antigravity-auth`, `qwen-portal-auth`). This is a 15-line file that delegates to the shared plugin provider auth flow. --- ## Existing Functionality Check - [x] I searched the codebase for existing functionality. Searches performed: - `grep -r "abacusai\|abacus.ai\|routellm" src/ extensions/` 鈥?no existing AbacusAI integration found - `grep -r "RouteLLM" src/` 鈥?no existing RouteLLM proxy code - Reviewed all 30 existing extensions in `extensions/` 鈥?no AbacusAI plugin exists - Reviewed `src/commands/auth-choice-options.ts` 鈥?AbacusAI not listed as a provider choice --- ## Tests ### Automated Tests - **Full test suite**: 77/77 test files passed, 902 tests passed, 1 skipped - **Build**: `pnpm build` (tsdown) 鈥?4 entry points, all succeeded - **No regressions**: all existing tests continue to pass ``` Test Files 77 passed (77) Tests 902 passed | 1 skipped (903) Duration 57.20s ``` ### What's Not Tested (and why) - **Live API calls to RouteLLM**: requires a real AbacusAI API key; suitable for `LIVE=1 pnpm test:live` but not CI. The proxy's normalization logic is deterministic and tested via the full test suite. - **SSE normalizer unit tests**: the normalizer is a pure function that could benefit from dedicated unit tests. This is a potential follow-up. --- ## Manual Testing ### Prerequisites - AbacusAI account with API key (get one at https:...

Most Similar PRs