← Back to PRs

#6484: docs: add LiteLLM + Nebius integration guide

by demianarc open 2026-02-01 18:08 View on GitHub →
docs
## Summary This PR adds a comprehensive guide for running Nebius models (e.g., GLM-4.7, Qwen3) with OpenClaw using LiteLLM as an OpenAI-compatible proxy. ## What's included - ✅ LiteLLM systemd setup with config templates - ✅ OpenClaw provider configuration - ✅ Troubleshooting section for common issues - ✅ TL;DR quick reference for emergency use ## Why this matters - Enables open-source model support without relying on proprietary APIs - No rate limits — use your own Nebius API key - Cost-effective alternative for local/self-hosted deployments - Pattern applies to any OpenAI-compatible model provider ## File added - `docs/providers/litellm-nebius.md` — complete setup guide Let me know if anything needs adjustment! <!-- greptile_comment --> <h2>Greptile Overview</h2> <h3>Greptile Summary</h3> Adds a new provider doc at `docs/providers/litellm-nebius.md` describing how to run Nebius-hosted models behind LiteLLM and point OpenClaw at the local OpenAI-compatible proxy. The guide includes a LiteLLM config template, OpenClaw provider config snippet, service restart steps, and troubleshooting/curl validation workflow. <h3>Confidence Score: 4/5</h3> - This PR is generally safe to merge; changes are docs-only with a few consistency/usability nits. - Only a new documentation file is added. The main risks are broken Mintlify anchors from the em dash title and instructions that may not work for some readers (root-specific paths, assuming python3). - docs/providers/litellm-nebius.md <!-- greptile_other_comments_section --> <sub>(2/5) Greptile learns from your feedback when you react with thumbs up/down!</sub> <!-- /greptile_comment -->

Most Similar PRs