← Back to PRs

#20771: feat(memory-lancedb): support custom OpenAI-compatible embedding providers

by marcodelpin open 2026-02-19 09:06 View on GitHub →
extensions: memory-lancedb size: S
## Summary Adds `baseUrl` and `dimensions` configuration to the `memory-lancedb` plugin, enabling use of **any OpenAI-compatible embedding endpoint** instead of being locked to OpenAI's API. This allows users to run memory embeddings with: - **Ollama** (`http://localhost:11434/v1`) - **LM Studio** (local endpoint) - **vLLM** / **text-generation-inference** - **Custom ONNX Runtime servers** - Any other OpenAI-compatible embedding API ### Changes **`config.ts`** - Added `baseUrl?: string` and `dimensions?: number` to `MemoryConfig.embedding` type - `vectorDimsForModel()` now accepts optional `configDimensions` fallback — unknown models work when dimensions are specified in config - `parse()` passes through `baseUrl` and `dimensions` to the returned config (previously these were silently dropped) - Updated `assertAllowedKeys` to accept the new fields - Added `uiHints` for `baseUrl` and `dimensions` (marked as `advanced`) **`index.ts`** - `Embeddings` constructor accepts optional `baseUrl` and passes it to `new OpenAI({ apiKey, baseURL })` - `register()` passes `cfg.embedding.baseUrl` and `cfg.embedding.dimensions` through to the relevant constructors **`openclaw.plugin.json`** - Removed hardcoded `enum` restriction on embedding model — any string is now accepted - Added `baseUrl` and `dimensions` properties to the JSON schema with descriptions - Updated `uiHints` labels/help text **`index.test.ts`** - Added 3 new tests: - Custom `baseUrl` + `dimensions` config is accepted - Known model works without `dimensions` - Unknown model without `dimensions` throws a helpful error ### Example Configuration ```json5 // Local Ollama embeddings { "embedding": { "apiKey": "ollama", "model": "nomic-embed-text", "baseUrl": "http://localhost:11434/v1", "dimensions": 768 } } // Local ONNX Runtime server { "embedding": { "apiKey": "local", "model": "multilingual-e5-base", "baseUrl": "http://localhost:11434/v1", "dimensions": 768 } } // OpenAI (unchanged, backwards compatible) { "embedding": { "apiKey": "sk-proj-...", "model": "text-embedding-3-small" } } ``` ### Backwards Compatibility - Fully backwards compatible — existing configs without `baseUrl`/`dimensions` work exactly as before - Known models (`text-embedding-3-small`, `text-embedding-3-large`) don't need `dimensions` - `baseUrl` defaults to OpenAI's endpoint when not specified ## Test plan - [x] All 9 existing tests pass - [x] 3 new tests added for custom config - [x] Tested end-to-end with local ONNX Runtime embedding server on CUDA GPU (multilingual-e5-base, 768 dims) - [ ] Verify with Ollama embedding endpoint - [ ] Verify with LM Studio Closes #8118 Closes #17564 🤖 Generated with [Claude Code](https://claude.com/claude-code) Testing: fully tested (automated + manual verification) I understand and can explain all changes in this PR. <!-- greptile_comment --> <h3>Greptile Summary</h3> Adds support for custom OpenAI-compatible embedding providers (Ollama, LM Studio, vLLM) by introducing `baseUrl` and `dimensions` configuration options. The implementation correctly handles backwards compatibility, validates unknown models require explicit dimensions, and passes configuration through all layers (schema → parser → constructor). Three new tests cover the key scenarios: custom providers with dimensions, known models without dimensions, and proper error handling for unknown models without dimensions. <h3>Confidence Score: 5/5</h3> - This PR is safe to merge with minimal risk - The changes are well-structured, maintain backwards compatibility, include comprehensive test coverage (3 new tests for the new functionality), and follow existing code patterns. The implementation correctly validates that unknown models require explicit dimensions, properly passes configuration through all layers, and doesn't introduce breaking changes to existing configurations. - No files require special attention <sub>Last reviewed commit: 0f4059b</sub> <!-- greptile_other_comments_section --> <!-- /greptile_comment -->

Most Similar PRs