← Back to PRs

#19006: feat(memory-lancedb): OpenAI-compatible baseUrl + Ollama provider + optional dims [AI-assisted]

by martinsen-assistant open 2026-02-17 07:50 View on GitHub →
extensions: memory-lancedb size: M
## Summary This updates the bundled `memory-lancedb` plugin so embeddings are no longer OpenAI-only. ### What's included - Adds `embedding.baseUrl` for OpenAI-compatible endpoints (e.g. OpenRouter/local proxies). - Adds `embedding.provider` with support for: - `openai` (default) - `ollama` (local embeddings via `/api/embeddings`) - Adds optional `embedding.dims` override for custom/unknown models. - Keeps known OpenAI model dims (`text-embedding-3-small`/`text-embedding-3-large`) and auto-detects dimensions from a probe embedding when needed. - Updates plugin JSON schema + UI hints to expose new fields. ## Why Current plugin behavior hardcodes OpenAI client usage and fixed model dims, which blocks: - OpenAI-compatible APIs that require custom base URLs - local embedding workflows with Ollama - custom models with non-default dimensions ## Testing - `pnpm exec vitest run extensions/memory-lancedb/index.test.ts` - 15 passed, 1 skipped (live test gate) ## AI-assisted - [x] AI-assisted - Degree of testing: lightly tested (targeted plugin tests above) - I reviewed the generated code and validated behavior locally before opening this PR. <!-- greptile_comment --> <h3>Greptile Summary</h3> Extends the memory-lancedb plugin to support multiple embedding providers beyond OpenAI, adding Ollama support for local embeddings and `baseUrl` for OpenAI-compatible proxies. Introduces lazy dimension detection via a probe embedding for unknown models, and optional `dims` override for explicit configuration. Config parsing, validation, and test coverage are solid. - Adds `OllamaEmbeddingConfig` type and Ollama HTTP embedding client alongside the existing OpenAI SDK path - Supports `baseUrl` for OpenAI-compatible endpoints (e.g., OpenRouter) and custom Ollama server URLs - Defers vector dimension resolution to table creation time using an async callback, enabling auto-detection via probe embedding - Updates JSON schema with `oneOf` discriminator and new UI hints for the added fields - `vectorDimsForModel` is now dead code (no longer imported) — could be cleaned up - `OpenAIEmbeddingConfig.provider` is optional in the type but the runtime code relies on it being set — the parser ensures this, but the type could be tightened for safety <h3>Confidence Score: 4/5</h3> - This PR is safe to merge with minor type-level improvements recommended - The changes are well-structured with good separation between config parsing and runtime usage. The parser guarantees runtime safety for all config paths. Two minor style issues found: an optional provider field that weakens discriminated union narrowing, and a dead exported function. No logic bugs or security issues identified. Test coverage for the new config modes is adequate. - extensions/memory-lancedb/config.ts has a type design concern (optional provider) and dead code (vectorDimsForModel) <sub>Last reviewed commit: 2007a77</sub> <!-- greptile_other_comments_section --> <sub>(2/5) Greptile learns from your feedback when you react with thumbs up/down!</sub> <!-- /greptile_comment -->

Most Similar PRs