LemonData exposes multiple API formats so common coding tools, SDKs, and frameworks can integrate with minimal glue code.This page is intentionally narrower than a marketing matrix:
Supported means we document a concrete setup path and LemonData exposes the protocol shape that path expects.
Strong native path means the repo also has direct adapter or request-format evidence for that protocol family.
Best-effort means the integration can work, but the upstream client does not treat this custom gateway workflow as a stable contract.
Unsupported fields are not handled uniformly. On compatibility routes, some fields are ignored or normalized. On native Responses paths, unsupported fields can return explicit 400 or 503 errors when the selected model or routed channel does not support the required passthrough behavior.
Works for BYOK standard chat/editor flows, not as a replacement for Cursor-managed features like Tab Completion
Claude Code CLI
Strong native path
Anthropic
Native /v1/messages route with adapter coverage for thinking and tool_choice
Codex CLI
Supported with model/channel limits
OpenAI Responses
Best when configured for /v1/responses; some Responses-native fields require native passthrough support from the selected model and channel
Gemini CLI
Best-effort / experimental
Gemini
Custom LemonData base URL flow is not a stable upstream contract
OpenCode
Supported
OpenAI-compatible
Use an OpenAI-compatible provider by default; move to a Responses-based provider only when you explicitly need it
Other OpenAI-compatible editors and agent tools often work with the same base URL pattern, but this repo does not currently maintain tool-specific regression coverage for Windsurf, Aider, Continue.dev, Cline/Roo Code, GitHub Copilot, and similar clients.
Chat Completions, Embeddings, and common Responses workflows are documented; some Responses-native-only fields depend on model/channel passthrough support
Anthropic SDK
Python/JS
Strong native path
Native Messages route with direct evidence for tools, thinking, and prompt caching
Vercel AI SDK
TypeScript
Recommended integration pattern
Prefer @ai-sdk/openai-compatible; use @ai-sdk/openai only when you explicitly want Responses-native behavior
LangChain
Python/JS
Supported standard surfaces
ChatOpenAI and OpenAIEmbeddings are the intended scope; vendor-native extras are out of scope
LlamaIndex
Python
Supported via OpenAILike
Use OpenAILike, not the built-in OpenAI classes, for third-party gateways such as LemonData
Dify
-
Supported with scope limits
OpenAI provider and chat-completions-oriented flows are the intended path; not a fit for Codex-specific Responses or WebSocket behavior