Overview
LiteLLM fits LemonData in two common ways:- use LemonData as an OpenAI-compatible upstream behind LiteLLM
- put LiteLLM in front of LemonData when your team wants one more internal gateway layer for routing, virtual keys, or centralized observability
https://api.lemondata.cc/v1.
If you specifically need Claude-native or Gemini-native request shapes, prefer LemonData’s dedicated native integrations instead of forcing those workflows through LiteLLM’s OpenAI-compatible abstraction.
Type: Framework or PlatformPrimary Path: OpenAI-compatible upstreamSupport Confidence: Supported path
Install
Proxy Configuration
Create alitellm-config.yaml like this:
Call LiteLLM Through OpenAI SDK
Direct Python Usage
If you are using LiteLLM as a Python library instead of the proxy, keep the same LemonData base URL:Best Practices
Prefer custom_openai for LemonData
Prefer custom_openai for LemonData
Treat LemonData as an OpenAI-compatible upstream unless you have a very specific reason to build a more complex provider mapping.
Use LiteLLM when you need one more gateway layer
Use LiteLLM when you need one more gateway layer
LiteLLM makes sense when your own platform wants virtual keys, extra routing policy, or centralized logs in front of LemonData.
Keep native-provider expectations realistic
Keep native-provider expectations realistic
OpenAI-compatible translation layers are great for broad compatibility, but they are not the right place to promise every provider-native feature.
Troubleshooting
Connection errors
Connection errors
- Verify
api_baseis exactlyhttps://api.lemondata.cc/v1 - Make sure LiteLLM can reach LemonData over the public internet
- If you run the proxy locally, verify the OpenAI client points to your LiteLLM port instead of LemonData directly
Authentication errors
Authentication errors
- Check that LiteLLM is reading the right
OPENAI_API_KEY - Confirm the LemonData key starts with
sk- - Confirm the key is active in LemonData dashboard
Model not found
Model not found
- Verify the LemonData model name in
custom_openai/<model> - Keep your LiteLLM
model_namealias separate from the real LemonData model id