Skip to main content

Overview

LemonData works with the official OpenAI SDKs by pointing the client to https://api.lemondata.cc/v1. For new projects, prefer the Responses API. Keep Chat Completions only when a framework, plugin, or legacy code path still expects it. This is the default LemonData SDK path for OpenAI-compatible usage, but it is not a promise that every Responses-native-only field works identically across every model and routed channel.
Python, JavaScript, and Go have official OpenAI SDKs. PHP works well with OpenAI-compatible community clients, but it is not an official OpenAI SDK.
Type: Native SDKPrimary Path: OpenAI-compatible / OpenAI ResponsesSupport Confidence: Supported core path

Installation

pip install openai
Responses-native-only fields such as some hosted tools, include, service_tier, and truncation_strategy can depend on native /v1/responses passthrough support from the selected model and routed channel. If that support is unavailable, LemonData returns an explicit 400 or 503 instead of silently pretending the field worked.

Configure the Client

from openai import OpenAI

client = OpenAI(
    api_key="sk-your-lemondata-key",
    base_url="https://api.lemondata.cc/v1",
)
response = client.responses.create(
    model="gpt-5.4",
    input="Explain what LemonData does in one sentence."
)

print(response.output_text)

Streaming with Responses

stream = client.responses.create(
    model="gpt-5.4",
    input="Write a short poem about coding.",
    stream=True,
)

for event in stream:
    if event.type == "response.output_text.delta":
        print(event.delta, end="")

Tools / Function Calling

response = client.responses.create(
    model="gpt-5.4",
    input="What's the weather in Tokyo?",
    tools=[{
        "type": "function",
        "name": "get_weather",
        "description": "Get weather for a location",
        "parameters": {
            "type": "object",
            "properties": {
                "location": {"type": "string"}
            },
            "required": ["location"]
        }
    }]
)

for item in response.output:
    if item.type == "function_call":
        print(item.name)
        print(item.arguments)

Vision with Responses

response = client.responses.create(
    model="gpt-4o",
    input=[{
        "role": "user",
        "content": [
            {"type": "input_text", "text": "What's in this image?"},
            {"type": "input_image", "image_url": "https://example.com/image.jpg"}
        ]
    }]
)

print(response.output_text)

Embeddings

response = client.embeddings.create(
    model="text-embedding-3-small",
    input="Hello world"
)

print(response.data[0].embedding[:5])

Chat Completions Compatibility

If your framework or plugin still expects Chat Completions, LemonData also supports the standard OpenAI-compatible path:
response = client.chat.completions.create(
    model="gpt-4o",
    messages=[
        {"role": "system", "content": "You are a helpful assistant."},
        {"role": "user", "content": "Hello!"}
    ]
)

print(response.choices[0].message.content)
Use Chat Completions for compatibility only. For new integrations, prefer client.responses.create(...).

Troubleshooting

  • Verify the base URL is exactly https://api.lemondata.cc/v1
  • Check for proxy interference or custom HTTP client overrides
  • Make sure your SDK version is current before debugging provider behavior
  • Check that your API key starts with sk-
  • Verify the key is active in LemonData dashboard
  • Confirm the SDK is sending Authorization: Bearer ...
  • responses.create(...) sends requests to /v1/responses
  • chat.completions.create(...) sends requests to /v1/chat/completions
  • If your app only supports Chat Completions today, keep it on that compatibility path