Skip to main content

Overview

Type: Framework or PlatformPrimary Path: OpenAI-compatible defaultSupport Confidence: Recommended integration pattern
For LemonData, the most stable default in Vercel AI SDK is the OpenAI-compatible provider. If you specifically need Responses-native behavior, you can switch to the OpenAI provider and keep the same LemonData base URL. Treat this page as a recommended integration pattern, not as a claim that every helper in Vercel AI SDK has dedicated end-to-end regression coverage in this repo.
npm install ai @ai-sdk/openai-compatible
import { createOpenAICompatible } from '@ai-sdk/openai-compatible';

export const lemondata = createOpenAICompatible({
  name: 'lemondata',
  apiKey: process.env.LEMONDATA_API_KEY,
  baseURL: 'https://api.lemondata.cc/v1',
});

Generate Text

import { generateText } from 'ai';
import { lemondata } from './lemondata';

const { text } = await generateText({
  model: lemondata.chatModel('gpt-5.4'),
  prompt: 'Explain LemonData in one sentence.',
});

console.log(text);

Stream Text

import { streamText } from 'ai';
import { lemondata } from './lemondata';

const result = await streamText({
  model: lemondata.chatModel('gpt-5.4'),
  prompt: 'Write a short poem about coding.',
});

for await (const textPart of result.textStream) {
  process.stdout.write(textPart);
}

Tool Calling

import { generateText, tool } from 'ai';
import { z } from 'zod';
import { lemondata } from './lemondata';

const result = await generateText({
  model: lemondata.chatModel('gpt-5.4'),
  prompt: 'What is the weather in San Francisco?',
  tools: {
    weather: tool({
      description: 'Get weather in a location',
      parameters: z.object({
        location: z.string(),
      }),
      execute: async ({ location }) => ({
        location,
        temperature: 72,
        condition: 'sunny',
      }),
    }),
  },
});

console.log(result.text);

Structured Output

import { generateObject } from 'ai';
import { z } from 'zod';
import { lemondata } from './lemondata';

const { object } = await generateObject({
  model: lemondata.chatModel('gpt-5.4'),
  schema: z.object({
    name: z.string(),
    role: z.string(),
  }),
  prompt: 'Generate a fake developer profile.',
});

console.log(object);

If You Explicitly Need Responses-Native Behavior

npm install ai @ai-sdk/openai
import { createOpenAI } from '@ai-sdk/openai';

export const lemondataResponses = createOpenAI({
  apiKey: process.env.LEMONDATA_API_KEY,
  baseURL: 'https://api.lemondata.cc/v1',
});
import { generateText } from 'ai';
import { lemondataResponses } from './lemondata-responses';

const { text } = await generateText({
  model: lemondataResponses('gpt-5.4'),
  prompt: 'Explain LemonData in one sentence.',
});
Use @ai-sdk/openai-compatible as the safe default for proxy-style integrations. Switch to @ai-sdk/openai only when you explicitly want a provider path built on /v1/responses.

Environment Variables

# .env.local
LEMONDATA_API_KEY=sk-your-lemondata-key

Best Practices

For third-party gateways and proxy backends, @ai-sdk/openai-compatible is usually the least surprising starting point.
If you need provider behavior tied to /v1/responses, switch the provider package deliberately instead of mixing both patterns in one client.
Never expose your LemonData API key in client-side code. Put provider setup in server files or API routes.