For LemonData, the most stable default in Vercel AI SDK is the OpenAI-compatible provider.If you specifically need Responses-native behavior, you can switch to the OpenAI provider and keep the same LemonData base URL.Treat this page as a recommended integration pattern, not as a claim that every helper in Vercel AI SDK has dedicated end-to-end regression coverage in this repo.
import { generateText } from 'ai';import { lemondata } from './lemondata';const { text } = await generateText({ model: lemondata.chatModel('gpt-5.4'), prompt: 'Explain LemonData in one sentence.',});console.log(text);
import { streamText } from 'ai';import { lemondata } from './lemondata';const result = await streamText({ model: lemondata.chatModel('gpt-5.4'), prompt: 'Write a short poem about coding.',});for await (const textPart of result.textStream) { process.stdout.write(textPart);}
import { generateText } from 'ai';import { lemondataResponses } from './lemondata-responses';const { text } = await generateText({ model: lemondataResponses('gpt-5.4'), prompt: 'Explain LemonData in one sentence.',});
Use @ai-sdk/openai-compatible as the safe default for proxy-style integrations. Switch to @ai-sdk/openai only when you explicitly want a provider path built on /v1/responses.