Skip to main content

Overview

Dify works well with LemonData through its OpenAI-compatible model provider flow. This is a chat-completions-oriented integration path. It should not be read as a guarantee that Dify exposes the same Responses or WebSocket behavior as dedicated Codex integrations. For current Dify versions, the safest path is usually:
  • choose the built-in OpenAI provider
  • set your LemonData API key
  • set a custom base URL of https://api.lemondata.cc/v1
Some older Dify builds expose this as OpenAI-API-compatible instead of the built-in OpenAI provider with a custom base URL field. If your Dify UI looks different, use the closest OpenAI-compatible custom provider flow available in that version.
Type: Framework or PlatformPrimary Path: OpenAI-compatible chat pathSupport Confidence: Supported with scope limits

Prerequisites

  • LemonData account with API access
  • Dify Cloud or self-hosted Dify

Configuration Steps

Step 1: Get Your API Key

  1. Log into LemonData Dashboard
  2. Open API Keys
  3. Create or copy an API key that starts with sk-

Step 2: Configure the Provider

1

Open Model Providers

In Dify, go to SettingsModel Providers
2

Select OpenAI

Open the OpenAI provider settings. If your Dify version does not offer a custom base URL here, use the OpenAI-compatible custom provider option exposed by that version instead.
3

Enter LemonData Settings

Use these values:
FieldValue
API Keysk-your-lemondata-key
API Base URL / Custom Base URLhttps://api.lemondata.cc/v1
4

Add Models

Add the models you want to use, for example:
  • gpt-5.4
  • gpt-5-mini
  • gpt-4o
  • claude-sonnet-4-6
  • claude-opus-4-6
  • gemini-2.5-flash
  • gemini-2.5-pro
  • deepseek-r1

Step 3: Test Connection

  1. Pick one model such as gpt-5-mini or gpt-4o
  2. Send a test prompt
  3. Confirm Dify receives a valid response

Embeddings for Knowledge Bases

For RAG and knowledge base indexing, add an embedding model such as:
  • text-embedding-3-small
  • text-embedding-3-large
Then set it as the default embedding model in the relevant knowledge base or application settings.
Use CaseSuggested Models
Default chatgpt-5.4, gpt-5-mini, gpt-4o
Deep reasoninggpt-5.4, claude-opus-4-6, deepseek-r1
Fast/cheapgpt-5-mini, gemini-2.5-flash
Embeddingstext-embedding-3-small, text-embedding-3-large

Best Practices

In newer Dify versions, the built-in OpenAI provider with a custom base URL is usually the cleanest setup for LemonData.
Use gpt-5-mini or gemini-2.5-flash while iterating, then switch heavier workflows to stronger models only where needed.
Most Dify flows use OpenAI-compatible chat behavior. If you need Codex-specific Responses or WebSocket behavior, use the dedicated Codex integrations instead of Dify.

Troubleshooting

  • Verify the base URL is exactly https://api.lemondata.cc/v1
  • Remove trailing slashes if Dify duplicates them
  • Confirm the Dify server can reach LemonData over the public internet
  • Double-check the API key
  • Confirm the key is active in LemonData dashboard
  • Make sure the value pasted into Dify does not contain extra spaces or line breaks
  • Verify the model name exactly
  • Re-add the model entry if the provider UI cached an older value
  • Check current model availability in LemonData docs or dashboard