šŸš€ Connecting CodingPlanX Custom Models to Dify

This tutorial will guide you through integrating CodingPlanX APIs and models into your Dify platform to leverage custom AI capabilities.

šŸ“‹ Prerequisites

Before starting, ensure you have:

  1. Dify Access: A self-hosted Dify instance or a Dify Cloud account.
  2. API Key: A valid API Key from the CodingPlanX platform.

āš™ļø Configuration Steps

Step 1: Access Model Provider Settings

  1. Log in to the Dify console.
  2. Click your Profile Icon in the top right and select Settings.
  3. In the left sidebar, click Model Providers.

Step 2: Add a Custom Model

Since CodingPlanX is OpenAI-compatible, we use the OpenAI-API-compatible provider.

  1. Scroll down the Model Providers page to find OpenAI-API-compatible.
  2. Click the Add Model button on the card.

Step 3: Fill in Configuration Parameters

Fill in the popup window exactly as follows:

ParameterValue / ExampleDescription
Model TypeLLMSelect the type of model (Large Language Model).
Model Namee.g., gpt-4-turbo or claude-3-opusRequired. Use the exact model ID provided by CodingPlanX.
API Keysk-xxxxxxxxxxxxxxxxxxxRequired. Your CodingPlanX API Key.
API endpoint URLhttps://api.codingplanx.ai/v1Required. (āš ļø Note: Must include the /v1 suffix).
Context Length8192Max token count supported by the model.
Max Token Limit4096Maximum tokens allowed for a single response.

> šŸ’” Pro Tips: > * Regarding API URL: While the base domain is https://api.codingplanx.ai, Dify requires the full path: https://api.codingplanx.ai/v1. > * Regarding Model Name: If unsure of the ID, check the official CodingPlanX documentation for the exact Model ID string.

Step 4: Save and Verify

  1. Click Save after verifying all details.
  2. If successful, the model will appear in your provider list.

🧪 Testing the Setup

  1. Go to the Dify Studio and create or open an App (e.g., a Chat Assistant).
  2. In the Model Selection area in the top right, click the dropdown.
  3. Find the OpenAI-API-compatible group and select your CodingPlanX model.
  4. Send a message in the debug window. If the AI responds, you are all set! šŸŽ‰

ā“ Troubleshooting (FAQ)

Q1: "Connection failed" or "Invalid API Key" during save?

  • Check for extra spaces in the API Key.
  • Ensure the URL ends with /v1.
  • Verify your server can reach https://api.codingplanx.ai.

Q2: "Model not found" during chat?

  • The Model Name entered does not exist or you don't have access to it. Double-check the spelling (case-sensitive).

Q3: Responses are being cut off?

  • Check if the Max Token Limit is set too low. Try increasing it to 2048 or 4096.