š Connecting CodingPlanX Custom Models to Dify
This tutorial will guide you through integrating CodingPlanX APIs and models into your Dify platform to leverage custom AI capabilities.
š Prerequisites
Before starting, ensure you have:
- Dify Access: A self-hosted Dify instance or a Dify Cloud account.
- API Key: A valid API Key from the CodingPlanX platform.
āļø Configuration Steps
Step 1: Access Model Provider Settings
- Log in to the Dify console.
- Click your Profile Icon in the top right and select Settings.
- In the left sidebar, click Model Providers.
Step 2: Add a Custom Model
Since CodingPlanX is OpenAI-compatible, we use the OpenAI-API-compatible provider.
- Scroll down the Model Providers page to find OpenAI-API-compatible.
- Click the Add Model button on the card.
Step 3: Fill in Configuration Parameters
Fill in the popup window exactly as follows:
| Parameter | Value / Example | Description |
|---|---|---|
| Model Type | LLM | Select the type of model (Large Language Model). |
| Model Name | e.g., gpt-4-turbo or claude-3-opus | Required. Use the exact model ID provided by CodingPlanX. |
| API Key | sk-xxxxxxxxxxxxxxxxxxx | Required. Your CodingPlanX API Key. |
| API endpoint URL | https://api.codingplanx.ai/v1 | Required. (ā ļø Note: Must include the /v1 suffix). |
| Context Length | 8192 | Max token count supported by the model. |
| Max Token Limit | 4096 | Maximum tokens allowed for a single response. |
> š” Pro Tips:
> * Regarding API URL: While the base domain is https://api.codingplanx.ai, Dify requires the full path: https://api.codingplanx.ai/v1.
> * Regarding Model Name: If unsure of the ID, check the official CodingPlanX documentation for the exact Model ID string.
Step 4: Save and Verify
- Click Save after verifying all details.
- If successful, the model will appear in your provider list.
š§Ŗ Testing the Setup
- Go to the Dify Studio and create or open an App (e.g., a Chat Assistant).
- In the Model Selection area in the top right, click the dropdown.
- Find the
OpenAI-API-compatiblegroup and select your CodingPlanX model. - Send a message in the debug window. If the AI responds, you are all set! š
ā Troubleshooting (FAQ)
Q1: "Connection failed" or "Invalid API Key" during save?
- Check for extra spaces in the API Key.
- Ensure the URL ends with
/v1. - Verify your server can reach
https://api.codingplanx.ai.
Q2: "Model not found" during chat?
- The Model Name entered does not exist or you don't have access to it. Double-check the spelling (case-sensitive).
Q3: Responses are being cut off?
- Check if the Max Token Limit is set too low. Try increasing it to 2048 or 4096.