feat(provider): add bailian-coding-plan as native provider#15102
feat(provider): add bailian-coding-plan as native provider#15102Jlan45 wants to merge 1 commit intoanomalyco:devfrom
Conversation
Add Alibaba Cloud Model Studio Coding Plan (bailian-coding-plan) as a native provider with support for Qwen3.5 Plus, Qwen3 Max, Qwen3 Coder, MiniMax M2.5, GLM-5/4.7, and Kimi K2.5 models. Uses Anthropic SDK with DashScope endpoint for API compatibility. Supports API key configuration via environment variable, auth storage, or config file.
|
Thanks for updating your PR! It now meets our contributing guidelines. 👍 |
|
I urgently need this feature |
|
Thanks for the effort, but I'd suggest closing this PR. It introduces an anti-pattern that bypasses the intended architecture. The ProblemHardcoding providers in
The Correct FixUpdate models.dev instead. PRs already exist: These are clean, validated, and waiting for review. Workaround for UsersIn the meantime, users can add this to {
"provider": {
"alibaba-coding-plan": {
"name": "Alibaba Coding Plan",
"npm": "@ai-sdk/anthropic",
"api": "https://coding-intl.dashscope.aliyuncs.com/apps/anthropic/v1",
"env": ["BAILIAN_CODING_PLAN_API_KEY"],
"models": {
"qwen3.5-plus": {
"name": "Qwen3.5 Plus",
"reasoning": true,
"tool_call": true,
"modalities": { "input": ["text", "image", "video"], "output": ["text"] },
"limit": { "context": 1000000, "output": 65536 },
"cost": { "input": 0, "output": 0 }
},
"qwen3-max": {
"name": "Qwen3 Max",
"tool_call": true,
"modalities": { "input": ["text"], "output": ["text"] },
"limit": { "context": 262144, "output": 65536 },
"cost": { "input": 0, "output": 0 }
},
"qwen3-coder-next": {
"name": "Qwen3 Coder Next",
"tool_call": true,
"modalities": { "input": ["text"], "output": ["text"] },
"limit": { "context": 262144, "output": 65536 },
"cost": { "input": 0, "output": 0 }
},
"qwen3-coder-plus": {
"name": "Qwen3 Coder Plus",
"tool_call": true,
"modalities": { "input": ["text"], "output": ["text"] },
"limit": { "context": 1048576, "output": 65536 },
"cost": { "input": 0, "output": 0 }
},
"glm-5": {
"name": "GLM-5",
"reasoning": true,
"tool_call": true,
"modalities": { "input": ["text"], "output": ["text"] },
"limit": { "context": 202752, "output": 16384 },
"cost": { "input": 0, "output": 0 }
},
"glm-4.7": {
"name": "GLM-4.7",
"reasoning": true,
"tool_call": true,
"modalities": { "input": ["text"], "output": ["text"] },
"limit": { "context": 202752, "output": 16384 },
"cost": { "input": 0, "output": 0 }
},
"minimax-m2.5": {
"name": "MiniMax M2.5",
"reasoning": true,
"tool_call": true,
"modalities": { "input": ["text"], "output": ["text"] },
"limit": { "context": 1048576, "output": 32768 },
"cost": { "input": 0, "output": 0 }
},
"kimi-k2.5": {
"name": "Kimi K2.5",
"reasoning": true,
"tool_call": true,
"modalities": { "input": ["text", "image", "video"], "output": ["text"] },
"limit": { "context": 262144, "output": 32768 },
"cost": { "input": 0, "output": 0 }
}
}
},
"alibaba-coding-plan-cn": {
"name": "Alibaba Coding Plan (China)",
"npm": "@ai-sdk/anthropic",
"api": "https://coding.dashscope.aliyuncs.com/apps/anthropic/v1",
"env": ["BAILIAN_CODING_PLAN_API_KEY"],
"models": {
"qwen3.5-plus": {
"name": "Qwen3.5 Plus",
"reasoning": true,
"tool_call": true,
"modalities": { "input": ["text", "image", "video"], "output": ["text"] },
"limit": { "context": 1000000, "output": 65536 },
"cost": { "input": 0, "output": 0 }
},
"qwen3-max": {
"name": "Qwen3 Max",
"tool_call": true,
"modalities": { "input": ["text"], "output": ["text"] },
"limit": { "context": 262144, "output": 65536 },
"cost": { "input": 0, "output": 0 }
},
"qwen3-coder-next": {
"name": "Qwen3 Coder Next",
"tool_call": true,
"modalities": { "input": ["text"], "output": ["text"] },
"limit": { "context": 262144, "output": 65536 },
"cost": { "input": 0, "output": 0 }
},
"qwen3-coder-plus": {
"name": "Qwen3 Coder Plus",
"tool_call": true,
"modalities": { "input": ["text"], "output": ["text"] },
"limit": { "context": 1048576, "output": 65536 },
"cost": { "input": 0, "output": 0 }
},
"glm-5": {
"name": "GLM-5",
"reasoning": true,
"tool_call": true,
"modalities": { "input": ["text"], "output": ["text"] },
"limit": { "context": 202752, "output": 16384 },
"cost": { "input": 0, "output": 0 }
},
"glm-4.7": {
"name": "GLM-4.7",
"reasoning": true,
"tool_call": true,
"modalities": { "input": ["text"], "output": ["text"] },
"limit": { "context": 202752, "output": 16384 },
"cost": { "input": 0, "output": 0 }
},
"minimax-m2.5": {
"name": "MiniMax M2.5",
"reasoning": true,
"tool_call": true,
"modalities": { "input": ["text"], "output": ["text"] },
"limit": { "context": 1048576, "output": 32768 },
"cost": { "input": 0, "output": 0 }
},
"kimi-k2.5": {
"name": "Kimi K2.5",
"reasoning": true,
"tool_call": true,
"modalities": { "input": ["text", "image", "video"], "output": ["text"] },
"limit": { "context": 262144, "output": 32768 },
"cost": { "input": 0, "output": 0 }
}
}
}
}
}Note: Coding Plan is a flat monthly subscription (¥7.9-200/month with request limits), not pay-per-token, so Set your API key: export BAILIAN_CODING_PLAN_API_KEY=sk-your-keyThen use: opencode -m alibaba-coding-plan/glm-5SuggestionI'd recommend closing this PR and helping review/merge the models.dev PRs instead. That's the proper long-term fix. |
|
I was digging into this and I think PR is trying to solve this: #14819 |
|
This can now be closed as it is already available. Implemented by anomalyco/models.dev#1023 and anomalyco/models.dev#1030 |
Add Alibaba Cloud Model Studio Coding Plan (bailian-coding-plan) as a native provider with support for Qwen3.5 Plus, Qwen3 Max, Qwen3 Coder, MiniMax M2.5, GLM-5/4.7, and Kimi K2.5 models.
Uses Anthropic SDK with DashScope endpoint for API compatibility. Supports API key configuration via environment variable, auth storage, or config file.
Issue for this PR
Closes #
Type of change
What does this PR do?
Add Alibaba Cloud China Bailian Coding Plan as opencode native AI Provider
How did you verify your code works?
Use
opencode auth logincommand, you will seeModel Studio Coding Planin the listScreenshots / recordings
If this is a UI change, please include a screenshot or recording.
Checklist
If you do not follow this template your PR will be automatically rejected.