Skip to main content

AI Mini Program Growth Plan User Guide

Develop Mini Programs with AI Tools

Use CloudBase Skills (MCP) to let Cursor, Lobechat 🦞, OpenClaw, CodeBuddy, WorkBuddy, VS Code, Claude Code, and other AI tools directly operate CloudBase resources without manual environment setup.

Step 1, run the following command in your AI tool to install CloudBase Skills:

npx skills add tencentcloudbase/cloudbase-skills -y

Step 2, tell your AI: "Use CloudBase Skills to develop a todo mini program"

The AI will automatically handle: code generation, database creation, cloud function deployment, and more.

For detailed configuration, see CloudBase AI Toolkit Usage Documentation.


Use CloudBase AI Capabilities

The tokens gifted by the Growth Plan can be used in two ways: call AI directly in your mini program, or configure them in AI tools to assist development.

Scenario 1: Call AI Capabilities in Mini Programs

In a mini program, you can call CloudBase large models directly via wx.cloud.extend.AI. Tokens gifted by the AI Mini Program Growth Plan are consumed here.

Initialize (in app.js):

wx.cloud.init({ env: "<CloudBase Environment ID>" });

Streaming text generation (suitable for conversational scenarios):

const model = wx.cloud.extend.AI.createModel("hunyuan-exp");

const res = await model.streamText({
data: {
model: "hunyuan-2.0-instruct-20251111",
messages: [{ role: "user", content: "Write a poem about spring" }],
},
});

for await (const text of res.textStream) {
console.log(text); // Output text fragments progressively
}

Non-streaming text generation (suitable for short text):

const model = wx.cloud.extend.AI.createModel("hunyuan-exp");

const res = await model.generateText({
model: "hunyuan-2.0-instruct-20251111",
messages: [{ role: "user", content: "Introduce Li Bai in one sentence" }],
});

console.log(res.choices[0].message.content);

For more usage (multi-turn conversations, cloud function calls, etc.), see Mini Program AI Model Documentation.

Scenario 2: Configure CloudBase Token in AI Tools

CloudBase is compatible with the OpenAI API protocol. You can fill in your CloudBase API Key and Base URL directly into AI tools like Lobechat 🦞, Cursor, and OpenClaw that support custom models, using CloudBase large models to assist development.

Step 1, create an API Key at CloudBase Console → Environment Management → API Key Configuration.

Step 2, fill in the model configuration in your AI tool:

Config ItemValue
Base URLhttps://<ENV_ID>.api.tcloudbasegateway.com/v1/ai/hunyuan-exp/v1
API KeyYour CloudBase API Key
Model Namehunyuan-2.0-instruct-20251111

Replace <ENV_ID> with your CloudBase environment ID.

Step 3, use the AI tool normally — tokens consumed are from the AI Mini Program Growth Plan gift.

For detailed OpenAI SDK compatible usage, see OpenAI SDK Integration Documentation.


Resource Management

Note: Log in with your WeChat Official Accounts Platform account and select the mini program that has applied to the AI Mini Program Growth Plan.

  • If you received a CloudBase plan, view the expiration time and resource usage in CloudBase Console → Package Usage
  • If you received a CloudBase voucher, view vouchers and validity in Tencent Cloud Expense Center → Promotion Management → Vouchers
  • Text generation model token usage: View usage and integration guide in CloudBase Console → Text Generation Models
  • Image generation model usage: View usage and integration guide in CloudBase Console → Image Generation Models

Get Help

Visit the CloudBase Community to ask questions, join communication groups, or submit tickets — 1v1 dedicated customer service available.