Skip to main content

Guide to Integrating Mini Program with Cloud Development AI Capabilities

tip

Cloud Development AI+ seamlessly integrates DeepSeek + Hunyuan dual-model development for intelligent agents, supporting scenarios such as Mini Programs, h5, Official Accounts, and WeChat Customer Service.

New users enjoy a free first-month trial and receive 1 million tokens.

Experience the AI capabilities of the Cloud Development Platform now. Click to view the Complete Guide

This article describes how Mini Programs can quickly integrate with Cloud Development AI (full-featured DeepSeek) capabilities

Unlock more core AI capabilities of the Cloud Development Platform: Intelligent Q&A Bot, AI-powered Enterprise Knowledge Base, AI-enhanced WeChat Platform Customer Service. Click to view the Quick Guide

Preparation

  • Register a WeChat Mini Program account and create a local mini program project
  • The Mini Program basic library must be version 3.7.1 or above and include the wx.cloud.extend.AI object
  • To enable Cloud Development for the Mini Program, click the "Cloud Development" button in the toolbar of the Mini Program development tools to activate and create an environment (Note: First-time Cloud Development users receive a free tier for the first month):
    • image

Guide 1: Call Large Models to Implement Text Generation

In Mini Programs, directly call the text generation capabilities of large models to achieve the simplest text generation. Here we take a simple Demo of a "seven-character quatrain" generator as an example:

Step 1: Initialize the CloudBase environment

In the Mini Program code, initialize the CloudBase environment through the following code:

wx.cloud.init({
env: env: "<CloudBase environment ID>",
});

Here, "<Cloud Development environment ID>" needs to be replaced with the actual Cloud Development environment ID. After successful initialization, you can use wx.cloud.extend.AI to call AI capabilities.

Step 2: Create an AI model and call it to generate text

In Mini Program basic library version 3.7.1 or above, taking calling the DeepSeek-R1 model as an example, the code on the Mini Program side is as follows:

// Create a model instance, here we use the DeepSeek large model
const model = wx.cloud.extend.AI.createModel("deepseek");

// First, we set the system prompt for the AI, taking the generation of a seven-character quatrain as an example
const systemPrompt =
"Please strictly adhere to the metrical requirements of seven-character quatrains or seven-character regulated verses when composing. The level and oblique tones must conform to rules, rhymes should be harmonious and natural, and rhyming words must belong to the same rhyme group. Create content centered around the user-given theme. A seven-character quatrain consists of four lines with seven characters each; a seven-character regulated verse has eight lines with seven characters each, requiring strict parallelism in the third/fourth and fifth/sixth couplets. Simultaneously, incorporate vivid imagery, rich emotions, and elegant artistic conception to showcase the charm and beauty of classical poetry.";

// User's natural language input, such as "Write a poem praising Yulong Snow Mountain for me"
const userInput = "Write a poem praising Yulong Snow Mountain for me";

// Pass the system prompt and user input to the large model
const res = await model.streamText({
data: {
model: model: "deepseek-r1", // Specifies the specific model
messages: [
{ role: "system", content: systemPrompt },
{ role: "user", content: userInput },
],
},
});

// Receive the response from the large model
// Since the large model returns results in a streaming manner, we need to continuously receive the complete response text here.
for await (let str of res.textStream) {
console.log(str);
}
// Output result:
// "# Ode to Yulong Snow Mountain\n"
// "Snow-capped ridges pierce the clouded skies,\n"
// "Snow shadows and mountain haze enhance the scenic view,\n"

As can be seen, with just a few lines of Mini Program code, you can directly call the text generation capabilities of large models through cloud development.

Guide 2: Enable Intelligent Conversations via Agent (Intelligent Agent)

By calling the text generation interface of large models, Q&A scenarios can be quickly implemented. However, for a comprehensive conversational capability, merely having the input and output of a large model is insufficient. The large model needs to be transformed into a complete Agent to better engage in dialogues with users.

Cloud Development's AI capabilities not only provide raw large model integration but also offer Agent integration capabilities. Developers can define their own Agents on Cloud Development and then directly invoke them through Mini Programs for conversations.

Step 1: Initialize the CloudBase environment

In the Mini Program code, initialize the CloudBase environment through the following code:

wx.cloud.init({
env: env: "<CloudBase environment ID>",
});

Here, "<Cloud Development environment ID>" needs to be replaced with the actual Cloud Development environment ID. After successful initialization, you can use wx.cloud.extend.AI to call AI capabilities.

Step 2: Create an Agent

Go to Cloud Development Platform - AI+, choose a template and create an Agent.

You can create Agents through open-source frameworks like LangChain and LangGraph, or integrate with third-party Agents such as Yuanqi, Agent development platforms, and Dify.

image

Click "Copy ID" at the top of the page to obtain a bot-id, which is the unique identifier of the Agent and will be used in the following code.

image

Step 3: Implement conversations with the Agent in mini programs

I have just created an "Mini Program Development Expert" Agent. Let's try conversing with it to see if it can handle common error messages in Cloud Development. In the mini program, use the following code to directly invoke the Agent we just created for conversation:

// Initialization
wx.cloud.init({
env: env: "<CloudBase environment ID>",
});

// User input, here we take an error message as an example
const userInput =
"What does this error message in my mini program mean: 'FunctionName parameter could not be found'";

const res = await wx.cloud.extend.AI.bot.sendMessage({
data: {
botId: botId: "xxx-bot-id", // Step 2: obtain the Agent's unique identifier
msg: msg: userInput, // user input
history: history: [], // historical conversation content, which can be omitted since this is the first round of conversation
},
});
for await (let x of res.textStream) {
console.log(x);
}
// Output result:
// "### Error Explanation\n"
// "**Error Message:** `FunctionName \n"
// "parameter could not be found` \n
// "This error usually indicates that when calling a function, \n"
// "The specified function name parameter could not be found. Specifically, \n"
// "It is likely one of the following situations:\n"
// ...

We can also record the conversation content and repeatedly invoke the Agent's interface to achieve multi-turn conversations.

const res = await wx.cloud.extend.AI.bot.sendMessage({
data: {
botId: botId: "xxx-bot-id", // Step 2: obtain the Agent's unique identifier
msg: msg: userInput, // user input
history: [
{ role: "user", message: "What does this error message mean?..."},
{ role: "bot", message: "### Error Explanation..." },
{ role: "user", message: "How should I proceed to fix it?" }
// ...
]
},
});

Step 4: Implement more enriched chat features

CloudBase provides a comprehensive set of API interfaces for Agent integration in the SDK, including basic conversation, conversation history saving, conversation feedback collection, and follow-up question recommendations.

Mini Program developers can create an Agent on the CloudBase Platform, then directly call various interfaces under wx.cloud.extend.AI in the Mini Program frontend code to interact with the Agent, including:

  • Obtain chat records
  • Send and obtain user feedback
  • Obtain recommended follow-up questions

Below are some code samples:

Obtaining Chat Records

await wx.cloud.extend.AI.bot.getChatRecords({
botId: "botId-xxx",
pageNumber: 1,
pageSize: 10,
sort: "asc",
});

Pass in the botId, pagination information, and sorting method to obtain the chat records of the specified Agent.

Sending and Obtaining User Feedback

Send user feedback:

const res = await wx.cloud.extend.AI.bot.sendFeedback({
userFeedback: {
botId: "botId-xxx",
recordId: "recordId-xxx",
comment: comment: "Excellent",
rating: 5,
tags: tags: ["elegant"],
aiAnswer: aiAnswer: "Fallen petals scatter everywhere",
input: input: "Give me an idiom",
type: "upvote",
},
});
const res = await wx.cloud.extend.AI.bot.getRecommendQuestions({
data: {
botId: "xxx-bot-id",
msg: msg: "Introduce the Python language",
},
});
for await (let x of res.textStream) {
console.log(x);
}

Set botId and user message msg in the data parameter, then traverse textStream to obtain recommended questions.

Guide Three: Using the Cloud Development AI Dialog Component to Quickly Integrate AI Conversations

To facilitate developers in quickly implementing the AI dialog feature in their mini-programs, CloudBase provides a mini-program source code component for AI dialog for direct use, with the effect as shown in the figure below:

Step 1: Download the component package

Method 1: Directly download the component sample package, which includes the agent-ui source code component and usage guide

Method 2: Create an agent-ui component template using WeChat DevTools, and configure and use it according to the guide

Step 2: Import the component into your mini-program project

  1. Copy the miniprogram/components/agent-ui component to your mini-program

image

  1. Register the component in the page index.json configuration file
{
"usingComponents": {
"agent-ui":"/components/agent-ui/index"
},
}
  1. Use the component in the page index.wxml file
<view>
<agent-ui agentConfig="{{agentConfig}}" showBotAvatar="{{showBotAvatar}}" chatMode="{{chatMode}}" modelConfig="{{modelConfig}}"></agent-ui>
</view>
  1. Write the configuration in the page index.js file

Connect Agent:

 data: {
chatMode: chatMode: "bot", // bot indicates using the agent, model indicates using the large model
showBotAvatar: showBotAvatar: true, // Whether to display the avatar on the left side of the dialog box
agentConfig: {
botId: "bot-e7d1e736", // agent id,
allowWebSearch: allowWebSearch: true, // Allow the client to choose to enable web search
allowUploadFile: allowUploadFile: true, // Allow file uploading
allowPullRefresh: allowPullRefresh: true, // Allow pull-to-refresh
allowUploadImage: allowUploadImage: true // Allow image uploading
showToolCallDetail: showToolCallDetail: true, // Whether to display the details of tool calls
allowMultiConversation: allowMultiConversation: true, // Whether to display the session list and create session button
}
}

Connect Large Model:

  data: {
chatMode: chatMode: "model", // bot indicates using the agent, model indicates using the large model
showBotAvatar: showBotAvatar: true, // Whether to display the avatar on the left side of the dialog box
modelConfig: {
modelProvider: modelProvider: "hunyuan-open", // Large model service provider
quickResponseModel: quickResponseModel: "hunyuan-lite", // Large model name
logo: logo: "", // model avatar
welcomeMsg: welcomeMsg: "Welcome message", // model welcome message
},
}

Step 3: Initialize the CloudBase environment

In app.js, initialize the sdk within the onLaunch lifecycle

   // app.js
App({
onLaunch: function () {
wx.cloud.init({
env: env: "<CloudBase environment ID>",
});
},
});

After that, you can directly use the AI chat component on the page:

image

Summary

This article introduces the following three approaches to integrating large models in cloud development, each suitable for different scenarios:

  1. Directly call large models via SDK: Suitable for general non-conversational scenarios such as text generation, intelligent completion, intelligent translation, etc.
  2. Invoke Agent conversation capabilities via SDK: This approach is suitable for dedicated AI conversation scenarios, supporting the configuration of capabilities required in conversations such as welcome messages, prompt words, knowledge bases, etc.
  3. Use the AI chat component: This approach is more friendly to professional front-end developers, enabling quick integration of AI conversation capabilities into mini-programs based on UI components provided by CloudBase.

For the above three approaches to integrating AI in mini-programs, CloudBase has provided complete code samples in the code repository for reference:

Of course, it's not just mini-programs; CloudBase's AI capabilities also support invoking large models via Web applications, Node.js, and HTTP APIs. For details, refer to the following documentation:

tip

Unlock more core AI capabilities of the Cloud Development Platform. Click to view the Quick Guide

In the future, Cloud Development plans to introduce more AI capabilities such as Tool Calling, multi-Agent chaining, workflow orchestration, etc. Stay tuned. Access the following content for more information: