Large Model Access
Go to the CloudBase Platform -> Environment Configuration -> API Key Configuration page to create a new API Key.
After obtaining the API Key, you can perform HTTP API calls by including the Authorization: Bearer <your API Key> header in the HTTP request.
Mini Program Base Library Example
wx.cloud.init({
env: "<your environment ID>"
});
const model = wx.cloud.extend.AI.createModel("hunyuan-exp"); // create model
const result = await model.generateText({
model: "hunyuan-lite",
messages: [{ role: "user", content: "Hello, please introduce Li Bai" }],
});
console.log(result.text);
CloudBase js-sdk Sample
// In the root directory of your Web project, use npm or yarn to install the required packages:
// npm i @cloudbase/js-sdk@latest
// Import the SDK. Here we have imported the complete clousebase-js-sdk, which also supports modular imports.
import cloudbase from "@cloudbase/js-sdk";
const app = cloudbase.init({
// should be replaced with the actual usage environment id
// obtain publishable key: https://tcb.cloud.tencent.com/dev#/env/apikey
env: "<your environment ID>",
accessKey: "<your client publishable key>"
});
const auth = app.auth();
const aiModel = ai.createModel("hunyuan-exp");
const res = await aiModel.streamText({
model: "hunyuan-turbos-latest",
messages: [
{ role: "user", content: "Hello, please introduce Li Bai" },
],
});
// Print the generated text content
for await (let text of res.textStream) {
console.log(text);
}
cURL example
Here is a sample of using cURL to call the large model HTTP API:
curl -X POST 'https://<your environment ID>.api.tcloudbasegateway.com/v1/ai/deepseek/v1/chat/completions' \
-H 'Authorization: Bearer <your API Key>' \
-H 'Content-Type: application/json' \
-H 'Accept: text/event-stream' \
-d '{
"model": "deepseek-r1",
"messages": [
{
"role": "user",
"content": "Introduce yourself"
}
],
"stream": true
}'
OpenAI SDK Sample
After obtaining the API Key, you can also use the OpenAI SDK to access the large model service. Simply replace baseURL and apiKey. Here is an example:
const OpenAI = require("openai");
const client = new OpenAI({
apiKey: "your API Key",
baseURL: "https://<your environment ID>.api.tcloudbasegateway.com/v1/ai/deepseek/v1",
});
async function main() {
const completion = await client.chat.completions.create({
model: "deepseek-r1",
messages: [{ role: "user", content: "Hello" }],
temperature: 0.3,
stream: true,
});
for await (const chunk of completion) {
console.log(chunk);
}
}
main();