AI Agent Integration
Feature Introduction
Regardless of whether you have programming experience, you can quickly build various Q&A Agents based on AI models in the CloudBase console. You can also publish the Agent to Mini Program Customer Service, Service Accounts, Official Accounts, or WeCom customer service so more users can chat with it. We have published the AI Agent as an API service, allowing you to interact with the Agent through HTTP requests.
This series of open APIs provides a unified way to connect to AI Agents. Simply configure the token of the desired Agent in the CloudBase environment to integrate it quickly. The dialogue interfaces in this series support responses in the SSE format.
Access Guide
Calling the following APIs requires an AccessToken in the format Authorization: Bearer <token>. For details on how to obtain the token, see Get token.
HTTP Status Codes
Depending on the response result, status codes 2xx, 4xx, or 5xx may appear. The Agent may indicate errors in the response body even when the status code is 200, so handle the response based on the actual content.
Error Codes
The following table lists the error codes specific to this series of APIs.
| Error Code | Description |
| AGENT_INVALID_PARAM | Invalid parameters were provided when calling the Agent |
| AGENT_GET_INFO_ERROR | Failed to query Agent information |
| AGENT_REQUEST_LLM_ERROR | The Agent encountered an exception while calling the large model |
| AGENT_REQUEST_KNOWLEDGE_ERROR | The Agent encountered an exception while querying the knowledge base |
| AGENT_SERVER_ERR | An internal Agent system error occurred |
Authentication
- HTTP: Bearer Auth
The token corresponding to the environment ID is obtained using login authentication (v2)
| Security Scheme Type: | http |
|---|---|
| HTTP Authorization Scheme: | bearer |