Local Development
This guide explains how to develop and debug Agents locally.
Quick Start
Step 1: Get Project Code
Method 1: Get from GitHub Templates (Recommended)
Visit awesome-cloudbase-examples for a complete collection of example projects.
# Clone the examples repository
git clone https://github.com/TencentCloudBase/awesome-cloudbase-examples.git
cd awesome-cloudbase-examples/httpfunctions
# Choose the template you need
cd <template-name>
Available Templates:
| Template Name | Language | Framework | Description |
|---|---|---|---|
langchain-js | JavaScript | LangChain | LangChain Agent Example (JS) |
langchain-ts | TypeScript | LangChain | LangChain Agent Example (TS) |
langgraph-js | JavaScript | LangGraph | LangGraph Agent Example (JS) |
langgraph-ts | TypeScript | LangGraph | LangGraph Agent Example (TS) |
langgraph-python | Python | LangGraph | LangGraph Agent Example (Python) |
crewai-python | Python | CrewAI | CrewAI Multi-Agent Collaboration Example |
adk-python | Python | ADK | Agent Development Kit Example |
adp-js | JavaScript | ADP | Agent Development Platform Example |
adp-ts | TypeScript | ADP | Agent Development Platform Example |
coze-python | Python | Coze | Coze Agent Example |
n8n-js | Python | n8n | n8n Agent Example |
n8n-python | Python | n8n | n8n Agent Example |
For details on how to integrate specific frameworks, refer to:
Method 2: Sync Online Services to Local
If you have already deployed an Agent service in the cloud, you can sync the code to your local machine for development.
Prerequisites: Install CloudBase CLI
npm install -g @cloudbase/cli
# Log in to CloudBase
cloudbase login
Sync HTTP Cloud Functions:
# List cloud functions
cloudbase fn list -e <env-id>
# Download cloud function code to local
mkdir -p ./my-agent && cloudbase fn code download -e <env-id> <function-name> ./my-agent
cd ./my-agent
Sync CloudBase Run Services:
# List CloudBase Run services
cloudbase cloudrun list
# Download CloudBase Run code to local
cloudbase cloudrun download -s <service-name> --targetPath ./my-agent
cd my-agent
For more CLI commands and usage, refer to the CloudBase CLI Documentation.
Step 2: Install Dependencies
Install dependencies based on your project type. Each template's README file contains detailed installation instructions.
- Node.js Project
- Python Project
# Using npm
npm install
# Or using yarn
yarn install
# Or using pnpm
pnpm install
# Using pip
pip install -r requirements.txt
Each template includes detailed README documentation explaining dependency installation, configuration, and running instructions. It's recommended to read the template's README file first.
Step 3: Configure Environment Variables
Environment variable configuration may differ between templates. Below is an example using the LangGraph template.
Create a .env file:
# Configure model information called in the agent
OPENAI_API_KEY=xxx
OPENAI_BASE_URL=xxx
OPENAI_MODEL=xxx
# Temperature (default: 0.7)
OPENAI_TEMPERATURE=0.7
# Optional: Enable CORS (for cross-origin debugging during local development)
# ENABLE_CORS=true
# Server port (default: 3000)
- API Key: Create at CloudBase Console - API Keys
- Model List: Refer to Large Model Configuration Guide
Step 4: Start the Service
- Node.js Project
- Python Project
node src/index.js
# Run directly
python -u app.py
After the service starts, it provides service on http://localhost:9000 by default.
Run with Docker
For CloudBase Run projects, you can use Docker for local development to ensure consistency between local and production environments.
# Build image
docker build -t my-agent .
# Run container
docker run -p 9000:9000 \
-e OPENAI_API_KEY=your-api-key \
-e OPENAI_BASE_URL=model-base-url \
-e OPENAI_MODEL=model \
-e OPENAI_TEMPERATURE=0.7 \
my-agent
Local Debugging
Method 1: Debug with cURL
curl 'http://localhost:9000/send-message' \
-H 'Accept: text/event-stream' \
-H 'Content-Type: application/json' \
--data-raw '{
"threadId": "550e8400-e29b-41d4-a716-446655440000",
"messages": [
{ "id": "msg-1", "role": "user", "content": "Hello" }
],
"tools": [],
"context": [],
"state": {},
"forwardedProps": {}
}'
Method 2: Proxy Online Requests to Local
This method allows you to proxy Agent UI or SDK requests to online Agents to local, enabling local debugging without modifying client code.
Online URL Example:
https://lowcode-3geceaptb6c8835b.api.tcloudbasegateway.com/v1/aibot/bots/agent-kkk-5gxr7dys9d0c5427/send-message
1. Install Whistle
npm install -g whistle
# Start Whistle
w2 start
2. Configure Proxy Rules
Visit http://localhost:8899, and add the following in Rules:
# Proxy online Agent requests to local
https://lowcode-3geceaptb6c8835b.api.tcloudbasegateway.com/v1/aibot/bots/agent-kkk-5gxr7dys9d0c5427/send-message resCross://* http://localhost:9000/send-message
3. Configure Proxy
Configure the proxy at different locations depending on usage:
In WeChat Developer Tools:
- Open WeChat Developer Tools
- Settings → Proxy Settings → Manual proxy setup
- Proxy Server:
127.0.0.1 - Port:
8899 - Check "Use proxy"
In Browser:
You can proxy requests in webpages to port 8899 through browser plugins:
- Chrome: Use SwitchyOmega plugin
- Install SwitchyOmega plugin
- Create new scenario mode → Proxy server
- Proxy protocol: HTTP
- Proxy server:
127.0.0.1 - Proxy port:
8899 - Apply options and switch to this scenario mode
4. Install HTTPS Certificate
Visit http://localhost:8899, click HTTPS → Download and install the root certificate.
- Debug online Agents with Agent UI
- Debug with SDK in real environments
- Local debugging without modifying client code
Common Issues
1. Port in Use
# Check port usage
lsof -i :9000 # macOS/Linux
netstat -ano | findstr :9000 # Windows
# Kill process
kill -9 <PID> # macOS/Linux
taskkill /PID <PID> /F # Windows
# Or change port
PORT=3001 node index.js
2. Proxy Not Working
- Ensure Whistle is running:
w2 status - Check if proxy rules are correct
- Confirm system proxy is configured
- Mini Programs need separate proxy configuration in WeChat Developer Tools
Next Steps
After completing local development, you can deploy your Agent to the cloud:
Or learn how to invoke the Agent from clients: