基于 LangChain 开发
LangChain 是一个用于开发 LLM 应用的开源框架,提供了丰富的组件和工具链。云开发提供了 @cloudbase/agent-adapter-langchain 适配器,让 LangChain Agent 可以无缝对接 AG-UI 协议。
前置条件
安装依赖
npm install @cloudbase/agent-adapter-langchain @cloudbase/agent-server langchain @langchain/openai @langchain/langgraph express
快速开始
1. 创建 LangChain Agent
// agent.ts
import { createAgent as createLangchainAgent } from "langchain";
import { MemorySaver } from "@langchain/langgraph";
import { ChatOpenAI } from "@langchain/openai";
import { clientTools } from "@cloudbase/agent-adapter-langchain";
const checkpointer = new MemorySaver();
export function createAgent() {
// 使用云开发内置的大模型端点
const model = new ChatOpenAI({
model: process.env.TCB_AI_MODEL || "hunyuan-turbos-latest",
apiKey: process.env.TCB_API_KEY!,
configuration: {
baseURL: `https://${process.env.TCB_ENV_ID}.api.tcloudbasegateway.com/v1/ai/hunyuan/v1`,
},
});
// 创建 Agent,使用 clientTools 中间件支持客户端工具
return createLangchainAgent({
model,
checkpointer,
middleware: [clientTools()],
});
}
tip
云开发提供了统一的大模型 HTTP 端点,支持腾讯混元和 DeepSeek 等模型。端点格式为:
https://<ENV_ID>.api.tcloudbasegateway.com/v1/ai/<PROVIDER>/v1
其中 PROVIDER 可选 hunyuan 或 deepseek。
2. 包装并部署为 HTTP 服务
// index.ts
import { createExpressRoutes } from "@cloudbase/agent-server";
import { LangchainAgent } from "@cloudbase/agent-adapter-langchain";
import { createAgent as createLangchainAgent } from "./agent.js";
import express from "express";
function createAgent() {
const lcAgent = createLangchainAgent();
return {
agent: new LangchainAgent({
agent: lcAgent,
}),
};
}
const app = express();
createExpressRoutes({
createAgent,
express: app,
});
app.listen(9000, () => console.log("Listening on 9000!"));
3. 配置环境变量
创建 .env 文件:
TCB_ENV_ID=your-env-id # 云开发环境 ID
TCB_API_KEY=your-api-key # 云开发 API Key
TCB_AI_MODEL=hunyuan-turbos-latest # 模型名称
支持的模型列表请参考 大模型配置指南。
4. 启动服务
npx tsx src/index.ts
核心 API
LangchainAgent
将 LangChain Agent 包装为 AG-UI 兼容的 Agent:
import { LangchainAgent } from "@cloudbase/agent-adapter-langchain";
const agent = new LangchainAgent({
agent: lcAgent, // createAgent() 返回的 LangChain Agent
});
构造参数:
| 参数 | 类型 | 说明 |
|---|---|---|
agent | ReturnType<typeof createAgent> | LangChain 的 createAgent() 返回值 |
clientTools()
创建一个中间件,允许客户端动态注入工具到 Agent 调用中:
import { clientTools } from "@cloudbase/agent-adapter-langchain";
const agent = createLangchainAgent({
model,
checkpointer,
middleware: [clientTools()],
});
用途:
- 让前端客户端可以定义工具,并在调用时动态传递给 Agent
- Agent 会将客户端工具与服务端工具合并使用
高级用法
带服务端工具的 Agent
import { createAgent as createLangchainAgent } from "langchain";
import { MemorySaver } from "@langchain/langgraph";
import { ChatOpenAI } from "@langchain/openai";
import { tool } from "@langchain/core/tools";
import { z } from "zod";
const checkpointer = new MemorySaver();
// 定义服务端工具
const weatherTool = tool(
async ({ city }) => {
// 实际项目中调用天气 API
return JSON.stringify({
city,
temperature: "25°C",
weather: "晴天",
});
},
{
name: "get_weather",
description: "获取指定城市的天气信息",
schema: z.object({
city: z.string().describe("城市名称"),
}),
}
);
export function createAgent() {
const model = new ChatOpenAI({
model: process.env.OPENAI_MODEL!,
apiKey: process.env.OPENAI_API_KEY!,
});
return createLangchainAgent({
model,
checkpointer,
tools: [weatherTool],
});
}
多模型支持
云开发内置了多种大模型,通过统一的 HTTP 端点访问:
import { ChatOpenAI } from "@langchain/openai";
const envId = process.env.TCB_ENV_ID;
const apiKey = process.env.TCB_API_KEY;
// 腾讯混元
const hunyuanModel = new ChatOpenAI({
model: "hunyuan-turbos-latest", // 可选:hunyuan-turbos-latest, hunyuan-t1-latest, hunyuan-2.0-thinking-20251109, hunyuan-2.0-instruct-20251111
apiKey,
configuration: {
baseURL: `https://${envId}.api.tcloudbasegateway.com/v1/ai/hunyuan/v1`,
},
});
// DeepSeek
const deepseekModel = new ChatOpenAI({
model: "deepseek-r1-0528", // 可选:deepseek-r1-0528, deepseek-v3-0324, deepseek-v3.2
apiKey,
configuration: {
baseURL: `https://${envId}.api.tcloudbasegateway.com/v1/ai/deepseek/v1`,
},
});
如需使用外部模型 API,也可以自行配置:
import { ChatOpenAI } from "@langchain/openai";
import { ChatAnthropic } from "@langchain/anthropic";
// OpenAI(需自行获取 API Key)
const openaiModel = new ChatOpenAI({
model: "gpt-4",
apiKey: process.env.OPENAI_API_KEY,
});
// Anthropic(需自行获取 API Key)
const anthropicModel = new ChatAnthropic({
model: "claude-3-opus-20240229",
apiKey: process.env.ANTHROPIC_API_KEY,
});
流式输出
LangchainAgent 默认支持流式输出,无需额外配置。客户端会收到流式的 TEXT_MESSAGE_CONTENT 事件。
部署
云函数部署
参考 HTTP 云函数部署
云托管部署
参考 云托管部署
最佳实践
1. 使用环境变量
const model = new ChatOpenAI({
model: process.env.TCB_AI_MODEL || "hunyuan-turbos-latest",
apiKey: process.env.TCB_API_KEY,
configuration: {
baseURL: `https://${process.env.TCB_ENV_ID}.api.tcloudbasegateway.com/v1/ai/hunyuan/v1`,
},
});
2. 合理设置 Temperature
- 需要确定性输出(如工具调用):
temperature: 0 - 需要创意输出(如写作):
temperature: 0.7-1.0
3. 超时控制
const model = new ChatOpenAI({
model: "gpt-4",
timeout: 30000, // 30 秒超时
});