Skip to main content

Developing with LangGraph

LangGraph is a graph-structured workflow framework developed by the LangChain team, providing more granular process control capabilities. CloudBase supports both TypeScript and Python versions of the LangGraph adapter.

Prerequisites

TypeScript Version

Install Dependencies

npm install @cloudbase/agent-adapter-langgraph @langchain/langgraph @langchain/openai

Quick Start

import { LanggraphAgent, createAgentServer } from "@cloudbase/agent-adapter-langgraph";
import { StateGraph, Annotation, END } from "@langchain/langgraph";
import { ChatOpenAI } from "@langchain/openai";

// Define state
const StateAnnotation = Annotation.Root({
messages: Annotation<string[]>({
reducer: (x, y) => x.concat(y),
default: () => [],
}),
response: Annotation<string>(),
});

// Use CloudBase's built-in LLM endpoint
const model = new ChatOpenAI({
model: process.env.TCB_AI_MODEL || "hunyuan-turbos-latest",
apiKey: process.env.TCB_API_KEY,
configuration: {
baseURL: `https://${process.env.TCB_ENV_ID}.api.tcloudbasegateway.com/v1/ai/hunyuan/v1`,
},
});

// Define node
async function chatNode(state: typeof StateAnnotation.State) {
const lastMessage = state.messages[state.messages.length - 1];
const response = await model.invoke(lastMessage);
return { response: response.content };
}

// Build graph
const graph = new StateGraph(StateAnnotation)
.addNode("chat", chatNode)
.addEdge("__start__", "chat")
.addEdge("chat", END);

// Compile graph
const compiledGraph = graph.compile();

// Create Agent
const agent = new LanggraphAgent({
name: "ChatBot",
description: "A conversational assistant based on LangGraph",
graph: compiledGraph,
});

// Export
module.exports = createAgentServer(agent);

Agent with Tools

import { LanggraphAgent, createAgentServer, ClientStateAnnotation } from "@cloudbase/agent-adapter-langgraph";
import { StateGraph, Annotation, END } from "@langchain/langgraph";
import { ChatOpenAI } from "@langchain/openai";
import { ToolNode } from "@langchain/langgraph/prebuilt";
import { tool } from "@langchain/core/tools";
import { z } from "zod";

// Define tools
const weatherTool = tool(
async ({ city }) => {
return JSON.stringify({
city,
temperature: "25°C",
weather: "Sunny",
});
},
{
name: "get_weather",
description: "Get weather information for a specified city",
schema: z.object({
city: z.string().describe("City name"),
}),
}
);

const calculatorTool = tool(
async ({ expression }) => {
try {
// Note: In real projects, use a safe expression evaluation library
const result = eval(expression);
return String(result);
} catch {
return "Calculation error";
}
},
{
name: "calculator",
description: "Calculate mathematical expressions",
schema: z.object({
expression: z.string().describe("Mathematical expression"),
}),
}
);

const tools = [weatherTool, calculatorTool];

// Define state
const StateAnnotation = Annotation.Root({
...ClientStateAnnotation.spec,
messages: Annotation<any[]>({
reducer: (x, y) => x.concat(y),
default: () => [],
}),
});

// Use CloudBase's built-in LLM endpoint
const model = new ChatOpenAI({
model: process.env.TCB_AI_MODEL || "hunyuan-turbos-latest",
apiKey: process.env.TCB_API_KEY,
configuration: {
baseURL: `https://${process.env.TCB_ENV_ID}.api.tcloudbasegateway.com/v1/ai/hunyuan/v1`,
},
}).bindTools(tools);

// Define node
async function agentNode(state: typeof StateAnnotation.State) {
const response = await model.invoke(state.messages);
return { messages: [response] };
}

// Routing function
function shouldContinue(state: typeof StateAnnotation.State) {
const lastMessage = state.messages[state.messages.length - 1];
if (lastMessage.tool_calls?.length > 0) {
return "tools";
}
return END;
}

// Build graph
const graph = new StateGraph(StateAnnotation)
.addNode("agent", agentNode)
.addNode("tools", new ToolNode(tools))
.addEdge("__start__", "agent")
.addConditionalEdges("agent", shouldContinue)
.addEdge("tools", "agent");

const compiledGraph = graph.compile();

// Create Agent
const agent = new LanggraphAgent({
name: "ToolAgent",
description: "An intelligent assistant with tool calling capabilities",
graph: compiledGraph,
});

module.exports = createAgentServer(agent);

Multi-Step Workflow

import { LanggraphAgent, createAgentServer } from "@cloudbase/agent-adapter-langgraph";
import { StateGraph, Annotation, END } from "@langchain/langgraph";
import { ChatOpenAI } from "@langchain/openai";

// Define state
const StateAnnotation = Annotation.Root({
input: Annotation<string>(),
plan: Annotation<string[]>({
default: () => [],
}),
currentStep: Annotation<number>({
default: () => 0,
}),
results: Annotation<string[]>({
reducer: (x, y) => x.concat(y),
default: () => [],
}),
finalOutput: Annotation<string>(),
});

const model = new ChatOpenAI({
model: process.env.TCB_AI_MODEL || "hunyuan-turbos-latest",
apiKey: process.env.TCB_API_KEY,
configuration: {
baseURL: `https://${process.env.TCB_ENV_ID}.api.tcloudbasegateway.com/v1/ai/hunyuan/v1`,
},
});

// Planning node
async function planNode(state: typeof StateAnnotation.State) {
const response = await model.invoke(
`Please break down the following task into specific steps: ${state.input}\nReturn a JSON array format step list.`
);
const plan = JSON.parse(response.content as string);
return { plan };
}

// Execution node
async function executeNode(state: typeof StateAnnotation.State) {
const currentStep = state.plan[state.currentStep];
const response = await model.invoke(`Please execute the following step: ${currentStep}`);
return {
results: [response.content as string],
currentStep: state.currentStep + 1,
};
}

// Summary node
async function summarizeNode(state: typeof StateAnnotation.State) {
const response = await model.invoke(
`Please summarize the following execution results:\n${state.results.join("\n")}`
);
return { finalOutput: response.content as string };
}

// Routing function
function shouldContinueExecution(state: typeof StateAnnotation.State) {
if (state.currentStep < state.plan.length) {
return "execute";
}
return "summarize";
}

// Build graph
const graph = new StateGraph(StateAnnotation)
.addNode("plan", planNode)
.addNode("execute", executeNode)
.addNode("summarize", summarizeNode)
.addEdge("__start__", "plan")
.addEdge("plan", "execute")
.addConditionalEdges("execute", shouldContinueExecution)
.addEdge("summarize", END);

const compiledGraph = graph.compile();

const agent = new LanggraphAgent({
name: "WorkflowAgent",
description: "Multi-step workflow Agent",
graph: compiledGraph,
});

module.exports = createAgentServer(agent);

Python Version

Install Dependencies

pip install cloudbase-agent-langgraph cloudbase-agent-server langgraph langchain-openai

Quick Start

from cloudbase_agent.langgraph import LangGraphAgent
from cloudbase_agent.server import create_agent_server
from langgraph.graph import StateGraph, END
from langchain_openai import ChatOpenAI
from typing import TypedDict, Annotated
import operator
import os

# Define state
class AgentState(TypedDict):
messages: Annotated[list, operator.add]
response: str

# Use CloudBase's built-in LLM endpoint
model = ChatOpenAI(
model=os.environ.get("TCB_AI_MODEL", "hunyuan-turbos-latest"),
api_key=os.environ.get("TCB_API_KEY"),
base_url=f"https://{os.environ.get('TCB_ENV_ID')}.api.tcloudbasegateway.com/v1/ai/hunyuan/v1",
)

# Define node
def chat_node(state: AgentState) -> dict:
last_message = state["messages"][-1]
response = model.invoke(last_message)
return {"response": response.content}

# Build graph
graph = StateGraph(AgentState)
graph.add_node("chat", chat_node)
graph.set_entry_point("chat")
graph.add_edge("chat", END)

compiled_graph = graph.compile()

# Create Agent
agent = LangGraphAgent(
name="ChatBot",
description="A conversational assistant based on LangGraph",
graph=compiled_graph,
)

# Create service
app = create_agent_server(agent)

if __name__ == "__main__":
import uvicorn
uvicorn.run(app, host="0.0.0.0", port=3000)

Agent with Tools

from cloudbase_agent.langgraph import LangGraphAgent
from cloudbase_agent.server import create_agent_server
from langgraph.graph import StateGraph, END
from langgraph.prebuilt import ToolNode
from langchain_openai import ChatOpenAI
from langchain_core.tools import tool
from typing import TypedDict, Annotated
import operator

# Define tools
@tool
def get_weather(city: str) -> str:
"""Get weather information for a specified city"""
return f"{city}: 25°C, Sunny"

@tool
def calculator(expression: str) -> str:
"""Calculate mathematical expressions"""
try:
result = eval(expression)
return str(result)
except:
return "Calculation error"

tools = [get_weather, calculator]

# Define state
class AgentState(TypedDict):
messages: Annotated[list, operator.add]

# Use CloudBase's built-in LLM endpoint
model = ChatOpenAI(
model=os.environ.get("TCB_AI_MODEL", "hunyuan-turbos-latest"),
api_key=os.environ.get("TCB_API_KEY"),
base_url=f"https://{os.environ.get('TCB_ENV_ID')}.api.tcloudbasegateway.com/v1/ai/hunyuan/v1",
).bind_tools(tools)

# Define node
def agent_node(state: AgentState) -> dict:
response = model.invoke(state["messages"])
return {"messages": [response]}

# Routing function
def should_continue(state: AgentState) -> str:
last_message = state["messages"][-1]
if hasattr(last_message, "tool_calls") and last_message.tool_calls:
return "tools"
return END

# Build graph
graph = StateGraph(AgentState)
graph.add_node("agent", agent_node)
graph.add_node("tools", ToolNode(tools))
graph.set_entry_point("agent")
graph.add_conditional_edges("agent", should_continue)
graph.add_edge("tools", "agent")

compiled_graph = graph.compile()

# Create Agent
agent = LangGraphAgent(
name="ToolAgent",
description="An intelligent assistant with tool calling capabilities",
graph=compiled_graph,
)

app = create_agent_server(agent)

Core Concepts

State

State is the core of LangGraph, defining the data structure passed through the workflow:

// TypeScript
const StateAnnotation = Annotation.Root({
messages: Annotation<any[]>({
reducer: (x, y) => x.concat(y), // Define how to merge state
default: () => [], // Default value
}),
});

// Python
class AgentState(TypedDict):
messages: Annotated[list, operator.add]

Node

Nodes are processing units in the workflow:

// TypeScript
async function myNode(state: StateType) {
// Processing logic
return { key: newValue }; // Return state update
}

graph.addNode("myNode", myNode);

Edge

Edges define connections between nodes:

// Regular edge: fixed flow
graph.addEdge("nodeA", "nodeB");

// Conditional edge: dynamic routing
graph.addConditionalEdges("nodeA", routingFunction);

Client State

Use ClientStateAnnotation to support state synchronization with clients:

import { ClientStateAnnotation, ClientState } from "@cloudbase/agent-adapter-langgraph";

const StateAnnotation = Annotation.Root({
...ClientStateAnnotation.spec,
// Other state fields
});

Best Practices

1. State Design

  • Keep state concise, storing only necessary information
  • Use appropriate reducer functions for state merging
  • Consider state serializability

2. Node Design

  • Each node should do only one thing
  • Nodes should be pure functions (same input produces same output)
  • Avoid side effects in nodes

3. Error Handling

async function safeNode(state: StateType) {
try {
// Processing logic
return { result: value };
} catch (error) {
return { error: error.message };
}
}

4. Debugging

// Add checkpointer
const compiledGraph = graph.compile({
checkpointer: new MemorySaver(),
});

// Get execution history
const history = await compiledGraph.getStateHistory(threadId);

Deployment

Cloud Function Deployment

Refer to HTTP Cloud Function Deployment

CloudRun Deployment

Refer to CloudRun Deployment

Observability

CloudBase provides built-in observability capabilities for LangGraph applications, configurable via environment variables or code.

# Enable console output
AUTO_TRACES_STDOUT=true

No code changes needed; the SDK will automatically read environment variables and enable tracing.

Code Configuration Method

from cloudbase_agent.server import AgentServiceApp
from cloudbase_agent.observability.server import ConsoleTraceConfig

app = AgentServiceApp(observability=ConsoleTraceConfig())
import { createExpressRoutes } from "@cloudbase/agent-server";
import { ExporterType } from "@cloudbase/agent-observability/server";

createExpressRoutes({
createAgent,
express: app,
observability: { type: ExporterType.Console }
});

For more configuration options (OTLP export, Serverless considerations, etc.), refer to the Agent Observability Guide.