LangGraph Python Adapter
Make LangGraph workflows compatible with the AG-UI Protocol.
What is AG-UI?
AG-UI is an open, lightweight, event-based protocol for standardizing AI Agent interactions with user interfaces. It enables agents to:
- Real-time streaming conversations
- Bidirectional state synchronization
- Frontend tool integration (Client Tools)
- Human-in-the-loop workflows
What does this package solve?
- Make LangGraph workflows support AG-UI protocol: Adapt compiled LangGraph StateGraph to AG-UI compatible agents
- Client state management: Provides utilities for receiving tools and messages from the frontend
Core Concepts
| Export | Description |
|---|---|
LangGraphAgent | Wraps compiled LangGraph workflows as AG-UI compatible agents |
Works with
| Package | Purpose |
|---|---|
cloudbase-agent-server | Deploy agents as AG-UI compatible HTTP services |
langgraph | LangGraph workflow framework |
langchain-openai | OpenAI compatible model integration |
Architecture Diagram
Installation
pip install cloudbase-agent-langgraph cloudbase-agent-server langgraph langchain-openai
Quick Start
1. Create LangGraph Workflow
# agent.py
import os
from typing import Any, List
from langchain_openai import ChatOpenAI
from langchain_core.messages import SystemMessage, AIMessage
from langchain_core.runnables import RunnableConfig
from langgraph.graph import StateGraph, MessagesState, START, END
from langgraph.checkpoint.memory import MemorySaver
from cloudbase_agent.langgraph import LangGraphAgent
# Define state
class State(MessagesState):
"""Chat agent state containing messages and tools."""
tools: List[Any]
# Define chat node
def chat_node(state: State, config: RunnableConfig = None) -> dict:
"""Generate AI response using OpenAI-compatible model."""
model = ChatOpenAI(
model=os.getenv("OPENAI_MODEL", "gpt-4o-mini"),
api_key=os.getenv("OPENAI_API_KEY"),
base_url=os.getenv("OPENAI_BASE_URL"),
)
# Bind client tools
tools = state.get("tools", [])
if tools:
model_with_tools = model.bind_tools(tools)
else:
model_with_tools = model
system_message = SystemMessage(content="You are a helpful AI assistant.")
messages = [system_message, *state["messages"]]
try:
response = model_with_tools.invoke(messages, config)
return {"messages": [response]}
except Exception as e:
return {"messages": [AIMessage(content=f"Error: {str(e)}")]}
# Build workflow
def build_workflow():
graph = StateGraph(State)
# Add node
graph.add_node("chat_node", chat_node)
# Define edges
graph.add_edge(START, "chat_node")
graph.add_edge("chat_node", END)
# Compile with memory
return graph.compile(checkpointer=MemorySaver())
# Export create_agent function
def create_agent():
workflow = build_workflow()
agent = LangGraphAgent(
name="ChatBot",
description="A helpful conversational assistant",
graph=workflow,
)
return {"agent": agent}
2. Deploy as HTTP Service
# server.py
from cloudbase_agent.server import AgentServiceApp
from agent import create_agent
# One-line deployment
AgentServiceApp().run(create_agent, port=9000)
3. Configure Environment Variables
Create a .env file:
OPENAI_API_KEY=your-api-key
OPENAI_BASE_URL=https://api.deepseek.com/v1
OPENAI_MODEL=deepseek-chat
4. Start Service
python server.py
For complete project configuration (dependencies, etc.), see the example project.
API Reference
LangGraphAgent
Wraps compiled LangGraph workflows as AG-UI compatible agents.
from cloudbase_agent.langgraph import LangGraphAgent
agent = LangGraphAgent(
name="ChatBot",
description="A helpful assistant",
graph=compiled_graph, # Return value of StateGraph.compile()
)
Constructor Parameters:
| Parameter | Type | Description |
|---|---|---|
graph | CompiledStateGraph | Compiled LangGraph workflow |
name | str | Human-readable agent name (default: "") |
description | str | Detailed agent description (default: "") |
use_callbacks | bool | Enable callback processing (default: False) |
fix_event_ids | bool | Enable automatic event ID fixing (default: True) |
Example with Callbacks:
from cloudbase_agent.langgraph import LangGraphAgent
# Create agent with callback support
agent = LangGraphAgent(
graph=compiled_graph,
name="ChatBot",
description="A helpful assistant",
use_callbacks=True,
)
# Add callback for logging
class ConsoleLogger:
async def on_text_message_content(self, event, buffer):
print(f"[AI] {buffer}")
agent.add_callback(ConsoleLogger())
State Definition
For AG-UI compatibility, your state should include a tools field to receive client tools:
from langgraph.graph import MessagesState
from typing import Any, List
class State(MessagesState):
"""State with client tools support."""
tools: List[Any]
# Use in workflow
graph = StateGraph(State)
State Fields:
| Field | Type | Description |
|---|---|---|
messages | List[BaseMessage] | Message history (from MessagesState) |
tools | List[Any] | Client tools passed from frontend |
Advanced Usage
With Resource Cleanup
def create_agent():
# Initialize resources
db = connect_database()
workflow = build_workflow()
agent = LangGraphAgent(
name="ChatBot",
description="Agent with database access",
graph=workflow,
)
# Define cleanup function
def cleanup():
db.close()
print("Resources cleaned up")
return {"agent": agent, "cleanup": cleanup}
Multiple Agents
from fastapi import FastAPI
from cloudbase_agent.server import create_send_message_adapter, RunAgentInput
app = FastAPI()
@app.post("/chat/send-message")
async def chat_endpoint(request: RunAgentInput):
return await create_send_message_adapter(create_chat_agent, request)
@app.post("/assistant/send-message")
async def assistant_endpoint(request: RunAgentInput):
return await create_send_message_adapter(create_assistant_agent, request)