ag2 (v0.11.5+), the community-maintained continuation of the classic AutoGen framework.
Setup
To capture the complete picture—including both the agent orchestrations and the underlying token costs—we highly recommend initializing both the AutoGen integration and your specific LLM provider (e.g., Google GenAI, OpenAI, Anthropic).Usage
Once initialized, the standardinitiate_chat loop and internal agent reasoning steps are captured automatically:
Usage with Tools
Function registrations and local tool executions are automatically traced as nested child spans within the executing agent’s workflow:What Gets Captured
| Attribute | Description |
|---|---|
| Conversation Loop | The overarching initiate_chat session |
| Agent Steps | Individual spans for each AssistantAgent or UserProxyAgent turn |
| Messages | Full chat history, input messages, and agent replies |
| Tool calls | Function names, input arguments, and execution outputs |
| LLM calls* | Raw completion requests to the provider |
| Tokens & Cost* | Aggregated usage and pricing for the chat session |
Integration.GOOGLE_GENAI) alongside Integration.AUTOGEN