Trace Vercel AI SDK agents in TraceRoot. Unlike LangChain, OpenAI, or Anthropic, noDocumentation Index
Fetch the complete documentation index at: https://traceroot.ai/docs/llms.txt
Use this file to discover all available pages before exploring further.
instrumentModules configuration is needed — the Vercel AI SDK emits OpenTelemetry spans natively when experimental_telemetry is enabled, and TraceRoot enriches them via the OpenInference span processor automatically.
Setup
instrumentModules config:
Usage
What Gets Captured
| Attribute | Description |
|---|---|
| Model | The model passed to generateText / streamText / generateObject |
| Messages | Input prompt and message history |
| Response | Generated text output |
| Tool calls | Each tool invocation with input and output |
| Tokens | Input and output token counts |
| Cost | Calculated from token usage and model pricing |
| Latency | Request duration per span |
| Multi-step iterations | Each maxSteps iteration captured as a child span |
Run the example
Clone the repo and run a complete agent end-to-end.TypeScript
Run the TypeScript example