Skip to main content

Documentation Index

Fetch the complete documentation index at: https://traceroot.ai/docs/llms.txt

Use this file to discover all available pages before exploring further.

Automatically capture agent steps, tool calls, and LLM invocations within LangChain chains and LangGraph graphs.

Setup

import traceroot
from traceroot import Integration

traceroot.initialize(integrations=[Integration.LANGCHAIN])

Usage with LangGraph

from langchain_openai import ChatOpenAI
from langgraph.prebuilt import create_react_agent

llm = ChatOpenAI(model="gpt-4o")

tools = [search_tool, calculator_tool]
agent = create_react_agent(llm, tools=tools)

# All agent steps are automatically traced
result = agent.invoke({
    "messages": [{"role": "user", "content": "What is 2 + 2?"}]
})

Usage with LangChain

from langchain_openai import ChatOpenAI
from langchain_core.prompts import ChatPromptTemplate

llm = ChatOpenAI(model="gpt-4o")
prompt = ChatPromptTemplate.from_messages([
    ("system", "You are a helpful assistant."),
    ("user", "{input}"),
])

chain = prompt | llm

# Chain execution is automatically traced
result = chain.invoke({"input": "Hello!"})

What Gets Captured

AttributeDescription
Agent stepsEach agent iteration as a span
Tool callsEach tool invocation with input/output
LLM callsAll LLM calls within the chain/graph
Graph structureParent-child relationships between steps
Token usageAggregated across all LLM calls
CostTotal cost for the full chain/graph execution

Run the example

Clone the repo and run a complete agent end-to-end.

Python

Run the Python example

TypeScript

Run the TypeScript example