Skip to main content

Documentation Index

Fetch the complete documentation index at: https://traceroot.ai/docs/llms.txt

Use this file to discover all available pages before exploring further.

Once you have initialized TraceRoot, LLM calls are traced automatically. To go further — tracing custom functions, identifying users, grouping sessions, or tracking cost — you can use observe and attach rich context to every trace.

Quickstart

Here’s everything you can attach in one place:
import traceroot
from traceroot import Integration, observe, using_attributes
from openai import OpenAI

traceroot.initialize(integrations=[Integration.OPENAI])

client = OpenAI()

@observe(name="run_agent", type="agent", metadata={"version": "v2"})
def run_agent(query: str) -> str:
    response = client.chat.completions.create(...)  # tokens + cost tracked automatically
    return response.choices[0].message.content

with using_attributes(user_id="user-42", session_id="chat-abc", tags=["production"]):
    run_agent("What's the weather?")
    run_agent("What about tomorrow?")

Quick Reference

I want to…AttributeGuide
Identify which user triggered a traceUser IDUsers
Group traces from the same conversationSession IDSessions
Label traces for filteringTagsTags
Attach structured key-value contextMetadataMetadata
Track token usage and LLM costauto via instrumentationCost Tracking
Link traces back to source code + commitauto via observeGit Context