Skip to main content

Documentation Index

Fetch the complete documentation index at: https://traceroot.ai/docs/llms.txt

Use this file to discover all available pages before exploring further.

Automatically capture all Mistral AI chat completion calls, including tool/function calls and streaming responses.

Setup

import traceroot
from traceroot import Integration

traceroot.initialize(integrations=[Integration.MISTRAL])

Usage

Once initialized, all Mistral calls are captured automatically:
import os
from mistralai import Mistral

import traceroot
from traceroot import Integration

traceroot.initialize(integrations=[Integration.MISTRAL])

client = Mistral(api_key=os.environ["MISTRAL_API_KEY"])

# This call is automatically traced
response = client.chat.complete(
    model="mistral-large-latest",
    messages=[
        {"role": "user", "content": "What is the capital of France?"},
    ],
)

print(response.choices[0].message.content)

What Gets Captured

AttributeDescription
Modelmistral-large-latest, mistral-small-latest, etc.
MessagesInput messages array
ResponseCompletion content
Tool callsFunction calls emitted by the model
TokensInput and output tokens
CostCalculated from token usage and model pricing
LatencyRequest duration

Run the example

Clone the repo and run a complete agent end-to-end.

Python

Run the Python example