LiteLLM is a unified interface for calling 100+ LLM APIs using the OpenAI format. Braintrust automatically traces LiteLLM calls across all providers including OpenAI, Azure, Anthropic, Cohere, Replicate, and more.Documentation Index
Fetch the complete documentation index at: https://braintrust.dev/docs/llms.txt
Use this file to discover all available pages before exploring further.
This guide covers manual instrumentation. For quicker setup, use auto-instrumentation.
Setup
Install LiteLLM alongside the Braintrust SDK:Trace with LiteLLM
Braintrust provides a patch function that automatically instruments LiteLLM to capture all model interactions.braintrust.auto_instrument() patches LiteLLM automatically. See Trace LLM calls for details about auto-instrumentation.
Call patch_litellm() before importing LiteLLM to enable automatic tracing:
trace-litellm.py
- Chat and text
completion/acompletioncalls across different providers - Audio speech (
speech/aspeech) calls, with the generated audio captured as anAttachment - Audio transcription (
transcription/atranscription) calls - Image generation (
image_generation/aimage_generation) calls - Request and response data
- Token usage and costs
- Latency metrics
- Error tracking
Resources
- LiteLLM documentation
- DSPy integration - Combines LiteLLM tracing with DSPy-specific callbacks
- Supported providers