Documentation Index
Fetch the complete documentation index at: https://braintrust.dev/docs/llms.txt
Use this file to discover all available pages before exploring further.
OpenRouter lets you call models from many providers through a single API.

TypeScript

Python
If you’re using OpenRouter’s agent toolkit package (@openrouter/agent), see OpenRouter Agent. Setup
Install packages
# pnpm
pnpm add braintrust @openrouter/sdk
# npm
npm install braintrust @openrouter/sdk
Set environment variables
OPENROUTER_API_KEY=<your-openrouter-api-key>
BRAINTRUST_API_KEY=<your-braintrust-api-key>
# If you are self-hosting Braintrust, set the URL of your hosted dataplane
# BRAINTRUST_API_URL=<your-braintrust-api-url>
Auto-instrumentation
Braintrust provides automatic tracing for OpenRouter calls. This is the recommended setup for most projects.import { initLogger } from "braintrust";
import { OpenRouter } from "@openrouter/sdk";
initLogger({
projectName: "My Project",
apiKey: process.env.BRAINTRUST_API_KEY,
});
const client = new OpenRouter({ apiKey: process.env.OPENROUTER_API_KEY });
const response = await client.chat.send({
chatRequest: {
model: "openai/gpt-5-mini",
messages: [{ role: "user", content: "What is observability?" }],
},
});
Run with the import hook:node --import braintrust/hook.mjs app.js
If you’re using a bundler or Next.js Turbopack, see Trace LLM calls for plugin/loader setup.Manual instrumentation
Trace an OpenRouter client explicitly by wrapping it with wrapOpenRouter.import { initLogger, wrapOpenRouter } from "braintrust";
import { OpenRouter } from "@openrouter/sdk";
initLogger({
projectName: "My Project",
apiKey: process.env.BRAINTRUST_API_KEY,
});
const client = wrapOpenRouter(
new OpenRouter({ apiKey: process.env.OPENROUTER_API_KEY }),
);
const response = await client.chat.send({
chatRequest: {
model: "openai/gpt-5-mini",
messages: [{ role: "user", content: "What is observability?" }],
},
});
What Braintrust traces
For the @openrouter/sdk package, Braintrust traces:
- Chat completions via
chat.send(), including streaming
- Embeddings via
embeddings.generate()
- Responses API via
beta.responses.send(), including streaming
Setup
Install packages
pip install braintrust openrouter
Set environment variables
OPENROUTER_API_KEY=<your-openrouter-api-key>
BRAINTRUST_API_KEY=<your-braintrust-api-key>
Auto-instrumentation
Braintrust provides automatic tracing for OpenRouter calls. This is the recommended setup for most projects.import os
import braintrust
braintrust.auto_instrument()
braintrust.init_logger(project="My Project")
from openrouter import OpenRouter
client = OpenRouter(api_key=os.environ["OPENROUTER_API_KEY"])
response = client.chat.send(
model="openai/gpt-5-mini",
messages=[{"role": "user", "content": "What is observability?"}],
)
Manual instrumentation
Trace the OpenRouter Python SDK by wrapping the client with wrap_openrouter().import os
from braintrust import init_logger, wrap_openrouter
from openrouter import OpenRouter
init_logger(project="My Project")
client = wrap_openrouter(OpenRouter(api_key=os.environ["OPENROUTER_API_KEY"]))
response = client.beta.responses.send(
model="openai/gpt-5-mini",
input="Summarize tracing in one sentence.",
)
# Response shape varies; adjust indexing for your model and request.
print(response.output[1].content[0].text)
OpenAI-compatible endpoint
If your app already uses the OpenAI Python SDK with OpenRouter’s OpenAI-compatible endpoint, keep that setup and use wrap_openai().import os
from braintrust import init_logger, wrap_openai
from openai import OpenAI
init_logger(project="My Project")
client = wrap_openai(
OpenAI(
base_url="https://openrouter.ai/api/v1",
api_key=os.environ["OPENROUTER_API_KEY"],
)
)
response = client.responses.create(
model="openai/gpt-5-mini",
input="Explain routing in one sentence.",
)
print(response.output_text)
What Braintrust traces
For the openrouter SDK, Braintrust traces:
- Chat completions via
chat.send() and send_async(), including streaming
- Embeddings via
embeddings.generate() and the async variant
- Responses API calls via
beta.responses.send() and send_async(), including streaming
Resources