Skip to main content
Braintrust integrates with Vercel in two ways: through the Vercel AI SDK for code-based tracing, and through the Vercel Marketplace for dashboard-based observability.

Setup

Choose your integration method based on your needs:
MethodBest forSetup
Vercel AI SDKFine-grained control over tracing, selective instrumentationInstall packages + add code
Vercel MarketplaceQuick setup, automatic tracing of all AI callsConfigure in Vercel dashboard

Option 1: Vercel AI SDK

Install the Braintrust SDK alongside the Vercel AI SDK. The Braintrust SDK supports Vercel AI SDK v3, v4, v5, and v6.
# pnpm
pnpm add braintrust ai
# npm
npm install braintrust ai

Option 2: Vercel Marketplace

Install the Braintrust integration from the Vercel Marketplace. No package installation required.

Trace with Vercel Marketplace

The Vercel Marketplace integration provides automatic tracing for all AI calls in your Vercel applications with minimal setup.

Installation steps

  1. Visit the Vercel Marketplace listing and select Install
  2. Create or link your Braintrust account
  3. Select a plan (Free or Pro) and create a project name
  4. Select Add Drain to configure trace collection

Configure log drain

In the Add Drain panel:
  1. Select Traces and Next
  2. Choose which Vercel projects to trace (All Projects or specific projects)
  3. Set the sampling rate for trace collection

Enable OpenTelemetry

In your Next.js project, create an instrumentation.ts file and call registerOtel. See the Vercel OpenTelemetry docs for details.

Trace with Vercel AI SDK

The Braintrust SDK provides native support for the Vercel AI SDK, automatically tracing AI calls with full input/output logging, metrics, and tool execution.

Basic tracing

Use wrapAISDK to wrap the Vercel AI SDK functions (generateText, streamText, generateObject, streamObject).
trace-vercel-ai-sdk.ts
import { initLogger, wrapAISDK } from "braintrust";
import * as ai from "ai";
import { openai } from "@ai-sdk/openai";

initLogger({
  projectName: "My AI Project",
  apiKey: process.env.BRAINTRUST_API_KEY,
});

const { generateText } = wrapAISDK(ai);

async function main() {
  // This will automatically log the request, response, and metrics to Braintrust
  const { text } = await generateText({
    model: openai("gpt-5-mini"),
    prompt: "What is the capital of France?",
  });
  console.log(text);
}

main().catch(console.error);

Tool calls

wrapAISDK automatically traces tool call suggestions from the LLM and the tool execution results.
trace-vercel-ai-sdk-tools.ts
import { initLogger, wrapAISDK } from "braintrust";
import * as ai from "ai";
import { openai } from "@ai-sdk/openai";
import { z } from "zod";

initLogger({
  projectName: "Tool Tracing",
  apiKey: process.env.BRAINTRUST_API_KEY,
});

const { generateText } = wrapAISDK(ai);

async function main() {
  // Tool executions are automatically wrapped and traced
  const { text } = await generateText({
    model: openai("gpt-5-mini"),
    prompt: "What's the weather like in San Francisco?",
    tools: {
      getWeather: {
        description: "Get weather for a location",
        inputSchema: z.object({
          location: z.string().describe("The city name"),
        }),
        execute: async ({ location }: { location: string }) => {
          // This execution will appear as a child span
          return {
            location,
            temperature: 72,
            conditions: "sunny",
          };
        },
      },
    },
  });

  console.log(text);
}

main().catch(console.error);

Streaming with tools

You can also use streamText for streaming responses with tool calls:
trace-vercel-ai-sdk-streaming.ts
import { initLogger, wrapAISDK } from "braintrust";
import * as ai from "ai";
import { openai } from "@ai-sdk/openai";
import { z } from "zod";

initLogger({
  projectName: "Streaming Tool Tracing",
  apiKey: process.env.BRAINTRUST_API_KEY,
});

const { streamText } = wrapAISDK(ai);

async function main() {
  const result = streamText({
    model: openai("gpt-5-mini"),
    prompt: "What is 127 multiplied by 49?",
    tools: {
      calculate: {
        description: "Perform a mathematical calculation",
        inputSchema: z.object({
          operation: z.enum(["add", "subtract", "multiply", "divide"]),
          a: z.number(),
          b: z.number(),
        }),
        execute: async ({ operation, a, b }) => {
          switch (operation) {
            case "add": return a + b;
            case "subtract": return a - b;
            case "multiply": return a * b;
            case "divide": return b !== 0 ? a / b : 0;
          }
        },
      },
    },
    maxToolRoundtrips: 2,
  });

  for await (const delta of result.textStream) {
    process.stdout.write(delta);
  }
}

main().catch(console.error);

Resources