Skip to main content
Braintrust integrates with Vercel in two ways: through the Vercel AI SDK for tracing AI applications, and through the Vercel Marketplace for project setup and observability through your Vercel dashboard.

Vercel Marketplace vs. Vercel AI SDK

Both the Vercel Marketplace and Vercel AI SDK approaches enable you to send your Vercel application’s AI traces to Braintrust. The method you choose is related to how you want to set up and manage your integration and how much flexibility and control you want over how your traces are logged in Braintrust. The Vercel Marketplace integration provides less fine-grained control over what parts of your app are traced to Braintrust, but requires almost no code changes to your project. This is a good method for getting set up with Braintrust quickly. The Vercel AI SDK integration requires installing the Braintrust SDK and adding several lines of code to your project, but allows you more control over what parts of your application are traced. This is a good method for more complex applications where you only want to capture specific traces. The Braintrust SDK supports Vercel AI SDK v3, v4, v5, and the upcoming beta v6.

Vercel Marketplace

Braintrust is available as a native integration on the Vercel Marketplace. This integration allows you to run evals, monitor model quality and user experience, and benchmark across models from OpenAI, Anthropic, Gemini, and more from your Vercel dashboard.

Set up the Vercel Marketplace integration

Install the Braintrust integration

From the Vercel Marketplace listing, select Install. You will be prompted to create a new Braintrust account. On the next screen, select either the Free plan or the Pro installation plan and select Continue. Create a name for your Braintrust project in the Product Name field and select Create. This creates a Braintrust project and links it to your Vercel account. From here, either select Done to go to your integration page or Add Drain to start sending logs to Braintrust.

Add Drain

In the Add Drain panel, select Traces and Next. Create a name for the drain and choose which Vercel projects you want to send traces to Braintrust. You can select All Projects or designate one or several specific projects. Adjust the sampling rate to designate what percentage of Vercel logs are sent to Braintrust.

Configure OpenTelemetry

Once you’ve added the integration, you need to configure OpenTelemetry on each project that’s sending traces to Braintrust. In your Next.js project, create an instrumentation.ts file and call registerOtel. Check out the Vercel docs on initializing OTel for an example.

Vercel AI SDK

Braintrust natively supports tracing requests made with the Vercel AI SDK. The Vercel AI SDK is an elegant tool for building AI-powered applications.

Tracing with wrapAISDK

wrapAISDK wraps the top-level AI SDK functions (generateText, streamText, generateObject, streamObject) and automatically creates spans with full input/output logging, metrics, and tool call tracing.
trace-vercel-ai-sdk.ts
import { initLogger, wrapAISDK } from "braintrust";
import * as ai from "ai";
import { openai } from "@ai-sdk/openai";

// `initLogger` sets up your code to log to the specified Braintrust project using your API key.
// If you don't call `initLogger`, wrapping is a no-op, and you will not see spans in the UI.
initLogger({
  projectName: "My AI Project",
  apiKey: process.env.BRAINTRUST_API_KEY,
});

const { generateText } = wrapAISDK(ai);

async function main() {
  // This will automatically log the request, response, and metrics to Braintrust
  const { text } = await generateText({
    model: openai("gpt-5-mini"),
    prompt: "What is the capital of France?",
  });
  console.log(text);
}

main().catch(console.error);

Tool calls

wrapAISDK automatically traces both the LLM’s tool call suggestions and the actual tool executions.
trace-vercel-ai-sdk-tools.ts
import { initLogger, wrapAISDK } from "braintrust";
import * as ai from "ai";
import { openai } from "@ai-sdk/openai";
import { z } from "zod";

initLogger({
  projectName: "Tool Tracing",
  apiKey: process.env.BRAINTRUST_API_KEY,
});

const { generateText } = wrapAISDK(ai);

async function main() {
  // Tool executions are automatically wrapped and traced
  const { text } = await generateText({
    model: openai("gpt-5-mini"),
    prompt: "What's the weather like in San Francisco?",
    tools: {
      getWeather: {
        description: "Get weather for a location",
        parameters: z.object({
          location: z.string().describe("The city name"),
        }),
        execute: async ({ location }: { location: string }) => {
          // This execution will appear as a child span
          return {
            location,
            temperature: 72,
            conditions: "sunny",
          };
        },
      },
    },
  });

  console.log(text);
}

main().catch(console.error);

Streaming with tools

You can also use streamText for streaming responses with tool calls:
trace-vercel-ai-sdk-streaming.ts
import { initLogger, wrapAISDK } from "braintrust";
import * as ai from "ai";
import { openai } from "@ai-sdk/openai";
import { z } from "zod";

initLogger({
  projectName: "Streaming Tool Tracing",
  apiKey: process.env.BRAINTRUST_API_KEY,
});

const { streamText } = wrapAISDK(ai);

async function main() {
  const result = streamText({
    model: openai("gpt-5-mini"),
    prompt: "What is 127 multiplied by 49?",
    tools: {
      calculate: {
        description: "Perform a mathematical calculation",
        parameters: z.object({
          operation: z.enum(["add", "subtract", "multiply", "divide"]),
          a: z.number(),
          b: z.number(),
        }),
        execute: async ({ operation, a, b }) => {
          switch (operation) {
            case "add": return a + b;
            case "subtract": return a - b;
            case "multiply": return a * b;
            case "divide": return b !== 0 ? a / b : 0;
          }
        },
      },
    },
    maxToolRoundtrips: 2,
  });

  for await (const delta of result.textStream) {
    process.stdout.write(delta);
  }
}

main().catch(console.error);