Skip to main content

Trace with the Vercel AI SDK (JS/TS only)

You can use LangSmith to trace runs from the Vercel AI SDK with our built-in AISDKExporter OpenTelemetry trace exporter. This guide will walk through an example.

note

The AISDKExporter is only available in langsmith JS SDK version >=0.2.1.

0. Installation

Install the Vercel AI SDK. We use their OpenAI integration for the code snippets below, but you can use any of their other options as well.

yarn add ai @ai-sdk/openai zod

1. Configure your environment

export LANGCHAIN_TRACING_V2=true
export LANGCHAIN_API_KEY=<your-api-key>

# The below examples use the OpenAI API, though it's not necessary in general
export OPENAI_API_KEY=<your-openai-api-key>

2. Log a trace

Next.js

First, create a instrumentation.js file in your project root. Learn more how to setup OpenTelemetry instrumentation within your Next.js app here.

import { registerOTel } from "@vercel/otel";
import { AISDKExporter } from "langsmith/vercel";

export function register() {
registerOTel({
serviceName: "langsmith-vercel-ai-sdk-example",
traceExporter: new AISDKExporter(),
});
}

Afterwards, add the experimental_telemetry argument to your AI SDK calls that you want to trace. For convenience, we've included the AISDKExporter.getSettings() method which appends additional metadata for LangSmith.

import { AISDKExporter } from "langsmith/vercel";

await streamText({
model: openai("gpt-4o-mini"),
prompt: "Write a vegetarian lasagna recipe for 4 people.",
experimental_telemetry: AISDKExporter.getSettings(),
});

Node.js

Add the AISDKExporter to the trace exporter to your OpenTelemetry setup.

import { AISDKExporter } from "langsmith/vercel";

import { NodeSDK } from "@opentelemetry/sdk-node";
import { getNodeAutoInstrumentations } from "@opentelemetry/auto-instrumentations-node";

const sdk = new NodeSDK({
traceExporter: new AISDKExporter(),
instrumentations: [getNodeAutoInstrumentations()],
});

sdk.start();

Afterwards, add the experimental_telemetry argument to your AI SDK calls that you want to trace.

info

Do not forget to call await sdk.shutdown() before your application shuts down in order to flush any remaining traces to LangSmith.

import { generateText } from "ai";
import { openai } from "@ai-sdk/openai";
import { AISDKExporter } from "langsmith/vercel";

const result = await generateText({
model: openai("gpt-4o-mini"),
prompt: "Write a vegetarian lasagna recipe for 4 people.",
experimental_telemetry: AISDKExporter.getSettings(),
});

await sdk.shutdown();

Customize run name

You can customize the run name by passing the runName argument to the AISDKExporter.getSettings() method.

import { AISDKExporter } from "langsmith/vercel";

await generateText({
model: openai("gpt-4o-mini"),
prompt: "Write a vegetarian lasagna recipe for 4 people.",
experimental_telemetry: AISDKExporter.getSettings({
runName: "my-custom-run-name",
}),
});

Customize run ID

You can customize the run ID by passing the runId argument to the AISDKExporter.getSettings() method. This is especially useful if you want to know the run ID before the run has been completed.

import { AISDKExporter } from "langsmith/vercel";

await generateText({
model: openai("gpt-4o-mini"),
prompt: "Write a vegetarian lasagna recipe for 4 people.",
experimental_telemetry: AISDKExporter.getSettings({
runId: "my-custom-run-id",
}),
});

wrapAISDKModel (deprecated)

note

The wrapAISDKModel method is deprecated and will be removed in a future release.

The wrapAISDKModel method wraps the Vercel model wrapper and intercept model invocation to send traces to LangSmith. This method is useful if you are using an older version of LangSmith or if you are using streamUI / Vercel AI RSC, which currently does not support experimental_telemetry.

import { wrapAISDKModel } from "langsmith/wrappers/vercel";
import { openai } from "@ai-sdk/openai";
import { generateText } from "ai";

const vercelModel = openai("gpt-4o-mini");

const modelWithTracing = wrapAISDKModel(vercelModel);

await generateText({
model: modelWithTracing,
prompt: "Write a vegetarian lasagna recipe for 4 people.",
});

Was this page helpful?


You can leave detailed feedback on GitHub.