LangSmith Observability
LangSmith is a platform for building production-grade LLM applications. It allows you to closely monitor and evaluate your application, so you can ship quickly and with confidence.
Use of LangChain's open-source frameworks is not necessary.
A version of this guide is also available in the LangSmith
documentation.
If you are using AI SDK v4 an older version of the langsmith
client, see the
legacy guide linked from that page.
Setup
langsmith>=0.3.63.
.Install an AI SDK model provider and the LangSmith client SDK. The code snippets below will use the AI SDK's OpenAI provider, but you can use any other supported provider as well.
pnpm add @ai-sdk/openai langsmith
Next, set required environment variables.
export LANGCHAIN_TRACING=trueexport LANGCHAIN_API_KEY=<your-api-key>
export OPENAI_API_KEY=<your-openai-api-key> # The examples use OpenAI (replace with your selected provider)
Trace Logging
To start tracing, you will need to import and call the wrapAISDK
method at the start of your code:
import { openai } from '@ai-sdk/openai';import * as ai from 'ai';
import { wrapAISDK } from 'langsmith/experimental/vercel';
const { generateText, streamText, generateObject, streamObject } = wrapAISDK(ai);
await generateText({ model: openai('gpt-5-nano'), prompt: 'Write a vegetarian lasagna recipe for 4 people.',});
You should see a trace in your LangSmith dashboard like this one.
You can also trace runs with tool calls:
import * as ai from 'ai';import { tool, stepCountIs } from 'ai';import { openai } from '@ai-sdk/openai';import { z } from 'zod';
import { wrapAISDK } from 'langsmith/experimental/vercel';
const { generateText, streamText, generateObject, streamObject } = wrapAISDK(ai);
await generateText({ model: openai('gpt-5-nano'), messages: [ { role: 'user', content: 'What are my orders and where are they? My user ID is 123', }, ], tools: { listOrders: tool({ description: 'list all orders', inputSchema: z.object({ userId: z.string() }), execute: async ({ userId }) => `User ${userId} has the following orders: 1`, }), viewTrackingInformation: tool({ description: 'view tracking information for a specific order', inputSchema: z.object({ orderId: z.string() }), execute: async ({ orderId }) => `Here is the tracking information for ${orderId}`, }), }, stopWhen: stepCountIs(5),});
Which results in a trace like this one.
You can use other AI SDK methods exactly as you usually would.
With traceable
You can wrap traceable
calls around AI SDK calls or within AI SDK tool calls. This is useful if you
want to group runs together in LangSmith:
import * as ai from 'ai';import { tool, stepCountIs } from 'ai';import { openai } from '@ai-sdk/openai';import { z } from 'zod';
import { traceable } from 'langsmith/traceable';import { wrapAISDK } from 'langsmith/experimental/vercel';
const { generateText, streamText, generateObject, streamObject } = wrapAISDK(ai);
const wrapper = traceable( async (input: string) => { const { text } = await generateText({ model: openai('gpt-5-nano'), messages: [ { role: 'user', content: input, }, ], tools: { listOrders: tool({ description: 'list all orders', inputSchema: z.object({ userId: z.string() }), execute: async ({ userId }) => `User ${userId} has the following orders: 1`, }), viewTrackingInformation: tool({ description: 'view tracking information for a specific order', inputSchema: z.object({ orderId: z.string() }), execute: async ({ orderId }) => `Here is the tracking information for ${orderId}`, }), }, stopWhen: stepCountIs(5), }); return text; }, { name: 'wrapper', },);
await wrapper('What are my orders and where are they? My user ID is 123.');
The resulting trace will look like this.
Tracing in serverless environments
When tracing in serverless environments, you must wait for all runs to flush before your environment shuts down. See this section of the LangSmith docs for examples.
Further reading
For more examples and instructions for setting up tracing in specific environments, see the links below:
And once you've set up LangSmith tracing for your project, try gathering a dataset and evaluating it: