PostHog LLM Analytics

PostHog is an open source product analytics platform. With the PostHog LLM Analytics integration, you can track LLM usage alongside your product analytics to understand how AI features impact user behavior. PostHog provides:

Setup

PostHog supports AI SDK telemetry data through OpenTelemetry. The PostHogTraceExporter from the @posthog/ai package sends gen_ai.* spans to PostHog's OTLP ingestion endpoint, where they are automatically converted into $ai_generation events.

Next.js has built-in support for OpenTelemetry instrumentation via the instrumentation hook.

Step 1. Install dependencies

npm install @posthog/ai @opentelemetry/sdk-node @opentelemetry/resources

Step 2. Create an instrumentation.ts file in your project root

instrumentation.ts
import { NodeSDK } from '@opentelemetry/sdk-node';
import { resourceFromAttributes } from '@opentelemetry/resources';
import { PostHogTraceExporter } from '@posthog/ai/otel';
export function register() {
const sdk = new NodeSDK({
resource: resourceFromAttributes({
'service.name': 'my-nextjs-app',
}),
traceExporter: new PostHogTraceExporter({
apiKey: process.env.POSTHOG_API_KEY!,
host: 'https://us.i.posthog.com', // use https://eu.i.posthog.com for EU
}),
});
sdk.start();
}

Step 3. Use the AI SDK with telemetry enabled in your route handlers or server actions

import { generateText } from 'ai';
import { openai } from '@ai-sdk/openai';
const result = await generateText({
model: openai('gpt-4o'),
prompt: 'Tell me a fun fact about hedgehogs.',
experimental_telemetry: {
isEnabled: true,
functionId: 'my-ai-function',
metadata: {
posthog_distinct_id: 'user_123', // optional: links events to a PostHog user
},
},
});

The integration supports streaming functions like streamText. Each streamed call will produce ai.streamText spans that are captured by PostHog.

Configuration

Linking events to PostHog users

Pass posthog_distinct_id in the metadata field to associate LLM events with a specific user in PostHog. If omitted, events are captured anonymously.

const result = await generateText({
model: openai('gpt-4o'),
prompt: 'Write a short story about a cat.',
experimental_telemetry: {
isEnabled: true,
metadata: {
posthog_distinct_id: 'user_123',
},
},
});

Privacy controls

You can disable recording of inputs and outputs by setting recordInputs and recordOutputs to false:

const result = await generateText({
model: openai('gpt-4o'),
prompt: 'Write a short story about a cat.',
experimental_telemetry: {
isEnabled: true,
recordInputs: false,
recordOutputs: false,
},
});

Resources