Scorecard

Scorecard is an observability platform for monitoring and evaluating LLM applications. After integrating with the AI SDK, you can use Scorecard to trace, monitor, and analyze your LLM providers, prompts, and application flows.

Setup

Scorecard supports AI SDK telemetry data. You'll need to sign up at https://app.scorecard.io and get your API Key from your settings page.

NextJS

To use the AI SDK to send telemetry data to Scorecard, first set these environment variables in your project:

OTEL_EXPORTER_OTLP_ENDPOINT=https://tracing.scorecard.io/otel/v1/traces
OTEL_EXPORTER_OTLP_HEADERS="Authorization=Bearer <Your Scorecard API Key>"

Next, create an instrumentation.ts file in your project root to initialize OpenTelemetry (You can configure it as needed):

import { registerOTel } from '@vercel/otel';
export function register() {
registerOTel({
serviceName: 'my-service-name',
});
}

You can then use the experimental_telemetry option to enable telemetry on supported AI SDK function calls:

import { openai } from '@ai-sdk/openai';
import { generateText } from 'ai';
const result = await generateText({
model: openai('gpt-4o-mini'),
prompt: 'Tell me a joke',
experimental_telemetry: { isEnabled: true },
});

What You'll See in Scorecard

After integrating, you'll be able to view in your Scorecard dashboard:

  • LLM call traces: Detailed information about each AI SDK call
  • Performance metrics: Latency, token usage, and cost tracking
  • Model information: Which models and providers were used
  • Custom metadata: Any additional context you provide via telemetry
  • Error tracking: Failed requests and debugging information
  • Usage analytics: Patterns and trends in your LLM usage

Resources