Vercel Provider

The Vercel provider gives you access to the v0 API, designed for building modern web applications. The v0-1.0-md model supports text and image inputs and provides fast streaming responses.

You can create your Vercel API key at v0.dev.

The v0 API is currently in beta and requires a Premium or Team plan with usage-based billing enabled. For details, visit the pricing page. To request a higher limit, contact Vercel at support@v0.dev.

Features

  • Framework aware completions: Evaluated on modern stacks like Next.js and Vercel
  • Auto-fix: Identifies and corrects common coding issues during generation
  • Quick edit: Streams inline edits as they're available
  • OpenAI compatible: Can be used with any tool or SDK that supports OpenAI's API format
  • Multimodal: Supports both text and image inputs

Setup

The Vercel provider is available via the @ai-sdk/vercel module. You can install it with:

pnpm
npm
yarn
pnpm add @ai-sdk/vercel

Provider Instance

You can import the default provider instance vercel from @ai-sdk/vercel:

import { vercel } from '@ai-sdk/vercel';

If you need a customized setup, you can import createVercel from @ai-sdk/vercel and create a provider instance with your settings:

import { createVercel } from '@ai-sdk/vercel';
const vercel = createVercel({
apiKey: process.env.VERCEL_API_KEY ?? '',
});

You can use the following optional settings to customize the Vercel provider instance:

  • baseURL string

    Use a different URL prefix for API calls. The default prefix is https://api.v0.dev/v1.

  • apiKey string

    API key that is being sent using the Authorization header. It defaults to the VERCEL_API_KEY environment variable.

  • headers Record<string,string>

    Custom headers to include in the requests.

  • fetch (input: RequestInfo, init?: RequestInit) => Promise<Response>

    Custom fetch implementation. Defaults to the global fetch function. You can use it as a middleware to intercept requests, or to provide a custom fetch implementation for e.g. testing.

Language Models

You can create language models using a provider instance. The first argument is the model ID, for example:

import { vercel } from '@ai-sdk/vercel';
import { generateText } from 'ai';
const { text } = await generateText({
model: vercel('v0-1.0-md'),
prompt: 'Create a Next.js AI chatbot',
});

Vercel language models can also be used in the streamText function (see AI SDK Core).

Example with AI SDK

import { generateText } from 'ai';
import { createVercel } from '@ai-sdk/vercel';
const vercel = createVercel({
baseURL: 'https://api.v0.dev/v1',
apiKey: process.env.VERCEL_API_KEY,
});
const { text } = await generateText({
model: vercel('v0-1.0-md'),
prompt: 'Create a Next.js AI chatbot with authentication',
});

Models

v0-1.0-md

The v0-1.0-md model is the default model served by the v0 API.

Capabilities:

  • Supports text and image inputs (multimodal)
  • Supports function/tool calls
  • Streaming responses with low latency
  • Optimized for frontend and full-stack web development

Model Capabilities

ModelImage InputObject GenerationTool UsageTool Streaming
v0-1.0-md