Ollama Provider
nordwestt/ollama-ai-provider-v2 is a community provider that uses Ollama to provide language model support for the AI SDK.
Setup
The Ollama provider is available in the ollama-ai-provider-v2
module. You can install it with
pnpm add ollama-ai-provider-v2
Provider Instance
You can import the default provider instance ollama
from ollama-ai-provider-v2
:
import { ollama } from 'ollama-ai-provider-v2';
If you need a customized setup, you can import createOllama
from ollama-ai-provider-v2
and create a provider instance with your settings:
import { createOllama } from 'ollama-ai-provider-v2';
const ollama = createOllama({ // optional settings, e.g. baseURL: 'https://api.ollama.com',});
You can use the following optional settings to customize the Ollama provider instance:
-
baseURL string
Use a different URL prefix for API calls, e.g. to use proxy servers. The default prefix is
http://localhost:11434/api
. -
headers Record<string,string>
Custom headers to include in the requests.
Language Models
You can create models that call the Ollama Chat Completion API using the provider instance.
The first argument is the model id, e.g. phi3
. Some models have multi-modal capabilities.
const model = ollama('phi3');
You can find more models on the Ollama Library homepage.
Model Capabilities
This provider is capable of using hybrid reasoning models such as qwen3, allowing toggling of reasoning between messages.
import { ollama } from 'ollama-ai-provider-v2';import { generateText } from 'ai';
const { text } = await generateText({ model: ollama('qwen3:4b'), providerOptions: { ollama: { think: true } }, prompt: 'Write a vegetarian lasagna recipe for 4 people, but really think about it',});
Embedding Models
You can create models that call the Ollama embeddings API
using the .textEmbeddingModel()
factory method.
const model = ollama.textEmbeddingModel('nomic-embed-text');
const { embeddings } = await embedMany({ model: model, values: ['sunny day at the beach', 'rainy afternoon in the city'],});
console.log( `cosine similarity: ${cosineSimilarity(embeddings[0], embeddings[1])}`,);
Alternative Providers
There is an alternative provider package called ai-sdk-ollama
by jagreehal, which uses the official
Ollama
JavaScript client library instead of direct HTTP API calls.
Key differences:
- Uses the official
ollama
npm package for communication - Provides automatic environment detection (Node.js vs browser)
- Includes built-in error handling and retries via the official client
- Supports both CommonJS and ESM module formats
- Full TypeScript support with type-safe Ollama-specific options via
providerOptions.ollama
This approach leverages Ollama's official client library, which may provide better compatibility with Ollama updates and additional features like model management. Both providers implement the AI SDK specification, so you can choose based on your specific requirements and preferences.