
# Mem0 Provider

The [Mem0 Provider](https://github.com/mem0ai/mem0/tree/main/vercel-ai-sdk) is a library developed by [**Mem0**](https://mem0.ai)
to integrate with the AI SDK.
This library brings enhanced AI interaction capabilities to your applications by introducing persistent memory functionality.

<Note type="info">
  🎉 Exciting news! Mem0 AI SDK now supports <strong>Tools Call</strong>.
</Note>

## Setup

The Mem0 provider is available in the `@mem0/vercel-ai-provider` module. You can install it with:

<Tabs items={['pnpm', 'npm', 'yarn', 'bun']}>
  <Tab>
    <Snippet text="pnpm add @mem0/vercel-ai-provider" dark />
  </Tab>
  <Tab>
    <Snippet text="npm install @mem0/vercel-ai-provider" dark />
  </Tab>
  <Tab>
    <Snippet text="yarn add @mem0/vercel-ai-provider" dark />
  </Tab>
  <Tab>
    <Snippet text="bun add @mem0/vercel-ai-provider" dark />
  </Tab>
</Tabs>

## Provider Instance

First, get your **Mem0 API Key** from the [Mem0 Dashboard](https://app.mem0.ai/dashboard/api-keys).

Then initialize the `Mem0 Client` in your application:

```ts
import { createMem0 } from '@mem0/vercel-ai-provider';

const mem0 = createMem0({
  provider: 'openai',
  mem0ApiKey: 'm0-xxx',
  apiKey: 'provider-api-key',
  config: {
    // Configure the LLM Provider here
  },
  // Optional Mem0 Global Config
  mem0Config: {
    user_id: 'mem0-user-id',
    enable_graph: true,
  },
});
```

<Note>
  The `openai` provider is set as default. Consider using `MEM0_API_KEY` and
  `OPENAI_API_KEY` as environment variables for security.
</Note>

<Note>
  The `mem0Config` is optional. It is used to set the global config for the Mem0
  Client (eg. `user_id`, `agent_id`, `app_id`, `run_id`, `org_id`, `project_id`
  etc).
</Note>

- Add Memories to Enhance Context:

```ts
import { LanguageModelV2Prompt } from '@ai-sdk/provider';
import { addMemories } from '@mem0/vercel-ai-provider';

const messages: LanguageModelV2Prompt = [
  { role: 'user', content: [{ type: 'text', text: 'I love red cars.' }] },
];

await addMemories(messages, { user_id: 'borat' });
```

## Features

### Adding and Retrieving Memories

- `retrieveMemories()`: Retrieves memory context for prompts.
- `getMemories()`: Get memories from your profile in array format.
- `addMemories()`: Adds user memories to enhance contextual responses.

```ts
await addMemories(messages, {
  user_id: 'borat',
  mem0ApiKey: 'm0-xxx',
});
await retrieveMemories(prompt, {
  user_id: 'borat',
  mem0ApiKey: 'm0-xxx',
});
await getMemories(prompt, {
  user_id: 'borat',
  mem0ApiKey: 'm0-xxx',
});
```

<Note>
  For standalone features, such as `addMemories`, `retrieveMemories`, and
  `getMemories`, you must either set `MEM0_API_KEY` as an environment variable
  or pass it directly in the function call.
</Note>

<Note>
  `getMemories` will return raw memories in the form of an array of objects,
  while `retrieveMemories` will return a response in string format with a system
  prompt ingested with the retrieved memories.
</Note>

### Generate Text with Memory Context

You can use language models from **OpenAI**, **Anthropic**, **Cohere**, and **Groq** to generate text with the `generateText` function:

```ts
import { generateText } from 'ai';
import { createMem0 } from '@mem0/vercel-ai-provider';

const mem0 = createMem0();

const { text } = await generateText({
  model: mem0('gpt-4.1', { user_id: 'borat' }),
  prompt: 'Suggest me a good car to buy!',
});
```

### Structured Message Format with Memory

```ts
import { generateText } from 'ai';
import { createMem0 } from '@mem0/vercel-ai-provider';

const mem0 = createMem0();

const { text } = await generateText({
  model: mem0('gpt-4.1', { user_id: 'borat' }),
  messages: [
    {
      role: 'user',
      content: [
        { type: 'text', text: 'Suggest me a good car to buy.' },
        { type: 'text', text: 'Why is it better than the other cars for me?' },
      ],
    },
  ],
});
```

### Streaming Responses with Memory Context

```ts
import { streamText } from 'ai';
import { createMem0 } from '@mem0/vercel-ai-provider';

const mem0 = createMem0();

const { textStream } = streamText({
  model: mem0('gpt-4.1', {
    user_id: 'borat',
  }),
  prompt:
    'Suggest me a good car to buy! Why is it better than the other cars for me? Give options for every price range.',
});

for await (const textPart of textStream) {
  process.stdout.write(textPart);
}
```

### Generate Responses with Tools Call

```ts
import { generateText } from 'ai';
import { createMem0 } from '@mem0/vercel-ai-provider';
import { z } from 'zod';

const mem0 = createMem0({
  provider: 'anthropic',
  apiKey: 'anthropic-api-key',
  mem0Config: {
    // Global User ID
    user_id: 'borat',
  },
});

const prompt = 'What the temperature in the city that I live in?';

const result = await generateText({
  model: mem0('claude-3-5-sonnet-20240620'),
  tools: {
    weather: tool({
      description: 'Get the weather in a location',
      parameters: z.object({
        location: z.string().describe('The location to get the weather for'),
      }),
      execute: async ({ location }) => ({
        location,
        temperature: 72 + Math.floor(Math.random() * 21) - 10,
      }),
    }),
  },
  prompt: prompt,
});

console.log(result);
```

### Get sources from memory

```ts
const { text, sources } = await generateText({
  model: mem0('gpt-4.1'),
  prompt: 'Suggest me a good car to buy!',
});

console.log(sources);
```

This same functionality is available in the `streamText` function.

## Supported LLM Providers

The Mem0 provider supports the following LLM providers:

| Provider  | Configuration Value |
| --------- | ------------------- |
| OpenAI    | `openai`            |
| Anthropic | `anthropic`         |
| Google    | `google`            |
| Groq      | `groq`              |
| Cohere    | `cohere`            |

## Best Practices

- **User Identification**: Use a unique `user_id` for consistent memory retrieval.
- **Memory Cleanup**: Regularly clean up unused memory data.

<Note>
  We also have support for `agent_id`, `app_id`, and `run_id`. Refer
  [Docs](https://docs.mem0.ai/api-reference/memory/add-memories).
</Note>

## Help

- For more details on Vercel AI SDK, visit the [Vercel AI SDK documentation](/docs/introduction).
- For Mem0 documentation, refer to the [Mem0 Platform](https://app.mem0.ai/).
- If you need further assistance, please feel free to reach out to us through following methods:

## References

- [Mem0 AI SDK Docs](https://docs.mem0.ai/integrations/vercel-ai-sdk#getting-started)
- [Mem0 documentation](https://docs.mem0.ai)


## Navigation

- [Writing a Custom Provider](/v5/providers/community-providers/custom-providers)
- [Qwen](/v5/providers/community-providers/qwen)
- [Ollama](/v5/providers/community-providers/ollama)
- [A2A](/v5/providers/community-providers/a2a)
- [ACP (Agent Client Protocol)](/v5/providers/community-providers/acp)
- [Helicone](/v5/providers/community-providers/helicone)
- [FriendliAI](/v5/providers/community-providers/friendliai)
- [Portkey](/v5/providers/community-providers/portkey)
- [Built-in AI](/v5/providers/community-providers/built-in-ai)
- [Gemini CLI](/v5/providers/community-providers/gemini-cli)
- [MCP Sampling AI Provider](/v5/providers/community-providers/mcp-sampling)
- [Automatic1111](/v5/providers/community-providers/automatic1111)
- [Cloudflare Workers AI](/v5/providers/community-providers/cloudflare-workers-ai)
- [Cloudflare AI Gateway](/v5/providers/community-providers/cloudflare-ai-gateway)
- [OpenRouter](/v5/providers/community-providers/openrouter)
- [Azure AI](/v5/providers/community-providers/azure-ai)
- [Aihubmix](/v5/providers/community-providers/aihubmix)
- [SAP AI Core](/v5/providers/community-providers/sap-ai)
- [Crosshatch](/v5/providers/community-providers/crosshatch)
- [Requesty](/v5/providers/community-providers/requesty)
- [MiniMax](/v5/providers/community-providers/minimax)
- [Mixedbread](/v5/providers/community-providers/mixedbread)
- [Voyage AI](/v5/providers/community-providers/voyage-ai)
- [Jina AI](/v5/providers/community-providers/jina-ai)
- [Mem0](/v5/providers/community-providers/mem0)
- [Letta](/v5/providers/community-providers/letta)
- [Supermemory](/v5/providers/community-providers/supermemory)
- [React Native Apple](/v5/providers/community-providers/react-native-apple)
- [Anthropic Vertex](/v5/providers/community-providers/anthropic-vertex-ai)
- [Spark](/v5/providers/community-providers/spark)
- [Inflection AI](/v5/providers/community-providers/inflection-ai)
- [LangDB](/v5/providers/community-providers/langdb)
- [Zhipu AI](/v5/providers/community-providers/zhipu)
- [SambaNova](/v5/providers/community-providers/sambanova)
- [Dify](/v5/providers/community-providers/dify)
- [Sarvam](/v5/providers/community-providers/sarvam)
- [AI/ML API](/v5/providers/community-providers/aimlapi)
- [Claude Code](/v5/providers/community-providers/claude-code)


[Full Sitemap](/sitemap.md)
