Nest.js
You can use the AI SDK in a Nest.js server to generate and stream text and objects to the client.
Examples
The examples show how to implement a Nest.js controller that uses the AI SDK to stream text and objects to the client.
Full example: github.com/vercel/ai/examples/nest
UI Message Stream
You can use the pipeUIMessageStreamToResponse
method to pipe the stream data to the server response.
app.controller.ts
import { Controller, Post, Res } from '@nestjs/common';import { openai } from '@ai-sdk/openai';import { streamText } from 'ai';import { Response } from 'express';
@Controller()export class AppController { @Post('/') async root(@Res() res: Response) { const result = streamText({ model: openai('gpt-4o'), prompt: 'Invent a new holiday and describe its traditions.', });
result.pipeUIMessageStreamToResponse(res); }}
Sending Custom Data
createUIMessageStream
and pipeUIMessageStreamToResponse
can be used to send custom data to the client.
app.controller.ts
import { Controller, Post, Res } from '@nestjs/common';import { openai } from '@ai-sdk/openai';import { createUIMessageStream, streamText, pipeUIMessageStreamToResponse,} from 'ai';import { Response } from 'express';
@Controller()export class AppController { @Post('/stream-data') async streamData(@Res() response: Response) { const stream = createUIMessageStream({ execute: ({ writer }) => { // write some data writer.write({ type: 'start' });
writer.write({ type: 'data-custom', data: { custom: 'Hello, world!', }, });
const result = streamText({ model: openai('gpt-4o'), prompt: 'Invent a new holiday and describe its traditions.', }); writer.merge( result.toUIMessageStream({ sendStart: false, onError: error => { // Error messages are masked by default for security reasons. // If you want to expose the error message to the client, you can do so here: return error instanceof Error ? error.message : String(error); }, }), ); }, }); pipeUIMessageStreamToResponse({ stream, response }); }}
Text Stream
You can use the pipeTextStreamToResponse
method to get a text stream from the result and then pipe it to the response.
app.controller.ts
import { Controller, Post, Res } from '@nestjs/common';import { openai } from '@ai-sdk/openai';import { streamText } from 'ai';import { Response } from 'express';
@Controller()export class AppController { @Post() async example(@Res() res: Response) { const result = streamText({ model: openai('gpt-4o'), prompt: 'Invent a new holiday and describe its traditions.', });
result.pipeTextStreamToResponse(res); }}
Troubleshooting
- Streaming not working when proxied