createAgentUIStreamResponse

The createAgentUIStreamResponse function executes an Agent, runs its streaming output as a UI message stream, and returns an HTTP Response object whose body is the live, streaming UI message output. This is designed for API routes that deliver real-time agent results, such as chat endpoints or streaming tool-use operations.

Import

import { createAgentUIStreamResponse } from "ai"

Usage

import { ToolLoopAgent, createAgentUIStreamResponse } from 'ai';
const agent = new ToolLoopAgent({
model: "anthropic/claude-sonnet-4.5",
instructions: 'You are a helpful assistant.',
tools: { weather: weatherTool, calculator: calculatorTool },
});
export async function POST(request: Request) {
const { messages } = await request.json();
// Optional: support cancellation (aborts on disconnect, etc.)
const abortController = new AbortController();
return createAgentUIStreamResponse({
agent,
uiMessages: messages,
abortSignal: abortController.signal, // optional
// ...other UIMessageStreamOptions like sendSources, includeUsage, experimental_transform, etc.
});
}

Parameters

agent:

Agent
The agent instance to stream responses from. Must implement `.stream({ prompt, ... })` and define the `tools` property.

uiMessages:

unknown[]
Array of input UI messages provided to the agent (e.g., user and assistant messages).

abortSignal:

AbortSignal
Optional abort signal to cancel streaming, e.g., on client disconnect. This should be an [`AbortSignal`](https://developer.mozilla.org/en-US/docs/Web/API/AbortSignal) instance.

options:

CALL_OPTIONS
Optional agent call options, for agents with generic parameter `CALL_OPTIONS`.

experimental_transform:

StreamTextTransform | StreamTextTransform[]
Optional stream transforms to post-process text output—the same as in lower-level streaming APIs.

...UIMessageStreamOptions:

UIMessageStreamOptions
Other UI message output options—such as `sendSources`, `includeUsage`, and more. See [`UIMessageStreamOptions`](/docs/reference/ai-sdk-core/ui-message-stream-options).

headers:

HeadersInit
Optional HTTP headers to include in the Response object.

status:

number
Optional HTTP status code.

statusText:

string
Optional HTTP status text.

consumeSseStream:

boolean
If true, consume the stream as SSE (server-sent events) instead of default streaming.

Returns

A Promise<Response> whose body is a streaming UI message output from the agent. Use this as the return value of API/server handlers in serverless, Next.js, Express, Hono, or edge runtime contexts.

Example: Next.js API Route Handler

import { createAgentUIStreamResponse } from 'ai';
import { MyCustomAgent } from '@/agent/my-custom-agent';
export async function POST(request: Request) {
const { messages } = await request.json();
return createAgentUIStreamResponse({
agent: MyCustomAgent,
uiMessages: messages,
sendSources: true, // (optional)
includeUsage: true, // (optional)
// headers, status, abortSignal, and other UIMessageStreamOptions also supported
});
}

How It Works

    1. UI Message Validation: Validates the incoming uiMessages array according to the agent's specified tools and requirements.
    1. Model Message Conversion: Converts validated UI messages into the internal model message format for the agent.
    1. Streaming Agent Output: Invokes the agent’s .stream({ prompt, ... }) to get a stream of chunks (steps/UI messages).
    1. HTTP Response Creation: Wraps the output stream as a readable HTTP Response object that streams UI message chunks to the client.

Notes

  • Your agent must implement .stream({ prompt, ... }) and define a tools property (even if it's just {}) to work with this function.
  • Server Only: This API should only be called in backend/server-side contexts (API routes, edge/serverless/server route handlers, etc.). Not for browser use.
  • Additional options (headers, status, UI stream options, transforms, etc.) are available for advanced scenarios.
  • This leverages ReadableStream so your platform/client must support HTTP streaming consumption.

See Also