Cedar enables streaming responses by default in your chat interface. Responses appear in real-time as they’re generated. You can disable streaming by setting the stream prop to false on any chat component or ChatInput.
Partial Object Streaming Not Supported: Cedar currently doesn’t support partial object streaming. For structured data to be handled via Cedar’s automatic handlers, only complete objects should be streamed or streaming should be turned off.

Quick Start

With Pre-built Components

Streaming is enabled by default for all pre-built chat components:
import { FloatingCedarChat } from 'cedar-os-components';

function App() {
	return (
		<FloatingCedarChat
			// Streaming is enabled by default
			side='right'
			title='Streaming Assistant'
		/>
	);
}
To disable streaming, set stream={false}:
<FloatingCedarChat
	stream={false} // Disable streaming
	side='right'
	title='Non-Streaming Assistant'
/>

With Individual Components

The ChatInput component has streaming enabled by default, but you can override it:
import { ChatInput, ChatBubbles } from 'cedar-os';

function CustomChat() {
	return (
		<div className='flex flex-col h-full'>
			<div className='flex-1'>
				<ChatBubbles />
			</div>
			<ChatInput
				stream={false} // Override to disable streaming
				position='embedded'
			/>
		</div>
	);
}

How It Works

When streaming is enabled:
  1. Real-time Updates: Text streams in character by character as it’s generated
  2. Out of the box support for streaming flexible objects: Handle structured objects and custom events (see Custom Message Rendering)
  3. Smooth Animation: Built-in typewriter effect for natural reading experience
  4. Error Handling: Graceful fallbacks if streaming fails

Additional Configuration necessary by provider

  • OpenAI and AI SDK: Streaming is supported out of the box
  • Mastra and Custom backend: The backend is required to send a data-only SSE stream of information in either streamed text or entire objects
Server-Sent Events (SSE) are part of a one-way streaming protocol to deliver a sequence of data: messages over a single HTTP connection. The stream mixes plain text and structured JSON, sent as newline-delimited chunks prefixed with data:.Under the hood, the server emits:Text chunks for incremental message rendering
→ data: Hello, how can I help you?
JSON objects for structured data or tool outputs
→ data: {"type":"setState","command":"search","query":"nearest cafe"}
Our client handlers will parse each data: line as it arrives and handle parsed text or JSON accordingly. This enables real-time, mixed-format updates with minimal overhead.
Feel free to drop these handlers into your backend to take care of Cedar-OS-compatible streaming.1. Example Usage:
// Return statement of /handleStream
return createSSEStream(async (controller) => {
	// Handle application execution logic with controller variable available

	// Stream text responses from your agent
	const streamResult = await agent.stream({
		prompt,
		temperature,
		maxTokens,
		systemPrompt,
	});

	// Stream the text result using the helper function
	await handleTextStream(streamResult, controller);

	// Example: Stream a tool call result as structured JSON
	const toolCallResult = {
		type: 'tool_call',
		tool: 'plant_seed',
		result: {
			status: 'success',
			message: 'A new Cedar tree has been planted <3',
		},
		timestamp: new Date().toISOString(),
	};

	// Stream the JSON event using the helper function
	streamJSONEvent(controller, toolCallResult);
});

/**
 * Handles streaming of text chunks using data-only format.
 * Pass your agent's stream result and the controller to stream text chunks to frontend.
 */
export async function handleTextStream(
	streamResult: StreamTextResult<any, any>,
	streamController: ReadableStreamDefaultController<Uint8Array>
): Promise<string> {
	const encoder = new TextEncoder();
	const chunks: string[] = [];

	// Stream raw text chunks through data field
	for await (const chunk of streamResult.textStream) {
		chunks.push(chunk);
		// Escape literal newlines for SSE compliance
		const escaped = chunk.replace(/\n/g, '\\n');
		streamController.enqueue(encoder.encode(`data:${escaped}\n\n`));
	}

	return chunks.join('');
}

/**
 * Emit any JSON object as a data event.
 * Use this to stream structured data like tool results, progress updates, or custom events.
 */
export function streamJSONEvent<T>(
	controller: ReadableStreamDefaultController<Uint8Array>,
	eventData: T
) {
	const encoder = new TextEncoder();
	controller.enqueue(encoder.encode('data: '));
	controller.enqueue(encoder.encode(`${JSON.stringify(eventData)}\n\n`));
}
2. Core SSE Stream Creator:
/**
 * Creates a Server-Sent Events stream response.
 * Pass a callback function that receives a controller for streaming data.
 * Use this as your main endpoint return value for streaming responses.
 */
export function createSSEStream(
	cb: (controller: ReadableStreamDefaultController<Uint8Array>) => Promise<void>
): Response {
	const encoder = new TextEncoder();

	const stream = new ReadableStream<Uint8Array>({
		async start(controller) {
			try {
				// Execute your application logic with streaming controller
				await cb(controller);

				// Signal completion with empty data
				controller.enqueue(encoder.encode('event: done\n'));
				controller.enqueue(encoder.encode('data:\n\n'));
			} catch (err) {
				console.error('Error during SSE stream', err);

				const message = err instanceof Error ? err.message : 'Internal error';
				controller.enqueue(encoder.encode('data: '));
				controller.enqueue(
					encoder.encode(`${JSON.stringify({ type: 'error', message })}\n\n`)
				);
			} finally {
				controller.close();
			}
		},
	});

	return new Response(stream, {
		headers: {
			'Content-Type': 'text/event-stream',
			'Cache-Control': 'no-cache',
			Connection: 'keep-alive',
		},
	});
}

Next Steps