Real-time
SSE Streaming
Server-Sent Events for long-running operations
SSE Streaming#
For long-running operations, enable streaming on a command. The handler receives an emit() function on the context to send progressive chunks via Server-Sent Events.
Defining a Streaming Command
Set stream: true on the command definition. Inside the handler, use ctx.emit() to push chunks to the client. The return value becomes the final done event.
const surf = await createSurf({ name: 'AI API', commands: { generate: { description: 'Generate text with AI', stream: true, params: { prompt: { type: 'string', required: true }, maxTokens: { type: 'number', default: 500 }, }, run: async ({ prompt, maxTokens }, ctx) => { const response = ai.stream(prompt, { maxTokens }) let totalTokens = 0ย for await (const chunk of response) { totalTokens += chunk.tokens // Each emit() sends an SSE "chunk" event ctx.emit!({ text: chunk.text, tokens: chunk.tokens }) }ย // Return value is sent as the final "done" event return { finished: true, totalTokens } }, }, },})Client-Side SSE
When the client sends stream: true in the execute request, the response is an SSE stream instead of a single JSON body. Each chunk follows the StreamChunk protocol:
// Request streaming executionPOST /surf/executeContent-Type: application/jsonย { "command": "generate", "params": { "prompt": "Explain SSE" }, "stream": true }ย // SSE response (Content-Type: text/event-stream):data: { "type": "chunk", "data": { "text": "Server-Sent", "tokens": 2 } }ย data: { "type": "chunk", "data": { "text": " Events are", "tokens": 3 } }ย data: { "type": "chunk", "data": { "text": " a standard...", "tokens": 4 } }ย data: { "type": "done", "result": { "finished": true, "totalTokens": 9 } }Consuming Streams with the Client SDK
The @surfjs/client SDK handles SSE parsing automatically. Use executeStream() to get an async iterator of chunks:
import { SurfClient } from '@surfjs/client'ย const client = await SurfClient.discover('https://ai.example.com')ย // Stream chunks as they arriveconst stream = client.executeStream('generate', { prompt: 'Hello world' })ย for await (const chunk of stream) { if (chunk.type === 'chunk') { process.stdout.write(chunk.data.text) } if (chunk.type === 'done') { console.log('\nDone:', chunk.result) }}Pipeline Streaming
Streaming also works inside pipelines. If a pipeline step targets a streaming command and the pipeline request includes stream: true, chunks from that step are emitted in real-time while remaining steps execute normally after the stream completes.
// Pipeline with a streaming stepPOST /surf/pipeline{ "steps": [ { "command": "search", "params": { "query": "SSE tutorial" }, "as": "results" }, { "command": "generate", "params": { "prompt": "$prev.results[0].summary" } } ], "stream": true}ย // Step 1 executes normally, step 2 streams:data: { "type": "step", "index": 0, "result": { "results": [...] } }ย data: { "type": "chunk", "index": 1, "data": { "text": "Server-Sent..." } }ย data: { "type": "chunk", "index": 1, "data": { "text": " Events..." } }ย data: { "type": "done", "results": [...] }๐ก Tip: Both the command definition must have
stream: trueand the client request must includestream: truefor SSE to activate. Ifstream: trueis sent for a non-streaming command, it executes normally and returns a standard JSON response.