Real-Time Data
This content is for Backend. Switch to the latest version for up-to-date documentation.
Real-time features let the server push updates to clients without waiting for the next normal page/API refresh.
Typical examples:
- Chat messages
- Live dashboards (stocks, sensors)
- Notifications
- Multiplayer collaboration
The Core Problem
Section titled “The Core Problem”HTTP request/response is usually client-driven:
For real-time updates, we want server-driven delivery:
Long Polling
Section titled “Long Polling”With long polling, the client sends a request and the server holds it open until there is new data (or a timeout), then the client immediately starts a new request.
- Pros: Works almost everywhere, simple mental model.
- Cons: More overhead (many HTTP requests), can get expensive at scale.
Where It’s Used
Section titled “Where It’s Used”- When you need near-real-time updates but can’t rely on WebSockets (network restrictions, old infrastructure)
- Simple bots / integrations where “poll for updates” is acceptable
Industry Examples (Companies)
Section titled “Industry Examples (Companies)”- Telegram: Bot API supports long polling to receive updates
Express Integration Example
Section titled “Express Integration Example”This is a minimal pattern: keep a list of “waiting” HTTP responses and complete them when an update happens.
Server (Express):
import express from 'express';
const app = express();app.use(express.json());
// For demo purposes we keep waiting clients in memory.// In real apps, updates usually come from a DB change, queue, or pub/sub.let waiting = [];
app.get('/updates', (req, res) => { // Hold the request open. req.setTimeout(30_000);
// If no update arrives, return a "no update" response. const timeoutId = setTimeout(() => { waiting = waiting.filter((entry) => entry.res !== res); res.status(204).end(); }, 25_000);
waiting.push({ res, timeoutId });});
// Demo endpoint to trigger an update.app.post('/publish', (req, res) => { const update = { message: req.body?.message ?? 'Hello from server', at: new Date().toISOString(), };
const clients = waiting; waiting = [];
for (const { res: clientRes, timeoutId } of clients) { clearTimeout(timeoutId); clientRes.json(update); }
res.status(202).json({ deliveredTo: clients.length });});
app.listen(3000, () => console.log('http://localhost:3000'));Client (browser):
async function pollForever() { while (true) { const res = await fetch('/updates', { cache: 'no-store' }); if (res.status === 204) continue; // no update, immediately poll again const data = await res.json(); console.log('update', data); }}
pollForever();Server-Sent Events (SSE)
Section titled “Server-Sent Events (SSE)”SSE is a one-way stream: server → client over a single long-lived HTTP connection.
- Pros: Great for streams of updates (dashboards, notifications), simpler than WebSockets.
- Cons: One-way only; client still needs normal HTTP requests to send data.
Where It’s Used
Section titled “Where It’s Used”- Live dashboards and notification feeds
- Streaming text responses (server continuously sends chunks)
Industry Examples (Companies)
Section titled “Industry Examples (Companies)”- OpenAI: streaming API responses commonly use SSE
Express Integration Example
Section titled “Express Integration Example”SSE is just an HTTP response that stays open and streams text lines.
Server (Express):
import express from 'express';
const app = express();app.use(express.json());
const clients = new Set();
app.get('/events', (req, res) => { res.setHeader('Content-Type', 'text/event-stream'); res.setHeader('Cache-Control', 'no-cache'); res.setHeader('Connection', 'keep-alive'); res.flushHeaders?.();
// Optional: an initial message so the client knows it's connected. res.write(`event: ready\ndata: ${JSON.stringify({ ok: true })}\n\n`);
clients.add(res);
const heartbeat = setInterval(() => { // Comment line keeps some proxies from timing out the connection. res.write(': keep-alive\n\n'); }, 15_000);
req.on('close', () => { clearInterval(heartbeat); clients.delete(res); });});
// Demo endpoint to broadcast an event.app.post('/publish', (req, res) => { const payload = { message: req.body?.message ?? 'Hello SSE', at: new Date().toISOString(), };
for (const clientRes of clients) { clientRes.write(`event: message\ndata: ${JSON.stringify(payload)}\n\n`); }
res.status(202).json({ deliveredTo: clients.size });});
app.listen(3000, () => console.log('http://localhost:3000'));Client (browser):
const events = new EventSource('/events');
events.addEventListener('message', (e) => { console.log('message', JSON.parse(e.data));});WebSockets
Section titled “WebSockets”WebSockets upgrade an HTTP connection into a persistent, two-way channel.
- Pros: Best for interactive real-time (chat, collaboration), low latency.
- Cons: More complex to deploy (proxies/load balancers), requires careful auth and scaling.
Where It’s Used
Section titled “Where It’s Used”- Chat and messaging
- Multiplayer / collaborative apps (shared cursors, live edits)
- Real-time trading / live data feeds
Industry Examples (Companies)
Section titled “Industry Examples (Companies)”- Slack: real-time messaging and updates rely on persistent connections
- Discord: real-time chat and presence relies on persistent connections
Express Integration Example
Section titled “Express Integration Example”Express doesn’t “do” WebSockets by itself. A common approach is to use Express for HTTP routes and attach a WebSocket server to the same underlying HTTP server.
Server (Express + ws):
import express from 'express';import http from 'http';import { WebSocketServer } from 'ws';
const app = express();const server = http.createServer(app);
app.get('/health', (req, res) => res.json({ ok: true }));
const wss = new WebSocketServer({ server });
wss.on('connection', (socket) => { socket.send( JSON.stringify({ type: 'welcome', at: new Date().toISOString() }) );
socket.on('message', (raw) => { // Broadcast to everyone (simple chat-style demo) for (const client of wss.clients) { if (client.readyState === 1) client.send(raw); } });});
server.listen(3000, () => console.log('http://localhost:3000'));Client (browser):
const ws = new WebSocket('ws://localhost:3000');
ws.addEventListener('message', (e) => { console.log('ws message', e.data);});
ws.addEventListener('open', () => { ws.send(JSON.stringify({ type: 'chat', text: 'hello' }));});Choosing the Right Approach
Section titled “Choosing the Right Approach”- Long polling: simplest fallback when you can’t keep a connection.
- SSE: best when the server mostly pushes updates and clients rarely send messages.
- WebSockets: best when both sides frequently send messages.
HTTP Streaming (Chunked Responses)
Section titled “HTTP Streaming (Chunked Responses)”“HTTP streaming” usually means the server returns a normal HTTP response, but instead of sending it all at once, it streams the body in chunks over time.
This is the idea behind many “streaming AI responses” and large downloads:
- The client makes one request.
- The server starts responding immediately.
- More data arrives over time until the server ends the response.
How This Relates to SSE
Section titled “How This Relates to SSE”- SSE is a specific format for streaming over HTTP (
text/event-stream) with named events and automatic reconnection support in the browser (EventSource). - HTTP streaming is more general: you can stream plain text, JSON lines, or any custom format.
Common streaming formats:
- Plain text chunks (
text/plain): simplest for incremental text. - NDJSON / JSON Lines (
application/x-ndjson): one JSON object per line (good for parsing incrementally).
When It’s Used
Section titled “When It’s Used”- Streaming logs or progress updates
- Streaming long-running job results as they’re produced
- Token-by-token / chunk-by-chunk text generation
Key Practical Notes
Section titled “Key Practical Notes”- Many reverse proxies buffer responses by default. If you expect “live” streaming, you often need to disable buffering at the proxy/load balancer level.
- Keep-alives and timeouts matter: long streams can be cut off by infrastructure if you don’t send data periodically.
- Streaming is still “HTTP”: it’s great for server → client data, but doesn’t replace bi-directional protocols like WebSockets.
Express Integration Example (Text Chunks)
Section titled “Express Integration Example (Text Chunks)”Server (Express):
import express from 'express';
const app = express();
app.get('/stream', async (req, res) => { res.setHeader('Content-Type', 'text/plain; charset=utf-8'); res.setHeader('Cache-Control', 'no-cache'); res.setHeader('Connection', 'keep-alive'); res.flushHeaders?.();
for (let i = 1; i <= 5; i++) { res.write(`chunk ${i}\n`); await new Promise((r) => setTimeout(r, 500)); }
res.end('done\n');});
app.listen(3000, () => console.log('http://localhost:3000'));Client (browser using fetch streaming):
const res = await fetch('/stream');const reader = res.body.getReader();const decoder = new TextDecoder();
while (true) { const { value, done } = await reader.read(); if (done) break; console.log(decoder.decode(value, { stream: true }));}Security Notes
Section titled “Security Notes”- Use TLS (
https://andwss://) so data is encrypted in transit. - Authenticate the connection (e.g., cookie session or token) and enforce authorization on every message/channel.
- Validate input on real-time channels the same way you validate REST inputs.