Skip to content

Real-Time Data

This content is for Backend. Switch to the latest version for up-to-date documentation.

Real-time features let the server push updates to clients without waiting for the next normal page/API refresh.

Typical examples:

  • Chat messages
  • Live dashboards (stocks, sensors)
  • Notifications
  • Multiplayer collaboration

HTTP request/response is usually client-driven:

Diagram

For real-time updates, we want server-driven delivery:

Diagram

With long polling, the client sends a request and the server holds it open until there is new data (or a timeout), then the client immediately starts a new request.

  • Pros: Works almost everywhere, simple mental model.
  • Cons: More overhead (many HTTP requests), can get expensive at scale.
  • When you need near-real-time updates but can’t rely on WebSockets (network restrictions, old infrastructure)
  • Simple bots / integrations where “poll for updates” is acceptable
  • Telegram: Bot API supports long polling to receive updates
Diagram

This is a minimal pattern: keep a list of “waiting” HTTP responses and complete them when an update happens.

Server (Express):

import express from 'express';
const app = express();
app.use(express.json());
// For demo purposes we keep waiting clients in memory.
// In real apps, updates usually come from a DB change, queue, or pub/sub.
let waiting = [];
app.get('/updates', (req, res) => {
// Hold the request open.
req.setTimeout(30_000);
// If no update arrives, return a "no update" response.
const timeoutId = setTimeout(() => {
waiting = waiting.filter((entry) => entry.res !== res);
res.status(204).end();
}, 25_000);
waiting.push({ res, timeoutId });
});
// Demo endpoint to trigger an update.
app.post('/publish', (req, res) => {
const update = {
message: req.body?.message ?? 'Hello from server',
at: new Date().toISOString(),
};
const clients = waiting;
waiting = [];
for (const { res: clientRes, timeoutId } of clients) {
clearTimeout(timeoutId);
clientRes.json(update);
}
res.status(202).json({ deliveredTo: clients.length });
});
app.listen(3000, () => console.log('http://localhost:3000'));

Client (browser):

async function pollForever() {
while (true) {
const res = await fetch('/updates', { cache: 'no-store' });
if (res.status === 204) continue; // no update, immediately poll again
const data = await res.json();
console.log('update', data);
}
}
pollForever();

SSE is a one-way stream: server → client over a single long-lived HTTP connection.

  • Pros: Great for streams of updates (dashboards, notifications), simpler than WebSockets.
  • Cons: One-way only; client still needs normal HTTP requests to send data.
  • Live dashboards and notification feeds
  • Streaming text responses (server continuously sends chunks)
  • OpenAI: streaming API responses commonly use SSE
Diagram

SSE is just an HTTP response that stays open and streams text lines.

Server (Express):

import express from 'express';
const app = express();
app.use(express.json());
const clients = new Set();
app.get('/events', (req, res) => {
res.setHeader('Content-Type', 'text/event-stream');
res.setHeader('Cache-Control', 'no-cache');
res.setHeader('Connection', 'keep-alive');
res.flushHeaders?.();
// Optional: an initial message so the client knows it's connected.
res.write(`event: ready\ndata: ${JSON.stringify({ ok: true })}\n\n`);
clients.add(res);
const heartbeat = setInterval(() => {
// Comment line keeps some proxies from timing out the connection.
res.write(': keep-alive\n\n');
}, 15_000);
req.on('close', () => {
clearInterval(heartbeat);
clients.delete(res);
});
});
// Demo endpoint to broadcast an event.
app.post('/publish', (req, res) => {
const payload = {
message: req.body?.message ?? 'Hello SSE',
at: new Date().toISOString(),
};
for (const clientRes of clients) {
clientRes.write(`event: message\ndata: ${JSON.stringify(payload)}\n\n`);
}
res.status(202).json({ deliveredTo: clients.size });
});
app.listen(3000, () => console.log('http://localhost:3000'));

Client (browser):

const events = new EventSource('/events');
events.addEventListener('message', (e) => {
console.log('message', JSON.parse(e.data));
});

WebSockets upgrade an HTTP connection into a persistent, two-way channel.

  • Pros: Best for interactive real-time (chat, collaboration), low latency.
  • Cons: More complex to deploy (proxies/load balancers), requires careful auth and scaling.
  • Chat and messaging
  • Multiplayer / collaborative apps (shared cursors, live edits)
  • Real-time trading / live data feeds
  • Slack: real-time messaging and updates rely on persistent connections
  • Discord: real-time chat and presence relies on persistent connections
Diagram

Express doesn’t “do” WebSockets by itself. A common approach is to use Express for HTTP routes and attach a WebSocket server to the same underlying HTTP server.

Server (Express + ws):

import express from 'express';
import http from 'http';
import { WebSocketServer } from 'ws';
const app = express();
const server = http.createServer(app);
app.get('/health', (req, res) => res.json({ ok: true }));
const wss = new WebSocketServer({ server });
wss.on('connection', (socket) => {
socket.send(
JSON.stringify({ type: 'welcome', at: new Date().toISOString() })
);
socket.on('message', (raw) => {
// Broadcast to everyone (simple chat-style demo)
for (const client of wss.clients) {
if (client.readyState === 1) client.send(raw);
}
});
});
server.listen(3000, () => console.log('http://localhost:3000'));

Client (browser):

const ws = new WebSocket('ws://localhost:3000');
ws.addEventListener('message', (e) => {
console.log('ws message', e.data);
});
ws.addEventListener('open', () => {
ws.send(JSON.stringify({ type: 'chat', text: 'hello' }));
});
  • Long polling: simplest fallback when you can’t keep a connection.
  • SSE: best when the server mostly pushes updates and clients rarely send messages.
  • WebSockets: best when both sides frequently send messages.

“HTTP streaming” usually means the server returns a normal HTTP response, but instead of sending it all at once, it streams the body in chunks over time.

This is the idea behind many “streaming AI responses” and large downloads:

  • The client makes one request.
  • The server starts responding immediately.
  • More data arrives over time until the server ends the response.
  • SSE is a specific format for streaming over HTTP (text/event-stream) with named events and automatic reconnection support in the browser (EventSource).
  • HTTP streaming is more general: you can stream plain text, JSON lines, or any custom format.

Common streaming formats:

  • Plain text chunks (text/plain): simplest for incremental text.
  • NDJSON / JSON Lines (application/x-ndjson): one JSON object per line (good for parsing incrementally).
  • Streaming logs or progress updates
  • Streaming long-running job results as they’re produced
  • Token-by-token / chunk-by-chunk text generation
  • Many reverse proxies buffer responses by default. If you expect “live” streaming, you often need to disable buffering at the proxy/load balancer level.
  • Keep-alives and timeouts matter: long streams can be cut off by infrastructure if you don’t send data periodically.
  • Streaming is still “HTTP”: it’s great for server → client data, but doesn’t replace bi-directional protocols like WebSockets.

Server (Express):

import express from 'express';
const app = express();
app.get('/stream', async (req, res) => {
res.setHeader('Content-Type', 'text/plain; charset=utf-8');
res.setHeader('Cache-Control', 'no-cache');
res.setHeader('Connection', 'keep-alive');
res.flushHeaders?.();
for (let i = 1; i <= 5; i++) {
res.write(`chunk ${i}\n`);
await new Promise((r) => setTimeout(r, 500));
}
res.end('done\n');
});
app.listen(3000, () => console.log('http://localhost:3000'));

Client (browser using fetch streaming):

const res = await fetch('/stream');
const reader = res.body.getReader();
const decoder = new TextDecoder();
while (true) {
const { value, done } = await reader.read();
if (done) break;
console.log(decoder.decode(value, { stream: true }));
}
  • Use TLS (https:// and wss://) so data is encrypted in transit.
  • Authenticate the connection (e.g., cookie session or token) and enforce authorization on every message/channel.
  • Validate input on real-time channels the same way you validate REST inputs.
Built with passion by Ngineer Lab