Web Servers & Reverse Proxies
A web server (or reverse proxy) often sits in front of your backend application.
Even if your backend is an Express app, production setups commonly look like:
What a Reverse Proxy Does
Section titled “What a Reverse Proxy Does”A reverse proxy receives requests from clients and forwards them to your application.
Common jobs:
- TLS termination: handle HTTPS certificates (so your app can run plain HTTP internally)
- Static files: serve assets efficiently
- Compression: gzip/brotli responses
- Caching: cache responses for speed
- Load balancing: distribute traffic across multiple app instances
- Rate limiting: protect against abuse
- Request limits: cap upload sizes / header sizes
Popular Choices
Section titled “Popular Choices”- Nginx: very popular, fast, flexible
- Apache: older but feature-rich
- Caddy: great developer experience; HTTPS is often very easy
- Traefik: common in container/Kubernetes environments
Express Behind a Reverse Proxy (Key Notes)
Section titled “Express Behind a Reverse Proxy (Key Notes)”- If your reverse proxy is doing TLS, your Express app may only see internal HTTP unless you configure it correctly.
- If you use cookies with
secure: true, you usually needapp.set("trust proxy", 1)so Express understands it’s really HTTPS.
Example: Nginx Reverse Proxy to Express
Section titled “Example: Nginx Reverse Proxy to Express”This example sends all traffic to an Express app running on localhost:3000.
server { listen 80; server_name example.com;
location / { proxy_pass http://127.0.0.1:3000; proxy_http_version 1.1; proxy_set_header Host $host; proxy_set_header X-Real-IP $remote_addr; proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for; proxy_set_header X-Forwarded-Proto $scheme; }}WebSockets, SSE, and Streaming Through a Proxy
Section titled “WebSockets, SSE, and Streaming Through a Proxy”Real-time features often require proxy settings to avoid buffering and to support connection upgrades.
WebSockets
Section titled “WebSockets”WebSockets require an HTTP upgrade.
location /ws { proxy_pass http://127.0.0.1:3000; proxy_http_version 1.1; proxy_set_header Upgrade $http_upgrade; proxy_set_header Connection "upgrade"; proxy_set_header Host $host;}SSE / HTTP Streaming
Section titled “SSE / HTTP Streaming”SSE and streaming rely on long-lived connections and sending chunks quickly.
location /events { proxy_pass http://127.0.0.1:3000; proxy_http_version 1.1; proxy_set_header Connection ""; proxy_buffering off; proxy_cache off;}Load Balancing (Concept)
Section titled “Load Balancing (Concept)”When your app runs multiple instances:
Important: real-time connections (WebSockets/SSE) can be sensitive to load balancing. You may need sticky sessions or a shared pub/sub layer depending on your design.
Mini Case Studies
Section titled “Mini Case Studies”Case Study 1: Single VM Deployment (Common First Step)
Section titled “Case Study 1: Single VM Deployment (Common First Step)”- A single Linux VM runs:
- Nginx on
:443(HTTPS) - Express on
:3000
- Nginx on
- Nginx terminates TLS and forwards requests to Express.
- Logs:
- Nginx access/error logs
- App logs (stdout) collected by your process manager
This setup is cheap, understandable, and handles a lot of traffic for small/medium apps.
Case Study 2: Real-Time Features Behind a Proxy
Section titled “Case Study 2: Real-Time Features Behind a Proxy”- You add SSE at
/eventsand/or WebSockets at/ws. - You must ensure:
- WebSockets:
Upgradeheaders are forwarded. - SSE/streaming: proxy buffering is off, so events arrive immediately.
- WebSockets:
If you later add multiple app instances behind a load balancer, you may need:
- Sticky sessions (same client goes to the same instance), or
- A shared pub/sub (e.g., Redis pub/sub) so updates reach clients no matter which instance they’re connected to.