Building SaaS? WebSockets probably aren't the best choice
They often address problems that can be solved with much simpler alternatives.
The hidden cost
The number one cost of WebSockets is the operational overhead that you incur by using them. I'd say that you should look for alternatives first - seeking out Websockets if and only if you need the features they provide.
• connection lifecycles are naturally hard to track
• the need for stateful servers / sticky sessions
• load-balancer tweaks (HTTP Upgrade headers)
• each socket ≈ 70 KB kernel memory
Then what should we use?
1. Short polling
1// every 20 seconds, poll the server for updates2const poll = () =>3 fetch('/api/updates')4 .then((r) => (r.ok ? r.json() : Promise.reject(r.statusText)))5 .then(handleUpdates)6 .catch(console.error);7
8setInterval(poll, 20_000);
- Simple to implement; works with any HTTP/1.1 server
- Leverages HTTP caching and CDN layers for stale-while-revalidate scenarios
- No special transport or server configuration required
- Fixed polling interval leads to higher latency and staleness
- Wastes bandwidth when responses are empty
- Can overwhelm the server under high client concurrency
2. Long polling (hanging GET)
1// keep exactly one request outstanding2async function longPoll() {3 try {4 const res = await fetch(`/api/updates?since=${lastSeen}`);5 const data = await res.json();6 handleUpdates(data);7 } catch (err) {8 console.error(err);9 } finally {10 longPoll(); // recurse immediately11 }12}13longPoll();
- Reduces wasted requests by holding the connection open until data arrives
- Broad compatibility, no WebSocket or SSE support needed
- Straightforward HTTP semantics using standard fetch/XHR
- Still unidirectional; client must re-establish after each event
- Susceptible to timeouts, proxy buffering, and head-of-line blocking
- Large numbers of hanging requests increase server memory usage
3. Server-sent events (SSE)
Client-side
1const source = new EventSource('/api/events')2
3source.onmessage = ({ data }) => {4 const { type, payload } = JSON.parse(data) as {5 type: string6 payload: any7 }8
9 switch (type) {10 case 'chat':11 addChatMessage(payload)12 break13 case 'notification':14 showNotification(payload)15 break16 default:17 handleGenericUpdate(payload)18 }19}20
21source.onerror = e => console.error('SSE error', e)22source.onopen = () => console.log('SSE connected')
Server-side
1app.get('/api/events', (req, res) => {2 res.writeHead(200, {3 'Content-Type': 'text/event-stream',4 'Cache-Control': 'no-cache',5 Connection: 'keep-alive',6 })7
8 const send = (type: string, payload: any) =>9 res.write(`data: ${JSON.stringify({ type, payload })}\n\n`)10
11 send('chat', { text: 'New message!' })12 send('notification', { title: 'Alert!' })13 send('generic', { value: 42 })14
15 // if we want to keep the connection alive c:16 const ping = setInterval(() => res.write(': ping\n\n'), 30000)17 req.on('close', () => clearInterval(ping))18})
- Native browser API with built-in auto-reconnect and last-event-ID tracking
- Efficient one-way push over a single long-lived HTTP connection
- Lower overhead than polling patterns; minimal network chatter
- Server-to-client only (no full-duplex; my example just shows sending)
- Text-only payloads; no binary frames
- Not supported in older browsers like IE11
4. Manual refreshes
Sometimes a refresh button hides seconds of latency at essentially zero backend cost. The client can handle work too, you know!
- Ultimate simplicity; no background tasks or open connections
- Fully leverages HTTP caching; minimal server impact
- User-driven, so avoids unnecessary polling or connections
- Poor UX for real-time updates; data may grow stale
- Requires user interaction, breaking the “live” illusion
- Not suitable for frequent or unpredictable update patterns
Decision Matrix
I created a quick decision matrix to help you decide which approach to take. This isn't exhaustive, but it should give you a good starting point.
Characteristic | Polling | SSE | WebSocket |
---|---|---|---|
Direction | C → S | S → C | ↔︎ |
Latency | interval | <1 s | <100 ms |
Infra changes | none | none | LB/L7 |
Stateful server | no | no | yes* |
* or implement sticky sessions / external socket layer.
Heuristics
There's some other stuff you should consider too:
npm i socket.io
- Do clients need to publish as well as receive?
- Will an N-second delay hurt UX?
- Expected peak concurrent users?
- Can the team run stateful infra today?
If the answers to my questions are mostly “no / few”, stick with polling or SSE—you'll ship sooner and debug less.
Some case studies
A. Polling Hook → SWR Data Fetch
1- // Poll every 5s2- setInterval(async () => {3- const res = await fetch('/api/status');4- const { status } = await res.json();5- setStatus(status);6- }, 5000);7
8+ import useSWR from 'swr';9+10+ function useStatus() {11+ return useSWR(12+ '/api/status',13+ (url) => fetch(url).then((res) => res.json()),14+ { refreshInterval: 5000, dedupingInterval: 2000 }15+ );16+ }17+18+ // Component-level:19+ const { data, error } = useStatus();20+ if (error) console.error(error);21+ <span>Status: {data?.status ?? 'Loading...'}</span>;
Take-away: Hooks + SWR handle caching, revalidation, and error states out of the box, but you'll incur more latency.
B. WebSocket → SSE Hook
1- // Before: raw WebSocket2- const socket = new WebSocket('wss://api.example.com/updates');3- socket.onmessage = (e) => handleUpdate(JSON.parse(e.data));4
5+ // After: (in some frontend codebase)6+ import { useEffect } from 'react';7+8+ function useSSE(url: string, onMessage: (data: any) => void) {9+ useEffect(() => {10+ const es = new EventSource(url);11+ es.onmessage = (e) => onMessage(JSON.parse(e.data));12+ return () => es.close();13+ }, [url, onMessage]);14+ }15+16+17+ useSSE('/api/updates', handleUpdate);
Take-away: SSE is unidirectional but sidesteps socket infrastructure while providing server push.
C. Socket.IO → Push API
1- // Before: Socket.IO for notifications2- useEffect(() => {3- const socket = io('/notifications');4- socket.on('notify', (msg) => addNotification(msg));5- return () => socket.disconnect();6- }, []);7
8+ // After: Web Push via Service Worker9+ useEffect(() => {10+ if (!('serviceWorker' in navigator) || !('PushManager' in window)) return;11+12+ navigator.serviceWorker.ready13+ .then((reg) =>14+ reg.pushManager.subscribe({15+ userVisibleOnly: true,16+ applicationServerKey,17+ })18+ )19+ .then((sub) => {20+ // send subscription details to server21+ });22+23+ const onPush = (e: PushEvent) => {24+ const data = e.data?.json();25+ addNotification(data);26+ };27+28+ navigator.serviceWorker.addEventListener('push', onPush);29+ return () => {30+ navigator.serviceWorker.removeEventListener('push', onPush);31+ };32+ }, []);
Take-away: Push API moves notifications off the socket layer and into browser-native channels.
When you truly need WebSockets
Despite all the above, there are some cases where WebSockets are the best choice:
- Collaborative editing (Figma, Google Docs)
- Interactive multiplayer games
- Chat / messaging platforms
- Live control dashboards
• You need true full-duplex traffic
• <50 ms latency matters
• Both client and server send spontaneously
• Team can operate stateful infra
Looking Ahead
HTTP/3 (WebTransport) & peer-to-peer (WebRTC) shrink the WebSocket niche even further, but the tooling is still super young. All I ask is that you start simple, and upgrade only when numbers (not vibes) demand it.
TL;DR
Go stateless first (polling / SSE). Reach for WebSockets only when sub-second, bi-directional comms are business-critical.