OAuth: Where Frontend Developers Die
OAuth: Where Frontend Developers Die
Most OAuth tutorials end at "user clicks login, gets a token". That's the easy part. Real production is where the complexity lives.
The postMessage Architecture
When we integrated Google, Facebook and X logins at NeoXonline, the naive approach was to redirect the user to the provider and back. The problem: you lose React state, pending uploads, socket connections. Everything resets.
The fix is a postMessage architecture:
- Open a small popup window (
window.open('/auth/google/popup', ...)) - The popup completes the OAuth flow server-side
- On success, the popup calls
window.opener.postMessage({ token, user }, origin) - The main window receives the message, closes the popup, updates state
This keeps the user on the same page. No redirect, no state loss.
// In the main window
window.addEventListener('message', (event) => {
if (event.origin !== window.location.origin) return; // CRITICAL security check
if (event.data?.type === 'oauth-success') {
setUser(event.data.user);
setToken(event.data.token);
}
});
The origin check is not optional. Without it, any site can send you a fake user object.
Silent Refresh and the 5-Minute Buffer
Access tokens expire. The naive solution: wait until a request fails with 401, then refresh. The problem: you get a flash of broken UI while the refresh happens.
The pattern that works:
async function ensureValidToken(): Promise<string> {
const expiresAt = getTokenExpiry(); // stored in memory, not localStorage
const BUFFER_MS = 5 * 60 * 1000; // 5 minutes
if (Date.now() + BUFFER_MS < expiresAt) {
return getStoredToken(); // still valid with buffer
}
// Proactively refresh before expiry
return await refreshToken();
}
Call ensureValidToken() before every API call, not just when you get a 401. The 5-minute buffer means you refresh before the user hits an expired token, not after.
Token Rotation and Race Conditions
Keycloak (and most modern providers) use token rotation: every refresh returns a new refresh token and invalidates the old one. Problem: if two concurrent API calls both trigger a refresh, the second one uses an already-invalidated refresh token and the user gets logged out.
Solution: a shared refresh promise with deduplication:
let refreshPromise: Promise<string> | null = null;
async function refreshToken(): Promise<string> {
if (refreshPromise) return refreshPromise; // reuse in-flight refresh
refreshPromise = doActualRefresh()
.finally(() => { refreshPromise = null; });
return refreshPromise;
}
One race condition, one logout. Ship it wrong once and you'll never forget.
Where to Store Tokens
The localStorage vs httpOnly cookie debate has a correct answer: httpOnly cookies for refresh tokens, memory for access tokens.
- Memory: fast, gone on tab close, immune to XSS
- httpOnly cookie: immune to XSS, persists across sessions, requires CSRF protection
- localStorage: fast, immune to CSRF, vulnerable to XSS — don't use for sensitive tokens
The hybrid: store the refresh token in an httpOnly cookie (server sets it), store the access token in memory. On page load, call a /api/refresh endpoint that reads the cookie and returns a new access token.
Migrating off Keycloak.js
Keycloak ships a frontend SDK that monkeypatches fetch and manages tokens internally. The problem: it creates a hidden dependency on the SDK's internal state, makes testing harder, and can conflict with your own interceptors.
We migrated NeoXonline to backend-proxied Keycloak: the frontend never talks to Keycloak directly. All token operations go through our own API. The frontend just stores a JWT and calls /api/refresh when needed.
Result: cleaner code, testable auth logic, zero Keycloak.js dependency in the bundle.
The Takeaway
OAuth seems simple until you handle production edge cases:
- popups blocked by browsers
- concurrent refresh races
- token rotation invalidating sessions
- cross-tab synchronisation
- mobile browsers killing background windows
None of this is in the OAuth spec. It's all discovered the hard way. In production. At midnight.