Problem
In claude.ai web interface, long conversations load all messages into the DOM at once. This causes:
- Page lag and unresponsiveness as conversation grows
- High RAM consumption in the browser (hundreds of MB for long sessions)
- HTML rendering slowdown — the more messages, the slower the page
- Browser tabs can crash on very long conversations (1000+ messages)
This is especially painful for users who work in extended sessions with Claude Code on the web, where conversations can reach thousands of messages with code blocks, tool results, and artifacts.
Proposed Solution
Implement virtual scrolling with lazy-loading for conversation history:
Core Behavior
- Render only visible messages — Keep only the last N messages (e.g., 50) in the actual DOM
- Remove old messages from DOM — As user scrolls down, old messages are removed from DOM (not deleted — just unmounted)
- Load on scroll up — When user scrolls to the top, fetch and render previous batch of messages (AJAX/fetch)
- "Load earlier messages" button — Optional explicit trigger at the top of conversation
- Preserve scroll position — When loading older messages, maintain current scroll position (no jumping)
Technical Approach
- Virtual scrolling (like Discord, Slack, Telegram) — only render what's in viewport + buffer
- Libraries like
react-virtualized, react-window, or tanstack-virtual handle this well
- Messages fetched in pages (e.g., 50 at a time) from the server
- Skeleton placeholders for messages not yet loaded
Expected Impact
- Constant DOM size regardless of conversation length
- Stable RAM usage (~50-100MB instead of growing unbounded)
- Smooth scrolling even in 10,000+ message conversations
- Faster initial page load (render last 50 messages, not all 5000)
Additional Context
- This is a standard pattern in all modern chat applications (Slack, Discord, Telegram, WhatsApp Web)
- The current approach of loading everything works for short conversations but breaks down for power users
- Claude Code web sessions tend to be especially long due to tool use, code output, and iterative development
Problem
In claude.ai web interface, long conversations load all messages into the DOM at once. This causes:
This is especially painful for users who work in extended sessions with Claude Code on the web, where conversations can reach thousands of messages with code blocks, tool results, and artifacts.
Proposed Solution
Implement virtual scrolling with lazy-loading for conversation history:
Core Behavior
Technical Approach
react-virtualized,react-window, ortanstack-virtualhandle this wellExpected Impact
Additional Context