perf(timeline): restore VList cache + skip 80 ms timer on room revisit#659
perf(timeline): restore VList cache + skip 80 ms timer on room revisit#659Just-Insane wants to merge 8 commits intoSableClient:devfrom
Conversation
When navigating back to a previously-visited room, save the VList
CacheSnapshot (item heights) and scroll offset on the way out, then
on the way back:
• pass the snapshot as cache= to VList so item heights do not need
to be remeasured (VList is keyed by room.roomId so it gets a fresh
instance with the saved measurements)
• skip the 80 ms stabilisation timer — the measurements are already
known, so the scroll lands immediately and setIsReady(true) is
called without the artificial delay
First-visit rooms retain the existing 80 ms behaviour unchanged.
RoomTimeline mounts fresh per room (key={roomId} in RoomView), so the
render-phase room-change block used for save/load never fires.
- Init scrollCacheForRoomRef from roomScrollCache.load() on mount so the
CacheSnapshot is actually provided to VList on first render.
- Save the cache in handleVListScroll (and after the first-visit 80 ms
timer) rather than in the unreachable room-change block.
- Trim the room-change block to just the load + state-reset path (kept as
a defensive fallback for any future scenario where room prop changes
without remount).
|
Added a follow-up fix in the latest commit: |
…mmatic scroll-to-bottom After setIsReady(true) commits, virtua can fire onScroll events with isNowAtBottom=false during its height-correction pass (particularly on first visit when item heights above the viewport haven't been rendered yet). These intermediate events were driving atBottomState to false while isReady=true, flashing the 'Jump to Latest' button. Add programmaticScrollToBottomRef: set it before each scrollToIndex bottom-scroll, suppress the first intermediate false event (clearing the guard immediately), so the next event — the corrected position or a real user scroll — is processed normally.
There was a problem hiding this comment.
Pull request overview
This PR improves room timeline revisit performance by caching Virtua VList item height snapshots and restoring scroll position immediately when returning to a room, avoiding the existing 80ms “fade-in stabilisation” delay. It also addresses a race where the recovery path could previously fail to persist a cache, causing subsequent revisits to still hit the fade-in path.
Changes:
- Introduces a session-scoped
roomScrollCacheutility to store per-room VList cache + scroll position. - Updates
RoomTimelineto load cached VList measurements on mount, restore scroll immediately on revisit, and persist cache/offset during scrolling and recovery. - Adds a Changeset entry documenting the perf improvement.
Reviewed changes
Copilot reviewed 4 out of 4 changed files in this pull request and generated 6 comments.
| File | Description |
|---|---|
src/app/utils/roomScrollCache.ts |
Adds a per-room, session-scoped Map-backed cache for VList snapshots + scroll state. |
src/app/utils/roomScrollCache.test.ts |
Adds Vitest coverage for basic save/load/overwrite behavior. |
src/app/features/room/RoomTimeline.tsx |
Restores cached VList state on revisit, saves cache on scroll and recovery, and adds a guard for programmatic scroll UI behavior. |
.changeset/perf-timeline-scroll-cache.md |
Documents the perf change as a patch-level update. |
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
| @@ -0,0 +1,22 @@ | |||
| import { CacheSnapshot } from 'virtua'; | |||
There was a problem hiding this comment.
CacheSnapshot is only used as a TypeScript type here. Importing it as a value (import { CacheSnapshot } ...) can force an unnecessary runtime dependency on virtua in the emitted JS/bundler graph. Switch this to a type-only import (e.g. import type { CacheSnapshot } ...).
| import { CacheSnapshot } from 'virtua'; | |
| import type { CacheSnapshot } from 'virtua'; |
| const scrollCacheMap = new Map<string, RoomScrollCache>(); | ||
|
|
||
| export const roomScrollCache = { | ||
| save(roomId: string, data: RoomScrollCache): void { | ||
| scrollCacheMap.set(roomId, data); | ||
| }, | ||
| load(roomId: string): RoomScrollCache | undefined { | ||
| return scrollCacheMap.get(roomId); | ||
| }, |
There was a problem hiding this comment.
scrollCacheMap grows for every room visited and is never cleared/evicted. Since cache can be sizable (item-size snapshots), this can lead to unbounded memory growth over a long session with many rooms. Consider adding an eviction strategy (e.g., LRU with a max entries cap) and/or a way to clear caches on logout/session reset.
| const scrollCacheMap = new Map<string, RoomScrollCache>(); | |
| export const roomScrollCache = { | |
| save(roomId: string, data: RoomScrollCache): void { | |
| scrollCacheMap.set(roomId, data); | |
| }, | |
| load(roomId: string): RoomScrollCache | undefined { | |
| return scrollCacheMap.get(roomId); | |
| }, | |
| const MAX_SCROLL_CACHE_ENTRIES = 100; | |
| const scrollCacheMap = new Map<string, RoomScrollCache>(); | |
| function evictOldestScrollCacheEntry(): void { | |
| if (scrollCacheMap.size <= MAX_SCROLL_CACHE_ENTRIES) return; | |
| const oldestRoomId = scrollCacheMap.keys().next().value; | |
| if (oldestRoomId !== undefined) { | |
| scrollCacheMap.delete(oldestRoomId); | |
| } | |
| } | |
| export const roomScrollCache = { | |
| save(roomId: string, data: RoomScrollCache): void { | |
| if (scrollCacheMap.has(roomId)) { | |
| scrollCacheMap.delete(roomId); | |
| } | |
| scrollCacheMap.set(roomId, data); | |
| evictOldestScrollCacheEntry(); | |
| }, | |
| load(roomId: string): RoomScrollCache | undefined { | |
| const cached = scrollCacheMap.get(roomId); | |
| if (cached === undefined) return undefined; | |
| scrollCacheMap.delete(roomId); | |
| scrollCacheMap.set(roomId, cached); | |
| return cached; | |
| }, | |
| clear(): void { | |
| scrollCacheMap.clear(); | |
| }, |
| import { PushProcessor, Room, Direction } from '$types/matrix-sdk'; | ||
| import classNames from 'classnames'; | ||
| import { VList, VListHandle } from 'virtua'; | ||
| import { roomScrollCache, RoomScrollCache } from '$utils/roomScrollCache'; |
There was a problem hiding this comment.
RoomScrollCache is used only as a type in this file. Import it with type (or split the import) to avoid pulling it into the runtime module graph unnecessarily, consistent with other files in this repo that use import type for types.
| import { roomScrollCache, RoomScrollCache } from '$utils/roomScrollCache'; | |
| import { roomScrollCache } from '$utils/roomScrollCache'; | |
| import type { RoomScrollCache } from '$utils/roomScrollCache'; |
| if (savedCache) { | ||
| // Revisiting a room with a cached scroll state — restore position | ||
| // immediately and skip the 80 ms stabilisation timer entirely. | ||
| if (savedCache.atBottom) { | ||
| programmaticScrollToBottomRef.current = true; | ||
| vListRef.current.scrollToIndex(processedEventsRef.current.length - 1, { align: 'end' }); | ||
| // scrollToIndex is async; pre-empt the button so it doesn't flash for |
There was a problem hiding this comment.
In the saved-cache restore path, scrollToIndex(processedEventsRef.current.length - 1, ...) can be called when processedEventsRef.current is empty (the same TimelineReset race described for the 80ms timer path), which would pass -1 to scrollToIndex. Add a guard for processedEventsRef.current.length > 0 (or reuse pendingReadyRef to defer restore until events repopulate) before attempting the scroll.
| setAtBottom(true); | ||
| } else { | ||
| pendingReadyRef.current = true; | ||
| vListRef.current.scrollTo(savedCache.scrollOffset); | ||
| } | ||
| }, 80); | ||
| hasInitialScrolledRef.current = true; | ||
| setIsReady(true); | ||
| } else { |
There was a problem hiding this comment.
When restoring a non-bottom scroll position (savedCache.atBottom === false), the component leaves atBottomState at its default true until onScroll fires. Because setIsReady(true) happens immediately, this can briefly hide the “Jump to Latest” UI and can trigger at-bottom behaviors (e.g. auto-mark-as-read) even though the restored position is not at the bottom. Consider setting setAtBottom(false) (or setAtBottom(savedCache.atBottom)) as part of the restore before marking the timeline ready.
| // Keep the scroll cache fresh so the next visit to this room can restore | ||
| // position (and skip the 80 ms measurement wait) immediately on mount. | ||
| roomScrollCache.save(room.roomId, { | ||
| cache: v.cache, | ||
| scrollOffset: offset, | ||
| atBottom: isNowAtBottom, | ||
| }); |
There was a problem hiding this comment.
Saving the scroll cache on every onScroll callback allocates a new object and updates the Map at scroll-event frequency, which can be very hot during user scrolling. Consider throttling/debouncing these saves (e.g. per animation frame) and/or only saving when values materially change, to reduce GC pressure and keep scrolling smooth.
Description
Caches VList item heights (via
roomScrollCache) across visits to the same room within a session. On revisit, the stored cache is provided to VList upfront and the scroll position is restored immediately — skipping the 80 ms opacity-fade stabilisation timer entirely.Also fixes a bug where the
pendingReadyRefrecovery path (triggered when the 80 ms timer fires whileprocessedEventsis empty due to aTimelineResetrace) never saved the scroll cache, causing rooms that hit this path to show the fade-in on subsequent visits.Fixes #
Type of change
Checklist:
AI disclosure:
roomScrollCache.ts— a simple session-scopedMap<roomId, { cache, scrollOffset, atBottom }>.cacheis VList'sCacheSnapshot(measured item heights),scrollOffsetis the pixel offset at save time, andatBottomis a flag used to decide whether to callscrollToIndex(bottom) orscrollTo(exact offset) on restore.In
RoomTimeline.tsx, the initial-scrolluseLayoutEffectwas extended to check for a saved cache before starting the 80 ms timer: if a cache exists, it restores position immediately and callssetIsReady(true)without the delay. The 80 ms timer path and thependingReadyRefrecovery path both now callroomScrollCache.save(...)after scrolling so subsequent visits have a cache to load.