Skip to content

perf(timeline): restore VList cache + skip 80 ms timer on room revisit#659

Open
Just-Insane wants to merge 8 commits intoSableClient:devfrom
Just-Insane:perf/timeline-scroll-cache
Open

perf(timeline): restore VList cache + skip 80 ms timer on room revisit#659
Just-Insane wants to merge 8 commits intoSableClient:devfrom
Just-Insane:perf/timeline-scroll-cache

Conversation

@Just-Insane
Copy link
Copy Markdown
Contributor

Description

Caches VList item heights (via roomScrollCache) across visits to the same room within a session. On revisit, the stored cache is provided to VList upfront and the scroll position is restored immediately — skipping the 80 ms opacity-fade stabilisation timer entirely.

Also fixes a bug where the pendingReadyRef recovery path (triggered when the 80 ms timer fires while processedEvents is empty due to a TimelineReset race) never saved the scroll cache, causing rooms that hit this path to show the fade-in on subsequent visits.

Fixes #

Type of change

  • Bug fix (non-breaking change which fixes an issue)
  • New feature (non-breaking change which adds functionality)
  • Breaking change (fix or feature that would cause existing functionality to not work as expected)
  • This change requires a documentation update

Checklist:

  • My code follows the style guidelines of this project
  • I have performed a self-review of my own code
  • I have commented my code, particularly in hard-to-understand areas
  • I have made corresponding changes to the documentation
  • My changes generate no new warnings

AI disclosure:

  • Fully AI generated (explain what all the generated code does in moderate detail).

roomScrollCache.ts — a simple session-scoped Map<roomId, { cache, scrollOffset, atBottom }>. cache is VList's CacheSnapshot (measured item heights), scrollOffset is the pixel offset at save time, and atBottom is a flag used to decide whether to call scrollToIndex (bottom) or scrollTo (exact offset) on restore.

In RoomTimeline.tsx, the initial-scroll useLayoutEffect was extended to check for a saved cache before starting the 80 ms timer: if a cache exists, it restores position immediately and calls setIsReady(true) without the delay. The 80 ms timer path and the pendingReadyRef recovery path both now call roomScrollCache.save(...) after scrolling so subsequent visits have a cache to load.

When navigating back to a previously-visited room, save the VList
CacheSnapshot (item heights) and scroll offset on the way out, then
on the way back:
  • pass the snapshot as cache= to VList so item heights do not need
    to be remeasured (VList is keyed by room.roomId so it gets a fresh
    instance with the saved measurements)
  • skip the 80 ms stabilisation timer — the measurements are already
    known, so the scroll lands immediately and setIsReady(true) is
    called without the artificial delay

First-visit rooms retain the existing 80 ms behaviour unchanged.
RoomTimeline mounts fresh per room (key={roomId} in RoomView), so the
render-phase room-change block used for save/load never fires.

- Init scrollCacheForRoomRef from roomScrollCache.load() on mount so the
  CacheSnapshot is actually provided to VList on first render.
- Save the cache in handleVListScroll (and after the first-visit 80 ms
  timer) rather than in the unreachable room-change block.
- Trim the room-change block to just the load + state-reset path (kept as
  a defensive fallback for any future scenario where room prop changes
  without remount).
@Just-Insane Just-Insane requested review from 7w1 and hazre as code owners April 10, 2026 02:52
@Just-Insane
Copy link
Copy Markdown
Contributor Author

Added a follow-up fix in the latest commit: setAtBottom(true) is now called synchronously when scrolling to the bottom in the cache-restore and pendingReadyRef recovery paths, immediately before setIsReady(true). This prevents the "Jump to Latest" button from flashing for one render cycle between isReady becoming true and VList's async onScroll confirming the bottom position.

…mmatic scroll-to-bottom

After setIsReady(true) commits, virtua can fire onScroll events with
isNowAtBottom=false during its height-correction pass (particularly on
first visit when item heights above the viewport haven't been rendered
yet). These intermediate events were driving atBottomState to false
while isReady=true, flashing the 'Jump to Latest' button.

Add programmaticScrollToBottomRef: set it before each scrollToIndex
bottom-scroll, suppress the first intermediate false event (clearing
the guard immediately), so the next event — the corrected position or
a real user scroll — is processed normally.
@Just-Insane Just-Insane deleted the perf/timeline-scroll-cache branch April 12, 2026 19:33
@Just-Insane Just-Insane restored the perf/timeline-scroll-cache branch April 12, 2026 19:41
@Just-Insane Just-Insane reopened this Apr 12, 2026
Copilot AI review requested due to automatic review settings April 12, 2026 21:24
Copy link
Copy Markdown
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This PR improves room timeline revisit performance by caching Virtua VList item height snapshots and restoring scroll position immediately when returning to a room, avoiding the existing 80ms “fade-in stabilisation” delay. It also addresses a race where the recovery path could previously fail to persist a cache, causing subsequent revisits to still hit the fade-in path.

Changes:

  • Introduces a session-scoped roomScrollCache utility to store per-room VList cache + scroll position.
  • Updates RoomTimeline to load cached VList measurements on mount, restore scroll immediately on revisit, and persist cache/offset during scrolling and recovery.
  • Adds a Changeset entry documenting the perf improvement.

Reviewed changes

Copilot reviewed 4 out of 4 changed files in this pull request and generated 6 comments.

File Description
src/app/utils/roomScrollCache.ts Adds a per-room, session-scoped Map-backed cache for VList snapshots + scroll state.
src/app/utils/roomScrollCache.test.ts Adds Vitest coverage for basic save/load/overwrite behavior.
src/app/features/room/RoomTimeline.tsx Restores cached VList state on revisit, saves cache on scroll and recovery, and adds a guard for programmatic scroll UI behavior.
.changeset/perf-timeline-scroll-cache.md Documents the perf change as a patch-level update.

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

@@ -0,0 +1,22 @@
import { CacheSnapshot } from 'virtua';
Copy link

Copilot AI Apr 12, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

CacheSnapshot is only used as a TypeScript type here. Importing it as a value (import { CacheSnapshot } ...) can force an unnecessary runtime dependency on virtua in the emitted JS/bundler graph. Switch this to a type-only import (e.g. import type { CacheSnapshot } ...).

Suggested change
import { CacheSnapshot } from 'virtua';
import type { CacheSnapshot } from 'virtua';

Copilot uses AI. Check for mistakes.
Comment on lines +13 to +21
const scrollCacheMap = new Map<string, RoomScrollCache>();

export const roomScrollCache = {
save(roomId: string, data: RoomScrollCache): void {
scrollCacheMap.set(roomId, data);
},
load(roomId: string): RoomScrollCache | undefined {
return scrollCacheMap.get(roomId);
},
Copy link

Copilot AI Apr 12, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

scrollCacheMap grows for every room visited and is never cleared/evicted. Since cache can be sizable (item-size snapshots), this can lead to unbounded memory growth over a long session with many rooms. Consider adding an eviction strategy (e.g., LRU with a max entries cap) and/or a way to clear caches on logout/session reset.

Suggested change
const scrollCacheMap = new Map<string, RoomScrollCache>();
export const roomScrollCache = {
save(roomId: string, data: RoomScrollCache): void {
scrollCacheMap.set(roomId, data);
},
load(roomId: string): RoomScrollCache | undefined {
return scrollCacheMap.get(roomId);
},
const MAX_SCROLL_CACHE_ENTRIES = 100;
const scrollCacheMap = new Map<string, RoomScrollCache>();
function evictOldestScrollCacheEntry(): void {
if (scrollCacheMap.size <= MAX_SCROLL_CACHE_ENTRIES) return;
const oldestRoomId = scrollCacheMap.keys().next().value;
if (oldestRoomId !== undefined) {
scrollCacheMap.delete(oldestRoomId);
}
}
export const roomScrollCache = {
save(roomId: string, data: RoomScrollCache): void {
if (scrollCacheMap.has(roomId)) {
scrollCacheMap.delete(roomId);
}
scrollCacheMap.set(roomId, data);
evictOldestScrollCacheEntry();
},
load(roomId: string): RoomScrollCache | undefined {
const cached = scrollCacheMap.get(roomId);
if (cached === undefined) return undefined;
scrollCacheMap.delete(roomId);
scrollCacheMap.set(roomId, cached);
return cached;
},
clear(): void {
scrollCacheMap.clear();
},

Copilot uses AI. Check for mistakes.
import { PushProcessor, Room, Direction } from '$types/matrix-sdk';
import classNames from 'classnames';
import { VList, VListHandle } from 'virtua';
import { roomScrollCache, RoomScrollCache } from '$utils/roomScrollCache';
Copy link

Copilot AI Apr 12, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

RoomScrollCache is used only as a type in this file. Import it with type (or split the import) to avoid pulling it into the runtime module graph unnecessarily, consistent with other files in this repo that use import type for types.

Suggested change
import { roomScrollCache, RoomScrollCache } from '$utils/roomScrollCache';
import { roomScrollCache } from '$utils/roomScrollCache';
import type { RoomScrollCache } from '$utils/roomScrollCache';

Copilot uses AI. Check for mistakes.
Comment on lines +321 to +327
if (savedCache) {
// Revisiting a room with a cached scroll state — restore position
// immediately and skip the 80 ms stabilisation timer entirely.
if (savedCache.atBottom) {
programmaticScrollToBottomRef.current = true;
vListRef.current.scrollToIndex(processedEventsRef.current.length - 1, { align: 'end' });
// scrollToIndex is async; pre-empt the button so it doesn't flash for
Copy link

Copilot AI Apr 12, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In the saved-cache restore path, scrollToIndex(processedEventsRef.current.length - 1, ...) can be called when processedEventsRef.current is empty (the same TimelineReset race described for the 80ms timer path), which would pass -1 to scrollToIndex. Add a guard for processedEventsRef.current.length > 0 (or reuse pendingReadyRef to defer restore until events repopulate) before attempting the scroll.

Copilot uses AI. Check for mistakes.
Comment on lines +329 to +334
setAtBottom(true);
} else {
pendingReadyRef.current = true;
vListRef.current.scrollTo(savedCache.scrollOffset);
}
}, 80);
hasInitialScrolledRef.current = true;
setIsReady(true);
} else {
Copy link

Copilot AI Apr 12, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

When restoring a non-bottom scroll position (savedCache.atBottom === false), the component leaves atBottomState at its default true until onScroll fires. Because setIsReady(true) happens immediately, this can briefly hide the “Jump to Latest” UI and can trigger at-bottom behaviors (e.g. auto-mark-as-read) even though the restored position is not at the bottom. Consider setting setAtBottom(false) (or setAtBottom(savedCache.atBottom)) as part of the restore before marking the timeline ready.

Copilot uses AI. Check for mistakes.
Comment on lines +699 to +705
// Keep the scroll cache fresh so the next visit to this room can restore
// position (and skip the 80 ms measurement wait) immediately on mount.
roomScrollCache.save(room.roomId, {
cache: v.cache,
scrollOffset: offset,
atBottom: isNowAtBottom,
});
Copy link

Copilot AI Apr 12, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Saving the scroll cache on every onScroll callback allocates a new object and updates the Map at scroll-event frequency, which can be very hot during user scrolling. Consider throttling/debouncing these saves (e.g. per animation frame) and/or only saving when values materially change, to reduce GC pressure and keep scrolling smooth.

Copilot uses AI. Check for mistakes.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants