This example discusses the issues in computing how quickly the user sees a visual response to their finger tap, click or keyboard press.

The platform already delivers events for these events, and there are even discussions of putting the hardware-level timestamp on them.

However, that isn't enough to compute actual repsonse time. For that, we need to know when the actual visual effects of that input are onscreen. In older browsers that update the screen on the main javascript thread right after requestAnimationFrame, you can timestamp right after rAF using a setTimeout(, 0) and guess the time. However, most browsers these days run the painting and compositing steps on another thread after rAF. This time is invisible to the platform, so it is difficult to guess the on-screen time.

In some browsers, a second rAF cannot begin until the first is put onscreen. However, this is just an implementation accident and at any moment browsers could switch to allowing N rAFs to be queued. Chrome for instance has prototyped this before.

Furhter complicating this, the page may contain accelerated animations. In browsers that run CSS animation on a second compositor thread, it is not enough to know when the compositor thread draws, since it may be drawing the CSS animation for your old page, not showing the new frame. To really get the right answer, you need to know precisely which frame the compositor is drawing.