Optimizing Web Page Render

qngh
The Startup
Published in
5 min readFeb 22, 2021

--

For a very long time, there has been one matter that I would always want to write about. It was among the first things that I picked up while working as a front-end engineer. The one that proves small details can make a big impact. It’s about optimizing web page render. In this article, I am going to share with you tips to improve web rendering. But, before diving into the techniques, it is essential to first understand how a web page is rendered. I will briefly describe the render phase in the following section; however, for a full knowledge of the life cycle of a web page, I highly recommend you to check out this article from MDN if you have not already.

Photo by Francesco Ungaro from Pexels

A glance at the render phase

A typical browser renders a document in 4 steps: style, layout, paint, and compositing. At the beginning of the render phase, the DOM tree and CSSOM tree (similar to DOM tree, but built from the process of parsing CSS) are combined to a render tree where CSS stylings are matched to their corresponding DOM nodes (style). The render tree then computes the geometry of nodes in the tree (layout), and finally paints those elements on the screen (paint). Under certain circumstances, the visual appearance of the document will be broken into different layers. This is where the compositing is needed to ensure layers are drawn in the correct order.

Before being a front-end engineer, I have always been a web user. I myself really enjoy visiting web pages that have good UI/UX, and I am working hard to bring such experiences to my users. For me, animations must be smooth, user-interactions are lightning-fast, and no janky scrolls should be allowed. And to achieve that, I need to maintain my script computations, plus the rendering phase to complete within ~16ms. Why 16ms you may wonder? It’s because responsive user interfaces have a frame rate of 60 frames per second, derived from that one frame should take 1 / 60 = 0.016666667s = 16.66667ms to render.

Avoid unnecessary reflow & repaint

Animated elements are a common feature of web pages. I believe you can easily spot out them in your frequently-visited web pages. They can be found in an auto-scroll carousel, drop down, collapsable boxes, and so on. As they appear to be fundamental elements of a web page, it’s important to make them well-performed. Let take this example, where I am going to create a navigation drawer, that slides in and out upon toggling a button.

Navigation drawer example

Done. This just works as I expect. I then did a step further, trying to examine the performance by recording a performance snapshot using the developer tool.

Use of positions’ performance snapshot

From the snapshot above, I have noticed there are warnings about the Layout shift events. They result from the drawer’s positions being updated throughout the animation, which forces the browser re-calculate the element’s geometry; hence, reinvokes the layout, paint and compositing phases again. The process I just described is called reflow, in which the browser needs to re-calculate the positions of all elements to re-layout the page.

Would that mean animation will just cause the webpage to reflow? The answer is no. It turns out not all CSS properties triggers the reflow. Some properties will just cause the browser to re-paint (and compositing), e.g background-color, background-image. Some are just limited at the compositing phase, which is pretty good. Personally, I use this website to check out what CSS properties trigger what phase.

Let’s get back to our drawer example above. After looping through the CSS list, I found an exciting CSS property transform . It does not cause the document to reflow nor re-paint. Its operation is even loaded to the GPU, which reduces the pressure for the main thread. So good! This is the updated solution.

Now, let’s take another performance snapshot.

Use of transform’s performance snapshot

Hurray! All layout shift warnings are gone.

Avoid forced synchronous layout

The layout phase, technically, can be forced to happen sooner than it’s supposed to be via Javascript. Unintended forces like that should really be avoided. It happens when CSS properties are written followed by a read. To be able to yield the correct value for the read, the browser will have to re-calculate the layout, then trigger the reflow. Noticeably, the issue only appears in the write-read order, and with certain CSS properties. The list of CSS properties can be found here.

What’s worse than a forced synchronous layout? It’s a series of forced synchronous layout, a.k.a layout thrashing. Imagine having a loop that, on each iteration, the document will have to re-calculate the layout of all elements and repaint the document. That’s exactly what happens to a drawer implementation below.

const drawer = document.querySelector('.drawer');for (let i = 0; i < 170; ++i) {
drawer.style.width = `${drawer.offsetWidth + 1}px`;
// maybe some delay needed
}

In this example, I add 1pxat a time to the drawer’s width to make it looks like it’s sliding. The loop then creates a series of read-write-read-write-read-… and causes layout thrashing. A fix for this would be:

const drawer = document.querySelector('.drawer');
const drawerWidth = drawer.offsetWidth;
for (let i = 0; i < 170; ++i) {
drawer.style.width = `${++drawerWidth}px`;
// maybe some delay needed
}

This time, the offsetWidth is read once and writes multiple times. The snippet resolves the layout thrashing issue but is still the less preferable method compared to the solutions in the previous section.

An escape hatch

Sometimes, it’s unavoidable to use CSS properties that trigger reflow or repaint. The contain CSS property would be very useful in such cases. It can help to ease the pain by allowing me to limit updates to a particular part of the DOM. To use the contain property, my element and its content need to be independent of the rest of the DOM also. See the contain and its options in details here.

Key takeaways

My goal in writing this article is to raise the awareness of unnecessary processes that can downgrade the users' experience. Those issues are fixable with reasonable efforts and care. Be a force for good and make web pages smooth again. See you in the next article.

--

--