The topic of web performance (sometimes called 'perf' for short) is all about how fast a web app or page loads in a user's browser—as well as how smooth it feels to use once loaded. Web performance optimization refers to the various techniques we can use to increase web performance.
Why does web performance matter?
Performance is an important consideration in user experience. Ever been frustrated trying to use a slow, clunky website? You're not alone. As time goes on, users are becoming less and less patient with how long they are prepared to wait before simply abandoning a site.
According to Google's Core Web Vitals:
The recommended web page load time in 2023 is under 2.5 seconds.
Depending on the site, a slow-loading page could mean fewer page impressions and lower advertising revenue, lower conversion rates and lost sales, or simply poor adoption of your product in general.
It's your job as a developer to know how to identify the causes of poor performance—and what you can do to fix it.
It can also come up in interviews, so it's a great idea to have things to say on the topic (you'll get bonus points for talking through a specific example of how you improved performance on a project!).
So how can you improve the performance of a website?
What can we do to make pages load more quickly, perform more efficiently, and feel fast and responsive to users?
Before making any optimizations, the first thing you should do is analyze current load time, and identify any bottlenecks. This is to give you some benchmarks so you can see which areas are having the most impact on performance, and measure what improvements you were able to make.
In the Chrome Developer Tools, you can do this with the Performance tab. It shows every step the browser takes to load a page, how long each step takes, which processes block other processes, and which ones happen concurrently. (For a more high-level analysis, tools like PageSpeed Insights will test a page and suggest specific optimizations for you, which can be a better place to start for beginners.)
Pro tip: For a more thorough intro to web performance optimization, including how to use the Chrome Developer Tools, I recommend Udacity's free course: Website Performance Optimization
If a page is loading slowly, it could be one request that's taking a long time, an inefficient sequence of requests, or that some processes are blocking resources unnecessarily.
There are three main areas you can potentially optimize:
- Defining critical page resources
- Preparing those resources
- Fetching them from the server
Defining critical resources
So the first set of optimizations we can make involves minimizing the number of critical resources a page has to request.
Start by making sure every (internal and external) library or framework is actually necessary, and that the performance cost of requesting it is worth the benefit it's providing to your users. Then you can move onto the other optimizations.
For CSS, you can minimize critical resources by splitting your code into different files based on purpose. So styles that only apply to very large screens could go in a separate file, styles that only apply to print go in another separate file, and so on. Then when importing with
link tags, you specify a rule with the
<link href="style.css" rel="stylesheet">
<link href="large-desktop.css" rel="stylesheet" media="(min-width: 1920px)">
<link href="print.css" rel="stylesheet" media="print">
This way, resources not needed for the environment will still be downloaded, but won't block page rendering.
<!-- Blocking -->
<!-- Non-blocking -->
<script src="awesome-script.js" async></script>
It's also a good idea to include scripts not responsible for the layout of your page just before the closing
body tag. This is because when the browser hits a
This has the added benefit that you don't have to listen for the
script tag, you know the previous page elements have loaded already. You could also defer non-essential scripts until after the page has loaded, to stop them impacting the initial render.
Preparing your resources
Now we're not wasting time being blocked by requests for unnecessary resources, but we can save even more time by making sure the resources we do need are as small as possible. Fewer bytes over the network means faster responses (or lower latency).
There are a lot of minifiers and optimizers to choose from, and plugins to include the popular ones in build processes (with Gulp, Grunt, Webpack etc.) so you can automate these tasks.
Fetching your resources
So we've minimized the number of critical resources, and made them as small as possible, but HTTP requests are still an expensive part of loading a webpage. There are several things we can do to ensure we're making them efficiently.
Order of requests
First, we need to look at the order in which we're requesting resources. The most important requests should happen first, so things like essential CSS aren't being blocked by some less important process. This is why
link elements are usually placed at the top of a page inside the
Number of requests
You save a request, but the downsides are potentially repeating code across multiple pages (which is a problem for maintenance), not being able to use CSS pre or postprocessors, and bloating your HTML page.
An HTML page should ideally be smaller than 14KB, otherwise the browser is going to have to make multiple roundtrips to download it anyway. On that note, if you have a lot of content to show on a page, you can load your HTML incrementally with AJAX, using techniques like pagination or infinite scroll to keep initial page load time to a minimum.
Size of requests
Finally, we can cut requests on subsequent visits to a page by saving certain resources locally. This is known as HTTP caching. If a resource hasn't changed since the last visit, why request it again?
Caching is controlled by the server response rather than the request, so has to be set server-side. In the response headers, you can specify things like whether to allow caching, and how long the resource should be cached for. Once it has been downloaded and cached, when the client needs it a second time, the browser will first check the cache for the resource. If it's there, and hasn't expired, that's one less network roundtrip the browser has to make.
Resources that don't change very often (e.g. libraries) or code that is the same for all users are the best kind of things to cache, whereas code that's regularly updated is better cached for little or no time.
Even with these techniques, optimizing page load is still a balancing act. Is including that library worth it for the amount we're using it? Is inlining that CSS to save a request a fair trade-off for how much it increases the size of our HTML file? It takes careful analysis and decision-making to find the best compromise for both developer convenience and user experience.