As the proverb says, a journey of a thousand miles begins with a single step. And the improvement of web performance starts with the identification and prioritization of issues. The good news is that Google provides powerful free tools to facilitate this first step. This article gives an overview of Google’s free tools to access web performance data and translate it into actionable insights.
Lighthouse and Chrome UX report: lab tests vs field data
Lab tests and field data are two complementary approaches to analyzing a page’s web performance.
A lab test records the loading process on an emulated device. Such an experiment helps discover specific web performance issues and estimate their impact. Besides, it is the only way to evaluate the performance of an unpublished or recent page or observe the change of behavior immediately after an update. Google’s program for synthetic testing is open-source; it is called Lighthouse. For mobile experience analysis, Google Lighthouse emulates a connection from a mid-tier mobile on a slow 4G network. When analyzing the desktop experience, Lighthouse emulates a lower-end PC with a 10-megabit-per-second connection.
The lab experiments might miss many factors and interactions impacting the real users.
A more accurate picture could be obtained by collecting field data from real users. Field data takes into account the variety of devices and connection speeds of the actual audience. Google records the Core Web Vitals performance from the sessions of users navigating with its Chrome browser. These records are anonymized and aggregated in a public dataset - the Chrome UX Report.
Depending on your needs and technical level, you might choose one of the several ways to access Google Lighthouse and the Chrome UX Report:
- Get an automated audit of a page based on both these sources via the web interface of PageSpeed Insights
- Run a Lighthouse analysis from the dedicated tab of Developer Tools in Chrome browser
- Explore the detailed raw data from Chrome UX Report via the command line interface of BigQuery.
- Communicate with the Lighthouse API and CrUX API to integrate their data into other applications.
The first method is the most attractive for general use. It will be covered in the next section.
How to interpret PageSpeed Insights audits.
PageSpeed Insights (also referred to as PSI) quickly gives an easy-to-interpret report on page load speed and opportunities for its improvement. The tool is accessible via a simple web interface at https://pagespeed.web.dev/.
Let’s run PageSpeed Insights for a category page of Bloom & Wild - a popular flower delivery e-commerce in the UK.
It’s enough to insert the URL of interest and press the blue Analyze button.
By default, the tool displays the audit for the mobile experience. (The audit for desktop experience is available in a separate tab).
The first report shows the page’s Core Web Vitals, measured for real users navigating with Chrome Browser over the last 28 days. (The data comes from the Chrome UX Report). The metrics are given for the 75th percentile. In other words, the page performed worse than the reported figure for 25% of its users.
We can observe that our page fails the Core Web Vitals assessment (based on real users’ experience). This is bad for conversions and is also a negative factor for SEO. In particular, the Largest Contentful Paint, Cumulative Layout Shift, and Interaction to Next Paint metrics are in the red zone.
The next part - the lab test - reproduces the loading experience in a controlled environment. This will help to identify the sources of the poor performance.
The audit is powered by Google Lighthouse. It emulates a connection on a Moto G4 mobile with a slow 4G connection. The page is rated for 4 aspects: loading performance, accessibility, UX best practices, and SEO optimization. The scores are normalized: the highest possible score is one hundred. A score below 50 is considered poor and is marked in red.
The performance score is calculated by comparing the performance indicators to their benchmark values and weighting the results. As of February 2023, the indicators and their weights are as follows: Total Blocking Time (30%), CLS (25%), LCP (25%), Speed Index (10%), FCP (10%).
In practice, it makes little sense to aim for a mobile performance score close to 100. (It is useful to repeat that in contrast to field data on Core Web Vitals, the Lighthouse score has no direct impact on SEO). Only the most lightweight pages with limited functionality might obtain it. A score of over 70 is considered good; it seems to be a reasonable objective for most use cases.
The mobile performance of our test page from Bloom & Wild is assessed as rather poor: 12 out of 100. (The result might slightly change each time you run the test).
PageSpeed Insights lists the recommendations to improve the loading experience, ranked by the potential savings in load time. (These estimations aren’t precise and should only be interpreted as relative indicators of priority).
The limits of free tools by Google and their alternatives
The ease of use and interpretation makes Google Lighthouse a reference for lab testing of performance. However, it lacks options when used out of the box. A free alternative with more flexibility is provided by WebPageTest. It can emulate a much wider range of conditions: dozens of devices, different browsers, locations of connection, and network speeds.
As for the field data from the Chrome UX Report, it has no publicly available alternatives. But keep in mind that this dataset only keeps data related to the Core Web Vitals metrics. If you want to investigate patterns of user interactions with your website (clicks, scrolls, conversions), you’d need to set up real-time user monitoring with a different tool, like Pingdom, Speedcurve or Raygun.