How does Google rate the UX of your website through the Core Web Vitals?
Google's interest in website loading speed is not recent. Since the creation of the PageSpeed browser extension 12 years ago, Google has launched multiple initiatives aimed at improving websites: APIs, modules for web servers, measurement of server response times, enhancement of the mobile offering, AMP... Things accelerated in 2017-2018 with the Speed Update and the availability of new performance indicators with data collected directly from users.
These indicators have evolved and we are entering a phase of clarification. A roadmap has been decided and there are tools to measure and recommendations from Google to improve your results. In 2020 Google announced the implementation of 3 key indicators called Core Web Vitals to help website publishers objectively evaluate the quality of the experience they are offering to their users.
Google's initiative is to:
- Put the end-user on top priority
- Structure the reflection around metrics that can be understood by all employees and not only by technical teams
- Provide clear monitoring indicators and recommendations to deliver the best possible customer experience
The Core Web Vitals are made up of 4 key indicators. All of them are end-user experience related. These metrics are LCP (Largest Contentful Paint), CLS (Cumulative Layout Shift), FID (First Input Delay), and finally INP (Interaction to Next Paint). Each of these measurement units allows us to rate the navigation experience by testing the interactions that Internet users can have with websites.
More information on LCP & CLS:
The INP - Interaction to Next Pain - is the last indicator implemented by Google in 2022. Interaction to Next Paint (INP) is an experimental metric that assesses responsiveness. When an interaction causes a page to become unresponsive, that is a poor user experience. INP observes the latency of all interactions a user has made with the page, and reports a single value in which all (or nearly all) interactions were below. A low INP means the page was consistently able to respond quickly to all—or the vast majority—of user interactions.
What impact do Core Web Vitals have on SEO?
SEO activities generally pursue one of the following two objectives:
- Indexing: transmitting information to the engines
Web performance metrics have no direct impact on indexing. Indeed a fast site is already more easily indexed than a slower site. During the time given to robots to browse the site, they browse and index more pages if the content loads faster.
As far as ranking is concerned, no penalties are applied. However, there is an SEO bonus for passing the Core Web Vitals test. In fine, when a site rises in the ranking it implies that the others go behind. Doing nothing is therefore choosing to lose ranking in the long term.
Google: "The ranking gives priority to the overall relevance and accuracy of the information provided [...]. However, when many pages offer similar content, the on-page experience becomes much more important to their visibility in search."
Google's Core Web Vitals, therefore, highlight websites that offer a better user experience.
UX & browsing context
The user experience when loading a web page or interacting within that page can be summarized in three key moments:
- Visual feedback
The user can see that loading is taking place, as the first elements are displayed on the page. With progressive loading, the faster the user feels the page is loading.
- Confidence Indicator
The user has enough information in front of them to think they can interact with the page. For example, the most important content is available, or most of the content on the page is displayed.
- UX Frictions
The user can interact with the page, but the experience is compromised by slowdowns and content lags that degrade the smoothness of navigation.
These three points noted by Google are not identical depending on the user context.
Today, everything is a question of adaptability, performance is subjective depending on the user:
- A website loading time can be fast for one user (4G network / latest generation smartphone) but slow for another (3G network / low-end device).
- Two websites may finish loading in the same amount of time, but one may appear to load faster (if it loads content gradually, for example).
- A website may load quickly but respond slowly to user interaction.
That's why it's important when talking about UX quality, to use objective criteria and therefore to take into account all usage contexts.
How to pass the Core Web Vitals test?
Google's indicators, as previously mentioned, do not only rate the UX of websites but also give us clues on the points to improve to deliver an optimal browsing experience.
For each of these indicators, Google has defined benchmarks that tell us if we are good, perfectible, or bad. To validate an indicator, more than 75% of users must have had a good experience (related to what the indicator is monitoring). And so, for each page, if the three indicators are validated, the test passes, and the website will get an SEO bonus.
Today, if we follow the data available in the Chrome UX Report 2021, only ⅓ of the origins and domains pass the Core Web Vitals test. So there is still some effort to be made.
If one wants to pass the Google test, the following strategy should be implemented:
- Know and Do
- Automate the follow-up
- Analyze upstream
Step 1: Start measuring to identify the priority
Performance metrics are measured in two ways:
- In the lab: using tools to simulate a page load in a consistent, controlled environment → Reliable and essential for testing new features and releases (before rollout
- In the field: on real users loading and interacting with the page → The only way to truly know how your site performs for your users
Neither of these options is necessarily better or worse than the other. You need to use both to ensure good performance.
Step 2: Convince your teams
Explain to your teams the importance of monitoring Core Web Vitals to improve the user experience and how they impact the company's business indicators.
Get agreement/budget internally to launch small experiments.
Create a shared goal among stakeholders to improve Core Web Vitals across teams.
Step 3: Sponsorship
Identify who is responsible for the topic: someone who is training and who is driving the developments to demonstrate value.
Step 4: Prioritize - Test - Compare - Monitor
Prioritize: choose a high-traffic and/or high-converting page to get meaningful results (e.g., ad landing page, conversion page, or popular pages).
A/B test: use server-side testing to avoid rendering costs. Compare results between optimized and non-optimized versions. Control: use continuous monitoring to avoid regression.
Step 5: Anticipate through shared culture
Discuss web perf and UX at the beginning of projects, even before any implementation.
The limits of this monitoring model
We also note that these indicators are only measured with authenticated Chrome users. This is both understandable, insofar as Chrome is the browser that is advancing the fastest on these topics. However, it also gives a very restricted view of the web, which implies that the customer experience stops at Chrome (which excludes all IOS users, for example). It is therefore important that webmasters and developers continue to test their sites on various hardware, with browsers other than Chrome.