Core Web Vitals Strategies to Improve Site Performance
Chapter 1
Imagine, as a user, visiting an eCommerce store, only to wait for a blank page or large images to load. Or be reading information about a product or about to click on a picture that captures the eye, only to have that website element jump to a different place on the screen. What about clicking something and getting no response? People are used to clicking on an element and getting an instant reaction; if this expectation is not met, it creates the illusion that the website is not working, which may result in the person leaving the website after just a few seconds. All of these problems can create confusion and a frustrated potential customer.
To address these and other concerns, in early 2020, Google created Core Web Vitals (CWVs), metrics measuring user experience when interacting with websites. These metrics assess websites in terms of how they load, their interactivity, and visual stability. By 2021, Core Web Vitals had become a factor in search engine ranking. Today, evaluating any website based on CWV scores—especially one that relies on traffic and sales—is essential.
In this article, we explore what these metrics mean and how to improve your website’s score in each one by using industry best practices.
Summary of key Core Web Vitals strategy best practices
Best Practice | Description |
---|---|
Understand the three Core Web Vitals metrics | There are three Core Web Vitals: 1) Largest Contentful Paint (LCP): The time to load the largest image or text element 2) Interaction to Next Paint (INP): The time for the web page to react to user action 3) Cumulative Layout Shift (CLS): How much the page shifts while loading |
Learn the recommended tools available for measuring Core Web Vitals | Some of the most effective tools include Chrome User Experience Report (CrUX), PageSpeed Insights, Google Search Console, Lighthouse, the SemRush site audit, WebPageTest, and CLSDebugger. |
Improve Largest Contentful Paint (LCP) | To improve LCP, it’s important to prioritize the user experience (UX) by ensuring that the most important webpage content loads quickly. This can be achieved by optimizing images, prerendering content, and using a CDN to load images geographically closer to the user. |
Improve Interaction to Next Paint (INP) | Start by determining the root causes of slow interactions using field data and lab diagnosis. Once the root causes are determined, the goals are to reduce input delay, processing duration, and presentation delay. |
Improve Cumulative Layout Shift (CLS) | CLS can be improved by using lazy loading of everything below the fold, fixed-sized containers with predictable screen size, cache to load previously visited pages from memory, and preloaded fonts. |
Take a mobile-first approach | Google uses a mobile-first indexing approach, so developers must also take a mobile-first approach when designing their websites. A mobile-first approach starts with implementing responsive design, which should be consistent across multiple devices. Avoid distorted images or incorrect placement of elements on smaller screens. |
Regularly test and optimize | Each Web Core Vital has specific tools that are most appropriate for testing them. Consistently assess using these tools and optimize to improve performance. |
Understand the three Core Web Vital metrics
As mentioned earlier, the three Core Web Vital are metrics that were developed by Google to measure websites based on their user experience. Google uses the results of these metrics to rank the seemingly infinite number of websites that make it onto the internet daily.
Largest Contentful Paint (LCP)
LCP measures how long it takes to render the largest element on the viewport (“above the fold”) website section. The element can be a text block, image, video, element with a background image using the url() function, or anything else that the user sees on the first load and before any interaction. You should strive for an LCP score of 2.5 seconds or less.
Website loading stages before LCP (source)
Invisible elements, like those with an opacity of zero, are excluded from this calculation, as are background images that are not considered part of the main content.
The size that Google uses for this calculation is only the viewable portion of the content. This means that any images that are partially cut off from the viewport will only have their viewable portions counted. Similarly, any margins, paddings, or borders added by CSS are not considered.
Because most websites load in stages—often first loading text elements, then larger elements like images or videos—Google constantly recalculates the LCP. Take a look at this example from Google, showing how the LCP is determined in a Techcrunch article:
LCP gets continuously recalculated until the page is fully loaded (source)
As you can see, at first the LCP is a block of text, but once the article’s featured image is loaded, the image becomes the LCP and the time it took to load gets calculated.
LCP score ranges (source)
Interaction to Next Paint (INP)
Quick responsiveness after loading is essential for a positive user experience. Interaction to Next Paint measures latency for certain interactions during a page visit and reports the longest interaction observed (sometimes ignoring outliers). A good score for INP is 200 milliseconds or less.
Stages of INP (source)
Whereas its predecessor, First Input Delay (FID), measured only the input delay of the first interaction on the page, Google determines INP by observing user interactions on a page—from the input delay to the time it takes to run event handlers and, finally, until the browser has painted the next frame.
INP score ranges (source)
Cumulative Layout Shift (CLS)
The Cumulative Layout Shift metric measures how much the page shifts while loading. This can negatively impact the user experience in many ways. It can cause the user to lose the text they’re reading, force the user to click on the wrong button or link, or even cause the reader to completely lose the spatial sense of where they are on the website.
Bad CLS versus Good CLS (source)
Poor CLS often occurs when images have dynamic sizes or elements such as ads are loaded after the first load finishes. A good CLS score is 0.1 or less.
CLS score ranges (source)
Example of layout shift on a mobile device (source)
Automated SEO optimization
- Improve Core Web Vitals using a performance proxy
- Prerender your website pages automatically for faster crawl time
- Customize waiting room journeys based on priorities
Learn the recommended tools available for measuring Core Web Vitals
Here are some important tools you’ll want to understand when improving your Core Web Vitals scores.
Chrome User Experience Report (CrUX)
This tool tells you how real users experience your website on the web. This data is available in several places, including the CrUX dashboard, Pagespeed Insights, and Google Search Console.
To use the CrUX dashboard, navigate to https://developer.chrome.com/docs/crux/dashboard, enter the target website’s URL, and press Go.
Image of the CrUX dashboard (source)
From there, the dashboard will display the Core Web Vitals that help explore how the origin website is experienced by real users. There are filters available to see real data filtered by month or device.
PageSpeed Insights
This product presents a consolidated view of Google’s Core Web Vitals for a URL. It displays a separate report for mobile and desktop users. The report shows the scores for each of the Core Web Vitals metrics and offers tips on how to improve them.
To access PageSpeed Insights, visit https://pagespeed.web.dev/ to enter your website’s URL and click on Analyze. After a few moments, it will display a detailed report on its findings, sorted by most critical, and it will show the potential time savings after each area of improvement.
Image of the PageSpeed Insights results (source)
Google Search Console
The Search Console offers a more robust dashboard with many different metrics of a website and crawler data, including Core Web Vitals, under the “Experience” heading.
To use Google’s powerful Search Console, visit https://search.google.com/search-console/welcome, where a website can be claimed as property by the owner or webmaster. If using this tool for the first time, there is a verification process to claim the property. After a short verification process, you can access the Search Console, which provides invaluable data about impressions, clicks, crawlability of pages, sitemaps available, and queries users used to find the website. Here, you may also request Google to index your website, which can be useful after a website overhaul.
If enough data over the last 90 days is available, the Google Search Console also provides the Core Web Vitals inside the dashboard.
Image of the Google Search Console (source)
Lighthouse
Lighthouse allows you to see the Web Core Vitals straight from Chrome DevTools. You can run Lighthouse on any website, including those requiring authentication.
Lighthouse is somewhat different from the other tools. For easy use, you will need Google Chrome to access it through the Chrome DevTools. Advanced users may also run it from the command line or as a Node module.
To do this, first visit the target website to analyze, then go to DevTools and click on the Lighthouse tab. Before running the tool, you can select what type of report you want to see. Lighthouse can measure performance, SEO, accessibility, and more. After making a selection, click on Analyze Page Load and Run Audit. In less than a minute, Lighthouse will show you a detailed report from the categories selected previously.
SemRush site audit
SemRush offers an alternative outside the Google ecosystem to view analytics about your website, including the Web Core Vitals metrics and tips on how to improve your scores.
After creating an account, SemRush offers limited access to the target website reports and issues for free. It studies tags, performance, meta descriptions, broken images and links, speed, and more, and displays a detailed report with actionable steps to fix each issue.
Image of the SemRush dashboard (source)
WebPageTest
This site gives a comprehensive review of your Core Web Vitals as well as suggestions and experiments to attempt to enhance them.
After navigating to https://www.webpagetest.org/, scroll down and select Core Web Vitals from the Site Performance dropdown, or leave it as is for a more complete report. Enter the target website URL and click on Start Test. Free users get three test runs to see a detailed report with performance metrics, including charts and waterfall representations of load times. Users like the configurability of WebPageTest’s reports and the ability to run various experiments to see how changes might affect the score. For example, one available experiment is to add async or defer to render-blocking scripts to see what the effect of it would be if added to the real website.
Image of WebPageTest results (source)
CLSDebugger
Specifically designed to debug layout shifts, this tool helps calculate the CLS score and visualize any shifts that may occur.
Go to https://webvitals.dev/cls, input your website’s URL, and click on Check CLS score for a specialized report on cumulative shifts. The system will crawl, render, and test the target website with multiple browsers. This tool seems to be the slowest to gather data but offers a more focused view on layout shift improvement rather than seeing all Core Web Vitals at once. It is useful to isolate layout shift issues on either mobile or desktop.
Example of CLS debugger (source)
Recommended tool for each Core Web Vitals Metric
It’s important to constantly use tools like CrUX and the Search Engine Console to see how your website behaves in the real world and how real users experience it.
Although most tools can be used for all the metrics, different tools are better suited for specific Core Web Vitals.
Core Web Vitals Metric | Recommended Tools |
---|---|
LCP | Google PageSpeed Insights, Lighthouse |
INP | Google PageSpeed Insights, Chrome DevTools |
CLS | WebPageTest, CLSDebugger |
Is your website optimized for visitors? Find out.
Improve Largest Contentful Paint (LCP)
Improving LCP isn’t as simple as making a small change. To fix LCP, one must audit the entire loading process of the webpage.
It’s important to choose a tool that will report real data from users—that is, you should optimize for what visitors of your website are seeing versus a tool that will give you results from a lab scenario. Google’s CrUX gives access to real user data; you can use it with a Google Chrome extension, through the developer tools, or inside the Google Search Console.
Once the LCP is determined, here are some improvements you can make.
Use the fetch priority HTML attribute to give higher priority for the LCP
Timeline of resources, including LCP (source)
The LCP resource should load at the same time as the first resource. In the image above, we see an example of a website that waits before the first resource and the LCP resource. With the fetch priority attribute, we can cause the LCP to load at the same time as the first resource, improving the LCP metric.
Load an LCP image with the <img> element and the src or srcset attribute
The goal is to start loading your LCP resources as quickly as possible. Having images present in the initial HTML markup with <img> element tags and src or srcset attributes makes them immediately discoverable, and the browser will load them sooner.
If there’s a reason for a prominent image to not be inside an img tag, or if it’s a background image, you must eliminate any loading delay by preloading the image with a high fetchpriority, for example:
<!-- Load the stylesheet that will reference the LCP image. -->
<link rel="stylesheet" href="/path/to/styles.css">
<!-- Preload the LCP image with a high fetchpriority so it starts loading with the stylesheet. -->
<link rel="preload" fetchpriority="high" as="image" href="/path/to/hero-image.webp" type="image/webp">
Use server-side rendering to improve speed over client-side rendering
Loading resources in the server rather than the browser can speed up the website and reduce the LCP metric. Macrometa’s PhotonIQ is capable of loading complex scripts and pages on the server for faster load times. If the resource needs to be loaded externally, add a rel attribute with preload on the link tag like this: <link rel="preload">.
Optimize images
Images compressed using WebP and other formats are preferable for the web. In an ecommerce website, many plugins will compress images automatically once uploaded, but this process can also be done manually by online tools as well as local tools like ImageMagik if you need more control over the quality.
Prerendering
Speculating about the next website the user will visit and prerendering its resources is a great way to speed up LCP. The Speculative Rules API allows you to programmatically tell the browser what pages to prerender based on the rules you specify.
An easier way to achieve prerendering, among other features, is with Macrometa’s PhotonIQ. PhotonIQ’s Prerender leverages AI to optimize a website and boost its search rankings. It uses synthetic interactions to simulate user triggers that would otherwise be missed by web crawlers. It presents the crawler with a static HTML version of JavaScript pages.
Using a content delivery network
A content delivery network (CDN) is a distributed network of servers that stores copies of a website at different locations. Then, the server closest to the user can load the website upon request. Serving your website geographically close to the user improves loading times significantly.
The rest of the website’s resources that are below the fold must be lazy-loaded, or loaded as needed. You can read more about these optimizations here.
Improve Interaction to Next Paint (INP)
Here are some steps to take for better INP scores.
Determine the root causes of slow interactions using field data and lab diagnosis
Field data, or data from Real User Monitoring (RUM), are a set of values that includes device, network conditions, and geographic locations of real users visiting your site. Google uses field data to generate the CRuX report.
Google calculates the INP with the field data, which also includes contextual information about what specific interaction caused the INP value, including the time when it happened and the type of interaction it was (click, keypress, or tap).
Lab data offers a contained and predictable environment that allows you to manually reproduce the slow interactions found in the field data.
Reduce input delay
Input delay starts on an interaction with the page and ends when the callback from the interaction runs. Here are some ways to reduce it:
- Avoid blocking JavaScript and CSS by minifying scripts.
- Reduce unused Javascript code; combine or minify files.
- Use code splitting to only include necessary Javascript during the first load.
Reduce processing duration
This is the amount of time it takes for the interaction callback to run to completion. Consider a task that takes longer than 50 milliseconds to be a long task; in such cases, breaking it down into smaller, simpler tasks is advisable. Breaking up the long task into smaller tasks allows the main thread to respond to tasks with higher priority faster, later resuming the smaller, less important tasks once the thread is not busy. This is called yielding to the main thread.
If you have access to the source code, there are ways you can programmatically schedule or break apart long tasks. You can use async and await to briefly stop a long task to yield to the main thread to allow it to respond to higher-priority tasks.
You can also use the Scheduler API to assign priorities to tasks inside a stack:
function saveSettings () {
// Validate the form at high priority
scheduler.postTask(validateForm, {priority: 'user-blocking'});
// Show the spinner at high priority:
scheduler.postTask(showSpinner, {priority: 'user-blocking'});
// Update the database in the background:
scheduler.postTask(saveToDatabase, {priority: 'background'});
// Update the user interface at high priority:
scheduler.postTask(updateUI, {priority: 'user-blocking'});
// Send analytics data in the background:
scheduler.postTask(sendAnalytics, {priority: 'background'});
};
(From https://web.dev/articles/optimize-long-tasks#scheduler-api)
Or, if programmatically yielding to the main thread:
async function saveSettings () {
// Create an array of functions to run:
const tasks = [
validateForm,
showSpinner,
saveToDatabase,
updateUI,
sendAnalytics
];
// Loop over the tasks:
while (tasks.length > 0) {
// Shift the first task off the tasks array:
const task = tasks.shift();
// Run the task:
task();
// Yield to the main thread with the scheduler
// API's own yielding mechanism:
await scheduler.yield();
}
}
(From https://web.dev/articles/optimize-long-tasks#scheduler-api)
You may also defer non-critical updates to elements by using a timeout inside a requestAnimationFrame, like this:
textBox.addEventListener('input', (inputEvent) => {
// Update the UI immediately, so the changes the user made
// are visible as soon as the next frame is presented.
updateTextBox(inputEvent);
// Use `setTimeout` to defer all other work until at least the next
// frame by queuing a task in a `requestAnimationFrame()` callback.
requestAnimationFrame(() => {
setTimeout(() => {
const text = textBox.textContent;
updateWordCount(text);
checkSpelling(text);
saveChanges(text);
}, 0);
});
});
Avoid synchronous layout or layout thrashing, which occurs when you update styles in JavaScript and then read them in the same task.
Reduce presentation delay
Presentation delay is the time it takes the browser to display the result from the interaction. There are various ways to reduce the presentation delay. One of them is using content-visibility to lazily render off-screen elements and keep DOM sizes small. Another way is offloading complex calculations and processing of third-party scripts to the edge with Macrometa’s PhotonIQ, so your website loads in a flash.
Improve Cumulative Layout Shift (CLS)
Use lazy loading
Lazy loading prevents below-the-fold contents from shifting elements during the first load. Elements outside of the viewport will only be loaded when they’re needed.
Use fixed-sized containers
Use containers with fixed dimensions (explicit width and height) that are predictable and can easily be accounted for. Use grids, flexbox, and other CSS tricks to achieve this. The aspect-ratio CSS property can help the browser auto-detect dimensions without impacting CLS.
Make sure your website is bfcache ready
Bfcache, or back/forward cache, optimizes browsers to enable instant back and forward navigation. This is possible by loading the next page from memory instead of initiating a new request.
To understand if your website is bfcache ready, in Chrome, navigate to DevTools, and go to Application -> Back-forward Cache. Click on the Run Test button. DevTools will attempt to navigate back and forth to determine if the page can be loaded from bfcache. If the report says “Restored from back-forward cache”, you’re all set. If not, and if it’s fixable by a developer, it will display actionable steps to fix it.
Bfcache unsuccessful result (source)
By maintaining a back/forward cache, a website can load instantly from memory with no layout shifts.
Preload fonts
Preload fonts or swap them so that the screen can show the text without waiting for the actual font to load.
There are two things you can do, depending on how the website loads the fonts. In CSS, you may include font-display: swap; on the font declaration. In the HTML, you can add a preload to the link element, for example:
<link rel="preload" href="assets/fonts/xxx.woff" as="font" type="font/woff" crossorigin />
Is your website optimized for visitors? Find out.
Use transforms
Not all element shifts are negative. Using CSS transformations is a good way to shift elements as part of an animation without affecting CLS.
Take a mobile-first approach
Google uses a mobile-first indexing approach, starting with the mobile version of a website for indexing. Web developers should take this into account during design.
Here are some ways to do this:
- Implement responsive design: Use flexbox, grids, and containers so that the website flows predictably.
- Create consistency across multiple devices: Make sure your website designer takes a mobile-first approach; i.e., build the website from the smallest screen to the largest and not the other way around. This helps maintain consistency among devices.
- Avoid distorted images or incorrect placement of elements on smaller screens: Make images width: 100% in the CSS and control the dimensions with a container as needed. Use well-compressed images in the right format.
Although mobile devices are convenient and widely used, their weakness is their limitation of resources given their size and portability. You can boost web performance by up to 300% with Macrometa’s PhotonIQ, which will enhance the user experience among devices. Allow your users to enjoy accelerated rendering and reduced loading times, increasing your conversion rate and search engine ranking.
Last thoughts
In today’s digital world, a positive user experience is no longer nice to have; it's a fundamental requirement for a successful website. Core Web Vitals provide essential metrics for understanding how users perceive a website's performance. By focusing on loading speed (LCP), interactivity (INP), and visual stability (CLS), website owners can improve user satisfaction and business outcomes.
To achieve better scores for First Contentful Paint, Time to Interactive, and Cumulative Layout Shift, you can leverage the AI-powered tools of Macrometa’s PhotonIQ. PhotonIQ boosts Core Web Vitals by improving performance with enhancements like prerendering, edge-side rendering, and more.
Like the Article?
Subscribe to our LinkedIn Newsletter to receive more educational content.