← Clarigital·Clarity in Digital Marketing
SEO Foundation · Guide 7 of 8

Core Web Vitals for SEO · The Complete Guide

What Core Web Vitals are, how LCP, INP, and CLS are measured, why Google made them a ranking factor, how to measure your scores using free tools, and how to systematically improve each metric — with official threshold values and technical fixes.

SEO3,200 wordsUpdated Apr 2026

What You'll Learn

  • What Core Web Vitals are and why Google introduced them as ranking signals
  • LCP (Largest Contentful Paint) — what it measures, thresholds, and top causes of poor scores
  • INP (Interaction to Next Paint) — what replaced FID, how it is measured, and how to improve it
  • CLS (Cumulative Layout Shift) — what visual stability means, what causes it, and how to fix it
  • The difference between lab data and field data and why it matters for ranking
  • How to measure Core Web Vitals using Google Search Console and PageSpeed Insights
  • A systematic technical approach to improving each metric

What Are Core Web Vitals

Core Web Vitals are a set of specific performance metrics that Google uses to quantify real-world user experience on web pages. Introduced in May 2020, they represent Google's attempt to make page experience measurable and actionable for web developers and SEOs — translating abstract concepts like "loading", "interactivity", and "visual stability" into specific, numeric metrics with defined good/needs improvement/poor thresholds.

The three current Core Web Vitals metrics are:

  • LCP (Largest Contentful Paint) — measures loading performance
  • INP (Interaction to Next Paint) — measures responsiveness to user interactions (replaced FID in March 2024)
  • CLS (Cumulative Layout Shift) — measures visual stability

Core Web Vitals are measured from real user data collected through Chrome's usage statistics — not from synthetic lab tests. This is an important distinction. Google's ranking signals use field data (Chrome User Experience Report, or CrUX data), which reflects how actual users experience pages on real devices and network conditions.

Core Web Vitals are page-level, not site-level

Google evaluates Core Web Vitals at the individual page level, not as a single site-wide score. A site can have some pages with "Good" scores and others with "Poor" scores. Google Search Console's Core Web Vitals report groups pages by URL pattern and shows the distribution across your site — focus improvement efforts on the most important pages first.

LCP — Largest Contentful Paint

LCP

Largest Contentful Paint

Measures the time from when a page begins loading until the largest image or text block in the viewport is rendered. This is the primary measure of perceived loading speed from the user's perspective.

Good
≤ 2.5s
Needs Improvement
2.5s – 4.0s
Poor
> 4.0s

What qualifies as the LCP element?

The browser considers these elements when determining the LCP element: <img> elements, <image> inside SVG, <video> poster images, background images loaded via CSS url(), and block-level elements containing text nodes. The LCP element is determined dynamically as the page loads — it can change as larger content appears.

Most common causes of poor LCP

  • Slow server response times (TTFB). Everything on the page is blocked until the server sends the first byte of HTML. A TTFB above 800ms is one of the most common causes of poor LCP.
  • Render-blocking resources. CSS stylesheets and JavaScript files that block rendering delay when the browser can start painting content.
  • Unoptimised hero images. Large, uncompressed hero images are frequently the LCP element. Images not sized correctly, missing fetchpriority="high", or not using modern formats (WebP, AVIF) are a major LCP issue.
  • Client-side rendering. Pages that require JavaScript to render content have their LCP delayed until JavaScript executes — often significantly.
  • No preloading of LCP resource. If the LCP element is a background image or late-discovered resource, adding a <link rel="preload"> hint can significantly reduce LCP.

INP — Interaction to Next Paint

INP

Interaction to Next Paint

Measures the latency of all user interactions (clicks, taps, keyboard presses) throughout a page visit and reports the worst-case interaction — or near-worst for pages with many interactions. Replaced FID (First Input Delay) in March 2024 as a Core Web Vital.

Good
≤ 200ms
Needs Improvement
200ms – 500ms
Poor
> 500ms

Why INP replaced FID

FID (First Input Delay) only measured the delay on the very first user interaction on a page. INP is a comprehensive improvement — it measures all interactions throughout a page visit and captures the worst-case responsiveness, giving a much more accurate picture of whether a page feels responsive or sluggish during use. Google announced the transition in May 2023 and completed it in March 2024.

Most common causes of poor INP

  • Long tasks on the main thread. JavaScript tasks that run for more than 50ms block the browser from responding to user input. Any heavy computation, complex DOM manipulation, or unoptimised event handlers can block the main thread.
  • Excessive JavaScript execution. Large JavaScript bundles that parse and execute on page load occupy the main thread, preventing timely responses to interactions. Reducing JavaScript payload, code-splitting, and deferring non-critical scripts all help.
  • Unoptimised event handlers. Event listeners that perform heavy synchronous work directly in response to clicks or taps delay the browser's ability to repaint.
  • Third-party scripts. Analytics scripts, ad platforms, chatbots, and other third-party scripts running on the main thread are among the most common sources of poor INP on real-world sites.

CLS — Cumulative Layout Shift

CLS

Cumulative Layout Shift

Measures the cumulative score of all unexpected layout shifts that occur throughout the lifespan of a page. Each layout shift scores impact fraction × distance fraction. CLS is dimensionless — it is a score, not a time value.

Good
≤ 0.1
Needs Improvement
0.1 – 0.25
Poor
> 0.25

What causes layout shifts

  • Images without dimensions. When <img> elements do not have explicit width and height attributes (or equivalent CSS), the browser cannot reserve space for the image before it loads. When the image loads, it pushes surrounding content down — causing a significant layout shift.
  • Ads, embeds, and iframes without reserved space. Ad slots that load asynchronously and inject content into the DOM after initial paint are one of the largest sources of CLS on commercial sites.
  • Web fonts causing FOUT/FOIT. Custom fonts loading after text is initially rendered can cause text to shift as the font metrics change. Using font-display: optional or preloading critical fonts reduces this.
  • Content dynamically injected above the fold. Any content inserted above existing content after the page loads — banners, cookie consent bars, notification popups — causes layout shifts for content below.
  • CSS animations that affect layout properties. Animations that change width, height, top, or left properties trigger layout — use transform instead.

Primary fix: always size your media

Add explicit width and height attributes to all <img> and <video> elements. This alone resolves the most common source of high CLS scores. For responsive images, use the CSS aspect-ratio property as an alternative to maintain layout reservations.

Core Web Vitals as a Ranking Signal

Google officially announced Core Web Vitals as a ranking signal in May 2020, with rollout beginning in June 2021 under the Page Experience update. The ranking signal uses field data from the Chrome User Experience Report (CrUX) — only pages with sufficient real-user data are evaluated. Pages without enough CrUX data fall back to domain-level aggregates.

Google has described CWV as a "tiebreaker" signal

Google's official communications have consistently described the Page Experience signal (which includes Core Web Vitals) as a relatively lightweight ranking factor that acts as a tiebreaker when content quality is equivalent. A page with excellent content but poor Core Web Vitals will typically outrank a page with mediocre content but perfect Core Web Vitals. Content quality remains the primary ranking factor. However, for competitive informational queries where many pages have similar content quality, CWV can be a meaningful differentiator.

The 75th percentile threshold is key to understanding how Google uses CWV for ranking. Google evaluates whether at least 75% of real page visits achieve "Good" scores on all three metrics. A page passes the Page Experience signal if 75% of visits are in the "Good" threshold for LCP, INP, and CLS. Pages that fail one or more metrics receive a lower page experience score.

How to Measure Core Web Vitals

Core Web Vitals can be measured using two data types: field data (real user measurements from the CrUX dataset) and lab data (synthetic measurements from controlled test environments). For SEO ranking purposes, only field data matters. Lab data is valuable for diagnosing problems and testing fixes before they reach real users.

Free · Field DataGoogle Search Console

Core Web Vitals report under Experience. Shows real CrUX data for your site's pages, grouped by status (Good/Needs Improvement/Poor) and URL pattern. The most important tool for understanding your ranking signal.

Free · Lab + FieldPageSpeed Insights

Combines CrUX field data (when available for the URL) with a Lighthouse lab test. Available at pagespeed.web.dev. Run individual URLs to see both real-world performance and diagnostic opportunities.

Free · Lab DataChrome DevTools

Performance panel in Chrome DevTools provides detailed waterfall charts, main thread activity analysis, and layout shift identification. Essential for diagnosing the root cause of poor LCP, INP, and CLS.

Free · Lab DataLighthouse (CLI or DevTools)

Google's open-source automated auditing tool. Provides scores and specific recommendations for each Core Web Vital. Useful for CI/CD integration to catch regressions before deployment.

Lab data vs field data discrepancy

Lab data (PageSpeed Insights, Lighthouse) and field data (CrUX, Search Console) frequently show different scores for the same page. Lab data uses a simulated network and CPU; field data reflects the diversity of real users' devices and connections. Prioritise field data for understanding your ranking signal; use lab data to diagnose specific technical issues. A page can score well in lab tests and poorly in field data if the real user base is predominantly on slow mobile connections.

A Systematic Approach to Improving Core Web Vitals

Improving Core Web Vitals requires a methodical approach: diagnose which metric is failing and on which pages, identify the root cause using lab tools, implement the fix, and then monitor field data to confirm improvement (field data typically takes 28 days to reflect changes in Google Search Console).

Priority actions by metric

MetricHighest Impact Actions
LCPReduce TTFB (server-side caching, CDN, hosting upgrade). Preload the LCP image with <link rel="preload">. Compress and convert hero images to WebP/AVIF. Remove render-blocking CSS and JS above the fold. Use fetchpriority="high" on the LCP image element.
INPAudit and reduce third-party scripts — remove or defer any not critical to initial page function. Break up long JavaScript tasks using setTimeout or scheduler.yield(). Reduce JavaScript bundle size through code splitting. Minimise DOM size (large DOMs increase interaction latency).
CLSAdd explicit width/height to all images and videos. Reserve space for ad slots with min-height CSS. Preload critical fonts and use font-display: optional. Avoid inserting content above existing DOM elements after page load. Use CSS transform for animations instead of layout properties.
Field data changes take 28 days to reflect in Search Console

Google Search Console's Core Web Vitals report uses a 28-day rolling window of CrUX data. After deploying a fix, you need to wait up to 28 days to see the impact in Search Console. Use PageSpeed Insights immediately after deployment to verify the fix in lab data, then monitor Search Console weekly for the field data trend.

Authentic Sources Used in This Guide

OfficialGoogle web.dev — Core Web Vitals

Official definition, metrics, and thresholds for all three Core Web Vitals with measurement guidance.

OfficialGoogle Search Central — Page Experience Update

Official announcement of Core Web Vitals as ranking signals and Page Experience update timeline.

OfficialGoogle web.dev — Largest Contentful Paint

Official LCP documentation with threshold values, element eligibility, and optimisation guidance.

OfficialGoogle web.dev — Interaction to Next Paint

Official INP documentation including why it replaced FID and how interaction latency is measured.

OfficialGoogle web.dev — Cumulative Layout Shift

Official CLS documentation with the layout shift calculation formula and common causes.

OfficialChrome UX Report (CrUX) Documentation

Official documentation on the CrUX dataset that powers Google's field data measurements for ranking.

600 guides on digital marketing. All authentic sources.

Official documentation, academic research, and government data only. No blogger opinions.