← Clarigital·Clarity in Digital Marketing
Tools & Resources · Session 13, Guide 10

Core Web Vitals Tools · PageSpeed Insights, Lighthouse & CrUX

Core Web Vitals — LCP, INP, and CLS — are Google's official page experience metrics and a documented ranking factor in Google Search. Measuring them correctly requires understanding the difference between lab data (simulated performance measurements on a controlled device) and field data (real user experience measurements from actual Chrome users). Google provides several free official tools for both types of measurement. This guide covers how to use each tool and what the results mean.

Tools & Resources4,900 wordsUpdated Apr 2026

What You Will Learn

  • The critical difference between lab data and field data — why they can contradict each other
  • How to use Google PageSpeed Insights to get both lab and field Core Web Vitals data
  • How Chrome Lighthouse works and when to use it instead of PageSpeed Insights
  • What the Chrome User Experience Report (CrUX) is and how to access it
  • How to find Core Web Vitals data in Google Search Console
  • How to interpret Good, Needs Improvement, and Poor thresholds
  • Common causes and diagnostic approaches for LCP problems
  • Common causes and diagnostic approaches for INP problems
  • Common causes and diagnostic approaches for CLS problems
  • A practical workflow for identifying and prioritising Core Web Vitals improvements

Lab Data vs Field Data

The most important concept in Core Web Vitals measurement is the distinction between lab data and field data — because these can show dramatically different results for the same page.

Lab data is a simulated measurement: a tool (Lighthouse, WebPageTest) loads the page in a controlled environment — a specific emulated device, a specific throttled network connection, a specific browser configuration. Lab data is reproducible (you get similar results each time you test the same page) and detailed (you can identify specific resources causing problems). But it represents a single simulated scenario, not the range of real user experiences.

Field data (also called Real User Monitoring, or RUM) is aggregated from actual Chrome users visiting the page in their real environments — their actual devices, their actual network connections, their actual browser configurations. Google collects this data through the Chrome User Experience Report (CrUX) and reports it in PageSpeed Insights and Google Search Console. Field data represents the actual experience distribution of real visitors.

Google uses field data for the Core Web Vitals ranking signal — not lab data. A page that performs well in Lighthouse but has poor field data (because real users have slower devices or connections than the lab simulation) will be affected by the ranking signal based on the field data. Always check field data, not just lab data, when assessing Core Web Vitals for SEO purposes.

LCP target

<2.5s

Largest Contentful Paint under 2.5 seconds = Good

INP target

<200ms

Interaction to Next Paint under 200ms = Good

CLS target

<0.1

Cumulative Layout Shift score under 0.1 = Good

Google PageSpeed Insights

PageSpeed Insights (pagespeed.web.dev) is Google's primary free Core Web Vitals measurement tool. Enter any URL and PSI returns both field data (from CrUX, if the page has sufficient traffic to be in the CrUX dataset) and lab data (from a Lighthouse simulation). PSI is the recommended starting point for Core Web Vitals assessment because it combines both data types in a single interface.

The field data section (top of the PSI report) shows the 75th percentile Core Web Vitals values from real users — categorised as Good, Needs Improvement, or Poor. The 75th percentile threshold matters: a page's Core Web Vitals are assessed by Google at the 75th percentile, meaning 75% of users must have a Good experience for the page to pass the Good threshold. A page where 80% of users experience Good LCP but 20% experience Poor LCP does not qualify as Good.

The lab data section shows Lighthouse performance scores and individual resource diagnostics — identifying specific resources (large images, render-blocking scripts, large JavaScript bundles) contributing to performance problems. Use the lab data diagnostics to identify specific fixes; use the field data to confirm whether fixes have improved the real user experience.

Chrome Lighthouse

Lighthouse is Google's open-source automated auditing tool for web page quality. It is built into Chrome DevTools (F12 → Lighthouse tab) and also runs as the lab simulation engine behind PageSpeed Insights. Lighthouse assesses Performance, Accessibility, Best Practices, SEO, and Progressive Web App criteria.

Using Lighthouse in Chrome DevTools (rather than PSI) provides additional control: you can run it while authenticated on password-protected pages; test in a staging environment before deploying changes; and test on a development branch URL that is not publicly accessible. DevTools Lighthouse also provides more detailed diagnostic output than PSI's simplified interface.

Lighthouse performance scores (0–100) are calculated from a weighted combination of lab metrics: LCP (25%), Total Blocking Time (30%), Speed Index (10%), First Contentful Paint (10%), and CLS (25%). The score is useful as a relative benchmark but not as an absolute threshold — a score of 90 does not guarantee Good Core Web Vitals field data, and a score of 70 does not necessarily mean the page fails Core Web Vitals thresholds in the field.

Chrome User Experience Report (CrUX)

The Chrome User Experience Report (CrUX) is Google's dataset of real-user performance data collected from Chrome browsers that have opted into syncing browsing history and usage statistics. CrUX powers the field data shown in PageSpeed Insights and Google Search Console. It is available as a public BigQuery dataset (for analysts who want raw data), via a REST API (the CrUX API), and via the CrUX Dashboard in Looker Studio (a pre-built visualisation of CrUX data for any domain).

The CrUX dataset has a minimum traffic threshold for page-level data: pages must have sufficient traffic to be included (typically several hundred to a few thousand real user sessions per month). For low-traffic pages, PSI may show "No field data available for this page" — in which case only lab data is shown. The CrUX Dashboard aggregates data at the origin (domain) level, which is more widely available than page-level data.

Core Web Vitals in Google Search Console

GSC's Core Web Vitals report (Experience → Core Web Vitals) shows Good, Needs Improvement, and Poor page counts for the site — using CrUX field data. The report groups pages with the same URL pattern together and identifies which issues are affecting performance across groups of pages, making it more actionable for large sites than checking individual pages in PSI.

Click into any issue in GSC (e.g. "Poor LCP: 2 issues") to see the specific pages affected and the specific CWV values. GSC also provides a validation workflow: after fixing issues identified in the report, request validation. GSC will monitor the affected pages for 28 days and confirm whether the fix improved the CWV values to Good status.

Interpreting Core Web Vitals Scores

MetricGoodNeeds ImprovementPoor
LCP≤2.5s2.5–4.0s>4.0s
INP≤200ms200–500ms>500ms
CLS≤0.10.1–0.25>0.25

The Core Web Vitals ranking signal requires that a page pass all three metrics at the Good threshold (at the 75th percentile of real user data) to be considered Good. Passing two out of three does not provide the same ranking benefit. Prioritise the metric where the page is furthest from the Good threshold — particularly if it is in the Poor category.

Diagnosing LCP Problems

The LCP element is typically the largest image (hero image, product image) or the largest text block visible in the initial viewport. PSI and Lighthouse identify which element is the LCP element and what is delaying its rendering. Common LCP causes and fixes:

  • Large, unoptimised images. Images not compressed or not served in next-gen formats (WebP, AVIF). Fix: compress images; convert to WebP; use responsive images with srcset.
  • No preload on LCP image. The LCP image is discovered late in the page load (after stylesheets and scripts load). Fix: add <link rel="preload"> for the LCP image in the <head>.
  • Render-blocking resources. CSS stylesheets or synchronous JavaScript in the <head> blocking the browser from rendering content. Fix: defer non-critical JavaScript; inline critical CSS.
  • Slow server response time (TTFB). The server takes too long to respond, delaying the start of all rendering. Fix: improve server infrastructure; add CDN; enable caching.

Diagnosing INP Problems

INP (Interaction to Next Paint) measures how quickly the page responds to user interactions — clicks, taps, keyboard input. Poor INP is typically caused by JavaScript that blocks the main thread, preventing the browser from responding to user interactions promptly. Common causes: large JavaScript bundles executing synchronously on the main thread; third-party scripts (tag managers, analytics, chat widgets, advertising scripts) blocking the main thread; long tasks (JavaScript executing for more than 50ms continuously).

Diagnosis tool: Chrome DevTools Performance panel. Record a session and look for long tasks (red-capped bars in the Main thread). Identify the specific scripts executing in those long tasks — these are the candidates for code splitting, deferred loading, or removal.

Diagnosing CLS Problems

CLS (Cumulative Layout Shift) measures unexpected movement of page content during loading. The most common causes:

  • Images without explicit dimensions. Images loaded without width and height attributes in the HTML cause the browser to allocate no space initially and then shift content when the image loads. Fix: always include explicit width and height attributes on <img> tags.
  • Late-loading ad units. Ad slots that load after the initial render cause content below them to shift downward. Fix: reserve explicit space for ad slots.
  • Web fonts causing text reflow. Pages using web fonts where the fallback font has different metrics can cause text layout to shift when the web font loads. Fix: use font-display: optional or preload critical web fonts.
  • Dynamic content injection. JavaScript that inserts banners, cookie consent bars, or promotional strips into the visible page after load. Fix: reserve space for dynamic insertions; insert above the visible content instead of into the visible content.

Core Web Vitals Improvement Workflow

  1. Check GSC Core Web Vitals report — identify which pages and issues are in the Poor category.
  2. Run PSI on the worst-performing URLs — review both field data (confirm the problem is real) and lab data (diagnose specific causes).
  3. Prioritise: high-traffic pages with Poor status first; Needs Improvement pages for high-traffic URLs second.
  4. Implement fixes — image optimisation, preloading, script deferral, dimension attributes.
  5. Verify with Lighthouse in DevTools that lab scores improve after fixes.
  6. Return to GSC and request validation — monitor for 28 days to confirm field data improves.

Authentic Sources

Source integrity

Every factual claim in this guide is drawn from official sources, primary documents, or directly documented historical records. We learn from official sources and explain them in our own words — we never copy.

OfficialGoogle PageSpeed Insights

Official Google tool for measuring Core Web Vitals with both field data (CrUX) and lab data (Lighthouse).

OfficialGoogle Developers — Lighthouse

Official documentation for Chrome Lighthouse — metrics, scoring, and diagnostic categories.

OfficialChrome for Developers — CrUX

Official documentation for the Chrome User Experience Report — data collection, field data methodology, and access.

Officialweb.dev — Largest Contentful Paint

Google's web.dev official guide to LCP — causes, measurement, and optimisation techniques.

600 guides. All authentic sources.

Official documentation only.