← Clarigital·Clarity in Digital Marketing
Analytics & CRO · Session 12, Guide 11

Heatmaps & Session Recording · Understanding On-Site Behaviour

Heatmaps and session recordings are qualitative research tools that reveal what users actually do on a page — not what they say they do, not what GA4 shows in aggregate statistics, but the granular micro-behaviours: where they click, how far they scroll, where they hesitate, what they try that does not work, which form fields cause abandonment. These tools transform abstract conversion funnel drop-off data into visible, concrete user experiences that generate specific, actionable hypotheses. This guide covers how heatmaps and session recordings work, what to look for in each, and how to translate the findings into CRO actions.

Analytics & CRO5,000 wordsUpdated Apr 2026

What You Will Learn

  • What heatmaps measure and how they complement GA4's quantitative data
  • How to read click heatmaps — what patterns indicate problems vs healthy engagement
  • What scroll heatmaps reveal about fold position, content visibility, and CTA placement
  • How move and attention maps differ from click maps and what they add
  • How to conduct efficient session recording analysis — filtering sessions and what to look for
  • What rage clicks, dead clicks, and error clicks reveal about user frustration
  • How form analytics shows which specific fields cause form abandonment
  • The available tools — Microsoft Clarity (free), Hotjar (freemium) — and their capabilities
  • A systematic analysis workflow for turning heatmap and recording data into CRO hypotheses
  • Common heatmap analysis mistakes that lead to wrong conclusions

What Heatmaps Measure

Heatmaps aggregate the mouse and touch interactions of all users on a page into a visual overlay — using colour to indicate intensity (red/orange = high activity; blue/green = low activity). They answer the question that GA4 cannot: where on this page do users click, how far do they scroll, and where do they spend visual attention?

Heatmaps are not a substitute for A/B testing — they reveal patterns but do not prove causation. A click heatmap showing that users click on a non-clickable image suggests a hypothesis (make it clickable or change the visual design to reduce confusion); it does not prove that making it clickable will increase conversion. The heatmap generates the hypothesis; the A/B test validates it.

The primary value of heatmaps is efficiency: they reveal potential conversion barriers in minutes — far faster than user testing or survey analysis — while working continuously in the background on the live site without any user recruitment effort.

Click Heatmaps

Click heatmaps record every click or tap on the page and aggregate them visually. They reveal: which elements receive the most clicks; which elements receive clicks but are not links; which intended clickable elements are not being clicked; and how click patterns change across device types.

Patterns to look for

  • Clicks on non-clickable elements. When users consistently click on an image, a heading, or a block of text that is not a link, they expect it to be clickable — or they expect clicking it to reveal more information. This is a usability problem: the element is visually signalling clickability without being clickable. CRO hypothesis: make the element clickable, or redesign it to not appear interactive.
  • CTA receiving fewer clicks than expected. The primary CTA on a page should receive the most prominent click cluster. If heatmap shows users clicking other elements more frequently than the CTA, the CTA may be visually buried, poorly positioned, or using language that does not communicate its purpose. This is a high-priority finding — CTA click rate directly affects conversion rate.
  • Navigation links capturing conversion-intent clicks. If users frequently click navigation links (back to products, homepage) from a checkout page, they may be trying to access information they could not find on the page — about returns policy, product details, or shipping. Providing this information on the conversion page may reduce navigation abandonment.
  • Link click distribution. On content pages with multiple links, the click heatmap shows which links users prefer — which topics they find most interesting and which they ignore. This informs both content strategy and internal linking priorities.

Scroll Heatmaps

Scroll heatmaps show what percentage of users scroll to each point on a page — from the top (100% of users see the content at the very top of the page) to the bottom (typically a much smaller percentage reach the bottom). The point where the line drops from warm colours to cold colours is the "fold" — the point below which most users do not scroll.

Patterns to look for

  • Key content below the fold. If important value propositions, trust signals, testimonials, or CTAs are positioned below the point where 50%+ of users stop scrolling, most users never see them. These elements should be moved higher on the page — or the above-the-fold content should be redesigned to entice more scrolling.
  • Steep drop-off near the top. A sudden scroll drop near the top of the page (say, 70% of users scroll to 20% depth but only 40% scroll to 25%) indicates a content discontinuity — something at that position is causing users to decide the rest of the page is not worth reading. Identifying and addressing what's at that point can improve overall page engagement.
  • Unexpectedly high scroll depth on long pages. Pages with scroll depth above 70% for most users have strong content engagement — users are reading to the end. This validates the content quality and suggests that additional depth or a stronger closing CTA could be effective.

Move and Attention Maps

Move heatmaps track cursor movement — where users move their mouse while reading or considering the page. On desktop, cursor movement is correlated with eye tracking — where the cursor goes, attention tends to follow. This provides a proxy for visual attention without requiring expensive eye-tracking equipment.

Attention maps (available in some heatmap tools) use a combination of cursor movement, click data, and time-on-element data to estimate which areas of a page receive the most visual attention. High-attention areas that contain conversion-critical information confirm the placement is working; high-attention areas that contain irrelevant elements indicate attention is being drawn away from conversion elements.

Session Recordings

Session recordings capture individual user sessions as anonymised video playback — showing the exact sequence of mouse movements, scrolls, clicks, form interactions, and page transitions for each recorded session. Unlike heatmaps (which show aggregate patterns), session recordings show individual journeys — revealing the specific sequence of events that led a user to convert or abandon.

Efficient session recording analysis

Reviewing thousands of session recordings unfiltered is inefficient. Use filters to identify the most informative sessions:

  • Sessions that visited a specific page but did not convert. Filter for sessions that include the checkout page but do not include the confirmation page — these are the checkout abandonment sessions most relevant to conversion optimisation.
  • Sessions with rage clicks. Most session recording tools allow filtering for sessions containing rage clicks — the most efficient filter for finding frustration and confusion experiences.
  • Sessions with specific UTM sources. Filter for sessions from a specific campaign or traffic source to understand how that audience's experience differs from others.
  • Mobile sessions only. Review mobile sessions separately from desktop — mobile experiences often have different friction points invisible in desktop-aggregated heatmaps.

Watch at 2× playback speed, pausing at points of hesitation or unusual behaviour. Take notes on patterns that appear repeatedly across multiple sessions — a pattern appearing in 3+ sessions warrants a hypothesis; a pattern appearing in 1 session may be idiosyncratic.

Rage Clicks and Frustration Signals

Rage clicks — rapid repeated clicks on the same element — are an automated frustration signal detected by session recording tools. They indicate: an element the user expects to respond (a button, a link, a form field) that is not responding as expected; a page element that appears clickable but is not; or a JavaScript error preventing a click from registering. Microsoft Clarity and Hotjar both detect and filter rage clicks automatically.

Other frustration signals

  • Dead clicks. Clicks on non-interactive elements — indicating the user expected interactivity that does not exist. Different from rage clicks in that they may occur only once; rage clicks involve frustrated repetition.
  • Error clicks. Clicks that produce an error state — error messages, validation errors, broken states. Concentrated error clicks at a specific form field indicate a field validation problem.
  • Excessive back navigation. Users who repeatedly navigate backward from a specific page suggest the page is not delivering what they expected based on the link or CTA that brought them there — a landing page/CTA mismatch problem.

Form Analytics

Form analytics is a specific application of session recording that focuses on how users interact with forms — the most common conversion mechanism on websites. Form analytics shows: which fields users fill in first; which fields they skip and return to; which fields they abandon the form after encountering; and how long they spend on each field.

Actionable form analytics insights

FindingLikely CauseCRO Hypothesis
High abandonment after a specific fieldField is confusing, asks for sensitive data, or is technically brokenRemove the field; make it optional; add helper text; test validation logic
Long time spent on a specific fieldField instructions are unclear; field requires information users don't have readily availableAdd example text; add tooltip explanation; consider removing the field
High return rate to a specific fieldValidation is rejecting valid inputs; user changes their mind about what to enterCheck field validation logic; test alternative field types (dropdown vs text)
Users abandon at the first fieldFirst field asks for sensitive information (email, phone) before trust is establishedReorder fields to ask less sensitive information first

Tools Overview

ToolCostKey Features
Microsoft ClarityFree (unlimited)Session recordings, heatmaps, rage click detection, dead clicks, form analytics, GA4 integration. Official product from Microsoft.
HotjarFreemium (limited free tier)Session recordings, heatmaps, surveys, user interviews, funnels. More mature product with broader research features.
Crazy EggPaidA/B testing integration with heatmaps; confetti reports (click data by referral source); scroll maps.
FullStoryPaid (enterprise)Advanced session analytics; DX Data (automatic frustration signal detection); dev tools for error analysis.

Microsoft Clarity is the recommended starting tool — it is completely free with no session or traffic limits, integrates directly with GA4 (linking Clarity sessions to GA4 user IDs for combined analysis), and provides all the core capabilities (heatmaps, recordings, frustration detection) needed for most CRO programmes.

Analysis Workflow

A structured analysis workflow turns heatmap and session recording data into actionable hypotheses:

  1. Start with quantitative data. Use GA4 funnel analysis to identify the pages and steps with the highest drop-off. These are the pages where heatmap and recording analysis should focus.
  2. Review heatmaps for those pages. Check click, scroll, and move heatmaps. Document observations: what is being clicked unexpectedly? What important elements are below the fold? Where is attention concentrated vs intended conversion elements?
  3. Filter and watch session recordings. Apply filters for sessions that visited the high-drop-off page but did not complete the next step. Watch 20–30 sessions. Document repeated patterns of hesitation, confusion, or frustration.
  4. Synthesise observations into hypotheses. For each repeated observation, form a hypothesis using the template: "We observed [X]. We believe that [change] will [improve metric] because [reason]."
  5. Prioritise hypotheses. Use ICE or PIE scoring to prioritise. Add to the test backlog.

Common Heatmap Analysis Mistakes

  • Not separating desktop and mobile data. Desktop and mobile users have entirely different interaction patterns — fold positions are different, tap targets behave differently, and scroll behaviour differs. Always view heatmaps separately for desktop and mobile. Combined heatmaps obscure mobile-specific problems.
  • Acting on single-session observations. One user doing something unusual is not a pattern. Only observations that repeat across multiple sessions represent meaningful user behaviour. Set a minimum threshold (3–5 occurrences) before forming a hypothesis from session recording observations.
  • Treating heatmaps as proof rather than research. A click heatmap showing that the main CTA is not receiving many clicks suggests a problem — it does not prove that any specific change will fix it. The heatmap points to a hypothesis; the A/B test validates it.
  • Using heatmaps for pages with low traffic. Heatmaps require a minimum volume of interactions to produce statistically reliable patterns. A page with 50 monthly visitors produces a heatmap with too few data points to draw reliable conclusions — the patterns are too susceptible to influence from a small number of unusual sessions.

Authentic Sources

Source integrity

Every factual claim in this guide is drawn from official Google documentation, regulatory bodies, or platform-published technical specifications. No third-party blogs or marketing tools are used as primary sources. All content is written in our own words — we learn from official sources and explain them; we never copy.

OfficialMicrosoft Clarity

Free session recording and heatmap tool from Microsoft — official product with GA4 integration.

OfficialMicrosoft Clarity Documentation

Official technical documentation for Clarity implementation and features.

OfficialGoogle Analytics Help — Explorations

GA4 Explorations for the quantitative funnel analysis that identifies which pages to analyse with heatmaps.

OfficialGoogle Analytics Help — Funnel Exploration

GA4 funnel exploration — the quantitative research layer that directs heatmap and recording analysis focus.

600 guides. All authentic sources.

Official documentation only.