← Back to My Microsoft Clarity Overview
Who this guide is for
Anyone with Clarity installed who wants to use session recordings as a diagnostic tool rather than just watching sessions aimlessly.
Prerequisites
Microsoft Clarity installed and collecting data. At least a few days of sessions on the pages you want to investigate. If you have not set up Clarity yet, start with Guide 01.
Works best alongside
Heatmap data from Guide 02. Heatmaps show you where patterns exist across many sessions. Recordings show you what individual visitors were actually doing at those moments.
Time per review session
20 to 40 minutes to review 10 to 15 targeted recordings on a specific page or problem. Not a daily task — do it when you have a specific question to answer.
Start your recording review today
Open Clarity, filter recordings to your main landing page with exits, and watch ten sessions. You will find something worth fixing.
Disclosure: This is not a paid promotion. I have no affiliate or commercial relationship with Microsoft or Microsoft Clarity. It is a tool I genuinely use across my own businesses and for clients. Views are my own.
What session recordings actually show — and what they do not
Session recordings are the closest thing to watching a real visitor use your site. They are also easy to misread if you do not understand what you are looking at.
A Clarity session recording is a reconstruction of a visitor’s session — not a video of their screen. Clarity records the DOM state of your page and the visitor’s interactions (mouse movements, clicks, scrolls, keystrokes in non-sensitive fields) and replays them as an animation. This means recordings look like video but are actually a replay of data.
What you can see in a recording: where the visitor’s mouse moved and paused, where they clicked, how far they scrolled, which pages they visited and in what order, how long they spent on each page, and where they left the site. For mobile visitors, you see taps and swipes rather than mouse movement.
What you cannot see: what the visitor was thinking, what they looked at specifically (eye tracking is not captured), what they did in other browser tabs, or anything from pages outside your site. A visitor who spent 30 seconds on your contact page and then left may have been interrupted, satisfied, or frustrated — the recording alone cannot tell you which.
The most important rule for recording review
Always go into a recording review with a specific question. “Why are visitors not completing the contact form on this page?” is a useful question. “What are visitors doing on my site?” is not — it is too broad to produce actionable conclusions. Focused review finds things. Undirected watching produces vague impressions.
The recording review workflow
Treating recordings as an investigation rather than entertainment makes the process faster and the conclusions more useful.
The workflow that works across all the business sites follows the same sequence every time. Start from heatmap data — not from recordings. Heatmaps show you aggregate patterns across hundreds of sessions; recordings show you individual behaviour. It is much easier to find meaningful recordings when you already know which page has a problem and roughly where on that page the problem is occurring.
01
Identify the page
From heatmap data or conversion tracking — which page has a behaviour you do not understand?
02
Define the question
Write down exactly what you are trying to understand before opening a single recording.
03
Apply filters
Filter recordings to sessions on that page, with the relevant frustration signal or exit behaviour.
04
Review 10 to 15
Watch enough recordings to see whether a pattern emerges. One or two sessions is not a pattern.
05
Record what you find
Write down the pattern in one sentence. Then decide whether to act, investigate further, or leave.
The workflow keeps the review process from becoming open-ended. Watching recordings without a structure tends to produce a long list of things you noticed without a clear priority order. Watching with a question produces an answer — or a reason why more investigation is needed.
How to use filters to find the most useful recordings
Filters are what separate a useful recording review session from an hour of watching people scroll. Use them before you watch a single recording.
In Clarity’s recordings section, the filter panel lets you narrow sessions by a wide range of criteria. The most useful filters for conversion investigations are below. Apply them in combination — for example, filtering for sessions on a specific page that also included a rage click will quickly surface recordings most likely to show friction.
Filter: Rage clicks
Sessions where a visitor clicked the same spot repeatedly
The most reliable frustration signal in Clarity. Rage clicks almost always mean something was not responding the way the visitor expected — a button that did not fire, a link that did not work, or an element they expected to be interactive.
Filter: Dead clicks
Sessions where clicks landed on non-interactive elements
Useful for finding elements visitors expected to be clickable. Combine with the click heatmap to confirm whether a dead-click pattern is widespread before prioritising a fix.
Filter: Exit page
Sessions where the visitor left from a specific page
Filter for your key landing page or contact page as the exit page to see sessions that ended there. This surfaces visitors who reached the page but left without converting — exactly the sessions you want to understand.
Filter: Session duration
Filter by how long the session lasted
Short sessions (under 30 seconds) on a key landing page often indicate the visitor did not find what they expected immediately. Longer sessions that still ended without a conversion are worth investigating for friction deeper in the page.
Filter: Pages visited
Sessions that included a specific page in the journey
Useful for understanding the path visitors took before reaching — or not reaching — a key page. Did visitors who converted visit the pricing page? Did visitors who left skip it entirely?
Filter: Traffic source
Sessions from a specific channel or campaign
Filter for paid traffic specifically to see whether visitors from ads behave differently from organic visitors on the same page. Particularly useful when combined with the Google Ads integration — see Guide 04.
Filter: Device type
Mobile-only or desktop-only sessions
As with heatmaps, mobile and desktop visitors often behave very differently on the same page. If you are investigating a problem that appeared in the mobile heatmap, filter recordings to mobile sessions only.
Filter: Excessive scrolling
Sessions where the visitor scrolled up and down repeatedly
Excessive scrolling — scrolling back up the page multiple times — often indicates a visitor who is looking for something they cannot find. It is a sign of confusion rather than engagement.
The five friction signals that matter most
Most of what matters in a recording review comes down to five behaviours. Learn to spot them quickly and you will rarely need to watch more than a few minutes of any session.
When you are watching a recording, you are looking for moments that tell you something went wrong — or something the visitor expected did not happen. These five signals cover the vast majority of meaningful friction you will find in recordings on a typical business website.
Friction signal
Rage clicks
Multiple rapid clicks on the same spot. Flagged automatically by Clarity with a visual indicator in the recording. Almost always means the visitor expected something to happen and it did not. Check what element they were clicking and why it might have failed to respond.
Friction signal
U-turn behaviour
Visitor navigates to a page, spends a few seconds, and immediately goes back to the previous page or leaves entirely. Often indicates the page did not match what the visitor expected based on where they came from — a navigation label or ad that promised something the page did not deliver.
Friction signal
Form abandonment
Visitor reaches the contact form, begins interacting with it — moving the cursor to fields, sometimes starting to type — and then stops and leaves. Clarity masks form field content by default, so you see the interaction without the data. Common causes: too many fields, a required field that seems unreasonable, or a question that creates hesitation.
Friction signal
Excessive scrolling
Visitor scrolls up and down the page repeatedly rather than in one continuous direction. This usually means they are searching for something — a price, a phone number, a specific piece of information — and cannot find it easily. The fix is often making that information more prominent or adding it to a section the visitor was returning to.
Intent signal
Deep engagement before exit
Visitor spends significant time on the page, scrolls deeply, reads carefully — and then leaves without converting. This is not a friction signal in the traditional sense. It suggests the content is compelling but something at the conversion point itself is failing: the form is not visible, the CTA label is unclear, or the ask feels premature given the content.
Contextual signal
Repeated visits to the same page
Visitor returns to the same page or section multiple times within a session. This can indicate genuine interest and consideration — or confusion about where to find something. Context matters: is it a pricing page they are returning to (likely interest), or a navigation element they keep clicking (likely confusion)?
On playback speed
Clarity allows recordings to be played back at up to 8x speed. For a first pass through a batch of recordings, 4x to 6x is usually fast enough to spot the signals above without missing them. Slow down to 1x or 2x when you reach the specific moment you want to understand in detail.
How many recordings to review before drawing conclusions
One recording is an anecdote. Ten recordings that all show the same behaviour is a pattern worth acting on.
The number of recordings you need to review depends on what you find in the first few. If you watch five recordings filtered for rage clicks on a specific page and all five show the rage click happening on the same element, that is already a strong enough signal to act on. You do not need to watch fifty more to be certain.
If you watch five recordings and each one shows something different — different exit points, different behaviour patterns, no consistent signal — then either the problem is not concentrated enough to identify from recordings alone, or the filter you applied was too broad. In that case, tighten the filter (add an additional criterion) or return to the heatmap data to narrow the investigation.
How many recordings to review
Guidance by situation
First 5 recordings all show the same behaviour on the same element
Stop and act — pattern is clear
Mixed signals across the first 5 — no clear pattern
Review 10 more with a tighter filter
Investigating a high-traffic page where you need confidence
Review 15 to 20 before concluding
Low-traffic page with fewer than 50 sessions total
Review all available — note that sample is small
You have watched 20+ recordings and still see no pattern
The problem may not be on this page — check the referral journey
Remember the 30-day limit
Clarity only retains session recordings for 30 days. If you are investigating a problem that may have appeared more than a month ago, the recordings you need may no longer be available. Heatmap data (retained for up to 13 months) is more useful for historical analysis. This is another reason to review recordings regularly rather than leaving it until a problem has become obvious.
Decision points: act, investigate further, or leave alone
Not every friction signal in a recording needs fixing. Here is how to judge what is worth acting on.
The same principle that applies to heatmaps applies here: the goal is to find concentrated, repeatable friction — not to fix every imperfect visitor journey. Some visitors will always navigate in unexpected ways. What you are looking for is behaviour that appears consistently across multiple sessions and has a plausible connection to a lower conversion rate.
Decision guide
If you see this in recordings, here is what to do
Rage clicks on the same element in multiple sessions
Act: Fix the element — it is not behaving as expected
Multiple visitors abandon the form at the same field
Act: Remove or rephrase that field
Visitors scroll past the CTA without clicking, then leave
Act: Improve CTA visibility or label — check heatmap for confirmation
Visitors scroll back up repeatedly to the same section
Act: That section likely contains something they need — make it easier to find or more prominent
One session shows unusual behaviour seen nowhere else
Leave: Single outlier — not a pattern
Visitors exit from your contact confirmation page
Leave: Expected behaviour — they have completed their goal
Visitors from paid ads behave differently from organic visitors
Investigate: Filter by traffic source and review separately — see Guide 04
Visitors spend a long time on a page but the form is never approached
Investigate: Check whether the form is visible and whether the CTA is clear enough
Real example: finding the ad landing page problem
Here is how session recordings — filtered by paid traffic — identified a specific problem that Google Ads data alone would never have surfaced.
Real example from paid search campaigns across the Carden businesses
The campaign that looked fine on paper but was losing visitors immediately
The situation
A Google Ads campaign was running to a service landing page. Click-through rate was acceptable and spend was within budget. But conversion rate from that campaign was lower than expected given the traffic volume.
The question
What are visitors from this campaign actually doing when they land on the page? Are they leaving immediately, or reaching the page and then not converting?
Filters applied
Recordings filtered to sessions from paid traffic, with the landing page as the entry page, and session duration under 30 seconds. This surfaced sessions most likely to show immediate exit behaviour rather than genuine consideration.
What the recordings showed
Most of these short sessions showed visitors landing, scrolling briefly through the top of the page, and leaving within 15 to 20 seconds. The page they were landing on led with general company information rather than the specific service the ad had promoted. There was a mismatch between the promise in the ad and what the visitor saw first on the page.
What changed
The landing page was updated to lead with the specific service the ad referenced — matching the language and promise of the ad — rather than the general introduction. The ad was also tightened to more accurately reflect what the page covered.
The result
Subsequent recordings from the same campaign showed longer sessions and visitors reaching further down the page — including the contact section. The behaviour change was visible in Clarity within the first two weeks of the updated page going live.
The limit
Recordings confirmed the behaviour had changed, not definitively that the change caused an improvement in conversion rate. That required monitoring enquiry volumes alongside the recording data.
Session recordings and heatmaps are two sides of the same investigation. Heatmaps show you the aggregate pattern; recordings show you the individual moment. Used together — as described in Guide 02 — they give you a complete picture of where your site is losing visitors and why.
For the full context of how session recordings fit into a website sales strategy, see my Microsoft Clarity overview. If you are running Google Ads and want to take the traffic-source filtering further, Guide 04 covers the full Google Ads integration setup.
Recording review checklist
Before you start
- Identified which page you are investigating and why — from heatmap data, conversion data, or a specific reported problem
- Written down the specific question you are trying to answer before opening recordings
- Confirmed Clarity has enough session data for meaningful review (at least 50 sessions on the page in the last 30 days)
Applying filters
- Filtered recordings to the relevant page (entry page or pages visited)
- Applied a frustration signal filter if relevant (rage clicks, dead clicks, excessive scrolling)
- Applied device filter if the investigation is mobile-specific
- Applied session duration filter if relevant to the question
During review
- Playing recordings at 4x to 6x speed for first pass, slowing to 1x to 2x at moments of interest
- Watching for the five friction signals: rage clicks, u-turn behaviour, form abandonment, excessive scrolling, deep engagement before exit
- Noting the specific element or moment where friction occurs — not just “they left”
- Reviewing at least 10 recordings before concluding a pattern exists (or does not)
After review
- Summarised the finding in one sentence: “Visitors are [doing X] at [point Y] because [probable cause Z]”
- Decided: act now, investigate further, or leave — with a reason for the decision
- If acting: made one change at a time where possible
- Set a reminder to check recordings again in two to four weeks to see whether the behaviour has changed
Start your recording review today
Open Clarity, filter recordings to your main landing page with exits, and watch ten sessions. You will find something worth fixing.
Disclosure: This is not a paid promotion. I have no affiliate or commercial relationship with Microsoft or Microsoft Clarity. It is a tool I genuinely use across my own businesses and for clients. Views are my own.