← Back to My Microsoft Clarity Overview
Who this guide is for
Digital marketing agencies, freelance consultants, and in-house teams who manage websites for multiple clients and want to include behavioural data in their reporting.
Prerequisites
A Clarity account with at least one project set up. Clarity installed on the client site and collecting data. If you have not set up Clarity yet, start with Guide 01.
What you will be able to do
Structure Clarity projects per client, decide when and how to give clients access, pull heatmap and recording observations into reports, and frame recommendations in language clients respond to.
Time investment
Around 30 to 45 minutes per client per month once you know what to look for and have a template in place. Less as the process becomes routine.
Why Clarity works well for client work
The practical advantage for agency use is straightforward: one Clarity account, unlimited projects, no per-client cost.
When I started using Clarity across client sites at Carden Digital, the first benefit was economic. Running a paid behavioural analytics tool across multiple client sites adds up quickly. Either the client bears that cost, or the agency absorbs it. Clarity removes that calculation entirely. A new client site gets its own project at no extra cost, and data starts flowing the moment the tracking script is installed.
Beyond cost, there is a reporting advantage. Heatmaps and session recordings are among the most immediately legible data you can put in front of a client. A graph of bounce rate requires explanation. A scroll heatmap showing that most mobile visitors stop halfway down the page before reaching the contact form requires almost none. Clients who have never engaged with analytics data will often respond to a heatmap screenshot because what it shows is visually obvious.
This makes Clarity useful not just as a diagnostic tool but as a communication tool — a way of showing clients evidence for recommendations rather than asserting them on instinct alone.
The reporting use case in one sentence
Clarity gives you visual evidence for recommendations you would otherwise have to make on gut feeling, and it costs nothing to run across as many client sites as you manage.
Setting up a separate project for each client
Each client site needs its own Clarity project. One tracking script, one project, one isolated dataset. Mixing client data into a shared project would make it impossible to separate their behaviour.
1
Create a new project for the client site
In your Clarity account, click “New project” from the dashboard. Enter the client’s website name and URL. Give the project a clear name that identifies the client — something like “Client Name — Website” so it is immediately obvious in your project list when you are managing several sites at once.
Clarity will generate a unique project ID and tracking snippet. This project is entirely separate from any other project in your account. The client’s data will only appear here.
2
Install the tracking script on the client site
Add the Clarity tracking script using whichever method suits the client’s site. For most managed client sites, Google Tag Manager is the cleanest approach — it keeps tracking centralised and does not require access to the theme or code every time something needs adjusting. If GTM is not available, install via the site’s header, a WordPress plugin, or platform-specific integration.
See Guide 01 for the full installation walkthrough across different platforms.
3
Verify data is being collected before the client meeting
After installation, visit the client site yourself and check the Clarity project within 30 minutes. You should see a recording of your own session in the recordings list, and the dashboard should show “Data is being collected” under Settings. Do not wait until a client review call to discover the tracking was not working.
Consent must be in place before the script goes live
If the client’s site serves visitors from the UK or EEA, a compliant cookie consent banner that covers Clarity must be active before the tracking script collects data from those visitors. Running Clarity without consent for UK or EEA visitors is non-compliant since October 2025. See Guide 07 for the full compliance setup.
4
Allow two weeks before using the data in a report
The first two weeks on a new project tend to include your own test visits, low session counts on individual pages, and occasional noise from the installation process. Set a reminder to review the project properly two weeks after going live. Heatmaps with fewer than 100 sessions per page are harder to interpret with confidence — waiting produces better data to work with.
Managing access: what to give clients and what to keep internal
Giving clients direct Clarity access can be genuinely useful, but it needs to be the right level — enough to see their data, not enough to disrupt your setup.
Clarity supports three access levels within a project. Understanding the difference matters for client work.
| Role | View recordings and heatmaps | Change project settings | Add or remove team members | Best for |
|---|---|---|---|---|
| Admin | Yes | Yes | Yes | Your own account — not for clients |
| Member | Yes | Yes | No | Internal team members who need to configure filters or update settings |
| Viewer | Yes | No | No | Clients — they can explore their own data without touching settings |
The Viewer role is the right level for almost every client. They can log in, navigate to their project, watch recordings, and explore heatmaps. They cannot change the tracking setup, adjust consent settings, or modify what data is being masked. This means if a curious client wants to explore their data between reporting calls, you can give them access without worrying they will alter something that affects your tracking.
Should you give every client direct Clarity access?
Not automatically. Some clients will use it and find it valuable — particularly those with an internal marketing resource or a genuine interest in their site performance. Others will not log in and will find it confusing without guidance. The default approach at Carden Digital is not to offer it unless a client asks, but to have it ready when they do. The Viewer role makes it a low-risk option either way.
Brief clients before giving access
Before inviting a client as a Viewer, set expectations about what recordings show. They will see real visitor behaviour — including hesitation, frustration, and exits without enquiry. A client watching recordings for the first time and seeing visitors leave without contacting them needs context, not just data. Frame the conversation as: “this shows us where the website is working and where it needs help” rather than leaving them to interpret it alone.
What to review each month and in what order
A monthly Clarity review for a client site should follow a consistent sequence — the same pages, the same questions, every time. Consistency is what makes month-on-month comparison meaningful.
The value of a regular review is partly in the individual findings and partly in the patterns that emerge over time. A contact form that consistently shows high scroll depth but low CTA click activity is a clearer signal month after month than it is in a single observation. A heatmap that shows improvement after a layout change is evidence that the change worked. Both require comparable data points across multiple reviews.
Start here — Dashboard
Rage and dead click summary
The Clarity dashboard shows a count of rage clicks and dead clicks across the site. Any notable increase compared to last month is worth investigating before anything else. A spike in rage clicks often signals a broken element — a button that stopped responding, a form field not accepting input, or a JavaScript error affecting interaction.
Then — Heatmaps
Key conversion pages on mobile
Check the scroll heatmap on the homepage, main service or product pages, and the contact page — always with the mobile filter applied first. Note whether the primary CTA or enquiry form is sitting above or below where most mobile visitors stop scrolling. This check often produces the most immediately actionable finding in a monthly review.
Then — Recordings
Sessions filtered by exit on key pages
Filter recordings to sessions that ended on the contact page or main landing page. Watch 8 to 10. Look for the common pattern: did visitors reach the form but not submit, and if so, where did they stop? This is the most consistent source of reportable findings and clients respond well because the behaviour is visually obvious in the recording.
Finally — Advertising (if applicable)
Campaign intent levels if Google Ads is active
If the client is running Google Ads and the Clarity integration is set up, check the advertising dashboard for intent level changes per campaign since last month. A campaign shifting from predominantly medium intent to predominantly low intent may signal a landing page issue or a change in the ad itself. See Guide 04 for the full workflow.
Aim for two or three findings per report, not ten
A client report that lists twelve Clarity observations loses its impact. Most will not produce clear recommendations, or they will compete with each other for the client’s attention and budget. Pick the two or three findings that point to the most specific, actionable changes — and present those well — rather than cataloguing everything you noticed.
Turning findings into recommendations clients understand
The gap between “here is what the data shows” and “here is what we should do about it” is where client reporting either adds real value or produces confusion.
Clarity data is not self-explanatory to most clients. A scroll heatmap showing a cold zone below the third section of a page is a visual that needs translation: what does it mean, why does it matter, and what should change? The same applies to a recording showing a visitor clicking repeatedly on a non-interactive element — the client needs to understand what the broken experience was, and why fixing it would improve their results.
The format that works consistently in client reports follows three parts for each finding: what the data showed, why it matters, and what the specific recommendation is. The examples below show how to apply this to the three most common Clarity findings.
Most mobile visitors are not reaching the contact form
Scroll heatmap
What we found: The scroll heatmap on mobile shows the page turning from warm to cool approximately halfway down — above where the contact form is positioned. A significant proportion of mobile visitors are not scrolling far enough to reach the form.
Why it matters: If a visitor cannot see the form, they cannot use it. Mobile traffic accounts for a large share of visitors on most business websites. Enquiries you are not receiving are likely coming from mobile visitors who left before reaching the point where they could make contact.
Recommendation: Add a contact button or short enquiry prompt to the top section of the page — visible within the first one or two scrolls on mobile. The existing form can stay in its current position for visitors who prefer to read first. A sticky “Get in touch” button fixed to the bottom of the mobile screen is also worth considering.
Visitors are clicking an element that is not interactive
Click heatmap + Recordings
What we found: The click heatmap shows a cluster of clicks on [specific image or heading] — an element that is not linked to anything. Session recordings confirm visitors are clicking it repeatedly, suggesting they expect it to take them somewhere or do something that it does not.
Why it matters: Every time a visitor clicks something and nothing happens, their confidence in the site drops. Repeated dead clicks are associated with frustration and early exit. If this element looks like it should be clickable, visitors are being misled by its appearance.
Recommendation: Either make the element a link (to a relevant page or section), or restyle it so it no longer looks like it should be clicked. The former is usually preferable — if visitors want to click on it, there is likely a page it could logically link to.
Visitors from a paid campaign are leaving significantly faster than organic visitors
Google Ads dashboard
What we found: The Clarity advertising dashboard shows that sessions from [campaign name] have an average duration considerably lower than the site average. The majority of sessions from this campaign are classified as low intent. Recordings filtered to this campaign show visitors arriving and leaving within 15 to 20 seconds.
Why it matters: Every click on this campaign costs budget. If most of those clicks result in immediate exits, the campaign is paying for visitors who are not converting and are not engaging with the page. This is a landing page problem, not a targeting problem — the page is not delivering what the ad promised.
Recommendation: The landing page needs to reflect what the ad is offering from the very first section. The ad headline should be visible immediately on the page — not buried below a general company introduction. We recommend updating the landing page before increasing this campaign’s budget further.
Real example: a finding, a recommendation, and the result
Here is how a single heatmap finding translated into a recommendation, a page change, and a visible improvement in the following month’s Clarity data.
Client reporting example — professional services site
The enquiry form nobody was reaching on mobile
Context
A professional services client whose site was generating reasonable organic traffic but a lower-than-expected enquiry rate. The site was well designed and the content was relevant. The problem was not apparent from traffic data alone.
What Clarity showed
The monthly Clarity review included a scroll heatmap check on the main services page, mobile view. The heatmap showed the page turning cold before the enquiry form section. Most mobile visitors were not scrolling far enough to see it. The click heatmap confirmed the form button was receiving minimal interaction from mobile sessions.
The report finding
The finding was presented with a screenshot of the mobile scroll heatmap, annotated to show the fold line and the form position below it. The recommendation was to add a “Request a callback” button in the first section of the page, visible without scrolling on mobile.
Client response
The client approved the change within the same week. The heatmap made the problem immediately legible. The recommendation was not “we think you should move the form” but “look at where your mobile visitors are stopping” — and the visual made the case without requiring further explanation or persuasion.
The outcome
The following month’s Clarity review showed the new button receiving meaningful click activity in the mobile heatmap. Session recordings showed visitors interacting with the new CTA earlier in their journey. The next monthly report included the before and after heatmap comparison as evidence the change had produced a visible behavioural improvement.
The limit
Clarity confirmed the behavioural change. Attributing a specific change in enquiry volume directly to the CTA repositioning required the client’s own form submission data alongside the Clarity evidence. Clarity does not count form submissions by default — the before and after heatmap was enough evidence for the client, but a fuller picture always includes both.
What Clarity cannot do for client reporting
Being clear with clients about what the data can and cannot show keeps reporting credible and prevents over-reliance on behavioural data alone.
Clarity does not track form submissions as conversions by default. It can show that a visitor reached the form, interacted with the fields, and either continued or left — but it does not confirm whether the form was actually submitted successfully unless you set up custom events. This means Clarity data alone does not tell you how many enquiries a page generated. It tells you how many visitors engaged with it.
Clarity does not reveal visitor identity. You cannot tell from a recording whether the visitor was a potential customer, a competitor researching you, or a returning client. Some clients ask about this and the honest answer is that Clarity shows behaviour patterns, not individual identities.
Clarity does not generate its own branded reports. There is no “export monthly report” function that produces a PDF. The workflow is: you review the data, take screenshots, annotate them where useful, and write the findings and recommendations yourself. The tool provides the evidence. The interpretation and communication is your work as the consultant or agency.
And Clarity does not explain why a visitor behaved the way they did. It shows what happened — where they clicked, how far they scrolled, where they left. The “why” requires your interpretation, informed by the recordings and supported by an understanding of the client’s business and audience. This distinction matters with clients who may expect the tool to explain intent.
Set expectations at the start of any engagement
Clarity is a behavioural observation tool. It shows what visitors do and provides evidence to support specific recommendations. It does not replace conversion tracking, does not count enquiries, and does not provide demographic or identity data. Clients who understand this from the start get more from the reporting than those who expect it to do more than it does.
For the full context of how Clarity fits into a website improvement strategy, see my Microsoft Clarity overview. For the GDPR compliance setup every client site needs before Clarity goes live, see Guide 07.
Client reporting checklist
Project setup (once per client)
- New Clarity project created with a clear client name
- Tracking script installed via GTM or direct install on client site
- Consent banner in place covering Clarity before script goes live (UK and EEA sites)
- Verified data is being collected — own session visible in recordings within 30 minutes
- Noted the two-week wait before first meaningful reporting review
- Clarity disclosed in client site privacy policy
Client access (optional)
- Decided whether to give client direct Clarity access based on their profile and interest
- If yes: invited as Viewer, not Member or Admin
- Briefed client on what recordings show before they explore independently
Monthly review (repeat per client)
- Checked dashboard rage and dead click counts versus previous month
- Reviewed mobile scroll heatmap on homepage and key conversion pages
- Reviewed click heatmap for CTA engagement and dead click clusters
- Filtered recordings to exits from key conversion pages — watched 8 to 10
- Checked advertising dashboard if Google Ads is running — reviewed intent levels per campaign
- Identified two to three findings to report — not every observation
Report writing
- Each finding structured as: what we found / why it matters / specific recommendation
- Heatmap screenshots included, annotated where the pattern is not immediately obvious
- Recommendations are specific — not “improve mobile experience” but “add callback button above the fold on mobile”
- Before and after comparison included if a previous recommendation has been implemented
- What Clarity cannot confirm (form submission count, visitor identity) is not overstated in the report
Add Clarity to your next client site today
Free to install, unlimited sessions, separate project per client. Start collecting the data that makes your recommendations credible
Disclosure: This is not a paid promotion. I have no affiliate or commercial relationship with Microsoft or Microsoft Clarity. It is a tool I genuinely use in my own agency work. Views are my own.