Why Your GA4 Numbers Don't Match Google Ads (And How to Fix It)
You pull your monthly report, open GA4 and Google Ads side by side, and the conversion numbers are off by 30%. Maybe more. You are not imagining it, and you are definitely not alone. The gap between GA4 and Google Ads can range anywhere from 10% to 60%, and it is one of the most common frustrations in digital analytics today.
The Discrepancy Is Real, and It Is Structural
Before diving into fixes, it helps to understand that GA4 and Google Ads were never designed to report the same numbers. They are fundamentally different measurement systems with different scopes, different counting logic, and different attribution philosophies. A perfect match between the two would actually be suspicious.
That said, a 40% or 60% gap is not healthy either. When the discrepancy is that large, it usually points to one or more specific, fixable configuration issues layered on top of the inherent differences. Here are the seven root causes, roughly ordered by how much data they typically account for.
1. Different Attribution Models
This is the single biggest source of confusion. GA4 defaults to data-driven attribution, which uses machine learning to distribute conversion credit across multiple touchpoints in a user journey. Google Ads, by contrast, uses last-click attribution by default within its own ecosystem, giving full credit to the last Google Ads click before conversion.
The practical effect is significant. A user clicks a Google Shopping ad on Monday, then returns via an organic search on Wednesday and converts. Google Ads gives full credit to that Shopping ad click. GA4, using data-driven attribution, might split credit between the ad, the organic visit, and possibly other touchpoints in between. The same conversion, counted differently.
How to fix it: You cannot make them match perfectly, but you can reduce the gap. In GA4, go to Admin, Attribution Settings and experiment with switching to last-click attribution to see how much closer the numbers get. This is not necessarily the right long term choice for your reporting, but it isolates how much of the discrepancy is model-driven versus data-loss driven.
2. Different Counting Methods
Google Ads and GA4 count conversions using fundamentally different logic. Google Ads can be configured to count either one conversion per click or every conversion per click. If you are tracking purchases and a user buys three times after a single ad click, Google Ads (set to "every") counts three conversions. GA4 counts each purchase event independently, regardless of what drove the initial visit.
The reverse problem also exists. If Google Ads is set to count "one conversion per click" for lead generation, but GA4 counts every form submission event, GA4 will show more conversions than Google Ads for the same user actions.
How to fix it: Check each conversion action in Google Ads under Tools, Conversions. Verify whether the counting method (one vs every) matches the event type. Purchases should generally be "every." Lead form submissions should generally be "one." Then compare like with like when reviewing GA4 against Ads.
3. Date Attribution Differences
This one catches a lot of analysts off guard. Google Ads attributes conversions to the date of the ad click, not the date the conversion actually happened. If a user clicks an ad on April 1st and converts on April 8th, Google Ads logs that conversion under April 1st. GA4 logs it under April 8th, the day the event actually fired.
For businesses with longer consideration cycles, this creates a persistent date mismatch. Your April report in Google Ads includes conversions that actually happened in May. Your April report in GA4 shows only what happened in April. The totals might eventually converge over a long enough time window, but any single month or week comparison will look wrong.
How to fix it: Use longer comparison windows (90 days instead of 30) to smooth out the date attribution lag. If you need monthly reporting, acknowledge the lag in your reports and include a note about the expected variance. Google Ads also has a "by conversion time" segment you can enable to align more closely with GA4.
4. Consent and Ad Blockers Are Eating Your Data
This is the most underestimated cause of discrepancies, and it is getting worse every year. Research consistently shows that approximately 15% of users block analytics scripts entirely through browser extensions like uBlock Origin or Brave's built-in shields. On top of that, roughly 20% of potential tracking data is lost to cookie consent banners where users decline or simply close the banner without consenting.
Google Ads is partially insulated from this because it measures clicks on its own platform before the user reaches your site. It knows a click happened regardless of whether your GA4 tag fires. GA4, however, depends entirely on the tag loading and executing on your site. If the tag is blocked, GA4 never sees the visit, let alone the conversion.
The result is that Google Ads will consistently report more conversions than GA4, because it captures the click side of the equation while GA4 misses a percentage of the site side.
How to fix it: Implement Google Consent Mode v2 to enable behavioural and conversion modelling for users who decline consent. Consider server-side tagging via Google Tag Manager to reduce exposure to client-side ad blockers. Server-side tagging moves the measurement endpoint to your own domain, making it significantly harder for browser extensions to block.
5. GA4 Modelled Data: Polite Guesswork
To compensate for consent-related data loss, GA4 applies behavioural and conversion modelling. When a user declines cookies or when data is otherwise unavailable, GA4 estimates what likely happened based on patterns from consented users.
Google calls this "blended data." A more accurate description is polite guesswork. The modelling is statistically sound at aggregate level, but it introduces a layer of estimation that Google Ads does not apply in the same way. Google Ads has its own conversion modelling, but it operates on different signals (logged-in Google account data, for example) and arrives at different estimates.
The two systems are effectively running independent statistical models on overlapping but non-identical data sets, producing non-identical results. This is expected, but it means any row-level reconciliation between GA4 and Google Ads is fundamentally impossible once modelling is involved.
How to fix it: Accept that modelled data will never reconcile perfectly. Focus on trends rather than absolute numbers. If your GA4 modelled conversions and Google Ads conversions trend in the same direction at roughly the same rate, the measurement system is working. If they diverge in trend, that signals a real tracking issue beyond normal modelling variance.
6. Cross-Domain Tracking Gaps
If your user journey spans multiple domains, for example a marketing site on one domain and a checkout on another, cross-domain tracking configuration becomes critical. Without it, GA4 treats the domain transition as a new session with a new user, losing the original traffic source attribution.
The paid click that started the journey gets replaced by a referral from your own marketing domain. Google Ads still attributes the conversion to the ad click because it tracks via its own click ID. GA4 misattributes it to self-referral or direct traffic. The conversion shows up in both systems, but the source attribution is completely different.
How to fix it: Configure cross-domain tracking in your GA4 data stream settings. Add all domains involved in the user journey. Then add those domains to your referral exclusion list to prevent self-referrals from resetting sessions. Test thoroughly using GA4 DebugView to confirm the client ID persists across domain boundaries.
7. Conversion Lookback Windows
Google Ads uses a configurable conversion lookback window, defaulting to 30 days for most conversion actions and up to 90 days for others. GA4's attribution lookback window is separate and configurable in Admin, Attribution Settings, with options ranging from 30 to 90 days.
If these windows are not aligned, conversions that fall within one system's lookback but outside the other's will appear in only one report. A user who clicks an ad on January 1st and converts on March 15th would be attributed in Google Ads (with a 90-day window) but potentially missed by GA4 (if set to a 30-day attribution window).
How to fix it: Align your lookback windows. In GA4, go to Admin, Attribution Settings and set the acquisition and other conversion event windows to match your Google Ads conversion action settings. For most businesses, 90 days for acquisition events and 30 days for other events is a reasonable starting point.
The Compound Effect: Why 73% of GA4 Properties Lose 30-40% of Conversion Data
Each of these issues in isolation might cause a 5-10% discrepancy. But they compound. A property with misaligned attribution models, no consent mode implementation, broken cross-domain tracking, and mismatched lookback windows can easily see 30-40% of its conversion data either missing or misattributed. Research across hundreds of GA4 implementations suggests that 73% of properties are affected by this level of data loss.
The danger is not just inaccurate reporting. It is the decisions made on inaccurate data. When Google Ads shows 200 conversions for a campaign and GA4 shows 120, which number do you use for ROI calculations? Which number do you present to the client? Which number do you use to decide whether to increase or decrease budget?
The answer is neither, until you understand why they differ and can quantify each source of discrepancy. Only then can you make an informed decision about which measurement to trust for which purpose.
How an Automated Audit Catches These Issues
Manually checking each of these seven areas across a GA4 property takes hours. You need to verify attribution model settings, cross-domain configuration, consent mode implementation, conversion counting methods, lookback window alignment, tag firing reliability, and data completeness, all while cross-referencing multiple interfaces.
An automated GA4 audit systematically validates each of these configuration points in minutes. It checks whether consent mode is implemented and firing correctly, whether cross-domain tracking covers all domains in the user journey, whether your attribution settings are internally consistent, and whether your conversion events are actually capturing the data they should be.
More importantly, an automated audit catches the issues you did not know to look for. A tag that fires twice on certain page types, inflating event counts. A consent banner that blocks GA4 but not Google Ads tags, creating a systematic gap. A referral exclusion that was removed during a site migration six months ago. These are the configuration details that explain the gap between what Google Ads reports and what GA4 shows.
Practical Steps to Reduce Your Discrepancy
- Audit your attribution settings in both GA4 and Google Ads. Document which models are active and understand the expected directional impact on each channel.
- Implement Consent Mode v2 if you have not already. This is the single highest-impact fix for most properties experiencing large discrepancies, especially in the EU and UK.
- Verify cross-domain tracking end to end using DebugView. Check that the client ID persists and that no self-referrals are resetting session attribution.
- Align lookback windows between GA4 and Google Ads. Mismatched windows are a common and entirely avoidable source of discrepancy.
- Check conversion counting in Google Ads. Make sure purchase events use "every" and lead events use "one."
- Use longer comparison periods to reduce the impact of date attribution differences. Ninety days gives a much more accurate comparison than seven.
- Run an automated audit to catch the configuration issues you cannot see from the reporting interface alone.
Find out exactly where your data is leaking
Run a free GA4 audit to identify attribution mismatches, consent gaps, cross-domain breaks, and the other configuration issues causing your GA4 and Google Ads numbers to disagree. Takes under 10 minutes.
Start Free Audit