What is attribution modelling in GA4?
Attribution modelling in GA4 determines how conversion credit is distributed across the touchpoints in a user's journey to conversion. GA4 supports four attribution models: Last click (100% credit to the last channel touched before conversion), Data-driven (ML-based credit distribution across all touchpoints, default for qualifying properties), First click (100% credit to the first channel that introduced the user), and Linear (equal credit to all touchpoints). The model you choose changes which channels appear to "cause" conversions — and therefore which channels receive budget increases.
This is one of the highest-leverage analytics decisions in any paid media strategy.
The four GA4 attribution models
Last click
Credit distribution: 100% to the last non-direct channel touched before conversion.
What it overvalues: Channels that appear at the bottom of the funnel — branded search, direct retargeting, and brand-name paid search. These are channels that "close" the sale but often didn't initiate the customer's consideration.
What it undervalues: Awareness channels — display, video, upper-funnel paid social, content marketing, affiliate. These channels introduce users to the brand but rarely appear as the last touchpoint before purchase.
When to use: Simple single-channel operations (e.g., a business that only runs Google Ads Search). When you genuinely want to optimise only toward close-stage channels. When starting Smart Bidding before hitting data-driven thresholds (last click as a stable fallback).
Data-driven (default)
Credit distribution: Machine learning distributes credit across all touchpoints based on the actual contribution of each touchpoint to conversions. Touchpoints that appear more frequently in converting paths (relative to non-converting paths) receive higher credit.
What it does better: Surfaces the value of upper-funnel touchpoints that assist conversions but rarely appear as the last click. Display campaigns, YouTube, and paid social that influence consideration get credit proportional to their actual contribution.
Requirements for activation:
- 300+ conversions in 30 days on the conversion event
- 3,000+ ad clicks in 30 days
Below these thresholds, GA4 defaults to last-click even if data-driven is selected. Check: Google Ads → Tools → Attribution → Attribution model status → "Data-driven" or "Last click (fallback)".
The key insight data-driven typically reveals: Upper-funnel channels (brand awareness display, YouTube) appear undervalued by last-click. When switching to data-driven, these channels often show 20–50% more attributed conversions, while branded search shows fewer (as it was over-credited as the last touchpoint).
First click
Credit distribution: 100% to the first channel that brought the user to the site.
What it overvalues: Awareness and prospecting channels — the channels that introduced the user to the brand.
What it undervalues: Retargeting, branded search — channels that help users complete a conversion they were already considering.
When to use: Rarely used as a primary model. Useful for assessing the "introduction" value of top-of-funnel channels. Most useful for understanding new customer acquisition paths specifically.
Want to see whether attribution loss is already distorting your channel data?
Linear
Credit distribution: Equal credit to every touchpoint in the path.
When to use: When you believe all touchpoints contribute equally (rare in practice). Useful as a reference model in the attribution model comparison tool to see how different models diverge.
How to use the attribution model comparison tool
GA4 → Advertising → Attribution → Model comparison
This tool shows how different models attribute conversions across channels, side by side.
The most useful comparison: Data-driven vs Last click
| Channel | Last click conversions | Data-driven conversions | Difference |
|---|---|---|---|
| Branded search | 450 | 280 | -38% |
| Organic search | 210 | 260 | +24% |
| Paid Social | 85 | 150 | +76% |
| Display | 40 | 110 | +175% |
How to read this: Paid social and display are significantly undervalued by last click. A budget allocation based on last-click attribution underfunds these channels because their contribution to conversions is credited to branded search (which usually appears last).
Budget implication: If you're running data-driven attribution and your results look like the table above, increasing investment in Paid Social and Display (and reducing over-investment in branded search bidding) is directionally correct — these channels are producing conversion assists that last-click is not crediting.
The recalibration period
When you change attribution models in Google Ads (which uses the model for Smart Bidding), allow 90 days before making major budget decisions based on the new model's data:
- Smart Bidding relearns which clicks and audiences lead to conversions under the new credit distribution
- Historical attributed conversion counts change retroactively in reports (attribution models recalculate historical data)
- CPAs and ROAS figures appear to shift as the model recalibrates
During the recalibration period:
- Do not increase or decrease channel budgets based on the immediate post-switch numbers
- Monitor for unusual bidding behaviour (automated bids may swing before the algorithm stabilises)
- Run the model comparison tool to understand the delta between old and new model (this is consistent during calibration)
Data-driven vs last click: when data-driven is wrong
Despite its advantages, data-driven attribution is not always the better choice:
Data-driven may mislead when:
- Your property has barely met the 300 conversion threshold — the model has limited data and may overfit
- Your campaigns have structural changes (new ad formats, new landing pages) that make historical conversion path data unreliable
- You're running single-touchpoint campaigns (e.g., only one paid channel) — data-driven adds no value if there's nothing to distribute credit across
In these cases, last-click with micro-conversion assists is often more reliable — clear, interpretable, and stable.
FAQ: GA4 Attribution Models: Last Click vs Data-Driven vs First Click
How close should ga4 attribution models: last click vs data-driven vs first click numbers be before I worry?
What should I validate first when ga4 attribution models: last click vs data-driven vs first click numbers disagree?
When is a discrepancy a tracking bug instead of a reporting difference?
Related guides for GA4 Attribution Models: Last Click vs Data-Driven vs First Click
ChatGPT, Atlas, Perplexity, Comet, Claude: How Each Shows Up in GA4 (2026 Reference)
In 2026, AI traffic in GA4 splits into three buckets. Browsers and assistants that pass clean referrers (Perplexity web, Perplexity Comet, Claude.ai, Copilot, Gemini standalone) appear with a recognisable source / medium like perplexity.ai / referral. Surfaces that strip the referrer (ChatGPT Atlas…
Perplexity Sources Report: How to Influence What It Cites in 2026
Perplexity citations correlate strongly with five factors: (1) ranking in Bing's top 10 for the underlying query (Perplexity uses Bing's index as fallback alongside its own ~5 billion-URL custom crawler), (2) a clear direct answer in the first 50 words of the relevant page…
Check GA4 Attribution Models: Last Click vs Data-Driven vs First Click before campaign reporting gets blamed for the wrong issue
Run a free GA4 audit to spot attribution breaks, UTM governance issues, self-referrals, and source/medium loss fast.