Launch Offer2 free audits with all 229 checks. No credit card required.Start free audit

ChatGPT, Atlas, Perplexity, Comet, Claude: How Each Shows Up in GA4 (2026 Reference)

Intermediate

How does each AI assistant appear in GA4?

In 2026, AI traffic in GA4 splits into three buckets. Browsers and assistants that pass clean referrers (Perplexity web, Perplexity Comet, Claude.ai, Copilot, Gemini standalone) appear with a recognisable source / medium like perplexity.ai / referral. Surfaces that strip the referrer (ChatGPT Atlas, ChatGPT iOS/Android apps, in-app webviews) land in (direct) / (none). Surfaces that mimic existing channels (Google AI Overviews, Gemini inside google.com) are reported as Organic Search and cannot be separated. Across a typical site, around 70% of AI-adjacent visits arrive without referrers and bucket into Direct.

This guide walks every major AI surface, what GA4 records for each, and the channel group that captures the rest.

The 2026 reference table

AI surfaceReleasedReferrer behaviourGA4 source / mediumCustom channel group catches it?
ChatGPT (web)Nov 2022Inconsistent — desktop citations append utm_source=chatgpt.com since June 2025chatgpt.com / referral (when UTM survives) or DirectYes, when source is present
ChatGPT (iOS / Android apps)May 2023Stripped — uses WKWebView (iOS) or Chrome Custom Tabs (Android)(direct) / (none)No
ChatGPT SearchOct 2024Appends utm_source=chatgpt.com on most desktop citationschatgpt.com / referralYes
ChatGPT Atlas (browser)Oct 2025Stripped — opens in sandboxed webview(direct) / (none) (Browser dimension shows as Chrome 141)No
Perplexity (web)Aug 2022Cleanperplexity.ai / referralYes
Perplexity Comet (browser)Jul 2025Clean — passes referrer normallyperplexity.ai / referralYes
Perplexity (mobile app)2023Stripped — in-app webview(direct) / (none)No
Claude (web)Mar 2023Cleanclaude.ai / referralYes
Gemini (gemini.google.com)Feb 2024 (rebrand)Cleangemini.google.com / referralYes
Gemini in google.com (AI Mode / AIO)2024–2026Inherits Google.comgoogle / organic — indistinguishableNo
Microsoft Copilot (web)Feb 2023Cleancopilot.microsoft.com / referralYes
Bing Copilot in Edge sidebar2023Often cleancopilot.microsoft.com / referral or bing.com / referralYes
Meta AI (web + WhatsApp)2024Mostly stripped (WhatsApp 100% Direct per SparkToro)(direct) / (none)No
Grok (x.com)2023Cleangrok.com / referral or x.ai / referralYes
DeepSeekJan 2025Cleandeepseek.com / referralYes
You.com2022Cleanyou.com / referralYes
Phind2022Cleanphind.com / referralYes

If you're scanning quickly, the rule of thumb is: any surface that opens links in its own webview or sandbox strips the referrer. Anything that hands the click off to the system browser keeps it.

Why ChatGPT Atlas is the biggest 2026 measurement story

ChatGPT Atlas launched 21 October 2025 and within a week 27.7% of enterprises had it installed (Cyberhaven Labs, 2025). It matters more than any other AI browser for three architectural reasons that all hit GA4:

  1. It strips referrers. When a user clicks a link inside the ChatGPT interface in Atlas, the request arrives at your site with no Referer header. GA4 logs (direct) / (none).
  2. Its user-agent reports as Chrome 141. GA4's Browser dimension cannot distinguish Atlas from standalone Chrome — there's no clean way to filter Atlas traffic out at the report level.
  3. Its Chrome import doesn't carry GA4 cookies. A returning customer who opens your site in Atlas for the first time gets a fresh _ga cookie, a fresh client_id, and counts as a new user. Repeat customers silently inflate your new-user count.

If your GA4 new-user numbers drifted up in Q4 2025 with no traffic-volume change to explain it, Atlas adoption is the most likely cause. Add a GA4 annotation on 21 October 2025 so future-you remembers the cause when reviewing year-over-year reports.

What about Perplexity Comet?

Comet (released 9 July 2025) is the cleanest of the new AI browsers from a measurement perspective. It passes perplexity.ai as the referrer normally, so sessions classify as perplexity.ai / referral exactly as web Perplexity does. Cookies are honoured. Returning users don't reset.

The split between Atlas and Comet is the single most important 2026 insight: two AI browsers, two architectures, two completely different outcomes at the analytics layer. If the rest of the AI browser category (Arc, Dia, Brave Leo, Opera Neon) follows Comet's pattern rather than Atlas's, the measurement situation gets better. If they follow Atlas, it gets dramatically worse.

The custom channel group: what to actually paste in

GA4 doesn't surface AI traffic as a default channel — you have to build one. The setup takes 10 minutes and applies retroactively to historical data, so you immediately see prior AI sessions reclassified.

Step 1 — Open Channel Groups

In GA4 go to Admin → Data display → Channel groups. You get one custom channel group on the standard tier (two on 360). Click Copy to create new to duplicate the default group — you cannot edit the default.

Step 2 — Add a new channel called "AI Assistants"

Click Add new channel. Name it AI Assistants. Set the condition to Source matches regex. Paste:

This is the practical April 2026 list — the major platforms actually sending referral traffic to most sites, no false positives. Avoid the ^.*ai pattern that floats around online; it matches mail, gmail, domain and anything else containing the letters "ai", and pollutes your AI channel with email and noise.

Step 3 — Reorder the channel above Referral

Want to see whether attribution loss is already distorting your channel data?

This is the step most setups miss. Drag the AI Assistants channel above the Referral channel in the list. GA4 evaluates rules top-down, first-match-wins — if Referral sits above AI Assistants, every AI session gets classified as Referral first and never reaches your custom channel.

Step 4 — Save and verify

Save the group. Wait ~24 hours for processing. Open Reports → Acquisition → Traffic acquisition and switch the dimension dropdown to Session custom channel group. AI Assistants now appears as its own row.

Want to separate Paid AI later?

In a few months you'll likely want to split paid AI placements (Perplexity sponsored answers, ChatGPT promoted citations) into their own channel. The cleanest way is to require any paid AI to land with utm_medium=paid_ai — then add a separate channel rule that fires *before* AI Assistants matching medium = paid_ai.

What about AI Overview impressions?

Google rolled out the AI Overview filter in Search Console Performance reports to most accounts in April 2026. This is the only place to see how often your domain appears as a citation inside an AIO. It shows impressions, clicks, position, and CTR for queries where you appeared in an AI Overview.

Two things to know:

  • AIO CTR is roughly 0.6% on average, against a typical organic CTR of 1.7% on the same queries — useful for quantifying cannibalisation when AIO appears for queries you used to rank for.
  • AI Mode (Google's standalone Gemini-powered search experience) sits under a separate Search Console filter rolled out late 2025. AI Mode produces a full conversational response page; AI Overviews appear inline above search results. They're tracked separately and behave differently.

Search Console gives you the impression side. GA4 only sees the click side — and only when the click survives the referrer chain (Gemini in google.com appears as Organic Search; AIO citations from desktop usually preserve google as the referrer too).

Mobile is where the data goes to die

The single biggest blind spot isn't browsers — it's mobile apps. SparkToro's controlled experiment across 1,113 visits found 100% of visits from TikTok, Slack, Discord, Mastodon, and WhatsApp were misattributed as Direct in GA4 (SparkToro, 2023). The same mechanism applies to:

  • ChatGPT iOS app (WKWebView strips Referer)
  • ChatGPT Android app (Chrome Custom Tabs — same outcome)
  • Perplexity mobile apps
  • Meta AI inside WhatsApp, Instagram, Facebook
  • Any AI assistant accessed inside another app's webview

Your custom channel group will not catch any of this. The only mitigation is to ask creators and partners who share your URL inside AI assistants to use UTMs — which, realistically, almost no one does. Treat your mobile AI traffic as systematically under-reported and track the trend rather than the absolute number.

How to validate your AI channel is working

Three checks once the channel group has been live for 48 hours:

  1. Traffic Acquisition cross-check. Filter standard Traffic Acquisition by Session source matches regex using the same pattern. Sessions returned should match your custom channel group's session count to within 2%.
  2. Landing page audit. Build an Exploration with Session custom channel group = AI Assistants and Landing page + query string as rows. The pages that show up are your highest-cited content — feed this list into your content strategy.
  3. Engagement-rate comparison. AI traffic typically engages 23% more than non-AI traffic and stays 41% longer (Adobe, Q2 2025). If your AI channel shows engagement worse than your organic baseline, the regex is catching false positives — review and tighten.

How AI traffic compares to other channels (use this for stakeholder reporting)

When you present AI traffic to stakeholders, frame it with these numbers from 2025–26 industry data:

  • AI referral traffic is around 0.19% of total traffic on most B2C sites today, but growing 50%+ year-over-year (Rankshift, January 2026).
  • ChatGPT alone accounts for roughly 50% of all AI-referred traffic across measured sites (Ahrefs, 2025).
  • AI visitors convert at 15.9% on average vs 1.76% for Google organic search (Seer Interactive, 2025) — AI traffic is small but disproportionately valuable.
  • Gemini referral traffic grew 388% year-over-year in late 2025 (Similarweb, reported by Digiday).

The strategic takeaway: AI traffic is a small percentage today and a likely-meaningful percentage by 2027. The teams who build measurement now will have 12–24 months of clean baseline data before competitors notice it matters.

FAQ: ChatGPT, Atlas, Perplexity, Comet, Claude: How Each Shows Up in GA4 (2026 Reference)

What should a team validate first when chatgpt, atlas, perplexity, comet, claude: how each shows up in ga4 (2026 reference) appears?

Reproduce the problem in the live implementation, isolate whether it is scoped to one report or flow, and compare it against at least one secondary source before changing the setup.

How do I know whether the fix actually worked?

You need before-and-after evidence in the browser and in the downstream report. A clean-looking dashboard without validation is not enough.

When should this become a full GA4 audit instead of a quick fix?

If the issue touches attribution, consent, revenue, campaign quality, or data trust for more than one workflow, it is usually safer to audit the surrounding implementation than patch only the visible symptom.

Check ChatGPT, Atlas, Perplexity, Comet, Claude: How Each Shows Up in GA4 (2026 Reference) before campaign reporting gets blamed for the wrong issue

Run a free GA4 audit to spot attribution breaks, UTM governance issues, self-referrals, and source/medium loss fast.

These findings come from auditing thousands of GA4 properties. See how your property compares

GA4 Audits Team

GA4 Audits Team

Analytics Engineering

Specialising in GA4 architecture, consent mode implementation, and multi-layer audit frameworks.

Share