How does each AI assistant appear in GA4?
In 2026, AI traffic in GA4 splits into three buckets. Browsers and assistants that pass clean referrers (Perplexity web, Perplexity Comet, Claude.ai, Copilot, Gemini standalone) appear with a recognisable source / medium like perplexity.ai / referral. Surfaces that strip the referrer (ChatGPT Atlas, ChatGPT iOS/Android apps, in-app webviews) land in (direct) / (none). Surfaces that mimic existing channels (Google AI Overviews, Gemini inside google.com) are reported as Organic Search and cannot be separated. Across a typical site, around 70% of AI-adjacent visits arrive without referrers and bucket into Direct.
This guide walks every major AI surface, what GA4 records for each, and the channel group that captures the rest.
The 2026 reference table
| AI surface | Released | Referrer behaviour | GA4 source / medium | Custom channel group catches it? |
|---|---|---|---|---|
| ChatGPT (web) | Nov 2022 | Inconsistent — desktop citations append utm_source=chatgpt.com since June 2025 | chatgpt.com / referral (when UTM survives) or Direct | Yes, when source is present |
| ChatGPT (iOS / Android apps) | May 2023 | Stripped — uses WKWebView (iOS) or Chrome Custom Tabs (Android) | (direct) / (none) | No |
| ChatGPT Search | Oct 2024 | Appends utm_source=chatgpt.com on most desktop citations | chatgpt.com / referral | Yes |
| ChatGPT Atlas (browser) | Oct 2025 | Stripped — opens in sandboxed webview | (direct) / (none) (Browser dimension shows as Chrome 141) | No |
| Perplexity (web) | Aug 2022 | Clean | perplexity.ai / referral | Yes |
| Perplexity Comet (browser) | Jul 2025 | Clean — passes referrer normally | perplexity.ai / referral | Yes |
| Perplexity (mobile app) | 2023 | Stripped — in-app webview | (direct) / (none) | No |
| Claude (web) | Mar 2023 | Clean | claude.ai / referral | Yes |
| Gemini (gemini.google.com) | Feb 2024 (rebrand) | Clean | gemini.google.com / referral | Yes |
| Gemini in google.com (AI Mode / AIO) | 2024–2026 | Inherits Google.com | google / organic — indistinguishable | No |
| Microsoft Copilot (web) | Feb 2023 | Clean | copilot.microsoft.com / referral | Yes |
| Bing Copilot in Edge sidebar | 2023 | Often clean | copilot.microsoft.com / referral or bing.com / referral | Yes |
| Meta AI (web + WhatsApp) | 2024 | Mostly stripped (WhatsApp 100% Direct per SparkToro) | (direct) / (none) | No |
| Grok (x.com) | 2023 | Clean | grok.com / referral or x.ai / referral | Yes |
| DeepSeek | Jan 2025 | Clean | deepseek.com / referral | Yes |
| You.com | 2022 | Clean | you.com / referral | Yes |
| Phind | 2022 | Clean | phind.com / referral | Yes |
If you're scanning quickly, the rule of thumb is: any surface that opens links in its own webview or sandbox strips the referrer. Anything that hands the click off to the system browser keeps it.
Why ChatGPT Atlas is the biggest 2026 measurement story
ChatGPT Atlas launched 21 October 2025 and within a week 27.7% of enterprises had it installed (Cyberhaven Labs, 2025). It matters more than any other AI browser for three architectural reasons that all hit GA4:
- It strips referrers. When a user clicks a link inside the ChatGPT interface in Atlas, the request arrives at your site with no Referer header. GA4 logs
(direct) / (none). - Its user-agent reports as Chrome 141. GA4's Browser dimension cannot distinguish Atlas from standalone Chrome — there's no clean way to filter Atlas traffic out at the report level.
- Its Chrome import doesn't carry GA4 cookies. A returning customer who opens your site in Atlas for the first time gets a fresh
_gacookie, a freshclient_id, and counts as a new user. Repeat customers silently inflate your new-user count.
If your GA4 new-user numbers drifted up in Q4 2025 with no traffic-volume change to explain it, Atlas adoption is the most likely cause. Add a GA4 annotation on 21 October 2025 so future-you remembers the cause when reviewing year-over-year reports.
What about Perplexity Comet?
Comet (released 9 July 2025) is the cleanest of the new AI browsers from a measurement perspective. It passes perplexity.ai as the referrer normally, so sessions classify as perplexity.ai / referral exactly as web Perplexity does. Cookies are honoured. Returning users don't reset.
The split between Atlas and Comet is the single most important 2026 insight: two AI browsers, two architectures, two completely different outcomes at the analytics layer. If the rest of the AI browser category (Arc, Dia, Brave Leo, Opera Neon) follows Comet's pattern rather than Atlas's, the measurement situation gets better. If they follow Atlas, it gets dramatically worse.
The custom channel group: what to actually paste in
GA4 doesn't surface AI traffic as a default channel — you have to build one. The setup takes 10 minutes and applies retroactively to historical data, so you immediately see prior AI sessions reclassified.
Step 1 — Open Channel Groups
In GA4 go to Admin → Data display → Channel groups. You get one custom channel group on the standard tier (two on 360). Click Copy to create new to duplicate the default group — you cannot edit the default.
Step 2 — Add a new channel called "AI Assistants"
Click Add new channel. Name it AI Assistants. Set the condition to Source matches regex. Paste:
This is the practical April 2026 list — the major platforms actually sending referral traffic to most sites, no false positives. Avoid the ^.*ai pattern that floats around online; it matches mail, gmail, domain and anything else containing the letters "ai", and pollutes your AI channel with email and noise.
Step 3 — Reorder the channel above Referral
Want to see whether attribution loss is already distorting your channel data?
This is the step most setups miss. Drag the AI Assistants channel above the Referral channel in the list. GA4 evaluates rules top-down, first-match-wins — if Referral sits above AI Assistants, every AI session gets classified as Referral first and never reaches your custom channel.
Step 4 — Save and verify
Save the group. Wait ~24 hours for processing. Open Reports → Acquisition → Traffic acquisition and switch the dimension dropdown to Session custom channel group. AI Assistants now appears as its own row.
Want to separate Paid AI later?
In a few months you'll likely want to split paid AI placements (Perplexity sponsored answers, ChatGPT promoted citations) into their own channel. The cleanest way is to require any paid AI to land with utm_medium=paid_ai — then add a separate channel rule that fires *before* AI Assistants matching medium = paid_ai.
What about AI Overview impressions?
Google rolled out the AI Overview filter in Search Console Performance reports to most accounts in April 2026. This is the only place to see how often your domain appears as a citation inside an AIO. It shows impressions, clicks, position, and CTR for queries where you appeared in an AI Overview.
Two things to know:
- AIO CTR is roughly 0.6% on average, against a typical organic CTR of 1.7% on the same queries — useful for quantifying cannibalisation when AIO appears for queries you used to rank for.
- AI Mode (Google's standalone Gemini-powered search experience) sits under a separate Search Console filter rolled out late 2025. AI Mode produces a full conversational response page; AI Overviews appear inline above search results. They're tracked separately and behave differently.
Search Console gives you the impression side. GA4 only sees the click side — and only when the click survives the referrer chain (Gemini in google.com appears as Organic Search; AIO citations from desktop usually preserve google as the referrer too).
Mobile is where the data goes to die
The single biggest blind spot isn't browsers — it's mobile apps. SparkToro's controlled experiment across 1,113 visits found 100% of visits from TikTok, Slack, Discord, Mastodon, and WhatsApp were misattributed as Direct in GA4 (SparkToro, 2023). The same mechanism applies to:
- ChatGPT iOS app (WKWebView strips Referer)
- ChatGPT Android app (Chrome Custom Tabs — same outcome)
- Perplexity mobile apps
- Meta AI inside WhatsApp, Instagram, Facebook
- Any AI assistant accessed inside another app's webview
Your custom channel group will not catch any of this. The only mitigation is to ask creators and partners who share your URL inside AI assistants to use UTMs — which, realistically, almost no one does. Treat your mobile AI traffic as systematically under-reported and track the trend rather than the absolute number.
How to validate your AI channel is working
Three checks once the channel group has been live for 48 hours:
- Traffic Acquisition cross-check. Filter standard Traffic Acquisition by
Session source matches regexusing the same pattern. Sessions returned should match your custom channel group's session count to within 2%. - Landing page audit. Build an Exploration with Session custom channel group = AI Assistants and Landing page + query string as rows. The pages that show up are your highest-cited content — feed this list into your content strategy.
- Engagement-rate comparison. AI traffic typically engages 23% more than non-AI traffic and stays 41% longer (Adobe, Q2 2025). If your AI channel shows engagement worse than your organic baseline, the regex is catching false positives — review and tighten.
How AI traffic compares to other channels (use this for stakeholder reporting)
When you present AI traffic to stakeholders, frame it with these numbers from 2025–26 industry data:
- AI referral traffic is around 0.19% of total traffic on most B2C sites today, but growing 50%+ year-over-year (Rankshift, January 2026).
- ChatGPT alone accounts for roughly 50% of all AI-referred traffic across measured sites (Ahrefs, 2025).
- AI visitors convert at 15.9% on average vs 1.76% for Google organic search (Seer Interactive, 2025) — AI traffic is small but disproportionately valuable.
- Gemini referral traffic grew 388% year-over-year in late 2025 (Similarweb, reported by Digiday).
The strategic takeaway: AI traffic is a small percentage today and a likely-meaningful percentage by 2027. The teams who build measurement now will have 12–24 months of clean baseline data before competitors notice it matters.
FAQ: ChatGPT, Atlas, Perplexity, Comet, Claude: How Each Shows Up in GA4 (2026 Reference)
What should a team validate first when chatgpt, atlas, perplexity, comet, claude: how each shows up in ga4 (2026 reference) appears?
How do I know whether the fix actually worked?
When should this become a full GA4 audit instead of a quick fix?
Related guides for ChatGPT, Atlas, Perplexity, Comet, Claude: How Each Shows Up in GA4 (2026 Reference)
Perplexity Sources Report: How to Influence What It Cites in 2026
Perplexity citations correlate strongly with five factors: (1) ranking in Bing's top 10 for the underlying query (Perplexity uses Bing's index as fallback alongside its own ~5 billion-URL custom crawler), (2) a clear direct answer in the first 50 words of the relevant page…
Tracking AI Overview Impressions in Google Search Console: The 2026 Reality
No — Search Console does not have a native AI Overview filter, and Google has confirmed none is planned. AI Overview impressions, AI Mode impressions, featured snippets, and standard 10-blue-link impressions are all blended together under the "Web" search type with no way to separate them…
Check ChatGPT, Atlas, Perplexity, Comet, Claude: How Each Shows Up in GA4 (2026 Reference) before campaign reporting gets blamed for the wrong issue
Run a free GA4 audit to spot attribution breaks, UTM governance issues, self-referrals, and source/medium loss fast.