Launch Offer2 free audits with all 229 checks. No credit card required.Start free audit

UTM Cardinality Crisis: When 'Other' Eats Your Reports (2026)

Intermediate

Why does GA4 say (other) in my reports?

GA4 collapses dimension values into "(other)" when the unique value count for that dimension exceeds 50,000 in a single day at the property level. Once a session triggers the cardinality limit, every subsequent unique value rolls into "(other)" for that day's report.

The most common cause is high-cardinality custom dimensions (user IDs, session IDs, full URLs as parameters), followed by UTM parameters with timestamps or random IDs encoded into them, and free-text user inputs sent as event parameters. Once "(other)" appears, you cannot retrieve the granular values from standard reports — only from BigQuery exports if configured.

This is the silent killer of GA4 reports. The fix is preventing cardinality at source.

The 50,000-value threshold

GA4's cardinality limit applies per dimension, per day, per property:

  • Threshold: 50,000 unique values per dimension per day
  • Behaviour above threshold: all values beyond the first 50,000 collapse into "(other)"
  • Reset: daily at the property's reporting time zone
  • Affected reports: all standard GA4 reports, Explorations, and Hub reports
  • NOT affected: BigQuery exports (preserve full cardinality)

The threshold applies to dimensions in standard reports. Higher-cardinality data exists in the underlying GA4 data — it's just not surfaced in the UI once the threshold is crossed.

For Analytics 360 (paid GA4), the threshold is higher — Google doesn't publish the exact number but it's roughly 5x standard.

What dimensions hit the threshold

The cardinality offenders, in approximate order of frequency:

1. Page path with query strings. Every unique URL counts as a unique value. A site with thousands of products + heavy use of query parameters can hit 50,000 daily. Solution: strip query strings from the page_path event parameter (use page_location for the full URL where needed).

2. Custom dimensions storing IDs. User IDs, transaction IDs, session IDs as custom dimensions. Each user generates a new value per session. A site with 50,000+ daily users automatically hits the limit.

3. Search query parameters. Site search queries collected as a custom dimension. Free-text inputs are inherently high-cardinality. A search-heavy site can exceed 50,000 unique queries per day.

4. UTM parameters with random IDs. Marketing automation platforms sometimes append timestamps or random IDs to UTM values. utm_campaign=spring-2026-{{timestamp}} generates a new campaign value per session.

5. Hostname as dimension when serving multiple subdomains. A multi-tenant SaaS with thousands of customer subdomains hits the limit on the hostname dimension.

6. Event parameters from custom code. Engineers passing dynamic values as event parameters without considering cardinality. The classic: event_param: full_referrer_url capturing every unique referrer.

How to detect cardinality issues

Three diagnostic methods:

1. Standard reports — look for "(other)" in any dimension. Open Reports → Engagement → Pages and screens (or any report). If "(other)" appears as a row, that dimension has hit cardinality. Critical: "(other)" with a high session count means most of your data is collapsed and reports are unreliable.

2. Cardinality report in GA4 Admin. Admin → Data Display → there's a Cardinality limits warning indicator on properties hitting limits. Lists the affected dimensions.

Want to see whether attribution loss is already distorting your channel data?

3. BigQuery query. For properties with BigQuery export, run a query counting distinct values per dimension per day:

Replace page_location with whatever dimension you suspect. If the result is over 50,000 distinct values for any single day, that's the offender.

The four practical fixes

Fix 1: Strip query strings at the source

The most common cause is page paths with query strings. Configure GA4 to strip them:

In your GA4 implementation (gtag or GTM Configuration), set the page_path parameter to exclude query strings. This is typically done in the page_view event configuration — strip the query string from the URL before passing as page_path. Keep the full URL in page_location for cases where you need it.

Fix 2: Hash high-cardinality custom dimensions

If you must capture user IDs or similar high-cardinality data, hash them into bucketed values:

  • User ID → User ID first 2 characters (creates ~36 buckets max)
  • Transaction ID → Transaction ID prefix (date-based bucketing)
  • Search query → Search query length range (1-3 chars, 4-7, 8+) or first word

This preserves analytical utility while keeping cardinality below the threshold.

Fix 3: Don't send what you don't need

Audit your custom dimensions. Most properties have 5-10 active custom dimensions; many have unused ones still firing. Each unused dimension is cardinality risk for nothing. Disable dimensions you're not using in current reports.

Fix 4: Use BigQuery for high-cardinality analysis

For analyses that legitimately need high cardinality (full search query analysis, per-user analysis, full URL analysis), do them in BigQuery rather than GA4 standard reports. BigQuery preserves full granularity. Standard reports show aggregated views; BigQuery shows the detail.

Once "(other)" appears: recovery

Bad news: once cardinality has hit the limit, you cannot retrieve the granular values from standard reports for that day. Recovery options:

1. BigQuery export (if configured). Granular values are preserved. Build your reporting layer in BigQuery → Looker Studio rather than GA4 standard reports.

2. Apply the fix forward. Once you've fixed the cardinality cause (strip query strings, hash IDs, etc.), future days won't show "(other)". Historical data stays affected.

3. Upgrade to 360. Analytics 360 has a higher cardinality limit. Requires a contract — typical cost ~£40,000+/year — so this is enterprise-only.

The cheapest fix is preventing cardinality at source. The expensive fix is rebuilding your reporting in BigQuery. The most expensive fix is upgrading to 360.

FAQ: UTM Cardinality Crisis: When 'Other' Eats Your Reports

What should a team validate first when utm cardinality crisis: when 'other' eats your reports appears?

Reproduce the problem in the live implementation, isolate whether it is scoped to one report or flow, and compare it against at least one secondary source before changing the setup.

How do I know whether the fix actually worked?

You need before-and-after evidence in the browser and in the downstream report. A clean-looking dashboard without validation is not enough.

When should this become a full GA4 audit instead of a quick fix?

If the issue touches attribution, consent, revenue, campaign quality, or data trust for more than one workflow, it is usually safer to audit the surrounding implementation than patch only the visible symptom.

Check UTM Cardinality Crisis: When 'Other' Eats Your Reports before campaign reporting gets blamed for the wrong issue

Run a free GA4 audit to spot attribution breaks, UTM governance issues, self-referrals, and source/medium loss fast.

These findings come from auditing thousands of GA4 properties. See how your property compares

GA4 Audits Team

GA4 Audits Team

Analytics Engineering

Specialising in GA4 architecture, consent mode implementation, and multi-layer audit frameworks.

Share