Insights
Analytics7 min read

Why Most GA4 Implementations Miss the Point Before a Single Tag is Fired

Most GA4 implementations are technically live but strategically broken.

TLDR: GA4 is an event-based analytics system. Most teams implementing it are still thinking in sessions and pageviews. That mismatch produces properties that fire events, collect data and report numbers but can’t answer the business questions that actually matter. This applies to any event-based tool, including Adobe Analytics and Amplitude. It’s just more common in GA4 implementations because the shift from Universal Analytics was rushed, widely misunderstood and rarely revisited.

Click to discuss this article with your favourite AI

The migration is done. The tags are firing. The GA4 implementation is live and collecting data. As far as most teams are concerned, the implementation is finished.

It isn’t.

What most GA4 implementations have is the technical scaffolding of a working analytics setup without the strategic thinking that makes it useful. The data is flowing, but the questions it can answer are the wrong ones. And because everything looks fine on the surface (events are firing, reports are populating, dashboards exist) nobody goes looking for what’s missing.

The mental model that didn’t migrate

Universal Analytics trained a generation of analysts and marketers to think in sessions, pageviews and bounce rates. Those were the native units of the platform. You tracked pages, you measured how many people visited them, and bounce rate told you something rough about engagement.

A GA4 implementation is built on an entirely different model. Everything is an event. A pageview is an event. A scroll is an event. An add to cart, a form submission, a video play, a purchase. All events. The platform is designed to capture user behaviour as a sequence of actions, not as a count of page visits.

That shift requires a fundamentally different way of thinking about what you’re measuring before you touch a tag manager. Most teams skipped that step.

Instead they migrated. They moved the old tracking across, turned on enhanced measurement to pick up some auto-collected events, added a handful of custom events where the old ones didn’t transfer cleanly, and called it done. The result is a property built on a Universal Analytics mental model running inside a GA4 container. It collects data. It just doesn’t collect the right data, in the right structure, to answer anything useful.

Events without parameters are half an answer

The clearest way to see this problem is in how events are configured.

Take a common e-commerce event: added_to_cart. If that event fires without parameters, the only business question it answers is how many add-to-cart actions happened. That’s it.

With parameters (product name, category, price, SKU) the same event becomes genuinely useful. Now you can see which product categories are most likely to be added to cart. You can track whether those products reach checkout. You can follow them through to purchase and measure where that journey breaks down. You can connect upstream behaviour to downstream revenue.

Without parameters, the event fires and the data disappears into a number. With parameters, it becomes part of a thread you can actually pull.

This isn’t a minor gap. It’s the difference between an analytics property that records activity and one that generates insight. And the reason it happens is that teams focus on whether the event fires rather than what it needs to say when it does.

The same logic applies across the whole implementation. Every event that fires without the right parameters is a question you’ll never be able to ask. Most properties are full of them.

Conversion events are not a highlight reel

The second problem is simpler to describe and harder to fix culturally.

Marking an event as a conversion in your GA4 implementation should mean one thing: this action represents a tangible business outcome. A purchase. A qualified lead submission. A booking. Something where a real thing happened that the business cares about.

What I see regularly is conversion lists that include pageviews, video plays, button clicks, scroll depth thresholds and PDF downloads. The logic is usually that these things indicate interest, so they feel like wins worth counting.

They’re not conversions. They’re engagement signals, and there’s a meaningful difference.

When you inflate the conversion list, a few things break. Your funnel reporting becomes noise. If fifteen different actions count as conversions, you lose the ability to read the actual funnel clearly. If you’ve assigned monetary values to those actions, the reported value of the website gets inflated beyond anything the business can sense-check against real revenue. And if you’re running Google Ads, the conversion signals feeding into Smart Bidding are polluted. The algorithm optimises toward whatever you’ve told it matters. If that includes a scroll to 50% of the page, it will find people who scroll, not people who buy.

None of this is obvious from inside your GA4implementation. The reports look full. The conversion counts are high. It feels like the data is rich. The signal that something is wrong only arrives later, when the numbers don’t connect to anything real.

The planning step that most GA4 implementations skip

GA4, and any event-based analytics platform, rewards teams that think in tiers before they build anything.

Not all events are equal. Some events are revenue events: purchase, subscription activated, contract signed. Some are intent events: pricing page viewed, add to cart, request a quote. Some are engagement events: video play, scroll depth, content download. Some are retention signals: repeat login, feature used, return visit.

Each tier tells you something different about the user journey. Each requires different parameters to be useful. And each should be treated differently in reporting, in conversion configuration and in how you interpret the data it produces.

Building this structure after the fact is painful. Building it before you touch a tag means the implementation has somewhere to go. The events have a purpose. The parameters have a logic. The conversion list reflects actual outcomes.

Most teams skip this step because it feels like planning work rather than analytics work. It is planning work. That’s exactly why it matters.

This isn’t just a GA4 implementation problem

It’s worth being clear: the thinking in this post applies to any event-based analytics platform. Adobe Analytics, Amplitude, Mixpanel. The same trap exists in all of them. Plan poorly, implement without a measurement framework and you’ll end up with a property full of events that fire but can’t answer the questions that would change what you do.

GA4 implementations are where this problem is most widespread because the migration from Universal Analytics was done at scale, under deadline pressure and by teams who had no prior experience with event-based thinking. The shift was technically mandatory and strategically optional, so most teams handled the technical part and left the strategic part for later.

Later usually doesn’t arrive.

67% of data professionals say they don’t completely trust their organisation’s data for decision-making. In my experience, a significant part of that distrust is earned. Not because the tools are unreliable, but because the implementations feeding them weren’t built to produce data worth trusting.

The good news is that this is a solvable problem. An audit of what you’re actually capturing versus what you need to capture, a measurement plan that maps events to business questions and a conversion list that reflects real outcomes. None of that requires rebuilding from scratch. It requires stopping, thinking and being honest about whether your current setup can answer anything that would change a decision.

Most can’t. Most could, with the right work applied to the right places.

Recognise this pattern in your data?

If your GA4 implementation is live but you’re not confident it’s telling you what you need to know, that’s where I start.

Let’s Connect

Let's talk about your data

Book a free 30-minute conversation

No obligations — just a chance to talk through your challenge and see if I can help.

Pick a time
or send a message