Apples to Apples and GA4

What do we mean by apples to apples in the reporting sense? It means that we ensure we combine two of the same thing, and it’s particularly important to ensure this when building reports and dashboards. We have to be certain of what the metrics and dimensions we use actually mean and not just assume what they mean by the name or label they are given. These names represent variables that can change over time, be altered in some way, or even have a similar or the exact same name, but in practice be very different.

Often, your data is not inaccurate. It’s accurately telling you something other than what you think it’s telling you and leading you or your audience to draw the wrong conclusion.

As we read in Chapter 6 – Plan, Plan, Plan, we need to have our own dimension and metric libraries to ensure that we don’t get lost in the actual definitions, and as we saw in Chapter 1 – Climbing the AMC, we need to be sure that data integrity checks and active reconciliation (detailed comparing of like-for-like metrics) are part of our audits. The only sound way to complete a proper data integrity check (apart from regression testing from a previous benchmark) is to compare data sources against each other, and that can often mean comparing MarTech tools and reports like e-commerce reporting to back-end financial systems and accountancy programs that throw up unusual results. It’s at this point that we can lose confidence and start worrying that our data might be F**ked.

If we compare the Sessions we see in our web analytics tool with the Visits reported on our website (often built into the CMS back-end), we will see that these are completely different. It doesn’t necessarily mean that one of the sources is wrong. They could measure two different, similar-sounding metrics correctly but count them in different ways.

Often 100% accuracy is not possible – in Chapter 4 – Cookie Apocalypse, we explore why, due to the limitations of our tools, we should only expect some metrics to be 95% accurate anyway. The Revenue metric we measure in Google Analytics is an example; a 5% variance or less to the revenue recorded in your bank account is more than good enough.

Several reasons can explain an unexpected significant variance in two metrics without their being a problem. Here are some key overlapping factors that can contribute to the mismatch:

  • Data Collection Methods: Differences in data collection methods can lead to discrepancies. If the two sources use different techniques, instruments, or sampling approaches, it can result in variations in the collected data.

As we’ll see, Google Analytics 3 collects and then processes data differently from Google Analytics 4.

  • Data Processing Techniques: Variances in data processing methods can also cause discrepancies. Different algorithms, assumptions, or filters applied during data processing can lead to variations in the final results.

This was a key reason why websites used to regularly over-report Visitors. Google Analytics had much better access to information about the most recent bots (false traffic created by spammers), which they used to filter out fake sessions automatically.

  • Data Scope and Definition: The scope and definition of the data can significantly impact its interpretation and comparison. The sources’ different definitions, categories, or timeframes for data collection can result in inconsistencies.

A common mistake is comparing Visits to Sessions or Users to Sessions. A user has a session on a website and opens a page, which is a Page View. These are all different scopes or hierarchies of data.

  • Data Updates and Timing: If the two sources were updated at different times, it could lead to disparities. New information, revisions, or corrections made to one source may not be reflected in another, causing discrepancies. Additionally, if the sources have different reporting frequencies or lags, this can further contribute to the mismatch.

This is again different between Google Analytics 3 and Google Analytics 4.

  • Data Governance and Standards: Inconsistent data governance practices or varying adherence to data standards can result in differences. It can lead to discrepancies if the sources follow different protocols for data validation, quality control, or normalisation. Additionally, variations in data formats, units of measurement, or encoding standards can contribute to mismatches.

You can control This part of the process by creating and maintaining your own data dictionaries and libraries and sharing them for everyone to review.

We’re likely to see this misunderstanding get worse rather than better with the introduction of Google Analytics 4. A GA3 Session is calculated differently from a GA4 Session. They are not the same metric but have the same name, and there are many more examples of these differences. Here are those most likely to cause problems for us marketers.

Key Metrics:

  • Sessions: The definition and calculation of sessions can vary between GA4 and G3, leading to differences in session counts.

  • Users: The way users are identified and counted can differ between the two versions, resulting in variations in user counts.

  • Conversion Rates: GA4 introduces a new way of tracking conversions, which may lead to differences in conversion rates compared to GA3.

  • Events: GA4 emphasises event tracking more, allowing more detailed event-based metrics that may not be directly comparable to GA3.

Key Dimensions:

  • Traffic Source: The way traffic sources are classified and attributed can differ between GA4 and GA3, leading to variations in traffic source dimensions.

  • Device Category: GA4 and GA3 may use different classifications or categorisations for devices, resulting in discrepancies in device category dimensions.

  • Content Grouping: GA4 introduces a new way of grouping and organising content, which may lead to differences in content grouping dimensions compared to GA3.

  • Campaign Parameters: The handling and interpretation of campaign parameters can differ between the two versions, resulting in variations in campaign dimensions.

Our example here uses Google Analytics as it’s such a common tool and just about to go through (depending on when you read this) a huge upheaval.

It is essential to carefully evaluate and understand these factors for each of the different tools that we use, as this is often the only way to identify the root causes of data mismatches and ensure reliable analysis and decision-making. We must ensure we don’t mislead naïve users with poor naming, definitions, and labelling, and we can ensure this by both maintaining and auditing our data and libraries.

As we explored in Who Wants Data, we may also need to restrict access to raw data and platforms, ensuring that we only provide guided ‘Knowledge and Wisdom’ along with the metrics and dimensions we show in our prepared dashboards and reports. This is why we don’t leave the analysis to the executives (a key principle suggested in Plan, Plan, Plan). They don’t have the time to understand why and how two metrics with the same standard name, from the same tool, with everything working no longer match and why in this previously specialist field that’s OK and not completely insane or illogical.

This blog post is a snippet of a much bigger text - Your Data Is F**KED for Marketers - You can purchase this book here in print or Kindle or join the newsletter below to wait for the next free blog snippet or even the next free book release.

Mark McKenzie

Mark McKenzie, starting his career in media in London, has amassed over a decade of experience in the field of digital marketing and analytics. Throughout his journey, he has collaborated with SMEs, corporates, and enterprises, establishing highly specialised consultancy and agency departments that prioritise digital analytics. Serving clients across New Zealand, the United Kingdom, Australia, and the USA, Mark has encountered and tackled challenging questions from struggling marketers in diverse industries, spanning web analytics tools, platforms, connections, and databases.

Previous
Previous

Data Sovereignty, TickTock and The Cold War

Next
Next

It Comes from the Top – or Not