fbpx

Migrating from Universal Analytics to Google Analytics 4: What You Need To Know

Migrating from Universal Analytics to Google Analytics 4: What You Need To Know
Reading Time: 10 minutes

To future readers of this article after July 1, 2023: good news! You no longer need to decide when to make this transition. I hope everything went smoothly for you, and you aren’t reading this article with a faraway look of regret, having missed an opportunity to get ahead of the transition. 

To the rest of you…

If you haven’t already made this transition and have logged in to Google Analytics recently, you might be concerned about this scary message:

Error message prompting user to update to GA4
Actual scary message.

I decided to find out what this ominous warning means. Why is there a hard cutoff for my old, trusty Universal Analytics? How does this affect the reporting that our marketing team relies on? Do I have to do anything? The warning tells me that it will handle the upgrade automatically. 

Why is this Google Analytics Migration Happening?

To better frame this release, let’s review a brief-ish history of Google Analytics (or click here to skip ahead to the Google Analytics 4 section):

GIF of two men
dee-del-ee-doo

Phase 1: Urchin

Urchin logo and a T-shirt. Urchin is acquired by Google
Urchin corners the analytics market and is acquired by Google.

Way back in the 90s, Urchin was a leading provider of reports generated from aggregated log data. It began as a bandwidth measurement tool, used to track costs associated with data usage – before unlimited data plans were the norm. Banner ads had exploded onto the internet, but attribution tracking was not yet a mature concept. Businesses wanted to know which ads were working, when, and why – so the Urchin team then decided to pivot into ad and e-commerce tracking. Hosting companies included Urchin ad-tracking services for their customers, which quickly ballooned the Urchin customer base…and Google took notice. Google acquired Urchin in 2005, and rebranded the solution “Urchin For Google”. The service was free for anyone to use for their website. Anyone could set up a Google account, generate a chunk of JavaScript for users to execute from their website, and then track their users’ traffic. With very little configuration, a marketer, admin, or “webmaster” (dating myself, here) could generate reporting based on real traffic that could help their company or client make critical advertising investment decisions. 

note: the acronym “utm”, still used by GA in query string parameters, originally stood for Urchin Tracking Module.

Phase 2: Google Analytics

Google Analytics logo
Google Analytics

Google invested in enhancement of their tracking tools for e-commerce. The internet was packed with ecom sites, and these properties needed reporting that was focused on conversions, plus the reports needed to serve many levels of an organization. Released in 2005, this optional upgrade was named Google Analytics. The name stuck, and is still the general term most people use for the service. Users who didn’t want to upgrade could continue to use their “Urchin For Google” scripts.

There were some metrics that were important to sysadmin users that became visible through a service named Google Webmaster Tools, which was renamed a few years later to Google Search Console. Frustratingly, some of the data that is only available in Search Console is very useful to marketers. Ex: search query strings. There was no way to merge this data with GA. And since this data was generated by the Google search engine, rather than users, it became an interesting benchmark to compare actual traffic to the traffic recorded via GA.

Around this time, it was clear that the time it takes to load a page had a direct correlation with sales. Google, NYSE, Amazon, and the entire industry had become aware that latency is revenue poison. One of the primary sources for this latency was “blocking scripts” The script that Google Analytics provided to customers to place on their websites was synchronous JavaScript; it had to be requested, downloaded, and executed in a user’s browser, before the page could render.

Most heinous.

So in 2007, Google issued an asynchronous version of their script – a much more performant solution – again, upgrade was completely optional. Existing products continued to function as before.

Phase 3: Universal Analytics

Google Universal analytics logo
Universal Analytics

Site owners were clamoring for a way to track what is and is not helping them sell products and build engagement. And Google responded. In 2012/2013, Universal Analytics (UA) became available in a beta program. It was a major upgrade – layout changes, navigation, imagery, cta design – all of the properties of a website that lead to a conversion could be monitored, tested, and iterated upon. UA changed everything. A basic set of tools was transformed into a highly-customizable suite that, while a bit intimidating for the layperson, was exciting for expert data crunchers . Businesses began hiring data experts to help them craft reports and views that would define their marketing strategies. A/B testing could be performed with reliable results.

Many companies built complex, bespoke solutions to feed these reports and views – more on that, later. And with mobile device adoption exploding, Firebase app analytics became ubiquitous for native apps. Though app analytics was frustratingly decoupled from web analytics in UA, it was a positive step – also more on that, later. Around this time, Google began to push AdWords (Google Ads) tracking through UA. And, at last, Search Console data became available in UA. In 2016, all legacy analytics accounts were automatically upgraded to UA. This was an additive upgrade which did not require website owners to change the scripts on their sites. This has been the Google Analytics that we have been accustomed to until 2023, despite a pivot in the delivery of the script via:

Phase 4: Google Tag Manager

Google Tag Manager Logo
Google Tag Manager

Though UA remained more or less the same for a while, a big architectural shift was in the works: Google Tag Manager (GTM). The concept of GTM is to provide a central location for the input and output of analytics data so that developers and sysadmins don’t have to execute a code change or deployment to modify their tracking scripts. You inject one script into your website code, and then manage all of your tracking scripts through a google website – including any custom scripting that you might want to inject. This opened up opportunities for a sea of service providers who offer all manner of marketing advice, reporting, monitoring, and solution-ing; SEO partners. The new “output” possibilities for any and all tracking data were not limited to 3rd parties. Google Data Studio (later renamed Looker Studio) allows you to connect to your analytics data, and a host of other datasources, to make custom reports that are not subject to the limitations inside the Google Analytics dashboard. And the BigQuery product offers a robust database solution that can be used to offload and aggregate all of your data – including data from point of sale and any custom sources – into one powerful datasource. Big changes! But as before, all of this is optional. If you had the UA code embedded in your website, you were never forced to upgrade to GTM. Also in play here is Analytics 360 (later renamed Google Marketing Platform), providing a higher tier of analytics support for customers who wanted it.

Phase 2023: Google Analytics 4

Google Analytics 4 Logo
Google Analytics 4

But why, after all this time, is there a forced Google Analytics Migration?

The fundamental components that drive UA – 3rd party cookies – are no longer in play. 

Cookie monster up close
disbelief

In order to comply with privacy demands and to avoid costly legal penalties, Google is ditching 3rd party cookies. This update completely changes the implementation of analytics data collection. It changes which data is collected, as well. Without 3rd party cookies, a data-collection bridge between your website and other websites no longer exists. You may have one of two reactions to this:

Oh no! My reporting is doomed!

Hooray! A huge win for internet privacy!

A lot of us can probably relate to both.

What is happening to my UA? Let’s split the update into three categories:

The Good:

  • Cross-device tracking! You can stream different sources to the same destination. Apps and websites…living together…mass hysteria! It’s easy to imagine that you need to rebuild your data layer to line up events across different platforms. The new data model includes enhanced “event parameters” that can be used to line up different actions (a swipe in the app == a form submit on the web) into one common “action”. It’s a bit like “dimensions” in UA. And don’t limit your imagination to iOS/Android – this data could be collected from a POS, OTT/IoT devices, or anything you can integrate with GA4.
  • No 3rd party cookies required (more on this, later).
  • Everything is an event. Page views, conversions, scrolls, etc. These are all now considered “events”. Above all, this change makes the interface less confusing. 
  • Consent mode. A built-in way to allow your users to opt-in to cookie tracking.
  • Easy to plug in. Still no need to modify your source code, if you use GTM. And Google Analytics 4 tracking can happen alongside UA tracking with no conflicts or penalty.
  • Predictive metrics (Insights). Machine-learning driven prediction model. I guess stay tuned to see how useful this data is?
  • Timing improvements. GA4 offers a much more accurate picture of a user’s time spent on a page. For example, when a user changed tabs, UA would not stop the timer. This led to inaccurate and misleading reporting.

The Bad:

  • UA data is not visible to GA4. You can view historical data in your old UA property for a limited time. But there is no way to view/compare UA data mixed with GA4 in the website. If you want to visualize any historical-to-new comparisons, you will need to wire up the separate datasources externally (BigQuery/Looker Studio).
  • No Bounce Rate. Google can no longer tell you with certainty that a user left your site to read a New York Times article. As an alternative, you can measure Engagement Rate as the inverse of Bounce Rate.
  • No Views. This means that if you have set up views to exclude certain traffic (internal, for example) or if you have views set up for particular users (a department or vendor), they are gone. Report filters and custom dimension options are limited, and Behavior Flow is gone – along with its demographic filters. If you want to reproduce an existing View, crafting a custom Exploration might be the path of least complexity. Or use BigQuery/Looker Studio. Excellent video introduction to Explorations.
  • Threshold data warnings – exclamation point in report. When data signals are enabled. If you want to see small numbers in reports, enable device-based reporting identity in the admin.
  • Ecommerce data missing amounts? Make sure “currency” is defined in the data layer for the event.
  • Automatic migration is not very useful. You might be able to import some events from UA, but you will need to review and test each event manually. It might be less work to rebuild the events from scratch. There are implementation limitations in event tracking and also thresholds on goals that might need workarounds.

The Ugly:

  • Up to double the time for data to show up in reports. Give it a couple of days 🙁
  • Much of the default reporting from UA is gone. Because GA4 is focused on events that you create, it does not include much out-of-the-box reporting. The new Explorations section might be a good place to start, if you want to build DIY reports. Or you could use BigQuery/Looker Studio to build your custom reports outside GA.
  • DO NOT use enhanced forms interaction measurement. The data from forms that fail validation will be treated like a success event. And it might not track valid submissions properly, either.
  • DO NOT enable “use Universal Analytics” events. This data will not track properly after the cutoff date. This is offered as a quick fix to aid in migration, but it will not solve the underlying issues with migration. This option is also only useful if you have analytics.js embedded directly into your source code.
  • Data retention is limited/monetized. Probably ever-changing details here. If you would like to access your data beyond these limits, you will need a BigQuery subscription.

What should I do?

Don’t delay. Get ahead of the forced Google Analytics Migration. Set up a new GA4 instance and:

  • Debug your events to make sure they still work properly.
  • Align your app/web events into a unified property.
  • Re-analyze all UA reporting. Do you need this report? Is that report/view even possible in GA4? Do I rebuild or replace? Via Explorations or via BigQuery or Looker Studio?
  • Analyze your cookie policy.

Depending on your current UA setup, GA4 could be a simple transition, or it could require a complete overhaul. Your mileage may vary. You might want to frame this in your mind like a vendor change, and start from scratch. The internet has changed since you created your UA strategy, and these new tools are going to be used by everyone – including your competitors. You have an opportunity to embrace the change and leverage these new features to increase your traffic or sales! You can do it!

Google’s official setup and validation docs:

 

Get in touch now for expert guidance and unlock your analytics potential.

New call-to-action