Skip to main content

GTM event tracking best practices – Solutions Corner #1

Adam Englebright1 October 20183 min read
GTM event tracking best practices – Solutions Corner #1

This is part 1 of what I hope will become a semi-frequent series on here, dedicated to reusable GTM solutions for scenarios or issues we've encountered more than once. This particular entry is a companion to Mark's piece from a couple of years back, but for Google Optimize rather that Optimizely. In short, it sends data layer pushes so you can fire Google Analytics events when there are Optimize experiments running, which saves you from using custom dimension slots etc in the course of ordinary analysis.

Setting up the custom js variable

Here, we've got a bit of code that you'll want to set up as a custom JS variable, to set as hitCallback in the custom fields to set on your default GA pageview, as we want it to run after GA's available and any pageview hits are being sent. It checks to see if there are any Optimize experiments being run, then pushes to the data layer the ID and variations of any that are being run. Unfortunately there's no "human-readable" name available on the page as there is with Optimizely, but if you really want it to be readable in the GA interface, you can set up a lookup table or something to get the name from the experiment ID.

This being a hitCallback, the whole thing is a function returning another function.

function(){ return function(){

Defining the property id

Then, we define the property ID - you'll want to set this to be a string—whatever the GA property you've got associated with Optimize is.

var propertyId = ; // YOUR PROPERTY ID HERE

Checking gaData and active experiments

We then check that the gaData global object exists, that there are active experiments and that we haven't done this before on this page.

if (gaData[propertyId] !== undefined && typeof(gaData[propertyId].experiments) !== "undefined" && window.optHitCounter !== 1) {

and then, if that's all correct, we create an array of the number of keys in the 'active experiments' object, set a global variable to 1 to prevent the data layer push from firing multiple times,

var activeExperiments = Object.keys(gaData[propertyId].experiments); window.optHitCounter = 1;

set a varable equal to the length of the experiments array,

var mCnt = activeExperiments.length;

loop through them all,

for (var i=0;i<(mCnt);i++) {

set variables equal to each experiment ID and variation,

var mExp = activeExperiments[i]; var curVar = gaData[propertyId].experiments[mExp];

Looping through experiments and pushing to the data layer

And push to the data layer the ID value and variation ID for each of them, which you can set up a GA event tag against to send the information to GA.

window.dataLayer.push({ 'event': 'google_optimize', 'eventCategory': 'Google Optimize', 'eventAction': 'Experiment ID: ' + mExp, 'eventLabel': 'Variation ID: ' + curVar }); } } } }

Final implementation

And that's it, set that as hitCallback on your pageview and you'll be tracking your Optimize implementation events in no time. This solution is also available on Github, for those of you who don't want to copy it out of this blog post in little bits!


Suggested content

Measurelab awarded Google Cloud Marketing Analytics Specialisation

At the start of the year, if you’d asked us whether Measurelab would be standing shoulder to shoulder with Europe’s biggest consultancies by September, we would've been surprised. Not because we don't believe in ourselves, but because these things feel so distant - until suddenly, they’re not. So, here it is: we’ve been awarded the Marketing Analytics Services Partner Specialisation in Google Cloud Partner Advantage. What’s the big deal? In Google’s own words (with the obligatory Zs): “Spec

Will Hayes11 Sept 2025

BigQuery AI.GENERATE tutorial: turn SQL queries into AI-powered insights

BigQuery just got a major upgrade, you can now plug directly into Vertex AI using the new AI.GENERATE function. Translation: your analytics data and generative AI are now best friends, and they’re hanging out right inside SQL. That opens up a whole world of new analysis options for GA4 data, but it also raises some questions: * How do you actually set it up? * What’s it good for (and when should you avoid it)? * Why would you batch the query? Let’s walk through it step by step. Step 1: H

Katie Kaczmarek3 Sept 2025

How to start forecasting in BigQuery with zero training

If you’d told me five years ago that I’d be forecasting product demand using a model trained on 100 billion time points… without writing a single line of ML code… I probably would’ve asked how many coffees you’d had that day ☕️ But its a brand new world. And it’s possible. Let me explain What is TimesFM? TimesFM is a new foundation model from Google, built specifically for time-series forecasting. Think of it like GPT for time, instead of predicting the next word in a sentence, it predicts t

Katie Kaczmarek14 Jul 2025