Clearing 2026: why UK universities need recruitment intelligence
UK universities face a deficit crisis. Student Recruitment Intelligence can transform Clearing from chaos to precision.

Hey folks. The shiny new Built-In Triggers for things we used to have to use big old Custom HTML tags to track before—Youtube tracking! Scroll depth! Element visibility!—have had a bit of time to bed in now. We had a look at changing over to the new built-in solutions everywhere we were using Custom HTML—if it's being done by Tag Manager itself, it has to be better, right? Right?
Wrong.
This issue is related to a fact that I've discovered through experience but not seen actually documented except in the answers to Stack Exchange questions and the comments on Simo's blog. Those of you who are getting on a bit (like myself) will remember back in the day before GTM v2, when you had to add listeners manually. You'll still see them sometimes in old containers that haven't been cleaned out in a while, hanging around like a bad smell. Anyway: the point is, those don't exist any more. Or you can't add them, at least. Auto-event listeners are the order of the day, as they have been for a good long time now. Eagle-eyed observers, though, will have realised that those listeners are still there, they're just automatically dropped in if you have one of the relevant triggers in your container.
Now, here's the thing, the fact that I set up at the beginning of the last paragraph of this paragraph: the listeners are added on the gtm.js event, aka the Page View event, and barring data layer pushes, the first event to fire on page load. I discovered this when I found that someone had managed to add a data layer declaration in the wrong place and it meant gtm.js didn't happen. I set the pageview tags to fire on gtm.dom until it was fixed, but then we realised that the click-triggered tags weren't firing. Push an event with the name "gtm.js" into the data layer, though, and the clicks come back. Auto-event triggers are added on page view. Good to know! That makes sense though—surely you want the listeners to be listening as soon as possible, right? Right?
Wrong.
Well, wrong in certain cases, at least; for click, link click, form submission etc, it's fine. Where it becomes an issue is listeners that aren't necessarily best fired as early as possible. Scroll depth, for instance, which is dependent on the length of the page, and the length of the page can be affected by stuff loading in. Land on the page, three scroll depth tracking events before you’ve even touched your mouse. At this point I'd say "especially in the case of single-page applications", but honestly I've seen this happen on my test site which is flat HTML. It's not just scroll depth tracking either—sometimes, videos are loaded in later and we want to be able to trigger the listener on a click (or sometimes even a setTimeout after a click), and I'm sure you can think of some examples you've encountered yourself.
As a consequence, we’re still relying on custom HTML tags in a fair few cases when we’d really rather not be. For the time being, there’s no good way around this, really—but there could be. If Google were to give you the option to customise their auto-event listeners for triggers, even somewhere buried deep in the advanced settings so we could put off the scroll depth listener to gtm.dom or gtm.load, that would let us use the Built-In Triggers everywhere—but as it is, for the time being, most people will still be better off using Custom HTML tags until we can be sure it’s not .
Google, if you’re listening—how about some auto-event listener customisation?
UK universities face a deficit crisis. Student Recruitment Intelligence can transform Clearing from chaos to precision.
You’ve probably heard that AI is coming to make our lives easier, especially in tools like BigQuery. But here’s the thing: AI isn't magic. If you want it to be accurate and useful, you need to set it up for success. One of the best ways to do that? Improve the metadata in your BigQuery warehouse. Metadata is like the index or contents page in a book, it quickly tells you exactly what’s inside and where to find it. Creating clear metadata means AI can more easily understand your data warehouse
In our recent engagement with a client, we went on a journey to transform their data pipelines, tackling inefficiencies in performance and cost within their Google Cloud BigQuery environment. Our efforts culminated in a comprehensive optimisation strategy that used Dataform, improved SQL practices, and implemented tailored solutions for significant performance gains and cost savings. Here’s a deep dive into the highlights of our project. Identifying inefficiencies in BigQuery workflows We beg