#27 Common Google Analytics mistakes to avoid

The Measure Pod
The Measure Pod
#27 Common Google Analytics mistakes to avoid
/

This week Dan and Dara talk about common Google Analytics mistakes they see time and time again. From using UTMs incorrectly to having analytics siloed away from the rest of the business.

The GA4 BigQuery connector has finally started enforcing their 1 million daily event export cap, and has released an event export filtering tool to reduce the export if you are breaching the limit – https://bit.ly/34Quu2X.

GA4 has launched the first phase of their DV360 link (and for the first time for the free version of GA!) – https://bit.ly/3I4qCty.

Google Data Studio has released an updated connecter to the new and improved SA360 – https://bit.ly/3gSEkDS.

Heatmaps are now supported in Google Map visualisations in GDS – https://bit.ly/3H1yril.

The BigQuery connector in GDS now supports down to 1 minute data freshness – https://bit.ly/3sODmhF.

GDS now also supports their native navigation when embedding the reports (finally)!

In other news, Dan is a jackass and Dara gets (more) cultured!

Please leave a rating and review in the places one leaves ratings and reviews. If you want to join Dan and Dara on the podcast and talk about something in the analytics industry you have an opinion about (or just want to suggest a topic for them to chit-chat about), email podcast@measurelab.co.uk or find them on LinkedIn and drop them a message.

Transcript

[00:00:00] Dara: Hello and thanks for joining us in The Measure Pod here for episode number 27 believe it or not. Measure Pod for those of you don’t know is a podcast for people in the analytics world like us. I’m Dara, I’m MD at Measurelab and I’m joined as always by Dan who’s an Analytics Consultant also at Measurelab. Hey Dan, what’s new in the analytics world?

[00:00:37] Daniel: Hey Dara, a couple of updates, it’s been busy on the GMP side of things this week. So starting off with GA4 they have released their BigQuery export filtering feature. So what that means is that you can now filter out to the events you don’t necessarily need when they’re exporting all of your raw data into BigQuery, and why that’s really important is because there is a daily cap on the amount of events you can export to BigQuery. So now if you are breaching or in danger of breaching that cap, which is 1 million daily events on the free version of GA4 at least, you can actually start removing events that you don’t necessarily need to get yourself under the threshold. So it’s a feature that’s been in the pipeline for a little while now, but it’s actually been really useful for those sort of larger volume clients. They can actually start getting themselves underneath that cap so that they’re not missing any of the important data that they need in BigQuery.

[00:01:28] Dara: So is that something that you control in GA4 in the interface?

[00:01:32] Daniel: No this is all from within GA4, so in the property settings where you go view your BigQuery linking there’s a whole new interface now that enables you to set filters and conditions to include and exclude different event names and things like that. So it also gives you a counter kind of like the old Universal Analytics days of hit volumes per month. It gives you a rough indication of how many daily events that you’re on at the moment with an average. So you can actually see if you’re hitting or coming near that threshold. What they’ve also done this week as well, is that they’ve released their DV360 link. This is big news for people using DV360 because this is the first time that this has been in the free version of Google Analytics.

[00:02:07] Daniel: So this was something that Google Analytics 360 used to have in Universal Analytics but now you don’t need that premium price tag to connect a DV360, obviously you still pay for DV360 but not the GA360 side of things. But in the world of Data Studio, there’s a couple of things they’ve released this week. And the first one is that Search Ads 360 have kind of had a bit of a facelift they’ve reinvented themselves and they’ve got a new version of SA360. That’s by the by, but the Data Studio side is it now connects to this new version of Search Ads 360. They’ve also released heat maps in their Google map charts, which is quite nice of what he had to play with that. So you can actually, rather than see these kinds of circles or balls appearing all over the map, you can actually do a heat map with color gradients, which is nice.

[00:02:49] Daniel: They finally released navigation which sounds ridiculous, navigation in embedded reports. If you embed your Data Studio report in a web platform or a Google Site or something like that, you can actually use the built in navigation rather than it just being a static page. And the last thing is that they’ve actually updated the BigQuery connector data freshness. So you can even set your BigQuery exports or your BigQuery connector rather to one minute freshness. So refresh that data every minute, they’ve made some big waves in the data freshness side just for the BigQuery linking. But if you really, really want the minute to minute data, you can now do that.

[00:03:20] Dara: Saves people hitting refresh over and over again.

[00:03:23] Daniel: If you’re that way inclined, hitting refresh every minute, just to change your data. But yeah, I think this is the move towards more real time data, which BigQuery has, especially the streaming data from GA4 even.

[00:03:33] Dara: So definitely not a slow news week this week. 

[00:03:36] Daniel: It just sounds like they’ve banked all this stuff up over Christmas in the new year and they’ve come out of the, come out in the new year swinging. So, yeah, hopefully there’s more to come.

[00:03:43] Dara: So here’s my really forced tie into our topic this week. We’re going to talk about common mistakes to avoid with Google Analytics. So with all these new features coming out, it’s more important than ever. Here’s the tenuous link, more important than ever to not fall victim to these common mistakes.

[00:04:00] Daniel: Yeah I mean we’ve collectively, we talked about our experience and our kind of path into analytics. And I think it’s fair to say that between me and you plus everyone else we work with, we can’t take credit for everything here. We’ve got a bank of stuff that we see often or common pitfalls to avoid or mistakes that we would rather people never make again. So we’re just going to go through a couple of these, maybe some of our pet peeves as well. Things that annoy us the most and maybe give our listeners some tips around things to avoid and to help your colleagues overcome as well.

[00:04:29] Dara: We’re going to just see what comes up here when we talk about this, we each have our own mistakes that we’ve probably seen time and time again, and some of these, they are true, whether it’s GA4 or Universal and potentially even other analytics platforms entirely. But we are going to focus on Google Analytics because that’s our bag as it were.

[00:04:46] Daniel: So I think I’ll start just because this is something that’s super fresh and in my brain in recent memory, and this is something I see and I think everyone else sees time and time again. And that is, I talk about it in terms of plumbing, where people get technology on their website. Really good example is consent banners or cookie banners and CMPs in general. And it’s not plumbed in to GTM, it’s not plumbed into their analytics platform. So all of the people backend, the website owners are thinking, great yeah, we’re now compliant we have a cookie banner and yet nothing’s changed in GTM. Nothing’s changed with their marketing tags or their analytics tags, and anything else. So plumbing is important, so one of the first things common mistakes to avoid is I suppose not thinking of the analytic stack as part of your, kind of overall tech stack, when doing things like implementing consent banners or changing platforms or anything like that, analytics always gets left to the last, it’s always kind of separated out with a different team.

[00:05:39] Dara: I guess following on from that one, same theme of analytics not being maybe front and center, if a website is going to change, especially if URLs are going to change, or if there’s going to be new sub-domains added if there’s any significant changes going to happen to a website or even if there’s a new campaign being launched. So often we see that analytics doesn’t get taken into account. There’s no kind of planning aspect around what needs to get updated, whether it’s in data collection or even through to the reports and dashboards that are being used to measure the effectiveness of those changes. So how many times have we seen a change to a website or a new campaign and then the tracking break, and then we’re going back in after the fact to correct the issue.

[00:06:18] Dara: Obviously the big problem with that is the data loss. So there’s no way to go back and retroactively plug that data back in. So if analytics isn’t seen as being a key part of that planning phase, then you’re going to end up with gaps in the data. You’re going to fix the issue, but you’re going to have a, could be hours, could be days where you’re missing vital performance data for those campaigns or those website changes.

[00:06:38] Daniel: Yeah we always get forgotten don’t we with all this kind of stuff. And I think this is the hardest thing to change, adding analytics, tracking, getting data, collecting data, visualising data. All of the stuff that we do with data is actually the easiest part. I think the hardest thing is that kind of cultural change or cultural adoption in terms of the internal perspective of why analytics is important or how it works to a point where it’s not left until afterwards, people are considering it in their sprint planning, or they have developer resource to account for this.

[00:07:08] Daniel: It’s not easy, you know. We can’t get involved too early because you know while it’s wire frames, there’s not a lot we can do. So there’s this happy sweet spot in the middle where there’s like a staging environment, but still developer resource available to tweak things like data layers. So It’s not easy, and like we said it’s that cultural shift because too late you’re missing stuff and too early, there’s nothing we can do, but I can guarantee and everyone probably has some experience with this listening. It doesn’t matter how deprioritised analytics is, on day one of launch they’re like, so how’s it doing? And that’s where it’s like, but you didn’t let us put the tags or the data layer on what do you expect us to do? 

[00:07:41] Dara: And arguably worse than not having enough tracking is at times having too much. So another example of a mistake that we see time and time again is double tracking and this can cause mayhem within the reporting data because it can inflate certain metrics and deflate other metrics. So the common scenarios we see here are if somebody has hard coded tags and they’re firing tags through the tag manager, like Google Tag Manager for example, just to name one of the many tag management systems out there. Or another example might be, if you’re website is built on, even something like WordPress, and you might have a plugin that’s injecting Google Analytics code, or you might have two plugins or you might have GTM and a plugin and hard coded tags, and then you end up with three page views for the price of one. So you’ll get inflated page views and an artificially low. It’ll look great because it will look like you’ve got zero percent bounce rate or close to it, but absolutely not correct, and a sign that you’ve got duplicate page view tracking going on.

[00:08:39] Daniel: I won’t name names, of course, but there was one client I worked with about four or five years ago who had this issue. They had double counting page views where they had a 0% bounce rate or very low, actually it was around 1 or 2% because of some tracking nuance on certain pages. But basically they were double counting the majority of all pages and I highlighted it to them on an audit. And said, we should remove one of these things and their like, but wouldn’t that change our data? And I was like, yes, it would fix your data, and they didn’t go for it. They didn’t want their historical data to be different, they wanted to keep it going because their numbers would look worse. And I couldn’t for the life of me, I couldn’t understand why they wouldn’t fix something that’s broken just because it would change. I was like, yes, it would change, but it’s fixing, it’s not changing I wouldn’t class it as a change I think this is fixing it.

[00:09:22] Dara: I’m with you, obviously the right thing to do is to fix it, but that’s not to deny that it does cause a lot of headaches. If the data, even if it’s always been wrong and you fix it, obviously you’re, you’ve got a cleaner, more accurate version of the truth going forward. But it does, we have to appreciate it does obviously cause huge headaches when you know, you’ve been used to reporting. So you love benchmarks, you love historical comparisons and suddenly they’re all out of whack because you realise somethings been wrong. But you’re right the correct thing to do as painful as it is, is fix the issue and accept the fact that you may not be able to do historical comparisons because you’re going to be comparing against broken data.

[00:09:57] Daniel: Wouldn’t you rather one side of the comparison be broken than both? That’s the thing I never understood.

[00:10:03] Dara: Yeah, you don’t need to convince me, I get it.

[00:10:05] Daniel: Thinking about the setup and you’ve got issues around double tagging, just to just a bit more on the setup. I think there’s a couple of low hanging fruit in terms of fixing setups or mistakes that we see often aren’t set up correctly in the first place. And this has to do with things like cross domain tracking or UTM tracking and things like that that are either not used, or not used correctly. So when it comes to things like cross domain tracking, if someone is tracking across multiple domains, and if they haven’t set up cross domain tracking, then obviously they’re going to get an inflated user and session counts and pages per session and session duration. All those things are going to be affected because you’re technically being seen as a different user with a different cookie on a different domain. So there’s a nuance there that you think, wow, just set this up and it’ll fix, quote unquote, fix your data. But on the UTMs, I think it’s the same kind of thing as well, where we see the obvious cases, people not using them. That’s the hardest thing first of all is to get people in a habit of using them with a convention, with a spreadsheet or a tool or something to do that. But it’s also the other way around where they’re using it too much, like for internal site navigation, and they’re using UTMs to link between different parts of their own website, which is kind of breaking sessions in Universal Analytics. I know it doesn’t happen in GA4 anymore, but it still messes around with campaign attribution and a session with two different sets of UTMs it all becomes very confusing.

[00:11:14] Daniel: So again, just avoiding using things in the wrong place and making sure you’re using them in the right place is something that we see more often than we’d like I suppose.

[00:11:23] Dara: Just something to add, and this isn’t strictly an analytics mistake, but it relates to use of UTM. So if you’ve got bad redirects in place that can strip those UTMs. So this has more to do with the website itself and how redirects are being handled, but I’ve seen that issue numerous times as well, where old URLs and often it’ll happen with maybe with old ad URLs that are being those URLs are being redirected to a new page. Whether it’s the Google Ads, GCLID parameter or it’s manual UTM tags. They get stripped off if they’re going through a bad redirect. So if you are redirecting old URLs to new ones, make sure that will correctly pass through any UTM parameters and values or any other auto-tagging that’s used.

[00:12:06] Daniel: Yeah also just riffing on that again, I actually came up when I was doing some GA training today as well. And that’s the usage of vanity URLs and UTMs for offline media. So let’s say you’re running a TV ad and you’ve got visit measurelab.co.uk/tv and then we can set up an internal redirect that redirects that to a URL with query string parameters, with the UTM parameters, which is great, but some people just use slash TV or slash radio as a page. Then they won’t unlist that page from things like SEO. So the more people that visit that page, the more people are going to find it organically or share it via social media. And it’s just going to become more of a common page, and now you can’t attribute the traffic to the original intended use case. So it’s things like that, where again, we’re using these tools, but it’s not doing its desired outcome. It’s being manipulated or used incorrectly. We had an example recently where one of our clients took a QR code that redirected to a page, and I think it was like a print ad. They redirected to a page with UTMs, but then someone just copied that URL and posted it on Twitter. So now that same set of UTMs is posted on Twitter, looking at print ads. So now we can’t identify the correct source of traffic because people aren’t understanding they’re using the feature incorrectly.

[00:13:19] Dara: Another quick campaign related or channel related one is making sure you use referral exclusions correctly. So you’ll use UTMs to assign attribution to the right channel, the right campaign. There are times where you don’t want that to happen, if you’re using a third party payment gateway, for example, you’ll want to exclude that from triggering a new session and preserve the original traffic source for that session, and let that carry through when somebody goes off to the payment gateway and then comes back to your website to complete their transaction.

[00:13:48] Daniel: That’s literally, maybe the first thing I check is just the referral exclusion or go into the Source/Medium report and instantly look for PayPal, Sage Pay, those kinds of things.

[00:13:56] Dara: Yeah, and the domain itself so you can avoid that issue of having self referrals, which can be a pain, especially if you’re, you’ve got complicated setup with different sub-domains or different cross domain tracking.

[00:14:07] Daniel: And the same for GA4 actually they’ve got the kind of spiritual successor to the referral exclusion list. And by default, it does not add your domain there. So in UA now, if you do add a new property, it does automatically add it. But in GA4 if you’ve already got a GA4 property, it hasn’t. So I always go in and check that to make sure that your domain, your root domain is added as well as any other domains like you said, Dara.

[00:14:27] Dara: Before we move on this isn’t a mistake that we, or our listeners might make but an issue at the moment is the issue with the BigQuery export where Google paid traffic is incorrectly showing up as organic traffic.

[00:14:40] Daniel: Yeah one of many issues right now, that whole thing around not understanding that Google CPC traffic and the Source/Medium, if there’s a GCLID. So although you’ve got the auto tagging with Google Ads and in the interface, it says Google CPC, and it gives you the campaign information, the campaign parameters and the BigQuery export are probably going to be Google organic or direct, but it’s not going to understand that it being Google CPC. There’s a number of nuances shall we say with the GA4 BigQuery export as it currently is. And that’s probably one of the most annoying or frustrating or things that you think shouldn’t be there, for their kind of flagship product from Google Analytics right now. But right now, just yet, if you are doing anything in BigQuery especially around the campaign data, first of all, understand that sessions can have multiple sets of UTMs, right. You can have multiple different campaign parameters per session. There is not one-to-one anymore, like in Universal Analytics, but also Google CPC isn’t currently accounted for. So you have to do a bit of fudging and look at the GCLID parameter in the export, which exists. And if that’s present, then it’s Google CPC. If not, then use the original, it’s a bit of a workaround, but hopefully they’ll just fix that soon.

[00:15:41] Dara: You’re right, hopefully that second issue is just something that will get ironed out and that work around won’t be needed anymore. But the first point you mentioned, and I don’t want to go off too much on a tangent here, but that’s interesting and something, I think that a lot of people won’t be aware of, maybe not something to go into too much detail on this particular topic, but this point about there being multiple source value campaign combinations within a single bit of call it a session, you’ll probably tell me off, say it isn’t officially a session, but briefly maybe it’s worth just explaining in a tiny bit more detail and then maybe it’s something we can pick up at a future date and talk about it in more detail.

[00:16:19] Daniel: Yeah, for sure and definitely a wider conversation to have maybe with people way more clever than us as well. But in Universal Analytics, if you were to use multiple different UTM parameters, it would restart a session every time you came back via a different set of UTMs, even if it’s even slightly different, it would restart a session. So there’s always only ever one set of Source, Medium, Campaign per session. In GA4 it doesn’t, It just uses that 30 minute, as a default which you can change, timer to identify and workout sessions. So if I click on a paid ad and go to the website and then click on an email ad within five minutes, that’s going to continue that same session. But now that session ID in BigQuery will have two different sets of UTM values. So the source medium campaign, there’ll be two sets of those. There will be the first one, and there’ll be the second one, there could be a third one. Think about it from a GA4 perspective, everything’s event based. Every event will have a different set of UTM parameters, there is no session-level data within the GA4 dataset. There is a session ID in the export, but there’s no session level campaign parameters.

[00:17:17] Daniel: So this is one of the nuances where trying to re-engineer the user interface for GA4, they’ve got a session acquisition report. Trying to re-engineer that is actually really hard. And one of the hard things right now within the industry. How do I just get my campaign report in BigQuery is actually proving to be a lot harder than maybe it would seem to be.

[00:17:33] Dara: That was, that was well done trying to summarise that it’s not an easy thing to summarise and I was going to say it’s taken us off topic. It is actually related to a common mistake so the common mistake of not fully understanding the data or how it’s being collected, how it’s working. So that’s a really good example of one that relates to GA4 now where something used to work a certain way. Maybe people will have that set in their heads, but actually it’s a change that you need to get your head around with GA4 where gone are the days of having a single set of UTM values apply to the whole session. Now it’s going to be, as you mentioned, it could potentially be different for every event that’s collected within that session.

[00:18:10] Dara: There’s other examples of this too. So one that always bugged me was when people would try and add up unique users over different time periods. So this is the one where if you look at a month of data in analytics, you’ll get a true unique count of users. And let’s for now, at least not even go into what users means and is it people, what is this? But it will give you a truly unique count over that period of time. A common mistake is when people will take, say weekly data or daily data, and then add the number of users for those days or weeks. Add them up and think that gives them the unique counts for the month. But actually that’s not a valid sum, you can’t add users that way, they need to be unique over the biggest timeframe, the full time frame that you want to look at.

[00:18:54] Daniel: I know very well how you feel about this Dara. There’s been a couple of occasions where we’ve had a never come to blows, but you’ve argued very strongly that we can not sum users even if it is 10 hours more work to do it the other way around. But yeah, I mean the whole uniqueness is such a complicated thing to explain to people that aren’t aware of it. And it’s not just users. There are other unique metrics in there too, but users is the common one. But a couple of other things, I think this is just on that theme of people thinking they know what the number they’re looking at means, but actually it’s not, this is the very hard thing within specifically Google Analytics actually, because there’s this kind of like Google halo effect where people just assume Google is right. A lot of the time people look into Google Analytics and it’s like, oh, it’s a known brand, it’s Google, it’s a thing that I trust or respect. I assume it has correct data or it’s correct in some way, but actually it’s always nuanced.

[00:19:43] Daniel: First of all, you should be validating your data. You should be validating your data against a proper source of truth, to know, and to be confident and have trust that it is correct. Especially when it’s like purchase data or revenue data, for sure. But the stuff that’s a bit more nuanced is more like when it says, for example, new versus returning users, or it has a as a pie chart or as a percentage split, assuming that Google Analytics has an accurate view of that and that it does what it says on the tin that’s quite damaging, or it can be quite damaging, and knowing that the new versus returning user split is just to kind of, is there a cookie already there, or did we say a cookie is a really hard thing to do, especially with ITP, ETP and all browser nuance and technologies let alone multi-device and multi browser users. It’s just something that just isn’t relevant or just not accurate anymore. Then people see new versus returning and they’re still using that as KPIs. And I think this is where it can be really damaging to then action that and say, 10% of our users are new, wow let’s go do something with that information, not understanding that the data itself is flawed.

[00:20:39] Daniel: It’s not just the new versus returning user split as well. But it’s also, if you go a bit more deeper and you look at things like attribution modelling – relevant now, because GA4s just pushed everyone on to data-driven attribution. The look back window is actually 90 days, about three months. So an ad click from 80 days ago can convert today and it’s getting attributed value, but obviously your ad spend is 80 days ago. If you’re measuring like ROI from that campaign, your cost exists two and a half months ago, and your revenue exists today. How the hell do you marry that data across? And then if you didn’t know that that was happening, then you just looked at revenue today and said, oh, look, this campaign has got this much revenue. There’s very little cost there or if any costs at all it’s like, wow that’s performing really well, the return on ad spend is huge, but actually you’ve just got a load of negative ROAS a couple of months ago and a lot of bunch of positive ROAS this month. It can be quite misleading and that’s the worst is being armed with good data, but a poor understanding could be the worst combination.

[00:21:37] Dara: Yeah and I think Google Ads has always attributed back to the click, rather than to the subsequent conversion so it’s not just about understanding the data within the platform itself, but it’s also maybe how that compares against other systems or other platforms that you might be getting data from. You mentioned about validating GA data, and that is something that people don’t do often enough. It’s sometimes seen as something you do maybe once or you do it as part of an audit, and then you assume that those two systems are going to continue to be comparable because obviously they won’t be the same and you don’t expect them to be the same, but you expect them to be comparable within a certain margin of error.

[00:22:15] Dara: But if you’re not going back and rechecking that, and the frequency might depend, so it might depend on how likely it is or how often it is that there’s going to be something that will knock them out of line. But it might be that you check once a quarter, once a month, once every six months, it depends, but you should be checking it, you should be validating on an ongoing basis. Because even if everybody understands the GA data, for example, is going to be 10% lower or 10% different maybe from your actual sales data, you can’t then just assume it’s always going to be 10% different because something could change and it might suddenly be 50% different or a 100% different. So even though you’re not going to be using it as your financial reporting tool, you are going to be using it to make decisions based on what’s driving sales. So if you’re not continuing to validate that data, then you could get yourself into some really hot water and you could be making decisions based on very inaccurate data.

[00:23:03] Daniel: Yeah for sure, that’s always the case, any change to a website, any browser change even. It might not be something you’re even aware of or a release that you’ve done yourself, but if Chrome just releases a new version and all of a sudden if your basket stops working, and that’s not something that you would have been able to test, but actually this is really important to validate intermittently or just have a bit of an eye on the kind of compatibility side, for sure. So it doesn’t have to be your fault that things stopped working. There’s all sorts of technology that gets rolled out and changed that can affect this kind of stuff and worth just checking in more from, again, just a confidence perspective, a trust perspective. It’s hard to trust a dataset when you’re not entirely sure if it’s right, and that’s actually most of the time when people speak to us, they’re like, ah, we just don’t trust it. It could be right, it could be wrong, but we just don’t trust it. Help us understand or help us trust in this data so that we can make decisions on it.

[00:23:53] Daniel: Sometimes that is by the way it’s all good, you’re all good to go. Sometimes that’s, we’ve uncovered X, Y, and Z right.

[00:23:58] Dara: Yeah and you made a good point about the different browsers and actually something that I’ve found very useful over the years. If you’re an e-commerce website and your checkouts underperforming, either against what you expect it to do or against some historical comparison, often the kind of knee-jerk response is to think there’s something wrong with the user journey or there’s something wrong with the design and people jumped to thinking, we need to do some AB testing and we need to redesign the checkout. It can be as simple as a change to a browser version and that actually breaking the e-commerce journey. So if you can spot that and it’s so quick to check that in GA. Go into the browser report, look at it by conversion rate, maybe sorted by using weighted sort and find out which browser or which browser versions have got the lowest weighted conversion rate, and then it might be that you realise that you’re getting a security error or the checkout button isn’t visible in the latest version of Chrome. And actually there’s no problem with your checkout as such, it’s just the design of it for that particular browser version. Then you can save yourself a lot of time and money by just correcting for that browser version rather than going and redesigning your entire website or your entire checkout journey.

[00:25:03] Daniel: I think the last thing I wanted to highlight with this and I see this time and time again, and I think that’s just our unique position of being consultants rather than in-house or part of a, I suppose, a media agency is more around the documentation not being up to scratch or having, I don’t want to use these kind of like terms because there are, I feel like a bit loaded, but like things like an analytics centre of excellence, or an analytic suite, or some kind of repository, or documentation hub, whatever you call it, it doesn’t matter. But having what you’ve got implemented, why and how, and when from, and who has access, having that written down somewhere so that it doesn’t ever get lost and someone doesn’t have to come in and start from scratch. Time and time again, we’ve worked with really complex and amazing analytics setups, event tracking, GTM containers that are amazing. But whoever set them up, didn’t document it or write anything down. They’ve left the organisation or they’re an agency they’ve parted ways with, and then everyone’s gone, I don’t understand what’s there. I don’t know how this all works, and then they get us or someone else involved to say, hey look, we need to figure this out can you. Quite often you end up starting again because it’s so complicated that you like, look, let’s just start again. Let’s do another framework session, let’s understand what it is you want to track. Let’s now implement the tracking. Data gets reset and starts from square one. They probably sat through three of these framework sessions already now, but it hasn’t been written down anywhere.

[00:26:19] Daniel: So it doesn’t sound glamorous and it’s not super sexy, but documentation not having documentation is a easy avoidable mistake. Whoever is responsible for this, you set up a new custom dimension, put it in a Google Doc, put the date it was there, the date it had good data from, put an annotation in GA and move on, that’s all it needs. But having some reference points is really important because it will not be you that maintains this forever. Whoever you are, if you’re the one that set it up, it will not be you. Specifically relevant from agencies and consultancies like ourselves. We won’t be working on this forever, so we need to make sure that is able to be picked up by someone else. I suppose it’s more for the people that on agency side or consultancy side, just be aware that documentation is key because if you don’t document it, unfortunately, inevitably all of your great work will be reset the next time someone comes along.

[00:27:08] Dara: Yeah so true, and not just around the documentation side of that point, but also the desire sometimes, or the urge for analytics people to overcomplicate. Especially if you’re brought in to implement best in class analytics, it’s very easy to think because you can do something, you should do it. Actually, the sign of a really good analytics team is when they know it’s almost as much about knowing what not to do as it is about what to do. So just because you can do something doesn’t mean you should. So these kind of overly convoluted setups, we often see it when it’s layered over time and it’s lots of different people adding analytics over the years, but to go hand and hand, probably with documentation, it’s actually regularly reviewing what is in place and if nobody knows what something is or what it’s doing, then it has no place in the implementation in the first place. So actually making sure that you’re regularly reviewing the implementation you’ve got, the setup you’ve got, the reports that are being fed by that data you’re collecting, and making sure that it’s doing what you need and nothing else. If there’s any surplus, convoluted tracking or filters or tags in there, get them out of there if they’re not being used, get them out of there.

[00:28:14] Daniel: Yeah I couldn’t agree more. The amount of times we spend hours building a data studio dashboard and then realise no one’s looking at it, or whenever I’m doing any kind of GTM work or training is a really good example where I’m telling people as important as when it should go live is when it should be removed because you have this culture of adding and adding and adding and never removing when it comes to analytics and we’ve been involved with, but also added to accounts where there’s hundreds of Google Analytics events. And you think, wow, you could not be using hundreds of analytics events now. You might have 10, most that you’d be interested in right now. So the rest of them are just things that maybe used to be important, but they’re still running, but you don’t use them anymore. Or they’re just stuff that, oh, that would be a good idea if we tracked every button on the website, wouldn’t it? Yeah. Yeah. Let’s go do that. And then all of a sudden it’s overwhelming the amount of data they’ve got and they just don’t know how to consume it.

[00:29:02] Daniel: Again, it’s these things to avoid. Be smart with what you’re doing I think this comes with a level of experience as well. More is more to a lot of people until you realise that actually more is not more.

[00:29:12] Dara: That probably leads into a nice summary of this. So regular reviews of your implementation and your setup, removing anything that you don’t need, using the right features in the right way, the right time. Not using the features you don’t need, and it is a case of less is more or the right amount is the right amount. Yeah, maybe that’s the way to put it. Yeah, it’s not about tracking less, or tracking more, it’s about tracking the right amount. So a lot of these mistakes can be avoided by regularly reviewing the implementation, making sure that the people using the data understand it and also are using it. If you’re not understanding it, figure out what it means and if you’re not using it, then if nobody else is then potentially your setup could be streamlined.

[00:29:51] Dara: I’m sure there’s tons of others, if there’s anything glaring we missed, get in touch with us, let us know if there’s any mistakes you’ve seen that are bigger, scarier, worse than the ones we’ve talked about. We’ve covered the ones that we’ve seen kind of time and time again, and no matter how the technology changes. So even moving to GA4 a lot of these issues we still see even if they come up in a slightly different way, we’re still seeing the same theme.

[00:30:12] Daniel: Yeah, I think what’s really interesting is, most of it is actually not due to a lack of technical expertise. It’s actually a lack of cultural awareness or kind of internal adoption within analytics. So it’s about people not understanding what they’re looking at is correct or wrong. It’s about understanding to account for analytics earlier on in the process, rather than too late. It’s about making sure that it’s accounted for in the tech stack so that you don’t go and do things like consent management without analytics being considered. Very few of these common mistakes are about, you’ve done it wrong, you’ve implemented something wrong. If anything has done it too much, too right. But actually, like I say, it’s really interesting. It’s actually the hardest thing out of all of this in the whole industry is changing a kind of corporate culture towards analytics and the adoption of analytics and the understanding of the value it can provide.

[00:30:57] Dara: Okay so I think we’ve covered enough. We could probably come back to this topic again and cover another set of issues. And maybe again and again, or let’s see how things shape up once GA4 is being used more commonly, and maybe we could do a GA4 specific episode on the common issues we’ve seen. But I think we’ve probably covered this enough for now, at least. So let’s switch gears, move out of analytics. What have you been doing outside of work in the last week?

[00:31:22] Daniel: The cinema Dara, I’ve been to the cinema again, and it has to be Jackass Forever. It’s so much fun. It’s so much fun. It’s something that shouldn’t work as a film. Like none of them should have worked, but they are all of them, amazing. So, yeah how about you, Dara? What have you been up to?

[00:31:37] Dara: Well, I’m going to give the wrong impression. I’m going to give everyone the impression that I’m really cultured because for the second week in a row, I went and did cultural stuff. So last week I talked about going to the ballet. This Sunday just gone, we went to two different musical performances. So my partner Hannah played in the daytime at an art gallery as part of an exhibition. So she played a harp set and also two violin sets. One was a solo violin set, and the other one was playing alongside a guitarist. So that was really, really interesting. And there was art on show obviously in the gallery as well, this was all kind of a joined up experience. Then in the evening we went to a classical performance with a classical string ensemble, but that was really, really good as well. Then we went out for dinner and drinks with some friends, it was somebody’s birthday. So I had a fun day and night out in London, which was really good.

[00:32:24] Daniel: It does sound very cultured.

[00:32:26] Dara: Yeah next week it’s probably going to be back to Netflix, binge watching something on Netflix.

[00:32:30] Dara: Yeah, right that’s a wrap for this week. As always, you can find out more about us at measurelab.co.uk or you can get in touch via email at podcast@measurelab.co.uk, or just look us up on LinkedIn, and do get in touch if you want to suggest a topic or better still, if you want to come on The Measure Pod and discuss it with Dan and myself. Otherwise join us next time for more analytics chit-chat. I’ve been Dara joined by Dan, so it’s bye for me.

[00:32:56] Daniel: And bye from me.

[00:32:57] Dara: See you next time.

Share:
Written by

Daniel is the innovation and training lead at Measurelab - he is an analytics trainer, co-host of The Measure Pod analytics podcast, and overall fanatic. He loves getting stuck into all things GA4, and most recently with exploring app analytics via Firebase by building his own Android apps.

Subscribe to our newsletter: