#79 Quant and qual data, different but complementary
This week Dan and Dara chat about quantitative and qualitative data, the merits of each type, and how they need to be used together and not siloed away in different teams to get the most valuable insights. This topic was suggested via our Feedback Form by Hugh – thanks Hugh!
Enrolment is now open for June’s cohort of the GA4 Immersion 6-week cohort training with early bird pricing for 25% off!
The Marketoonist is a must-follow with regular funny cartoon strips – like this one Dan mentions on the podcast.
In other news, Dan is always sunny and Dara snookers themself!
Intro music composed by Confidential – check out their lo-fi beats on Spotify.
If you’re like what we’re doing here, please show some support and leave a rating on Apple, Spotify, or wherever really.
Quote of the episode from Dan:
“…you’re working with a marketer and they’re about driving all the traffic to the website. It’s like, oh, this isn’t converting, let’s put our spend over here. But often you might find that, well, maybe it’s the website that’s crap and maybe it’s the website that could be optimised, there’s nothing wrong with the marketing.”
Quote of the episode from Dara:
“…optimization is of often used wrongly in a very narrow sense. It’s like, oh, you know, if you’re talking about optimization, you’re talking about, you know, landing page optimization or checkout optimization or whatever. But it isn’t, it’s the whole thing, you don’t just look at it from one piece of the journey.”
The full transcript is below, or you can view the Google Doc.
[00:00:15] Dara: On today’s episode, we talk about quantitative versus qualitative data. We talk with the merits of each and give our views on how they should be used together, and also talk about how changes to privacy legislation are affecting both.
[00:00:26] Daniel: This was submitted by one of you wonderful listeners. So if you want to do the same and get us talking about something that you are interested in or want to come on and talk to us about it, then fill in the feedback form in the show notes. So click the link in whatever app you are listening to and follow the link through to a form and get in touch with us that way. Or you can always find me in Dara floating around on LinkedIn and Twitter as well, if you want to do it that way instead. Last thing to mention is we’re filling up seats for our Google Analytics 4 immersion training course very quickly for June. This is the last training course you can get in on before they switch off Universal Analytics. We’re actually going to be doing the training over the magical date of 1st of July when they turn off Universal Analytics. So join us if you would like to get the last couple of seats for that. Head over to measurelab.co.uk/training and check out the details there for availability and pricing. And maybe we’ll arrange some kind of online party to see off the 1st of July who knows? Anyway, enjoy.
[00:01:16] Dara: Hello, and welcome back to The Measure Pod, a podcast for analytics and data enthusiasts. I’m Dara, I’m CEO at Measurelab.
[00:01:23] Daniel: And I’m Dan, I’m an analytics consultant and trainer here at Measurelab.
[00:01:27] Dara: So the topic we’re going to discuss today is something that’s been submitted through our contact form from somebody called Hugh. So Hugh suggested a really good topic for us to discuss, which is the differences, the pros, the cons, the merits of quantitative versus qualitative data or analysis. So this is something we’ve probably discussed ourselves, Dan, and we’ve undoubtedly talked to, to lots of clients about in the past. It is a really interesting topic, and it’s probably two areas that should actually blend together a lot better than maybe they often do in practice. So that’s probably something that we’re going to, we’re going to talk about too. But maybe if we can just kick things off. I’ll ask you, I’ll pose you a question so in your view, what’s, you know, what are we talking about when we talk about qualitative versus quantitative analytics? Especially in our kind of space where we’re talking about web and app.
[00:02:13] Daniel: Good question, Dara and Hugh, thank you for submitting the question, Hugh. The thing for me is that there’s always this perception that there’s like a fence and you’re over one side of the fence or the other, you’re in the quant side or the qual side. That’s the realities, it’s not the same, there is no big divide or gap at separating the two things. But to answer your question, like, well, what do we mean when we talk about qual data or qualitative data? And for me it’s just anything that doesn’t fit in a spreadsheet. If I can’t put it down into a tabular form, then that’s qualitative data. So this is going to be things like, or, or maybe just really hard to, so it’s going to be things like free-form survey results or interviews or a lot of kind of in-person or remote UX testing. Heat maps, scroll maps you might have come across things like Microsoft Clarity or Hotjar or SessionCam, these kind of tools from a digital perspective at least. But they’re a qualitative data set whereby they’ve got lots of interesting stuff to say, it’s just not as black and white as a Google Analytics and Adobe Analytics that can output stuff into a data table very easily I might add. So there is grey area, but if you can’t put it into a spreadsheet, then it’s qualitative.
[00:03:12] Dara: Yeah, I’d agree with that. And I think obviously, I mean, that’s for the sake of simplicity, that’s a pretty good distinction. But there’s obviously a lot of overlap as well so like with a lot of the qualitative tools, they do, you know, you can export data from them as well. So If you only like to work in a spreadsheet, you can still pull data out of qualitative tools and vice versa. Some of the quantitative tools do provide some kind of elements of qualitative data as well. Which reminds me of the pretty poor attempt that Google Analytics made at including some qualitative data back in the day. I was trying to remember what it was called, was it ‘in page analytics’? Is that what it was called?
[00:03:46] Daniel: Yeah, that was it. They demoted it to a Chrome extension, didn’t they? And then they eventually, sort of got rid of it completely.
[00:03:51] Dara: Yeah, and it just had so many different issues like it, it would not overlay properly and it would often break. And then the biggest issue of all was it would give the same, it wouldn’t differentiate between different clicks to the same destination. So you could have like a big, you know, button that links off to a page and then you could have a tiny link down the bottom and it would give them both the same percentage.
[00:04:11] Daniel: Yeah that was it’s click mapping, right?
[00:04:13] Dara: Yeah exactly, and it was always kind of, I don’t know if it’s died down a bit now, or maybe I’m just a bit less kind of close to these conversations, but back then when Google kind of had that, again, I’m going to be mean here, but when they have that pretty poor attempt to have some kind of qualitative click mapping data, heat mapping data in there. There was a lot of talk about why GA didn’t actually maybe acquire somebody like Hotjar or Clicktale was another one that I was using, you know, a number of years ago. And why didn’t Google kind of acquire one of these tools and then fully integrate it into GA and maybe the reason why is back to what you were saying about people wrongly thinking there’s this fence between the two, and it’s like a case of people staying in their lanes. maybe that’s the right thing as well. Maybe you shouldn’t get both from the same, from the same tool. But there’s no reason why you can’t use the two data sets or the two tools together.
[00:05:02] Dara: And I’ve done that in the past successfully many times where you might use something like GA or another analytics, kind of quantitative tool, to try and help you figure out where the problems are and quantify those problems. And then you might use something like Hotjar to then try and get a better understanding of why that’s happening. So the two should work well together, in practice they often don’t. And my number one theory for that is probably the same as what you’ve suggested, which is that, you know, people kind of tend to just feel like they have to, for one reason or another, stick on one side of the fence.
[00:05:33] Daniel: I think it’s very siloed and a lot goes alongside team structures too, or at least traditional, or maybe less so now, but traditional team structures whereby you might have a data or analytics team, and that almost exclusively will be quantitative analytics on data. And so it’s not explicitly said in the name, but they’re assuming it. And so it becomes more of a science, you know, the science teams, the maths teams. They’ll have the quant data, and then often you might have like a more CRO or UX-based teams or design teams that might have the qualitative data like the Hotjars and the, the SessionCams and the ones with the heat maps and the scroll maps. And I think there’s often a perception there that there’s a kind of, well, that’s the, the numbers people, and that’s the design people or the experienced people, and this is the tools that we use.
[00:06:13] Daniel: But as you said, there’s huge overlap between them. And I think it, it all comes down to adding in or filling in gaps in terms of the narrative that you are getting from the data. It’s all a narrative, it’s a story that we tell, right? And I think where Google Analytics and equivalent tools, when we’re talking about quantitative data it’s very much about counting things, right? Very clever, sometimes counting, but it’s just counting the number of times something happened or not happened. And so it’s just quantifying, I suppose that’s not where the whole word comes from, but like it’s just quantifying what has happened, but not why. So we can’t tell you why things have happened, and over time you can get really clever at inferring things, and you can start being clever with things like machine learning and other things to start recognizing patterns to understand why.
[00:06:51] Daniel: But out of the box we’re looking at, this is a factual thing that’s happened this many times. You know that 10 purchases have happened and 5 times someone abandoned basket. But it’s not telling you why and I think that’s where the qualitative data tools are really good at doing, they’re not really good at quantifying necessarily how many times something’s happened, but it can tell you why something happened or happened less effectively, or maybe this time it didn’t happen because of X, Y, Z. And I think this is really where we kind of join those two pieces together. Actually, it reminds me, it’s kind of often when we’re working with marketing teams and you’ve got like a web team or a CRO team, and you’re working with a marketer and they’re about driving all the traffic to the website. It’s like, oh, this isn’t converting, let’s put our spend over here. But often you might find that, well, maybe it’s the website that’s crap and maybe it’s the website that could be optimised, there’s nothing wrong with the marketing or vice versa, you know, that the page might be working perfectly, but the quality of the traffic from the marketing is pretty bad.
[00:07:40] Daniel: And I think there’s this idea of like, I’ve done my job, now it’s over to you. It’s like a, like a relay race. But actually it’s just one thing, right? We’re all working into the same end goal within the same organisation, or at least hopefully. And I think this is where the quantitative and the qualitative user research, the user analysis, the user data is going to be really important. And you touched on something, Dara, which is around the kind of the way they work together, because it’s easy for me to say, yeah, they work together, but like, how and why and what the use cases is going to be the hard thing if you’ve only used one and not the other, as you said, the data side, the quant stuff can be really good at finding issues or finding success things but then understanding why and what to change and what to do or where the issue might be exactly, or why things are happening that’s where the qualitative stuff comes in.
[00:08:18] Daniel: So I wouldn’t suggest, for example, if we are talking about session recordings or heat mapping. I wouldn’t suggest looking at every session ever to your website. But I would categorise, based on the quantitative analysis, I would categorise or find or segment the different types of sessions that are of interest and then do a watching session of those sessions. And I think that’s where we can kind of tie the two together. You kind of identify groups or patterns in the data, and then we go look at those and kind of come up with the ideation or ideas around what to improve, what to change. And then it goes back to the quant to measure the impact of your change and it’s this cycle, right? It goes round in cycles. Where it’s like, okay, well we’ve identified something, we’ll go figure out what we need to do. We’ll go do something, we’ll measure the change, we’ll go back round again. And often it’s the way that we put these things into teams, like analytics teams or data teams, or UX teams, but the word optimization is so broad and it’s so ambiguous, but it’s all about optimization.
[00:09:09] Daniel: It doesn’t matter what you’re optimising, whether it’s websites, marketing campaigns, experiences, revenue, you know, we’re all about optimization and they all, they all tie together in some way, you know, using these two, using these two avenues. I’ve talked a lot, I’m very aware of that. So I’ll hand it over to you now, but that’s my thoughts in a nutshell.
[00:09:24] Dara: I agree, and I think that you’ve hit on a really key point there, which is optimization is of often used wrongly in a very narrow sense. It’s like, oh, you know, if you’re talking about optimization, you’re talking about, you know, landing page optimization or checkout optimization or whatever. But it isn’t, it’s the whole thing, you don’t just look at it from one piece of the journey. And it’s the same thing with using the data you don’t just look at your analytics data because it’s only going to, it helps you like, like we’ve covered, it helps you kind of quantify a problem that you’ve identified, but it doesn’t always, it doesn’t always give you that kind of extra layer of insight, or it doesn’t necessarily tell you what to do next. It might help you come up with a few hypotheses that you could then test, but if you just look at it through that one lens, then you’re going to miss all of the things that those hard numbers don’t tell you, and that’s where you can use something like screen recordings.
[00:10:15] Dara: And I agree with you, I think often where I saw it go wrong when people were using something like Hotjar or any other kind of screen recording tool is the heat maps were good in that you could go in, you could look at the heat map and it was still kind of aggregated data across all the users that were tracked. But those screen recordings were often where you would really find what was happening. People were put off because they thought they had to just go through and like watch video after video, after video, but of course you don’t. You can kind of narrow your search down, you can filter and you can hone in on those journeys that you think are going to help you uncover why something is happening. And then I’d add a third one, and there’s going to be way more than three, but just a third one that came straight to mind is so as opposed to screen recordings doing remote user testing as well. So a drawback of the screen recordings is you don’t necessarily know, or, well, you probably don’t know actually at all what the motivation of the user was, you don’t necessarily even know what they were trying to find.
[00:11:08] Dara: And when you’ve watched those screen recordings like we both have, you often see very erratic behaviour and you see people clicking around to lots of different pages. You’re not really even sure if they’re looking to, say it’s an ecommerce site, you don’t know if they’re looking to buy something or if they’re just looking to compare products. You don’t necessarily know what the motivation is. So if you can do kind of pretty tightly controlled user testing and ask users to perform a certain function on the website, you can find issues that you wouldn’t have actually found through either screen recording or just looking at the analytics data. And like I said, they’re just three things you can do. Then you’ve got surveys and lots of other stuff you can do too. But if you want to improve the end result, you’ve, you’ve got to look at it from every angle you can’t just go in with blinkers on and think, I’m just going to log into GA or Adobe Analytics or whatever, and I’m going to figure out all the problems and then go and fix them, it doesn’t work that way.
[00:11:56] Daniel: I completely agree. And I think there’s an infinite variety of qualitative data that you can use. And I think there’s almost a finite, very small finite amount of counting things, right? So yeah, there’s definitely more, it’s broader, right? It’s a broader spectrum on the qualitative side. Focusing very much on the web side or the kind of user behaviour side on the websites at the moment though, like using things like Hotjar and you said about watching the sessions and stuff. One of my favourite things that I think has only ever come up from using qualitative data is this notion of challenging assumptions, or disproving or proving assumptions.
[00:12:24] Daniel: So often you’ve got this idea of the HiPPO, right? The highest paid person’s opinion in a room. And often we’ve been in scenarios where there’ll be someone that’s like, I know how this works, I know how people use our website. I know how X, Y or Z happens because I’ve done this for long enough, or I’m important and I get these things. But you know, when you watch someone struggle to find where they add to cart button is, and everyone’s like, well obviously it’s just there. You know, you’ve got a bunch of people that have been in the industry for a while, work with that company and have built the website all sitting around well, it’s obvious. But the reality is, until you see something, you put it in front of someone for the first time, whether it’s through user testing or whether it’s through something a bit more broad and like session recordings. You actually see people can’t find it or, you know, for example, 90% of your traffic’s on mobile, when you look on a mobile, people are struggling to find where the information is because you’ve designed it on desktop.
[00:13:06] Daniel: And it’s one of those things that even if you try to use all the data in the world, all the quantitative data in the world, you’ll find a way to prove a point. You can make it kind of sing to your own tune, whereas I think the qualitative data, you’re literally showing them this person can’t look at them, look at them clicking everywhere, but the add to cart button, it’s impossible. And you can see them leaving, right? You can kind of see this happen over and over again, you can get multiple examples and I think that’s something that’s so powerful and it, it really resonates, I think, a bit harder than numerical data or at least, quantitative data. And I think that’s my favourite part of this whole process of integrating qualitative stuff alongside the quantitative is to find, you know, I can find that there’s a bad conversion rate on mobile and I can find that it’s on these certain devices, whatever. But then to say, look here, look at them struggle, this is why it’s happening, look at this. And I think that’s the real kind of powerful, you know, aha moment you can get from these kind of things.
[00:13:51] Dara: A great example of that that I saw, so it was an ecommerce website, was quite a complicated website. There was lots of different kind of sections to it, selling very different products. And the user testing basically asked pretty simple stuff. It said like, you know, go on and find these two different products from these two different sections. And the majority of the users, instead of using the site search or using the navigation on the site, they went back to Google each time. So let’s just say, think up a hypothetical example now. But let’s say it was a, you know, it was a gardening website, and the first thing they had to buy was a plant pot. They would Google plant pot, go through to that section, try and find the product, and then the next thing was a rake, and then they went back to Google, and then they searched for it again. So they were looking at it all wrong. They were trying to figure out like why people weren’t effectively using the side search or was the navigation good enough or did they need to, do they need to improve it? And the reality was, like the surprising user behaviour was that people were just thinking in their head, I’m not even going to bother to use the navigation, I’m just going to go back to Google and search for the yeah, so they search for the brand and then the product they were looking at.
[00:14:53] Dara: So the screen recordings won’t catch you leaving the site, or it’ll show you leaving, but won’t, you know, won’t show what you’ve done next and then it’ll track you as a new session when you come back in potentially. And the same in analytics, if you’re going back and searching through a different keyword, you’re going to trigger a new session when you reappear back on the site. So it was messing up all of the other data sources and the only way to find it was by actually looking at what users genuinely do and then it, it showed this, you know, slightly odd behaviour. Maybe it isn’t odd, maybe it’s actually more effective.
[00:15:21] Daniel: Maybe, yeah. I mean, Google invests a lot of more than any other company I know into their search capabilities right. I’ve got so many fun stories like that, I mean one of my favourites that comes to mind at these is when there was a company I was working with from a consulting perspective that is, and they had a one and a half hour meeting every week to discuss what was going to be cycled on, on their homepage right. So they had like content blocks, and it was about like fighting for like my product suite. You know, you had merchandisers there, you had kind of content people there and it’s this whole thing. And then we managed to show them that, you know, I think it was something crazy like, I mean, 20% of all their traffic to that page even saw it, it was below the fold. And so like as soon as you said that like, you know, you get this much traffic, because a lot of people, and this is not necessarily a kind of pointing the finger at these people, but I think it’s a maturity thing, but a lot of people still use things like page views as a KPI, right? Like how many views has been viewed on this page, but then like we said, that’s the quantitative side that’s counting.
[00:16:12] Daniel: But then the qualitative side is saying, well look, okay well that is true objectively, but then at the same time if only, you know, 10% of those even scrolled down beyond the fold to see the content, you know, regardless of what device they’re on. But it’s these kind of things that prevented and all of a sudden it kind of completely blew that weekly recurring meeting out of their calendar, which they loved, actually for the people I was working with at least because it’s like why are we having this argument every week? And actually we could just worry about something different. And it was another thing which is quite common actually, or at least it used to be more so than now. But a lot of people still designed and developed websites for desktop devices or on desktop devices, not realising or maybe realising, but not putting two and two together that if 80% of their traffic is on mobile, they should be designing for mobile, make it desktop responsive.
[00:16:52] Daniel: But when it comes to work and their website, they would always be on their desktop cause they’re at work, right? Or they’re on their laptop, and so they never truly just opened the mobile phone version of the website to experience it that way and I think this is where the qualitative and quantitative can join together and say, look here’s what’s objectively happening, let’s go have a look at those and I think this is where the segmentation happens. But yeah, disproving myths, hypotheses, assumptions, I think this is really good way of doing it. But when we talk about digital analytics in the broadest sense, you know, we’re still looking at things like identifying users, you know, especially things like session recordings, we need to define a session which needs to define a user. And so cookies, consent, legalities, you know, even ethics come into a play in terms of, you know, are you letting your customers know that you’re recording them when they come in? Like you would with a CCTV camera, right? You wouldn’t just secretly record them, you would say, look hey look, smile, you’re on camera, or something like that.
[00:17:41] Daniel: So what happens in the future as we, we talked a lot about the quantitative side and the Google Analytics side, and some of the solutions that Google are releasing, like Consent Mode and things like that to account for it. What’s going to happen with the other side of it, the qualitative side, the Hotjars and things like that.
[00:17:53] Dara: Ooh, you’ve asked me that as if you expect me to give you a very detailed, technical answer. I agree in, in terms of, I agree that’s an important question. They are going to be obviously affected in, in all of the same ways, like we’re more familiar with navigating through these issues from a kind of analytics perspective. But yeah, all of the same legislation will apply to anybody who’s doing any of this kind of heat mapping and screen recording. And I don’t know, on the kind of survey, I’m going to go off on a slight tangent and then come back in again. But I was just thinking before you asked that question about how, and this is probably because we’re analytical, but we’ve been focusing when we’ve been talking about qual on more of the, the kind of stuff that’s, maybe it’s qualitative compared to pure analytics, but it’s still numbers based.
[00:18:35] Dara: So things like Hotjar, you’re still collecting kind of click data, it is still quantitative data, it’s just presented in a more visual way. And then you’ve got tools like Heap, which can track everything that happens. So every click that happens in the browser gets tracked. So you could build a heat map from that, but it’s not really a qualitative, or well it is, but it’s quantitative and qualitative. It kind of sits across the two. But on the kind of true qualitative stuff or pure qualitative stuff, like surveys, again you know, you’re obviously going to have to have permission and there’d be a difference maybe between, because obviously you’re going to have your consented, trusted users, who you know are potentially already customers and you could reach out to them and you could get survey data from them, and that’s fine. But what about the anonymous users who you probably really want that feedback from? So I don’t know where that’s going to sit in terms of like having to gain consent then to then survey people.
[00:19:26] Dara: And just like with cookie banners, if there’s kind of an increase in surveys popping up, then people are probably going to get pretty, general people are probably going to get pretty dismayed with that and are probably going to disengage and aren’t going to want to be surveyed. I’m not really answering your question, I’m just saying more things. But I guess where my mind was going there is, are we going to see an increase in things like exit surveys or even just general kind of onsite surveys because of these kind of gaps that are being plugged with machine learning. So you’re not necessarily going to have the maybe full trust that you did in some of the quantitative data in the past.
[00:19:59] Daniel: Trust is going to play a huge part of this across all of this, right? I was thinking as you were saying that, there’s two types of data, right? Two types of users. So we’ve got the anonymous users and then we’ve got the customers. I mean, it’s just like going into a store and having a loyalty card going into like a Tesco or something with a club card. You’ve got the people that will never get a club card, they’re just going to walk in and walk out and buy their stuff. Some people are going to walk in and be like, here’s my points, I’m going to sign up and get the scanner thing in the door so that you can know exactly where I’m going and what I’m scanning and do everything. And I think there’s an element of trust there that you know, like the people that accept cookies on websites. You know, let’s say I don’t know if people act the way I do, but the way I do it is based on the brand and the trust. And so I will happily let you track stuff if I respect and trust the brand. But if you’re some random news site that’s just going to do whatever you’re going to do with the data and not very clear with that, I’m absolutely not going to do that.
[00:20:44] Daniel: Especially if it’s something that I’m only visiting for this piece of content and not regularly in any way associated to the brand. And so like you’ve got the people that accept these cookies, the people that are going to be tracked in your quantitative data, they’re going to be the ones recording your qualitative data on the, the SessionCams, the heat maps, everything else. And so their behaviour’s going to be so different to the behaviour of an anonymous user, but they’re the ones that we can’t track. Yes, there’s gap filling mechanisms like machine learning, but that’s not going to help anyone really, it’s not going to truly get the qual side. Like you’re not going to, you know, use machine learning to get the heat map of users that didn’t consent right, or the session recording. I’m wondering like how long or how much time there is before you can have to change everything. I just thinking in terms of processes, like we’re talking about like finding opportunities through quantitative research and then qualitative analysis.
[00:21:27] Daniel: But the thing is, that’s one skew of your customers. That’s one version of them, how the hell are we going to access the valuable nuggets of insight of the unconsented users? Like how the hell are we going to know what drove them away or why they didn’t buy or did buy, you know, even though they didn’t consent and it’s going to be an interesting future for all of the industries, but like how, how much the skew, like the audience skew is going to be for those that have consented that you have the data. And I’m wondering how much of a bad thing it could be if you run a, I’m not saying everyone is, but if you run with the assumption that everyone visits your websites like these people and you change things according to the people that are already like your brand, how much damage that could potentially do by pushing those other people further away. It’s an interesting conundrum, I don’t have any answers. It’s just a question that’s kind of come up.
[00:22:11] Dara: You’re right, if you just keep basing it all on the people that are, you know, your ideal segment, then you’re just going to, you’re just going to narrow your view down further and further and further. If anyone listening has any views on this like where is that qualitative data going to come from in the future? Are there other ideas and creative ways of gathering that feedback from, from users who aren’t, you know, aren’t consenting to being tracked.
[00:22:33] Daniel: Maybe it’s just a like we have been talking lots about marketing analytics and Google Analytics and you know, the shift away from, for example, multi-touch attribution to media mix modelling. You know, and these are technologies and ideas and approaches that have existed from the very beginning, pre-digital analytics. But what we are doing is we, you see a lot of this kind of full circle stuff going back and kind of going back into these ideas of you know, econometric modelling in a sense, you know, but just with a fresh lens on it. But these are ideas that have been around since, you know, people started advertising when it was all billboards and you know, even into the TV and radio days. So I’m wondering if there’s a thing in the qual side, which is like, okay, these technology is kind of like our multi-touch attribution. Things like session recording is just a technology that’s going to disappear over time and we can’t keep using it and it’s going to be of limited use. So maybe there’s more of a thing going back into surveys, going back into NPS (net promoter score).
[00:23:24] Daniel: Just getting NPSs scores on the websites through in un-invasive popups and things like that, you know, rather than these big exit polls or surveys and things like that. Or maybe it’s just to their CRM and their email database and I’m wondering if there’s like a shift in ideas and approaches and tool sets alongside, just like there is in the kind of MarTech side alongside this. If you are in this world, then let us know, we’ve got that feedback form in our show notes, click the link into whatever app you are listening to and it’ll be there just like you did and made us talk about this today. But yeah, I think this is an interesting perspective and how it’ll, this kind of almost like regression back into older technologies, but they are almost new technologies for us because we haven’t used them before in a very long time.
[00:24:01] Dara: Yeah and things like the focus groups, and again, kind of remote user testing. These will still work because you are, you’re bringing in people, you’re getting the consent that you need. You’re just not getting the people who anonymously visit the website and you don’t know why they’ve gone to the website and why they’ve left. But that’s no different again, to the real world. You only get to learn information about either customers or people who agree to talk to you and give you survey data in the real world.
[00:24:25] Daniel: Do you ever look at the marketoonist? The cartoon strips. Ah, they’re so funny. Check ’em out on LinkedIn and they’ve got a website too. I’ll put a link in the show notes. It actually reminds me of a recent one they released a cartoon strip and it was around this kind of user research testing. And it’s just two people sitting together behind a, like one of those one way mirrors with a bunch of people in a room doing the user research they pulled in from the street and this person says, this decision is so important, let’s leave it up to whoever we can find, willing to give up their afternoon for free snacks and a Starbucks gift card. I’ll put a link to the, in the show notes that but it’s worth checking it out LinkedIn and following them. But yeah, it’s just, it’s that kind of stuff, isn’t it?
[00:24:59] Daniel: Again, it’s these ideas of having a representative sample, you know, were so important. Even then, you know, when there wasn’t the digital analytics side or the digital qual or quant analysis, whereas we’re kind of moving away from that again now of having everyone, and so that we can get, in a sense, your total customer bases views that voice of customer being the full extent, whereas now the voice of customer stuff is only going to be for the customer that’s in a sense, already relatively engaged and liked your brand, and maybe that’s going to skew the voice of the customers more positive right. So I think going back into these, again, these new/old approaching technologies like econometric modelling or surveys and things like that. We’re just going to have to be hyper aware of the impact of the, the kind of representation and the kind of voice of customers. We are never truly going to get a, the broader sense of the voice of customer again and maybe that’s something we just learn to live with.
[00:25:49] Dara: I mean, there’s probably some creative marketing campaigns that could attract plus with some incentivization maybe, to get people to complete surveys so you could reach out to people who aren’t customers. Again, you probably, you know, you run the risk that you’re never going to know how clean that data is, but that again, will be the same with any surveys. But I’m sure there are examples of people who are being very clever and getting that feedback from both customers and non-customers through a mixture of clever marketing and some incentivization.
[00:26:17] Daniel: Well, I think we’ve covered quant versus qual, or at least talked about it in a sense where we think it’s going, where it’s going from. But again, if you are that person that knows this inside and out and wants to come and talk to us about it, we’d love to hear from you. So check out that, that feedback form in the show notes so please do reach out. But onto other news Dara, and more important news than the quant versus qual debate, how have you been winding down? What have you been doing outside of this world of analytics to get away from everything?
[00:26:40] Dara: Well, I mean, how long do we give it? We’ve both enjoyed the World Snooker Championship, so I think we should probably do a podcast episode just on that. Well, I was watching it last night and my partner Hannah came and sat down for about 20 seconds and then said, wow, this is even more boring than I thought it was, and then she left. So you either, you either love Snooker or you really, really don’t. It seems to be a bit of a marmite thing, I don’t know anyone who’s kind of like, yeah, I could take it or leave it.
[00:27:09] Daniel: Oh, for sure, for sure. You either make the time for it or you don’t. And I like the, the kind of the slow, quiet, calming nature of it. It’s almost like, quite therapeutic actually in a sense. It’s very calming, meditative is what I meant. It’s quite meditative in that state of like, you know, everyone’s dressed smart and it’s very silent and it’s very calm and it’s very slow. One person at a time, no talking. It’s nice, I like it, but it’s not for everyone. Some people want a bit more action, or stuff happening, I think I quite like the tactical battles. But enough about snooker, I do have something to mention. I recently, it was last week or the week before, I can’t remember now time is a blur. But I went to see the Always Sunny in Philadelphia live podcast show in London at the Royal Albert Hall, which was incredible.
[00:27:48] Daniel: So a podcast about a TV show that’s been going for 16 years sold out the Royal Albert Hall two times. And we went to go see it and they just sat on stage talking about stuff, and it was just a live podcast. It sounds so lame when you say it out loud, but it was really fun, really entertaining, they had a bunch of clips from the show, they did loads of games and stuff and yeah, it was an incredible experience actually. Yeah, I can’t wait for the new series coming out in a couple of weeks.
[00:28:10] Dara: I feel a little bit like the people who don’t like snooker now I’m sitting here thinking, I don’t get it. What did they do? How is that a live show?
[00:28:18] Daniel: Well just, you know, what would we do if we had a live show? So think of it that way. You would have to come up with something that’s a little bit more interesting than me and you picking our noses on, you know, an audio form where no one can tell what we’re doing.
[00:28:29] Dara: Speak for yourself.
Dara: That’s it for this week, to hear more from me and Dan on GA4 and other analytics related topics, all our previous episodes are available in our archive at measurelab.co.uk/podcast. Or you can simply use whatever app you’re using right now to listen to this, to go back and listen to previous episode.
Daniel: And if you want to suggest a topic for something me and Dara should be talking about, or if you want to suggest a guest who we should be talking to, there’s a Google Form in the show notes that you can fill out and leave us a note. Or alternatively, you can just email us at firstname.lastname@example.org to get in touch with us both directly.
Dara: Our theme is from Confidential, you can find a link to their music in the show notes. So on behalf of Dan and I, thanks for listening. See you next time.