Skip to main content

#137 Which LLMs stand the test of time?

Will Hayes · 20 February 2026

In this episode of The Measure Pod, Dara and Matthew ride the wave of technological change, diving into the fast-moving dynamics of AI model competition and what it means for the industry. They explore how tools like Co-work are reshaping the way developers write code, and reflect on the broader trend of removing friction from technology - touching on recent innovations like OpenClaw. The conversation shifts toward the enterprise world, unpacking governance and security challenges in SaaS before turning to Google's evolving AI strategy, its infrastructure advantages, and the role of Tensor Processing Units in giving it a unique edge.


Show notes

More from The Measure Pod

Share your thoughts and ideas on our Feedback Form.

Follow Measurelab on LinkedIn


Transcript Show transcript ▼
"You are only limited by your own imagination." Dara
"If that bubble burst, you could see someone like OpenAI collapsing because they just don't have the revenue to support what it is they're trying to do." Matthew

Show full (AI-generated) transcript

[00:00:00] Lizzie: Hello and welcome to The Measure Pod by Measurelab podcast dedicated to the ever-changing world of data and analytics. With your hosts, Dara Fitzgerald and Matthew Hooson. Between them, they've spent more years and they'd like to admit wrestling with dashboards, data quality, and the occasional Google Curve ball.

[00:00:32] Lizzie: So join us as we share stories about how analytics really works today and where it might be headed tomorrow. Let's get into it.

[00:00:41] Dara: Hello, and welcome back to the Measure Pod. I'm Dara joined by Matthew, as always, Matthew. Hi, how are you today?

[00:00:53] Matthew: This week. Alright. I'm surviving. I've got a cough. That's my health update. 

[00:00:55] Dara: I feel like, I feel like this is a theme you need to stop. I know. 

[00:00:58] Matthew: Mention it's two children, how weak and sickly you are.

[00:01:00] Matthew: It's the two young children that keep bringing diseases home continuously. 

[00:01:05] Dara: Yeah, I hear they do that. 

[00:01:07] Matthew: Yeah. So. I see it in case I splitter halfway through talking about something. That's why I'm mentioning it. But apart from that, I'm okay. How are you? 

[00:01:15] Dara: So, yeah, I'm okay. I was going to, I was going to ask an extra qualifying question. We, you know, we were always saying about this like existential dread, and then on the other side you've maybe got this like, enthusiasm and, and, and drive for what's happening in the industry. I don't even know if it is a scale. I think they're maybe not on opposite ends, but what's your, where are you on the fizz versus dread scale?

[00:01:38] Matthew: I, I think I've got, I've just sort of settled into a sense of inevitability with everything of, like, I kind of know, I, I kind of have a theory of where he's going down the line in, in years to come and that there's not a huge amount that couldn't be done to change that and just to work with it and ride the wave and do good work.

[00:02:01] Matthew: That's kind of where I'm at. It's like, I dunno. Peace. Contentment Nirvana, how about you? 

[00:02:10] Dara: Yeah, I think that's a good way to put it. I, yeah, I think right. Ride the wave. 

[00:02:14] Matthew: I think. Well, I think you've been fizzy. 

[00:02:16] Dara: I, yeah. I'm maybe a bit fizzy. I was going to ride the wave. I don't actually know what it is.

[00:02:20] Dara: Is that, what's the thing surfers do? The hacker, is it the hacker? 

[00:02:23] Matthew: I dunno. 

[00:02:23] Dara: Hacker. Hacker. Dunno, dunno. I shouldn't, I shouldn't try and do things that I don't understand. 

[00:02:27] Matthew: No. 

[00:02:28] Dara: Anyway, right. The wave. Yeah. I am a bit fizzy. I think. I think you're right. I'm a bit fizzy. Fitzgerald, just swept up in. And this is, we do love a segue, but the danger is we segue too early.

[00:02:41] Dara: So this is going to be a segue for later, but this is related to what we're going to be talking about today. All the moving and shaking and everything else, and riding the wave. But yeah, I'm, I'm, I'm excited. A bit like you, I'm, I'm having a. You know, a clarification. my, my, the, you know, I feel like I'm seeing a clearer and clearer picture of where things are going.

[00:03:03] Dara: Obviously nobody knows. I mean, no one ever knows where things can end up, but I think more than ever, nobody really knows for sure. But it's exciting and I think just being part of it and just being swept up in it is quite exciting. 

[00:03:20] Matthew: It's a very interesting time to be alive. It does, it is, it is. 

[00:03:23] Dara: It does make being a podcast host. More difficult though. I'm sure you'll agree because even listening back to the last one that we did, which was, what, 25 years ago? Was it? 26 years ago? Something like that. Yeah. It already feels like there's so much more that we could have included or so much that's changed. So that's probably a sensible start.

[00:03:42] Dara: Let's, let's just take a little step back. It's probably worth saying, right, well, what's actually happened? What are the kind of key things that have happened since the last episode came out? What's the, what's the big news? We obviously couldn't possibly cover everything that's happened, but what are the big things that have happened since the last one?

[00:03:58] Matthew: There's two, I think the two headline pieces are two big models that came out since the last time we talked. So literally on the same day, maybe even hours apart, I think Claude released Opus 4.6 and then I, I think, and a couple of hours later. Open AI re-released Codex, about 5.3 on, on Codex, which is their like local host, coding platform thing.

[00:04:28] Matthew: Both very good models by all accounts, particularly good at coding and, and just churning through tasks with one hit and producing really good stuff. The context is huge, right? Yeah. Context window, million tokens. Yeah, massive. And, and that's something like Google's been leading on that for a while in terms of the context windows, but then Anthropic, I think yeah.

[00:04:49] Matthew: Was a, they've jumped up to that million number and yeah, it's, it's a pattern of a, it's a pattern of some trading blows between open AI and anthropic a little bit recently. They don't seem to be punching much towards Google, but they definitely seem to be trying to swing at each other a lot more. 

[00:05:04] Dara: Yeah. And it's kind of, a little bit of a case of, it's, it's, it's like, you know, that, that kind of. A fight where it's like slapping rather than, you know, it's like that bit of a, it's a bit of a bit of a cliche or whatever. I dunno what it is, you know, that kind of like slapping at each other rather than a real cat fight, KAF fight.

[00:05:22] Dara: Exactly. it's been a bit snippy hasn't it? I'd almost say I'm a bit disappointed in throwing, we've talked before about how we quite like their, their kind of, their, their moral direction and the way they conduct themselves. But to me I felt it was a little bit, you know, it's a little bit snippy to be taking some of the shots at them that they are.

[00:05:41] Dara: maybe shows, and it's just my personal view, but maybe shows a little bit of insecurity on their side, to be, to be kind of attacked, not attacking, but taking shots of them through advertising and yeah, it de definitely feels like there's a, maybe they're just sensing a little bit of weakness. 

[00:05:56] Matthew: Open AI that hasn't been there before. Like it definitely seemed for a few years that was OK. AI with this untouchable a few years, not that long. 

[00:06:04] Dara: Just a few seconds. 

[00:06:06] Matthew: Still alive. It's been about two years since this bloody incident started. Yeah. But like, it felt like, it felt like philanthropic were untouchable for a while and, and nobody could catch 'em. And Google was endlessly releasing models in the hopes of getting close and never quite could.

[00:06:18] Matthew: And philanthropic was there, or thereabouts, but not really ever competing. And it just seems. I don't, I can't quite tell if it's because I don't really use open AI at all anymore. I, I, I mean, Gemini and I'm in clawed for, for coding work, so I can't tell if, if me being moved out of that, Chachi pt open AI world is leading into my thinking of like, they're in trouble and, and it's, they're, they're not performing as well as they should or whether it's the reality.

[00:06:49] Matthew: 'cause I'm pretty sure I saw a chart the other day. Active users of chat, GPT in comparison to the others, are still massively higher on those fronts.

[00:06:58] Dara: I wonder if that's a case of, and you know, obviously this works in Google's favor, but I wonder if this is a case of it being like the established brand in a way.

[00:07:08] Dara: So for the, for the massive people who are using LLMs for, you know, maybe slightly more, Superficial is the wrong word, but people who are just using it as a, as a kind of, almost like a, like a car, like a beefed up search engine. They probably don't have any reason to move away from chat GBT because that was the first leader out of the gate.

[00:07:30] Dara: You know those people who went, right, okay, great. I've got this thing now and it's a bit better than Google for certain things. I'll use that. They probably don't have any reason why they would move to anthropic. You know, it's like. It's like when, or with browsers, you know, when, when Firefox was the main browser, why would anyone move to Chrome at, at the beginning, you know, or, or you know, whatever.

[00:07:50] Dara: It's like there's so many examples of this isn't there where it's like maybe the masses are just still bought into the one that's familiar to them? Because I think that stats are a bit different when you break it down by use case, right? So I think maybe as a coding topic. Yeah. Yeah, I think so.

[00:08:06] Dara: So it's like maybe just that overall usage is still very much in. In, in an open AI course. 

[00:08:11] Matthew: Yeah. There's a, and it might just be the circles I'm moving in as well, like I am more, more exposed to coders and developers and engineers, but there's, this term was getting flung around recently called Claude Pilling.

[00:08:27] Matthew: Like, you've taken, you've taken the Claude pill and seen the light, and there's a lot of people canceling, canceling their cursor or their open eye or whatever. Accounts and moving fully out into cloud code and using that, that flow. And I think coupled with 4.6 stuff, people were just finding it extremely powerful.

[00:08:47] Matthew: But I, I've also heard that 5.3 code X is really, really good as well. 

[00:08:51] Dara: I heard Yeah, I heard that too. just the, the, the other thing with, with cloud obviously is, is cowork. And I think that could be a game changer in terms of pulling maybe more people who aren't coding over to, to anthropic, to Claude because.

[00:09:06] Dara: You know, you could basically do a lot of the things you could do in cloud code, but it doesn't have to be, it doesn't have to be for code. So, I dunno if you've seen any, I haven't seen anything yet. It's probably too early to know around like, stats to do with how well that's doing and you know, whether it is potentially moving people over, but I guess that will come out in due course.

[00:09:23] Matthew: Yeah, it's, I've read something the other day that it's not a coincidence that all these models started to be, were good at code first and that was all, it was intentional. Moved by the foundation, AI companies because they wanted to get to a point of self-improvement with models and anything they were producing in terms of UI and interfaces and things.

[00:09:49] Matthew: And now they're kind of there. They've got to a point where I think open AI and anthropic, they're all pretty much stating that more than half of the code that's being produced by them internally is being done via these agents. So the next frontier is the rest of white collar work a little bit, almost like, and now everything else.

[00:10:11] Matthew: And that's what it feels like. cowork is a stab at and maybe codex in time like that. And now local environment, just getting really good at just using a computer to deliver work and access different sources and 

[00:10:26] Dara: all that stuff. Well, that's it, isn't it? 'cause that's the foundational layer. Like all of these, you, you talk about like, you know, knowledge work or computer based work.

[00:10:34] Dara: Well, what, what's that doing? It's running code, you know, so it's like no matter what you, you might think as the user, you might think, no, no, no, I'm writing a document, or I'm writing a blog, or I'm working in a spreadsheet, or I'm doing whatever. I'm booking tickets on a, you know, for, for, for my colleagues scheduled a conference or, you know, whatever you're doing, booking flights, doing this, doing that, I'm ordering supplies in.

[00:10:55] Dara: But you're doing all of that through, through code. So it makes sense, doesn't it? It's like that foundational layer. If they were able to crack that. Everything else is, I'm going to say, easy. Obviously it's not easy, but relatively speaking, it's easy then because all of those, you know, that kind of core foundation is in place, everything else just gets layered on top of that.

[00:11:14] Matthew: Yeah. And this, I, I had a play with, I had a play with Claude coworker the other day for the first time. Really? 'cause I, I don't, most of what I do. Generally we'll be either just sort of maybe like strategic thinking and just chatting. I'll just be chatting to an AI around thoughts and ideas, et cetera, or coding.

[00:11:35] Matthew: And for the most part, I've, they've just been happening in Claude code or just within the chat interface, but I, I, I got it. I went to Claude Cowork to help me produce a document the other day, and the output was really pretty brilliant. Like, it, it, it gave me like a good. 10 page document, headers, tables, the whole works like it.

[00:11:58] Matthew: Moved through various different plugins and pulled in different information and went and searched and did some research and it was just quite polished. Piece. The, the, the problem I still had is I sat and watched it because it took a while and I just sat, I was watching the process. I still need to get to my head, like, go away and do something else while it's doing all that.

[00:12:17] Dara: Yeah, yeah. Stop micromanaging. 

[00:12:18] Matthew: Exactly. Which, yeah, so, so it's really interesting and I'm sure it's only going to get better and better. And it's that thing we keep saying about where you're just, you're just removing or, or reducing down. The level of the barrier to entry to the point where it is, it is the iPhone moment of like a person can just open up a screen, type in plain English a thing, and something brilliant gets passed back to them as a result.

[00:12:48] Dara: And that's the inflection point where things start to change dramatically. Yeah. That's when things get really interesting. But it's true, isn't it? I kind of think about that from the perspective of like. Friction. And you know, the thing about friction is you don't really know. You don't necessarily know it was there until it's taken away.

[00:13:08] Dara: And you know, like even with the stuff that we are doing and we're kind of playing around with either internally a measure lab or just on our own even, and it's like every new capability you add takes away a previous point of friction. And once you take that point away, you realize how frustrating that friction was.

[00:13:27] Dara: So, yeah, which is almost a good and a bad thing because then if something goes wrong, you're like, I don't understand why, why can't I just click a button and have that thing work like it did before? I don't want to spend time trying to fix it, you know? So it's like every, every single iteration of this stuff is just making it easier and easier, and just removing more and more friction.

[00:13:46] Dara: And it's like the, the, with the trajectory of it, it's like, it, it really is crazy to think about where that's going to end up. And it's like, what, what is the final destination of that? Is it, is it literally. You know, oh, we always go to stop. Or maybe it's my fault, you know, we're all going to have a chip, we're going to have a chip in our head, or we're like, we're going to be like the people from Wally.

[00:14:06] Dara: we're just all going to be, you know, floating around little, little space pods or something all connected to the grids. 

[00:14:12] Matthew: People watching tell all day. Yeah. I don't know. It's, I don't, I don't know. It, I, I, it feels, and I think one thing is true, and, and I think this is exemplified by. What we said at the top of the podcast where we, we both kind of half listened to the last podcast and it felt so outdated.

[00:14:32] Matthew: Like, not like it hasn't been done before, like occasionally. Yeah. There'll be some, there'll be some big piece of news. Think even one podcast we recorded last year, I was saying, oh, Gemini three might be out soon, and Gemini three was already out because we'd prerecorded it. That didn't feel as out of date.

[00:14:48] Matthew: Then this last one. And maybe that's, maybe that's because of our own learning and, and the journeys we've gone on a little bit internally over the past sort of three or four weeks, or maybe it's the industry, but it, it just feels like something has shifted a little bit in terms of the use and the, and the sort of putting this stuff to work and hooking systems together and, and just the maturation of the.

[00:15:15] Matthew: Of the utilization of things that wasn't necessarily there November, December, that is suddenly there. And, open closed is a good example, right, because Sorry, malt bar. Sorry. open No, yeah, sorry. 

[00:15:30] Dara: Yeah, yeah. You, yeah. You, you've gone the other way around. 

[00:15:33] Matthew: Yeah. There was a third name. There was one that they only had for a day club.

[00:15:37] Matthew: Bott was the original, wasn't it? And then they were briefly Mulbar. Yeah. And then now open 12. Yeah. And, and, and that is like a, I was reading about the guy who created it who has just been hired by open air. That's a piece of news in there. But he just started that 'cause he just thought, well, I, I'd, I'd like to be able to just chat to it on my phone when I go and make a cup of tea.

[00:16:00] Matthew: So he just vibed a, a little, message handler to be able to talk to the, to talk to the LLM via his telegram or his WhatsApp. Things. And then from there, spawn this whole infrastructure, which has, we mentioned it briefly, last podcast since then. It is, I mean, the GitHub clone map is hilarious.

[00:16:27] Matthew: It is a vertical, completely vertical line. It's beyond exponential. It just does that, And yeah, we've been, I mean, we've been playing with it, our, our ourselves internally just to, to see the, the use of that. But it's such a good realization to pull all these things together, remove the frictions from all over the place.

[00:16:49] Matthew: The friction of just being, having to just be sat at a computer to talk to something versus just messaging it on telegram and it going off and being able to code and deploy something for you. Like there's so many frictions have fallen over in the past few weeks. 

[00:17:06] Dara: I went off on one a bit there, but I just said, "No, no, no, you're right. And I, I think a point you made before, I'm not going to try and guess when, but it was definitely on a previous podcast episode, that even if we, even if things stopped today, there's so much more potential of what already exists that's unexplored because it's just happening. It's such a breakneck speed that nobody can possibly keep up.

[00:17:28] Dara: And the potential of what's already there now and the way it makes you think about what's possible and over what kind of timescales is mind boggling. 'cause even when you said there about how, oh, you know, open Cloud was developed because the guy wanted to have a way of, you know, reducing friction for himself.

[00:17:44] Dara: So he has done this and it integrates with all these channels. You can use Slack, you can use Telegram, whatever. and I've been using Telegram. I think you have been too. And I've already reached a point where Telegram is, bothering me 'cause of some of its limitations. So I'm going to build my own custom interface.

[00:18:01] Dara: Which, and I do, I do appreciate some, sometimes the advice from Theis could be a bit patronizing and it can be a bit annoying, but sometimes it's quite helpful. I basically ran the idea by my personal AI and it said, this is a really great idea, and I actually thought this was inevitable that you would think of this.

[00:18:20] Dara: But this is not something to get sucked into right now, so you should park this one. And I thought, wow, that was actually, yes. Thank you for that. You've, you've done me a solid there because it was like this. You are going to lose a week on this if you, if you decide to do it now. 

[00:18:34] Matthew: Yeah. Yeah. Easily. I mean, the part of what I've been doing in the intervening two weeks is, and I think you've done, you've also done this, is like I built a Kanban.

[00:18:45] Matthew: Like I know there's various versions of these kanbans existing in the world, but I've got a Kanban that I can ping it off the agents and Claude code and all this sort of stuff. There's chatting with the faces within the Kanban. There's. I've managed to get other people and other agents and other systems into that Kanban, and it's a demonstration of what we've been saying for a while as well around the landscape for software as a service.

[00:19:10] Matthew: Like this is the first time I've really seen it and go, like, I saw that Kanban thing over there. It wasn't quite what I wanted and I've just completely invented one. And there's, there's, there's tons of people who are doing the same thing, who've got zero coding experience that are spinning up. Varied, bespoke specific solutions just for them, that serve the purpose. That feels like the way most problem solving is going to go in, in short order. 

[00:19:34] Dara: Yeah, and there's a whole other topic, and we've talked about this, you know, in bits before, and we won't get into it now, but there's the whole kind of flip side of that, which is around the kind of governance and security and, you know, having guardrails in place.

[00:19:46] Dara: Because on the one hand it's, it's, it's, it's, it's wild, isn't it? It's like you've got this kind of. Hugely powerful. It's like having some kind of big supercar or something, but you've got to know how to drive it. And you've got to have barricades, or what do they call them on the F1 track, like, chicanes or whatever.

[00:20:05] Dara: And if you don't have that in place, you're going to drive your Formula One car off the, off the cliff or into someone's house or whatever. so it's like the power is insane. The potential is insane. But there obviously is a. There's a flip side to that. People can be churning things out and even pushing them out, publicly sharing them, and they could have inherent security issues.

[00:20:25] Dara: This was a big thing around open claw. and some of it was obviously quite click basey because it became a, a trend, you know, a, a trending topic, but around the vulnerabilities of using open claw if you don't know what you're doing. 

[00:20:37] Matthew: Yeah, a hundred percent. And, and as. That's the problem with any discourse around any of this stuff at the minute is it, it instantly polarizes into, try to grab clicks for foreign, git.

[00:20:51] Matthew: And so you get, and it doesn't help. I, I, I don't think it really helps. Then with, so we've opened the cloth. There's, I, I think we've both got isolated environments when we've been using it. I've got like a home server, I've got it on and it's all sort of hidden away and all, all of that sort of stuff.

[00:21:06] Matthew: But basic things that people who aren't necessarily used to programming and things like that, like pushing a load of API keys up into GitHub or, or opening up various channels around knowing about it. People were giving it credit card information to go off and make spending and purchasing decisions for them and finding that it had spent a lot of money.

[00:21:26] Matthew: I mean, there's some stuff that you just just wouldn't do. But yeah, I think sensible, considered use is good. And it's a pretty much a limitless sort of horizon of the things you could plug together and, and, and work with with it. 

[00:21:44] Dara: This is very cliche, but you are only limited by your own imagination. I think it's, you know, most things, not everything obviously, but most things you can do. Now, it's just a case of do you want to do them? Does it make sense to do them, and can you do them without causing any harm? Yeah. Yeah, exactly. So this is a good point probably to get into, to weave this back into our main topic.

[00:22:09] Dara: unless there's anything, there might be more you want to share on the news front or personal front, but, we're talking a lot here around current potential but also future potential. but we want to try and say what we do think. Nobody can say what will happen, but we do want to kind of focus a little bit on what we do think are some of the certainties, and I might use them in quotes.

[00:22:31] Dara: So what are we seeing, that we think is going to maybe stand the test of time or who, who are the, you know, who are the players that we think are going to have a good chance of still being, you know, in the race at the finish line. 

[00:22:44] Matthew: Yeah. It feels like we talked about that feeling of fragility with, say, open ai, for example. Uh uh, earlier. Because they're running massive deficits of revenue, and it feels like if a bubble bursts, and I think we're both in agreement that a bubble exists, it's just not necessarily a bubble that undermines the technology. It's a bubble. It's a bubble of over speculation in various parts of the market.

[00:23:14] Matthew: And round robin, Nvidia backhanders and all that sort of stuff. If that bubble burst, you could see someone like OpenAI collapsing because they just don't have the revenue to support what that is they're trying to do. They're putting huge bets on massive infrastructures and all the rest of it, and hiring people for obscene amounts of money.

[00:23:36] Matthew: But I can't see past the dust settling. And Google is not still standing there. There may be more than one person, more than one person standing there, person to company, metaphorical ai, body people. but yeah, I can't see past Google still being, still being alive and well. and got and hoovering up anything that gets left behind because of, well.

[00:24:04] Dara: A lot of reasons I suppose. I don't know where to start on that front, but No, I don't, I don't either. Maybe if I talk for a minute, it'll give you an idea. It'll give you a chance to think about where to start. I think you're right. I think one of the main, and maybe this will actually cover some of the individual things we'll talk about, but I think Google have, at a kind of high level, maybe they've got, they've got such a good chance because they're not, they're not all in on one aspect of this.

[00:24:29] Dara: They've got a little bit more. If they're hedging their bet a little bit, whether this is strategic or just the nature of how their business is built. They're not new to this space. So Anthropic are new to this space. OpenAI is new to this space. Meta and Microsoft aren't either. and maybe we'll get to them, maybe we won't.

[00:24:47] Dara: But Google isn't just a frontier AI model company. They're an established global respected technology company. So that maybe is like the overarching view of this, and then obviously there's, there's a lot of, a lot of aspects to that. 

[00:25:06] Matthew: Yeah, they, they, I mean the, their earnings call recently highlighted a lot of where their, their co, where they are seeing big wins.

[00:25:14] Matthew: So that, I think Google Cloud has seen 48% increase in revenue or I forget the exact, the exact metric we were seeing. And I think that is, and, and all of that really, if you think about it over the past couple of years. They've very much pivoted pretty much every aspect of the cloud platform to AI in some way, shape, or form, whether it's layering it into BigQuery and, and pretty much every day you open BigQuery.

[00:25:40] Matthew: Now there's a new button, a new interface, a new way to interact with Gemini in some way, shape or form. they baked in all of the, the ai, functions into BigQuery to call, like generate and things like that over large pieces of data. They've built out all the Vertex and everything that goes on there, and that's, that's just cloud.

[00:26:05] Matthew: That's, that's, if that's, if we don't start to mention things like the fact they have the Android operating system or they have, or they have Chrome or they have workspace with email and docs and sheets and. Colabs and, and they, they've just got so many different surfaces and such a wide plethora of things to, to stuff AI into and see what, what does or does not stick search.

[00:26:32] Matthew: So I can't believe I forgot about the search. Oh yeah. That little, that little thing. Yeah. and, and it did feel like for a while they, they definitely had the most to lose, but then I think they, they moved in, eventually moved in a relatively decisive way and. At the end of last year, it really felt like a big old gut punch to open AI when Gemini three came out and was good and everyone was saying what a leap it was.

[00:27:01] Matthew: I think since then there's been a, there's been a model from each other, each of the different frontiers that have kept up with it. But it's everything else, isn't it? 

[00:27:09] Dara: And I think that, yeah, maybe that leads on to the next point, but the. Obviously they are, you're right, like Gemini. I even just had a thought there of like, are they going to rebrand to Gemini at some point?

[00:27:21] Dara: Because it's, it's, it's plastered all over everything and you know, is it, could this be part of the longer term transformation? Are they going to, are they going to morph into just being Gemini at some point? But it's, it's, it's absolutely like that, it's running through everything now, not just. Not just cloud, but you know, like you said, workspace.

[00:27:40] Dara: And I guess, I mean, all of these things do actually sit on clouds. So maybe cloud is the way to, you know, the, the, the umbrella term for it. But there, they obviously are doing, you know, groundbreaking work around, ai, you know, the stuff that Deep Mind do and emerge Deep Mind with the other. All their names are similar.

[00:27:58] Dara: It was the other, they had another AI arm or shallow mind. I don't know. Me. Yeah. Merged and merged them together and brought them all under the kind of one umbrella. and they're obviously doing really groundbreaking work there, but they, it again, 

[00:28:11] Matthew: they just won a Nobel pr they know won a Nobel Prize. That's not even lms. 

[00:28:15] Dara: Yeah, exactly. Yeah. So it's, it's not like they don't know what they're doing, on the, on the AI front, but it's not their only horse in the game or horse in the race, or it's not the only race they're in. That's one horse in one race, but they're in. You know, they've got coverage in other places and, and their infrastructure is also supporting some of the other AI companies.

[00:28:36] Dara: So it's not even like they've got to fall back outside of ai. They've got to fall back within the AI arms race. 

[00:28:43] Matthew: Yeah. Well, well, it's like we, it's like we keep saying in terms of, I, I keep saying, and I think you agree with this, but there's two, there's essentially two things that feel like they have the longest tail in all of this as work starts to.

[00:28:58] Matthew: Get gobbled up and, and be produced and done by ai. More and more the two underlying underpinning pieces that are going to still be needed is data and its infrastructure. It's a way of pulling together and, and facilitating these systems and its food is to fuel to make it good and, and personalized and, and, and better.

[00:29:21] Matthew: So take out. Take out the model entirely and, and let's just assume that we just stay where we are right now. And we've already said there's a lot, a lot of utility left in these models that we already have. Is there anybody else better positioned than Google when it comes to data, particularly because they have big queries or that they've built an entire infrastructure around analytics and, and data storage and data movement and data modeling and pipelining.

[00:29:48] Matthew: Semantics and glossaries and all of this sort of stuff that's going to add fuel to that fire. Is there anyone better positioned than Google in that front? And is there anyone better positioned than Google on the infrastructure side as in to, for, for modeling for this stuff that they're building?

[00:30:06] Matthew: So they think about all the stuff they're building with Vertex and their agent development kit and, and all of this. It's so. So, I dunno if it's accidental, but they're so well positioned for both of those things to keep to, to make the most of what's here and what's coming. 

[00:30:23] Dara: Yeah, you'd love to know, wouldn't you? If it was like, if you asked someone from Google, they'd obviously tell you, oh yeah, no, that was all we, we saw, we saw where it was headed and this is all, yeah. 

[00:30:31] Matthew: 2002 we decided. 

[00:30:32] Dara: Yeah. We thought, you know what we're going to do, we're going to, yeah. I mean maybe it's a little bit of both. But you're right. And I think you know that.

[00:30:40] Dara: That underlying data piece. I mean, that's at the core of it, isn't it? Because we were talking about how Gemini is plastered all over everything, but, but Big Query really is the, the kind of engine, if that fits in the, we, we throw out so many metaphors and analogies, but, you know, big queries that, you know, it's the, it's the engine or it's the backbone, or it's something in some kind of analogy or metaphor.

[00:31:00] Dara: It's the Shin. 

[00:31:02] Matthew: BigQuery is the shin of Google cloud platform. Yeah, it is. And they, I mean, they even relabeled it, right? They changed what it was called to a, oh yeah. Autonomous data and AI platform or something. 

[00:31:17] Dara: Something like, it was, it was really un-catchy. But yeah, something, something something like that. Yeah, something, something like that. Yeah. 

[00:31:25] Matthew: Yeah. So whether we, you know, if you just think about those two pieces, whether, whether the intelligence bit or the, the place where you interact with it or whatever it is. Does it exist? I don't know, but it definitely feels like the bones of it, the shin and the models.

[00:31:41] Matthew: And because they, I think that's why they are so happy to put any model pretty much within Vertex. Like we have, we've got cloud code at the company and we supply that to staff who need it via Vertex because we can centralize it all and it can all just work nicely, seamlessly within there. And there's, they've got all these models from everyone you can think of really apart from open ai. Yes. 

[00:32:11] Dara: Yeah. Look, read into that what you will, the, the, the other thing about, I think about BigQuery just to, 'cause we're, you know, like we're, we're, we're, we're maybe quite zoomed out here, which is the, the partly the point of this episode.

[00:32:23] Dara: But to zoom in a little bit for a second, it's like a. Google really did quite a clever move in opening GA four up to BigQuery. So, you know, different people have different, very strong views around GA four. but one thing you can't fault Google for is opening up that pipeline from GA four to BigQuery to everybody because even companies of all sizes and shapes can form.

[00:32:52] Dara: From day one, if they've got GA four, they've got all of that data pumping into BigQuery. So you know, people will be using it to various different extents and people will be layering other data in and pulling in other data sources. And whether you have a fully fledged data warehouse or not, the fact is you've got that, you've got that in the door, is open.

[00:33:12] Dara: Whereas it's not quite so straightforward to think, oh, well, we've got our analytic platform, we've got our CRM, whatever. Now who do we, where do we go? And you know, what cloud do we plug that into? It's almost like Google, it's almost like the, it, like the, the, it's a, it's, it's a, it's a hook or a, or a or, you know, they, they, they get you straight away if you're already using that very, very popular analytics products that they've been offering for the last 20 years or whatever.

[00:33:36] Dara: Yeah. And 

[00:33:38] Matthew: then they've, they've extended, they carried on extending that out with. Growing the data transfer service in big queries. You've got Google ads in their search console. They've extended it out to things like meta ads, connecting the open cloud networks. So you can, from BigQuery, I can, I can query Snowflake or I can query AWS or I can query Azure directly and just have the results appear in.

[00:34:04] Matthew: I mean, I think a few of 'em are doing that, but it's, it's to that point of just making it. As simple as possible to have that data centralized. And, and all of that, I suppose, also feeds into the fact that with open AI or anthropic, currently, your data lives still in silos, to a certain extent. You, you, you, you, you come and have to go off to these different sources and pull it into a system and then talk to that system to get an output and look at that, whereas Google and the Warehouse approach does mean that you.

[00:34:37] Matthew: It's your data. You, you've centralized it and, and you own it. And you can do whatever you want with it, manipulate how you want it, mani model it. All of these, all of these things. Having just these other LLM surfaces is not quite as straightforward unless you then went and hooked it up to a, you could hook BigQuery up to open AI or Atropic, I suppose.

[00:34:56] Dara: But Google owns the whole journey. That's the point, isn't it? They're able to kind of say, well, look, you know, you come to us for this and we can offer you our version, or, or you can use the, your alternative. If you don't want that, you might want something else. And the same applies, like if you want to use a different model over there, you can use our data warehousing.

[00:35:13] Dara: If you want to use AWS for your data, you can use our Gemini models. They, there, there it is. It's kind of broad, they've got broad coverage that means, but, but with each of those, if they get you in on one of them over time, they've got a good chance of getting you over to the, to the other aspects of it.

[00:35:31] Dara: So they just need that. I mean, that's what it's about, it's about getting yours. About, it's about getting your foot in the door. And then once you've got that, you can obviously start to offer, offer more. so they've got a lot of, you know, it's the, I need to flip this, but it's like attack surfaces.

[00:35:45] Dara: They've got the opposite of, they've got the opposite of attack surfaces, don't they? It's like they've got all those different areas that they can 

[00:35:51] Matthew: Yeah, they can grab people, they can attack those. 

[00:35:54] Dara: Yeah. Maybe, maybe attack surface is the right way to put it. Yeah, 

[00:35:57] Matthew: because I was thinking, I was thinking about it when you said about them renaming themselves, Gemini, like, I've got a.

[00:36:02] Matthew: Pixel phone. So they, they also own the hardware they got, they got laptops, they've got an operating system in Chrome os. but when I turn on my Pixel phone, the logo that pops up is Gemini. So like that's the boot. The boot is Gemini. It's not like it, it turns into Google after the fact, but it, but it is Gemini in the first instance.

[00:36:23] Matthew: And then we've seen it from, from a Google workspace front as well. 'cause we've got Google Workspace at Measure Lab. We got free access to Gemini, not free access, but with the, we didn't have to pay anything extra. We were already paying for it, whatever. And then here you go, here's Gemini and three Pro, and the ability to hook up all these things.

[00:36:42] Matthew: And here's Notebook, lm, and all these other services that we could, that we just started using. And, and, and the easy, really simple ability for it to start pulling in 10 years worth of drive information and things like that, that we've not necessarily. Structured in any way. It just can go and grab it and pull it in. Yeah, it's, it's, they've got it. They got it made. 

[00:37:04] Dara: They've got it made. Yeah. And we haven't even mentioned TPUs. 

[00:37:08] Matthew: No. Yeah, exactly. That's the next big one, isn't it? Because every other, I think I'm right in saying pretty much every other company, frontier models are beholden to Nvidia. Now I know that I am 99% positive that Google will be very heavily used.

[00:37:26] Matthew: H one hundred or whatever it is from Nvidia in the training of their models. But they're also using and developing their own tensor processing units. And I know that they did use some of those tensor processing units in the training of Gemini three. And you could imagine, you know, a bubble burst and Nvidia goes down or all the companies surrounding it go down and they're no longer the richest company in the world.

[00:37:50] Matthew: Google still has a chip, a development pipeline and all the rest of it to build out these. Processing units for AI that nobody else could possibly claim. I know, I know Open ai, were thinking about trying to make their own chips a couple of years ago, probably for this exact reason. 

[00:38:06] Dara: But it seems a bit, yeah, I mean maybe, maybe they are working on it in the background, but you just can't imagine they are.

[00:38:11] Dara: It's almost like it's too late on that front, you would think, wouldn't you? Especially with open AI, with everything else that they've got going on, and the, and the, perhaps, although you guys use, as you added the disclaimer, maybe it's the, maybe it's the, our perspective, but it does seem like. They're under pressure at the moment.

[00:38:26] Dara: So, it's harder to imagine they're off developing their own chips, when they've, maybe they're making a pencil. Yeah. Pencil. 

[00:38:37] Matthew: Is that true? I don't know. 

[00:38:38] Dara: I don't know. 

[00:38:39] Matthew: I think I heard that secondhand from you, to be honest. 

[00:38:41] Dara: But I, I thought I heard it secondhand from you. no. There was something, they 

[00:38:44] Matthew: just made up a lie to each other, and we, that's what we do.

[00:38:47] Dara: That's what we do here on the measure. Pause. No, there is talk of, it's the Johnny Ive, or Ives, Johnny, Johnny Ive, you know, collaboration and I don't know if it came, this is the thing with a lot of this stuff, like does it, these supposed leaks, are they leaks from people who actually know what they're talking about?

[00:39:05] Dara: Or is it speculation that then gets framed as a leak? But yeah, it's been presented as being like a pencil. It just has to make you laugh. You're like. You could imagine sitting in a room thinking, Roy, we've got this crazy futuristic way of interfacing with all this technology. And then after about two hours and seven cups of coffee, it's like, it could just be a pencil and one of them is just chewing a pencil and it's like, oh, Eureka, wait a minute, let's make a digital pencil.

[00:39:32] Matthew: Yeah. I hope it isn't that, 'cause it did sound like it was so simple and groundbreaking and, but, and that again, just to take this back to Google, are there any other. Frontier model owners, like in people who develop their own models that have a device other than Google, as in they have phones and laptops.

[00:39:50] Matthew: I know Microsoft has that. They use open AI as a model and, and sort that for copilot, don't they? And that's not gone very well. I'm just trying to think of anyone who's already in the market, not, not the big ones.

[00:40:02] Dara: I don't think there's any smaller fringe companies that do hardware as well. I don't know, like consumer hardware. I'm not sure. 

[00:40:10] Matthew: No. Yeah, but it, and, and obviously Google is now supplying it for Apple as well. but, but just to, to take it back, right, back to, to zoom in a little bit, I think for, for a lot of the reasons we've just discussed, that the, the, the sort of ease and ability to get the data centralized into, into BigQuery the power and the growing sort of utility of BigQuery from a, the, the really helpful AI features they're actually starting to add in there.

[00:40:36] Matthew: It's, it's starting to feel less like bloat and starting to feel real, you know, useful. Assets and, and new things that are being in, added in there, all of them, the add-ons that can then get built around that. Once you've got that data centered and modeled and pulled together, they have their own modeling, framework and, and solution in data form that's baked into BigQuery that makes it really nice and easy to interact with things.

[00:41:00] Matthew: Feels like that is the best. Place and solution for data and analytics specifically, let's in, you know, centralizing your analytics, growth, marketing, whatever you want to call it, data and using it for analysis or to build out solutions for conversation analytics. They've also released conversation analytics.

[00:41:27] Matthew: Every time I say a new sentence, I remember. Yeah. But it just feels like that is. That is a really safe growing, ever improving platform that is already the be all the, the, the top end of what you can do with data and analytics. And it's only improving and taking out of that model. Well, data and analytics is becoming it.

[00:41:51] Dara: It's a broader thing now because data and analytics isn't just within the realms of Marketing, customer product. You're talking about personal, personal data, personal analytics. So like everything, such a, what is it a, a truism or, I dunno what, this is a cliche maybe, but it's like everything is data.

[00:42:11] Dara: Everything online is data. All the, all, you know, all of the, everything you use, anything people use, you know, fitness trackers or it could be you know, journaling software or whatever. Anything and everything people use in their own lives is, You know, the data that's collected behind that, all all needs to sit somewhere as well.

[00:42:30] Dara: So again, maybe Google is seeing the bigger picture and thinking data and analytics isn't just a, you know, a business requirement or a business function. There's actually going to be a growing need for personal analytics, personal data. And that's all going to have to sit somewhere and you're going to need all the things Google provides, like.

[00:42:48] Dara: Semantic layers and governance and, and, and, and security and all, all the rest of it. So, and even if the end customer doesn't know that, it could be that the, the, the people building the products for customers are using the Google infrastructure. So, you know, it's almost like it is the, it is the backbone or the shin. It's the shin bone. If it's the shin bone of everything. 

[00:43:10] Matthew: I wonder if Google is going to adopt this marketing strategy that we put forward for them. 

[00:43:14] Dara: Don't be the spine that's overrated. And people get a lot of spine issues when, have you ever heard of anyone with a shin problem? 

[00:43:19] Matthew: No. And, and if you, if you do, they, they, they collapse instantly. So, you know, you need the hin integral. Yeah. Yeah. No, yeah, I think you're right. I, I think definitely there's so much more data that's useful to a business than just. The traditional structured stuff that you might stick into BigQuery. I mean, BigQuery has extended itself out to, to be multimodal. Now you can put in, you know, non-structured data and use that alongside the other stuff that's in there to contextualize things a little bit more.

[00:43:51] Matthew: So there's AI note takers everywhere. There's, there's things like, Fathom and Gemini and Otter and fire files, fire files, fireflies. I keep, I always get that the wrong way around. Years of documentation in Slack and things like that could all be looked at, pulled together and explored in new and interesting ways that we haven't done before. Nothing is lost. Everything can be 

[00:44:22] Dara: used. I think that's the key point, isn't it? That's something that's coming up more and more. It's like this idea that nothing is, you know, in the, in the past, if you had, whether, whether it's business data again, or, or actual personal data, you'd have it spread across so many different places.

[00:44:39] Dara: And then even when things started to get digitized. They still didn't live in one place, and now it's almost getting to a point where it doesn't, it's not even about whether it sits in one place or not. It's about whether it's, it's whether it sits somewhere and it can be pulled together, and some semantics can be layered on top of that, and then you can get some intelligence from that.

[00:44:56] Dara: So it's like it, it doesn't, it doesn't really matter where things are, as long as they are somewhere and then you can funnel them into something like BigQuery, especially now that it's got multimodal support. You could just have all of that unstructured data, but you can still, you can still make sense of it and actually query it, whether it's through a chat interface or a pencil, doesn't really matter.

[00:45:23] Matthew: Or a pair. Or a pair of creepy looking glasses. Yeah. You can film people without their consent, all those good things. Yeah. It's, and I might, what I might do, just to round that off, is just to undermine everything we just said. 

[00:45:37] Dara: Makes sense. 

[00:45:37] Matthew: Say that all, that's a guess. Right? It's, it's, looking at Google and looking at the way they're positioned.

[00:45:44] Matthew: It feels like they are in a really strong position moving forward to really take advantage of the new technology, the growing market, the, the, the use of it, the, the infrastructure, the data. But this is the most unpredictable time in. Technological history perhaps is not too much of an overstatement. Who knows?

[00:46:06] Matthew: It, it, where this goes in the next few weeks, Google will almost certainly be part of the future. But what, what part of any given layer they take, down the line is, is hard to say. You know, like the, like we said earlier, the, there's a, the output layer, the interaction layer, the intelligence layer could live somewhere else and Google just settled into infrastructure and data management and modeling or.

[00:46:30] Matthew: They could release things that are fantastic or somebody else could release something that no one's possibly thought of. And this blows the whole thing up again, you don't know, but that's the current lay of the land. Yeah, exactly. 

[00:46:41] Dara: Nobody knows for sure, but I think, you know, what you said at the very start is like we can you, you can, if you had to place a bet, you would bet on Google being there, when the dust settles because of that kind of broad coverage that they have and all those different, Different strengths that they have and, and, and, and the kind of heritage and the, and the infrastructure that they offer. But in the, as a follow on point from saying that, you know, nothing is lost and everything is, is available and connected, we can look back and see if we've made us make fools of ourselves with this.

[00:47:13] Dara: Because by the next. Episode, we might have to do a, we might have to start issuing an apology. We'll have to start doing apology episodes and say, oh, all that stuff we said, Google has collapsed. Yeah, Google collapsed when I bought Britain. shit. So we need to say apologies to our new overloads.

[00:47:30] Dara: Yes. We don't know. We do not know. We don't. But it's exciting to be part of it. And I think, just to kind of. Wrap up. I think this is where we're going to start to bring other points of view on, you know, you made the point about kind of are we in a, are we in a bubble? So our, our kind of guest strategy for this, for this season, if we call them seasons, is going to be trying to get, obviously people who are in this space, but maybe try and see if we can get different perspectives so that we can kind of.

[00:47:59] Dara: Make sure that we're being somewhat objective about where we see things going, but I think for today we've done enough speculating and enough future gazing. Yes, I think anything else is just. 

[00:48:10] Matthew: In fact, I just don't know anything. Alright, until next time. See you later. 

[00:48:15] Dara: That's it for this week's episode of the Measure Pods. We hope you enjoyed it and picked up something useful along the way. If you haven't already, make sure to subscribe on whatever platform you're listening on so you don't miss future episodes. And if you're enjoying the show, we'd really appreciate it if you left us a quick review. 

[00:48:30] Matthew: It really helps more people discover the pod and keeps us motivated to bring back more. So thanks for listening and we'll catch you next time.