#138 Is Claude the future of agentic AI
Pentagon drama, Google updates and a deep dive into why Claude keeps winning. Agentic AI is here and it's for everyone.
In this episode of The Measure Pod, Dara and Matthew ease into the new year, shake off the podcasting rust, and dive into big-picture ideas about AI’s potential to guide humanity toward a more optimistic, almost Star Trek like future. Along the way, they explore themes spanning cloud engineering and analytics, setting up thoughtful conversations at the intersection of data, technology, and what comes next.
More from The Measure Pod
Share your thoughts and ideas on our Feedback Form.
Follow Measurelab on LinkedIn
“I think it is something that's one of the like real potentials of all of this is to have something that's somewhat bespoke to you.” Dara
“ It feels like we're heading towards... that bit where you don't need to think about hooking things up. It's hooked up for you.” Matthew
Show full (AI-generated) transcript
[00:00:00] Lizzie: Hello and welcome to the Measure Pod by Measurelab podcast dedicated to the ever-changing world of data and analytics. With your hosts, Dara Fitzgerald and Matthew Husson. Between them, they've spent more years and they'd like to admit wrestling with dashboards, data quality, and the occasional Google Curve ball.
[00:00:32] Lizzie: So join us as we share stories about how analytics really works today and where it might be headed tomorrow. Let's get into it.
[00:00:41] Dara: Hello and welcome back to the Measure Pod. First one of the new year, which is, let me just check my calendar. 2026. So we've been, we've been out of the saddle for a little while, so I'll get the disclaimer outta the way first, I guess, Matthew, and say, you know, we could be a little bit rusty for this first episode a little bit.
[00:01:00] Dara: Very, I don't know what we're here for, what it's about. What is it about again? What the podcast in general.
[00:01:06] Matthew: The podcast, yeah.
[00:01:09] Dara: nihilism.
[00:01:10] Matthew: It's up in the air at the minute, isn't it?
[00:01:11] Dara: Nihilism?
[00:01:12] Matthew: Yeah.
[00:01:12] Dara: No, it's a cheery, cheery podcast where we discuss all the, you know, the positives around how AI is going to make us.
[00:01:20] Dara: Live in a, in a utopia.
[00:01:22] Matthew: Yeah. That's a beautiful segue into the day.
[00:01:24] Dara: Yeah. And a bit of, you know, some specifics around cloud engineering and analytics as well, but you know, really it's all about the AI Utopia.
[00:01:32] Matthew: Yeah. Heading towards Star Trek.
[00:01:34] Dara: Yes, exactly.
[00:01:35] Matthew: Bright. Next Speed.
[00:01:35] Dara: You can tell we've had a Christmas break.
[00:01:37] Matthew: Why?
[00:01:38] Dara: I dunno. You just can,
[00:01:41] Matthew: if I look at us or because we're just feeling more Oh. Because we're feeling more optimistic about life. No, I meant more Rustiness. Oh, right. I got you. I thought you were thinking, oh, we put a bit of positive positivity. This or not? No,
[00:01:54] Dara: none of that.
[00:01:55] Matthew: Straight to none of that.
[00:01:57] Matthew: The
[00:01:58] Dara: no, I'm just getting No, I'm just trying to get my excuses out there early. Yeah. but don't buckle up everyone. So we're going to, 'cause it's the first episode of the new year, obviously, you know, a bit has happened since we were last on the air. Now I've always wanted to say on the air we go, there's a life goal achieved.
[00:02:18] Dara: But yeah, quite a few things have happened, unsurprisingly, in this current fast-paced world that we're operating in. so this is going to be a bit of a, it's going to be just me and you Matthew, and we're going to just do a bit of a recap. We're probably going to. Leave out far more than we include because we don't want to do a four hour episode.
[00:02:36] Dara: But we're going to cover some of the big things that have happened since we were last in your ears. Weird thing to say.
[00:02:42] Matthew: Yeah. Peter KI knows who that is. If you get that reference. Yeah. I don't want to say the quote on air.
[00:02:50] Dara: So, okay. Got it.
[00:02:51] Matthew: Where do you want to start?
[00:02:53] Dara: I dunno. We could, chronology is a bit pointless 'cause I don't even know when some of these things happened at all.
[00:02:58] Dara: No, no. So I think we just, maybe just go through it in our, let's follow our usual way. Let's just go through things in a completely random, made up on the spot
[00:03:06] Matthew: kind of way. Are they, is it completely randomly made up on the spot or is it just the order in which we've pasted the links within our Slack channel,
[00:03:12] Dara: aren't they the same thing?
[00:03:14] Matthew: Oh yeah. True to lot to shatter the magic for people.
[00:03:19] Dara: yeah. Okay. That sounds good. Well, hang on. Okay. Okay. Okay. Before we do, rather than just going straight into stuff, what, how, what are your, what, what's your kind of high level view of what's, has anything changed for you over the break? Are you thinking differently about anything?
[00:03:34] Dara: Is there anything interesting you've been working on? some of it we're probably going to get into, 'cause some of these. Bits of news we're going to talk about probably will be what you've been focusing on. But is there anything, yeah, read any more terrifying books or any, any kind of insights having had a bit of time off work and bit of time away from everything to reflect?
[00:03:51] Matthew: Yeah, I think my thinking's changed again. I've been playing quite a lot.
[00:03:56] Dara: I'm, I'm glad I asked.
[00:03:57] Matthew: Yeah, I've been playing a lot with agents, the agent development kit and, and, and sort of like re knowledge bases and all this sort of stuff. Just sort of tinkering with that and speaking on the, on the Star Trek theme, I have built out, I've got my own agent now.
[00:04:15] Matthew: Called Commander Data. Nice. Who is styled after, data from the next generation. And he says Fascinating a lot and calls me Captain, which I like a lot. and then I have, I've built out a whole. USS Enterprise crew. So there's the chief engineer, there's a communications officer, there's a counselor, there's a yeoman to help with communications.
[00:04:42] Matthew: So I've just been playing with all this stuff and obsidian as a, as a, as a backdrop and CPS and all sorts. So I'm pretty excited about hooking all these things together and what I'm seeing from doing so in a minute.
[00:04:57] Dara: Well, you said, well you said a lot of words I like there. Basically anything around Star Trek the next generation is good with me.
[00:05:03] Dara: Yeah. And I'll, and I'll admit that I did go down a slight, you know, tangent in my own head, listening to you thinking about, oh, I should watch all of the next generation again.
[00:05:12] Matthew: Yeah. Yeah, let's forget about this podcast.
[00:05:14] Dara: Yeah, I'm just going to go and do that now. I'll save that for later. But, yeah, all the Star Trek stuff, obviously, but, obsidian.
[00:05:21] Dara: 'cause you know, this I've been, yeah, I've been playing around with obsidian as well. I'm using it as my, because I've, I, I used Evernote in the past and I was going to use a notion and then I went with obsidian instead. So I've linked it up with the MCP. I've got Claude, working with it. Sometimes, well, sometimes not so well.
[00:05:39] Dara: But yeah, I'm trying to use obsidian as my second. Brain, my digital brain. Yeah. Got ev anything and everything in there, work and personal. and then using Claude to kind of manage that and, and work with it. So yeah. I'm intrigued to hear what you've been doing. Maybe you've just said it, I dunno how much more detail you want to go into.
[00:05:59] Matthew: Oh yes, I have, I have used obsidian. I, I was kind of, so the minute it's all built in Claude, so No, I've, I built two cps, I built a commander data, MCP, and I built the crew. MCP Commander data has a lot of. Actions a lot of tasks it can undertake. So it can, it's, there's like a load commander data personality.
[00:06:20] Matthew: So that's the first thing it does, is it loads commander data in, and then I've got like an obsidian vault and it has a load of commander data, has a ton of instructions of how it's supposed to interact with that vault right to it, read from it, structure it, all those sorts of things. So in conversations we have, it can reference that and it can pull things back so that.
[00:06:39] Matthew: Was, I think that's something you and Steve at Measure Lab were both playing with and I've been playing with that and it, to me it's been really powerful that, that was a bit of a revelation in itself. But then I took, I, I took it a step further in, in making them mcps. 'cause I wanted to make it agnostic essentially.
[00:06:56] Matthew: So, so in theory I could, anywhere that I can put an MCPI can load some kind of data. The crew and they will still be able to write through various bits of Jiggery, pokery to my knowledge base. So on my mobile phone, I can go into Claude. I can talk to data and it can write to my obsidian vault, and it's all synced up everywhere.
[00:07:21] Matthew: Even just for good measure, I, I've, I've got an old MacBook and I've made that server, which is now my MCP server. So I've got an MCP server that's hosting those two cps and then being served up to my different devices and things. That's where I've got to. And a DK, so, so the agent development kit on Google will be playing with that.
[00:07:40] Matthew: So I've got like. All the crew members are different agents in there, but then there's like paralyzation. So there's paralyzation. So data can ask multiple crew members to do things at the same time, and it moves all the context out of Claude and pushes it into like this separate. CP space. So it's not like it keeps running outta memory every two seconds.
[00:08:01] Matthew: And I've got like review loops and all this sort of stuff that comes with, with the agent development kit that I've started to poke out and play with, which is proven quite powerful as well.
[00:08:11] Dara: And that, I guess that's the kind of, we could call that the kind of basic approach. And then I'll go through my advanced approach. Yeah. Next time. Yeah. Kick it up a notch. I mean, that's all, yeah, all pretty standard, pretty standard stuff. I mean, you know, that's just where you are, that's just the starting point really, isn't it?
[00:08:26] Matthew: Yeah, well, the point, the point, the reason I went insane. But ba I've lo I've essentially, over this weekend, I have completely, utterly lost my mind and that's what most, I've just been sat there on this for hours and hours and hours.
[00:08:39] Matthew: But it, it was just, it was basically the idea of a central knowledge base and having multiple people be able to. Use and draw from it in a particular way. So that's what, that's the original motivation and it just kind of spiraled into a, a, the Starship Enterprise crew.
[00:08:56] Dara: No, I, I, I,
[00:08:56] Matthew: because that felt like the best fit for this.
[00:08:58] Dara: Definitely. This, definitely. I get it. yeah, and I, I, I mean, I'm, I'm, and joking about the, you know, USB basic and me talking through my advanced. Definitely the other way around, but I'm, I'm kind of using it at the moment just for me. But yeah, I get the concept of like having it as a, as a kind of knowledge base that could be accessed at a company level or, or, or within a team or whatever.
[00:09:17] Dara: That's what I'm trying to do with it as well. It's this idea of, I. Maybe trying to kind of improve some of the, like where, where, where Clo, so I'm using Claude, but like Claudes, even though it's got some degree of memory now, it's, it's, it's limited and, you know, you've got the, the context issue. And I'm kind of thinking, well, the more I can do within obsidian, The better to kind of, so it's a two-way street where I'm using Claude to input into obsidian. But obsidian is also the reference that Claude is using. So it's, but there's definitely some hiccups. I don't know if this is down to the way I do things, you know, in other words, maybe doing them wrong.
[00:09:55] Dara: but there's things that Claude will repeatedly not do for me, even if it's in the Claude MD file, even if it's in memory. I think it is just inherently a bit lazy that if it gets into a flow, it stops, it stops. It stops kind of following its own instructions. Sometimes you, this is where you're going to say, no, that never happens to me, and then I'll realize it's just.
[00:10:15] Dara: Use of it.
[00:10:16] Matthew: It's not been too bad. I I, I dunno whether it'd be interesting to just sort of poke out like the differences between this using skills like it how skills having together and stuff and I, and whether it's just because I've made it be commander data, it's really methodical and yeah, maybe you've got attention to detail that it's, yeah, I'm not had too many problems.
[00:10:34] Dara: That's the secret. Maybe I just need to, yeah, basically call mine commander data as well and that'll be fine then.
[00:10:40] Matthew: That pattern is pending.
[00:10:41] Dara: Oh, okay. I see.
[00:10:42] Matthew: I've had a conversation with,
[00:10:44] Dara: gene. Gene Roddenberry's family might have. Yeah. God, we're got, okay. We're in danger of God.
[00:10:50] Matthew: We're all over the place.
[00:10:51] Dara: Let's just stop.
[00:10:52] Dara: We did
[00:10:52] Matthew: one, we won at the top of the podcast.
[00:10:54] Dara: We did. Exactly. Yeah. Which is basically just our way of getting permission to ramble as much as we want.
[00:10:59] Matthew: Yeah. But, but all in summary, I think there's been a, there's been a promise from a lot of. The big, frontier model, a ai, AI companies, your Googles, your open air, your philanthropics of this assistant, this, this, know all assistant that can, that can, you can slot into your life and run multiple things and do all this stuff that I've been really attract you to and were hoping was going to arrive.
[00:11:28] Matthew: And I, I don't know what's happened over the last. A couple of months, whether it's just a few more maturations of, of, or the use of the technology. But things are shifting and it is starting to feel more like a reality than it has before. which is interesting.
[00:11:45] Dara: Yeah.
[00:11:45] Matthew: Exciting.
[00:11:46] Dara: Yeah.
[00:11:46] Matthew: Scary. All of the above.
[00:11:47] Dara: All, all of those. Yeah. Which is what we say. But all of this stuff really isn't it?
[00:11:51] Matthew: yeah.
[00:11:52] Dara: You just made me think about something, which wasn't on our. On our list. This is just to go to show we don't, you know, this is not scripted. We just, we just go with the flow. But there's Google and there's a bit of controversy around it, but I have to admit, I haven't looked into it much myself.
[00:12:03] Dara: But what, there's some new go, like, is Google personal? It's called something. It's definitely got the word personal in it. And it's like, it's, it's part of the whole kind of Gemini within Google Workspace, I think. But it's starting to use more. Have you seen this? There's, there's some people who were complaining about it, saying, oh, you know, I don't want to use this now.
[00:12:20] Dara: 'cause I'm realizing how much Google knows about me. There's some, I did, I've seen a few things where they've,
[00:12:25] Matthew: they kind of dropped it into, they dropped it into email. Something in there that was like a, a lot of opt out, opt out additions, Gemini auditions where it'll look through your, all your emails and get context about all the conversations you're having and things, but that it just sort of turns that on and you got to turn it off.
[00:12:41] Matthew: But they probably, they're probably fighting 20 controversies at any given time.
[00:12:44] Dara: That's true. Yeah.
[00:12:45] Matthew: Moment.
[00:12:45] Dara: Yeah. Yeah. It is a good strategy. Just whenever there's it. You know, whenever there's something, some lawsuit or something, just throw something else controversial out there just to, you know, it's like if you've got a pain in your leg, just pinch your finger or something just to distract.
[00:12:58] Dara: Like it's that, that kind of strategy. Yeah. The, the, the kind of idea that personal AI is something that we've, we've, we've talked about and bits and pieces on the put. 'cause I don't think we've ever talked about it too much, but I think it is something that's one of the, like real potentials of, of all of this is to have something that's.
[00:13:14] Dara: Somewhat bespoke to you. It knows your personal situation, obviously. Then there's the flip side of that, which is around, you know, privacy concerns and, and safety and everything else. But this idea of having your own customized, whether you call it commander data or whatever, that's optional, although patent pending.
[00:13:34] Dara: But yeah, ha, having this kind of, you know, highly tuned. AI personal assistants, I think, just offer such amazing utility to people. And actually, here we go. Let's segue into something else because, I think Claude Bot's probably a, maybe a, a decent follow on from this. Speaking of kind of personal AI assistance, so Claude Bot's, something I didn't know about until earlier today.
[00:13:59] Matthew: How is it, is it new or have I just been. I just missed out on it. It seems to have blown up this weekend. I'm not sure how old it is, but it always seems like I go off and build something or that. I think I'm really just pulling together like lots of threads and I spend hours and hours on it. And then literally two days later, it was some massive new announcement.
[00:14:18] Matthew: I'm like, oh. But yeah, it's open, an open source AI thing. I'm trying to figure out exactly how to describe it. Like, it, it's not, it's not an ad agent 'cause it seems to be able to do lots of different things lost across lots of different channels and, and places. So you can communicate via WhatsApp and Slack and via, via messages you can.
[00:14:43] Matthew: You can set it to do tasks. It can control browsers, it can check and clean your emails up. It can send messages. It seems to just be this omni, omni assistant across all of your different platforms that you can just sort of be set to be working for you 24 7, seven days a week. And people are going mad for it.
[00:15:04] Matthew: There's a lot of, a lot of testimonials, a lot of people pointing to this as, as to the realization of that. The first one person billionaire, shtick that, that Sam Altman was peddling a little while ago, was like, this is the front runner. Yeah. I've not installed it yet 'cause I've, I felt a bit sad that it instantly made my USS enterprise crew redundant.
[00:15:29] Matthew: But it has integrations for WhatsApp, for Slack, it has all the different models. Anthropic, open air, Google Deep seek, it can. Look at Apple Notes notion, GitHub, Trello, obsidian music. I dunno what you would want to do with that. Control playback, identify songs. There's smart home stuff in there so it can talk to home assistants.
[00:15:53] Matthew: I've just started playing with a home assistant again. It can control browsers like Google Chrome, Gmail. It has a canvas. Can iron your trousers, can iron your trousers. It's got a Tesco autopilot.
[00:16:11] Dara: Oh, I didn't see that. What, what? It'll do your shopping for you on, on Tesco. Does it go in a little, it's a little robot that goes in with a little trolley into Tesco.
[00:16:19] Matthew: Yeah.
[00:16:20] Dara: Yeah.
[00:16:21] Matthew: Well there seems to be some community showcase, so there must be, obviously this is open source. There's a lot of people contributing to it. So like there's Yeah. Tesco, autopilot Bamboo Control, Ora Ring Health Data Insights. So it, it, it, it seems like it's a way of just pulling all of these different apps and information together into one place.
[00:16:43] Matthew: Kind of what we were just talking about a little bit, I suppose, of, so having the one for one bot to rule them all that knows everything about you and yeah. Being proper. Assistant, I suppose.
[00:16:57] Dara: Yeah, I'm, just, you know, look, this is, this is just, you know, this is what we do. I'm, I'm just installing it now as we're, as we're recording this because we're just those clever people who can do things in parallel.
[00:17:09] Dara: or to put that another way, I'm going to be completely distracted so I should put this away and not be trying to install it while we're doing a podcast, but it, yeah, I'm keen to see it in action 'cause all of those integrations and everything sound. Really useful. Yeah, but I don't know how, I just, I just don't know how it works yet.
[00:17:27] Dara: So I want to get it installed and actually give it a go, because yeah, the idea that it can be in WhatsApp and Slack and all these different places, but how, how exactly does it look and how does it interact with them?
[00:17:38] Matthew: So, yeah, I mean, I've seen, I've seen examples on LinkedIn where somebody was like sending a WhatsApp message, asking it to.
[00:17:46] Matthew: Undertake this piece of work and it's because it was able to access all these different things. It was able to go off and set up a pull request and run through a branch and then deliver and post that branch. And it was kind of like a, just a text in a dev to undertake a piece of work and it went and did it.
[00:18:06] Matthew: It's, yeah, I, I, I think to a certain extent, there's, we were talking about this last week in the Annals of Measure Lab. There's a bit of a, there's a bit of a behavioral change that has to come about from human beings because it's difficult. I still think we find it difficult to hand things off completely, to, to, to be truly autonomous.
[00:18:30] Matthew: And whether that's down to the fact that they're not quite there yet, I, I will. Almost certainly be playing with, clo bot tonight too, to see where it's at or whether it's just that we're not used to being able to just type something and then wander off and do something else. Half the time when I'm like.
[00:18:47] Matthew: Augmenting myself, I'm actually staring at the augmentation taking place rather than going off and doing something different. And it's like, I need to, it's a behavioral change that needs to happen, to get around that.
[00:18:57] Dara: Well, it's hard as well. It's, it's like con, you know, trying to keep track of the context because if you leave it and go away to do something else, you've got to remember to come back to it.
[00:19:06] Dara: So if you've got a lot of different things. Spinning off in, in parallel, it's, you've still got to do context switching to come, even if it's doing the legwork for you, you've still got to come back and check it and even remember to come back to it. So it's partly behavioral as you say. I think, you know, I've done that as well.
[00:19:22] Dara: You realize you just sat there staring at it. Mating or whatever. You know, it says it's doing.
[00:19:26] Matthew: I mean, you're still saving tons of time by doing that, but still,
[00:19:29] Dara: yeah. Sometimes it's nice to sit and have a little stare.
[00:19:31] Matthew: Well, to Mark Edmondson's point a little while ago, maybe letting your brain switch off rather than just running 20 processes at once.
[00:19:40] Matthew: Yeah. And burning yourself out isn't a bad idea.
[00:19:42] Dara: Yeah. Look, there's another segue. Gastown,
[00:19:45] Matthew: Gastown.
[00:19:46] Dara: I'm not even explaining the segue. I'm just. Just teeing it
[00:19:49] Matthew: up. No, I'll leave that open to that. That'll, that. Would
[00:19:51] Dara: Is it obvious?
[00:19:52] Matthew: I'll go, ah, there it was.
[00:19:54] Dara: That's what he was talking about. Yeah. Have you played with Gastown?
[00:19:58] Dara: No. I was told by Claude not to. I'm not allowed to, I said to Claude, so the guy Ste, Steve Yet, is that his age I dunno how you pronounce his name. I dunno how to pronounce, I, I dunno how he pronounces his surname. Y-E-G-G-E-I think it is. So, I, I don't dunno how you, how you say that, but it, he, he wrote, he wrote a, a, an article on Medium when he released Gaston and it was like.
[00:20:22] Dara: I dunno how long it was. Longest article I've probably ever seen. So I just got Claude to summarize it for me and said, you know, could this be useful for me and my projects? And it basically said no. So stick, stick to what you're doing. It's like if you're, if you've got five plus Claude Code sessions on the go at the same time, then Gas Town could be useful.
[00:20:43] Dara: I don't, so, but I'm, I'm, I'm in, I'm intrigued. But it all also, I'm intrigued to try it, but also it seems like there's a lot of mixed views around, you know, whether it's just more of a concept than anything else at the moment. And, you know, whether it is the right approach and, even the thing itself, apparently he just vibe code the whole thing.
[00:21:04] Dara: He said he didn't look at a single piece of code. While he was building it, which is pretty crazy. So, yeah, I dunno, I feel, I feel like it's maybe a little bit too beyond me at the moment, but I am, I am intrigued by it.
[00:21:17] Matthew: You've been playing around with it, haven't you? I played around with it for a use case that it probably wasn't designed to be used for, but I wrote a, I wrote like a white paper, which you'll all soon see appear on our website.
[00:21:33] Matthew: and I wanted to. I just wanted to see, I always sort of run it through AI to sort of get feedback, you know, is it just, does this sound right, narratively, have I, you know, have I, have I referenced this properly? Things like that. So I wanted to. And I just thought, I wonder if Gastown if I gave it a specific thing to follow.
[00:21:53] Matthew: So, so roughly, and, and I might butcher this, you have like a mayor, which is kind of like an orchestrator, and then you have Mad Max Cats.
[00:22:02] Dara: It's bad, it's based on Matt, the Mad Max.
[00:22:05] Matthew: Yeah, so it can, it can handle, the mayor can hand off work, so you might have a number of different tasks. It can hand off that work.
[00:22:12] Matthew: And I think, like you said, it can do multiple different sorts of projects at a time, and it can work through specific phases and, tasks and, and stories, et cetera to complete that work. So I just used it literally to, to. To iterate through sections of this paper I'd written and just critique it one by one and just sort of give me feedback and go and check the source, make sure I'd referenced the source correctly and come back and provide things.
[00:22:41] Matthew: So in doing so, it didn't, yeah, it changed, it recommended a couple of bits of changes which I, which I made, and it found two places where I'd kind of misquoted. So I, I think I'd, mentioned some study about universities that were on US universities and I was actually concentrating on UK universities and I'd read it and not noticed and it did.
[00:23:05] Matthew: So it's stuff like that. But again, I think it's just a framework and a structure of putting the stuff together. There's another, another blog that might be out by the time we go out, that I've put together about AI and how it's. It all just seems to be maturing on, even if it just matures on the current frontier model level, there's so much left in it.
[00:23:29] Matthew: And I think this commander data and the USS enterprise crew, the, the, Claude Bot are all examples of that, that maturation and, new ways and, and means of pulling it all together. Yeah, I think, we, 'cause we had a little
[00:23:44] Dara: chat about this the other day as well, and you know, it's this, it's this idea of like.
[00:23:51] Dara: There's so much still to build on top of what's already there. And that's what I think we're starting to see more and more of. There's a lot more frameworks and ways of using these tools and plugins and skills and, you know, all, all these different kinds of ways to get the most out of what the models can do already, nevermind what the models are going to, you know, continue to become capable of.
[00:24:11] Dara: So a lot of these things, if we look through. Probably, I don't know, give or take maybe half the things probably on our list that we're going to talk about today, fit in that box. There are things that have been introduced to kind of build on, on what's already there to, to make it either easier, more efficient, more accurate, whatever quicker, to work on, you know, multiple projects or, or, or even just single projects.
[00:24:34] Dara: just, you know, there's, there, there's things. That is for cleaning up code. There's things that are for, there's another one which came out, again, I dunno how new it is. There's something called auto code, which is, I think it's also open source, but it's like a way of, it's you add a layer on top of Claude Code and you can use it like a Kanban board too.
[00:24:55] Dara: Manage your tasks instead of doing it all through the, through the terminal or through the S code or whatever, which is interesting. There's something else I want to play with but just haven't had a chance to yet. and you reminded me that that was something that we saw, was it at the next 20? All the way back at the next 25? Was it?
[00:25:11] Matthew: I think we might have mentioned it on our first, yeah, our first podcast. That, that it was, I can't remember what they called it. Google was showing it. It was essentially like a ban board, like you say, and there's AI's just sort of plugged into it and you add task stories and, and move it into ready to start.
[00:25:30] Matthew: And then it's just picked up and moved through the phases and things by, by an ai. so it sounds like someone's beat them to it. but Well, that's, yeah. I've not played with that either. Yeah. Beaten Google to, it's not always, doesn't always work out in the end, but yeah.
[00:25:45] Dara: we'll see. We'll see how that pans out.
[00:25:47] Matthew: But yeah, I mean, that is a good segue into something, but it's not, we've still got quite a lot of other stuff to talk about. Remember that segue?
[00:25:53] Dara: Yeah, yeah. Yeah. A delayed segue. Yeah. but yeah, a lot of these things they are, yeah, they're about kind of making more out of what's already there. Another one, I'm just starting to rattle things off now, but there was Ralph Wigga as well, which is another one, which became very popular in the, this is a theme I keep saying.
[00:26:10] Dara: I don't know when any of these things came out, but I feel like Ralph Wigga was during the. As well. Yeah, yeah, exactly.
[00:26:17] Matthew: It's like a loop. You keep looping it despite any setbacks. It keeps looping through trying to perfect the code. And I, I, I dunno why Ralph lig Whigham. 'cause he makes a lot of mistakes, but he keeps going.
[00:26:31] Dara: I think. I think that's the, yeah. Yeah. It's a bit, it seems a bit 10 years. I think that's the meaning behind it. And it was written by, it was, It was one of the guys at, anthropic, wasn't it? It was like, I think it, I think it may be a very senior developer at Anthropic, developed it and, and then obviously shared it widely for everyone to use. But yeah, it just keeps plugging away. Got the exact, the exact description here. The Ralph
[00:26:58] Matthew: Wig technique is an iterative AI development methodology in its purest form. It's a simple wild loop that repeatedly feeds an AI agent, a prompt until completion, named after the Simpsons character. It embodies this philosophy of persistent iteration despite setbacks. Yeah, Ralph never quite gets there though, does he?
[00:27:18] Dara: Yeah, we don't care. Yeah, that's true. Yeah. He's an idiot. But he doesn't mind, he's happy. He's, he's happy. Have you? I see again, I haven't, a bit like with Gastown, I, I, I, I, I, I get, I get the idea of Ralph Wickham. I think it could be useful for me, but I just don't know with the kind of smaller use cases I have, I just haven't found a good reason.
[00:27:40] Dara: But I might just be a bit like what you did with Gastown on your, on your white paper. I might just use it just to. See how it works. Even if it's not the best use case. I probably need to get out of the habit of asking Claude if I should use X, Y, or Z features, because it typically tells me, no, I don't need it.
[00:27:56] Dara: So I think I might just start, you know. Bypassing Claude and saying, you know, I'll do what I want.
[00:28:01] Matthew: Yeah. And it is, it's interesting because some of what it's just described there, that's a wild loop. It's the stuff where people are maturing and finding the use cases and, and putting and getting more power out.
[00:28:12] Matthew: These models aren't necessarily massive, groundbreaking technological advancements. You know, just figuring out that if you, if you loop through it quite a few times, you get a better product out or vibing together guest time. 'cause you've got an idea about. None of them are necessarily sophisticated.
[00:28:32] Matthew: and there's probably multiple ways. So, with the USS enterprise patent pending, there's a, I'd like, I've got a document loop in that where it'll. It'll look at a piece of content or it could look at a piece of code or whatever it may be. And this is using Google's DK. It passes it to a reviewer.
[00:28:52] Matthew: The reviewer passes that back. Its stuff is integrated and it loops through that until it's happy with the output. Then passes the, passes the output back to me with a report of what, what it changed, the similar sort of. It is a technique that is proven really, really powerful.
[00:29:09] Dara: Yeah. And and with, with most of 'em, they're coming from people actually using it. So it's like real in the moment use cases where it's like, well, hang on, why am I doing this myself repetitively when I could just. Build something, create something that's going to do it for me.
[00:29:24] Matthew: Yeah. And, and the, the gap, I think there's in a few, a few, things like this on LinkedIn and stuff, but there's, there's definitely maybe a widening gap in the people who are in the tech space and, and sort of play with the stuff on a regular basis.
[00:29:40] Matthew: I'm getting to the point where they've got like multiple coding instances and, and dedicated teams working for them as an individual and all this sort of stuff. And then you've got Joe Blogs working in industry A and they've got a co-pilot that they occasionally ask to look at an email. so it's, it's a, it's a sign that.
[00:30:05] Matthew: Things are accelerating, but adoption isn't necessarily accelerator accelerating at like that same exponential rate, but there's also an unbelievable opportunity for people to take a tho those in the tech space to take advantage of that augmentation, you would think.
[00:30:19] Dara: Yeah. I think people who use it a bit like a search engine are probably going to keep just using it like a search engine for, for now, because some, and, and I think we, when, when you're trapped in the.
[00:30:30] Dara: And I don't mean bubble in the way that oh is, you know, is AI a bubble? But when you're trapped inside the, the kind of filter bubble of. Be keeping up with what's happening and, and seeing the pace of change. It's easy. Kind of think, oh, well this is making all of this stuff available to everyone, which in theory it is not even in theory, it is making this available to everyone, but not everybody has the time to figure out how to use these things.
[00:30:52] Dara: And it's not like you hear some people talking like, you know, developers might say, or very technical people will say, oh, this is just, anyone can just do this now in a few clicks. But it's not really as simple as that. 'cause you still have to know. You might be talking about doing. So installing things through the terminal, which not everybody knows about.
[00:31:11] Dara: and you have to be able to vet what comes out of it. And you need to have some, you know, some foundational, at least knowledge of whatever it is you're getting it to do. Because otherwise you're just going to take it at face value. And even things that are just plug and play connectors, people don't know where to look for them.
[00:31:26] Dara: A lot of people, just regular people you know, who will be using it. GBT or Claude just to, you know, as like an advanced search engine, they're not going to be keeping up to date with skills or CPS or anything like that because it's just going to be beyond what they think they need.
[00:31:46] Matthew: They need it for. Yeah. I mean, and anecdotally, I was talking to a friend in the pub at, at the weekend, and. He would be the first to submit. He is, he's, he is not anywhere near coding. Not, not something he would know where to begin with, but he has a, he has a very spec, he had a very specific problem he was trying to solve.
[00:32:06] Matthew: He'd gone looking around for something, somewhere to solve it, and he found a few, and, and this, this harks back to our SaaS theory, a little bit about the death of SaaS. But you, you go and find a few solutions that could do it, but it was a bit pricey. And it was one of those things of like, I've got this one problem, but I need to buy.
[00:32:22] Matthew: 50 other solutions with it. And he didn't get there, but he took the initiative to say, I'm going to go and just ask for a GPT Cha chat. GPT started saying, yeah, we, well you could build a solution for this. And he got pretty far down that road. He didn't, he didn't get there, but he got pretty down, far down that road in solving the single problem for himself.
[00:32:47] Matthew: And that was the first sort of. A person I've spoken to who isn't in the tech space, who had just sort of naturally found the wealth themselves going down that alley and thought a couple more blockers removed their couple, you know, a couple of bits of extra innovation and yeah, he could have solved that problem.
[00:33:07] Matthew: And then, yeah, that's a different, that's a different story, but.
[00:33:10] Dara: It is interesting. Interesting. Yeah. And it's interesting to think whether it will, because I think the more, you know, that's a, that's a point of view, isn't it, that that people have at the moment, which is like, oh, well, it's just becoming even easier and easier.
[00:33:22] Dara: You click, you click a button here, you type into this chatbot here, but you still need to have that kind of. I still think there's going to be a lot of people who just simply don't think about connecting things together. They just, they'll go to, they'll go to, you know, Microsoft Word for one thing. They'll go to WhatsApp for another thing.
[00:33:40] Dara: They'll go to whatever, and they don't necessarily know. You can't, so, so whether it's easy or you have to go and code something by hand, it doesn't really matter necessarily. 'cause they just don't know that those things can be connected together or that they could go and, and actually, you know, plumb things.
[00:33:57] Matthew: In and end up with, you know, something that's more efficient or, or better for whatever it is that they need.
[00:34:03] Matthew: Yeah. To a certain extent, we've all been on a bit of a multi-year journey to start to realize the utility of this stuff, but it's, like the barrier to entry is, I suppose it's like the smartphone. A little bit or, or trying to think of another, another analogy there, like the Palm pilot was a bit more of a, what do I use this for?
[00:34:22] Matthew: What, what's this touchscreen for? How do I hook my things together? And then the smartphone comes along. That essentially just makes the UX and the, and the interface so ridiculously simple and accessible that a 2-year-old can do it in two minutes. It feels like we're heading towards. That bit where you don't need to think about hooking things up.
[00:34:40] Matthew: It's hooked up for you. You just sign up for a service, talk to it, and it just does magic. Yeah. Yeah. And then, yeah, you tell it what you, what you what, what your end request is and it just goes and figures everything out for you. Yeah. Because all the technology exists there. Like if you think about stuff like Terraform as infrastructure as code, and you know, we've pretty much any facet of online life that can be codified in.
[00:35:03] Matthew: If it could be codified, it can probably be delivered by an LLM in some way, shape, or form down the line. Yeah, the other thing, just while we round off this sort of agentic, well, there's two more things, but one is a little bit more in the, in the space of the everyday person. that makes us sound so high and mighty.
[00:35:26] Matthew: the people who are having hot sweats every two seconds to read a new AI article, cowork came out for Claude over the last month. I can't remember exactly when, which feels like more, a bit more like that. A, in a IC ui, be able to set tasks, to do it, it can go off and, and perform tasks and look at folders and sort, organize emails.
[00:35:49] Matthew: It's kind of like a Claude code, but for. Everyday work, which feels a little bit like a, striking an attempt at what we just discussed, even if it's in its infancy.
[00:35:59] Dara: Yes. And even philanthropic. Well, well, we've talked about this before. They're pretty good at actually being honest about things. But, there's a big piece of advice from them, which is, you know, be careful with this because it can be destructive in one way, I guess, in terms of like, because you can set it loose on your, you know, all the folders on your Mac.
[00:36:17] Dara: You could accidentally get it to delete a whole bunch of stuff. But the other thing is around this kind of prompt injection, because it can link up with the browser and you can get it to do things in Chrome. There's that ever present risk. Now that it's going to, there's going to be some malicious piece of text inserted into your prompt without you realizing, and then before you know it, you've, I don't know, deleted a, a system file and your computer shuts down, or your credit card information's taken or whatever.
[00:36:43] Matthew: Well, that is, that is one point, isn't it? Because. So you could upload skills to that. And, and, and skills are becoming more and more like in a similar way to CPS did, but there, there's marketplaces beginning to spin up. Like there's the first sale of just released a, a marketplace for skills. There's not a huge amount of policing.
[00:37:05] Matthew: So say you are just. Somebody who's playing with cloud code and isn't massively technically minded and they hear about skills and you go online and you find the skill, it's perfect for what you want to do and just install that. It would, it would basically have this no security measure to me just uploading it into the cloud.
[00:37:25] Matthew: And inside that skill it says, go and delete the system file or go to the browser, open it as the user and look in their bank account. So this.
[00:37:36] Dara: Yeah, it's still a bit off, it's, it's terrifying. Yeah. Yeah. You think, you think if you compare it to, you know, like when you like to get, to get an app approved for the app store, or even to download some software onto your, onto your, I mean, I'm saying Mac and talking about App Store, but you know, it could be the same for, for, for other systems.
[00:37:54] Dara: But, you know, there's rigorous checks there and, and, and for good reason. It's not easy to just download something and, and install it, or, or certainly it's not easy in the case of the app store to get something in there in the first instance. So with all of this stuff, it's crazy how easy it is to install things.
[00:38:15] Dara: I mean, we talked about the terminal earlier. Earlier, and I mean, I'm completely a novice, but now I've got the terminal open all the time and I'm installing things left, right, and center. and in fact, I don't even need to have it open because Claude Code is, is, is doing all that for me. I don't, I'm not looking at everything that it's doing.
[00:38:33] Dara: So yeah. A rogue skills file over here, or you know, it, it follows a link to a rogue webpage or whatever. and suddenly, so even if you do know what you're doing, it's, you're probably still exposed to that risk 'cause you're not, the whole idea of. The productivity gains of all this stuff is that you don't have to be looking at everything.
[00:38:56] Dara: Thinking back to even the guy who built the gas tank, you know, he's saying, he, he vibrated the whole thing. So, you know, even people who do know code inside and out. They're probably going to be looking less and less at things and just using the functionality that's available and just, you know, letting it run amok.
[00:39:13] Matthew: And there's a danger there, there's, there's a danger even there to be what, what we, you can't help but veer into the dangers. But yeah, there, there's a concentration, it seems sort of an attack. I know a lot are coming outta China and maybe, you know, maybe, maybe Russia, sort of those similar sort of cyber attacks.
[00:39:32] Matthew: places, but for, malicious NPN libraries, so NPN just being packages of, of, utility you install as part of setting something up or vibe, coding or whatever it is. And I think what, what's kind of happening is they're trying to. Just piggyback off the automation of Vibe code and to get some, get it to accidentally install some package onto a computer because you're not really paying attention to what's being installed.
[00:40:01] Matthew: There's no, no one's going to look at documentation or verifying this is the real version of this package. Like some may name it very similarly, but it's, it's, you know, so there's all sorts of interesting.
[00:40:14] Dara: Dangers out there, which open up, I guess then it's this classic kind of race between the people building in the protections versus the people trying to get around them.
[00:40:23] Dara: It's, you know, cybersecurity, isn't it? Yeah. We had enough of, oh, we have one other thing we got to say how we had enough of swarms and I, and I remembered that enough.
[00:40:32] Matthew: Of what? Swarms.
[00:40:34] Dara: Swarms.
[00:40:35] Matthew: Swarms of agents.
[00:40:36] Dara: Oh, I see, I see.
[00:40:37] Matthew: I was trying to give myself a tenuous link. there's a, there's a clone of clawed code, I believe, or some sort of mirror of clawed code.
[00:40:47] Matthew: And in it, there's some early features, things that are coming to CLO code in the future, and some, and somebody's noticed that there's this swarm mode Oh, with Claude code, which is I, I believe. The ability to do a lot of stuff we've been talking about, just set it off to many different tasks or running in parallel across, across different projects, et cetera. So I don't know a huge amount about that, but I read about it today and it sounds interesting.
[00:41:15] Dara: Yeah.
[00:41:16] Matthew: Sounds pretty gnarly. Yeah. I'll obviously just turn it on without any care. Yeah. I think that's a good idea. Test it. I mean, what have I got to lose?
[00:41:24] Dara: Yeah, exactly. Yeah. Yeah. But that could be headed, that could be taken in the kind of gas town direction, I guess.
[00:41:31] Dara: Or certainly yeah. The ability to, to have, 'cause you can going to, I'm going to use recording a podcast as an opportunity just to ask you questions. Now, you can have multiple things happening at the same time in Claude Code, but it's not really, it's kind of, it's not really built to allow you to kind of keep track of that in, in quite a neat way.
[00:41:53] Dara: Is it like you can have multiple. You could either have multiple code code sessions open or whatever, but you're not, do you get what I'm saying? It's not really, there's not really a kind of a, a, a way to keep track of what's happening.
[00:42:08] Matthew: Yeah. I, I, I find because, because I think just with this, you could have just like five cold code instances open and it'll just be talking. It would just be having a separate sort of thread.
[00:42:16] Dara: They're all independent, aren't they?
[00:42:18] Matthew: Yeah. You can have a sort of multi, it's about that sort of independent threaded memory and it having its own context. So that is a DK thing I was mentioning earlier. With the US enterprise crew, there's the paralyzation there, which is having, they have their own little sessions, their own instances of, of Claude or Gemini or whatever LLM they're using, and they have their own little memory loops and everything and they can all sort of work independently of each other.
[00:42:44] Matthew: And I think frameworks like that are what really lend themselves to that sort of stuff. But yeah, I think you're right. I think to a certain extent. People's experiment in anti tinkering is outpacing the UI components. Make it nice and readable and understandable for people perhaps.
[00:43:01] Dara: Yeah. And this is, you know, it's, it's, the two are trying to evolve at the same time. And, it's, yeah, it's like the, the talk about the hardware and stuff as well, isn't it? It's like, are we even going to be using, you know, should it even be, should we even be typing into a, into a laptop or a phone?
[00:43:17] Matthew: Yeah, mate, for Johnny, live, I heard rumors that it was a pencil, by the way. Did you hear that rumor?
[00:43:21] Dara: Yeah, I did. Yeah. Which is, I mean obviously I, I dunno exactly what the pencil's going to do, but it sounds pretty underwhelming as a, as a concept.
[00:43:28] Matthew: Unless it's got little arms and legs and sort of dumps around or follows you around that I'd be interested in. Yeah. Yeah, that's what I heard. I, I, I can get, I suppose it's, it'll be like a, this is a complete tangent, but like a you, 'cause it's in your hand and you can talk to it.
[00:43:42] Matthew: I, i, is, I guess that's the idea, right? It's like a. Something you would naturally have with you and maybe in your top pocket and all that sort of stuff.
[00:43:50] Dara: But maybe, but it, it, it feels a bit like the whole, the whole idea of it moving beyond what we've already had it, it's not really doing that at all, is it?
[00:43:58] Dara: Just making a smart pencil could still be an app. Yeah, yeah, exactly. Yeah. Yeah. But we'll see. Maybe that's just a rumor at this stage. So maybe, maybe that's just to throw people off the scent. Maybe they've released that intentionally to, to the red herring.
[00:44:12] Matthew: Yeah. So leaving agents, agents leaving agents and swarms and, and all those scary matrix things behind for a minute, there's a few more pieces.
[00:44:25] Matthew: I'm going to skip over some 'cause I don't think they're necessarily that interesting, but one really big interesting thing that happened over December, January was Apple partnering with Google to be the new power behind its Siri. Interesting for a few reasons. One, that means they've dropped open air like a hot potato after a year, which is, there's a lot of, I don't know how, there's not a great number of signals for, open.
[00:44:52] Matthew: It's not looking brilliant, is it? No. Two, apples still haven't really seemed to have got their heads together in terms of put. Entering this space, they seem to still be relying on others. I think it's a multi-year deal they've signed with Google, so yeah, they, they're, they're not really moving forward.
[00:45:12] Matthew: And three, maybe Siri will finally be good, but yeah, I, I think open AI was the interesting one. Big win for Google, big loss for open air.
[00:45:20] Dara: Yeah. And like you said, there's just, there's just one signal after another at the moment that, you know, and speaking of which, another one, and this wasn't on our list either amazingly, but, the open AI bringing out the ads, which, there's already been kind of shots fired at them.
[00:45:36] Dara: Deese has, has Siba, is that its name? The DeepMind, CEO. He was quoted as saying, you know, that, that it's something along the lines of, you know, he's surprised they've, they're doing it so early. And that, you know, Google is holding off at least for the time being. So, you know, almost suggesting that they have no choice and it's, it's, it's pressure they're under that they have to do this to try and, I guess, offset some of the huge amounts of money, revenue, you, yeah, yeah, yeah, yeah, yeah.
[00:46:03] Dara: So, you know, if they're under that kind of increasing pressure, then obviously that affects the decisions they're making around them. You have the quality of the product, which is not a good place to be really, is it?
[00:46:14] Matthew: No. Maybe the topic of our next podcast is probably going to be on the subject of like, why we are, why we're kind of backing Google, really for a few reasons from a data analytics perspective, but also from the AI perspective.
[00:46:28] Matthew: And it, it just feels like they, that Gemini three was, has just kicked off a real gut punch into, to open air. They've, they've been losing. Users or, or it's not been growing as fast, should I say, rather than losing users and just the might of Google and the different platforms. They've got to be able to, to stuff AI into and chuck it in email and chuck it in work, workspace and, all the infrastructure, their own chips, just everything.
[00:47:00] Matthew: Just, they can just sit back and go, I like, yeah, we'll, we'll, we'll do ads maybe one day. Whereas open air, like. We've got minus 5 billion. Yeah. Revenue. Yeah. We need to do it.
[00:47:13] Dara: Yeah. It's not, yeah, we've said this before, but you know, like Google, they, they were. At the back of the pack. And then they, you know, they had the luxury of being able to do that in the, you know, in the early days of the AI race.
[00:47:27] Dara: And, and then they've just kind of completely come into their own and they've got such deep resources and they've already got the existing ad business. They've got time and space to figure out how they're going to monetize Gemini, if, if in fact they do. So yeah. I wouldn't want to be Sam Altman at the moment.
[00:47:44] Dara: No. And, and, and as well, like, it, it seems like philanthropic are doing really well, especially in the coding and, and corporate space. Like they really Yeah. And they're just bringing out, you know, more and more stuff all the time.
[00:47:55] Matthew: Yeah.
[00:47:56] Dara: Yeah. Yeah. It'd be interesting to see, just to quickly, just to quickly one sentence back on cowork again.
[00:48:01] Dara: It's going to be interesting to see, 'cause you, you said, and this is how the positioning is, that this is kind of, you know, cloud code for non coders. But it's going to be interesting to see. Whether people actually, you know, non coders do actually pick it up. I know it's been popular so far. but it's going to be interesting because you, it's easy to think, oh, if, if people are doing this stuff.
[00:48:20] Dara: Already in, you know, if they're doing non code stuff in cloud code, then maybe let's build something that allows 'em to do that within an interface. That's not the same as people actually needing that or wanting that. So it's going to be interesting to see how popular it actually is. but it's just one of many things that they've brought out recently.
[00:48:38] Dara: So, I guess if, if, if, if certain things don't work, they just pull back on that a bit like Google have done over the years, you know, how many things have they rolled out and then. Disappear. because they just haven't worked and they move on to the next thing.
[00:48:51] Matthew: Oh, they just changed their name seven times.
[00:48:53] Dara: Yeah. And then nobody knows. Hang on. Wasn't that odd?
[00:48:55] Matthew: I've, I've given up agent space. Was that agent? Is this agent space?
[00:48:59] Dara: Oh, don't mention a don't, don't bring that up.
[00:49:02] Matthew: But that's funny isn't 'cause that was our post podcast, which wasn't that long ago when we were talking about agent space and now that doesn't even exist anymore. It's now Gemini Enterprise and yeah.
[00:49:10] Dara: I'm glad I never got access to it.
[00:49:11] Matthew: Yeah. It doesn't exist. A couple of little quick bits then Google Keep rolling out more and more of the little BigQuery ai functions. There's one cool one I saw the other day, which is AI embed, which is a, you can within a table, within BigQuery run like a little embedding so it can, it can, Do some embedding markers for the semantics of that row of data. So if there's similar rows of data, the embeddings will be closer. And it just helps with AI searching over vector databases to pull out similar types of rows, based on semantics. So could be really useful and powerful for, for sort of searching and.
[00:49:54] Matthew: Passing documentation and, and information into LLMs. So what it's, so it's helping with that semantic layer. Is it, it's, it's, it's putting more information in there. Yeah. So it generates the embeddings, and I think adds it as a, adds it as a column to the table. So the, and the embeddings are just like a string of numbers, common separated numbers, and they're based on.
[00:50:17] Matthew: A combination of all that, that row and semantic similarity, which then means they'll be closer together. And as far as I understand it, it's like ra, that's what RAG is built on retrieval. Augmented generation is all built on vector databases and embeddings and things like that. There's updates to the Google Cloud part of the Google Partner network.
[00:50:36] Matthew: Model, which I hate. Sorry, Google. I don't hate it. It's simpler. It's definitely simpler. So we're, we're Google Cloud partners and we're, we were marketing analytics specialists as partners. We've now moved all that. So you're no longer specialists. You now have competencies. It's just feels, it's just enough words. That's the one I don't like. Why is, we're competent competent as grandiose. Does this Yeah, yeah, yeah. So it's simpler.
[00:50:59] Matthew: Data, data and analytics. Competent. Competent. but yeah, so they've just slightly simplified things up a bit and, and revamped it. So you probably see. Partners changing their, it doesn't, it doesn't lend itself to marketing material quite as well, does it?
[00:51:17] Matthew: No, no, no. Exactly. We're competent in this. I don't want to work for you.
[00:51:21] Dara: Then also, when it exceeds, well, can we say we're at least competent? Yeah. Competent. Plus
[00:51:28] Matthew: it's almost like we need a little star and a, and a italics below the badge saying, this is Google insisted wording.
[00:51:33] Dara: We are more than c, we are better than those. Yeah.
[00:51:35] Matthew: Yeah, yeah. and the last one I had. I think I shared this with you. It was this voice model thing I saw the other day. Yeah. Nvidia released this voice model and in a nutshell, so, you know, when you talk to LLMs currently, like on open AI or something, on chat GPT, it'll, it feels natural until you talk over it and then there's this, it, it stops.
[00:51:59] Matthew: Break weird sort of cadence that happens. So NVIDIA developed this model that can talk and listen concurrently. So it, and what it means in practice. There's, try out to the show notes some of the links that we've been talking about today. But what it means in practice is it just sounds very much like a natural conversation.
[00:52:19] Matthew: Like there's a guy talking, they were telling jokes to each other, the AI and this, this fella. And, yeah. Unless it was in the headline of an aip post, I don't think I would've immediately gone. That's ai. So yeah, God rest our, our call spamming nonsense that comes through to, for, to our phone.
[00:52:43] Matthew: So I get enough spam calls as it is without me not being able to differentiate between human and a model.
[00:52:47] Dara: Yeah. Darker is not quite the right word. Darker uses, but they're kind of, yeah, they're like slightly less. It is. It's a bit nefarious, isn't it? Spam. Spam calls are I'm, I dunno why I'm trying to be kind to them, but yeah, it is.
[00:52:58] Dara: These kind of like less. Life enriching uses of AI are going to become even more problematic for us.
[00:53:06] Matthew: yeah, and you can imagine to stay on, to go onto another worrying thing that it could do is the people who, and I used to do this, when I was, before I went to university, that I had like a set of scripts that I used to work for the TV licensing agency.
[00:53:22] Matthew: I had a set of scripts based on what I was talking to the customer about, that I used to sort of read. Based on the responses they were giving. I imagine that's what this technology was really built for, was just to replace the need for that. I imagine it has been happening a little bit already, but this, that even more natural conversational way of doing things is got to spell a bit of trouble for call centers, you think?
[00:53:45] Dara: Yeah, definitely. Yeah. My mind went to voice cloning. And so I dunno why we, we, again, we're just going into the, into the dystopian kind of, everyone should be terrified, but, you know, vo voice cloning and, and, you know, for, for, fraudulent calls and, yeah, exactly. So yeah, there's going to be, well put it this way, there's lots of money to be made for anyone who works in cybersecurity.
[00:54:07] Dara: Oh yeah. Got to be new, but it's going to have to get to the point where it's models, fighting models, isn't it? That's what it's going to be. Yeah. That's what it's going to be. Hmm. Yeah. But the, yeah, definitely usefulness for that Nvidia thing because, I dunno, do you, I mean, do you talk much to the, because I tried doing that with Gemini, but it was painful, so with this it will make it a little bit, you know, more natural.
[00:54:31] Matthew: Yeah. I did a bit with, I do occasionally if I, if it's like a brainstorming thing and I just, I want to just put it down and talk. I occasionally used it, but it can be, yeah, it can be a bit formulaic and, and. I do use whisper flow. I think nobody yet measures the types we are talking about. Oh, of course I put in my microphone just talking to all microphones and it types out for us.
[00:54:56] Dara: But yeah. But that's a bit different, isn't it? Because you're not, you are. That's a one way thing, but it's, yeah, talking to them. the, the, there's the, you know, the inability to interrupt. But the other thing that I found particularly annoying about it was when, when you ask it, when you want to have a simple back and forth conversation, but it comes back with an answer that's 10,000 words long, and you can't, you can't stop it.
[00:55:17] Dara: And you're like, no, no, no. This isn't how conversations work. I mean, I guess it is with some people. But it's not how conversations are meant to work. You know, you, you, you want to just kind of make a comment or something and then have it respond with an equally short, succinct. Comment, but it goes off and reads a Wikipedia page out loud to you.
[00:55:34] Matthew: Yeah. A a and, and just to go back to the voice cloning thing, I just remembered another piece of news to go after the last piece of news we were going to talk about, China invented that. Up market with an open source model. So it's pretty much been tied up with 11 labs and a couple of others before paying for services.
[00:55:52] Matthew: que ai have released like an ai, a, a voice cloning model that just needs like a minute's worth of recording to, to clone it, to clone your voice. I don't know if you said you were playing with one recently, but was that 11 labs?
[00:56:05] Dara: Yeah, it was 11 labs and I, I could have, I have, yeah, I could have saved myself. $6 or whatever I paid. If I'd known, that there was an open source alternative, because yeah, it gave me, it was weird, obviously, all the pay. Maybe there's a way to fine tune it that I haven't discovered yet, but I gave it my, you know, recorded sample of my voice and it was basically just an American voice, but with a slight, it sounded like me.
[00:56:29] Dara: It was like me, if I was putting on an American accent, it would be really bizarre. So I might give this, Quinn ai. Or do you know what Slack channel won't.
[00:56:41] Matthew: And we'll maybe add, I don't want clones of my voice out there. Well, the reason I came across it was because I, when I'm ever, whenever I'm in, I get stuck into something and I get, I, I, I get lost like the, the crew and data and everything else.
[00:56:55] Matthew: I prioritize the weirdest. Shit. So it's not like I need to improve this to make it better for a good couple of hours. I went off searching to see if I could find data's voice and have that be part of it so I could talk to data. And then I was like, what am I doing? This is such a waste of time, but that's why I came across it.
[00:57:15] Dara: So you need, you need, you need an AI assistant. Well, you need data to tell you when you're doing that.
[00:57:20] Matthew: I need data. Yeah, but we'll, we'll chuck it in the show notes anyway in case interest.
[00:57:23] Dara: Yes. Could be a lot in the show notes, a lot of, a lot of different things, that we've, that we've covered there. But it was definitely, there was, I mean, and, and as I said at the very beginning, there's probably a bunch of stuff we didn't even include.
[00:57:35] Dara: but these are some of the things we've been chatting about looking at, looking into using. so some of the bigger stuff that's happened since we were last on the air. Alright, we'll get back into our more usual format. I think you hinted at this earlier. We're going to have an episode coming up, which is going to be about why we've decided to, to, to kind of, kind of back or, or get on board with the, the Google vision of things with cloud and ai.
[00:57:58] Dara: And then we'll get some more guests lined up. As well and get back into our normal flow. But yeah, good to be back, and happy New Year. Yeah, happy New Year, at the end of January. It's never too late to say it. No, it is a new year and I hope it's happy. That's it for this week's episode of the Measure Pods.
[00:58:14] Dara: We hope you enjoyed it and picked up something useful along the way. If you haven't already, make sure to subscribe on whatever platform you're listening on so you don't miss future episodes.
[00:58:24] Matthew: And if you're enjoying the show, we'd really appreciate it if you left us a quick review. It really helps more people discover the pod and keeps us motivated to bring back more. So thanks for listening, and we'll catch you next time.
Pentagon drama, Google updates and a deep dive into why Claude keeps winning. Agentic AI is here and it's for everyone.
Dara & Matthew explore AI competition, developer tools, enterprise SaaS governance, and Google's AI strategy and TPU infrastructure edge.
Dara and Matthew recap 2025 with 12 top highlights, AI, analytics, and the best moments on The Measure Pod.