Skip to main content

Your analytics stack is dying. Here's what's going to replace it

Steven Elliott10 March 20266 min read
Your analytics stack is dying. Here's what's going to replace it

If you caught our Stop the Martech Madness webinar, you'll know we've had questions over the future of the modern marketing technology stack for a while. The last two months have seen some clearer answers to those questions emerge.

tl;dr: the monolithic SaaS model for marketing analytics is creaking. What's going to replace it is faster, better, cheaper, and more honest. But it needs engineering properly.

You already know your stack is bloated

Tally up your tools. There's an analytics platform, a tag manager, a BI tool - or three, data pipelines, probably an attribution product, possibly a CDP, and almost certainly a handful of point solutions someone signed up for back in 2022 that nobody dares cancel in case something breaks.

Each tool works fine on its own. Easy-to-use interface. But together, they don't work without significant effort. Every integration is a project. Every project is a cost. Any non-renewal represents a risk.

The SaaS vendors know this. They're banking on low barriers to entry and high switching costs to keep you locked in.

Meanwhile, they're all rushing to add features that overlap with the neighbouring piece of the jigsaw. Your CRM system wants to do analytics. Your analytics platform wants to do activation. Your BI tool wants to do data prep.

It's like maintaining subscriptions to Netflix, Amazon Prime, Disney+, Discovery. They each have one great series you couldn't live without. But you end up paying for a catalogue of duplicated dross you know you're never going to watch.

And for all the ease-of-use promises, life doesn't get any easier. You spend more time on data prep than analysis. More time hacking tools than using them. More time explaining why the numbers don't match than actually interpreting what they mean.

It's not your fault (honest!). And you're not alone.

The stack you're paying for

Let's name some names.

The ads team can't operate without Google Analytics. The product team is hooked on Amplitude. Marketing ops is running Funnel to pull spend data from every platform under the sun. Fivetran is pushing some of that data to the enterprise data warehouse on Snowflake. Someone signed an enterprise Tableau deal three years ago and now nobody can quite remember why. And sitting on top you've got Tealium as a CDP providing identity resolution you're not entirely sure you trust.

Each of these tools has a legitimate use. That's not the problem. The issue is what they cost together, what they don't do together, and what happens to all that fragmented data when you stop paying your dues.

Funnel's pricing is opaque by design. It looks reasonable until your data volumes grow, and then it doesn't. Fivetran's pricing is just going up - despite the fact you're ingesting less data. Tableau Enterprise is the kind of line item that makes CFOs ask questions at board meetings. CDPs like Tealium promise a unified view of your customer but deliver it inside a platform you rent, with matching logic you can't audit, sitting on data you don't truly own.

And critically: none of these platforms talk to each other without effort. The data sits in silos, each platform looking inward at its own slice of truth. And aligning on answers to the critical questions takes ages.

Your analysts aren't slow. Your stack is.

What replaces it

The alternative isn't a shinier set of SaaS tools. It's a fundamentally different model. Warehouse-first, code-native and agent-ready.

Raw event data collected via server-side GTM or Snowplow, pushed straight into BigQuery. No sampling. No obfuscation. No black box between you and what actually happened. Ad spend data, CRM data, product data - all of it flowing into the same warehouse via open source connectors.

Transformation and business logic defined in dbt or Dataform. Version controlled. Tested. Documented. BigQuery as your zero-copy single source of truth. Terms and metrics defined once and for all in Dataplex. Identity resolution handled in SQL, not inside a CDP. Code-first BI tools like Lightdash, Evidence or Rill sitting on top - dashboards built on your models, not disconnected from them.

In this new analytics stack, agents aren't stuck inside individual platforms looking at siloed data. BigQuery's native data agents can query and act on your warehouse autonomously. Pipelines monitored. Anomalies surfaced before the Monday morning meeting. Questions answered in plain English. Decisions informed by the full picture, not whichever tool someone happens to log into.

Your data never leaves your environment. The intelligence layer sits on top of a unified data set, not fragmented subscriptions. And when you stop paying, you keep everything. Because it was all yours from the start.

This is viable now. It's happening.

Let's not make out this is a simple swap. Inertia is real. Known brands feel safe. Nobody gets fired for renewing the Tableau licence. And it's just not realistic or feasible to throw everything out all at once.

But the emerging toolset is mature and production-tested. Once the target architecture is established, it should be possible and practical to build and test in parallel, cancel licences as they come up for renewal and bring the new tools online. The more you shift, the more costs are saved and the more upside is unlocked.

If you move decisively and early, you'll not only save a fortune, you'll have first-mover advantage over everyone still paying rent to access their own data.

In my next blog, we'll tackle the trickier question: what does it take to build and maintain this properly? A clue for you: engineering discipline, agentic practices, and encoded expertise are what separates a platform that compounds in value from a side project at risk of collapse.

FAQs

What is the benefit of a warehouse-first analytics stack?

The primary benefits include total data ownership, significantly lower long-term licensing costs, and a "single source of truth" where business logic is defined in code (SQL/dbt) rather than hidden inside proprietary UI settings.

Can I migrate from Google Analytics to a warehouse-first model?

Yes. By using server-side tagging or Snowplow to stream raw event data directly into BigQuery, you bypass the sampling and data-sharing limitations of standard GA4 implementations while maintaining full control over your raw datasets.

Is a code-native BI tool better than Tableau or Power BI?

Code-native tools (like Lightdash or Evidence) are better for teams seeking version control and "logic-as-code." While Tableau is powerful for visualization, code-native tools ensure your metrics are defined once in the warehouse, preventing different dashboards from showing conflicting numbers.

Suggested content

Measurelab awarded Google Cloud Marketing Analytics Specialisation

At the start of the year, if you’d asked us whether Measurelab would be standing shoulder to shoulder with Europe’s biggest consultancies by September, we would've been surprised. Not because we don't believe in ourselves, but because these things feel so distant - until suddenly, they’re not. So, here it is: we’ve been awarded the Marketing Analytics Services Partner Specialisation in Google Cloud Partner Advantage. What’s the big deal? In Google’s own words (with the obligatory Zs): “Spec

Will Hayes11 Sept 2025

BigQuery AI.GENERATE tutorial: turn SQL queries into AI-powered insights

BigQuery just got a major upgrade, you can now plug directly into Vertex AI using the new AI.GENERATE function. Translation: your analytics data and generative AI are now best friends, and they’re hanging out right inside SQL. That opens up a whole world of new analysis options for GA4 data, but it also raises some questions: * How do you actually set it up? * What’s it good for (and when should you avoid it)? * Why would you batch the query? Let’s walk through it step by step. Step 1: H

Katie Kaczmarek3 Sept 2025

Data pipeline optimisation with Google Cloud and Dataform

In our recent engagement with a client, we went on a journey to transform their data pipelines, tackling inefficiencies in performance and cost within their Google Cloud BigQuery environment. Our efforts culminated in a comprehensive optimisation strategy that used Dataform, improved SQL practices, and implemented tailored solutions for significant performance gains and cost savings. Here’s a deep dive into the highlights of our project. Identifying inefficiencies in BigQuery workflows We beg

Prasanna Venkatesan22 Apr 2025