You’re Not Switching Analytics Tools. You’re Switching How You Work.
Analytics tools aren’t the center of decision-making anymore, cloud infrastructure is. Here’s why the shift is happening and what it means for the future of measurement.
On my mind this week
For the past two years, the conversation around GA4 has been loud. Marketers complained. Analysts were frustrated. Competitors like Piwik PRO, Amplitude, and Matomo gained tons of momentum, and market share as businesses scrambled for alternatives.
But now? The noise has faded. Not because GA4 suddenly got better, but because something much bigger happened.
As an industry, we didn’t just swap Universal Analytics for GA4 or jumped from one tool to another.
We switched how we think about digital analytics entirely.
I’ve felt this shift more than ever lately. Been on my mind in the last two weeks.
As a commercial data leader, I knew commercially we are not where we need to be, but now more than ever, from a technical POV, we are distrupted AF.
The way businesses handle web and app analytics has fundamentally changed. Whether you’ve fully realized it or not, we’ve all moved away from tool-centric analytics to a cloud-centric model, one where the analytics tool itself is just a small piece of a much larger data ecosystem.
The End? of the Analytics Tool as the ‘Source of Truth’
For years, Google Analytics, Adobe Analytics, and other standalone analytics tools were the source of truth. Reports came straight from their UIs. Marketers and analysts practically lived inside these platforms, they were the go-to for insights, strategy, and performance tracking.
But something has fundamentally changed.
Analytics tools are no longer where the real decision-making happens. Instead, they’re just one part of a larger cloud data ecosystem. The true source of truth has shifted to data warehouses like BigQuery, Snowflake, and Redshift, where data is aggregated, processed, and analyzed at scale.
The analytics tool itself? It’s becoming a data collection mechanism more than anything else.
Like my friend Jason Packer, author of the Google Analytics Alternative book, said: “Connect BigQuery, setup server-side GTM, design a custom event model, create a GCP data pipeline, build our own Looker Studio dashboards. If we’re doing all of this, what exactly is GA4 doing for us other than data collection?
If I say I’m “doing analytics” does that mean I’m doing all this setup… and then explaining to stakeholders why it still doesn’t match the system of record?
When do we get to the part where we use the data to help people make decisions?”
The truth is that this whole shift changes everything about how we think about analytics tools. And the way we look at digital analytics as a whole.
What Google Actually Did With GA4 (It Was Never About a Better Analytics Tool)
When Google shut down Universal Analytics and forced everyone onto GA4, the industry focused on the idea that GA4 was an upgrade, albeit a controversial one LOL!
But in reality, this move was never about making a better analytics tool. It was a calculated push to drive cloud adoption. Which, for the record, I am not mad about at all. GCP is amazing.
It forced all of us to step up, to learn more about cloud data, rethink our workflows, and break out of our comfort zones.
I will say this a lot in this newsletter, GA4’s interface wasn’t designed to keep us inside the tool. Google made its built-in reporting more limited, ensuring that anyone serious about analysis would have to export their data to BigQuery.

And of course, BigQuery exports are “free”. Why? Because once companies start storing their data in Google Cloud, they’re far more likely to expand their use of Google’s entire cloud ecosystem. D’uh. Forget about the future of analytics lol, this was a move toward treating analytics data as raw input into a larger cloud architecture. Again, not mad about it. We needed this badly.
But this whole switcharoo means companies aren’t just comparing GA4 to Amplitude, Piwik PRO or Matomo anymore.
Instead, they’re asking: How does analytics fit into my entire data ecosystem?
This is bigger than GA4. It’s a complete redefinition of how digital analytics fits into business intelligence.
Where Does This Leave Piwik PRO, Amplitude, Matomo, and Snowplow?
Let’s pause here for a bit. While GA4 forced (assertively pushed?) the industry toward cloud-based analytics, some tools have thrived in this shift by leaning into their strengths rather than competing on GA4’s terms.
Let’s look at some market data in terms of deployments for these tools, curtesy to mr. Jason Packer amazing research on this. The data for this is 28.02.2025.
I will say that Piwik PRO has carved out a strong position in privacy-conscious industries. Unlike GA4, which locks businesses into Google Cloud, Piwik PRO offers flexible hosting options, on-premise, private cloud, or within a company’s own infrastructure. This makes it a go-to choice for finance, healthcare, and government organizations that need strict data governance. Their direct integrations with BI tools and cloud warehouses keep them relevant in a post-UA world. I’ve been using Piwik PRO for my blog for the last 3y now.
Meanwhile, Amplitude and Snowplow were built for this shift from the start. They never positioned themselves as closed, UI-driven analytics platforms but as data infrastructure solutions designed to integrate directly into cloud ecosystems.
Amplitude’s event-based model fits seamlessly into modern data stacks, with built-in integrations for Snowflake, BigQuery, and AI-driven insights. Plus I love their Experimentation module. It’s the best on the market IMO.
Snowplow has always been about custom first-party data tracking, giving businesses full control over event-level data and streaming it into their cloud of choice. Love the team there.
And finally, Matomo remains a trusted choice for companies prioritizing full data ownership. It has always been an advocate for open-source, independent analytics, enabling businesses to track data without third-party cloud reliance. Over time, Matomo has evolved, it now integrates with cloud data warehouses and BI tools, making it more adaptable to modern data infrastructure needs.(Wow, first time writing something nice about them LOL! but hey, I am trying to be as objective as possible here, and they are doing good work and have been for a long time, for their audience. So, here are your flowers Matomo team 🙂)
OK, so rather than competing with GA4 on interface or reporting, these tools are doubling down on data ownership, segmentation, and flexibility, exactly what businesses need in a cloud-first world.
The real takeaway? GA4 didn’t start this shift, it just made it unavoidable.
And while it’s easy to frame GA4 as the villain, it’s actually been a forcing function that has accelerated industry-wide innovation and made cloud-based analytics the new standard.
Beyond The Mean
Are We Moving From Analytics Tools to Data Readiness?
If analytics tools are no longer the source of truth, then what actually matters?
This is the question companies are now asking as analytics infrastructure shifts to the cloud. The focus has moved beyond the tool itself to the entire data pipeline, how data is collected, cleaned, and structured before it’s ever analyzed.
Data readiness refers to the state of preparedness and quality of data within an organization to support effective decision-making and operational processes. It encompasses several key aspects, including data accuracy, completeness, timeliness, consistency, and accessibility. (Definition)
Organizations, especially large enterprises, are no longer treating web and app analytics as isolated systems. Instead, analytics data is just one of many inputs feeding into a centralized cloud data warehouse or lake, whether that’s Snowflake, BigQuery, or Azure Synapse.
With massive volumes of data coming in from websites, mobile apps, and customer interactions, the real challenge isn’t choosing the right analytics platform, it’s ensuring the data is clean, structured, and reliable before it reaches a dashboard.
This is why data readiness is the new priority. Companies are heavily investing in data engineering, making sure:
Events are tracked consistently.
Identifiers are unified across platforms.
Data is accurate and structured before it enters any reporting system.
At the core of this are the ETL and ELTs of the world, where raw data is first stored in a warehouse and then transformed as needed. These pipelines consolidate disparate data sources into a single source of truth, applying cleaning and transformation rules so that every downstream analysis is built on reliable, consistent data.

Analytics tools no longer function as all-in-one solutions. Their role has been deconstructed, they are now simply data collection mechanisms feeding into a broader infrastructure.
Companies that fail to adapt, clinging to outdated workflows that rely solely on UI-based analytics reports, will struggle to compete in a world where data flexibility and readiness define success. Why?
Because the Analytics UI is Becoming a Commodity….and the Business World Has Already Moved On
Executives and decision-makers aren’t inside analytics platforms anymore. They’re not logging into them and pull out reports. Instead, they’re looking at custom dashboards in their favorite BI tools that pull data from multiple sources, not just web analytics.
This has resulted in a growing detachment between business users and traditional analytics UIs. They do not care.
For them, analytics tools are no longer the end destination.
With reliable data pipelines and cloud warehouses in place, the choice of analytics UI is no longer a strategic decision, it’s just a tactical one. (Let’s be strategic about being tactical as Doug Hall famously said once haha)
Businesses now treat analytics dashboards like interchangeable interfaces rather than the final source of truth. Whether it’s GA4 or another platform, the real question isn’t what reports it offers, it’s how well it integrates into the broader data stack.
And as I said before, this is exactly why Google made GA4’s reporting interface deliberately weaker to push businesses into BigQuery and Google’s broader cloud ecosystem.
This means one thing: Analytics UI is now a commodity.
The current expectation is that analysts will pull raw data into a warehouse and conduct deeper analysis using SQL, Python, or BI tools.
This is what matters:
How flexible the data access is.
How well the analytics tool integrates into the cloud stack.
How strong the underlying cloud infrastructure is.
Analytics vendors that fail to prioritize warehouse-native analytics and flexible data governance will become irrelevant.
What This Means for Web & App Analytics Tools (Commoditization Is Here)
Let’s be clear, I’m not saying GA4 is bad. If anything, GA4 disrupted the market in a way no other analytics tool could.
People reading this might think I’m speaking ill of GA4, but honestly, it’s more of a backhanded compliment. So, to the competitor vendors ready to celebrate reeeeelax, this isn’t the GA4 takedown you were hoping for. LOL.
GA4 forced an industry-wide shift that was already inevitable. By design, it pushed businesses to rethink their entire approach to analytics. It nudged (or shoved) companies into treating analytics data as raw input for a broader cloud ecosystem, rather than a standalone reporting tool.
And that’s exactly why analytics tools as we know them are becoming a commodity.
As analytics shifts to cloud-based ecosystems, there is no more GA vs another tool, or tool vs tool, the competition is between Google Cloud, AWS, and Azure.
At the end of the day, the future of web and app analytics is how well your analytics stack integrates into your broader data ecosystem. And that’s why I shouted Piwik PRO, Amplitude, Snowplow and Matomo above, because they have been actively working on being ahead of the game vs. competing on fancy UIs.
Industry Adoption & Shifting Measurement Strategies
The transition to cloud-first analytics is forcing companies to rethink how they measure success.
The traditional approach, relying solely on an analytics platform’s pre-built reports is no longer viable. Businesses are shifting toward customized measurement frameworks that emphasize data portability, integration across systems, and multi-touch attribution.
Restructuring Analytics Teams
As measurement strategies evolve, analytics teams are being restructured to prioritize data engineering and integration over front-end analytics expertise.
Organizations are hiring data engineers, cloud architects, and BI developers, shifting away from reliance on analytics tool specialists. This aligns with the broader trend of treating analytics as a data engineering function rather than a marketing-led initiative.
Hybrid Measurement & Attribution Models
Companies are adopting hybrid attribution models that leverage data across multiple sources instead of relying on a single analytics tool. For example:
First-party data tracking is being integrated directly into cloud warehouses.
Incrementality testing and predictive modeling are being used alongside last-click attribution.
Marketing mix modeling (MMM) is hot AF as companies try to account for privacy-driven measurement gaps.
Industry-Specific Adoption Trends
Adoption of cloud-first analytics strategies isn’t happening at the same pace everywhere of course, different industries are moving at different speeds:
Finance & Healthcare → Leading adoption due to strict data privacy and regulatory concerns. These industries favor solutions like Piwik PRO and Snowplow, which offer greater control over data storage and compliance.
Ecommerce & Retail → Rapid adoption of warehouse-native analytics, often using GA4+BigQuery or Snowflake. These industries need scalable, high-performance data analysis to optimize demand forecasting, supply chains, and personalization strategies.
SaaS & Tech → At the forefront of real-time analytics and AI-driven insights. Companies in this space are integrating tools like dbt, Amplitude, and Snowflake for real-time customer journey analysis and advanced cohort modeling.
This shift is accelerating the decline of pre-packaged analytics UIs, as businesses seek flexible, cloud-driven measurement strategies that scale beyond a single analytics tool.
Cloud Analytics Market Trends
The global cloud analytics market is growing fast AF.
Gartner forecasts that global end-user spending on public cloud services will reach $723.4 billion in 2025, up from $595.7 billion in 2024, with Google Cloud, AWS, and Azure competing for dominance in enterprise analytics infrastructure.
Key trends driving cloud analytics adoption:
Multi-cloud strategies are becoming the norm. Businesses are moving away from single-cloud reliance to avoid vendor lock-in, replicating GA4 and other analytics data across BigQuery, Snowflake, and Redshift.
Data privacy regulations are accelerating the shift to first-party data storage. With increasing regulatory scrutiny (GDPR, CCPA), businesses are moving toward self-hosted analytics solutions like Piwik PRO and Snowplow, giving them full control over data governance.
Warehouse-native analytics is reshaping decision-making. Analysts now work directly within data warehouses using SQL, dbt, and Python, making front-end analytics UIs secondary.
AI-powered analytics is expanding. Enterprises are using cloud-based AI tools to automate anomaly detection, predictive analytics, and customer segmentation within cloud warehouses, reducing the need for manual reporting workflows.
The net effect is that traditional web analytics tools are becoming commoditized. The real battle is for control over enterprise analytics infrastructure.
The Future Outlook (Predictions)
The next five years will see analytics tools take a backseat to cloud-based decision intelligence. Companies that once debated whether to use GA, Adobe, or an alternative will instead focus on how well their analytics stack integrates into their broader cloud ecosystem.
My predictions for the future of digital analytics:
Web analytics tools
will become are almost fully modularized. Companies will pick and choose tracking components that best fit their cloud stack, rather than committing to an all-in-one analytics suite.First-party data strategies
will becomeare a competitive differentiator. Businesses that own and control their customer data pipelines will have a major advantage over those that still rely on external analytics providers.Data engineering
will beis the core function of analytics teams. The role of analytics will shift further toward data preparation, transformation, and automation rather than front-end reporting.Google’s dominance in cloud analytics will be challenged. While GA4 and BigQuery are deeply embedded in enterprise ecosystems, multi-cloud strategies will continue to rise, ensuring that companies maintain flexibility.
AI-powered analytics will replace manual reporting…and basic analysis. Predictive modeling and automation will significantly reduce the need for manually built reports, allowing organizations to focus on strategic decision-making rather than data wrangling.
The future of analytics isn’t about the tool you use, it’s about the infrastructure that powers your decision-making.
Companies that invest in flexible, scalable, and cloud-native analytics architectures will lead the next wave of data-driven innovation.
Also, let’s not get twisted. Not all companies need this level of complexity, in fact Jason Packer told me: “There are millions and millions of websites, and 99% of them don't need anything more than the basic kind of stats you can get from tools like Plausible or Fathom. 99% of websites also won't find good ROI in hiring an analytics consultant, and using GA4 for those sites is a waste of time and effort. There's still plenty of wins out there for the 1% of sites that do need this kind of analytics engineering work though!"
Until next time,
x
Juliana
PS: Totally unrelated, but I promised Dave Mannheim I will share the unique research on eCommerce websites that they’ve published at Made With Intent. How we sell on ecommerce websites is a little disconnected from customers’ perception. With stats like 63% of online shoppers feel the tactics eCommerce websites use to influence purchase decisions are inappropriate and/or manipulative. (Only 11% disagree.) Crazy, right? Or 83% of online shoppers report using a discount code when they would have bought at full price. Check it out here.
Thanks for pushing this as well. And it was not as hard as a I thought. My part about GA4 alternatives would have been more brutal.
Thanks for the article Julina :)! I fully agree with it and as usually I have two questions:
1 - In your future course with Timo are you approaching in practice "...data preparation, transformation, and automation...." with some dataset and tools?
2 - I miss a paragraph like "What this means for Web Analysts as we know them"..Despite the article has a lead on that in the paragraph "Resturcturing Analytics Teams". Should we data analyst become more hard core data analysts? That's the reason for my first question :)