Cloud Data Viz and Analytics Health Check
Uncover the fitness of your Cloud Data Viz & AnalyticsGet my free score
Do We Really Need Another Analytics Tool?
The short answer is no. You don’t need another analytics tool — however, it’s likely that there’s a major gap in your workload when it comes to this area of your business. Let’s set the scene: you have an analytics setup (your BI platform and a team of developers or data scientists who interact with it) designed to provide you with clear, meaningful data and insights.
You also have a post-analytics process where you take action on that insight. Generally, this post-analytics process is semi-automated and based on the output of your analytics. Finally, you have your operations workload – the data-driven decision that action you actually take as a result of the entire analytics process.
If they can’t build it, they’ll buy it…
For example, hospitals may ‘automatically’ divert patients to another hospital if their analytics shows that bed capacity or care units are full. The process isn’t actually automatic – steps must be taken to actually divert the traffic – but the outcome of the analytics (that a threshold has been reached) triggers a specific event (patients sent elsewhere for care).
Hospitals also use advanced analytics to predict and prepare for scenarios like this: for example, forecasting and modelling data surrounding discharges vs. admissions.
So, you have an analytics workload (an automated process that evaluates bed capacity) and a post-analytics workload (the manual process of reviewing this information), and they both feed into a resulting operational workload (the process of diverting patients to appropriate locations).
And that’s the gap! Your analytics are disconnected from your operations. And when the flow of your insights stops before your operational workload, the likelihood that insights and actions will fall through the cracks is high. Think of it like the handoff in a relay: this is the part of the race where the baton is most often dropped.
Part of the reason for this gap in process is the fact that traditional do-it-all BI tools simply weren’t designed to run the whole race.
When the only thing you have is a hammer, everything looks like a nail
Traditional, legacy BI tools are known for doing it all. For decades, they’ve been trying to run every leg of the relay. Sure, they offer comprehensive functionality and good value for money, but as time goes on, they lock you in.
It can be a struggle to move data from legacy BI infrastructure to Snowflake or Databricks, (especially when you’re riddled with local files and business logic hidden away inside unreadable preparation “load scripts” tucked into dashboard files). This migration (or lack thereof) quickly becomes a Pandora’s box-like situation, and it can significantly inhibit your ability to make the move to Cloud, or deter you from trying in the first place.
There are other issues that arise as a result of this “do it all” mentality with regards to BI tools. Over time, this “solution for everything” approach has become a one-team job – for the BI team, specifically. But when it comes time to scale, this approach hits a wall. Often, coding in BI tools is done by ‘citizen developers’ who are typically unaware of best practices with regards to scalability, clean code, and version control.
The result is that business logic stored in the tool can’t be easily shared elsewhere, when it probably should never have been in the tool in the first place! I know the pitfalls that can flow from this approach, because early in my career, I was an unaware citizen developer. I thought that the BI tool I used could do anything, but just because I could get the tool to perform a certain job didn’t mean it was the right tool for the job.
Put simply: these “do it all” tools should never have been asked to do it all. There are certain workloads or processes that they simply weren’t built for, and that’s OK!
Choosing the right tool for the job
When approaching a task, the almost-too-obvious trick is to choose a tool or tools that are meant for the job. Legacy BI tools simply try to do too much, including performing workloads like ETL, which modern data warehouses can do on their own.
To visualize data, why use an expensive, clunky legacy BI tool, when other more modern, lower-cost tools can do it? And why invest in a tool that performs unnecessary functions? The best tool for any job is one that’s efficient and helps to streamline the work that needs to be done – legacy BI tools simply don’t fit that bill, especially when compared to modern, purpose-built solutions.
The reality is that many customers that still work with legacy BI tools have multiple analytics and BI solutions to cover gaps in their workloads, and their data is siloed and fragmented across multiple systems as a result. This lack of synchronicity inhibits efficiency and prevents the streamlining of critical workflows.
But the right tool will always 1) enable you to accelerate or enhance your analytics and 2) support taking actions on your analytics (making data-driven decisions). If you’re offering guided analytics dashboards to your users, the right tool will also certainly offer the option to 3) create a look and feel that matches what users are used to or expecting. To meet these evolving needs of customers, it’s critical to find a tool that can do the heavy lifting for you, and work in ways that make your life easier.
Creating the right tool for the job
Wearing my Product Manager hat, my job is to look at what customers and prospects do day-to-day with their analytics, and beyond. I regularly see customers pushing the boundaries of traditional BI tools: trying to use their legacy tools for writeback and data science, and working out ways to enable interaction with other tools.
Of course, these are all things that customers want and deserve, but they belong in a modern analytics tool: one that enables them to take effective action, without sacrificing the classic BI features that customers have come to love and expect. Critically, the right tool will let you look at all of your data, without performance challenges or gaps in workflows.
This may sound too good to be true, but it’s not! Astrato is an Analytics Data App solution – one that bridges the gap between analytics, operations, and data science workloads. It solves problems that legacy BI tools simply can’t, and it does it all without sacrificing the familiar functionality of a quintessential BI solution. Astrato is built on a simple concept – the simplification of BI: with world-class dashboarding, writeback, and advanced analytics/ML, Astrato solves well-known BI and analytics challenges in a low/no-code, creator-friendly tool purpose-built to drive fast, easy analytics adoption across your organization.
Do I need another analytics tool?
Unless your analytics tool is 30 years old (and we have on boarded a customer in this exact situation!) then purchasing and adopting a new tool, purely for visualizing data probably isn’t worth the hassle. Any new tool that you invest in should do more with data: more could be writeback. More could be data science. More could be both.
More could be – and should be – Astrato: encapsulating dashboarding, writeback, and advanced analytics all in one product.
Because what you really need is an Analytics Data App solution: a one-stop-shop for everything that should come with a modern analytics tool; the features and functions that don’t normally come with the read-only analytics/BI products that have dominated the industry for the past 10 years.
It’s time to stop retrofitting your Business Intelligence and Analytics. Analytics at its core is about reading data to draw insight and then using that insight to take an action. If you’re asking your legacy analytics tool to do more than that, you’re probably asking too much.