Business intelligence modernization, or most commonly known as BI modernization, is not a new term. The term does still create a fair amount of confusion for our customers. Some vendors will have you think that it’s some sort of massive shift in the analytics technology we use. While there is some truth to that thinking, with the accelerated growth of cloud, the true unlock is around how much easier it is to free up the data that is the basis of the analytics.
According to research conducted by HBR (of Fortune 1000s surveys in 2021): “Only 29.2% report achieving transformational business outcomes, and just 30% report having developed a well-articulated data strategy.” Here’s how BI modernization can change that, as well as why organizations are needing to pivot now more than ever.
3 major drivers affecting the data-driven enterprise: data access, BI tooling, and adoption
The success of any analytics project is to get users closer to the data. Yes, we can use the old trope that “data is the new oil” (and at Atrium, we like to replace “oil” with “soil”). And storing data in the cloud isn’t new, with cloud BI solutions such as AWS, Azure, GCP, open-source platforms that have been around for a decade or longer. But they also require large amounts of IT investment and additional specialized tools to manage and unify the data. Essentially, moving the IT dollars from on-premise to cloud BI solutions — but not enabling the everyday business user access to the data itself.
To achieve that nirvana of self-service, and unified BI, our customers are leaning on Snowflake, the leader in the data cloud wars, which complements and enhances a customer’s investment in the aforementioned cloud platforms. We’ve also seen customers use Snowflake as the cloud data infrastructure. Either approach is feasible and with the common goal of reducing costs in compute and storage, freeing up access to the enterprise’s most valuable asset: its data.
Let’s begin by recognizing that BI tools are now built with the end-user in mind and should be cloud-based, instead of the IT-centric, premise-based, server technology. Where in the past, IT was forced to both be the data stewards and actually build the static dashboard through a request process. This creates bottlenecks, as IT has to load balance analytics requests with running the business-critical applications. Additionally, IT, through no fault of their own, is the furthest from having the business context of what questions are trying to be asked with the data requests. This leaves most analytics having a less-than-desirable effect. This establishes an unhealthy tension between the business users and IT.
Over the past several years BI tools have become more business user-friendly and the movement of these platforms to the cloud has led to enabling a more self-service analytics orientation. This is very clearly seen with Salesforce’s investment in Tableau Cloud (formerly Tableau Online). Tableau Cloud is where all future application enhancements will be released first, enabling our customers to unlock advanced features such as embedded AI/data science and natural language display through infographics and narratives. The impact of new features is that business users have predictive and prescriptive insights and data displayed in a more digestible way, leading to business context in the workflow of where those users are living and making decisions. This will ultimately increase the adoption of the data-driven mindset.
While we have seen BI tooling become far more end-user friendly, we must also keep in mind that the perception of BI tooling as being “end-user friendly” will likely vary (widely) from those who are familiar with BI workflows to those business users whose BI literacy falls at the novice-to-intermediate level. Which is to say, those technical users who have seen the evolution of the tooling can appreciate how much more accessible it is today compared to two years ago. The target audience, however, is not comparing it to what it used to be, but rather to the best-in-class consumer apps that they regularly use, which rarely must facilitate processes as demanding as BI design, development, and deployment.
This is not to say that the gap is insurmountable — much the opposite! With an intentional and targeted effort to link the outcomes of the BI business case with a tactical end-user adoption plan, the path from current state to future success can be defined clearly and supported effectively with alignment across business and technical executive sponsors. The key is to avoid underselling the organizational change effort that is required to achieve the outcomes of the BI business case. The transformation team must at once celebrate the vast improvement in end-user experience of the BI tools, while at the same time underscoring the fact that the next step must be a BI literacy and governance initiative that ensures that the educational and governance infrastructure is established and activated. Without it, the BI business case will fail to meet its targets.
Why migrate? Reduce technical debt, boost productivity, and more with cloud BI solutions
With the pending economic uncertainty, IT and Finance are most certainly going to be looking at reducing technical debt. That usually starts with vendor consolidation. They will also look at organizational inefficiencies, such as ongoing capital expenditures like IT infrastructure. And asking questions like: Why are we spending money to try and be the experts in data cataloging, governance, security, and server performance? Why do we keep buying additional hardware/appliances to manage our data and analytics tools to get the customer 360 view when all of our client-facing products and tools (e.g., CRM, ecommerce, call center, etc.) are all cloud-based and we outsource that management to our trusted vendor?
At the end of the day, the C-suite wants a 360-degree view of the customer, with data that is virtually (or literally) real time and consistent. They also realize you can’t do it all in one fell swoop. So it is critical we start with the easy stuff, while we work on the hard stuff, give our business users incremental value, and show success to continue to receive critical investment.
Now to determine what analytics come with us as we migrate to the cloud…
Questions we go through with our customers are to see what has been used, what needs to be (near) real time, and how much of that data needs to be presented to answer the questions the business needs (in order to get actionable insights). In some instances, only small amounts of data are needed with high frequency to enable business users to achieve their goals. As we go from individual contributors to executives, the complexity and volume of data will change. Assessing what those various needs are must happen concurrently, so we’re not wasting valuable resources to move all the data that very few people or systems use, simply to reduce our server cost.
How we approach BI modernization projects at Atrium
When we approach BI modernization projects, there are a few key dynamics we guide our clients through. Within your BI environment today, namely:
- What should stay?
- What should go?
- What should be redone?
To bring it into focus, we use the following metaphor: If you were to move houses, would you just pack up each room as is, load it on the truck, and move it to your new abode?! Most likely not. So in short, BI modernization should start with questions as simple as:
- Inventorying what your end users need for analytics?
- What technology you have in-house today that could best align with the defined needs? (Do you need to look outside at something new/modern?)
- What data access can we provide to improve the analytics quickly and build on over time?
- And how quickly can we move it to the cloud?
Reach out and let’s discuss how we can help you with BI modernization, wherever you are on your journey. In the meantime, learn more about our BI modernization expertise: