- May 8, 2017
- Janke van der Vyver
- Digital Strategy
- 0 Comments
Standardised Cloud applications, the Internet of Things, Social media and burgeoning Network and storage capacity have resulted in an explosion of available data. As a consequence, organisations have become less precious about putting their data in the cloud and even selling or sharing abstractions thereof. Compute, data retention and analytics processes have also evolved to cope cost effectively with the challenges and opportunities of big data, whether through algorithmic changes, or distributed scale out models. All of this represents a significant opportunity for industry leaders and innovators because not only can they now react faster (even ahead of the time) to opportunities or issues, they can also gain game changing insight into their customers and markets, which will allow them to pull further and further ahead of their competitors. Let me explain.
The value of reaction time
Let’s say you have a burning issue, e.g. a department is under-performing, the longer it takes to find out about it, figure out what is causing it, decide what to do about it and then act, the more value is destroyed. The same is true of opportunities. For instance, most marketing departments will tell us that there are untapped markets and customers out there that would jump at the right product or marketing approach, we just need to identify them and reach them. Every minute we don’t, costs us those potential gains and increases the chance that a competitor will target them before us.
There are a few excellent articles and publications available on the subject of decision latency so I won’t rehash it. I would highly encourage you have a look at them though, if you haven’t yet. This one is an excellent place to start.
The changing role of BI and the rise of self-service analytics
So, how do we quickly identify or act on our highest risks and biggest opportunities?
We all know traditional BI is great at giving us the answers that we regularly need. It doesn’t affect the production systems and you can take the answers to the bank. Unfortunately, we also know that traditional BI suffers from a few challenges:
1) High dependency on constrained IT resources. How often does a business person not ask a question to which IT responds that the report would take too long, would cost too much and is not a priority, so they end up spending a lot of their time using MS Excel or Access themselves instead.
2) Business requirements and value are unclear. It is hard to know what you need until you see what has been produced and realise that’s not quite it. And even if you get a valid answer, that answer will only beg another question. Similarly, it is hard to predict what problems or opportunities will yet arise and what information will be required to address it. The cumulative effect is lengthy waits for expensive BI solutions that don’t deliver to expectation and the common solution is even more exhaustive requirements gathering, integration and prototyping for the mother of all BI solutions in the hope that that will be enough. BI, like any other project, is an investment that we perceive will have a pay-off, but there is a risk it won’t. The fact is, if we don’t change our logic, but only the size of the engagement, all we are doing is changing the commensurate size of our ‘bet’, not the ‘odds’.
3) You can’t keep everything. BIplatforms are expensive repositories of data, because of the need to run high speed compute over the top, so we aggregate the data to the level of granularity that we think we will need, don’t integrate systems we don’t think are material and purge aged or data of unknown value. If someone asks a question that would require access to data outside this scope, the chances are we would have told them, as above, that it is not a priority anyway. Bringing third party data into a BI platform is generally not even a consideration.
Real-time, self-service analytics and visualisation tools such as Tableau have made significant inroads into addressing these gaps. Self-service, means IT is less of a cost or bottle neck. Rapid real-time analytics and superior visualisation with drill down means insight is gained faster, which in turn reduces decision latency. Deep, self-service analytics tools such as Alteryx and distributed workload (“in DB”, Hadoop) mean that users can now ask complex, even predictive questions of data themselves and expect rapid answers. Hadoop and Cloud storage also means that those user queries can be run against raw internal and third party data retained in data lakes for just that purpose.
These “smart” analytics capabilities are changing the BI data and application landscape. They are in effect turning users into analysts, allowing visionaries to wrestle with “holy grail” questions and breaking down the traditional barriers to insight, such as “sorry we don’t have the information”, or “it is not viable to get that answer”. Instead visionary organisations are asking, “Where can we get the information, or how can we generate it? … with a customer App? … IoT devices? … purchased from third parties?”
It is now possible to go through multiple insight cycles in the same time as trying to get new answers using traditional ETL (Extract Transform Load) and reporting tools and at a fraction of the development cost. That is understandable, because insight only requires a “statistically accurate” answer, not the compliance level results that ETL is designed for. By quickly providing answers that would take ten times the cost and time to delivery using an ETL platform, “smart” analytics can release value early to proof the value in a BI business case, prototyping the BI design much more rapidly than using the BI platform itself for prototyping, and of course ensure that the business releases exponentially more value within a given period.
Gartner predicts that by 2020 these “smart” analytics capabilities will be mainstream for large BI players such as Microsoft, SAP and IBM who are already incorporating user driven, visual-based analytics tools into their portfolio. Significantly they postulate double the business value from “smart” capable BI solutions than traditional tools, specifically where users are given access to the data and tools. In the meantime though, Gartner also warns of the current immaturity of several of these vendors’ offerings.
I think double the value is conservative. Yes, on average that is probably true, but for those few companies who embrace “smart” analytics, I think the reward will be ten-fold.
Implications
What does this all mean for Business, Digital and IT strategy?
We are finding that organisations’ knowledge practices are currently still too much in the mode of looking backwards. “What happened? … Let’s do better” rather than “If we we knew …, then we could …, Let’s find out.” Chances are the data is obtainable to answer ambitious questions. Neither are organisations really geared up to act on such answers. That would require a philosophy change among the leadership of organisations, one that IT is in prime position to influence.
All the big vendors are currently doing a good job selling their version of the new visual, user enabled analytics world. This will be great for changing the mindset of the executive so that organisations will begin thinking of loftier information goals and faster cycles of prediction, insight and action. The caution however remains, if the vendors advocate “bringing in all the data” or embarking on a lengthy (i.e. expensive) design and build initiative in order to realise their new vision, not only are many of those tools still quite immature, but insight should not by shackled to a long lead time. Not only because the value at the end might well not be there, but also because there is no need for bullet-proof monolithic systems in order to generate insight. In fact, they are more likely to hinder it due to a dependency on integrated data, or IT input.
Given the current maturity of BI, what it comes down to is:
1) Ask for the dream: Encourage your executives and users to ask the questions, the answers to which will really change the business. Understand your customers and those who aren’t your customers. Those who should be and how to reach them. What they really need or want. How you really compare to your competitors in their minds. You don’t need the mother of all BI, CRM or loyalty systems to begin to do so. Not anymore. Predictive tools have become exceptionally powerful, reliable and fast. How would a change in product mix or pricing affect the bottom line? Have a look. Gartner also says that by 2020 the number of ‘user’ data scientists will increase 5 times faster than professional ones. Will they be your users or your competitors?
2) Get the Data: If the data required to answer the questions isn’t immediately available, be creative about how to make it available. Buy it, generate it by building customer apps, think about changing your line of business platform, IP enable your physical assets. Build your digital strategy on the information you want to generate in order to answer the questions that will change the game. IT’s role is no longer keeping the lights on and the cost down – that is assumed – as evidenced also by the rise of the CDO.
3) Give users the data and smart tools: Keep all the raw data you could possible use (possibly in a data lake), don’t change it.Let users answer their questions themselves.
4) Lock in the value: Once you know there is value in an answer AND that that answer is something you will need regularly, then build the process into your ETL/BI/reporting platform (assuming the “smart” analytics vendors haven’t stepped up their game to eat the big boys’ lunch by then).
5) Do it all again: The analytics life cycle is another article entirely, but suffice it to say the faster the cycle of question, answer, action and result repeats, the more value will be released.
In 2017, let us measure the business case for BI by proven value delivered, rather than in the hope of nebulous value next year.