5th April 2016 • James Para
Within the top Business Intelligence and Analytics trends tipped for 2016, one of the prevalent themes is the growing usage of Predictive and Prescriptive Analytics. As the most discussed topic among Business Intelligence professionals, this new trend has resulted in 2016 being christened “The Year of Prescriptive Analytics”.
As the market for Big Data technologies and services inches toward a 23.1% compound annual growth rate, the value of the sector is set to reach $48.6 billion by the year 2019*. One key facet of this growth is the development of the prescriptive analytics software market.
Predicted to reach $1.1 billion by 2019, with a 22% compound annual growth rate* from 2014, the prescriptive analytics market is rapidly expanding. Whilst just 10% of organisations currently use some form of prescriptive analytics, by 2020 this figure is set to grow to 35%. That’s a 25% increase.
Currently, only 30% of companies are using predictive analytics to provide insights into their data. This is set to grow over the next few years.
Extracting information from datasets to project/forecast future probabilities, predictive analytics is an extension of data mining. One of its particularly effective aspects is that it includes the possibility of future errors – providing a few alternative scenarios and some levels of risk assessment. This provides organisations with the capability to analyse their current data alongside historical information to develop a greater understanding of their customers and partners, and to isolate potential risks.
Predictive analytics is, in itself, named as one of the key emerging trends within Business Intelligence and Analytics in 2016. And it’s not surprising.
The product of two methods (Artificial Neural Networks – ANN) and Autoregressive Integrated Moving Average – ARIMA), predictive analytics helps to identify and discover underlying patterns within unstructured and structured data. It’s amazing.
ANN mimics neurons firing within our brains. It works like this: information flows into a mathematical “neurone” and the results/analysis flows out. Then, the process is consolidated into a mathematical formula (look it up, it’s intense) which is repeated numerous times. That’s a mathematical method for processing data based on the way in which we process information in real life!
ANN and ARIMA tackle different parts of the model for predictive analytics. ARIMA handles time series analysis – this means it takes past data and then inspects the autocorrelations within that data. It looks at things like how current values compare to past values, and then it predicts the future. Or at least, the future pertaining to the data it just analysed.
Organisations then use this data to inform their decision making and reduce the possibility of error.
One step on from predictive analytics (like a smarter, younger sibling) prescriptive analytics is doesn’t just predict the future, it offers insights based on potential decisions which haven’t even been made yet. It looks at future outcomes of numerous actions.
“Essentially, we’ve moved from, “why did this happen,” to, “what will happen,” and we’re now moving to, “how do we make this happen,” as an analytics methodology.” *
Examining content or data to see what kind of decision should be made, or how to take steps towards the completion or fulfilment of a goal, prescriptive analytics utilises techniques like simulation, graph analysis, neural networks, machine learning, and recommendation engines. That’s right, I said machine learning. Helping to optimise scheduling, inventory and supply chain design, among other things, prescriptive analytics is tipped to be the future of analytics. Think of it as Business Intelligence 2.0.
“In prescriptive analytics, algorithms basically run wild.”*
By machine learning, we’re talking about the use of extremely sophisticated algorithms. These algorithms are created to adapt to changes within established parameters. Big Data is just too… well big, to work within the normal rules, so the point of prescriptive analytics is to really reduce the number of rules in play. You still have them, but they exist for the algorithms to respond to.
"It's called dynamic calibration, and some of that is guided by rules. A rule could be, 'If you see this data change by more than 10%, trigger recalibration’" Atanu Basu
Intelligence as a Service (incidentally another trend to watch this year) vendors have already started to incorporate aspects of predictive and prescriptive analytics into their packages – particularly the use of ARIMA.
Demonstrable incorporation of prescriptive analytics into intelligence include Python (where the “stats models” package includes time series analysis – that’s ARIMA), R Language (whose “forecast” package automatically selects an ARIMA model) and Ruby (where the “stat sample-timeseries” includes ARIMA models). Other notable IaaS packages drawing upon predictive and prescriptive analytics include MATLAB, SAS and SQL. IBM is also offering prescriptive analytics capabilities through SPSS and other software products.*
What’s forecast as a result of this continue integration of predictive and prescriptive analytics tools into these vendor packages is that organisations will be able to purchase BI Analytics packages which will be able to make all the decisions for them.
“As it gets increasingly difficult to find data scientists to mine the rapidly growing volumes of data for business insights, companies will turn to autonomous services for machine learning.” *