Predictive analytics: que sera, sera
Danish physicist Niels Bohr once said: “It is very hard to predict, especially the future.” Ian McMurray believes things may be changing.
There is a school of thought – to which I subscribe – that the more information you have, the harder it is to make a decision. I well understand, though, that this isn’t necessarily a mainstream position – and one that is generally frowned upon in the world of business, where people like to talk about ‘data-driven decision making’.
Except, of course, their decisions aren’t based on data. Well, indirectly they are. They’re based on information, which – for the most part, ideally – derives from data. Data is only useful when we can turn it into information (and information is only useful when we can turn it into timely action).
And, in this day and age – boy, do we have a lot of data. Is there anything not capable of capturing and transmitting data anymore? I recently bought a new fridge-freezer – and to my shock and amazement, I can control it (albeit in a limited fashion) with my phone. Why? Not only that: I called up their service centre, and the woman I spoke to was able to interrogate the bloody thing, and could tell me how many times I open the fridge door each day, how many times I open the freezer door, what the average temperature in each compartment has been over the past week… Good grief.
Because we can
But here’s the thing. She was using data, collected by sensors in my fridge-freezer, to try to diagnose a fault. (She didn’t, but that’s another matter – except that it proves that not all the data that gets collected is valuable. I wonder sometimes if we collect it just because we can and because, increasingly, it’s very easy to do.)
The Internet of Things (IoT) is, of course, pretty much all about sensors – temperature, motion, pressure and so on. Historically, those sensors acted purely locally – but with the addition of, for example, WiFi, they became capable of transmitting the data they were sensing.
When it comes to a world awash with data, though, it’s not just about the IoT. Our online lives also generate huge amounts of data about us. That’s the basis on which, for example, Google, Facebook and Amazon have built their businesses, generating billions of dollars in ad revenue purely on the basis that they know so much about us. I’ve read somewhere that, if you downloaded the file Google holds on you – which you can apparently do – it runs to multiple Gigabytes of data. (And, scarily, that includes data that you thought you’d deleted…) Now that’s what you call ‘big data’.
A British politician, way back when, opined that “the best qualification of a prophet is to have a good memory”. It’s always been true: the best predictions about what will happen are based on what has happened: it’s all about extrapolating the data. Now, we have pretty much all the data we could ever need – all stored in a memory somewhere. In fact, it’s an overwhelming amount. The challenge is making sense of it – turning it into information that in turn will form the basis of insights and actions, whether it’s data from sensors or from any other source.
I think I’ve written previously in this column about the early days of expert systems. Based on the principle of AI, their claim to fame was that, while humans could only leverage a maximum of seven pieces of information before making a decision, an expert system could make better decisions by taking on many more than seven.
That was back in the 1980s. By comparison with AI today, it was pretty rudimentary stuff. Now, AI is mainstream, enabled by levels of computing power we could previously only dream of. Many of the latest mobile phones feature AI for managing resources, optimising the camera and so on. The natural language processing on which, for example, Amazon’s Alexa relies, is enabled by AI. Of more significance, however, when it comes to dealing with massive amounts of data, are neural networks and machine learning.
They’re at the heart of the resurgence in predictive analytics, an area of statistics that deals with extracting information from data and using it to predict trends and behaviour patterns. It’s nothing new: we’ve been doing it since man first observed that, when there were dark clouds, rain was likely to follow. The enthusiasm we’re seeing for predictive analytics today has been brought about by the confluence of big data on the one hand, and AI on the other – supported by a ubiquitous communications infrastructure. Now, we have the computing technology that enables us to make sense of unimaginable amounts of data – and turn it into actionable information, where and when it’s needed.
The list of potential applications is almost unending – from customer relationship management to medicine, from economics to the weather, from credit scoring and airline ticket pricing to fraud detection. One application that has particularly caught many people’s attention is predictive maintenance. In the old days, we fixed things when they were broken. Then, we moved to preventive maintenance: servicing your car at regular intervals, for example, is designed to avoid it breaking down. It’s a relatively clumsy, but generally effective, way of doing it.
With predictive maintenance, we take advantage of all the data gathered by sensors to predict when a failure will occur – and take remedial action ahead of time. Cars, for example, are approaching the point where each has some 200 sensors onboard: by 2020, it is said, the automotive industry will be deploying 22 billion sensors each year. Collectively, they’re capable of capturing all the data about how the engine is performing, detecting patterns and identifying out-of-range anomalies, comparing those with historical data – and recommending action.
And: exactly the same principles could be applied to any piece of AV equipment. Digital signage, for example, or a codec, a matrix or a projector. There’s no reason why any of them can’t be equipped with appropriate sensors, and an onboard AI capability. No more dark screens. No more routine maintenance. As AV becomes increasingly mission critical to many organisations, the prevention of downtime will become imperative. Predictive analytics has a role to play there.
Where do we go from here? It’s hard to conclude anything other than “more of the same”. A market research report issued by Technavio earlier this year forecasts that the predictive analytics market will grow by 23%/year over the coming years (I imagine I’m not the only person to see the irony in that…) Sensors will be deployed in growing numbers – and our digital exposure online is unlikely to diminish; we’ll continue to have more data available to us; and we’ll have increasing hardware/software capability to make sense of it all.
I guess when the ancient Chinese philosopher Lao Tzu – a man who, it seems, spoke entirely in quotable quotes – said “those who have knowledge, don’t predict – and those who predict, don’t have knowledge” – he didn’t foresee just how much knowledge we would have in the future.