The press and worldwide media have been full of reports over the last couple of years about a major breakthrough in information technology. We’re referring of course to. Now, if you believe all you read then Big Data will be the key factor in determining how quickly enterprise will grow in the next decade. But is this factually correct? Will Big Data and hold the key to the future as has been claimed? Will the technology unlock a wealth of untapped enterprise insight?
Well, not according to Irfan Khan, chief technology officer for Sybase, an industry leader in delivering enterprise and mobile software to manage, analyse and mobilise information across many data-intensive industries, systems, networks and devices. Speaking at Whitehall Media’s Big Data Analytics Conference in June, Mr Khan expanded on an article he had previously written which was published on Forbes, in which he maintained that Big Data is nothing new. In fact we’ve had access to Big Data for decades and have managed to cope without all the new technologies that we are now told are crucially important.
That according to Mr Khan is the big lie about Big Data: it isn’t new. We’ve always had lots of information on hand historically, and have always managed to deal with information effectively by increasing processing times, expanding memory and building faster connections to the information that we needed to mine for insight. So, if we’ve managed to cope up until now, why do we feel the need to consistently call for new technologies? In fact, he argued do we really need new technologies at all. Is it not more sensible to simply process information more effectively to deliver greater and more valuable business insights?
There’s no doubt that there will be a huge explosion in the amount of data in the next 3 to 5 years, particularly unstructured data like machine data, point-of-sale data and RFID data. All of this will add to the vast amounts of data we already hold. Yet, according to Irfan Khan, we should not be overwhelmed or worried by the explosion of new data that comes on stream. All we need to do is ensure that our existing technologies are used more effectively and practically. As an example he highlighted the financial crash of 2007, and referred to an article published in the Wall Street Journal under the telling headline; “we had all the information, but no insights.”
The Wall Street Journal claimed that if a deep analysis of the information held had been undertaken prior to the bottom dropping out of the market, there was a fair chance that the crisis and subsequent crash might have been avoided. The data available about the experiences of financial crises in both the 1960s and 1980s should have made it self-evident that the markets were about to implode. However, because we look at subsets of data from a macro perspective, we often fail to see what’s there in front of us.
What’s needed is change, he argued: not just a change in emphasis, but also a change in approach. If we want to progress from being a nation of data hoarders and avoid subsequent market crashes and other crises, we have to move forward and learn to process information more practically and effectively. That’s the only way businesses and enterprises will ever manage to gain valuable commercial insights. It might sound like a simple principle, but it will become increasingly important in the next decade or so.