Data volumes undoubtedly increase all the time. Experts estimate that 2.5 quintillion bytes of data are created every day from a variety of sources including sensors, social media, and mobile devices around the world. IDC estimates the market for “big data” technology and services will grow at an annual rate of nearly 40 percent to reach $16.9 billion by 2015.The reason big data is getting big attention is because data offers the promise of Insight. And insight often comes from determining what’s real and what’s not. If insight is what you’re ultimately seeking, then sorting the truth from the myths is mandatory.
Myth #1: It’s About the Size
IDC estimates that by 2015, 7.9 zetta bytes of data will be stored globally and 80% of it will be enterprise managed data. This may be true, but so? If you’re focused on the size, the speed and the variety, you’re missing the point. The important thing about data is the value it’s potentially capable of delivering. Yes, by definition, it’s larger than organizations can handle, but when hasn’t that been the case? Big or small, we need to focus on what the data is telling us, what we’re learning from it, and, perhaps most importantly, what to do with information once we have it.
Myth #2: It’s About the Data
Data, in and of itself, is not an answer to anything. The answer lies in the insights you can glean from the data. But how do you get those insights? Current tools and methodologies are failing when it comes to finding the most critical information in a time-sensitive, cost-effective manner. The reason? Many of these emerging tools and technologies are trying to approach this challenge with new ways of doing the same old things. But a bigger shovel, a bigger bucket, or more people to do the digging is not what’s needed. What is needed is a completely new approach to Big Data Analytics.
The size and speed of Big Data demands true automation, in which work is offloaded from human to machine. This automation happens with algorithms, which are designed for calculation, data processing, and automated reasoning. Algorithms are designed for tasks that are beyond human comprehension and require the speed of machines. This is the realm of Big Data. In this new approach to Big Data, the only way to find the gold is to automate the process of data to insight conversion. And automation requires algorithms—fast, highly optimized algorithms that leverage sophisticated mathematics to solve complex problems of an unimaginable size, thereby pushing the frontier of innovation and competition.
Myth #3: Big Data Requires Bigger, Faster Hardware
A 2010 White House Advisory Report states that in a study of progress over a 15-year span, the speed of completing calculations improved by a factor of 43 million. Of the total, a factor of roughly 1,000 was attributable to faster processor speeds. So, yes, bigger, faster machines are helpful. But, a factor of 43,000 was due to improvements in the efficiency of software algorithms. Algorithms are the answer. Not bigger, faster machines.
Myth #4: It Requires Lots of Data Scientists
According to McKinsey & Company, “There will be a shortage of talent necessary for organizations to take advantage of big data. By 2018, the United States alone could face a shortage of 140,000 to 190,000 people with deep analytical skills as well as 1.5 million managers and analysts with the know-how to use the analysis of big data to make effective decisions.”
Currently, most companies depend on data scientists to mine these stores of information. That’s a beginning, and many front-line companies have seen significant early revenue gains and cost savings with the help of data scientists. But, let’s face it: data scientists are practically non-existent, expensive, non-real-time, and they don’t scale. Bottom line: Any method that puts the burden on the user is a game-stopper.
The volume of Big Data demands a change in the human relationship to data. The algorithms have to do the work, not the humans. In this brave new world, algorithms are the protagonists. The role of the analysts will be to select the best algorithms and approve the quality of results based on speed, quality and economics.
The Big Data Analytics revolution is underway. This revolution is an historic and game-changing expansion of the role that information plays in business, government and consumer realms. To harness the power of this data revolution, a paradigm shift is required. And that shift demands the use of automation through algorithms.