When Big Data Gets Real Big: On Operationalizing and Optimizing Information

Dianna Labrien
D. Labrien|05.30.16

Sponsored Links

Dianna Labrien
May 30th, 2016
When Big Data Gets Real Big: On Operationalizing and Optimizing Information

The global Big Data market is expected to hit $67 billion in worth by 2021 with a compound annual growth rate of 18.45%. While most companies already have the respective capabilities for big data analysis, they are yet struggling with breaking down data silos and making it more accessible.

A recent survey conducted by Attivio discovered some interesting mishaps. While 98% of respondents claimed that their company encourages the employees to ground business decisions in data and evidence, only 23% felt that their company was extremely successful at leveraging big data to make decisions.

Jammed Data In All The Wrong Places

Almost 41% of the survey takers claimed that the data they are working with was too siloed within the functions for quick access. Just 23% said that their enterprises utilize over ¾ of all the big data at their disposal. Even when data seems accessible, 37% admitted that it takes at least a day or more to accesses the sources for analytics. In extreme cases it can take over a week or even more.

Among the additional challenges 59% of respondents mentioned that their legacy data storage systems require too much processing to keep up to date with the modern businesses requirements. Data scientists are deemed to be dying breed as well as 66% respondents mentioned that finding and hiring top talents gets extremely difficult.

Sasha Williamson, CEO of Luminet, one of the UK's leading managed services provider, comments on the results: "We are still steps away from establishing truly data-driven companies. While the tech side of storing and processing data has already been solved by modern hardware, at least that's the case without virtual data center services, most enterprises still lack in-house tools and standard for unifying and digging through their silos. I believe more development of object oriented data analysis combined with automation of SDN and NFV technologies will help towards the goal of more data driven companies".

IoT Industry As The Next Data Hoarder?

According to Kevin Kalish, IoT Domain Lead at SAS, most companies have got it absolutely wrong when it comes to collecting big data. Kalish claims that having more data is not always a great idea, especially when the organization isn't 100% sure of how they plan to use it. Data storage and processing costs quickly add up and unless you have a business model, which would involve some sort of big data monetization.

"When we talk about big data in the context of IoT, I actually think big data is a big miss because the focus, to a degree, has been and still is on storing the data cheaper," – says Kalish.

Operationalizing Big Data: The Key Challenges

While it's rather obvious that big data is well...big, but most companies tend to underestimate the actual picture. Over 2.5 exabytes of new data is created each day. And those numbers have been steadily growing month by month. Eventually, a lot of companies opting to collect large chunks of data about their customers end up sitting in huge data lakes without a slightest clue of how to proceed further.

Take Data Diversity Into Account

The problem is that the data is now growing in multiple directions – through social platforms, GPS signals, IoT devices, smartphones and push notifications and so on. All of these channels are still relatively new, yet new directions tend to emerge each other day. Now, most companies are migrating to the cloud, for instance, which will lead to even further diversity in Big Data and more challenges to control it.

Lack of Proper Methodology

As previously addressed by Williamson, the lack of an established framework for processing big data is a common challenge faced by large enterprises. Most companies opt to place huge data sets in pattern, which can be analyzed by existing methodologies. Yet, businesses keep evolving and it's just a matter of time when this approach becomes a major obstacle.

Companies should work on creating new efficient ways of structuring data to allow greater accessibility. One of such strategies is to integrate data as a foundation for cross-functional analysis and simultaneously developing ways to track and measure the key data that can administer your businesses.

"By collecting a wide variety of customer interaction data, including social media interactions, organizations can leverage data to understand the customer and customer experience better to improve customer retention and customer experience, " comments Dan Vesset, vice president, Business Analytics and Big Data Program for IDC.

In the modern world each and every customer should be treated as a data transmitter. With huge amounts of information growing into data lakes, we need to work on defining the ground rules, which would define the basic structure for storing all the available data in an efficient and easy-to-access manner. And as the current state of affairs proves – there's still a lot of room left for improvement.

All products recommended by Engadget are selected by our editorial team, independent of our parent company. Some of our stories include affiliate links. If you buy something through one of these links, we may earn an affiliate commission.
Popular on Engadget