Sunday, November 10, 2024

5 Big Data Predictions for 2017

Datamation content and product recommendations are editorially independent. We may make money when you click on links to our partners. Learn More.

By Nitin Donde, Founder and CEO, Talena, Inc.

Increasingly, companies understand the central role that data can play in their success, and they respect data as a critical business asset. As a result, companies are more committed than ever to protecting, managing and analyzing data within today’s modern architectures. One thing is certain: for many, big data offers opportunity and challenge in equal measure.

So what big data trends will move past their natural cycles of hype to real-world deployment? How will we work to ensure that essential data is easily accessible and always secure? What role will machines play in our learning in the coming months and years ahead?

Here are our picks for five areas that will remain at the top of the list in 2017:

1. Understanding the value of long-lived and short-lived data

Not all data is relevant all the time. Server metric or log data is highly relevant to understanding how a particular infrastructure is performing at a given point in time but kept at most for 30 or 60 days. Customer purchases are often kept for quarters for trending or pattern analysis and sometimes years for compliance reasons. Light speed and deep analysis—most companies today need both. Understanding the value of short- and long-lived data, how business-critical the data is, and the value in data purging, are all critical considerations. Each has a role to play in a company’s success and this will continue moving forward.

These considerations will be increasingly relevant for companies as they think about how to store and process these different data sets. Multiple concerns come into play: identifying the necessary data management infrastructure, investing in the right deployment model and optimizing storage consumption.

2. The public cloud juggernaut

According to Cisco’s recent Global Cloud Index (GCI) forecast, public cloud computing is growing at a much faster rate than the private cloud. The report projects that by 2020, 68 percent of all cloud workloads will be in public cloud data centers, up from 49 percent in 2015.

The recent announcement of WorkDay choosing AWS over its own private cloud is a clear indicator of how even large, established companies increasingly trust the public cloud for critical workloads. What’s more, it’s expected that the total volume of data generated by IoT alone will reach 600 zettabytes per year by 2020, 275 times higher than projected traffic going from data centers to end users.

As cloud-based deployments become increasingly mainstream, companies must understand the scale, security and compliance implications of how data is processed and stored as part of their new or forklifted workloads.

3. Security and agility: moving beyond the breach

Today’s growing data assets rarely stay static in a single infrastructure environment. For greater business agility, they are always on the move — from a storage environment to an analytics cluster, or from an on-premise datacenter to the public cloud.

The combination of volume and velocity of these bigger data assets make many privacy advocates nervous about the potential for not just data breaches (hello, Yahoo!) but also for data exposure to employees not authorized to view these assets. Yet there are steps that enterprises can take to overcome the agility-privacy divide and still emerge successfully in a world increasingly based on rapid processing of large data sets.

The common dictum that “no enterprise is an island” applies equally to their big data infrastructure. Companies will continue to use different big data systems for different business purposes, and because these data assets are often on the move, there is a need for transparency around the customer privacy contract. Most privacy concerns are often related to the notion that data is “secure” and won’t be knowingly or unwittingly shared with any third party, but these attitudes often fail to consider the concept of unintended internal access to these data sets.

Thinking beyond breaches to better understand and address compliance exposure and compliance issues will become a bigger part of security discussions and strategies. And due to the speed by which data moves, it is imperative to think about — and protect — data as it moves across the enterprise.

4. Analytics crossing new boundaries

While most companies agree that the real value of data lies in their ability to filter, analyze and interpret it, Forrester Research found that most of the companies they surveyed are analyzing just 12 percent of the data they have.

What’s more, the “adoption of technology to continuously analyze streams of events” continues to accelerate as it is “applied to IoT analytics, which is expected to grow at a five-year compound annual growth rate (CAGR) of 30%” according to IDC.

Optimizing procedures and IT operations to deliver improved business outcomes is key, and companies should focus efforts on better understanding which information is valuable and which can be largely ignored. Taking advantage of analytics on typically idle and typically used data assets such as backup copies is another way in which companies can shed light on customer interactions and revenue trends, while simultaneously reducing data processing costs.

Analytics will continue to have an enormous impact in almost all industries, including health care, retail, e-commerce and financial services. The speed with which companies can analyze and react will be increasingly critical — and for many will mean the difference between success and failure.

5. Machine learning becoming machine intelligence

In a recent survey of executives at 168 companies with more than $500 million in sales, 76 percent of them are targeting higher growth via machine learning. Without question, big data has greatly accelerated its relevance. Companies who have seen the need for processing and storing large and diverse data sets now need to understand what the data means — and need to do so at a granular level. Machine intelligence based on machine learning proves to be an invaluable asset in this effort.

The coming years will see a rapid intertwining of big data technologies and processes with machine learning techniques to enable the more than 35 zettabytes of data generated in 2020 to deliver useful and impactful business outcomes. If it’s not impacting your organization today, we predict that it will be in short order.

2017 and Beyond

Those of us in IT are used to riding the waves of change and preparing for them. 2017 will be no different, no matter what the hottest tech trends happen to be. Successfully managing the challenges inherent in running a business in today’s data-centric world will reap big rewards for many.

Photo courtesy of Shutterstock.

Subscribe to Data Insider

Learn the latest news and best practices about data science, big data analytics, artificial intelligence, data security, and more.

Similar articles

Get the Free Newsletter!

Subscribe to Data Insider for top news, trends & analysis

Latest Articles