Is There a Big Data Bubble?: Page 2

The hype and new investment dollars flowing toward Big Data are both vast. Will it live up to the anticipation?


How to Help Your Business Become an AI Early Adopter

Posted February 25, 2014

Jeff Vance

(Page 2 of 2)

Other examples include Google finding that search queries about the flu are a quicker way to predict where the flu is spreading than previous methods (such as hospital admission records); and data analytics firm Evolv learning that employees with a criminal record actually perform slightly better in the workplace than everyone else.

One of the places “unknown unknowns” are a major problem is security. There’s a reason zero-day threats are such a concern in the security community: if you don’t know what it is, how can you block it?

However, by applying Big-Data-driven pattern recognition to threat analysis, security companies are quickly closing the zero-day window.

“When a security incident happens, one question customers always have is about what happened in the moments leading up to a security event. In the past, we couldn’t always tell them, at least not right away,” said Mike Hrabik, president and CTO of managed Security Service Provider Solutionary.

A problem legacy security systems have is that they can’t pull in both structured and unstructured data into a single platform for analysis. Thus, when an event happens, it could take days or weeks of forensics work to figure out what happened.

To address this problem, Solutionary deployed a Hadoop platform from MapR and Cisco’s Unified Computing System for high-performance computing.

Hadoop has significantly increased the amount of data analysis and contextual data that Solutionary can access, which provides a greater view of attack indicators and a better understanding of attackers’ goals and techniques. This capability also enables Solutionary to more quickly identify global patterns across its client base.

If new threats are discovered, Solutionary can now detect and analyze activity across all clients within milliseconds. With the previous environment, even this seemingly simple task would be considerably more difficult and costly to do, taking as long as 30 minutes even with a preplanned, optimized environment.

“In past, due to the sheer amount of data, our analysts were often limited to examining log data, which misses a lot. Now, our experts are unshackled. They can see the big picture and can put the incident into context,” Hrabik said.

Big-Data-driven innovation

Big Data is teaching us more about ourselves each and every day. Facebook may now know when you’ll be entering a romantic relationship before you do, simply based on your online activities. After the Super Bowl, Pornhub proudly told us what we already could have guessed: the fans of the losing team had to find other ways to entertain themselves, since the party for them was over. Less frivolously, analysis of search engine queries can now discover unknown drug side effects.

Many of the Big Data insights you’ll hear about on the news are just noise. They’re publicity tools that add little meaning to our lives.

But that probably tells us more about ourselves than about the capabilities of Big Data. (The Pornhub study showed up everywhere. Conversely, I don’t recall a single mention of the drug study; I just learned about it doing research for this story.)

Moreover, Big Data is already spawning startups looking to do everything from identify the key influencers in social networks (in order to focus marketing efforts on that person) to understanding social value in gaming.

The main difference I see between the dotcom boom and the Big Data one is that most dotcom companies targeted consumers first (as did laptops, WiFi, smartphones, tablets, etc.). The Consumerization of IT trend has taught us that consumer dollars are often easier to capture than business ones, especially for new, unproven tech products.

Big Data is the opposite. There are no real Big Data tools for consumers. The game is in the enterprise, yet the enterprise is far more cautious than your average consumer.

What will this mean?

My guess is that it means Big Data will evolve more slowly and sanely than many preceding trends. Fewer science projects will get funded. We’ll skip a dotcom-sized bubble, and companies will soon know a heck of a lot more about us than we know about them.

History may not be repeating, but I can certainly hear plenty of rhymes.


Jeff Vance is a technology journalist based in Santa Monica, California. Connect with him on LinkedIn, follow him on Twitter @JWVance, add him to your cloud computing circle on Google Plus

Photo courtesy of Shutterstock.

Page 2 of 2

Previous Page
1 2

Tags: IT management, big data, Data Analytics

0 Comments (click to add your comment)
Comment and Contribute


(Maximum characters: 1200). You have characters left.



IT Management Daily
Don't miss an article. Subscribe to our newsletter below.

By submitting your information, you agree that datamation.com may send you Datamation offers via email, phone and text message, as well as email offers about other products and services that Datamation believes may be of interest to you. Datamation will process your information in accordance with the Quinstreet Privacy Policy.