Monday, May 27, 2024

5 Big Data Apps with Effective Use Cases

Datamation content and product recommendations are editorially independent. We may make money when you click on links to our partners. Learn More.









Even if your organization is compelled to become more data-driven, many don’t know how to transform themselves out of the use-your-gut mentality and into a data-first one.

The easiest way? Take shortcuts by refusing to reinvent the wheel and following the trails blazed by early adopters. Here are 5 cool Big Data apps, along with the use cases (and end users) that are helping to change the meaning of “business as usual.”

1. Big Data application: Roambi

How this Big Data app works: One thing often overlooked in the rush towards data-driven decision making is mobility. Increasingly mobile workforces need more ways to manipulate data from a smartphone that just basic business tools, which are so often stripped down for mobile. Mobile workers need the ability to access and analyze the same business data they use in the office in order to make smart, on-the-go decisions.

Roambi contends that it was founded to solve this very problem. Roambi’s goal is to reinvent the mobile business app to improve the productivity and decision-making of on-the-go employees. Roambi re-designs the way people interact with, share, and present data from a completely mobile perspective.

Use case of note: The Phoenix Suns.  In addition to their goal of consistently performing at an elite level on the court, the Phoenix Suns are making big strides off the court through the use of analytics, which they use to help drive strategy for both business and basketball decisions.

While considered by some in the NBA as a small business in terms of the infrastructure and processes in place, in the past three years, the Suns organization has invested significant resources in not only organizing the data they accumulate, but in also guaranteeing the accuracy of that data and ensuring that it is being used by all decision makers across the organization.

Whether it’s an off-site meeting or a long road trip, as is the nature with any professional sports team, a majority of their work is done away from the office. The organization’s ownership was looking for a way to make their critical business data available wherever their decision makers were located.

As the Suns began taking steps to become more mobile, there was a healthy amount of skepticism that a mobile solution could be found that was both valuable and, more importantly, easy enough for end users (most of whom don’t have a very technical background) in the organization to adopt.

That changed when the Suns adopted Roambi. The Suns started using Roambi Analytics with their front office, organizing and visualizing key player scouting information all in one place, as well as making this information available in real time.

After the success of the initial rollout, the Suns decided to expand their use of Roambi to their back office. On the business side, the Suns optimized their operations by providing KPIs across sales and marketing, reporting on everything from ticket sales to game summary reports to in-stadium promotions to customer buying behavior to inventory – all via mobile devices, so executives were all working off of same set of numbers and were able to make critical business decisions in a moment’s notice.

2. Big Data application: Esri ArcGIS

How this Big Data app works: Esri ArcGIS, as the name implies, is a Geographic Information System (GIS) that makes it easy to create data-driven maps and visualizations.

Use case of note (in this case it is more of a partnership): In mid-July at the Esri User Conference, the company radically updated its Urban Observatory project. Developed in partnership with Richard Saul Wurman and Radical Media and originally launched last year, the Urban Observatory helps cities use the common language of maps to understand patterns in diverse datasets.

“Our world has always had Big Data surrounding us that, until recently, has remained untapped for any real understanding,” said Wurman. “We are several iterations into developing a common language for mapping urbanization. It will allow cities to understand not only the major threads of their performance, land use, and contents comparatively but [also,] eventually, the nuance of change and action.”

I attended the Esri UC last week and spent plenty of time playing with (and before that standing in line to get access to) the Urban Observatory exhibit, an interactive exhibit that makes it easy to compare and contrast data from cities worldwide, all on a touch screen.

At least half of the world’s population is currently living in urbanized areas. The Global Health Observatory (GHO) projects that by 2050, 7 out of 10 people will live in a city. This year, nearly 60 cities are part of the Urban Observatory.

Participation in Urban Observatory is open to every city around the globe. Any city that has data its officials would like to share is eligible to be included. In February 2015, Urban Observatory will go on permanent display in the Smithsonian Institution.

3. Big Data Application: Cloudera Enterprise

How this Big Data app works: Not long after nailing down one of the (if not the) biggest funding rounds in history, Cloudera is now making inroads into the Internet of Things market with its app, locking down a deal with a major home automation company in mid-July. Oh, and I almost forgot: that close partnership with/funding from Intel is something you just can’t ignore either.

Use case of note: Cloudera has a ton of customers, but Wells Fargo and home automation company Vivent are two to pay attention to.  Wells Fargo has used Cloudera Enterprise to build an enterprise data hub.

Vivent is the use case that really caught my attention, though, since it ties two of the hottest, most promising tech trends of the moment together. Vivent is using Cloudera Enterprise to glean insights from the data generated from intelligent devices and systems embedded with sensors in and around homes. “[With Cloudera, we can now] look across many data streams simultaneously for behaviors, geo-location, and actionable events in order to better understand and enrich our customers’ lives. This platform has differentiated our business and given us a tremendous competitive advantage,” said Brandon Bunker, senior director, Customer Analytics and Insights

Vivent says that it has acquired more than 800,000 customers using a variety of third-party smart-enabled devices – roughly 20-30 sensors per home. Many of those devices come in the form of thermostats, smart appliances, video cameras, window and door sensors, and smoke and carbon monoxide detectors. Without a central internal repository to gather and analyze the data generated from each sensor, Vivent was previously limited in its ability to innovate and to add higher intelligence to its security offerings.

For example, knowing when a home is occupied or vacant is important to security – but when tied into the HVAC system (which tends to be the largest contributor to a home’s energy bill and carbon emissions), you can add a layer of energy cost savings by cooling or heating a home based on occupancy. Similarly, by adding geo-location into the equation, you can begin to adjust temperature changes to a home based on the proximity to an owner’s arrival, for instance, when the owner has a connected vehicle. Studies have shown that consumers could see 20 to 30 percent energy savings by turning off HVAC systems when residents are away or sleeping.

4. Big Data application: Zaloni Bedrock

How this Big Data app works: Many businesses know they want to implement a Hadoop data lake, but don’t know how to do so in a cost-effective, scalable way. Moreover, simply putting data into Hadoop does not make it ready for analytics. To use common analytics toolsets, you must know where data is, how it’s structured (or not) and where it came from.

You may also need to prepare it by filtering or joining datasets together, or masking out parts that are sensitive in nature. This typically takes a significant amount of time and effort and can be highly error prone. If you’ve done a poor job ingesting, organizing, and preparing data for analytics, the results of your analytics will be equally poor. Flawed analytics can lead to flawed business decisions and making better business decisions was the whole point of the data lake in the first place.

With Zaloni Bedrock, the process is automated. According to Zaloni, you set it up once and you’re done. It doesn’t matter how much data you are adding to the lake, since there is no technical limit.

Zaloni argues that without a product like Bedrock to help you along, 60 percent or more of the time and effort you spend to build an analytics system using a Hadoop data lake will be spent on data management and data preparation alone.

Use case of note: UnitedHealth Group’s Optum division, an IT and tech-enabled health services business, uses Bedrock as part of their data platform to manage services like data ingest and workflow execution. Bedrock enables Optum to monitor multiple data sources, capture and store schema/operational metadata, and provides features like data catalog search for end users.

5. Big Data application: Tamr

How this Big Data app: Tamr is a data-connection and machine-learning platform designed to make enterprise data as easy to find, explore, and use as Google. According to Tamr, due to the cost and complexity of connecting and preparing the vast, untapped reserves of data sources available for analysis, most organizations use less than 10 percent of the relevant data available to them.

It’s just too manual, too inefficient and too expensive to connect and ready the massive variety of internal and external data for analytics and other applications critical for business growth. Tamr argues that if the industry is going to be successful at helping customers manage the growth and variety of data that lies ahead – from internal sources, external public and private sources, Internet of Things feeds, etc. – a complete overhaul of traditional methods of information integration and quality management will be required.

Use case of note: Multinational media and information company Thomson Reuters faced challenges maintaining critical, accurate data. It had outgrown its manual curation processes and looked to Tamr to provide a better solution for continuously connecting and enriching its core enterprise information assets (data on millions of organizations with more than 5.4 million records pulled from internal and external data sources).

Using Tamr, one project that Thomson Reuters estimated would take six months was completed in only two weeks, requiring just forty hours of manual review time – a 12x improvement over the previous process. The number of records requiring manual review shrunk from 30 percent to 5 percent, and the number of identified matches across data sources increased by 80 percent – all while achieving Thomson Reuters’ 95-percent precision benchmark.

Tamr says that the disambiguation rate (or the rate of resolving conflicts) rose from 70 percent to 95 percent. Furthermore, the knowledge Tamr gleaned from its machine learning activities means that future data integration will take even less time per source.

Photo courtesy of Shutterstock.

Subscribe to Data Insider

Learn the latest news and best practices about data science, big data analytics, artificial intelligence, data security, and more.

Similar articles

Get the Free Newsletter!

Subscribe to Data Insider for top news, trends & analysis

Latest Articles