Supersize Me: Hadoop Upgrade Will Handle Even Bigger Data

Apache is planning a version of Hadoop that will support 6,000-node clusters.


You Can't Detect What You Can't See: Illuminating the Entire Kill Chain

InfoWorld: According to Hortonworks, the next version of the Apache Foundation's Hadoop big data processing tool will support even more data and offer faster performance. Version 0.23 of the open source tool, currently an alpha release, will eventually run across 6,000-machine clusters, each with 16 or more cores, and process 10,000 concurrent jobs. The next version of Hadoop will also offer federation and high availability for the Hadoop Distributed File System (HDFS). In addition, it will incorporate the "Yarn" upgrades for MapReduce.

The Hadoop update should become available as a general release later this year.

Tags: Hadoop, open source, big data

0 Comments (click to add your comment)
Comment and Contribute


(Maximum characters: 1200). You have characters left.



IT Management Daily
Don't miss an article. Subscribe to our newsletter below.

By submitting your information, you agree that datamation.com may send you Datamation offers via email, phone and text message, as well as email offers about other products and services that Datamation believes may be of interest to you. Datamation will process your information in accordance with the Quinstreet Privacy Policy.