HP has announced a new line of Proliant servers specifically designed to handle big data. The ProLiant SL4500 comes equipped with enough storage and multi-core processors to handle many petabytes worth of data.
ZDNet’s Sam Shead reported, “Hewlett-Packard is harnessing the brains of massively multicore chips from Intel and Nvidia to power a server designed to work with Big Data applications. The purpose-built ProLiant SL 4500 server crams more storage into the box than its rivals. By using either Xeon Phi or Nvidia Kepler processors, the server will help customers to handle vast amounts of data, HP said in a statement.”
Sean Michael Kerner from ServerWatch explained, “The HP ProLiant SL4500 server platform can provide up to 240 Terabytes of Big Data capacity in a 4.3 U server chassis. The system can be scaled up to a nine-chassis configuration that delivers up to 2.16 Petabytes of capacity. Each chassis has 12 DIMM slots that can provide 192 GB of node memory. The SL4540 can be configured with core 2.4 GHz Intel Xeon E5-2400 processors, while the SL4545 can be configured with AMD Opteron 4200 series 8 core 3.3 GHz processors.”
Chris Preimesberger from eWeek observed:
Most enterprise servers already have the capability to handle processing, storage and analysis of sizable workloads in the terabytes-to-hundreds-of-terabytes neighborhood. But HP is talking about multiples of that: petabytes and multiple petabytes.
“Conventional siloed servers and storage just doesn’t work anymore,” Jim Ganthier, HP Vice President of Marketing and Operations for Servers and Software, told eWEEK. “With increasing storage requirements and business needs changing, we figure that this is an unsustainable approach. These siloed architectures are no longer viable. They cost you too much, require too many admins, use too many different tools — and frankly, from a physicality perspective, it’s no longer applicable.
“What we decided to do is to use converged infrastructure optics to come up with the ProLiant SL 4500 server. The thing that separates this one from the others is that is it built new and from the ground up for big data workloads.”
The Register also quoted Ganthier, who added, “Customer demand is our biggest problem right now, and we can’t make them fast enough.”