The pace of IT development is fast and it is relentless. What was bleeding edge just a few short years ago becomes old hat, replaced by the next big thing. Generally, no one can predict in detail what that next advancement may be, either. Since no one wants to have to replace everything in their data center to accommodate the newest technology, maintaining flexibility is extremely important. Remaining flexible means that you can continue to utilize the capital investment you’ve already made, while rolling in the next wave of new products and services with minimal cost and disruption.
Most companies don’t have the luxury of starting from scratch every few years and building out a new IT infrastructure from the ground up. Instead, they have to work with what they’ve got. That means dealing with an existing array of operating systems, hypervisors, servers and storage. It means adding in cloud connections and services to existing architectures, systems and processes. It means moving forward into an ever more virtualized world while still supporting the status quo.
What Makes DP Flexible?
What does it mean to talk about flexibility with respect to data protection (DP)? Your DP solution has to provide you the options you need in order to support what you already have, in addition to allowing you to change and add products and services as your needs evolve. Flexibility enables IT to protect these investments and to scale and dynamically improve their infrastructure. Given today’s huge data growth and highly mixed workloads, the ability to do this quickly and cost-effectively is highly valuable.
Let’s talk briefly about the evolution of backup and what challenges still remain. (There are a lot of them.) Backup used to be all about meeting time constraints within backup windows. That is still an issue given the growth of big data, but today’s data protection environment is vastly more complex.
With the introduction of virtualization, many companies chose to use their existing backup applications to backup virtual machines too. This worked for smaller and simple virtualized networks but became problematic as virtualization grew. Eventually companies turned to virtualization-optimized backup. Now they either adopted a new backup application that protected both virtual and physical servers, or they kept their physical backup and added a new virtual backup product. At the same time, companies bought additional backup products for specific application needs, such as a continuous backup system for mission-critical applications.
Backup target choices also grew more complex. Companies could backup to on-premise disk and/or tape, to an external data center, and/or to the cloud — assuming that they could sufficiently accelerate their WAN. Even on the cloud they must still make backup storage decisions: when it becomes too expensive to store massive amounts of data on the cloud, IT must either delete large volumes of old data or create yet another storage layer by moving old files to cold cloud storage.
Today the DP environment has morphed into a highly complex environment: physical and virtual servers, multiple backup applications serving different needs, massive backup sizes, and widely varying data types. Now add a plethora of backup targets ranging from on-premise disk and tape, to remote hot sites, to the cloud. All of this represents a massive shift in traditional infrastructure, and introducing new software and hardware into the architecture can be a risky and expensive undertaking.
Many companies with this complex infrastructure would be happy to replace it with a comprehensive backup application platform — if they did not have to rip and replace. All too often they do: resulting in a nightmare of retiring existing assets, scheduling time-consuming deployments and optimization, staff training and new responsibilities and changing many processes.
An Alternative: Flexible Introduction to the Data Center
A far more workable plan is to adopt a backup platform that has comprehensive capabilities, but which deploys in simple steps that protects existing investments. Over time more processes move to the new platform. In the meantime flexible deployment lets IT go at its own pace. This flexible solution protects hardware and software investments by installing on non-proprietary hardware and by working with existing processes. This new platform offers a centralized management console that is simple to manage even in a small IT organization, and that seamlessly admits new modules and processes. This type of platform ideally integrates with multiple fabrics, operating systems, server and storage types, hypervisors, tape, and many applications. Below are the critical features that a flexible platform should provide:
· Dynamic scaling. The platform should scale from a small data center or small implementation in a larger one, up to thousands of physical and virtual servers. Scaling should also allow for hot upgrades and linear performance and storage improvements.
· Pre- and post-command support. Flexibility also means being able to support existing, well-established backup programs such as a widely deployed array-based backup or a specific application backup process. A flexible backup platform will support pre- and post-command sequences to launch, process, and then close the third-party backup utility. This is not a rare feature — Unitrends, Symantec, CommVault, IBM, and support it — but it is an important feature.
· Tape support. Some newer backup vendors prefer to ignore tape. This can work in the rarified world of continuous backup and immediate recovery objectives. But a comprehensive backup platform cannot afford to ignore customer investment in expensive tape libraries. Tape remains a large and important backup domain in many corporations.
· Software compatibility. The DP platform ideally integrates with multiple fabrics, operating systems, server and storage types, and many applications. Flexibility is also important with hypervisors: not only VMware and Hyper-V but also XenServer, KVM, Oracle VM.
· Single pane of glass management. A single pane of glass management environment expands with additional modules or services. Integration should be seamless. It should also easily clarify policies, schedules, and report functionality.
· Non-proprietary servers and storage. A flexible system will not force proprietary hardware on its customers. The more flexible the system, the more investment protection it offers IT for capital and ongoing costs and staff expertise. And when IT does decide to retire hardware in favor of new purchases, the flexible backup system easily runs on the new systems.
Flexible DP: Where do you need it?
Some organizations like the scalability, ease of deployment and isolation that physical appliances provide, whereas others simply want to build an all-virtual environment. Choosing a data protection vendor that offers both physical and virtual appliances allows IT to mix and match under a common management interface
For example, a data center is using Hyper-V or XenServer in their development and testing environment, and runs vSphere for production workloads. A flexible backup configuration allows IT to use the same data protection platform to protect all of their hypervisors.
The cloud offers many possibilities for DP services. IT may simply copy/archive backups to the cloud, or they may actively replicate VMs and use the cloud as a failover destination. They want to do cloud-based sandbox testing on backups and VMs to make sure they’re recoverable, or may go even further and incorporate the cloud as an active component in the organization’s DR plan. Given all of the available options concerning cloud functionality, best practice is to choose a data protection vendor that supports all of the above, preferably with the same product set for both local and cloud environments.
Many data centers run a mix of flat, snapshot-based backups and selective file-level backups. The environment may sport a mix of legacy hardware and applications with a set of virtualized workloads, or a set of backup types in a completely virtualized environment. In both of these cases, admins should be on the lookout for a data protection vendor who does two things and does them well: 1) supports the total environment instead of forcing admins to mix and match vendors and products, and 2) easily integrates with the legacy equipment and processes that admins are not yet ready to retire.
Product Flexibility
Every backup and recovery vendor on the planet has some flexibility built in. The question is where does the flexibility exist and how far does it extend? What operating systems does it support? How many hypervisors? Is it array-specific? How well does it integrate with other software in the data center? Does it run on appliances?
CommVault Sympana is an integrated platform that provides DP, storage management, array based snapshots and indexing/search for stored data. As a large and extensible platform, it works across many operating systems and applications. CommVault is not in the appliance business but they do offer appliance-based solutions in partnership with STORServer. Wide product choices grant flexible choices within the suite but integration is a demanding process.
EMC provides a wide range of data protection and management hardware and software. It’s a major cloud player, both in building private/hybrid clouds and cloud based backup and recovery. Like Sympana, EMC’s very broad hardware and software portfolio will meet every DP need. However, adaptable introductions and integration in a heterogeneous data center is not EMC’s strong suit.
Dell offers wide DP choices including NetVault, AppAssure, deduplication and compression appliances, vRanger agentless protection for VMware and HyperV and cloud based backup and DR. However, integrating systems, networks and storage between Dell products is specialist admin territory.
Unitrends physical and virtual appliances backup on-premise, remotely or in the cloud. Running on commodity hardware, it scales to thousands of physical and virtual servers with wide hypervisor support. Centralized administration easily integrates new services into the management console with support for existing processes. Unitrends treats flexibility as a fundamental feature and strategic driver, making it one of the most flexible DP platforms available.
Making the Decision
Most organizations don’t have the luxury of starting from scratch and building the ideal solution for every project. Instead the data center has a collection of legacy equipment, operating systems, hypervisors and applications that they need running and productive. How do you protect it all without selecting a different data protection vendor for each of them?
If the environment is a simple one consisting of, say, VMs running in a vSphere environment, one OS, and no legacy hardware or applications, admins can afford to choose a vendor that only supports simple environments. If, however, the data center runs multiple OS’s, a variety of legacy hardware and applications, and a couple of different hypervisors running, admins need a vendor that covers all of it. The alternative is that dark place that involves multiple DP vendors, each supporting bits and pieces of the environment.
Even if IT is starting out with something simple, they need to consider that they might end up with a more complex data center architecture in the future. Again it is to the company’s benefit to select a data protection vendor that is capable of doing it all, and that easily assimilates with legacy hardware and software. This way IT seamlessly adds support for whatever they decide to do in the future.
Photo courtesy of Shutterstock.