MOUNTAIN VIEW, Calif. — As enterprises move more heavily into virtualization, they will have to overhaul their data backup and disaster recovery strategies because these won’t apply so well to the new virtualized world.
That’s the case Deepak Mohan, senior vice president of Symantec’s (NASDAQ: SYMC) data protection group, made in a press briefing here at the company’s offices where he discussed its strategies for disaster recovery, high availability and data protection.
There are two major reasons why virtualization requires a new approach to data backup and disaster recovery, Mohan said. One is virtual sprawl, which is the unchecked proliferation of virtual machines (VMs) (define). “Virtual machines are easy to deploy and propagate like rabbits, and that causes complexity of management from the data perspective,” Mohan explained.
The other reason is the difficulty of protecting and recovering applications in virtual environments. Distributing applications across VMs or across both VMs and physical servers further strains the backup and recovery systems. Finally, VMs can be easily moved from one physical server to another, using applications like VMware’s VMotion, which makes them more difficult to track and back up.
Mohan recommended that CIOs consider restructuring their data backup and disaster recovery strategies as soon as they begin to virtualize. In the traditional backup approach, where perhaps 20 virtual machines are running on one physical server, IT would have to back up each of those VMs and take one snapshot of the entire environment so it could recover one file or a number of files with a data protection product, Mohan said.
Symantec’s NetBackup enterprise-class flagship product offers a new approach — it lets users take only the one snapshot of the environment (instead of many) and conduct granular recovery of files from that single snapshot image.
This sort of granular recovery capability is getting more important as virtualization moves from development and testing labs to production environments where transaction-intensive applications are being used.
“Before, people were virtualizing print and other servers and testing and development, where losing data wasn’t that important, or consolidating legacy applications into smaller, newer servers,” 451 Group analyst Henry Balthazar told InternetNews.com. “Now, they’re moving into e-mail servers and transaction-oriented applications, where problems get magnified,” he added.
This article was first published on InternetNews.com. To read the full article, click here.