Tuesday, August 3, 2021

Virtualization Needs a New Backup Strategy

MOUNTAIN VIEW, Calif. — As enterprises move more heavily into virtualization, they will have to overhaul their data backup and disaster recovery strategies because these won’t apply so well to the new virtualized world.

That’s the case Deepak Mohan, senior vice president of Symantec’s (NASDAQ: SYMC) data protection group, made in a press briefing here at the company’s offices where he discussed its strategies for disaster recovery, high availability and data protection.

There are two major reasons why virtualization requires a new approach to data backup and disaster recovery, Mohan said. One is virtual sprawl, which is the unchecked proliferation of virtual machines (VMs) (define). “Virtual machines are easy to deploy and propagate like rabbits, and that causes complexity of management from the data perspective,” Mohan explained.

The other reason is the difficulty of protecting and recovering applications in virtual environments. Distributing applications across VMs or across both VMs and physical servers further strains the backup and recovery systems. Finally, VMs can be easily moved from one physical server to another, using applications like VMware’s VMotion, which makes them more difficult to track and back up.

Mohan recommended that CIOs consider restructuring their data backup and disaster recovery strategies as soon as they begin to virtualize. In the traditional backup approach, where perhaps 20 virtual machines are running on one physical server, IT would have to back up each of those VMs and take one snapshot of the entire environment so it could recover one file or a number of files with a data protection product, Mohan said.

Symantec’s NetBackup enterprise-class flagship product offers a new approach — it lets users take only the one snapshot of the environment (instead of many) and conduct granular recovery of files from that single snapshot image.

This sort of granular recovery capability is getting more important as virtualization moves from development and testing labs to production environments where transaction-intensive applications are being used.

“Before, people were virtualizing print and other servers and testing and development, where losing data wasn’t that important, or consolidating legacy applications into smaller, newer servers,” 451 Group analyst Henry Balthazar told InternetNews.com. “Now, they’re moving into e-mail servers and transaction-oriented applications, where problems get magnified,” he added.

This article was first published on InternetNews.com. To read the full article, click here.

Similar articles

Latest Articles

Data Belongs in the...

In 2012, IBM made an oft-quoted claim that 90 percent of the world's data has been created in the last two years. They grossly...

Google Cloud Rolling Out...

WASHINGTON, D.C. — Google Cloud is helping the government sector with zero trust. The set of services are designed to help U.S. federal, state, and...

CFOs Committing to Digital...

STAMFORD, Conn. — More CFOs are planning to increase their spending on digital than any other part of the business in 2021. Eighty-two percent of...

SAP and IBM Partnering...

WALLDORF, Germany and ARMONK, N.Y. — SAP and IBM are working together to help financial institutions accelerate cloud adoptions and “modernize operations.” SAP plans to...