Backup has been turned on its head over the last decade due to the presence of the cloud, the increased frequency of cyber attacks, and the need for speed.
Companies increasingly want recovery to happen fast. All this has to happen at a time when there is more data to store and backup. Thus, the storage of backups has risen in importance in recent years.
Here are some of the top trends in the backup storage market.
1. Data creation surge
There is no question that data creation continues to mushroom.
Business is becoming more digitalized, generating vast amounts of data from product videos, social media posts, customer transactions, and data from Internet of Things (IoT) devices. Data management is also growing more complicated as remote work becomes the norm.
With this explosion of data, storage costs are significantly increasing. The average cost of storing a single TB of file data is now pegged at $3,351 per year. This poses problems for the storage of backups.
“There are ways that organizations can manage data more cost effectively, including exploring data tiering, which enables organizations to move data used less often to less-expensive storage levels,” said Ahsan Siddiqui, director of product management, Arcserve.
“It also reduces storage costs while safeguarding an organization’s most valuable data.”
2. AI-enabled storage
Artificial intelligence (AI) can help mitigate the impact of larger storage volumes, according to Siddiqui with Arcserve.
AI-enabled storage applies intelligence and machine learning (ML) to help companies determine which pieces of data are critical to their business and which are less important and may not need to be stored — or which datasets can be offloaded to the cloud and which should be stored locally, Siddiqui said.
Looking ahead, therefore, expect AI to play a larger role in the optimized storage of backups.
3. Data loss
Data loss is increasingly an issue. This comes down, in part, to the rise in cyber attacks and ransomware.
For most, it’s not a matter of if, but when they will face one.
According to recent Arcserve research of IT decision makers, 35% reported their organizations were asked to pay over $100,000 in ransom payments, and 35% were asked to pay between $1 million to $10 million.
The loss of critical data, then, continues to disrupt businesses. In the same study, 76% of respondents reported a severe loss of critical data. Of that number, 45% had permanent data loss.
These findings underscore the importance of building data resilience with a robust data backup and recovery plan with data integrity at the core to prevent severe business disruptions.
4. Continuous scanning of backup and storage
The cyber attack tactics being used by cybercriminals have changed. This puts larger organizations with legacy backup environments at major risk.
“Malware authors, like Locky and Crypto, are adapting ransomware, so it actively targets backups, prevents data recovery, or immediately targets any attempt to use recovered files by encrypting them,” said Doron Pinhas, CEO, Continuity.
While immutability can certainly help, it is just the beginning of a comprehensive cyber resiliency strategy. Being able to continuously scan your storage and backup devices, to detect security misconfigurations and vulnerabilities, is a critical element of any storage and backup strategy.
According to a recent paper by Continuity, which analyzed the state of storage and backup security within the financial services sector, “over two-thirds of respondents identified securing storage and backup being specifically addressed in recent external audits.”
More auditors and cyber-insurance firms are becoming aware of the importance of securing storage and backup environments and adding it to their security assessments and policies. This is trickling down to other verticals.
5. Multicloud backup
The multicloud approach to cloud storage has gained serious ground.
According to the Flexera 2022 “State of the Cloud Report,” 89% of respondents reported having a multicloud strategy.
Data management and backup requirements to maintain more copies and replicas and store information over longer periods of time mean that data stores can quickly expand beyond planned capacity.
The results of this data sprawl can be expensive, requiring the purchase of additional cloud capacity or on-premises systems — both of which may not be the most cost-effective options for storing data for a backup or data protection-related use case.
This is where the value of adopting and managing multiple cloud providers begins to shine, especially from an infrastructure perspective.
“Organizations can safely keep master copies of files or data in one location — perhaps to meet application performance requirements, regulatory compliance requirements, or data sovereignty requirements,” said Andrew Smith, senior manager of strategy and market intelligence, Wasabi Technologies.
“Then tier secondary/tertiary copies and replicas of data to a cost-efficient cloud service. And by storing replicas outside the primary application or platform environment, organizations take advantage of new price points, performance characteristics, storage locations, and avoid being locked into one application or infrastructure environment.”