Sunday, October 6, 2024

Remote Possibilities, Part 1

Datamation content and product recommendations are editorially independent. We may make money when you click on links to our partners. Learn More.

Let me state right from the start that this article represents a change of mind for me on some aspects of remote storage. In the past, transfer speeds were woefully inadequate for moving large amounts of data. Secondly, before recent economic changes brought about a shakeout, some companies pushing this technology had questionable staying power and capabilities. Well, things have changed.

I’ve always been an advocate of backing up data. We all know that it’s not a question of whether our hard drive will fail but a question of when it will fail. Everyone has had a piece of software corrupt a file, and there are even rumors that some of us have been heard to mutter, “Wait a minute, what file name did I just save that to?”

Backups are essential, but they’re also vulnerable. Computers are a prime target for burglars these days and airports abound with thieves specializing in the theft of notebook computers. If your only recent backup is located next to your computer at home or in the office, or in the same bag you’d been carrying your notebook in before it was stolen, your hardware loss is probably secondary to your data loss. A backup that is not itself safe and accessible in an emergency is of no use.

Add to this the various natural disasters that can befall computing systems from stand alone SOHO computers to server farms and the wisdom of storing your backups at some distance becomes evident.

Remember Alexandria
The remote storage of important data is not something that began in the Twentieth Century with the invention of computers. Storing important records some distance from the place where they originated appears to have begun shortly after writing was invented by the ancient civilizations thousands of years ago. The most famous ancient archive was the library at Alexandria, Egypt. It stored copies of books from all over the world. It was burned, twice, and many ancient books were lost. There are lessons to be learned from this.

Since security of the backup is important, the location of the backup is as crucial as the medium on which it’s stored. A remote storage facility located in a locale that experiences frequent floods or has an unreliable power grid may give you nothing more than a false sense of security.

One of the most troubling aspects of remote storage has always been the expense involved. For remote storage you need to have a place to shelter the backups, and that has been a stumbling block for many companies and most individuals.

Of course large companies have been handling their own remote storage for years. When I administered a network for a very large bank in the eighties we used to mail one of its full 120 MB backup tapes to a storage facility once a month. They were a bank so real estate with air conditioned vaults was no problem. Smaller firms had to hope that someone responsible could be persuaded to take a backup home, and not lose it.

Things have evolved over the past several years. Desktop units that had 30 to 40 MB of storage now have a thousand times that. Server capacity has grown at an even greater rate. The days of mailing 120 MB tapes to a remote vault are over. Remote storage today usually means accessing a Storage Area Network (SAN). Larger firms do regular, enterprise-wide backups to SANs and they do them electronically. In the best of circumstances they have dedicated fiber optic links to their SAN server farm and everything is automated. These large firms also have the personnel needed to oversee and maintain their SAN.

Of course there’s a great deal of expense involved in owning your own SAN and making use of it. You need the real estate to base the SAN itself in, and you need to be able to lay that fiber underground — not an easy accomplishment in some larger cities. Additionally, you need the computer hardware and staff to maintain and operate it. In this expensive scenario you own and control everything associated with your remote storage and the security measures you take are up to you.

Most companies cannot begin to afford this type of solution. But the need for remote storage remains.

In part 2 of this story, which will appear tomorrow, Martin will look at the various types of fee-based SAN operators and assess the inevitable security risk associated with storing the company’s most sensitive data somewhere else.

R. Paul Martin has been a network administrator for a Fortune 100 company. He works as a freelance writer and as a technology consultant.

Subscribe to Data Insider

Learn the latest news and best practices about data science, big data analytics, artificial intelligence, data security, and more.

Similar articles

Get the Free Newsletter!

Subscribe to Data Insider for top news, trends & analysis

Latest Articles