Monday, September 16, 2024

Easing Server Sprawl and Storage Traffic Load

Datamation content and product recommendations are editorially independent. We may make money when you click on links to our partners. Learn More.

According to most analysts, the amount of stored information is growing at a rate anywhere between 50 percent and 125 percent annually.

This places an enormous burden on the storage infrastructure. Servers are typically thrown at the problem, new storage arrays added, and additional SAN islands implemented. Yet performance suffers, disk space remains a problem, backup windows swell, and storage administrators are tied to the eternal firefight treadmill.

Such challenges faced Southern Wine & Spirits of California (SWS). A business unit of Southern Wine & Spirits of America, the nation’s largest wine and spirits distributor with more than $4 billion in revenue. Operating in 10 states, it represents more than 300 suppliers, 5,000 brands and services 125,000 retail and restaurant customers.

“We were suffering serious amounts of downtime and having to add two or three servers each month just to keep up with storage demands,” said Robert Madewell, director of networks at SWS. “As much as 10 percent of our storage resources were eaten up in non-business and duplicate files, and backup times were killing us.”

SWS employs more than 2,000 people in California and maintains a sizeable distribution network that is managed from two sites — Union City in Northern California and Cerritos in Southern California.

In total, 70 HP Proliant servers are installed and traffic is divided functionally rather than regionally between the two sites. Some applications are from Union City while others are housed at Cerritos. The application mix includes Exchange Server, proprietary collections software, a sales reporting system, SQL Server databases, fax servers, web servers and call center servers. Cisco networking gear predominates. Most servers run Windows 2000, though some NT units remain to be upgraded.

Organizationally, the storage management load was placed on an already overworked networking crew. This resulted in severe storage issues such as losing a couple of servers each month due to them running out of disk space.

Unfortunately, SWS had no way of tracking disk utilization. Thus, when a server filled up, users could be left with nowhere to store newly created documents. That generated a large volume of calls to IT to remedy system downtime. IT would spend hours manually correcting the problem.

“IT staff spent an average of two or three hours each day putting out storage-related fires and troubleshooting network storage issues,” said Madewell. “At the same time, our eight staff had seen storage consumption grow fivefold without any more personnel being added to the unit.”

Storage usage grew from 170G to 400G in one year in the north part of the state and from 90G to 200G in the south. Two or three servers were being added each month to cope with storage demands. And if a disk filled up, Madewell and his staff were forced to spend hours poring through it to see what space could be freed up.

But IT couldn’t just delete anything that looked suspicious. PowerPoint presentations could either be valuable sales tools or unnecessary junk. MP3s could be personal music or marketing department tools for the Web. Lacking a means of central tracking of storage consumption and its content, two IT staff would routinely go through the various directories checking files one by one to determine which were of value and which could be deleted.

Page 2: Backup Woes

Backup Woes

The amount of data being backed up had increased from 260G to 600G over the course of one year. A full backup took from Friday night till late on Sunday. Even a differential backup took the entire night.

“Differential backups during the week wouldn’t get completed as some users were logging on at 6 or 7 a.m.,” said Madewell. “We really struggled to get any kind of a backup done.”

Result: Many open files weren’t getting backed up, and early-morning users were faced with very slow backups and sluggish applications. SWS was forced to purchase a lot of tapes and two additional HP DLT tape drives to supplement its existing QualStar Tape Library.

This situation prompted Madewell to look into storage resource management (SRM) as a possible solution. He loaded a demo copy of BrightStor SRM by Islandia, N.Y.-based Computer Associates onto one server for testing. Initial results led to it purchase.

Madewell says the cost came to about $4,000 for software, $2,000 for maintenance and another $3,000-$4,000 for agents placed on nine storage servers used for storage. After two days of installation and configuration, SWS had eliminated almost 25G of data from these nine servers.

Backup, too, has improved markedly. A full backup now takes one day less than before and differential backups have been speeded up by several hours.

“We haven’t had a server down due to storage issues in almost a year,” said Madewell. “Additionally, we haven’t needed to add any more servers in order to provide user storage.”

Not everything was rosy with the implementation, however. SWS encountered reporting issues due mainly to the language used to write scripts — Enterprise Definition Language (EDL). The first elaborate script he tried to assemble took almost 90 minutes.

“The basic reports were a bit too high-level for us so we needed more granular data such as file owner name and a list of the top 20 storage users,” said Madewell. “As we didn’t know EDL, the syntax was the biggest killer and learning it proved to be really challenging.”

Madewell notes, though, that a newer version of the software comes with a scripting wizard. He says he can complete complex scripts in 5 to 10 minutes.

Another problem focused on Windows XP. Reports were hard to run from XP clients. This bug was recently fixed in a service pack, he says. Thus Madewell can run reports on storage from his laptop no matter where he connects to the network.

Overall, Madewell says, he is happy with the system. He estimates savings of $30,000 per year by not having to purchase additional storage servers. That figure, he says, doesn’t take into account three or four hours of time saved in IT each day.

“Instead of manually searching each directory for rogue files, we can now monitor the proper usage of our entire storage environment from one console,” said Madewell. “We only have to add hardware when we truly need it.”

Story courtesy of Datamation.com.

Subscribe to Data Insider

Learn the latest news and best practices about data science, big data analytics, artificial intelligence, data security, and more.

Similar articles

Get the Free Newsletter!

Subscribe to Data Insider for top news, trends & analysis

Latest Articles