Sunday, June 16, 2024

Best Practices in Data Archiving

Datamation content and product recommendations are editorially independent. We may make money when you click on links to our partners. Learn More.

Databases usually have two things in common. They have grown far larger than envisioned by their creators and they are filled with unneeded data. Many companies, therefore, are seeking to pare them down to the essentials and archive little used content. But what should you archive, how should you do it and what best practices apply.

“Data is growing at 125 percent a year yet up to 80 percent of this data remains inactive in production systems where it cripples performance,” sad Charlie Garry, senior program director at Meta Group. “To compound this problem, many enterprises are in the midst of compliance initiatives that require the retention of more data for longer periods of time, as well as consolidation projects that results in significant data growth.”

This article highlights the approach taken by one company to improve Oracle performance by implementing archiving technology and policy, while at the same copying with the rigors of global compliance issues. In doing so, it evolved the following best practices:

1. Set a data retention policy that is tailored to each country.
2. Integrate data retention policies for each country into one archiving system
3. Enforce data retention based on a published central retention document
4. Store data in classes appropriate to the age of data and its access requirements
5. Identify inactive business transactions in the database
6. Separate data into transaction categories, each with their own predefined archiving constraints
7. Relocate inactive data to the archive
8. Retain application transparency for users regardless of where the data resides
9. Get your archiving methodology approved as legal in each country and by each agency concerned
10. Run the archive DB as a separate instance within one Oracle system, if possible, to reduce costs.

Worldwide Headache

Tektronix Inc. of Beaverton, Ore., focuses on test measurement and monitoring products. It operates in 27 countries and has annual revenues of $850 million. All company financial data resides in one huge Oracle database. Previously, Tektronix operated 486 different legacy systems internationally. Countries like Japan, India and China utilized complex systems and unique character sets.

“Every legacy system required its own hardware, software and IT resources,” said Lois Hughes, senior business systems analyst for Tektronix. “As we kept adding countries, we magnified the complexity.”

Tektronix current production environment involves Oracle 10.7 running in all countries. This means two instances of Oracle:

# 1. Oracle Financials for accounts payable (AP), General Ledger (GL), and other transactional systems running on a Sun UE5000 database server with 8 x 336 MHz processors and 3 GB of memory. This instance is 35 GB

# 2. Oracle Customer Fulfillment — invoicing, accounts receivable (AR) etc. This production database is approximately 85 GB and consists of a Sun UE6000 box with 12 x 248 MHz processors and 6 GB memory. It also includes a forms server — a Sun S2000 with 10 x 60 MHz processors and 2 GB of memory.

This system experiences a fairly constant traffic volume of about 800 concurrent sessions throughout every part of the day and night. Hughes notes that the fewer the number of instances for Oracle financials, the lower the costs. Standardization also meant that the company could improve its level of customer responsiveness, gain worldwide inventory visibility and reduce worldwide IT expenditures. The monthly closing of financials can now be done in three and a half days.

However, this philosophy brought its own set of problems. The database soon expanded to 60 GB, rising at a rate of 1.25 GB per month. Performance began to suffer.
“Despite tuning exercises and hardware upgrades, the increased growth rate caused performance to decline,” said Hughes. “Run times for batch programs increased despite fewer numbers of executions.”

Faced with rising storage costs and lower performance, the company investigated data usage patterns. It realized that running all data from all time periods in one system was slowing down current transactional traffic. Over time, data usage declined sharply. Yet users reported that simple queries of current transactions took ages — enough time to go for a coffee, have a chat and then return to your terminal to view the results.
Tektronix compared the costs and potential results obtainable from its two routes forward:

a) Keep buying disks, networks, servers, processes and people or
b) Implement best practices to manage data growth via intelligent archiving.

As b) appeared to be the most attractive, the IT department’s first inclination was to utilize its own resources and existing software. They looked at purging as the best option for reducing the data footprint.

However, international finance regulations meant that purging would have to be paralleled by archiving. Hughes reports that Oracle itself possessed several bugs in its purging functions. The company attempted to develop its own purge/archive software. But when management realized that could take two years, it looked elsewhere.

The goal was to be able to manage data by country, while at the same time centralizing it. As well as the differing languages and character sets, many countries have wildly divergent data retention regulations.

The company adopted a system called LiveArchive by OuterBay Technologies. This functions in conjunction with Oracle to purge and archive data. Hughes successfully embarked upon a pilot project to solicit buy-in for corporate wide international adoption.

As a result, the company now carries out archiving of transactional data every three months. Initially, information is recategorized (reduces in priority within the same Oracle instance, then moved to a less expensive infrastructure. The users, however, are able to access all data from one screen without headache.

The results: improved database efficiency from the end user perspective in terms of queries and reports; reduced storage requirements; data retention is now fully compliant by country; and reduced time to backup.

“Queries are now instant,” said Hughes. “Overall, we have benchmarked a 46-percent improvement in our financial performance by implementing archiving.”

Subscribe to Data Insider

Learn the latest news and best practices about data science, big data analytics, artificial intelligence, data security, and more.

Similar articles

Get the Free Newsletter!

Subscribe to Data Insider for top news, trends & analysis

Latest Articles