This article can also be found in the Premium Editorial Download "Storage magazine: Hot tips for buying storage technology."

Download it now to read this article plus other related content.

Why upgrade when you can archive?

Requires Free Membership to View

Here is a typical database performance scenario: A key application gradually becomes less and less responsive over time. It's gotten to the point that the system, storage and database administrators are meeting to discuss the problem. Like the blind men describing the different parts of an elephant, each begins to analyze the problem from his own perspective.

The system administrator analyzes CPU, memory, paging and IO utilization, and (if trending data is available) confirms that utilization has been steadily increasing. The storage administrator reviews capacity and performance data, such as storage area network (SAN) switch port utilization, storage system cache parameters and physical disk utilization. It's discovered that the amount of data has grown significantly, and while there's plenty of SAN bandwidth, there's definitely a high rate of disk activity for the physical storage assigned to the database. The DBA begins to examine the problem from the application perspective and finds that queries and batch jobs are taking much longer to complete.

At this point, the next step is usually to go through several iterations of tuning. This often includes bringing in expertise from the vendors involved to make sure the system is optimized. Also, it isn't uncommon to begin to hear talk of the need for hardware upgrades. After several attempts, the conclusion is drawn that tuning options have been exhausted, and the only option is to upgrade.

However, there is another option: purge and archive the database. There are three classes of database archiving tools: ad hoc, native and third-party products.

An ad hoc process should only be considered for those rare instances when an archive will be only required infrequently and there isn't a need for long-term retention or frequent retrieval. Another consideration is the number of applications: More than one or two ad hoc database archiving processes will likely prove to be too difficult to manage.

Native tools are typically used for less complex, custom applications where data relationships are well understood within the organization. As long as data retrieval and retention requirements remain low, this approach is acceptable.

Third-party tools can make the archiving process more repeatable. They are particularly suited to large commercial applications, and where either regular access and/or long-term retention of archived data are required.

Getting started
Developing an archiving strategy is a significant undertaking. As mentioned earlier, a cross-functional team will be required to ensure that the business and technical needs are adequately addressed. The phases for developing and deploying an archiving strategy include:

  • Assess your archiving needs
  • Understand application data characteristics
  • Classify data
  • Establish data handling and retention policies for each class of data
  • Determine requirements for archival data format and retrieval
  • Evaluate tools
  • Conduct a pilot program
  • Wide-range rollout
The greatest challenge to database archiving isn't technology. The greatest obstacle for most organizations is coming to an agreement on what to archive and establishing appropriate policies for archived data. To address this, the first step is to assess your needs. Is the primary factor regulatory compliance and long-term retention, or is it capacity and performance? For some industries, compliance issues may require adopting records-management applications in addition to data archiving techniques.

Application data characteristics and dependencies can greatly impact the feasibility and cost to implement archiving. A major application like PeopleSoft can have thousands of tables, and understanding the business rules and logic to determine how to archive this data isn't trivial. Application complexity is a major driver for the adoption of third-party archiving tools. The classification of data and the development of policies for retention, migration to the archive and capabilities for retrieval are essential.

The phases of product evaluation and piloting should not just focus on the technology. They should also include the development and testing of standard operating procedures and the identification of roles and responsibilities needed to ensure that archiving policy requirements can be met. A wide-range rollout of an archiving solution demands regular monitoring and measurement to ensure policy compliance and evaluate whether performance and data capacity levels are meeting expectations.

The benefits of a successfully deployed database archiving strategy can be far-reaching. Performance improvements, better storage management and improved data retention are significant paybacks. Third-party database archiving products are starting to play a more prominent role in automating the archiving process. Take the necessary time to properly evaluate, design and test these new database archiving applications to achieve success.

This was first published in March 2004

There are Comments. Add yours.

TIP: Want to include a code block in your comment? Use <pre> or <code> tags around the desired text. Ex: <code>insert code</code>

REGISTER or login:

Forgot Password?
By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy
Sort by: OldestNewest

Forgot Password?

No problem! Submit your e-mail address below. We'll send you an email containing your password.

Your password has been sent to: