Data storage administrators are grappling with the challenges of data growth. With some storage volumes increasing...
By submitting your email address, you agree to receive emails regarding relevant topic offers from TechTarget and its partners. You can withdraw your consent at any time. Contact TechTarget at 275 Grove Street, Newton, MA.
at 90% to 100% per year, there's an incessant need to provide dramatically greater storage capacity, ensure that all storage is adequately allocated and achieve those objectives within shrinking labor and capital budgets. Enterprise storage administrators often monitor performance and employ capacity planning to stay one step ahead of storage needs -- sometimes using creative purchasing schemes like leasing to keep costs in line. Migration tools can then move and consolidate existing data onto the new storage, and data provisioning tools are used to allocate the storage capacity among users and applications within the business.
Performance and capacity planning
When storage resources are stretched too thin, users may experience network or application performance problems along with service level failures. Conversely, overbuying storage may result in wasted capital expenses for the enterprise. The goal of planning is to ensure that adequate storage upgrades are periodically deployed before performance is impacted. This eliminates the financial burden and service disruption of sudden, major storage upgrades. Instead, storage is acquired and added as a regularly budgeted resource. Advance planning can also help you to budget and schedule related infrastructure improvements involving power, cooling, training, and maintenance; while minimizing any disruption to the storage users.
Unfortunately, predicting data storage needs accurately in today's burgeoning storage environments is extremely difficult. Many organizations consider the planning process to be costly, time-consuming and cumbersome -- often they will not invoke formal performance or capacity planning guidelines unless a high-profile mission-critical project is underway, or until serious performance issues arise.
Part of this difficulty is a lack of good planning tools. The tools that are available are often integrated into storage resource management (SRM) products such as ControlCenter software from EMC Corp. Storage Horizon software from MonoSphere Inc. develops storage forecasts and automates storage planning. Other planning products include AppIQ SRM and SAN management software (now owned by Hewlett-Packard Co.), Storability business analytics software from Sun Microsystems Inc., Hi Command Storage Services Manager from Hitachi Data Systems' (HDS), among others.
Data migration strategies
Data migration is the process of moving data from one place to another within the storage infrastructure. There are many compelling reasons to migrate data. For example, migration may help boost storage performance by moving data from an older, slower disk to a newer, faster disk or storage platform. Migration can free critical disk space by shifting older and less critical data from Fibre Channel disk to SATA or SAS disk storage. Migration can also support storage consolidation by combining the data from disparate disks, or help to tier storage by shuffling data types to their respective storage platforms. Although migration is often implemented under special circumstances (e.g., when a new storage system is deployed), storage administrators are using migration tools for more routine maintenance tasks such as weekly or monthly disk cleanups.
Migration typically results in downtime for the systems involved in any data move, so administrators are constantly looking for ways to minimize downtime. One tactic is to minimize the amount of data involved in a move. For example, it may not be necessary to migrate an entire disk if only a few gigabytes (GB) of files must be moved. An increasing number of platforms can also migrate data in the background while the data is still in use, which is a significant boon for frequent migration tasks.
Storage acquisition and decision making
The costs of upgrading data storage capacity and replacing aging subsystems can have a serious impact on company finances, so administrators must be savvier when considering storage acquisitions. As a rule, it is better to purchase (or finance) technologies intended for long-term deployment, while leasing is often better for short-term acquisitions that will be disposed of quickly. The final choice of buying versus leasing depends on many business factors including cash flow and tax benefits, so administrators should have a keen understanding of the company's business objectives before recommending a new storage product.
Another key to successful acquisition planning is to understand the total cost of ownership (TCO). TCO is not the product with the lowest price tag, but rather the product that costs less to implement and use over time. For example, a storage system that includes data deduplication may cost more initially than a conventional storage array, but if the new system uses 60% less storage capacity, the TCO for that new system may be considerably less over its working life than the conventional array. Training, installation, upgrades, maintenance, software patches, and retirement/disposal costs are also frequently considered in TCO evaluations.
Data provisioning tools
Once storage is installed, it must be provisioned prior to use so that the space is available to the correct network users and applications. For example, some amount of data center storage may be provisioned for an Oracle database that might only be accessible to a purchasing department. Consequently, data provisioning can improve performance and security by preventing any user from accessing any storage on the SAN. Provisioning can be a difficult, tedious and error-prone process, requiring a detailed knowledge of the storage infrastructure and user needs. Storage administrators typically rely on software tools to support provisioning tasks. [See the TechCloseup on provisioning tools here].
Another disadvantage to data provisioning is that storage space allocated for one application cannot be used by other applications. That is, if 500 GB is set aside for a database that is currently using only 100 GB, that 400 GB difference is essentially wasted until the application actually calls for it. Thin provisioning eliminates the cost of allocated but unused space by assigning space to an application on an "as needed" or "just in time" approach.
There are several types of provisioning tools. Point products are the most straightforward, handling data provisioning tasks across the network and storage arrays. Point products often accompany specific storage products (such as EMC's Celerra Automated Volume Management software), though many tools accommodate a wide variety of storage systems -- leading to some degree of heterogeneity in the provisioning process.
More sophisticated tools such as the Inform Operating System accompanying products from 3Par Data Inc. integrate provisioning capabilities with other features like virtualization and SAN management. The most complex software tools like OpenView from HP include provisioning as part of a comprehensive storage automation and management objective. Given the rapid consolidation taking place with provisioning vendors, it's important to consider your vendor carefully when evaluating any provisioning tool.