sss78 - Fotolia

Get started Bring yourself up to speed with our introductory content.

Be mindful when working with public cloud storage

Some enterprises are using public cloud storage as a tier for secondary storage or archived data, but Scott Sinclair advises caution before taking the leap.

This article can also be found in the Premium Editorial Download: Storage magazine: Everything you need to know about storage snapshots:

If there are any universal truths to managing enterprise data, they would be that data is always growing and all data is not created equal. So it shouldn't come as a surprise that we at ESG continue to find the challenge of data growth a top concern among IT professionals, regardless of whether a study investigates storage, data protection or overall IT priorities.

Fortunately, the IT industry has seen a wealth of innovations to deal with the challenge of data growth and the complexity of efficiently supporting diverse data types. Flash storage emerged to address the requirements of high-performance workloads, for example, while the development of scale-out NAS and object storage tackled the needs of lower-performing, yet demanding, higher-capacity data sets.

As a result, storage is bifurcated into separate tiers, which helps reduce the total cost of ownership (TCO) of IT. ESG research shows that nearly half of enterprises surveyed that employ solid-state storage report a reduction in TCO, while the top reason for deploying object storage is to lower capital storage expenditures.

Bottom line: Leveraging different storage tiers works and saves money.

Not all storage tiers created equal

Extending the concept of storage tiers a little further -- outside the data center, in fact -- some enterprises have turned to public cloud storage providers as a tier for secondary storage or archive data. Often, for these environments, the thinking goes, "We have all this stagnant data, let's move it to the cloud and get it out of our data center." This approach sounds logical and makes sense theoretically, but it can lead to the most expensive storage tier in your ecosystem if not done carefully.

The fundamental assumption that enables the tiered-storage model to work is the belief that you can accurately discern which data set belongs in which storage tier. For many workloads, this may not be the case, however, for the following reasons:

Bottom line: Leveraging different storage tiers works and saves money.
  • The data access pattern doesn't always easily align to a specific storage tier.
  • Data access rates often do not stay constant over time.
  • Organizations frequently lack accurate data on the true performance demands of their workloads.

For an on-premises storage infrastructure, the penalty for misdiagnosing data set performance characteristics is usually minimal. The cost and complexity can skyrocket when dealing with off-premises cloud resources, though. That's because the physical separation introduced by public cloud storage providers results in a significant latency penalty when moving data back and forth between local and off-premises resources. As a result, the cost of a storage tier can increase dramatically when a data set thought to require low access rates is migrated to the cloud only to discover after the fact it's serving far more transactions than anticipated.

The physical separation introduced by the public cloud results in a significant latency penalty when moving data back and forth between local and off-premises resources.

This happens far more often than you might expect, unfortunately, and can result in a storage tier that is ultimately far more expensive than anything residing on premises. This scenario can lead to any number of the following issues:

  • Public cloud storage providers may issue a higher-than-expected bill, sometimes far exceeding budget expectations.
  • Cloud storage users may need to launch an investigation into the source of the increase in demand, requiring cycles from IT personnel.
  • And enterprises may need to decide whether to proceed with a costly migration process to move data back on premises to reside on storage infrastructure that may no longer be available.
Sure, there are many benefits to the public cloud, but if data sets are migrated hastily, problems can occur and costs can increase.

Thankfully, there are a number of options you can leverage to help gain a greater understanding of workload performance characteristics prior to a migration and ease data movement across the WAN:

  • Invest in tools to better understand the performance characteristics of applications. There are multiple software-defined storage (SDS) products that reside in the control plane that can provide insights into the resource demands of applications or virtual machines. These SDS technologies also let you virtualize or aggregate multiple on- and off-premises storage resources.
  • Evaluate hybrid cloud services. If you want to leverage public cloud storage providers as a storage repository, you should investigate services that can deploy a local high-performance storage cache while virtualizing off-premises storage resources. These technologies can greatly reduce the number of required data transactions over the WAN should performance demands increase.
  • Investigate the latest on-premises storage offerings before starting a cloud migration. Storage innovation has been rampant in recent years, reducing both the cost of capacity and performance. For example, 2016 has seen multiple announcements for new high-density, all-flash storage arrays that deliver massive amounts of high-performance capacity with incredibly low space and power footprints.

Data transactions over the WAN slow performance and add a level of permanence to off-premises architecture decisions. Sure, there are many benefits to public cloud storage providers, but if data sets are migrated hastily, problems can occur and costs can increase. A way to mitigate these concerns is to invest in technologies that provide better insight into workload requirements or greater flexibility in hybrid cloud data movement.

About the author:
Scott Sinclair is a storage analyst with Enterprise Strategy Group in Austin, Texas.

Next Steps

Taking stock of the public cloud storage market

What to ask providers of public cloud storage

Choosing a cloud storage provider for nearline data

This was last published in November 2016

Dig Deeper on Public cloud storage

PRO+

Content

Find more PRO+ content and other member only offers, here.

Join the conversation

1 comment

Send me notifications when other members comment.

By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy

Please create a username to comment.

What precautions do you take when working with a public cloud storage provider?
Cancel

-ADS BY GOOGLE

SearchSolidStateStorage

SearchCloudStorage

SearchDisasterRecovery

SearchDataBackup

Close