James Thew - Fotolia

Smart data storage types needed for future workloads

Data storage types that offer a flexible infrastructure and intelligent data are proving more beneficial than speed or scale alone.

This article can also be found in the Premium Editorial Download: Storage magazine: Twelve data storage industry startups to watch in 2016:

When researching storage demands with IT executives, one theme I commonly encounter is a desire to transition to a "modern" data storage architecture. Typically, when I hear this, my immediate follow-up questions are, "What do you mean by modern?" and "How will you know when you get there?"

With technology advancing rapidly, identifying the data storage types that would comprise a modern storage system may not be as easy as it used to be. These days, the two parameters that dictate storage purchasing decisions are typically speed and scale. So, depending on your needs, simply upgrading to the next generation of storage controllers or adopting the next advance of Fibre Channel bandwidth may not be good enough.

When looking for speed to address the needs of low-latency and high-transactional workloads, the answer often falls to solid-state storage. Whether the best product ends up being an all-flash array, a hybrid array, or a server-side product depends on specific workload needs. And when it comes to the challenge of scalability, file- or object-based with scale-out architectures are often the answer. However, storage architectures that offer infrastructure flexibility and data intelligence are more likely to address modern IT needs than data storage types that offer speed or scale alone.

Data storage types must grow with workloads

Different workloads have different storage requirements, and as those workloads evolve, storage must as well. The optimal media or infrastructure types will likely change over the course of data's lifecycle. Storage systems that offer the flexibility to present consistent data access despite these infrastructure changes simply offer the potential to better tailor the product to the workload and therefore offer more value. For example, off-premises public cloud storage offers different benefits than on-premises, but keeping public and private storage types isolated limits the benefits that can be achieved as business and workload requirements change.

As those workloads evolve, storage must as well. The optimal media or infrastructure types will likely change over the course of data's lifecycle.

Often, data storage products that can abstract the underlying storage infrastructure from the persistent data management layer are referred to as software-defined storage. The net result can be a storage system that incorporates a wide variety of data storage types, such as, on-premises solid-state, spinning disk on commodity hardware, and even tape, as well as multiple types of off-premises resources, while presenting a consistent level of data accessibility.

It's obviously difficult to predict the optimal storage media type for a particular workload 10, or even 5, years from now. So, a storage software layer that offers the flexibility to support a variety of data storage types translates into greater value and should be a key consideration when selecting a storage architecture for your organization.

Vendors unveil storage tools in quest for flexible infrastructure

In the pursuit of infrastructure flexibility, multiple established and emerging storage providers are innovating. For example, infrastructure flexibility is a core tenet in NetApp's hybrid cloud Data Fabric vision with its clustered ONTAP technology. EMC is discussing the next-generation data lake, which extends storage pools to include the public cloud and remote office resources with Isilon SD Edge and Isilon Cloud Pools.

Meanwhile, a number of software-defined storage providers are entering the marketplace with storage software fully abstracted from the underlying hardware. A couple examples include Hedvig with its distributed storage platform as well as Formation Data Systems' FormationOne Dynamic Storage Platform.

New storage types offer real-time data analytics

In addition to infrastructure flexibility, the opportunity for integrated data intelligence has started to gain some traction in the industry. Recently, data-aware storage players Data Gravity and Qumulo have started to tout the benefits of running real-time data analytics at the storage device level. In the case of Data Gravity, the system monitors data as it is written to help identify compliance escapes or data security gaps automatically. For Qumulo, insights are used to help identify and resolve performance and accessibility issues in real time.

When designing a storage system for the modern data center, speed and scalability are musts, but are only part of the equation. The flexibility of data storage types that can incorporate a variety of on- and off-premises resources now and in the future, and the intelligence to understand the data in real time are quickly becoming important considerations as well.

Next Steps

Identifying types of data storage for server virtualization

Big data and the storage type it needs

How to choose the right type of storage media

This was last published in January 2016

Dig Deeper on Data center storage

PRO+

Content

Find more PRO+ content and other member only offers, here.

Join the conversation

1 comment

Send me notifications when other members comment.

By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy

Please create a username to comment.

What data storage types are must-haves to brace for future workloads?
Cancel

-ADS BY GOOGLE

SearchSolidStateStorage

SearchConvergedInfrastructure

SearchCloudStorage

SearchDisasterRecovery

SearchDataBackup

Close