Sergey Nivens - Fotolia

News Stay informed about the latest enterprise technology news and product updates.

Smarter data storage tools needed for smarter storage

Yesterday's data storage tools are ill-equipped to handle today's data storage needs, but a handful of vendors have new products that can help.

This article can also be found in the Premium Editorial Download: Storage magazine: Hot data storage market technology trends for 2016:

Are storage pros really just a bunch of bit-pushers -- shoving gazillions of zeroes and ones from here to there, hoping that they'll arrive intact and on time where they're needed? I don't think anyone who's been a keeper of the corporate jewels -- all that invaluable IP -- would ever picture themselves like that. But the lack of smart, effective data storage tools is handcuffing anyone who's responsible for storing, saving and protecting company data.

And as long as we look at the data that ends up on disks, flash, tape or whatever as a commodity -- a block, a chunk, a thing -- we're doomed to a Sisyphean routine of moving bits and bytes around, like checkers on a checkerboard. But most of us don't see data that way. We know that those collections of aughts and ones add up to the information that our companies run on.

Old data handling tools can't cut it

But, for the most part, we're stuck with clumsy pre-millennium data storage tools that still treat information as if it was just a bunch of dumb lumps with no intrinsic value.

Backup is a great example of how a brute-force approach effectively devalues the data that it is protecting. Most backup processes, including the latest flat backup techniques, see data the same way that backup apps from 40 or more years ago did. Sure, the newer methods are far more efficient and leave less clutter in the data center, but they still don't have an intelligent perspective of what the data really represents and how important -- or unimportant -- it is to the company.

In the best of all possible worlds, the data itself should be smart enough to know how it should be used, who or what will use it, how long it should live and if it has any practical usefulness beyond its immediate application.

But until we can build that kind of intelligence into the data, even a potentially efficient operation like flat backup is essentially a "blind process." It's still just bumping around a bunch of things with no sense of what's actually inside.

We're stuck with clumsy pre-millennium data storage tools that still treat information as if it was just a bunch of dumb lumps with no intrinsic value.

When you think about how application- and data-centric we're becoming and how that new perspective is reshaping data centers, you've got to wonder why so few vendors have rolled out data storage tools that stop treating data so opaquely. The processes for handling data may continue to improve, but we're approaching a time when those improvements will be nullified by a lack of insight. As long as data is treated as some kind of commodity rather than the valuable information it may represent, there will be more missed opportunities for making better use of that data.

But there is some hope.

Object storage lends intelligence

Object storage -- with its ability to include expanded, more detailed metadata with the objects that it stores -- is a promising development. Of course, adding that metadata is the responsibility of the applications and users that create and use the data objects. But there's no denying the potential benefits. You could, for example, tag a data object with an appropriate shelf life based on the type of file, the contents, the people who created it and modified it, their departments and so on. That metadata information could, in turn, inform a storage system or a data handling application that the object can't be copied to the cloud or that it should be deleted on a certain date or archived to a data warehouse.

That kind of intelligence could eliminate a ton of "touches" and manual operations, and could cut down on the number of copies of data that are retained. The smarter the data gets, the smarter the processes surrounding it will become and data handling operations will become policy-driven events.

Smarter storage has arrived

A handful of vendors have also begun to address some of these issues with new data storage tools.

DataGravity's Discovery Series of storage arrays has built-in intelligence that allows their systems to do more than just store data. It maintains detailed data related to individual files to track and control who has access to particular information, uses policies to determine data retention and provides an audit report that details activities related to files. DataGravity's systems represent a significant step toward making data smarter.

You've got to wonder why so few vendors have rolled out data storage tools that stop treating data so opaquely.

Another data storage tool vendor, Qumulo, recently introduced what it calls data-aware, scale-out NAS. Its systems analyze the data that gets stored on them, and uses the metadata to classify files and other objects. It also provides detailed performance information that can drill down to specific clients and data paths, revealing any potential bottlenecks.

Tarmin uses an object storage system as the basis of its data-defined storage. Tarmin drapes a global namespace over object storage that may comprise several geographic locations. Processes like data tiering, archiving, retention and encryption can all be controlled via policies.

Newcomer Primary Data also makes storage data a little more intelligent by employing a single namespace that can stretch over DAS, NAS, block or cloud storage. It uses policies to determine data placement and has the capacity to include substantial metadata with the files it manages. That metadata can be tapped into by applications via the APIs that Primary Data provides. The policies that users develop can handle a range of rights and activities, including the use of files, directories and volumes, with safeguards such as limiting where particular files can be copied.

These are all steps in the right direction and deserve our attention. Until the value of data is understood by the systems and processes that are trying to manipulate it, we will undoubtedly store more than we should, make too many copies of that data and probably lose track of what's really important. Smart storage depends on smarter data ... and smarter data storage tools.

Next Steps

Storage management tools for your data center

Enterprise storage tools for virtual server environments

Conquer your storage performance troublemakers

This was last published in December 2015

Dig Deeper on Data storage strategy

PRO+

Content

Find more PRO+ content and other member only offers, here.

Join the conversation

1 comment

Send me notifications when other members comment.

By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy

Please create a username to comment.

How can data storage tools evolve to meet the needs of today's data storage management?
Cancel

-ADS BY GOOGLE

SearchSolidStateStorage

SearchCloudStorage

SearchDisasterRecovery

SearchDataBackup

Close