When storage managers are asked about their challenges, dealing with data growth always tops the list. Next-generation data storage technology could make a difference.
It seems there has been more change in the data storage industry in the past five years than in the previous quarter century. Thanks to an increased focus on data analytics and advances in compute capabilities, storage managers must deal with more data -- and data-related tasks -- than ever before.
The challenge for IT is how to secure, manage and harness that information while containing both capital and operational expenses. Given that a line-of-business manager can swipe a credit card and have new IT resources up and running in no time, IT must find ways to become more agile, responsive and cost-effective.
How will IT meet this challenge? Server virtualization has enabled IT to build an elastic compute environment that can be provisioned quickly to meet new application needs, and software-defined networking is making inroads to similarly help the network layer. But what about storage? Traditional storage technologies may not fit the bill for a number of reasons, but there's the potential for several next-generation data storage technologies.
As well as the pure volume of data, modern business needs and applications are driving huge demands for uber-performance, continual uptime, data interchange, mobile access and agility. And it's not just block storage, but increasingly files and nascent object storage that are pushing the need for new flexible and cost-effective storage approaches to address the data deluge.
While traditional storage models still have a stronghold in enterprise IT, we're seeing many alternative storage models come to bear for a number of reasons.
- Cloud as a viable cost-containment measure. Many IT organizations are turning to the cloud to drive down costs. The challenge of dealing with burgeoning types and volumes of data -- often arriving unpredictably -- is likely a key factor in users' interest in the cloud. When Enterprise Strategy Group asked respondents which IT initiatives they thought would significantly impact their organizations' storage spending in the next 12-18 months, the most frequently chosen response was "To use cloud storage service[s] to source storage capacity without buying new infrastructure." This is clearly a reactive move, and might not be the first choice if more "cloud-like" storage systems (flexible, on-demand, self-service and so on) were available to IT organizations. What's more, organizations differ in their maturity, sophistication and thoughtfulness when it comes to planning as to how best integrate off-premises cloud storage with a traditional on-premises infrastructure.
- Emerging software-defined technologies enable agility. Software-defined storage technologies hold promise. But what does software-defined storage actually entail and deliver? The answer varies depending on what vendor or customer you talk to. Is the software-defined storage/data center vision understood by and resonating with customers? The vision itself seems to resonate. Certainly, IT has benefitted from server virtualization or "software-defined servers." And the vision of a software-defined data center that essentially pools resources and allocates them where and when needed is powerful. But due to the lack of a meaningful definition, (or perhaps the abundance of definitions), software-defined storage has become more of a buzzword than a meaningful solution.
- Object storage helps contain costs. Object storage is maturing, and has moved beyond the cloud infrastructure and archive space. It's emerging as a platform for big data analytics, and even as a primary storage layer. It certainly has a role as a back end for sync-and-share applications that need scalable, manageable and cost-efficient back-end storage. Object storage has been hampered by the requirement for proprietary, RESTful APIs; but with Amazon S3 becoming a de facto standard, augmented by OpenStack Swift as an open standards API, and with many vendors now providing NFS, CIFS and even SMB interfaces, barriers to adoption are falling.
- Integrated compute platforms are a factor. Perhaps software-defined storage is just an interim step on the way to a fully centralized and managed integrated compute platform, where storage functionality -- the "storage app," if you like -- becomes an integrated component of the operating system.
- Solid-state storage is growing faster than hard disk drives. Do growing amounts of increasingly affordable flash mean we can gradually return storage from a "peripheral" to be adjacent to, and integral with, servers and apps? There are indications we're headed there.
With all these new technology options, storage is no longer just a necessary evil. Storage, and how it's approached, can have a huge impact on the top and bottom line. There seems to be a greater willingness on the part of IT organizations to shift from making a safe storage investment (no one was ever fired for buying EMC, Hitachi or IBM) to investing in new technologies that can benefit the business. There are many new technologies that can lower Capex and Opex, and also help the business to become more responsive and agile. Thanks to all this innovation, the next five years promise to be more interesting and dynamic than the last five.
About the author:
Terri McClure is a senior storage analyst at Enterprise Strategy Group, Milford, Mass.
Dig Deeper on Data storage strategy
HPE Primera storage makes Synergy, Composable Rack more intelligent
Top infrastructure and operations technology myths of 2019
Software-defined storage benefits to sway SDS holdouts
Veritas: the secret sauce is a ‘smarter’ storage