Organizations require reliable storage at every phase of the application delivery process. Those who participate in this process -- developers, testers and operations professionals -- need storage that delivers the performance and capacity necessary to stay productive, while minimizing administrative overhead and storage provisioning delays that can take them away from their primary tasks.
Two important storage technologies have emerged that can significantly benefit the application delivery process: cloud storage and intelligent storage. Although they work in different ways, both can help speed up application delivery and make it easier to manage storage so DevOps team members can focus on building and deploying high-quality applications.
Storage provisioning and application development
The four phases of the application delivery process are development, testing, deployment and production. In traditional approaches to application delivery, developers, testers and operations professionals have distinct roles. With the emergence of DevOps and the increase in automation, the lines between roles have blurred a bit. Even the phases themselves have become less distinct. Even so, one remaining constant is the necessity for robust storage at every step of the way.
Application delivery teams rely on storage for their application code, tools, test environments, build data, production data, images and any other data necessary to support the delivery process. In addition, teams often require multiple copies of data, such as duplicate sample databases. They also require source control repositories that track versions and change histories, and maintain multiple branches of development. Not only do these factors contribute to the amount of storage required, they point to the importance of storage provisioning at every stage of the operation.
The types of applications being developed can also affect storage. DevOps might be developing web, desktop or mobile apps, each requiring varying amounts of test and production data. The more data the applications will handle once in production, the more data is needed for development and testing to ensure proper operations at greater data loads. You cannot fully test a big data analytics project, for example, without data sets large enough to ensure it will operate as expected once in production. Applications that incorporate AI technologies require adequate storage provisioning for modeling and training data.
The data's format plays an important role. Applications might require storing data as file, block or object storage. They might also need sophisticated systems to manage that data, such as SQL Server or Elasticsearch, which can affect storage operations even further. In addition, application workflows can affect storage. For example, high-throughput transactional processing results in much different I/O patterns than a business intelligence product accessing a data warehouse.
The type of development can also have an impact on storage requirements. For instance, Agile or DevOps development methodologies are much less linear than a waterfall approach, resulting in different data storage requirements and workflow patterns. Some teams might employ technologies such as virtualization, containerization, microservices or infrastructure as code, all of which call for a more flexible approach to storage provisioning.
In today's world of application delivery, enterprises need storage that speeds up operations while minimizing the time spent managing storage systems. That's where cloud storage and intelligent storage enter the picture.
Traditional approaches to storage worked fine for more linear and predictable application development models. But even in those circumstances, storage provisioning could be a challenge. Getting systems in place to support application development could take weeks, if not longer. And changing direction midstream could result in lengthy delays, wasted money and a failed project.
Today's much more dynamic development efforts must adapt quickly to changing requirements, which can be difficult for traditional storage systems. Cloud storage has become a natural ally to application delivery teams, offering increased agility, easier management, system integration and data redundancy.
Cloud storage supports on-demand elastic scalability, making it possible for teams to respond immediately to fluctuating requirements. Teams can provision capacity and performance within minutes rather than weeks, enabling the application delivery process to continue unencumbered. Cloud storage also provides a global platform that can better accommodate distributed teams and applications while supporting collaborative workflows. In addition, storage services often include quality-of-service mechanisms so users can better control how resources are allocated to workloads, helping to ensure the storage performance necessary to support changing circumstances.
Cloud storage services
Many turn to the cloud for storage because it offers the capacity, scalability and flexibility they need to support their applications and storage requirements.
Subscribers can easily manage and provision their storage services from a single web-based interface. In addition, the service usually provides standards-based APIs for programmatically managing operations. Cloud services are also metered, making it easier to control and estimate costs.
Enterprises can find cloud services that offer file, block or object storage. With file storage, data is stored as files in a hierarchical format, much like the files on a laptop. File storage supports such use cases as content management, file sharing, database backups and web hosting.
In block storage, data is stored in individual blocks, with each block assigned a unique identifier. The blocks provide fixed-sized raw storage capacity that can be presented as mounted drive volumes to multiple OSes. Block storage is fast, scalable and well suited to essential applications, such as database systems or virtualization offerings.
With object storage, data is stored as raw blocks in a flat address space known as a storage pool. Each object includes an expandable set of metadata and an identifier that uniquely identifies the object within the address space. Because data does not have to negotiate hierarchical layers, object storage is highly scalable and fast, making it a good fit for analytics.
Storage providers generally rely on virtualization and multi-tenant architectures to deliver services as efficiently as possible. They can dynamically provision storage to meet each customer's requirements, while optimizing resource use in the underlying infrastructure.
Cloud storage services take care of back-end issues that come with supporting a storage platform, enabling team members to remove themselves from more mundane tasks. Most management procedures are simple point-and-click operations that can be carried out with little effort. A team can provision storage in minutes without worrying about setting up, optimizing or monitoring storage arrays. Not only does this help free up managers, but it makes it possible for developers and testers to provision their own resources.
Because less time is spent managing back-end systems, more time can be spent on application delivery. Managed storage services also benefit those that follow more traditional development methodologies because operation professionals can focus on tasks specific to application delivery, such as deployment and updates, while ensuring developers and testers have the resources they need to do their jobs.
Another way cloud services can help speed up application delivery is through their ability to integrate with automated systems used to make delivery more efficient. Most storage providers offer standards-based APIs for integrating with their services. Organizations can take advantage of these APIs to incorporate storage-related operations into their automated application delivery systems. For example, they can automate such operations as storage provisioning, snapshot creation or storage performance monitoring.
The more operations can be automated, the faster and more efficiently applications can be developed, tested and deployed. If development and deployment tools cannot easily integrate with the storage platforms, managers must either perform the operations manually or spend time figuring out workarounds to fit all the pieces together, either of which results in wasted time and slower application delivery.
Cloud storage makes it easier to copy data and ensure its availability. With most cloud services, you can quickly and easily clone data, create snapshots, generate backups or make copies of your data sets in other ways, processes that can take hours or even days on an in-house system. Not surprisingly, the faster those copies are made available, the sooner team members can get on with application delivery.
The cloud's capacity for data redundancy makes it possible to implement a disaster recovery strategy and ensure data is always available when and where it's needed. Cloud services are often more adept at copy data management, so storage resources are better utilized and data is delivered more efficiently.
Although organizations can carry out many of these operations on premises, the process can be slow and cumbersome, taking time away from the important tasks at hand. Application delivery teams must be able to access the data they need when they need it while accommodating changing application requirements. Cloud storage can go a long way in making all this possible.
Much like cloud storage, intelligent storage frees up DevOps teams so they can focus on application delivery. Intelligent storage can help users better manage arrays, prevent problems from occurring, resolve issues after they've occurred and optimize configurations to deliver better performance and cost savings.
Intelligent storage, or smarter storage, incorporates AI and machine learning to analyze data collected from across storage infrastructures and other systems. The technology relies on the advanced monitoring capabilities built into storage systems to provide the detailed metrics necessary to perform the analysis. In this way, AI can discover patterns and make predictions about possible issues and their resolution. As more data is collected, the AI algorithms become better at these predictions.
A good example of how intelligent storage works comes from Hewlett Packard Enterprise (HPE), which offers its InfoSight service with many of its storage platforms, such as HPE Nimble Storage and HPE Primera. Using predictive analytics, InfoSight analyzes data collected from HPE storage and server systems from around the world. The service then uses the results of this analysis to predict and prevent issues, resolve problems, optimize performance and maximize resource usage. InfoSight takes a preemptive approach to storage management that results in quicker resolutions, better performing applications and more reliable storage provisioning.
It's easy to see how these services can benefit the application delivery team. With intelligent storage, they experience fewer issues and spend less time resolving any issues that do occur, resulting in more time focused on application delivery. InfoSight's ability to optimize systems makes the application delivery process smoother and more efficient by providing better performance for all data-related operations.
It's not just HPE that's entered the intelligent storage arena. For example, NetApp's Active IQ intelligence engine uses AI and predictive analytics to help protect and optimize systems, such as the NetApp AFF A-Series of all-flash arrays. Active IQ collects data from the NetApp user base to create actionable insights to help prevent problems, accelerate issue resolution and optimize configurations, saving time and speeding application delivery.
Intelligent storage systems
Intelligent storage provides self-managing capabilities that help address the complexities that come with maintaining a storage system. Although vendors take different approaches to delivering intelligent storage, all of them rely on AI technologies, such as machine learning and predictive analytics, to analyze relevant data sets and predict outcomes that can then be used to prevent or resolve issues, as well as optimize system configurations.
An intelligent storage system collects data from relevant sources in preparation for the analytics. Much of the data comes from the storage systems themselves, but it can also come from other data center systems, such as servers or virtualized environments. This enables the intelligent storage system to determine the root cause of a problem, even if it lies outside the storage system itself.
When data is prepared for analysis, it is cleansed, transformed and organized according to the specific needs of the analytical process. A portion of the data is used to train data models and improve predictive algorithms. This enables the algorithms to constantly learn in order to arrive at better predictions.
Each vendor takes a different approach to implementing intelligent storage. For example, Hewlett Packard Enterprise has created a separate service, InfoSight, that works in conjunction with its storage platforms to deliver intelligent storage. Dell EMC, on the other hand, embeds AI capabilities directly into its storage arrays for autonomous decision-making based on a predefined service level.
There are pros and cons to both intelligent storage approaches. A system with built-in AI might be able to respond more quickly to a particular issue, but a centralized offering outside the storage system tends to be more current and comprehensive. When evaluating intelligent storage, you should verify how intelligence is implemented and which system delivers the storage services your organization requires.
Cloud storage services are also starting to provide intelligent storage provisioning features. For instance, Amazon S3 now offers Intelligent-Tiering, a storage class that adjusts services when data access patterns change. The new storage class automatically moves data between two access tiers -- frequent access and infrequent access -- without incurring performance or operational overhead. Application delivery teams can ensure the performance they need when they need it, while reducing the cost of storage when they don't.
Intelligent storage takes data storage to the next level, making it a value proposition for any application delivery team looking to speed up their application delivery process.
Storage provisioning today
Today's application development efforts require storage provisioning that is as flexible as the development processes themselves. A storage system should help speed up application delivery, not encumber it in any way.
Cloud storage services offer agility, simplified management, standards-based integration and data redundancy. In addition, intelligent storage can simplify management, while preventing and resolving issues and optimizing storage systems.
All of these factors can help free up application delivery teams to focus on what they're supposed to be doing: delivering applications. Storage that does anything less can encumber the entire application delivery process.
Dig Deeper on Application-aware storage
Use multi-cloud software-defined storage to prevent storage silos
Everything you need to know about composable infrastructure
DevOps and storage: APIs and flexibility key
Elevate software-defined technology's role in the data center