buchachon - Fotolia
Editor's Note: This is one of the last columns longtime Storage magazine and TechTarget contributor Jon Toigo wrote for us before he died of natural causes in February at age 59. Please read "Remembering Jon Toigo" -- near the end of this column -- for more about his life and career.
Storage was one of the original service offerings of clouds -- focused initially on general-purpose content distribution, then on file sharing or mobile-device-capacity expansion. Today, storage-related service offerings run the gamut from backup and archive data repositories to more complex storage solutions that support cloud-based applications and virtual machines.
Internal private clouds -- infrastructure built by private companies in their own on-premises data centers that takes advantage of many of the same technologies used by public cloud providers -- developed concurrently with public cloud-based outsourcing services. Recent surveys reported these on-premises private clouds support as much as 75% of the application workload hosted in enterprise data centers today, and it is the private cloud that is the foundation for the leading cloud trends of 2019.
Beginning in late 2018, analysts started to report another trend: businesses redistributing their workloads from outsourced or public clouds to hosted private or on-premises clouds.
IDC coined the expression cloud repatriation in a 2018 report citing the trend. The report said 80% of the IT planners surveyed repatriated workloads over the 2017 to 2018 timeframe and, on average, plan to move half of their public cloud applications to hosted private or on-premises locations over the next two years. Furthermore, 75% of enterprises that use the public cloud overall will also use a private cloud by 2020.
The reasons for repatriating workloads are many, ranging from a desire to exert more direct control over data to realizing perceived security benefits of local hosting to capturing cost savings expected from re-platforming apps and data. The latter is often the most common rationale. Among the cloud trends of 2019, this shows how despite years of marketing and hype, public clouds have not always proven to be the low-cost alternative to the on-premises data center they are cracked up to be.
Hybrid clouds becoming popular
Shifting cloud-based workloads between on- and off-premises infrastructure isn't a new concept, technically speaking. Larger enterprises never fully adopted public clouds, mainly because business leaders did not want to entrust their bread-and-butter legacy workloads and mission-critical data -- including financials, enterprise resource planning, logistics, customer relationship management, manufacturing automation and so on -- to an outsourcing agent. Analysts also linked reluctance to experiences with workload virtualization on distributed servers in the data center, a precursor to cloud that usually had the side effect of driving up platform cost and degrading application performance. In other cases, these legacy applications, sometimes called systems of record, were simply too complex to virtualize readily without a significant recoding effort.
Whatever the reason, vendors sought to create technologies ranging from gateways to modular systems that would enable a hybrid computing model to accommodate the virtualized-non-virtualized bifurcation in many data centers. Hybrid data centers, evangelists claimed, would combine private cloud and traditional hosting for workloads in the same on-premises data center, while building in capabilities to use public cloud resources and services on an as-needed basis. To the extent it made sense, public cloud services could be attached to the hybrid data center to use public cloud software services -- such as backup, archive or disaster recovery -- or to use clouds during peak load periods when local processor or storage resources were stretched thin.
Two cloud trends of 2019, the rise of the hybrid cloud and the uptick of cloud repatriation, are driving a broader strategy called multi-cloud. That is, organizations with hybrid data centers want to take advantage of public cloud services from multiple providers based on performance, cost and other characteristics and need a way to manage their services and resources coherently and efficiently. The keys to success will likely include modular technology.
Modular is a prerequisite for a multi-cloud strategy
The concept of modular storage technology as a means to facilitate multi-cloud strategies, while reducing the complexity inherent in managing multiple collections of resources deployed in geographically-dispersed facilities, is among the cloud trends of 2019. Similar terminology, however, does not connote common definition.
For example, in late 2018, Infinidat announced its Neutrix Cloud 2.0 service, which it described as "an enterprise-class storage service for multi-cloud computing." Neutrix Cloud is a public cloud storage service that augments Google Cloud, Microsoft Azure, AWS, IBM Cloud and VMware Cloud on AWS compute environments by brokering the use of these third-party services. With Neutrix Cloud 2.0, businesses can centralize the management of cloud storage while providing simultaneous access to multiple public cloud compute assets.
Neutrix Cloud 2.0 is intended to provide ease of use in multi-cloud settings comparable to those found on premises. Design philosophies for Neutrix Cloud RESTful APIs, GUI and command-link interfaces are similar to those the company provides with its proprietary InfiniBox on-premises arrays, but the hardware management aspects are completely removed, and deep integration into cloud resources from multiple providers is offered. One typical Neutrix Cloud workload is closely tied to resiliency: the ability to create writable snapshots of block- or file-based datasets accessible across cloud compute resources based on whatever service is most cost-effective or provides the best workload.
Remembering Jon Toigo
On Feb. 12, Jon William Toigo, Storage magazine columnist, conference speaker, technologist and raconteur, passed away at his home in Dunedin, Fla. The news, of course, was shocking and sad beyond words, for all of us here at TechTarget who worked with Jon over the years considered him as much a friend as a professional colleague.
About a dozen years ago, I got a call from Jon. I knew about Jon, but didn't know him personally. Jon had contributed to SearchStorage and was a speaker at one or two Storage Decisions conferences before I joined TechTarget and became involved with the site, the conferences and Storage magazine.
Jon was interested in ending his hiatus from our publications and conferences. And I knew enough about Jon's reputation as being opinionated, outspoken and maybe just a little bit outrageous to invite him back into the TechTarget fold immediately. In December 2007, Jon was one of our headliners at Storage Decisions San Francisco, presenting a session on disaster recovery plan testing. He also took center stage for a keynote addressing data provisioning and protection.
Not surprisingly, both sessions were enormously successful, cementing Jon's presence as a key fixture of all Storage Decision conferences and seminars to come.
It was also clear to me that our readers and attendees were eager to hear more from Jon, so I invited him to write a regular column for Storage magazine that turned out to be a long-running engagement.
Jon was no stranger to controversy. At times, it seemed like he courted it -- whether in print or on stage -- but he wasn't just trying to pick a fight. He approached every topic he wrote about with the same intelligence, insight and authority that he applied to his presentations.
Jon was a master at using entertainment to educate. He might make you laugh or scoff, nod in agreement or shake your head in disbelief, but you listened. And even if Jon's ideas were at odds with yours, he provoked you to think and just maybe consider a different point of view.
However outrageous or bombastic Jon might have been striding the stage, he was a gentleman through and through when he stepped off the stage where, inevitably, a cluster of attendees waited to tap Jon for some one-on-one advice. He was never too busy to listen to others and to share from his seemingly limitless pool of knowledge.
We at TechTarget will miss Jon and grieve for his loss. Our hearts go out to his wife, Margaret, and their six children.
-- Rich Castagna, former TechTarget vice president of editorial
Editor's Note: Jon Toigo wrote thousands of articles, authored 11 books and regularly discussed storage and data management technologies and issues on his blog, DrunkenData.com. You can also explore all the articles he authored for TechTarget by accessing his contributor page.
Cloudian, on the other hand, points to IBM's 2018 acquisition of Red Hat as validation of its strategy for enabling multi-cloud interoperability and seamless integration of hybrid data center storage with cloud storage. The IBM-Red Hat deal centered on application portability across environments. Cloudian focuses on data portability -- enabling hybrid cloud management, providing API compatibility between services and cloud types, consolidating data searches, providing unified access and enabling the convergence of storage for unstructured data across a multi-cloud environment. IBM acquired Red Hat, in part, to provide a full stack platform for IBM's multi-cloud strategy. From a storage perspective, Cloudian already provides similar capabilities with its management tools.
Cloudian claims to offer an additional service that competitors might want to think about: a common data management toolkit that works across all clouds. That means the technology goes beyond the management of modular storage infrastructure resources to address the management of data itself.
- 3 Common Cloud Challenges Eradicated with Hybrid Cloud –SearchStorage.com
- Jargon buster guide to hybrid cloud storage –ComputerWeekly.com
- Cloud Storage for Primary or Nearline Data –SearchStorage.com
- 4 Options for Architecting Hybrid Cloud Storage –Hedvig Inc