- Dave Raffo, Editorial News Director
Before the term took on sinister political tones, we were starting to hear a lot of talk about repatriation as the newest cloud trend.
Jon Toigo addressed cloud repatriation in his last column for Storage magazine before his untimely passing earlier this year, and analyst firms and vendors threw the term around a lot in late 2018 and early 2019. They described a trend where a large number of organizations that put data into the cloud over the past decade were now pulling that data back on premises.
It's likely the cloud repatriation trend was overblown, just as the breadth of companies moving into the cloud was once exaggerated. Although many organizations put data in public clouds over the years, few put the bulk of their data there. And while companies are pulling data back today, they're still leaving the right type of data in public clouds.
The cloud trend we see now is companies taking a good look at what data they need in the cloud and on premises and then making any adjustments, which may or may not include cloud repatriation. Talking to storage managers and other IT pros, we found that many discovered the public cloud to be more expensive than expected or that it didn't fit their needs. The public cloud also wasn't always compatible with their data center infrastructure, and it remains up for debate if it is useful for more than secondary storage.
With more cloud options than ever, it is smart for IT to consider the best way to run each application and type of data. For midsize and large companies, it will likely be a mix of public clouds, private clouds and on-premises data. That's why we're hearing so much these days about multi-clouds and the hybrid cloud trend taking off again.
Controlling cloud repatriation costs
Plex Systems has maintained its own manufacturing ERP cloud -- hosting ERP software as a service -- for customers since it began in 2001. At the same time, the company is considering how to use the public cloud for its own infrastructure and considerable data needs.
"We're looking at how to have a large storage footprint on our own cloud, and a large footprint in public clouds," said Todd Weeks, Plex Systems' group vice president of cloud operations and chief security officer.
Like other companies, Plex Systems is struggling to control the costs associated with moving data after it is put into the public cloud.
"The big thing now from a cost savings and cost control perspective is the public cloud costs a lot to transfer data," Weeks said. "They come in with 'Hey, data is very cheap to store here,' but it's not cheap to transfer data around. The challenge we have is now that we have large amounts of storage in both, how do we write the application so we don't lose cost on data transfer? We will have data all over the place, so how will we make sure to control costs when we transfer it and interact with it?"
In a June 2019 survey of 900 IT leaders conducted by U.K. research firm Vanson Bourne, 32% of respondents said the cloud met all expectations, while 58% said they received some benefit. Only 39% use the public cloud efficiently, while 40% struggle to benefit from the public cloud.
Writers for our TechTarget storage sites have been talking to executives to find out why.
Like Plex Systems' Weeks, Palmaz Vineyards CEO Christian Gastón Palmaz said his challenge came from moving data off the cloud. Palmaz told Johnny Yu of SearchDataBackup that he moved data off the public cloud because of egress changes and latency.
"To put a petabyte on the cloud is one thing, but pulling that data off the cloud was expensive," Palmaz said.
These excessive cloud repatriation cost issues aren't limited to winemakers or the public cloud. Craft brewer New Belgium Brewing Co. found a private cloud too costly to continue using, for example. New Belgium Brewing IT operations manager Jake Jakel told Garry Kranz of SearchConvergedInfrastructure that a private cloud hosted by a managed service provider "was holding us hostage" when the brewery tried to upgrade its VMware deployment.
Jakel said the brewery will save approximately $1 million over three years after switching to its in-house private cloud built on Dell EMC VxRail hyper-converged appliances.
"That's a huge amount of money in our IT budget," Jakel said.
Finding the right cloud fit
When our writer Carol Sliwa went to the AWS Summit in New York in July 2019, she came across several large companies still trying to figure out how best to use the public cloud. These included JP Morgan, which still has a relatively small public cloud footprint but is planning a large migration of data to Amazon Web Services.
JP Morgan's cloud data footprint is small compared to its overall storage capacity, but Kevin Lamb, who oversees resiliency for cloud at JP Morgan, said the company has a large migration plan for AWS.
"The biggest problem is the way applications interact," Lamb said. "When you put something in the cloud, you have to think, 'Is it going to reach back to anything that you have internally? Does it have high communication with other applications? Is it tightly coupled? Is it latency sensitive? Do you have compliance requirements?' Those kind of things are key decision areas to say this makes sense or it doesn't."
It's hard to say if many companies are cloud repatriates these days. In a lot of cases, they're just trying to find the right fit. One thing is clear, however: The high costs associated with cloud repatriation are but one aspect of the larger cloud trend of finding the best mix of cloud and on-premises storage for your needs.