boscorelli - Fotolia

IBM Spectrum storage beefs up hybrid cloud

IBM plans to add a new cloud service for capacity planning and performance monitoring, next-gen XIV and multi-tier file and object storage service.

IBM beefed up its hybrid cloud strategy capabilities this week by launching a cloud-based data management service and previewing a multi-tier file and object storage service.

The vendor also added native real-time compression to its XIV storage array. The IBM storage announcements came at its Edge Conference in Las Vegas.

IBM Spectrum Control Storage Insights, which is due to become generally available on June 29, will enable users to do capacity planning and performance monitoring of their on-premises storage through a software application that runs on the IBM Cloud.

IBM gave an early look at its "Big Storage Technology" project using IBM Spectrum Scale and Spectrum Archive software. Spectrum Scale is based on General Parallel File System and was known by the code name "Elastic Storage."

The Big Storage service aims to lower the cost of storing and retrieving large amounts of data. It will consist of file and object storage for on-premises use, hybrid cloud deployments and IBM's cloud, according to Bernie Spang, vice president of software-defined infrastructure for IBM Systems.

IBM Spectrum Control Storage Insights

Spectrum Control Storage Insights has its origins in IBM's former Tivoli Storage Productivity Center product, but the cloud service's capabilities are new. Control is the first addition to the IBM Spectrum Storage family since the company announced the $1 billion, five-year initiative in February.

Spang said customers can download and install lightweight agents to enable the cloud-based Spectrum Control Storage Insights service to perform analytics and provide guidance on how to optimize data placement and reclaim underutilized storage. Pricing will be announced at the time the service becomes generally available, he said.

"Customers are asking us to help them simplify their storage management and give them greater visibility into how to optimize it," Spang said. "This is the main way to apply analytics to their environment to make recommendations for how they can optimize capacity planning going forward."

Cal Braunstein, CEO and executive director of research at Robert Frances Group in Wilton, Conn., said most companies waste more than 40% of their storage capacity.

"There are software offerings out there that do examine systems for storage efficiency, but most products are not at the application level and therefore do not optimize storage as efficiently," Braunstein wrote in an email.

XIV Gen 3 offers real-time compression

IBM claims XIV Gen 3's real-time compression will allow users to store 50% to 80% more data with little or no performance impact on applications. IBM will include a compression guarantee of up to 5-to-1 data reduction with the product, according to Spang.

Real-time compression was available in previous versions of XIV only by using IBM Spectrum Virtualize, formerly known as SAN Volume Controller, with the arrays.

Spang said real-time compression will not be available with IBM's software-only version of XIV, Spectrum Accelerate. Spectrum Accelerate, which can run on commodity hardware, became available this year and IBM plans to add it as a service on the IBM Cloud in the second half of 2015.

Big Storage Technology preview

IBM did not disclose the estimated availability date of its Big Storage Technology. Spang said it will be made available "when our partners tell us it's ready." The company is running a pilot of the cloud service with Iron Mountain and design partner clients.

"We'll enable a multi-tier environment so that clients can have data that's in the IBM Cloud, on SoftLayer, on disk, or in a lower-cost tier that's managed with Spectrum Scale or Spectrum Archive in a tape library that's a fully active part of the global namespace," Spang said.

Spang said IBM's points of differentiation include Spectrum Scale's support of the OpenStack Swift object storage interface and automated policy-driven data placement and movement across storage tiers, and integration with Archive that extends the tiers to the tape library. He claimed the price point will also set Big Storage apart from competitors' products.

Mark Peters, a senior analyst at Enterprise Strategy Group Inc. based in Milford, Mass., said that IBM's Big Storage Technology offers the ability to take advantage of the economics of tape behind an object store, while retaining the metadata locally to speed up metadata operations such as search.

"With the level of data growth the industry is seeing currently, embracing object storage is becoming critical for any storage portfolio," Peters wrote in an email. He added, "For on-premises [storage], IBM may be hitting the market right in stride. While object technology has been around for some time, it still represents a small portion of the market. There have been some signs recently that point to the potential for growth in the object space moving forward."

Charles King, president and principal analyst at Pund-IT Inc., said via an email that IBM is taking the right approach by reinforcing its tape, disk and flash storage platforms in partnership with Iron Mountain, one of the highest profile names in data archiving.

"As one of the few vendors to still be involved in developing every major form of storage media, IBM's strategy is to emphasize the value of multi-dimensional tiering -- automatically moving data to the platform that most cost effectively suits its use cases," King wrote. "There aren't many vendors who can pull this off, and it's a value proposition that IBM's enterprise customers are likely to get without too much trouble."

Next Steps

IBM CTO says strategy revolves around Spectrum, flash, virtualization

IBM takes aim at hybrid cloud with Elastic Storage

IBM strategist discusses importance of flash, cloud

Dig Deeper on Hybrid cloud storage