News Stay informed about the latest enterprise technology news and product updates.

IBM Spectrum Storage platform adds analytics software

The new Spectrum Computing platform highlights IBM's strategy to bring together compute and storage management for file and object scale-out environments.

IBM extended its Spectrum software brand beyond storage with a new line of resource and workload management products...

designed for use with high-performance analytics and data-driven applications.

Spectrum Computing complements the IBM Spectrum Storage software portfolio that IBM introduced in February 2015. The new Spectrum Computing line includes IBM Spectrum Conductor, with integrated IBM Spectrum Storage Scale file and object capabilities.

Spectrum Scale is based on IBM's General Parallel File System. Spectrum Conductor can install on a scale-out cluster with x86 or IBM Power servers running Linux, according to Nick Werstiuk, IBM's director of software-defined infrastructure strategy. The application enables cloud applications and open source software frameworks, such as Hadoop, to share compute resources, accelerate analytics results, and protect and manage data and storage in a multi-tenant environment.

The Spectrum Computing platform, which is due to become generally available this month, also includes:

  • IBM Spectrum Conductor with Spark, a version of Conductor tailored to work with the open source Apache Spark big data analytics framework.
  • IBM Spectrum LSF, an updated and rebranded version of IBM Platform LSF, which accelerates throughput for simulation, design and research. The Spectrum LSF software, which is geared for high-performance computing (HPC) environments, stems from IBM's 2012 acquisition of Platform Computing.
  • IBM Spectrum Symphony, formerly known as IBM Platform Symphony, provides infrastructure as a shared service with workload and resource management and analytics acceleration. IBM Spectrum Symphony is a rebranded product that includes no technology updates at this time. The Symphony software is also rooted in the Platform Computing acquisition.

"We see this need for a distributed, scale-out infrastructure and the software to allow people to manage that effectively both on the software-defined storage portfolio, with Spectrum Storage, and now by introducing this set of Spectrum Computing offerings," Werstiuk said.

Werstiuk said IBM sees the lines of storage management and compute and workload management starting to blur as customers build scale-out, shared infrastructure environments. Customers need a multi-scale platform to manage compute and storage "simultaneously, but independently," he said.

"Multi means I have the flexibility to add in more storage-centric capacity, or I have the flexibility to add more compute-centric capacity into that environment. And we'll manage that situation with the software we provide," Werstiuk said.

Spectrum Conductor is an example of how the storage and compute software can work together. The application takes aim at enterprises using general-purpose analytics and service- or container-based environments supporting cloud-native workloads, Werstiuk said.

"We aggregate the local disk in that pool of resources into a usable storage environment," Werstiuk said. "And we aggregate the usable compute resources into a unified resource pool, which we then allow the organizations to divide up and allocate to their applications and manage the application deployment."

IBM Spectrum Storage Scale gives users an alternative to HDFS, and they can manage the compute using Spectrum Conductor.

Mike Gualtieri, a principal analyst at Forrester Research, spotlighted the signification of IBM's integration work for Apache Spark users. He said Spark is a popular cluster computing, batch processing engine, but it doesn't have its own file system. Users often turned to the Hadoop Distributed File System, but HDFS has limitations, he said.

IBM Spectrum Storage Scale gives users an alternative to HDFS, and they can manage the compute using Spectrum Conductor. They don't always know how many server nodes they need, and Spectrum Conductor can help to "elastically spin up the compute needed and monitor that job," Gualtieri said.

IBM Spectrum Conductor aids big data analytics

Ashish Nadkarni, a program director with IDC's infrastructure practice, said IBM Spectrum Conductor can "disrupt siloed solutions that are based on Hadoop." He said Conductor essentially enables businesses to "combine and natively analyze data on a shared, highly scalable infrastructure platform without the dependency of Hadoop."

"We are entering an era where data cannot be stored and analyzed in silos," Nadkarni said. "Next-generation applications take an analytics-first approach and require a data management platform that not only hosts the analytics workload, but also the components of the application itself. This platform has to be cloud-enabled, distributed, scalable and unified, supporting structured, unstructured and semi-structured data sets."

Steve Conway, research vice president for HPC and high-performance data analysis at IDC, said IBM's integration of LSF, Conductor and Symphony into the IBM Spectrum Storage and Computing portfolio makes sense.

"They all belong together," Conway said. "And when you think of the data explosion that's happening, certainly storage plays a very, very big part in dealing with that. So, to integrate those pieces together into something coherent is a big advantage for a buyer. Trying to do that on a do-it-yourself basis can lead to some very big problems."

Pricing for Spectrum Conductor is based on compute and storage, at $3,825 per compute socket and $625 per terabyte.

IBM is working on a reference configuration for Spectrum Conductor with Spark on its Power-based servers. The company is also working with several other vendors on reference architectures, according to an IBM spokesman.

Super Micro Computer is building analytics appliances using SuperMicro server hardware and IBM Spectrum Conductor, according to Marc The'berge, director of the rack solutions division at Super Micro. The'berge said the performance-optimized analytics appliances integrate compute, storage and networking hardware. Super Micro also builds hyper-converged appliances with IBM Spectrum Accelerate block storage. Accelerate, based on XIV technology, is part of the IBM Spectrum Storage family.

Next Steps

IBM Spectrum portfolio reaches for the cloud

IBM beefs up hybrid cloud strategy with help from Spectrum

IBM's Storage GM: Spectrum de-emphasizes hardware

Dig Deeper on Enterprise storage, planning and management

PRO+

Content

Find more PRO+ content and other member only offers, here.

Join the conversation

1 comment

Send me notifications when other members comment.

By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy

Please create a username to comment.

What use cases can you envision for the combination of IBM Spectrum Storage and Spectrum Computing software?
Cancel

-ADS BY GOOGLE

SearchSolidStateStorage

SearchCloudStorage

SearchDisasterRecovery

SearchDataBackup

Close