kentoh - Fotolia
Coho Data Inc. has injected storage quality of service (QoS) into its DataStream software-defined storage nodes, allowing customers to dynamically allocate resources as service levels evolve.
DataStream 2.8, available Tuesday, builds on the vendor's previous integration of software-defined networking and support for Docker container storage. The upgrade adds Hadoop Distributed File System support for big data analytics to existing support of scale-out file and iSCSI block multi-tenant storage.
Prior to the addition of QoS, DataStream relied on the native logic of software-defined network switches to arbitrate any resource contention.
DataStream software matures
Version 2.8 provides a single cluster-wide namespace per tenant. CPU, network and storage can be isolated and set aside for different use cases. DataStream presents compute, network and storage as a disaggregated resource pool. That allows capacity and performance to be scaled higher or lower independently.
"Our story revolves around scale-out (capabilities) at the rack level. We give you the ability to run multi-tenant storage that scales at a granular level, as dictated by your workloads," said Sanjay Jagad, Coho Data senior manager of product marketing.
Coho Data hybrid and all-flash nodes package DataStream software on white box Supermicro servers. Coho refers to DataStream as rack-converged data center infrastructure.
Scalable multi-tenant storage for Docker, Hadoop cluster consolidation
DataStream 2.8 expands support for Google Kubernetes-orchestrated Docker containers that run directly on Coho Data storage. For persistence, DataStream arrays mount a subset of the storage system namespace to keep data intact as a container migrates between nodes.
One thing Coho Data is missing is predictive analytics for storage consumption with containers. Coho Data said it may be added to the roadmap. For now, Coho Data profiles workloads (including containers) over time and places performance-intensive applications in DataStream's NVMe flash tier.
Jim Ensellchief marketing officer, Coho Data
Jim Ensell, Coho Data chief marketing officer, said quality of service (QoS) gives enterprises the flexibility to consume multi-tenant storage as a managed service for containers and Hadoop analytics.
"We provide the capability to isolate resources and guarantee service at both the container and virtual machine level, not through a hypervisor," Ensell said.
"You don't need a dedicated Hadoop cluster anymore. You can now run Hadoop closer to where the data resides and access it through HDFS."
Getting the most value from software-defined storage
Video overview of different SDS vendors
How efficient is the HCI/SDS systems model?