Essential Guide

Best data storage products 2015: Products of the Year

A comprehensive collection of articles, videos and more, hand-picked by our editors
News Stay informed about the latest enterprise technology news and product updates.

Qumulo Inc. Qumulo Core

The Qumulo Core Scalable File System uses a "flash-first" hybrid design to provide real-time data analytics.

Gold winner in the Storage magazine/SearchStorage.com 2015 Products of the Year Storage Systems: Disk and Hybrid...

Systems category

Startup Qumulo Inc. combines data-aware scale-out NAS with built-in analytics for traditional file systems and object storage. Designed for petabyte-scale deployments, Qumulo Core storage is designed to intelligently manage billons of data objects. Engineered by inventors of the Isilon scale-out NAS (now owned by EMC), Qumulo Core is a software-only installation that runs atop a Linux-based hybrid flash storage system on pre-validated commodity hardware, dedicated servers or virtual machines.

The Qumulo Scalable File System curates, manages and stores data. The Core database system files are served from flash storage. Core's analytics are embedded within data and storage, enabling users to identify data that is most valuable, see where it is stored and how it is being accessed, as well orchestrate for archiving, backing up or deleting files.

"Scale-out NAS meets data management, analytics and storage resource management. Very innovative," is how one judge described Qumulo Core.

Another judge praised Qumulo Core for its "great value and innovative data utilization aware that scales to (support) billions" of objects.

A Qumulo Core cluster scales from four nodes to more than 1,000 nodes, creating a single file system and single global namespace. Each node runs Qumulo Core and participates in the cluster as a fully symmetric peer, managing read and write requests and coordinating transactions with other nodes. Nodes get added non-disruptively for linear scaling of storage capacity and storage performance.

Qumulo Inc. Qumulo Core

Each Qumulo node is a modular building block that comprises processing power, memory, networking, flash and spinning disk with two 10 Gigabit Ethernet ports. Core supports NFS and SMB file storage and features a programmable REST-based object management API.

Qumulo's software-as-a-service delivery model is sold as an annual subscription. Entry pricing for a four-node 100 TB raw capacity Qumulo QC24 hybrid storage cluster begins at $50,000.

Next Steps

Qumulo uses data-aware storage

Qumulo among intelligent storage vendors

Qumulo touts real-time data anlytics

This was last published in February 2016

PRO+

Content

Find more PRO+ content and other member only offers, here.

Join the conversation

6 comments

Send me notifications when other members comment.

By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy

Please create a username to comment.

What innovations excite you most about the Qumulo Core Scalable File System?
Cancel
Interesting. It is something I will probably never run across. Most companies I have worked for are nowhere near that volume of data..It would be many years before they came close to 1 petabyte. My current company has an i-Series with or historical data going back to 1999 and we are only at 41% capacity of our 2TB of disk.
Cancel
Hi ToddN2000: The use of scale-out storage is on the rise. I think Qumulo's built-in data analytics is a key differentiator, esp. in the era of big data.
Cancel
I'd be curious as to what percent of today's businesses would be considered big data. Sounds like it may be the right direction if the trend continues that way. This might just be a small portion of the market so they can adjust things as they grow as it does not effect the mainstream. It's kind of scary that I have twice the storage capacity for my PC as my company.
Cancel
I think big data is still thought of mainly as complex data sets that require fast a high degree of analytics. If you aren't running Hadoop clusters, a large SQL farm or OLTP systems, you may not have encountered big data (and scale-out storage) yet. Interesting... what kind of storage area you running?
Cancel
Nothing like you mentioned. Just the standard default configuration for a IBM i-Series.
Cancel

-ADS BY GOOGLE

SearchSolidStateStorage

SearchCloudStorage

SearchDisasterRecovery

SearchDataBackup

Close