BACKGROUND IMAGE: iSTOCK/GETTY IMAGES
Editor's Note: This column on AI's use in storage management is one of the last pieces longtime Storage magazine and TechTarget contributor Jon Toigo wrote for us before he passed away of natural causes in February 2019 at age 59. As CEO and managing principal of Toigo Partners International and chairman of the Data Management Institute, Jon presented at conferences around the world, including TechTarget's own Storage Decisions conferences. Jon wrote thousands of articles, authored 11 books, and regularly discussed storage and data management technologies and issues on his blog, DrunkenData.com. You can explore all the articles he authored for TechTarget on his contributor page.
While judging TechTarget's storage products of the year competition late last year, it hit me that nearly every vendor touted some sort of component in their products aimed at using AI for storage management.
Storage array vendors said they had machine learning, cognitive computing or AI functionality built into their array controllers -- that is, running the array controllers as applications on the server controlling their flash or disk media that may or may not actually form an array. The same went for most disaster recovery and storage management products.
After reading 20 or so of the entry forms, I started to feel like I was on The Oprah Winfrey Show episode, Everybody gets an AI!
In 2019, AI will move from the lexicon of terms with actual technical meaning into the realm of meaningless market speak. That's unfortunate because we're getting close to a time when machine intelligence will become critical to managing storage infrastructure and the data that resides in it.
Beyond automating routine tasks
Many of the features described as AI in products probably aren't AI at all. An algorithm that looks at the date last modified metadata tags on files and then moves ones with a certain measure of inactivity from one part of a storage infrastructure to another probably shouldn't be categorized as artificial intelligence. Yes, it does perform a task that requires human intelligence, but it doesn't make decisions using independent reasoning or logic. Like any program or policy, it routinely applies predefined logic to every case.
The above is an example of the preponderance of so-called AI for storage: machine functionality that uses human logic or reasoning as a model and not as the end goal of system design. The automation of routine tasks that have well-defined parameters and predictable outcomes is the least lofty objective I can think of for AI today.
Since John McCarthy introduced the concept of AI in 1956, real AI has been thought to mean the construction of machines that think and learn as humans do -- so-called strong AI. Or it was thought to mean machines that work without necessarily emulating the full spectrum of human thought processes -- so-called weak AI.
The third category, illustrated above, strives to automate tasks that were once performed by humans. The more routine the task, the less intelligence it requires and the more amenable it is to automation.
AI for storage management
AI brings to mind many images, from HAL 9000 to C-3PO to Skynet, all the stuff of science fiction.
In the real world, Amazon focuses on machine learning, pattern recognition and problem solving as a way to improve the customer experience and optimize the speed and quality of order fulfillment and product delivery. Amazon has joined Apple, DeepMind, Google, IBM and Microsoft to create the Partnership on AI. While aspirational, the emphasis of this collaboration appears to be augmented intelligence rather than artificial intelligence -- less emulating the human brain and more performing routine tasks at scale.
The good thing about the Partnership on AI that has gone mostly unreported is it's an effort to standardize approaches and define best practices for the industry as a whole. However, it mainly applies to AI applications for business; I see no comparable efforts in the storage industry.
In the realm of storage, where AI is a buzzword mostly used to market the latest kit or software, vendors are using their proprietary algorithms, protocols and models to keep up with the cool kids.
IBM recently applied its Watson technology to enhance its storage management software and provision its cloud storage in an elastic fashion. With this effort, IBM has set something of a gold standard. But, Hewlett Packard Enterprise (HPE), with its deep learning initiative, and Hitachi, with its H technology, are starting to reap the benefits of their research. Other vendors are deploying AI for storage management technologies they hope will be alternatives to the big guys that have deep pockets to fund R&D or acquisitions.
An array of proprietary products
The other goal of AI development for storage is to create proprietary functionality that requires customers to use a particular vendor's products exclusively to gain value. For example, Dell EMC PowerMax boasts a built-in machine learning engine that the vendor says is designed to optimize performance by analyzing 40 million data sets per day, making 6 billion decisions per array. However, this functionality doesn't extend to non-Dell EMC arrays or even to a Dell EMC kit that isn't part of the PowerMax family.
HPE takes a similar approach with its 3PAR array, which uses HPE InfoSight's AI platform to drive performance, availability and system-level efficiencies. Like Dell EMC's PowerMax, HPE InfoSight is proprietary and only works with HPE hardware, including its Nimble product family.
Other products providing AI support at the array level include IBM with its AI-powered Storage Insights management software and Infinidat with what it describes as a neural caching architecture. Pure Storage has its AIRI -- AI-Ready Infrastructure -- technology. Quantum offers the AI-enabled Xcellis Scale-out NAS. And Vexata has its VelocityAI, which uses Nvidia's reference architecture for AI workloads. Regardless of how they improve on system-level performance, none of these tools provide heterogeneous infrastructure support.
One question yet to be answered is whether any of these AI for storage management technologies will impede storage management at the overall infrastructure level. Just as innovations in the early 2000s, such as thin provisioning and deduplication, impaired the efficient management of storage infrastructure by siloing arrays from one another, I worry that the sudden AI explosion will create similar problems.
That's one more issue for IT planners to puzzle over as they struggle to integrate multi-cloud and on-premises infrastructure.