One day, organizations will be using all-solid-state data centers, according to Steve Duplessie, founder and senior analyst at Enterprise Strategy Group. But until then, flash and spinning disk will coexist in storage systems. In this Storage Decisions video, Duplessie discusses the various options for deploying solid-state in a storage environment, particularly as a storage cache.
Duplessie noted that organizations should use solid-state to lower latency on specific tasks, such as database indexes, and not for less-intensive workloads, such as supporting an Exchange server or handling video. He also said that when choosing solid-state storage, hardware won't be the most critical factor going forward.
"With all of these vendors at the end of the day, the true value will become who has the best caching algorithms. That's all it's going to be about; it's all about software, because the hardware game is going to be over. It's going to be Samsung, Toshiba … Micron -- those are the guys who make these chips," Duplessie said.
He said that using solid state as network cache is the "most interesting", because it can serve as a cache for all the storage behind it.
"You can extend the useful life of the assets you already have ad infinitum, and secondarily, as these things evolve, they will include and incorporate data management functions, so you can migrate in real time. So being able to take or get rid of or retire a system becomes significantly easier, significantly less painful and, all the while, make everything behind it go faster," Duplessie said, and later added, "That to me is a really interesting means of adding a lot of performance really inexpensively on all of that existing file infrastructure that you have behind it.
"We're going to have disks for a long time; it's not all going away, but ultimately, we're going to be in a solid-state data center," Duplessie said.