Sergej Khackimullin - Fotolia

Problem solve Get help with specific problems with your technologies, process and projects.

How key factors in storage performance and cost spur adoption

It may take years, but a storage product's performance, cost and the problems it solves can converge to make its sales take off. See how a storage technology's tipping point is reached.

A tipping point happens when a concept, service or product takes off. For tech products, including storage, we see this happen when there's a sharp increase in sales, forming a hockey stick growth curve. Sometimes that hockey stick takes years to develop, such as with NAND flash SSDs. Other times, it happens overnight, as with smartphones when the iPhone came out in 2007.

Malcom Gladwell, with co-author John Decker, writing 20 years ago in The Tipping Point: How Little Things Can Make a Big Difference, launched the modern discussion of tipping points. There has been much discussion and controversy over their concepts and assertions, which have also been shown to have merit. Applying the book's primary concepts to storage provides a potential way to measure the tipping points in storage performance factors, cost and adoption.

Who makes a tipping point happen?

Gladwell and Decker cited three principles behind a tipping point. The first is the Law of the Few. Economists refer to this as the 80/20 rule or Pareto principle, where 20% of the people do 80% of the work. It's the work of the 20% that leads to tipping points. And the 20% is made up of a few specific types of people, who Gladwell and Decker referred to as connectors, mavens and salespeople.

Connectors, like social media influencers with millions of followers, have an outsized impact on how others accept products and services. A tech world analogy would be when a major bank adopts a new storage technology and gains a competitive advantage, making other financial institutions more likely to quickly deploy that technology to stay competitive.

Mavens collect massive amounts of product, vendor and market data. In the technology world, they would be analysts, such as Gartner, Enterprise Strategy Group, IDC and independents like myself. At times, we can have an outsized influence on the market.

There is clearly a contextual tipping point for storage where performance is a compelling factor for solving specific problems.

Salespeople are self-explanatory, but the term must be qualified. Good salespeople have impressive persuasion, negotiation and selling skills that can accelerate market tipping points. Mediocre and bad salespeople don't. That 80/20 rule is frequently applied to salespeople -- 20% of the salespeople make 80% of the sales.

What makes a tipping point happen?

The second principle is the Stickiness Factor, which refers to the concept, service or product having longevity. Storage is a good example and, more specifically, storage systems. Once data is stored on a storage system, moving it to a different storage system is not a trivial matter. With storage-to-storage, cloud storage and tape storage data migration, moving data is manual, labor-intensive and costly. That makes it sticky. In the case of cloud storage, its high egress fees can make data even stickier.

The third principle behind tipping points is the Power of Context, which refers to the conditions, circumstances, times and places in which the concept, service or product occurs. Context can have a much bigger influence than the Law of the Few or the Stickiness Factor. A high-speed 15K RPM HDD with high capacity released 20 years ago would be far more successful than if that same drive were released today with competition from flash SSDs. Context matters. It's the 800-pound gorilla in the tipping point room for storage technologies.

The essence of context

Context has a huge influence on determining when or if a tipping point will happen. For instance, an emerging storage technology's ability to solve a market problem could determine its tipping point. Other contextual factors in this scenario include:

-- the cost of the problem to the customer;
-- how the problem is currently solved;
-- the cost of solving the problem today; and
-- the cost of solving the problem with the newer storage technology.

Remember, a solution's cost is different from the cost of the problem itself. For instance, the problem of downtime has a cost associated with it that ranges from thousands of dollars per hour to millions, or even billions, of dollars. There is a separate cost to mitigate or eliminate that problem. They are different costs.

If the cost of solving the problem exceeds the cost of the problem itself, the context makes this technology unlikely to reach a tipping point where widespread adoption happens.

Storage performance factors, capacity and cost converge

The rise of commercial flash SSDs sheds light on this topic. SunDisk, which later became SanDisk and is now part of Western Digital, shipped the first commercial flash SSD in 1991. That was nearly three decades ago, and, yet, SSDs didn't reach a tipping point until nearly 25 years later, around 2015.

It's here that the relationship between storage performance factors, capacity and cost becomes clearer. Flash SSDs have had and continue to have three to four orders of magnitude greater transactional performance than high-performance HDDs. But the cost per gigabyte had to come closer to high-performance HDDs for wide-spread adoption to occur. The significant increase of flash NAND capacities in both planar and 3D fabrications made this possible.

SSD latency vs. HDD
SSDs significantly reduced storage media latency compared with HDDs.

Note that the technologies that enabled the massive increase in capacities also reduced storage performance in terms of latency, IOPS and throughput. Performance is generally an important factor, however, the value of that performance links back to the cost context of the problem being solved. For example, high-frequency trading puts an extremely high value on keeping performance latency low. For companies in this market, latency translates into millions, and even billions, of dollars lost. They are more likely to adopt higher cost-per-capacity storage technology than a laptop user or a website.

As the flash SSD cost per capacity declined, more IT pros began to see SSDs as a good solution to various problems. The cost per capacity didn't have to be lower or even equal to that of high-performance HDDs to gain widespread adoption, nor did it happen all at once. As the flash SSD cost-per-capacity curve dropped toward HDD's, the perceived value of storage performance increased. Consider that high-performance HDDs are disappearing from the market as a direct result. This pattern is playing out again with NVMe SSDs vs. SATA SSDs.

The importance of contextual value

There is clearly a contextual tipping point for storage where performance is a compelling factor for solving specific problems. As the cost declines, the breadth of problems solved increases. Widespread adoption demands the breadth of the problems solved be large and the contextual value of solving those problems be much greater than the contextual cost of doing so.

This doesn't mean connectors, mavens and good salespeople aren't important or that the stickiness value isn't also important. They are indeed important, but the contextual value is essential.

When new storage technologies, such as persistent memory, storage class memory and computational storage drives, come to market, remember these tipping point rules for widespread adoption.

Dig Deeper on Solid-state storage

Start the conversation

Send me notifications when other members comment.

Please create a username to comment.

SearchDisasterRecovery

SearchDataBackup

SearchConvergedInfrastructure

Close