frenta - Fotolia
The data reduction strategies used by the majority of all-flash arrays on the market to improve flash performance are compression, deduplication or a combination of the two. Although these are the best-known approaches, there are two other data reduction strategies that are used in some products.
Pattern removal. This data reduction method works at the binary level, removing frequently occurring patterns of zeros and ones -- including long strings of zeros that may indicate empty space or null data. While some observers would classify this technique as a type of compression, there are varying levels of granularity. Some of the more aggressive data reduction engines use 8-bit granularity for pattern removal, which eradicates more data than would be possible when using a higher bit count.
Instant cloning. This technology goes by many different names, but the basic idea is that data is often used for more than one purpose. For instance, a database might be used by a production application while a copy of the database is used on a development and test server. From a capacity-reduction standpoint, it makes no sense to create multiple copies of the same database. An instant cloning feature allows copies of data to be created through the use of pointers or snapshots, rather than through physical data cloning.
Taneja Group analyst Mike Matchett discusses the differences between compression and data deduplication.
Array vendors have been working to strike a balance between data reduction strategies and ways to improve flash performance. Early on, the goal for flash vendors was to reduce data at all costs. Flash storage was very expensive, so there was a great deal of pressure to minimize the data footprint. Unfortunately, some data reduction algorithms were very CPU- and memory-intensive, which decreased performance.
Some modern algorithms attempt to decrease memory and CPU consumption by examining data prior to implementing data reduction strategies to estimate the benefit of applying a data reduction algorithm. In the past, arrays would usually attempt to reduce the data footprint by globally applying data reduction strategies, even to nonredundant data that could not be reduced. The end result of these modern techniques are data reduction rates that remain comparable, and decreased CPU and memory usage.
Approaches to data reduction in the all-flash data center
SSD environments require data reduction to optimize performance
Customers value deduplication systems on the storage array
Dig Deeper on Solid-state storage
Related Q&A from Brien Posey
Disaster recovery may not be the most prominent use case for blockchain technology, but the idea of using it for data protection and recovery is ... Continue Reading
Appliance-based hyper-convergence integrates software with hardware, while software-only HCI enables you to use the hardware of your choice. Each ... Continue Reading
When failing over VMs to the cloud or a remote data center, slow data transfer speeds can spell disaster. Don't let failed replications and partial ... Continue Reading
Have a question for an expert?
Please add a title for your question
Get answers from a TechTarget expert on whatever's puzzling you.