read cache definition

Contributor(s): Carol Sliwa

A read cache is a computer storage component that temporarily keeps a copy of data from a slower permanent storage location in order to accelerate the fulfillment of future requests for the data. 

A read cache typically retains copies of the data in fast I/O memory storage such as dynamic random access memory (DRAM) or flash. The main distinguishing characteristic of a read cache is the mechanism by which the data populates the cache. Data enters the cache after it has been retrieved, or read, at least once from its permanent storage hard disk drive (HDD) or solid-state drive (SSD). An algorithm tracks the data reads and determines which data will be deposited in the read cache. 

As the cache receives more data, or warms up, more data requests can be accelerated. In a typical scenario, the algorithm orchestrates the placement of copies of the most frequently accessed data into the read cache. A data request that is fulfilled from the read cache is faster than a request fulfilled from the original storage location.

Read caches may operate differently based on the algorithm variant they use. A read cache can contain multiple layers of storage technology, such as DRAM and SSD paired to accelerate data requests to an HDD-based array. Another variable is the implementation location, such as server-side or storage-side.

This was first published in December 2013

Continue Reading About read cache

This Content Component encountered an error

PRO+

Content

Find more PRO+ content and other member only offers, here.

0 comments

Oldest 

Forgot Password?

No problem! Submit your e-mail address below. We'll send you an email containing your password.

Your password has been sent to:

-ADS BY GOOGLE

File Extensions and File Formats

Powered by:

SearchSolidStateStorage

SearchVirtualStorage

SearchCloudStorage

SearchDisasterRecovery

SearchDataBackup

Close