cache (computing)

Definition: Learn what cache is and how it’s used to shorten data access times, reduce latency and improve input/output (I/O).

A cache (pronounced CASH) is a place to store something temporarily in a computing environment.

In computing, active data is often cached to shorten data access times, reduce latency and improve input/output (I/O).  Because almost all application workload is dependent upon I/O operations, caching is used to improve application performance. For example, when you visit a website’s home page, the files that your browser requests are stored on your computing device’s disk drive in the browser’s cache. If you click “back” and return to that home page, your browser can retrieve most of the files it needs from cache instead of requesting they all be sent again. This approach is called read cache. Because your browser saves time getting the files it needs, the website’s home page will load faster.  

A cache's storage capacity is determined by the application's developer and cache algorithms provide instructions for how the cache should be maintained. For example, the Least Frequently Used (LFU) algorithm uses a counter to keep track of how often an entry is accessed; the entry with the lowest count is removed first. The Least Recently Used (LRU) algorithm keeps recently used items near the top of cache; when the cache limit has been reached, items that have been accessed less recently are removed. The Most Recently Used (MRU) algorithm removes the most recently used items first; this approach is good for situations in which older items are more likely it is to be accessed.

The three main approaches to caching are write-through, write-around and write-back. Write-around cache only copies of data that has been read is written into cache. With write-through cache, data is written into cache and the corresponding main memory location at the same time. With write-back cache, data is written into cache every time a change occurs, but it is written into the corresponding location in main memory only at specified intervals or under certain conditions.

Popular uses for cache include:

Cache server - a dedicated network server, or service acting as a server, that saves Web pages or other Internet content locally.

Disk cache - holds data that has recently been read and perhaps adjacent data areas that are likely to be accessed next.

Cache memory - random access memory (RAM) that a computer microprocessor can access more quickly than it can access regular RAM.

Flash cache - temporary storage of data on NAND flash memory chips to enable requests for data to be fulfilled with greater speed.

Also see: buffer, which, like a cache, is a temporary place for data, but with the primary purpose of coordinating communication between programs or hardware rather than improving I/O speed.

This was first published in August 2014

Continue Reading About cache (computing)

Dig deeper on Enterprise storage, planning and management



Enjoy the benefits of Pro+ membership, learn more and join.

1 comment


Forgot Password?

No problem! Submit your e-mail address below. We'll send you an email containing your password.

Your password has been sent to:


File Extensions and File Formats

Powered by: