cache (computing)

Definition: Learn what cache is and how it’s used to shorten data access times, reduce latency and improve input/output (I/O).

A cache (pronounced CASH) is a place to store something temporarily in a computing environment.

In computing, active data is often cached to shorten data access times, reduce latency and improve input/output (I/O). Because almost all application workload is dependent upon I/O operations, caching is used to improve application performance.

For example, Web browsers such as Internet Explorer, Firefox, Safari and Chrome use a browser cache to improve performance for frequently accessed webpages. When you visit a webpage, the files your browser requests are stored on your computing storage in the browser's cache. If you click "back" and return to that page, your browser can retrieve most of the files it needs from cache instead of requesting they all be sent again. This approach is called read cache. It is much faster for your browser to read data from the browser cache than to have to re-read the files from the webpage.

Cache algorithms

Cache algorithms provide instructions for how the cache should be maintained. Some examples of cache algorithms include:

  • Least Frequently Used (LFU) uses a counter to keep track of how often an entry is accessed; the entry with the lowest count is removed first.
  • Least Recently Used (LRU) keeps recently used items near the top of cache; when the cache limit has been reached, items that have been accessed less recently are removed.
  • Most Recently Used (MRU) removes the most recently used items first; this approach is good for situations in which older items are more likely to be accessed.

Types of cache

Write-around cache allows write operations to be written to storage, skipping the cache altogether. This keeps the cache from becoming flooded when large amounts of write I/O occur. The disadvantage is that data is not cached unless it is read from storage. As such, the initial read operation will be comparatively slow because the data has not yet been cached.

Write-through cache writes data to both the cache and storage. The advantage to this approach is that newly written data is always cached, thereby allowing the data to be read quickly. A drawback is that write operations are not considered to be complete until the data is written to both the cache and primary storage. This causes write-through caching to introduce latency into write operations.

Write-back cache is similar to write-through caching in that all write operations are directed to the cache. The difference is that once the data is cached, the write operation is considered complete. The data is later copied from the cache to storage. In this approach, there is low latency for both read and write operations. The disadvantage is that, depending on the caching mechanism used, the data may be vulnerable to loss until it is committed to storage.

Popular uses for cache

Cache server: A dedicated network server, or service acting as a server, that saves webpages or other Internet content locally. This is sometimes referred to as a proxy cache.

Disk cache: Holds data that has recently been read and perhaps adjacent data areas that are likely to be accessed soon. Some disk caches are designed to cache data based on how frequently it is read. Storage blocks that are read frequently are referred to as hot blocks and are automatically moved to the cache.

Cache memory: Random access memory (RAM) that a computer microprocessor can access more quickly than it can access regular RAM. Cache memory is usually tied directly to the CPU and is used to cache instructions that are frequently accessed by the processes that are currently running. Although a RAM cache is much faster than a disk-based cache, cache memory is much faster than a RAM cache because of its proximity to the CPU.

Flash cache: Temporary storage of data on NAND flash memory chips  -- often in the form of solid-state drive (SSD) storage -- to enable requests for data to be fulfilled with greater speed than would be possible if the cache were located on a traditional hard disk drive (HDD).

How to increase cache memory

Cache memory is a part of the CPU complex and is therefore either included on the CPU itself or is embedded into a chip on the system board. Typically, the only way to increase cache memory is to install a next-generation system board and a corresponding next-generation CPU. Some older system boards included vacant slots that could be used to increase the cache memory capacity, but most newer system boards do not include such an option.

Also see: buffer, which, like cache, is a temporary place for data; however, the main purpose of a buffer is to absorb demand spikes. For instance, a write buffer might use flash storage to temporarily store write operations and then move the recently written data to the system’s main storage when resources are more readily available. In this situation, the SSD storage is faster than HDD storage and can complete write operations more quickly. The data isn’t cached on the SSD, as the SSD is only used as a temporary data repository.

This was first published in April 2015

Continue Reading About cache (computing)

Dig Deeper on Enterprise storage, planning and management

PRO+

Content

Find more PRO+ content and other member only offers, here.

Related Discussions

Margaret Rouse asks:

Is caching a hardware function or a software function?

0  Responses So Far

Join the Discussion

1 comment

Oldest 

Forgot Password?

No problem! Submit your e-mail address below. We'll send you an email containing your password.

Your password has been sent to:

-ADS BY GOOGLE

File Extensions and File Formats

Powered by:

SearchSolidStateStorage

SearchVirtualStorage

SearchCloudStorage

SearchDisasterRecovery

SearchDataBackup

Close