A cache (pronounced CASH) is a place to store something temporarily in a computing environment.
In computing, active data is often cached to shorten data access times, reduce latency and improve input/output (I/O). Because almost all application workload is dependent upon I/O operations, caching is used to improve application performance. For example, when you visit a website’s home page, the files that your browser requests are stored on your computing device’s disk drive in the browser’s cache. If you click “back” and return to that home page, your browser can retrieve most of the files it needs from cache instead of requesting they all be sent again. This approach is called read cache. Because your browser saves time getting the files it needs, the website’s home page will load faster.
A cache's storage capacity is determined by the application's developer and cache algorithms provide instructions for how the cache should be maintained. For example, the Least Frequently Used (LFU) algorithm uses a counter to keep track of how often an entry is accessed; the entry with the lowest count is removed first. The Least Recently Used (LRU) algorithm keeps recently used items near the top of cache; when the cache limit has been reached, items that have been accessed less recently are removed. The Most Recently Used (MRU) algorithm removes the most recently used items first; this approach is good for situations in which older items are more likely it is to be accessed.
The three main approaches to caching are write-through, write-around and write-back. Write-around cache only copies of data that has been read is written into cache. With write-through cache, data is written into cache and the corresponding main memory location at the same time. With write-back cache, data is written into cache every time a change occurs, but it is written into the corresponding location in main memory only at specified intervals or under certain conditions.
Popular uses for cache include:
Disk cache - holds data that has recently been read and perhaps adjacent data areas that are likely to be accessed next.
Also see: buffer, which, like a cache, is a temporary place for data, but with the primary purpose of coordinating communication between programs or hardware rather than improving I/O speed.
Continue Reading About cache (computing)
'cache (computing)' is part of the:
View All Definitions