"Cache" is an over-used term, describing everything from extremely fast memory built directly into the computer's CPU (called "Level 1" or "L1") or on an adjacent chip (L2 or L3), to RAM accessed across a motherboard backplane, to flash or disk drives used to store frequently accessed data (read caching) or to organize a lot of write operations for greater efficiency (write cache). Where tape is used for purposes such as active archiving,...
a front-end rank of disk may be used as a tape cache.
Generally speaking, cache technology is an optimizing technology usually deployed to help balance the differences between different components. The CPU uses memory to hold instructions that are used repeatedly in the operation of programs. L1 cache memories (and, in some configurations, L2 cache memories) are built directly into chips to facilitate the fastest possible access to memory locations, supporting faster CPU performance. In other cases, adjacent chips are architected with direct pathways to the CPU, again to optimize chip performance. When L1 and L2 are built into the CPU chip, the adjacent chip is often referred to as "L3 cache." If the CPU has only L1 cache, the adjacent chip might play the role of L2 cache technology.
RAM is dynamic and usually volatile (meaning its contents will be lost if power is discontinued) memory that users can install on a motherboard. It is usually about half as fast as L1, L2 or L3 cache, and much less expensive. Since it is accessed by the CPU across the motherboard of the computer, it is subject to the speed limits of the bus. RAM, however, is much faster in terms of data access than are mechanical storage devices such as hard disks or tape and came into great use in the last few decades as a location for storing frequently accessed disk data with a goal of expediting I/O performance.
Flash memory has come into vogue in part as an alternative to traditional RAM caching. Flash is less expensive than RAM and it is non-volatile. Plus, disk vendors are showing great interest in pairing disk with flash to improve the performance of higher capacity disk or disk pools, prior to what some analysts claim will be the wholesale replacement of all magnetic media with solid-state storage in the medium-term future. There is also substantial discussion of storage architectures such as flape (flash plus tape) in which data is written to flash storage and to tape. When access to the data decreases, the data is eliminated from the flash storage and retained only on tape.
Avoiding write caching problems with SSD
Infinio releases new version of RAM cache tool
Dig deeper on Enterprise storage, planning and management
Related Q&A from Jon William Toigo
Increasing cache memory in your environment might help allocate CPU processes, but it doesn't necessarily result in better performance.continue reading
Expert Jon Toigo explains how virtual SAN vendors such as DataCore and StarWind aggregate storage capacity to better make use of cache memory.continue reading
Cache memory and random access memory both place data closer to the processor to reduce latency in response times. Learn how RAM can be the faster ...continue reading
Have a question for an expert?
Please add a title for your question
Get answers from a TechTarget expert on whatever's puzzling you.