"Cache" is an over-used term, describing everything from extremely fast memory built directly into the computer's...
By submitting your email address, you agree to receive emails regarding relevant topic offers from TechTarget and its partners. You can withdraw your consent at any time. Contact TechTarget at 275 Grove Street, Newton, MA.
CPU (called "Level 1" or "L1") or on an adjacent chip (L2 or L3), to RAM accessed across a motherboard backplane, to flash or disk drives used to store frequently accessed data (read caching) or to organize a lot of write operations for greater efficiency (write cache). Where tape is used for purposes such as active archiving, a front-end rank of disk may be used as a tape cache.
Generally speaking, cache technology is an optimizing technology usually deployed to help balance the differences between different components. The CPU uses memory to hold instructions that are used repeatedly in the operation of programs. L1 cache memories (and, in some configurations, L2 cache memories) are built directly into chips to facilitate the fastest possible access to memory locations, supporting faster CPU performance. In other cases, adjacent chips are architected with direct pathways to the CPU, again to optimize chip performance. When L1 and L2 are built into the CPU chip, the adjacent chip is often referred to as "L3 cache." If the CPU has only L1 cache, the adjacent chip might play the role of L2 cache technology.
RAM is dynamic and usually volatile (meaning its contents will be lost if power is discontinued) memory that users can install on a motherboard. It is usually about half as fast as L1, L2 or L3 cache, and much less expensive. Since it is accessed by the CPU across the motherboard of the computer, it is subject to the speed limits of the bus. RAM, however, is much faster in terms of data access than are mechanical storage devices such as hard disks or tape and came into great use in the last few decades as a location for storing frequently accessed disk data with a goal of expediting I/O performance.
Flash memory has come into vogue in part as an alternative to traditional RAM caching. Flash is less expensive than RAM and it is non-volatile. Plus, disk vendors are showing great interest in pairing disk with flash to improve the performance of higher capacity disk or disk pools, prior to what some analysts claim will be the wholesale replacement of all magnetic media with solid-state storage in the medium-term future. There is also substantial discussion of storage architectures such as flape (flash plus tape) in which data is written to flash storage and to tape. When access to the data decreases, the data is eliminated from the flash storage and retained only on tape.
Avoiding write caching problems with SSD
Infinio releases new version of RAM cache tool
Related Q&A from Jon Toigo
Although software-defined storage and object storage can work in similar ways, there are significant differences between the two.continue reading
Traditional file storage can have a hard time handling today's massive amounts of metadata. Learn why object storage has become a popular replacement.continue reading
Large amounts of unstructured data and technologies like cloud have some asking if the traditional file system is up to the task.continue reading
Have a question for an expert?
Please add a title for your question
Get answers from a TechTarget expert on whatever's puzzling you.