One response to this question might be one of a real estate agent's old favorites: “Location, location, location!”...
Cache memory is usually part of the central processing unit, or part of a complex that includes the CPU and an adjacent chipset where memory is used to hold data and instructions that are most frequently accessed by an executing program -- usually from RAM-based memory locations. In the classic von Neumann computer, the RAM was the “chalkboard” where processors did the math of a program. Placing this data store closer to the processor itself -- so data requests and responses didn’t have to traverse the motherboard bus -- reduced the wait time or latency associated with processing and delivered better than average latency and faster chip performance.
A RAM cache, by contrast, tends to include some permanent memory embedded on the motherboard and memory modules that can be installed by the consumer into dedicated slots or attachment locations. These memories are accessed via the mainboard bus (channels or conduits etched into the motherboard that interconnect different devices and chipsets). CPU cache memory operates between 10 to 100 times faster than RAM, requiring only a few nanoseconds to respond to the CPU request. RAM cache, of course, is much speedier in its response time than magnetic media, which delivers I/O at rates measured in milliseconds.
It should be noted that somewhat slower flash memory is now being used to provide an additional cache at the magnetic media level -- on disk controllers -- in an effort to change the latency characteristics of disk, especially as disks become more capacious and access to data increases. Considerable ink has been spilled to suggest that flash -- or solid-state disks -- will at some point in the future displace magnetic disks altogether as a production storage medium.
How to choose the right type of server-side flash
How to avoid problems when using SSD for write caching
Dig Deeper on Enterprise storage, planning and management
Related Q&A from Jon William Toigo
Increasing cache memory in your environment might help allocate CPU processes, but it doesn't necessarily result in better performance.continue reading
Expert Jon Toigo explains how virtual SAN vendors such as DataCore and StarWind aggregate storage capacity to better make use of cache memory.continue reading
As flash memory caching is on its way in because of its cost and non-volatile nature, RAM may be on its way out the door, says analyst Jon Toigo.continue reading
Have a question for an expert?
Please add a title for your question
Get answers from a TechTarget expert on whatever's puzzling you.