Definition

RAM (random access memory)

Contributor(s): Rodney Brown

For additional information, see Fast Guide to RAM.

RAM (random access memory) is the place in a computing device where the operating system (OS), application programs and data in current use are kept so they can be quickly reached by the device's processor. RAM is much faster to read from and write to than other kinds of storage in a computer, such as a hard disk drive (HDD), solid-state drive (SSD) or optical drive. Data remains in RAM as long as the computer is running. When the computer is turned off, RAM loses its data. When the computer is turned on again, the OS and other files are once again loaded into RAM, usually from an HDD or SSD.

You can compare RAM to a person's short-term memory and a hard disk to long-term memory. Short-term memory focuses on the work at hand, but can only keep so many facts in view at one time. If short-term memory fills up, your brain is sometimes able to refresh it from facts stored in long-term memory. A computer also works this way. If RAM fills up, the processor needs to continually go to the hard disk to overlay old data in RAM with new, slowing the computer's operation. Unlike a hard disk, which can become completely full of data and unable to accept any more, RAM never runs out of memory, but the combination of RAM and storage memory can be completely used up.

DRAM vs. SRAM

RAM comes in two primary forms:

    • Dynamic random access memory. DRAM is what makes up the typical computing device RAM and, as noted above, requires constant power to hold on to stored data.

David Evans, a professor with the
University of Virginia Department
of Computer Science, explains the
differences between DRAM and SRAM.
  • Static random access memory. SRAM doesn't need constant power to hold on to data, but the way the memory chips are made means they are much larger and thousands of times more expensive than an equivalent amount of DRAM. However, SRAM is significantly faster than DRAM. The price and speed differences mean SRAM is mainly used in small amounts as cache memory inside a device's processor.

History of RAM

RAM is called random access because any storage location -- also known as a memory address -- can be accessed directly. Originally, the term distinguished regular core memory from offline memory, usually on magnetic tape in which an item of data could only be accessed by starting from the beginning of the tape and finding an address sequentially. RAM is organized and controlled in a way that enables data to be stored and retrieved directly to specific locations. Note that other forms of storage -- such as the hard disk and CD-ROM -- are also accessed directly or randomly, but the term random access is not applied to these forms of storage.

RAM started out as asynchronous, or having a different clock speed for the microchips in the RAM than the processor. This was a problem as processors became more powerful and RAM couldn't keep up with requests for data from the processor. In the early 1990s, clock speeds were synchronized with the introduction of synchronous dynamic random access memory. SDRAM reached its limit quickly, since it transferred data in a single data rate. Around the year 2000, double data rate random access memory (DDR RAM) was developed. This moved data twice in a single clock cycle -- at the start and end. The introduction of DDR RAM also seems to have changed the definition of SDRAM, as many sources now define it as single data rate RAM.

DDR RAM has evolved three times, through DDR2, DDR3 and DDR4. Each iteration improved data throughput speeds and reduced power use. However, each version is not compatible with the previous ones, as data is handled in larger batches in each innovation.

DDR vs. DDR2 vs. DDR3
A visual representation of the various DDR formats.

How big is RAM?

RAM is small, both in physical size -- it's stored in microchips -- and in the amount of data it can hold. A typical laptop computer may come with 4 gigabytes of RAM, while a hard disk can hold 10 terabytes.

RAM comes in the form of discrete or separate microchips, and in modules that plug into slots in the computer's motherboard. These slots connect through a bus or set of electrical paths to the processor. The HDD, on the other hand, stores data on a magnetized surface that looks like a phonograph record, while the SSD stores data in memory chips that, unlike RAM, are not dependent on having power all the time and won't lose data once the power is turned off.


This video from Logical Instruments
explains RAM and how it
functions within a computer.

Most PCs allow users to increase the number of RAM modules to a certain limit. Having more RAM in your computer reduces the number of times the processor has to read data from the hard disk, an operation that takes much longer than reading data from RAM. RAM access time is in nanoseconds, while storage memory access time is in milliseconds.

This was last updated in August 2016

Continue Reading About RAM (random access memory)

Dig Deeper on Enterprise storage, planning and management

PRO+

Content

Find more PRO+ content and other member only offers, here.

Related Discussions

Margaret Rouse asks:

What computing device do you think needs to have more RAM than is usually included?

1  Response So Far

Join the Discussion

2 comments

Oldest 

Forgot Password?

No problem! Submit your e-mail address below. We'll send you an email containing your password.

Your password has been sent to:

-ADS BY GOOGLE

File Extensions and File Formats

Powered by:

SearchSolidStateStorage

SearchCloudStorage

SearchDisasterRecovery

SearchDataBackup

Close