While operating, computers store active data in Random Access Memory (RAM) chips. RAM chips are plugged into a computer's motherboard, and linked to the computer's processor via the front side bus. They provide what is essentially a direct highway for the exchange of variables and program data. The memory controller is a chip typically found on the northbridge of the motherboard. It manages read and write operations with system memory, along with keeping the RAM active by supplying the memory with electric current.
RAM is generally a speedier solution than other types of storage such as hard drives and optical discs. However, one of the downfalls to RAM is that it must be supplied with a constant flow of power in order to operate. As soon as the influx of power stops, the information stored in RAM chips is lost. The memory controller fulfills this need by "refreshing" the RAM at a constant rate while the computer is powered on.
During a "refresh," the memory controller sends a pulse of electronic current through the RAM chips. The amount of current sent through RAM is selected through the computer's Binary Input Output System (BIOS). This occurs at least every 64 milliseconds, keeping the RAM active and the data stored within secure against loss due to power interruptions. Without the memory controller, your data would be lost in fractions of a second.
The memory controller also manages read and write operations to the RAM chips. It acts to select the appropriate demultiplexer circuit for data storage and retrieval. Think of the memory on RAM chips like houses and the demultiplexer circuit like a street address; in order to "mail" information to a specific house or to retrieve information from that house, the computer must know what address to use. The memory controller acts as the middleman in these operations, ensuring that the proper information is retrieved from the right locations.
Dual-channel memory controllers are used in some types of memory. On these, two memory controllers work in tandem. They are positioned on two separate "buses," also called channels, allowing multiple read and write operations to occur concurrently. The advantage to this is that, in theory, the total bandwidth of the bus is doubled. However, in practice, other system considerations such as the speed of the bus and the processors capacities typically limit the extent to which the theoretical maximum bandwidth can be utilized.