Recently Rambus announced that it was using cryogenic temperatures to boost computer performance in large datacenters. This research is being done in a joint project with Microsoft who is developing a processor based on Josephson Junctions.
This is an effort to provide a performance increases greater than can be attained through standard semiconductor scaling. The research project aims to attain improvements in cycle time, power consumption, and compute density, leading to better energy efficiency and cost of ownership (COO). The companies hope to gain side benefits of being able to squeeze more bits onto a DRAM chip thus reducing cost per bit, improving performance, and making DRAM chips less costly to produce.
The system these two companies are researching uses a memory system that is cooled to 77 degrees Kelvin (77°K) with a processor that operates at 4°K. To do this the memory system is bathed in liquid nitrogen while the processor is cooled by liquid helium. The temperatures are the boiling points of these two liquids.
Surprisingly, the fact that these two subsystems are in different temperature environments doesn’t require a large physical separation, something that would reduce the speed of the memory bus.
In an interview The Memory Guy was told that, at liquid nitrogen temperatures, the energy efficiency of DRAMs is ten times that of the same DRAMs at room temperature, but I was a little puzzled that a data center could save energy by cooling the environment to cryogenic temperatures – after all, refrigeration systems are notorious for their inefficiency. I was informed that big refrigerators can achieve efficiencies as high as 30%. Although it’s not completely clear how these two figures relate to each other, the researchers are certain that this will lead to overall energy savings.
According to Rambus 20-30% of a computer’s power is consumed by the DRAM. Most DRAM power is lost in refresh. Since data only needs to be refreshed if it isn’t being accessed, then refresh power is wasted power. That means that most of this 20-30% of power is simply wasted.
Interestingly enough, the interval between refresh cycles can be increased when you reduce the temperature, so you don’t need to refresh a cold DRAM as often as a DRAM at room temperature. Although the need to refresh doesn’t completely disappear at 70°K, the junction leakage component becomes small enough to ignore. Refresh cycles can be performed every few minutes instead of every few milliseconds. That reduces concerns to those relating to disturb, which has pattern dependencies.
(In fact, this post’s picture, provided courtesy of Hacker10, is really from a post about using liquid nitrogen to allow someone to steal your DRAM data. At liquid nitrogen temperatures a hacker can freeze a DRAM, then move it from one system to another to surreptitiously read the contents. The DRAM data will not be lost when the DIMM is removed from the system, despite the fact that power will be momentarily lost.)
Other power savings can be achieved by reducing the power supply voltage (Vcc) to achieve CV² savings, although the researchers didn’t want to disclose how low Vcc could be taken.
Altogether I was told that the cryogenic approach is 10,000 times more energy efficient at only 100 times the cost.
I knew that The SSD Guy would be concerned about power failure, so I asked about that. Power failures are a major concern in transaction processing systems and other compute environments. If power is lost the liquid nitrogen will eventually boil off and data will be lost. The Rambus researchers indicated that those concerns would have to wait for a later date, and that the current effort is simply to make cryogenic computing feasible and economical.
This is certainly a technology to watch. Hyperscale data centers may find it worthwhile to spend 100 times as much on a datacenter if they can cut their power consumption by 10,000 times to 0.01% of today’s consumption.