Microsoft has been exploring innovative ways to cool data center servers for a few years now. Previously, the company has previously made waves for cooling the offshore data center using seawater via the Natick project. Now it is showing off a two-phase liquid cooling solution which it says enables even higher server densities.
The new system uses a non-conductive coolant. Microsoft does not identify it exactly, but it sounds like 3Ms Novec 1230, with a very low boiling point around 122F (Novec 1230 boils at 120.6F). Boiling off the coolant creates a cloud of steam, which rises and comes in contact with a cooled condenser on top of the tank cap. The fluid then rains down into the server chassis with the loop closed, and supplies the systems with freshly cooled coolant. Heat is also transferred from the server tank to a dry cooler outside the cabinet and disappears there as well. Immersion cooling works because direct contact with a non-conductive liquid provides far better thermal dispersion than a conventional air cooler.
“We are the first cloud vendor to run two-phase immersion cooling in a manufacturing environment,” said Husam Alissa, a senior hardware engineer on Microsoft’s advanced data center development team in Redmond, Washington.
Microsoft’s blog post describes the growth of immersion cooling as a good thing, and highlights the fact that it can reduce the server’s power consumption by 5-15 percent. The company notes that immersion cooling runs make it possible to direct burst workloads to the specific servers, because it can overclock them to be able to deliver requests faster.
Microsoft’s Project Natick experiment showed that pumping a data center module with nitrogen and dropping it into the water can be quite useful, with the submerged servers suffering 1/8 of the error rate for replica servers on land. The lack of moisture and oxygen is said to be responsible for the superior reliability under water. This system should have similar benefits. The company envisages distributing data centers for low latency, high performance and minimal maintenance needs if this liquid cooling system proves to be sustainable.
Microsoft’s blog post claims that the use of immersion cooling allows data centers to follow a separate “Moore’s Law” because the shift will reduce power consumption and enable increased server density, but this seems like a reach. The reason why companies are considering features such as immersion cooling is that CPUs and GPUs are now struggling to deliver higher performance without drawing ever-increasing amounts of power. CPUs can now hit 300W at the electrical outlet while GPUs scale up to 700W in the data center. CPUs continue to become more efficient, but increasing core numbers and extra functionality for death increase their absolute power consumption, even with these gains.
An interesting question is whether we will ever see cooling come to the consumer market. Technologies that debut in data centers often scale down to personal computing over time, but building an affordable aftermarket kit for home users to cope with this type of cooling is a high order. This type of cooling solution will never be cheap, but there may be a market for it in boutique gaming PCs and advanced workstations.
Feature image by Gene Twedt for Microsoft.