Advanced Computing in the Age of AI | Friday, March 29, 2024

Green Revolution Powered by (Mineral) Oil 

Austin-based Green Revolution Cooling uses mineral oil as the liquid for its datacenter cooling systems. The idea came to Christiaan Best as a friend described to him the cooling systems that were being installed at North American Aerospace Defense Command (NORAD).

Datacenters normally use air-conditioning or a liquid cooling system to keep things from overheating. Austin-based Green Revolution Cooling (GRC) is the only one to use mineral oil as the liquid for their system.

The idea came to Christiaan Best, CEO and Technical Founder of GRC, as a friend described to him the cooling systems that were being installed at North American Aerospace Defense Command (NORAD).

"Doing submersion in liquid was not something we came up with," Best told the Texas Advanced Computing Center. "People have been submerging electronics for 50 years. But now, people who run datacenters are starting to build dedicated buildings just to move the air through computers. Where does this madness end?"

A standard datacenter will be full of rack servers, with enough space between them to allow for airflow. The datacenter will also be positively pressurized to keep out dust. These datacenters generate so much heat that half of the cost of building a new one is the specializations required for air-cooling servers.

GRC's liquid cooling system, known as CarnaJet, has a similarity to other liquid cooling systems. They all use liquid, which is better at picking up heat and removing from the datacenter. The type of liquid used is what's different, and that can range from water to exotic chemicals, as well as how to keep the liquid apart from the electronics to avoid damage.

There's no one universally accepted method to cooling a datacenter with liquid. Some datacenters use water only to move the heat generated outside. Another way is to use chilled water and move the heat through pipes then out of the datacenter.

Seawater is also a liquid used to cool datacenters. By pumping seawater through a datacenter's HVAC system, the air circulating within a facility is cooled, which lowers the inside temperature. Using seawater eliminates the need to chill the water.

A more targeted approach involves supplying cool water to the rack or cabinet. In the case of enclosed cabinets, only the space surrounding the servers need be cooled—the remainder of the room is largely unexposed to the heat produced by the IT equipment. This approach is similar to the whole-room case using CRAHs, except that only small spaces (the interiors of the cabinets) are cooled.

Cooling can be targeted even more directly by integrating the liquid system into the servers themselves. For instance, Asetek provides a system in which the CPU is cooled by a self-contained liquid apparatus inside the server. Essentially, it's a miniature version of the larger datacenter cooling system, except the interior of the server is the "inside" and the exterior is the "environment." Of course, the heat must still be removed from the rack or cabinet – which can be handled by any of the above systems.

Since mineral oil is a poor conductor of electricity, the servers can be dipped into a tank of the stuff and not short out. The coolant also has 1,200 times the heat retention capacity of air, can cool up to 100kW per 42U rack and doesn't evaporate. The mineral oil itself is non-toxic, available anywhere, and is used in a variety of industries, including medical and food. The only requirements for a mineral oil cooling system are tanks, a pump for the oil, motherboards and a heat exchanger, whereas most liquid cooling systems require custom plumbing.

"We're essentially replacing the 'managed air flow' infrastructure of today's datacenters (including state of the art datacenters) with 'managed coolant flow," said David Banys, Director of Marketing for GRC. "Compared to other liquid cooling solutions, we're much, much less expensive because of the simplicity of our cooling system and the ability to use commodity servers from any OEM."

According to GRC's Andy Price, their cooling system can cut the initial cost of building a datacenter in half, since the need for air conditioning and specialized construction has been eliminated. With the servers sitting in tanks of mineral oil, a datacenter could be located in a dirty loading dock, like at the Texas Advanced Computing Center. There, Green Revolution's coolant has shown a 40 percent reduction in total power usage, even though the ambient temperature is sometimes over 100 degrees.

"The point is all we really need is a flat floor, a roof over our head and electrical utilities," Price said.

Companies have been working to create a "datacenter in a box" which could be placed in a shipping container and be easily transported. Reportedly, GRC is the first company able to use standard-sized shipping containers.

"You can get them anywhere in the world," Price said. "We're just cutting them up for access and plumbing and to pick up the floor, to bolt down our racks."

EnterpriseAI