Data Center Power and Cooling: What’s Hot Sponsored Content by Dell EMC
The legendary baseball coach Yogi Berra once said, “It ain’t the heat. It’s the humility.” He could have been describing today’s data centers. Heating and cooling challenges are humbling a great many infrastructure managers as the industry faces a reckoning over the limits of air cooling.
The Heat Conundrum
We’re not at a complete meltdown yet, but it’s coming. Data centers face an interlocking set of limitations with power and cooling. Servers consume vast amounts of power and generate an abundance of heat, which then require even more power to cool with air conditioning. At the same time, the business world is straining to build and/or fill up data center facilities to the brim. The push for higher processing speeds and hardware density leads to more heat in smaller spaces, which further ramps up cooling costs.
Space and power are limited. Yet, the Internet of things (IoT), artificial intelligence (AI), machine learning and High Performance Computing (HPC) are forcing data centers to max out power and cooling capacity. For instance, processors over 165 Watts, which are common for HPC, are almost possible to cool with air. Making the challenge more urgent, processor-based software licensing creates pressure to “scale up” servers with more compute power versus “scaling out” with more boxes. The result is a stubbornly high Power Usage Effectiveness (PUE), the ratio of compute power use to general power use.
Working smarter does help. For example, Google utilizes its DeepMind AI to halve the energy expended to cool its data centers. Facebook is building data centers in cooler climates where they can cool off server racks with outside air. However, it’s clear that a new approach to cooling and power management is needed in enterprise IT.
Déjà Vu All Over Again
The new approach to cooling might have Yogi Berra remarking, “It's like déjà vu all over again.” Liquid cooling, a staple of the old “heavy iron” of mainframes, is making an impressive comeback. The technology is different now, though the underlying mechanics are the same.
Liquid cooling is inherently superior to air cooling. The properties of liquid as a heat conductor make it significantly better at removing heat from a processor than air, while using a fraction of the electrical power. “If it were purely a matter of physics, you would never use air to cool a server,” said Geoff Lyon, CEO and CTO of CoolIT Systems, a Dell EMC partner and pioneer of cooling innovation.
CoolIT uses the exceptional thermal conductivity of liquid to provide dense, concentrated cooling. With a Microchannel architecture, CoolIT maximizes coolant flow. Their technology directs the coolest liquid to the hottest area of the processor first. Cold plates can be as small as 2.4mm in height are easily integrated into extremely compact, low-profile blade architectures providing optimal performance.
Lyon’s success shows that liquid cooling is gaining traction in the cooling business, though obstacles to adoption are plentiful. The biggest objection to liquid cooling is what Lyon refers to as “hydrophobia,” a concern that water will leak and ruin the hardware. CoolIT reduces the risk of a leak (and attendant anxiety) through a closed loop system with liquid sealed in specially engineered tubing.
Fear of water in the data center may be overblown, though, according to Jim Hearnden, a Data Centre Power & Cooling specialist at Dell EMC. Having worked with liquid cooling systems in mainframes since 1979, Hearnden likes to ask those balking at liquid cooling a simple question: “Do you think there are pipes in the ceiling and walls around your data center?”
Unless the data center is purpose-built and state-of-the-art, the answer is invariably “yes.” There are bathrooms and kitchens above many data centers. A common plumbing leak could cause far more damage to infrastructure than a liquid cooling failure. With this awareness, many IT managers grow more accepting of liquid cooling. The advent of virtualization has also made previously complex strategies like hot sites and seamless failover much easier and cheaper—further reducing the risk of service level impact from a failure in a liquid cooling solution.
Dell Power Edge – Innovations in Cooling and Energy
Power efficiency and low heat put the resources and innovations of Dell Power Edge at the forefront of power and cooling trends. The high-performing Dell PowerEdge C6420 is literally one of the coolest servers on the market. The PowerEdge C6420 features four server nodes in a 2U form factor with the latest Intel® Xeon® Scalable Processors inside. With the CoolIT Systems’ rack-based Direct Contact Liquid Cooling technology (Rack DCLC), it is able to run high wattage processors for increased performance. This provides measurable energy efficiency and rack-level density for data centers filled to capacity.
While both Intel® and Dell EMC continually strive to achieve more performance while using less energy, liquid cooling offers a way out of the heat and power conundrum. It will make it possible to increase computing performance and density. A greater number of faster machines will now fit on the same rack, but require less power for cooling. This outcome is great for the IT department and the bottom line. It’s also solving some more big picture problems.
Businesses, research centers and governments are expected (or required) to reduce their energy consumption and overall carbon footprints. In IT, the response has been the growing popularity of “Green” data centers, especially in Europe. Forward-looking enterprises are starting to convert to greener processes and use electricity generated from sustainable sources such as wind power and hydro.
Some Dell EMC customers are even starting to recycle the heat generated from their data centers for other uses. In cold regions like Scandinavia, hot liquid from data centers is being used to heat buildings. This ingenious process saves energy costs beyond the data center. The Dell EMC and CoolIT solutions facilitate this practice by allowing cooling liquid to enter the loop at an already high enough temperature (100° F) to exit into building heating systems at usable temperatures like 150° F.
Taking the Next Step
The choice to move to liquid cooling is going to be individual. Each organization has its own unique needs. Not every data center is ready for liquid cooling, and not all infrastructures face the same heat and power constraints. However, the trend is clear. The quest for processing speed and scale makes the dynamics of heating and cooling an issue that will have to be confronted sooner rather than later. Dell EMC can help customers implement liquid cooling incrementally. With this approach, it is possible to learn where the technology is the best fit.
To learn more about Dell EMC liquid cooled servers, visit dellemc.com/servers.