Advanced Computing in the Age of AI | Friday, April 19, 2024

Heat-Trapping and NREL’s Green Datacenter Leadership Effort 

The National Renewable Energy Laboratory (NREL) and their new $10 million HPC facility seeks to set the standard on green computing worldwide. The datacenter is part of the lab's new Energy Systems Integration Facility (ESIF), whose HPC implementations will be installed in multiple phases – the first one started in November of 2012 and will reportedly reach petascale capability by this summer.

Virtually every new data center that is planned or built in 2013 is labeled as having some energy efficient qualities. While that is beneficial to the progress of green computing, it can make it difficult to distinguish the facilities that are actually pushing the envelope on green technologies from those who are simply advertising energy efficient capabilities for the press.

The National Renewable Energy Laboratory (NREL) is in no danger of being accused of the latter as their $10 million HPC facility seeks to set the standard on green computing worldwide. The data center is part of the lab's new Energy Systems Integration Facility (ESIF), whose HPC implementations will be installed in multiple phases – the first one started in November of 2012 and will reportedly reach petascale capability by this summer.

"We took an integrated approach to the HPC system, the data center, and the building as part of the ESIF project," said NREL's Computational Science Center Director Steve Hammond. Then NREL worked with HP and Intel to develop computing systems that could operate in higher temperatures.

Further, as Hammond continued to explain, those systems are to be complimented by an HP-developed liquid cooling system:

"First, we wanted an energy-efficient HPC system appropriate for our workload," he said. "This is being supplied by HP and Intel. A new component-level liquid cooling system, developed by HP, will be used to keep computer components within safe operating range, reducing the number of fans in the backs of the racks."

It is the data centers that find interesting cooling methods that generally receive the most praise in the green community. Many institutions are finding critical success from placing data centers in areas where the outside air can cool the HPC systems, displacing the need for expensive chilled water operations. Google's Finnish data center and the Massachusetts Green High Performance Computing Center are two key examples of that. What ESIF is doing, however, is similar to the approach of the Pawsey Centre in Western Australia, which runs groundwater through its system for cooling purposes.

According to Hammond, however, the ESIF will be the first of its kind. "eBay, Facebook, and others have data centers that are water capable, but there aren't any products on the market now that are providing liquid cooling," Hammond said. "NREL is getting the first product that is direct-component liquid cooled. We're going to show it's possible, efficient, safe, and reliable."

The typical method of air conditioning servers is markedly inefficient in Hammond's view. He compares it to pouring a drink for oneself and relying on the air conditioning system in a house to cool it. "In traditional computer systems, you have a mechanical chiller outside that delivers cold water into the data center, where air-conditioning units blow cold air under a raised floor to try to keep computer components from overheating," Hammond said.

After all, air is generally cooled or heated by changing the temperature of water and allowing the general enthalpy of the liquid filter out into the air since water is thermodynamically more efficient at keeping temperature.

As a result, Hammond wonders why it would not be simpler to eliminate the air from the equation and just use water. "Getting the heat directly to liquid rather than through air first and then to liquid is the most efficient way to utilize it," he noted.

In another attempt to cut down on wasted energy, the facility will use the excess heat from the one megawatt of power that is required in running the system to heat office spaces as well as outside walkways. For example, in the winter, the excess heat can be carried through the water such that it melts ice and snow, making the entrances safer to traverse for workers.

"Most data centers simply throw away the heat generated by the computers," Hammond said. "An important part of the ESIF is that we will capture as much of the heat as possible that is generated by the HPC system in the data center and reuse that as the primary heat source for the ESIF office space and laboratories."

As a result, the facility will be able to cut down their operational and energy use expenses by about a million dollars per year, a fifth of which is made by possible by the recapture of heat.

"We're quite enamored with NREL being able to reuse the heat for the building and parts of the campus," said Stephen Wheat, general manager of high performance computing at Intel. "While others have done this before, here we are looking at a combined total efficiency goal and not just harvesting heat. We're looking to see how this can be the best solution for the entire campus."

Not only does NREL seek to make strides in manipulating environmental conditions for energy efficiency gains, they also look to their partners at HP and Intel to develop an HPC system that is efficient as possible.

"The methods of code optimization for Xeon Phi are identical to what one does to make the most of Xeon processors. Finely tuned optimizations for Xeon Phi almost always result in a better-performing source code for Xeon processors," Wheat said in discussing his company's contribution to the center.

Finally, this center, which is projected to come in at a PUE of 1.06, will be used to research even more ways to advance the green computing movement, such as how to improve clean technologies like wind and solar photovoltaics among others. The low 1.06 PUE, if achieved, would be the lowest for a petascale system in the world. "NREL isn't one-of-a kind in what we are doing – but we've set out to be the first of a kind," Hammond concluded. "For us, it just makes sense. Others can follow suit if it makes dollars and sense."

Related Articles

eBay Builds Energy Efficiency Dashboard

Google Adapts Its Ten Rules to Datacenter Building

Groundwater Flows Through New Datacenters

EnterpriseAI