Advanced Computing in the Age of AI | Saturday, April 20, 2024

Intel’s Bryant Outlines Drive to Exascale 

It was a celebratory review of the HPC industry’s achievements to date, a look at the industry’s evolution from its roots in scientific research to providing tools for business insight, and a philosophical discussion of “recursion” – innovation in HPC propelled by machine learning – delivered by Diane Bryant, senior vice president and general manager of Intel's Data Center Group, at Monday night’s plenary session at SC15.

Bryant also showcased a cast of Intel technology all-stars, who touted the company’s drive to exascale based in part on its efforts to perpetuate Moore’s Law, despite the doubts of critics who point to the slowing of peak performance numbers on the annual Top500 list of the world’s most powerful supercomputers. In fact, Bryant’s presentation included Gordon Moore himself on a recorded phone call from his retirement in Hawaii.

But most of all, Bryant highlighted the transformative impact HPC has had on scientific discovery and innovation in business.

“Arguably the last time science was so radically changed was over 400 years ago when Galileo invented the telescope,” Bryant said, “extending the range of visual observation beyond the unaided eye. Scientific computing did something quite similar, extending the range of exploration beyond the intuitive and analytical processing capabilities of the human brain. With HPC, we can now see things that just a decade ago we couldn’t possibly imagine.”

That radical change is driven by relentless enhancements in processing power, and Bryant provided several of those familiar, but nevertheless impressive, comparisons between systems of yore (20 years ago) with today. For example, two 1997-vintage Intel ASCI Red supercomputers combined have the processing power of a single Intel “Knights Landing” Xeon Phi co-processor, featuring up to 8 billion transistors. China’s Intel-based Tianhe 2, ranked No. 1 on the TOP500 list, has 25,000 times the processing power of the ASCI Red, consumes just 20 times more electrical power and costs less than $1500 per terabyte, compared with more than $13 million per teraflop for the ASCI Red.

The result of improved price/performance is extension of HPC systems into broader spheres, including medium sized business. Example: BMI Corp. SmartTruck technology used Oak Ridge National Laboratory’s Jaguar supercomputer to improve the aerodynamics of 18-wheeler semi-trucks, resulting in $19 billion in annual fuel costs savings.

Intel's Diane Bryant talks exascale at SC15.

While democratization of HPC is clear, Bryant said, HPC has a long way to go. She cited studies showing that only 10 percent of all servers worldwide run in HPC clusters. Less than 10 percent of U.S. manufacturers have adopted HPC technology.

“The HPC market of today, albeit very powerful, is still relatively small,” Bryant said. “The commercial use of HPC is stall nascent. The shift from physical to digital for product modeling, prototyping and simulation has yet to reach to tipping the point.”

This prompts the question: Why? “The answer is complexity. It’s still too hard to use, too hard to access.”

What’s needed is a catalyst for change, and Bryant predicts the “maker movement,” with more than 135 million enthusiasts and $29 billion in revenue in the U.S. alone, could be a force that brings about far more user-friendliness in HPC technology. Makers, many involved in engineering-oriented pursuits such as electronics, robotics, 3-D printing and use of CNC tools, are enabled by HPC access to optimized infrastructures delivered through the cloud, Bryant said, and it’s transforming the way innovators create, collaborate, manufacture and reach their customers.

“The maker community, as a counterculture movement, can transform HPC,” she said. “It will force the installation of that ‘easy button’ to access the computational power through new seamless workflows with expansive and secure collaboration capabilities. The makers I believe will influence and transform the HPC community.”

More significant, according to Bryant, has been the onset of machine learning, arguable the most talked-about application for HPC in recent years, which, in turn, is associated with recursion, which Bryant defined: “As HPC transforms the science, the science transforms HPC, and it propels us forward.”

“We are rapidly reaching the tipping point stage where data is the game changer,” she said, “which places HPC in a whole new realm of possibilities. HPC fundamentally is required to train these complex machine learning models. Without HPC those models would remain useless in practice. So HPC has ushered in our ability to predict future events by leveraging intense compute power running on massive

datasets. And then in turn HPC is being transformed by machine learning. So the classic HPC approach of simulation-driven science is now being transformed into data model-driven science.”

She cited agriculture, education and transportation as three industries that are evolving from human-led to machine intelligence-led services, with the hyperscale cloud service providers leading the pack in driving broad access to machine learning.

About the author: George Leopold

George Leopold has written about science and technology for more than 30 years, focusing on electronics and aerospace technology. He previously served as executive editor of Electronic Engineering Times. Leopold is the author of "Calculated Risk: The Supersonic Life and Times of Gus Grissom" (Purdue University Press, 2016).

EnterpriseAI