Advanced Computing in the Age of AI | Friday, April 19, 2024

HPC Myth Busters: The High Cost of Misconceptions 

Incorrect or outdated information about high performance computing may prevent organizations from leveraging these powerful -- and affordable -- systems for competitive advantage.

Misinformation and FUD cost enterprises the chance to avail themselves of the growing selection of high performance computers specifically designed to resolve today's advanced scale computing challenges. Big data, analytics, and other high-compute applications are changing some technology leaders' minds, while those who have already embraced HPC leverage these powerful solutions for competitive gain.

Overall adoption of these systems is growing fast, with 2018 worldwide sales expected to reach $14.7 billion, according to IDC. Sales of HPC workgroups will increase almost 10 percent between 2013 and 2018; divisional HPC systems are predicted to grow almost 7 percent in that period, the research firm said. Yet some IT professionals still hold onto misconceptions about supercomputers. As a result, their organizations could well invest in the wrong solutions for the jobs at hand, causing negative scenarios such as slower results, larger datacenter footprints, or more compute expenditures.

"The computer and HPC have advanced so much in the past few years yet the attitude toward HPC is still one of [the] last century," said Tim Lynch, CEO of Psychsoftpc, whose high performance products include the Tesla Personal Supercomputer, Psychlone Linux Hadoop Cluster, and Psychlone Linux Cluster.

A growing number of HPC vendors crafted lower cost systems specifically designed for business applications such as cloud and big data. Just this week, for example, Dell launched a new mid-tier hyperscale division and plans to soon release products specifically for this market segment. A day later, Intel unveiled its latest software developer kit for HPC and emphasized its integration, accessibility, and multi-tiered pricing. New vendors will enter the space and market leaders will continue partnering to meet enterprises' need for reliable, secure, compute-intensive solutions, experts said. Developers seek to simplify and open up processes and software to make systems more accessible to traditional coders and business users.

"We see ultrascale vendors as driving a generation of less expensive, more energy efficient, and more specialized cluster nodes," wrote Christopher Willard, Addison Snell, Laura Segervall, and Michael Feldman in Intersect360 Research's "Top Six Predictions for HPC in 2015." "This will most likely lead to another scalable architecture cycle in which the HPC market embraces the less expensive, more energy- efficient and space-efficient nodes developed for ultrascale users."

Despite developers' advances, misconceptions persist. Whether due, perhaps, to lack of communication or education, long-standing bias, or preference for maintaining the status quo, solution providers, progressive IT professionals, and HPC vendors can face challenges when recommending advanced scale computing solutions to resolve business hurdles or achieve enterprise opportunities.

The Menu of Misconceptions

Sometimes enterprise IT professionals mistakenly believe HPC systems always cost far more than traditional datacenter solutions, said Per Nyberg, senior director of worldwide business development at Cray, in an interview.

"[People think] it's tens of millions of dollars. That's not true," he said. "We certainly sell systems in that size, but we also sell systems in the hundreds of thousands of dollars, very much what organizations spend in enterprise computing."

Similarly, there's the idea HPC solutions are overly complex, a concept numerous developers have addressed with upgrades and new software designed to simplify usage and expand usability to business users. Intel designed Parallel Studio XE 2016 so data scientists could easily use the software update, James Reinders, chief evangelist of developer products at Intel, told EnterpriseTech. Dell's "usability engineers" work across all product lines, including HPC, to ensure continuity throughout the brand and ease of use, the company said.

"Another misconception is [that] these systems are composed of specialty components that are difficult to use. There are certainly parts of the machine that are custom designed or custom built using commodity components but with the end-user applications in mind," said Cray's Nyberg. "They today use standard operating systems like Linux so they're like running any enterprise cluster."

HPC traditionally has often been associated with supercomputing alone, said Bill Mannel, vice president and general manager of High-Performance Computing and Big Data, Hewlett Packard Enterprise, via email. As a result, CIOs may overlook HPC's benefits and opt for more general-purpose hardware to perform all workloads using the same infrastructure they've maintained for years.

Bill Mannel, HP

Bill Mannel, HP

"The are hesitant to move to a new solution that, in many cases, is perceived as complex," said Mannel. "Second, traditional HPC clusters have typically been designed, installed, managed, and utilized by groups outside traditional IT. Even today, the high end of the HPC market continues to be dominated by academic and research use, with staffs having high-end administrative and computer science capabilities. Often the HPC clusters are hidden from the traditional IT management because the users do not want IT to initiate procedures and infrastructure that are viewed as an impediment. This cloak of secrecy causes traditional CIOs to perceive HPC as too complex."

Although scientific research and engineering are traditional bastions of HPC, the technology has much wider appeal, Brian Freed, vice president and general manager of high-performance data analytics at SGI, told EnterpriseTech. Now most organizations use big data and analytics to succeed – and must adroitly and creatively leverage this data for competitive insight that allows them to thrive – HPC increasingly plays an important role across verticals and geographies, he said.

"Most IT professionals don’t think the advantages of a HPC system would carry over to their organization, but the reality is that the same technologies that allow research organizations to crunch large quantities of scientific data can also be applied to analyzing business data," said Freed. "The true value of analyzing big data sets isn’t coming up with a trend here and there within a small subset of data that you typically see in research. It means looking at all the data and getting a more holistic view of your business. It’s about seeing all the emerging trends that may not be visible when the information is distributed across several disparate data sets."

Traditional IT environments generally don't deliver total analysis, where organizations scrutinize all their information at scale to discover outliers that impact and differentiate their business, he said.

"This 360-degree view of the situation changes how a business operates and how they prepare for those issues down the line," added Freed.

 

 

 

 

 

 

 

 

 

 

 

About the author: Alison Diana

Managing editor of Enterprise Technology. I've been covering tech and business for many years, for publications such as InformationWeek, Baseline Magazine, and Florida Today. A native Brit and longtime Yankees fan, I live with my husband, daughter, and two cats on the Space Coast in Florida.

EnterpriseAI