Advanced Computing in the Age of AI | Tuesday, April 23, 2024

AI Drives Adoption of Accelerated Computing Architectures 
Sponsored Content by Dell Technologies

With the global artificial intelligence (AI) software market growing by 47% in 2022, thousands of IT environments are adopting accelerated computing infrastructure to handle the demands of not only AI, but also machine learning (ML) and high performance computing (HPC) workloads. They’re gravitating to a variety of server accelerators to provide faster parallel compute performance for the world’s most demanding applications.

The Benefits of Real-time Responsiveness

Just as consumers expect near instantaneous responses from virtual assistants like Siri and Alexa that mimic the speed of human thought, businesses expect ultra-high performance from systems that detect credit card fraud, provide network services, recommend products and services, and deliver products to your door. The growing adoption of advanced, data-intensive technologies is greatly increasing the strain on server CPUs.

Enter Dell PowerEdge Servers with GPUs for accelerated parallel computing. These electronic circuits are designed for parallel processing, working with CPUs to accelerate applications. Due to their architecture, GPUs can also process large amounts of data simultaneously, however, the applications must be written to take advantage of the acceleration. For example, if all the software code knows is to drive one a one lane road, it won’t use all the other available lanes. The good news is the coding is easier than ever, opening applications up to new (faster) capabilities.

 Customer Success Stories

From financial transactions to video editing, medical imaging, fluid simulations, and enterprise applications, companies are using accelerated computing, AI, and ML to do a wide variety of jobs. Here are some real-world results from the use of accelerated computing around the world.

Hardware Speed Factors

Today’s workloads demand technology to flawlessly drive workload operations. For example, Dell PowerEdge servers are built to take advantage of the latest technological advances, including a wide range of GPUs to handle various types of workloads. Accelerated PowerEdge allows AI and ML applications to leverage a parallel processing environment that optimizes the intensive processing portions of applications on the data plane, while CPUs run control plane code.

Another consideration in accelerated computing hardware is heat. With state of the art Dell infrastructure designed for high throughput, thermal designs address heat-producing components. Other server features are front-to-back air-cooled designs and multi-vector cooling pathways.

A Growing Performance Differential

Comparing chips, GPUs currently beat CPUs 10 to one on bandwidth and floating point operations, according to computational physicist Vincent Natoli. While the gap is widening, GPUs still don’t function without CPUs ─ they need each other and work together.

According to the IEEE, “Accelerated architectures such as GPUs…have been proven to increase the performance of many algorithms compared to their CPU counterparts and are widely available in local, campus-wide and national infrastructures.” As for business benefits, according to analysis by Gartner, businesses with accelerated computing, integrated datasets, and AI solutions designed to inform decision-making will generate at least three times more value than companies that don’t use those solutions by 2025.

For more information on AI in business, read “Winning in the Age of AI – How organizations are accelerating intelligent outcomes everywhere.”

 

EnterpriseAI