Advanced Computing in the Age of AI | Saturday, February 24, 2024

ExxonMobil, NCSA, Cray Scale Reservoir Simulation to 700,000+ Processors 

(Lukasz Z/Shutterstock)

In a scaling breakthrough for oil and gas discovery, ExxonMobil geoscientists report they have harnessed the power of 717,000 processors – the equivalent of 22,000 32-processor computers – to run complex oil and gas reservoir simulation models.

This is more than four times the previous number of processors used in energy exploration HPC implementations, according to ExxonMobil, which worked with the National Center for Supercomputing Applications (NCSA) at the University of Illinois, Champaign-Urbana, and its Cray XE6 “Blue Waters” petascale system.

“Reservoir simulation has long been seen as difficult to scale beyond a few thousand processors,” John D. Kuzan, manager, reservoir function for ExxonMobil Upstream Research Company, told EnterpriseTech, “and even then, ‘scale’ might mean (at best on a simple problem) ~50 percent efficiency. The ability to scale to 700,000-plus is remarkable – and gives us confidence that in the day-to-day use of this capability we will be efficient at a few thousand processors for a given simulation run (knowing that on any given day in the corporation there are many simulations being run).”

Source: NCSA

The objective of the scaling effort is for ExxonMobil geoscientists and engineers to make better, and a higher number of, drilling decisions by more efficiently predicting reservoir performance. The company said the run resulted in data output thousands of times faster than typical oil and gas industry reservoir simulations and that was the largest number of processor counts reported by the energy industry.

ExxonMobil’s scientists, who have worked with the NCSA on various projects since 2008, began work on the “half million” challenge – i.e., scaling reservoir simulations past half a million processors – since 2015. NCSA’s Blue Waters system is one of the most powerful supercomputers in the world. Scientists and engineers use the system on a range of engineering and scientific problems. It uses hundreds of thousands of computational cores to achieve peak performance of more than 13 quadrillion calculations per second and has more than 1.5 PB of memory, 25 PB of disk storage and 500 PB of tape storage.

The reservoir simulation benchmark involved a series of multi-million to billion cell models on Blue Waters using hundreds of thousands of processors simultaneously. The project required optimization of all aspects of the reservoir simulator, from input/output to communications across hundreds of thousands of processors.

“The partnership with NCSA was important because we had the opportunity to use ‘all’ of Blue Waters,” said Kuzan, “and when trying to use the full capability/capacity of a machine the logistics can be a challenge. It means not having the machine for some other project (even if it is for only a few minutes per run). The NCSA was willing to accommodate this and worked very hard not to disrupt others using the machine.”

The simulations were run on a proprietary ExxonMobil application, one that Kuzan said has not yet been named but is referred to as the “integrated reservoir modeling and simulation platform.”

Reservoir simulation studies are used to guide decisions, such as well placement, the design of facilities and development of operational strategies, to minimize financial and environmental costs. To model complex processes accurately for the flow of oil, water, and natural gas in the reservoir, simulation software solves a number of complex equations. Current reservoir management practices in the oil and gas industry are often hampered by the slow speed of reservoir simulation.

“NCSA’s Blue Waters sustained petascale system, which has benefited the open science community so tremendously, is also helping industry break through barriers in massively parallel computing,” said Bill Gropp, NCSA’s acting director.