Advanced Computing in the Age of AI | Thursday, March 28, 2024

Oil Exploration at Petascale 

(Lukasz Z/Shutterstock)

 We’ve been reading about it, researching it and discussing it, maybe even rolling out small implementations for years. Now growing numbers of enterprises across every vertical are adopting data analytics and artificial intelligence technologies as an essential tool for short and long-term business success. Business requirements to grow revenue while keeping costs low and the need to efficiently leverage data as a competitive advantage are driving the trend toward digitalization.

While many external sources and industry publications focus primarily on the resulting increase in compute power, if you look across AI, Deep Learning and High-Performance Computing (HPC) there is an interesting pattern emerging.

Successful environments are embracing compute solutions that are tightly integrated to storage infrastructure. These IT organizations are often attempting to leverage a combination of commodity-based hardware and storage components to achieve the greatest gains.  The prospect of cost savings looks to be the biggest driver for this approach, but many in the industry are realizing that an appliance-based approach (still leveraging commodity components) is the most expedient and effective way to go.

The supporting storage shifts in the oil industry serve as a good example.  In the face of weak crude prices, the oil industry is seeing signs of recovery and growth, including increased mergers and acquisitions and the uptick of projects that were delayed just a few years ago.

But if we look back at the worst of the oil downturn when capital spending remained under extreme pressure, investment in HPC technology stayed steady. Why? CIOs and CTOs told us that data center managers were being asked to deliver better results for less money without sacrificing organizational agility.

To have the biggest impact on profitability and to drive more efficient oil exploration they needed to increase the effectiveness of seismic data interpretations. To do that they turned to HPC technology.

For oil companies, a better understanding of subsurface structures directly translates to reduced exploration risk and cost. To get the best subsurface view, oil companies rely on high fidelity simulations and modeling. Other than the use of higher-fidelity images, the only other discernible factors to faster discovery are the mathematical algorithms applied to the data; the amount of data modelled; and the speed at which it can be modelled.

To do this competitively, oil companies have two options:

  • Either improve mathematical algorithms, which takes considerable time, or
  • Add to infrastructure to improve data processing sizes and speeds, which can be achieved in just a few months or even weeks.

Improving HPC infrastructure is the fastest route to gaining competitive advantage for  oil exploration. The goal is to analyze and re-analyze the largest possible amount of data in the least amount of time. In addition, emerging storage technologies, such as Non-Volatile Memory (NVMe), can be strategically applied to directly improve application performance.

There are a number of petascale systems in existence in the oil and gas industry. Competition and the need to maximize the return of exploration activity is frequently the business driver for these petascale operators. In fact, it’s common for the industry to see high-performance storage capacity growing at 120 to 500 percent year-over-year.

Storage is enabling oil and gas companies to handle tens of thousands of cores at an extremely high rate of performance. The key metric is efficiency – how well the storage infrastructure is matched to drive the performance the applications require. With the extensive investment required for AI, deep learning and HPC GPU-based compute platforms, ensuring their saturation is essential for timely success.  Traditional storage technologies must be re-examined and evaluated for their appropriateness.

Major oil companies with petascale + compute systems aren’t the only ones to build a competitive advantage based on this kind of infrastructure. Mid-sized producers and independent operators are exhibiting similar trends and are adopting high performance storage. These companies are finding out that storage can be optimized to balance application performance requirements with capacity demands while not necessitating a huge storage environment or an enormous investment.

Machine learning and artificial intelligence technologies will increasingly play a vital role in the future of the oil industry. As the dynamics of oil exploration change, the ability to capture, manage and analyze the massive volumes of data needed for accurate predictive modeling and optimized exploration is a key competitive advantage.

Storage solutions designed for artificial intelligence enhance precision exploration results by utilizing flash storage, parallel data processing, GPU’s and intelligent software to accelerate data analysis.  Removing the bottlenecks of the past by parallelizing IO and applying new levels of Flash performance, companies can truly maximize the performance of their exploration initiatives.

With oil price volatility, the industry must look to ways in which operational efficiencies and success of oil exploration can be increased. While seismic data interpretation remains the largest differentiator, investment in HPC and artificial intelligence as well as the high performance storage to support it, is poised to play an increasingly vital role in future oil exploration innovation and advancements.

Kurt Kuckein is director of marketing, DDN Storage.

EnterpriseAI