Advanced Computing in the Age of AI | Thursday, March 28, 2024

GE Looks to Hadoop for the Industrial Internet 

<img style="float: left;" src="http://media2.hpcwire.com/dmr/hadoopge.jpg" alt="" width="95" height="71" border="0" />With a name as big as “The Industrial Internet,” it shouldn't come as a surprise that implementing the waste-reducing, machine-optimizing network will require much more than a factory full of sensors. While we've already learned about the operational dashboards, what's been missing thus far is the analytics that help manufacturers become predictive instead of reactive.

With a name as big as “The Industrial Internet,” it shouldn't come as a surprise that implementing the sensor-filled, waste-reducing, machine-optimizing network will require a multitude of research and development from its incubator, General Electric. While we've already learned about the vast array of sensor technologies and operational dashboards in place at GE's Schenectady battery plant, what's been missing thus far is the powerful analytics that would turn what is currently a reactive system into a predictive tool to circumvent failures on the factory floor.

This week, the industrial giant announced such a platform at the D: All Things Digital conference in San Francisco, California. The Hadoop-based big data and analytics platform will include expanded partnerships with Accenture and Pivotal, as well as a new partnership with Amazon Web Services, which will provide cloud storage. Together, they will serve as the infrastructure and analytics behind GE’s push for the Industrial Internet.

Thus far the spotlight has fallen on GE's efforts to outfit equipment with the sensors and interconnects necessary to gather and transmit key data from the factory floor and beyond. But with this latest announcement, GE said it will provide "real-time data management, analytics, and machine-to-operations connectivity in a secure, closed-loop architecture so critical global industries can move from a reactive to a predictive industrial operating model."

Specifically, the toolkit has been designed to optimize equipment for longevity, energy consumption, and throughput, as well as to predict when a part must be replaced in order to avoid failure, which  was done by breaking analytics into two roles: asset health and process performance.

Brian Courtney, General Manager of GE Intelligent Platforms' Industrial Data Intelligence Software group, explained that the goal of asset health is to go beyond merely predicting equipment failure 90 minutes ahead of time to instead accurately forecast failures months in advance. For some industries this means significant reductions in downtime and a boost in productivity, but for others such as aerospace and energy, the power to predict failure in a jet engine or across the power grid could be life-saving.

Meanwhile, process performance analytics will help to optimize equipment and processes to deliver optimal machine output based on current conditions.

Today GE offers data collection through Historian, predictive analytics for condition-based monitoring through SmartSignal, and process-level analytics through CSense. And now that GE has been gathering data throughout their businesses using these tools, Courtney said that the question remained of how to tie these elements—data and analytics—together in a useful way.

The answer to GE's question was their newly released Monitoring and Diagnostics Suite, which combines those products with several new offerings: Proficy Knowledge Center and the Hadoop-based Proficy Historian HD. Together they allow for the storage of big data, process visibility, asset health assessment and process optimization.

Knowledge Center is a model-driven, browser-based visualization application that was designed to perform data mining on an asset, such that you can see information like asset and process health alongside the corresponding analytics (such as advisories and the predictions currently available based on past failures).

“Many large-scale manufacturers have so much data that the first thing they do is ignore it,” explained Courtney, which ultimately leads to data being overlooked that could have been used to predict equipment failures. Historian HD is expected to mitigate this problem because it offers the elasticity of the cloud, whereby manufacturers can simply add nodes to their Hadoop cluster in order to accommodate a growing data set.

This has enabled GE to expand into petascale, which wasn't possible before the company turned to Hadoop, which is necessary if you're like GE and processing 5 terabytes of data per day.

“We use the software ourselves in our own Monitoring & Diagnostics centers to manage trillions of dollars in asset value,” Courtney explained. “Today, in the GE Industrial Performance and Reliabilty Center, GE engineers monitor thousands of mission critical assets for our customers to ensure uptime, asset reliability and overall production throughput."

Courtney says that they had to make some changes to how Hadoop works such that it understands data collected at regular intervals, but all the Hadoop applications that sit on top of the data still work, which means that those working with R, Pig Latin and Hive aren't out of luck.

Jeff Immelt, CEO of GE, described advanced analytics such as these as comprising the foundation of GE's future. While he assured that “GE will never become a software company,” he did say that investing in analytics “will be the only way an industrial company can guarantee that the products it sells will be successful.”

EnterpriseAI