Advanced Computing in the Age of AI | Tuesday, October 3, 2023

Funneling Big Data through the Grid 

As we push our power grid harder with every passing year, our grid’s infrastructure—its backbone—remains a relic of decades past, with many transformers dating back to the 1970’s. To build a more reliable, up-to-date system, Brett Sargent, CTO and vice president of Products and Solutions for LumaSense Technologies believes the answer is in smart sensors, but as he explains, not every sensor is working to make the grid smarter.

Still, the importance of updating the grid sits right under our noses, as Sargent pointed out.

“The number of significant power outages have increased significantly over the past few years… growing from 76 in 2007 to 307 in 2011,” Sargent said of our aging grid in a recent interview with AZoSensors’ Kal Kaur. “The average age of a transformer in North America is over 40 years old.”

So what do we do about it? One option, he explains, is to build new infrastructure, but this is often met with so much resistance from home and property owners that it’s impractical. Updating existing equipment with sensors, on the other hand, Sargent pegged as a more viable option.

But Sargent noted that slapping smart meters at every point on the grid doesn’t actually make it “smart.” While it may enable a utility to offer more sophisticated pricing options, based on time of use and demand, it doesn’t guarantee reliable electricity flow for the customer.

Already there are a number of sensors that utilities are using to achieve this smart grid, such as dissolved gas analysis, but Sargent said that one of the largest technologies on the horizon is thermal imaging that can be performed continuously rather than once or twice per year.

Of course, with so many sensors comes a big data problem. To avoid this, Sargent said that utilities should take advantage of what LumaSense is calling “intelligent sensing at the edge,” which keeps data storage and analytics near the sensors, only transporting data back to the utility on a “report by exception” basis.

In the interview, Sargent said listed three places where data can be processed: at the edge, in the cloud, and in the server at utility headquarters. “Each has pros and cons,” he said. “But keep in mind the further you move data, the more likely it is something will go wrong or the data will get hacked, lost, corrupted, etc. It also costs more money in order to move data a greater distance.”