Advanced Computing in the Age of AI | Monday, June 24, 2024

Distributed Approaches Touted for IoT Data Storage 

The importance of data storage and the ability to scale storage resources has only grown as use cases and reliable estimates begin to roll in concerning the amount of sensor and other data points to be generated by various Internet of Things (IoT) deployments.

For instance, a geared turbo fan engine being developed by Pratt & Whitney (NYSE: UTX) will be fitted with about 5,000 sensors generating an estimated 10 Gb of data per second. An average 12-hour flight time could generate an astounding 844 Tb of data.

Where to store it all?

A growing number of software-defined storage startups along with industry consortia are beginning to tackle the challenges poised by IoT storage requirement via approaches like distributed storage platforms that oversee virtualized infrastructure in datacenters in a manner similar to public cloud services.

Storage startups such as Hedvig Inc. note that customers increasingly worry about how to anticipate and map out future storage requirements as they gird for the IoT data onslaught. "In only relatively few instances, it’s possible that organizations can somewhat accurately ballpark what they expect their storage requirements will be, provided there’s a well-defined and specific use case," Rob Whiteley, Hedvig's vice president of marketing, noted in a blog post.

The startup based in Santa Clara, Calif., argues that IoT storage technical challenges are twofold, with network connectivity, processing horsepower and storage representing "first order" challenges. Beyond these are data security, privacy, compliance with data sovereignty rules and a growing list of other requirements.

Based on his previous experience at Amazon Web Services (NASDAQ: AMZN) and Facebook (NASDAQ: FB), Hedvig CEO Avinash Lakshman, (who founded the company in 2012 but waited until last year to emerge from stealth mode) reckoned that a distributed storage platform was the best approach to taming IoT data.

Assuming that all or most IoT data is valuable to an enterprise, Hedvig and other storage and IoT vendors argue that the emerging storage architecture must be "sized" with scale in mind. "This means accurately sizing both your primary and secondary storage tiers," Hedvig's Whiteley argued.

In pitching its own distributed storage platform, Hedvig asserts that IoT storage architectures should be designed for "incremental scaling." Added Whiteley: " IoT data may be the first killer use case for a software-defined storage pilot."

Once IoT data is prioritized and storage tiers are established, vendors like Cisco Systems (NASDAQ: CSCO) argue that computing and storage need to be as close as possible to real-time sensor data for immediate processing. Cisco refers to the approach as "fog computing," and a coalition of IoT ecosystem developers that include Cisco launched an OpenFog Consortium late last year to accelerate deployment of the distributed resources approach.

Along with storage, the OpenFog architecture would distribute computation, networking and control resources and services closer to systems at or near users. The initiative promises to deliver those IoT resources "along with continuum from Cloud to Things," the group said.

Other consortium members include ARM Ltd., Dell, Intel Corp. (NASDAQ: INTC), Microsoft (NASDAQ: MSFT) and the Princeton University's Edge Laboratory.

About the author: George Leopold

George Leopold has written about science and technology for more than 30 years, focusing on electronics and aerospace technology. He previously served as executive editor of Electronic Engineering Times. Leopold is the author of "Calculated Risk: The Supersonic Life and Times of Gus Grissom" (Purdue University Press, 2016).