Advanced Computing in the Age of AI | Saturday, April 20, 2024

NetApp Stresses Consistency in Hybrid Deployments 

Cloud providers seeking to cash in on the accelerating shift to hybrid and multi-cloud deployments are emphasizing their ability to provide consistent performance across in-house and public cloud installations. That feature, for example, allows developers to migrate and access data regardless of where it is stored.

NetApp Inc. is taking the consistency theme a step further with the release this week of data fabric approach that links hybrid and multiple public cloud deployments while allowing customers to use its cloud data services on a consumption basis.

The company (NASDAQ: NTAP) said Tuesday (June 18) its new “hybrid multi-cloud” offering that includes cloud data services running on its hyperconverged infrastructure is being offered on a consumption basis. Users, for example, running application containers, could tweak persistent storage across public clouds.

Persistent storage has been identified as a key requirement for adoption of software containers. Hence, storage must be configured to handle those stateful applications. As part of its consistency pitch, NetApp said it is extending that capability to datacenters via is hyperconverged infrastructure.

Application developers are grabbing data from a variety of on-premises and cloud platforms, “so consistency becomes a very important construct,” Anthony Lye, general manager of NetApp’s cloud unit, noted in an interview.

As it offers developers the ability to work either on-premises or in the public cloud, NetApp also is offering a new Kubernetes cluster orchestrator service

NetApp also expanded its partnership with public cloud vendors with the beta release of a cloud volumes service running on Google Cloud (NASDAQ: GOOGL). Previously, Microsoft (NASDAQ: MSFT) announced support for a service called Azure NetApp Files

Meanwhile, NetApp is among the cloud vendors promoting the Istio service mesh intended to link application components, thereby boosting the capabilities of Kubernetes and the micro-services it orchestrates.

Based on that model, NetApp also unveiled a data fabric this week designed to provide a framework for hybrid and multi-cloud deployments. The data fabric is intended to customize everything from data integration to security. “It’s really about the life cycle of data,” Lye said.

Data fabric services include an orchestration tool kit for managing data stored in-house or aboard public clouds as well as a pair of consumption options for on-premise systems or data storage in the cloud.

Also this week, Hewlett Packard Enterprise (NYSE: HPE) announced an expansion of its competing GreenLake hybrid cloud service that also includes pay-as-you-go pricing.

Lye said NetApp’s hybrid multi-cloud services are being launch in public preview, during which potential customers can test them. The new services are scheduled for general availability during the summer of 2019.

About the author: George Leopold

George Leopold has written about science and technology for more than 30 years, focusing on electronics and aerospace technology. He previously served as executive editor of Electronic Engineering Times. Leopold is the author of "Calculated Risk: The Supersonic Life and Times of Gus Grissom" (Purdue University Press, 2016).

EnterpriseAI