Advanced Computing in the Age of AI | Friday, March 29, 2024

DataCore Storage Platform Scales To 64 PB 

Software-defined storage specialist DataCore Software said the latest release of its storage services platform could double the scale of hyper-converged storage systems to 64 nodes while significantly bumping up the performance of write-heavy workloads.

The company released a new version of its SANsymphony-V10 storage platform and a virtual storage area network that works with any hypervisor. It also enables deployment of 64 PB configurations to deliver more than 100 million IOPS across that virtual array.

The new release adds momentum to the company's strategy of virtualizing data storage that way most enterprises have virtualized servers in the datacenter. The company's software resides between application servers and existing storage hardware. The services platform helps storage hardware including solid-state drives work in concert to reduce the amount of wasted storage capacity.

The virtualization software runs directly on application servers so that storage cards and flash act as a virtual SAN. One result, the vendor claims, is a four-fold increase in storage capacity utilization.

The storage vendor's approach to boosting performance also focuses on workloads with lots of random writes, allowing solid-state and other drives, for example, to grab larger chunks of a workload. That approach aims to reduce the number of updates that are normally required when working with databases. The company claims its Random Write Accelerator delivers a 30-fold performance boost for write-heavy workloads that update databases, resource planning and online transaction processing.

Augie Gonzalez, DataCore's director of product marketing, said the storage vendor's approach focuses on leveraging less expensive techniques along with a customer's existing servers and other equipment to "mitigate the need for expensive flash memory."

The DataCore scaling approach also attempts to leverage read and write cache using a server's built-in RAM. That technique is also said to eliminate the need to add storage by using the storage area network to access memory and for bulk storage.

In doubling the scale of hyper-converged systems up to 64 nodes, the virtual SAN is said to support large-scale configurations by spreading workloads over more servers in a cluster. The approach also allows for sharing storage resources across multiple clusters to crank up performance.

Doubling the number of possible nodes in a virtual SAN, the company said, is critical to latency-sensitive applications spread out over large-scale clusters. Also, scaling out the configuration across more nodes is said to enable better distribution of workloads as well as the segmentation of physical storage capacity.

As more storage moves to the cloud, DataCore noted that the latest release of its storage platform could also be configured with Microsoft's Azure cloud platform. That permits hybrid cloud storage near active data that remains on-premise while inactive data is stored in the cloud.

DataCore said the latest version of its software services platform is available and shipping now.

About the author: George Leopold

George Leopold has written about science and technology for more than 30 years, focusing on electronics and aerospace technology. He previously served as executive editor of Electronic Engineering Times. Leopold is the author of "Calculated Risk: The Supersonic Life and Times of Gus Grissom" (Purdue University Press, 2016).

EnterpriseAI