Advanced Computing in the Age of AI | Friday, March 29, 2024

Liqid and MemVerge Collaborate with Intel for Big Memory Computing Solutions 

BROOMFIELD, Colo. & MILPITAS, Calif., March 25, 2022 — Liqid, a leading software company delivering data center composability, has announced it is collaborating with big memory solutions pioneer MemVerge and technology industry leader Intel to deliver composable memory solutions for big memory computing. With Liqid Matrix composable disaggregated infrastructure (CDI) software and MemVerge Memory Machine software, Liqid and MemVerge can pool and orchestrate DRAM and storage-class memory (SCM) devices such as Intel Optane Persistent Memory (PMem) in flexible configurations with GPU, NVMe storage, FPGA, and other accelerators to perfectly match unique workload requirements. The joint solutions, available today, deliver unparalleled scale for memory intensive applications for a variety of customer use-cases, including AI/ML, HPC, in-memory database and data analytics.

“We are pleased to collaborate with technology industry leaders like MemVerge and Intel, leveraging our shared expertise in software-defined data center architectures to deliver composable, big-memory solutions today,” said Ben Bolles, Executive Director, Product Management, Liqid. “Intel Optane-based solutions from MemVerge and Liqid provide near-memory data speeds for applications that must maximize compute power to effectively extract actionable intelligence from the deluge of real-time data.”

Pool and Orchestrate Intel Optane Persistent Memory for Memory Hungry Applications Today

DRAM costs and the physical limitations of the media have hampered deployment of powerful, memory-centric applications across use cases in enterprise, government, healthcare, digital media, and academia. Customers who need big memory solutions can deploy fully composable, disaggregated, software-defined big memory systems that combine MemVerge’s in-memory computing expertise with Liqid’s ability to compose for Intel Optane PMem and other valuable accelerators.

“Our composable big memory solutions are an important part of an overall CDI architecture,” said Bernie Wu, VP Business Development, MemVerge. “The solutions are also a solid platform that customers can leverage for deployment of future memory hardware from Intel, in-memory data management services from MemVerge, and composable disaggregated software from Liqid.”

With the joint, big memory CDI solutions from Liqid and MemVerge users can:

  • Achieve exponentially higher utilization and capacity for Intel Optane PMem with MemVerge Memory Machine software;
  • Enable ultra-low latency, PCI-Express-based composability for Intel Optane-based MemVerge solutions with Liqid Matrix CDI software;
  • Granularly configure and compose memory resources in tandem with GPUs, FPGAs, or network resources on demand to support unique workload requirements, release resources for use by other applications once the workload is complete;
  • Pool and deploy Intel Optane PMem in tandem with other data center accelerators to reduce big memory analytical operations from hours to minutes;
  • Increase VM and workload density for enhanced server consolidation, resulting in reduced capital- and operational expenditures
  • Enable far more in-memory database computing for real-time data analysis and improved time-to-value.

Building a Bridge to Future CXL Performance

CXL, a next-generation, industry-supported interconnect protocol built on the physical and electrical PCIe interface, promises to disaggregate DRAM from CPU. This allows both the CPU and other accelerators to share memory resources at previously impossible scales and delivers the final disaggregated element necessary to implement truly software-defined, fully composable data centers.

“The future transformative power of CXL is difficult to overstate,” said Kristie Mann, vice president of Product for Intel Optane Group at Intel. “Collaboration between organizations like MemVerge and Liqid, whose respective expertise in big memory and PCI-Express (PCIe)-composability are well-recognized, deliver solutions that provide functionality now that CXL will bring in the future. Their solution creates a layer of composable Intel Optane-based memory for true tiered memory architectures. Solutions such as these have the potential to address today’s cost and efficiency gaps in big-memory computing, while providing the perfect platform for the seamless integration of future CXL-based technologies.”

To learn more about joint composable, Big Memory solutions from Liqid and MemVerge schedule an appointment with an authorized Liqid representative or reach out at [email protected].

About Liqid

Liqid’s composable infrastructure software platform, Liqid Matrix, unlocks cloud-like speed and flexibility plus higher efficiency from data center infrastructure. Now IT can configure, deploy, and scale physical, bare-metal servers in seconds, then reallocate valuable accelerator and storage resources via software as needs evolve. Dynamically provision previously impossible systems or scale existing investments, and then redeploy resources where needed in real-time. Unlock cloud-like datacenter agility at any scale and experience new levels of resource and operational efficiency with Liqid.

About MemVerge

MemVerge is pioneering Big Memory Computing and Big Memory Cloud technology for the memory-centric and multi-cloud future. MemVerge Memory Machine is the industry’s first software to virtualize memory hardware for fine-grained provisioning of capacity, performance, availability, and mobility. On top of the transparent memory service, Memory Machine provides another industry first, ZeroIO in-memory snapshots which can encapsulate terabytes of application state within seconds and enable data management at the speed of memory. The breakthrough capabilities of Big Memory Computing and Big Memory Cloud Technology are opening the door to cloud agility and flexibility for thousands of Big Memory applications. To learn more about MemVerge, visit www.memverge.com.


Source: Liqid, MemVerge

EnterpriseAI