Skip to main content

Three reasons to adopt persistent memory in your data centre

(Image credit: Image Credit: Welcomia / Shutterstock)

The workloads of today and tomorrow continue to be more data-intensive, especially thanks to the growth of AI, machine learning and deep learning workloads. Furthermore, edge environments struggle to keep up between capacity demands, processing requirements and the speed at which data must be moved. A proper infrastructure is critical to ensuring the success of data-intensive applications in the enterprise, and traditional infrastructure alone cannot effectively support the needs of modern workloads.

This year, Intel announced the commercial availability of their Optane DC persistent memory, allowing storage and memory functions to come together for the first time in the history of computing. This is significant because persistent memory technologies have the potential to revolutionise the data centre over the next decade and could redefine the software stack to better support modern data-intensive applications, making them faster, more affordable and more predictable than ever before.

As defined by SNIA, persistent memory is non-volatile, byte addressable, low latency memory with densities greater than or equal to DRAM. There are a number of reasons why organisations should consider adopting persistent memory, also known as storage-class memory, in their data centres. The promise of this new technology can help enterprise infrastructure withstand data-intensive applications, meet real-time next-generation workload requirements as well as simplify and empower application development by removing two of the biggest bottlenecks in the processing of machine-generate data: storage I/O and memory capacity.

1. Data-intensive applications are testing traditional enterprise infrastructures

The nature of modern applications such as AI model training, machine learning, big data analytics, IoT and data warehousing has put a strain on traditional enterprise infrastructures, pushing them beyond their limits. These applications produce a constant flood of machine-generated data and require a more performant solution than today’s traditional storage systems to process the enormous number of small files that are particularly relevant to AI training.

Data-intensive applications tend to be more memory-centric and require a larger memory capacity due to the bottleneck that is known as storage I/O. Persistent memory addresses the memory-size constriction by delivering a larger memory pool size to better meet the demands of modern applications. Persistent memory delivers more memory than DRAM is capable of and at a much lower price point.

For example, AI training with checkpointing poses several challenges. Training takes a long time to complete for large datasets, data preprocessing and importing can also be time-intensive, which delays model deployment. By applying persistent memory in their storage infrastructure, enterprises can improve checkpointing speed and data loading speed.

2. Meeting real-time requirements while satisfying the fast-paced, dynamic needs of the enterprise

Data-centric workloads and lower latency applications require storage with fast sequential access and random access. Today’s systems that use HDDs or SSDs experience very slow random access to machine-generated data, and enterprises can no longer tolerate this in today’s on-demand economy where competitive advantage comes from immediate insights from data.

Persistent memory’s non-volatile and high endurance architecture makes it possible for enterprises to meet real-time requirements of next-generation workloads, making it so that large numbers of small files can be accessed with speeds as high as those accessing small numbers of large files. Persistent memory offers greater data centre reliability and can withstand 100X more full writes than NAND flash SSDs, so enterprises can also worry less about replacement of drives or potential data loss due to the failure of drives. The better disaster recovery capabilities also allow for faster restarts after a power failure.

3. Persistent memory offers a new programming model so developers can build simpler and more powerful applications

While volatile memory and storage I/O programming models will remain, a new programming model that gives the ability to allow applications to take advantage of persistent memory, or non-volatile memory, is fundamental for modern computing. Memory that can be persistent can help with developing a wide range of new applications that can serve as a differentiator for organisations in competitive or crowded markets.

The current SNIA NVM programming model is described thusly:

“This SNIA specification defines recommended behaviour between various user space and operating system (OS) kernel components supporting NVM. This specification does not describe a specific API. Instead, the intent is to enable common NVM behaviour to be exposed by multiple operating system specific interfaces.”

- SNIA NVM Programming Model (NPM) v1.2

Both Windows and Linux operating systems support the use of persistent memory today. Over time, further integration with these and other operating systems is expected to occur.

By way of example, developers of in-memory databases are already taking advantage of persistent memory to reduce their restart times. Others are re-writing their applications to boost performance by using persistent memory as a cache between DRAM and slower storage such as SSDs or even HDDs. In both these examples, code changes required are minimal.

Another set of applications that can benefit from persistent memory can be found in what’s known as graph analytics. With use cases as diverse as fraud detection in banking transactions to airline route optimisation to supply chain disruption mitigation, they all share a common problem: massive data sets with many-to-many relationships. Faced with analysing billions of data points, persistent memory can store these huge data sets in memory, eliminating the need to “chunk” the data into “bite size” pieces.

Further down the road it’s anticipated that more extensive application changes will come online as developers begin to find better ways to take full advantage of the capacity, speed, and durability of persistent memory.

In conclusion, new technologies are on the rise to support persistent memory,  and customers are already reaping the early benefits of it today. With the explosive growth of data-centric applications, the future of data storage is certainly memory.

Charles Fan, co-founder and CEO, MemVerge (opens in new tab)

Charles Fan is co-founder and CEO of MemVerge. Prior to MemVerge, Charles was the CTO of Cheetah Mobile leading its global technology teams, and an SVP/GM at VMware.