Advertisement

How NVMe Helps Manage and Maintain the Data Boom

By on

Click to learn more about author Ron Herrmann.

Every day there is an unfathomable amount of data – nearly 2.5 quintillion bytes – being generated all around us. Part of the data being created we see every day, such as pictures and videos on our phones, social media posts, banking and other apps.

In addition to this, there is data being generated behind the scenes by ubiquitous sensors and algorithms – whether that’s to process quicker transactions, gain real-time insights, crunch big data sets or to simply meet customer expectations. Traditional storage architectures are struggling to keep up with all this data creation, leading IT teams to investigate new solutions to keep ahead and take advantage of the data boom.

Some of the main challenges are understanding performance, removing data throughput bottlenecks and being able to plan for future capacity. Architecture can often lock businesses in to legacy solutions, and performance needs can vary and change as data sets grow.  

Architectures designed and built around NVMe SSDs and NVMe over Fabrics can provide the perfect balance, particularly for data-intensive applications that demand fast performance. This is extremely important for organizations that are dependent on speed, accuracy and real-time data insights.

Industries such as healthcare, autonomous vehicles, AI/ML and Genomics are at the forefront of the transition to high performance NVMe storage solutions that deliver fast data access for high performance computing systems that drive new research and innovations.

Genomics

With traditional storage architectures, detailed genome analysis can take upwards of five days to complete – which makes sense considering an initial analysis of one person’s genome produces approximately 300GB-1TB of data, and a single round of secondary analysis on just one person’s genome can require upwards of 500TB storage capacity. However, with an NVMe solution implemented it’s possible to get results in just one day.

In a typical study, genome research and life sciences companies need to process, compare and analyze the genomes of between 1,000 and 5,000 people per study. This is a huge amount of data to store, but it’s imperative that it’s done. These studies are working toward revolutionary scientific and medical advances, looking to personalize medicine and provide advanced cancer treatments. This is only now becoming possible thanks to the speed that NVMe enables researchers to explore and analyze the human genome.

Autonomous Vehicles

A growing trend in the tech industry is that of autonomous vehicles. Self-driving cars are the next big thing, and various companies are working tirelessly to perfect the idea. In order to function properly, these vehicles need very fast storage to accelerate the applications and data that ‘drive’ autonomous vehicle development. Core requirements for autonomous vehicle storage include:

  • Must have a high capacity in a small form factor
  • Must be able to accept input data from cameras and sensors at “line rate” – AKA have extremely high throughput and low latency
  • Must be robust and survive media or hardware failures
  • Must be “green” and have minimal power footprint
  • Must be easily removable and reusable
  • Must use simple but robust networking

What kind of storage meets all these requirements? That’s right – NVMe.

Artificial Intelligence

Artificial Intelligence (AI) is gaining a lot of traction in a variety of industries – from financial to manufacturing, and beyond. In financial, AI does things like predict investment trends. In manufacturing, AI-based image recognition software checks for defects during product assembly. Wherever it’s used, AI needs a high level of computing power, coupled with a high-performance and low-latency architecture in order to enable parallel processing power of data in real-time.

Once again, NVMe steps up to the plate, providing the speed and processing power that is critical during training and inference. Without NVMe to prevent bottlenecks and latency issues, these stages can take much, much longer. Which, in turn, can lead to the temptation to take shortcuts, causing software to malfunction or make incorrect decisions down the line. 

The rapid increase of data creation has put traditional storage architectures under high pressure due to its lack of scalability and flexibility – both of which are required to fulfill future capacity and performance requirements. This is where NVMe comes in, breaking the barriers of existing designs by offerings unanticipated density and performance. These breakthroughs that NVMe is able to offer contain the requirements needed to help manage and maintain the data boom.

Leave a Reply