Advertisement

When NVMe is Simply Not Enough: The Future of Storage for Edge Workloads

By on

Click to learn more about author Scott Shadley.

SSDs have evolved over the past decade to meet the growing demand of AI and Edge-Related workloads and now computational storage takes intelligent storage to the next level.

Today the big buzz words like “AI” and “Edge Computing” have taken the technology industry by storm and to the casual observer it all seems very cool and “fashion-forward.” Truth is, what it really means is that a tsunami of data that has taken over and the ability to quickly process and analyze terabytes and sometimes petabytes of data “in real time” (e.g.:  in an hour versus a week) presents both a challenge and an opportunity for the computer storage sector.

The computer storage industry is evolving and it is not just about capacity and data recovery anymore, but is about intelligently storing and analyzing data – in real time.

Here is the thing:  the oft overlooked computer storage function has suddenly gone from being a “supporting player” to having a starring role in this new and interconnected world of AI and edge workloads.  IoT devices are now generating five quintillion bytes of data every day (that is 5 million terabytes of data), and this will continue to grow as the number of IoT devices grows to 30 billion connected devices by next year (2020), according to Cisco.

Let’s take a real world example: today’s modern airliners, can generate up to a terabyte of data per flight. Even taking small snapshots of this data, which can reduce it to below a gigabyte per flight, is still far too much to transmit in-flight, so this massive amount of data must be analyzed at the edge if there is any chance of utilizing it in real time.

Another way “intelligent storage” is saving the day is helping in life and death situations.  One of the biggest worries a parent can ever suffer is losing a child in a crowd. Fortunately, there are ongoing improvements in the ability to track and find people, with the use of cameras and facial recognition or object detection continues to provide these enhancements. However, AI needs to manage these tools and their generated data. And the need to store and analyze data across multiple cameras and angles is requiring intelligent storage.  

The growing need to store and analyze data at the edge has spawned the need for intelligent storage solutions to solve the low power, more efficient compute needs without strain on the edge platforms.

Computer Storage Memory Lane

In order to meet the needs of intelligent storage, Computational Storage has emerged as a new trend which can deftly organize raw information into meaningful data.  Computational Storage allows an organization to ingest as many bits as possible and churns out just the right information on command and in real time at the storage level instead of in the CPU. 

How did we get here? 

Let’s take a trip down computer storage memory lane when SSDs or solid-state drives, other-wise known or flash, made an appearance in 2005 with the debut of the Apple iPod.  Suddenly the flash technology rendered the clunky physical hard drives (or HDDs) as nearly obsolete as SSDs were more stable, longer lasting and there were no moving parts that could be broken.  SSDs were also able to reduce storage media latency and improve storage reliability, thus reducing the need for huge RAM buffers.

However, in just the past few years as IoT and AI-powered devices became more standard use, SSDs needed to evolve. 

And that is where NVMe (Non-Volatile Memory Express) has emerged and marked the one of the first major developments for SSDs. NVMe is a streamlined and flash-focused interface that operates at a much higher interface for SSDs as NVMe is able to remove existing storage protocol bottlenecks for platforms churning out terabytes of data on a regular basis.

Is NVMe Enough?  Welcome Computational Storage

Analyzing data today often means finding a needle in the large data haystack, so that is where Computational Storage has come in and strengthened the haystack, augmenting the CPUs.  If organizations are trying to analyze a small portion of data from a huge data lake, it can take days or weeks to process, even with high-capacity NVMe SSDs. 

A recent survey by Dimensional Research of more than 300 computer storage professionals demonstrated that bottlenecks can occur at under 10 terabytes.

As such, Computational Storage enables more robust processing power to aid each host CPU, allowing an organization to ingest all the data it can generate but only provide what is necessary, therefore keeping the “pipes” as clear as possible.  This allows for when raw data is needed for analytics, organizations have the freedom to only pull out what is needed, versus having to deal with the entire data set. 

This approach is especially essential with high-capacity NVMe SSDs that require help to manage their data locality and storage compute needs. Computational Storage increases efficiency via In-Situ Processing for mass datasets, which reduces network bandwidth and is ideal for hyperscale environments, edge processing and AI/data applications.

The data tsunami is not lessening – it is increasing with veracity.  That said, storage architects must not look at data throughput as just physically moving data or storing it but rather how to intelligently organize it so that critical analysis can be done to increase not only efficiency within an organization, but to also  save lives.

Leave a Reply