Computational storage and the new direction of computing

We’re excited to bring Transform 2022 back to life on July 19th and virtually July 20-28. Join AI and data leaders for sensible conversations and exciting networking opportunities. Register today!

Excitement, unexpected delays, lost time, high costs: Traveling regularly by people from all over the world is the worst part of the day and is one of the big drivers for home work policies.

Computers feel the same way. Computational storage is part of an emerging trend to make datacenters, edge servers, IoT devices, cars and other digitally-enhanced items more productive and more efficient by moving less data. In computational storage, the complete computing system – complete with DRAM, I / O, application processors, dedicated storage, and system software – is squeezed within the limits of the SSD to manage repetitive, initial, and / or data-intensive tasks locally. .

Why? Because moving data saves an enormous amount of money, time, energy and computational resources. “For some applications like compression in the drive, hardware engines consuming less than one watt can achieve the same throughput as more than 140 traditional server cores,” said JB Baker, VP of Marketing and Product Management at Scaleflux. “It’s 1,500 watts and we can do the same thing with one watt.”

Even the circulation of unnecessary data is not good for the environment. A 2018 Google-sponsored study found that 62.7% of computing energy is used in a wide range of applications by shutting down data between memory, storage and CPU. Computational storage, thus, can reduce emissions Update Operation

And then the problem of capacity arises. Cloud workloads and Internet traffic have increased 10x and 16x over the past decade, and will grow at a faster rate in the coming years as AI-enhanced medical imaging, autonomous robots and other data-heavy applications move from concept to commercial deployment.

Unfortunately, servers, rack space and operating budgets struggle to grow at the same exponential rate. For example, Amsterdam and other cities have imposed strict limits on the size of data centers, forcing cloud providers and their customers to find out how to do more in the same footprint.

Consider a traditional two-socket server set-up with 16 drives. A typical server can have up to 64 computing cores (two processors with 32 cores each). With computational storage, the same server could potentially have 136: 64 server cores and 72 application accelerators in its drive for initial tasks. Multiplying the number of servers per rack, the number of racks per datacenter, and the number of datacenters per cloud empire, computational drives have the potential to increase the potential ROI of millions of square feet of real estate.

Fine print

So if computational storage is so beneficial, how is it not already widespread? The reason is simple – the confluence of advances from hardware to software standards must come together to transform business practices into examples. All of these factors are now being aligned.

For example, a computational storage drive should fit within the same power and space limits as a regular SSD and server. This means that the computational element can consume only two to three watts out of the 8 watts allocated to the drive in the server.

While some early computational SSDs relied on FPGAs, companies such as NGD Systems and ScaleFlux are adopting system-on-chips (SoCs) built around smartphones originally developed arm processors. (The eight-core computational drive SoC can dedicate four cores to manage the drive and the rest to the application.) SSDs usually already have a little DRAM – 1GB for each terabyte in the drive. In some cases, the computational unit may use it as a resource. Manufacturers may also add more DRAM.

Additionally, computational storage drives can support standard cloud-native software stacks: containers built with Linux OSes, Kubernetes, or Docker. Databases for image recognition and other applications and machine learning algorithms can also be loaded into the drive.

Standards will also need to be finalized. The Storage Networking Industry Association (SNIA) last year released its 0.8 specification which covers a wide range of issues such as security and configuration; Full specification expected later this year.

Other innovations you should expect: more ML acceleration and specialized SoCs, faster interconnects, improved on-chip security, better software for analyzing data in real-time, and tools for merging data from distributed networks of drives.

Over time, we may also see the emergence of computational capabilities added to traditional rotating hard drives, which are still a workhorse of storage in the cloud.

Double edged

Some early use cases will come to the edge when working computational drive edge-to-edge. Microsoft research and NGD systems, for example, have found that computational storage drives can dramatically increase the number of image queries that can be made by processing data directly on CSD – one of the most discussed use cases – and that throughput increases linearly with more. Is. Drive

Other main targets are bandwidth-related devices with low latency requirements, such as airplanes or autonomous vehicles. More than 8,000 aircraft carrying more than 1.2 million people are in the air at any given time. Machine learning can be done effectively for predictable maintenance during flight with computational storage to increase safety and reduce turnaround time.

Cloud providers are also experimenting with computational cloud drives and will soon begin shifting to commercial deployments. In addition to helping offload tasks with more powerful application processors, computational drives can enhance security by running scans locally for malware and other threats.


Some may argue that the solution is obvious: reduce the computing workload! Companies collect more data than they use anyway.

However, that approach ignores an unfortunate truth about the digital world. We don’t know what data we need unless we already have it. The only real choice is to efficiently devise ways to process the huge data invasion that comes our way. Computational drives will be an important lynchpin to allow us to filter through data without getting bogged down in details. The insights generated from this data can unlock capabilities and usage cases that could change entire industries.

Mohammed Awad is the Vice President of IoT and has arm embedded,


Welcome to the VentureBeat community!

DataDecisionMakers is where experts, including tech people working on data, can share data-related insights and innovations.

If you would like to read about the latest ideas and latest information, best practices and the future of data and data tech, join us at DataDecisionMakers.

You might even consider contributing to your own article!

Read more from DataDecisionMakers

Similar Posts

Leave a Reply

Your email address will not be published.