Skip to content
Home > In the Media > Tackling the Volume Challenge, Computational Storage Sets Up for Mainstream Adoption

Tackling the Volume Challenge, Computational Storage Sets Up for Mainstream Adoption

  • JB Baker 
  • 2 min read

Computational storage embeds processors in a drive to offload the server CPU from burdensome tasks such as compression and encryption.

The one constant in data storage over the years is the inexorable increase in the volume of data that needs to be stored. While the industry at large has done an exceptional job keeping pace with volume by consistently driving down the cost per GB, other factors become more limiting as data continues growing. After all, as Albert Einstein reputedly said, “Only two things are infinite—the universe and human stupidity.” Dubious, to be sure, but the point is that storage capacity is not endless, as systems administrators everywhere readily attest; this is one of the biggest reasons computational storage (CS) is ripe for a surge into the mainstream. The other is growth in edge computing—where capacity, footprint, power, and lifespan are all factors.

Market research firm Gartner has acknowledged the growing importance of CS adoption in its Hype Cycle for Storage and Data Protection Technologies, 2022. It also notes that more than 40 percent of enterprise storage will be deployed at the edge in the next few years. By 2026, large enterprises will triple their unstructured data capacity stored as file or object storage on-premises, at the edge, or in the public cloud. […]

nv-author-image

JB Baker

JB Baker is a successful technology business leader with a track record of driving top and bottom line growth through new products for enterprise and data center storage. He joined ScaleFlux in 2018 to lead Product Planning & Marketing as we expand the capabilities of Computational Storage and its adoption in the marketplace.