Skip to content
Home > In the Media > AI models are rapidly consuming power

AI models are rapidly consuming power

  • ScaleFlux 
  • 2 min read

AI and machine learning models, like ChatGPT, require a large amount of power due to the intensive processing involved. ChatGPT handles 200 million requests daily, consuming over 500,000 kilowatt-hours of electricity – 17,000 times higher than the average daily consumption of a U.S. household, of 29 kilowatt-hours.

AI models use extensive data for training, and their energy consumption rises with the complexity and popularity of the model.

  • Training one AI model consumes more power than 100 households use annually.
  • By 2030, AI model development will consume 13% of global power usage, contributing to 6% of global carbon emissions.

This combination of data-heavy technology and its constant need for more energy to handle complex data processing requires changes to IT architecture.

Boosting AI efficiency with computational storage and arm-based tech

Innovative solutions like SSDs with computational storage and Arm-based computing enhance performance and energy efficiency. These technologies optimize data processing and reduce energy consumption, supporting sustainability efforts in IT infrastructures.

“Managing large AI datasets presents significant storage efficiency and power challenges. Enhancing the storage, memory, and GPU pipeline with computational storage-enabled SSDs is crucial for companies to achieve energy efficiency and sustainability goals,” explains JB Baker, Vice President of Products at ScaleFlux.




ScaleFlux is the pioneer in deploying Computational Storage at scale. Computational Storage is the foundation for modern, data-driven infrastructure that enables responsive performance, affordable scaling, and agile platforms for compute and storage I/O intensive applications. Founded in 2014, ScaleFlux is a well-funded startup with leaders proven to deploy complex computing and solid-state storage solutions in volume.