NVIDIA’s H100 AI GPUs are projected to hit a deployed capacity of 3.5 million units by next year, but this comes with a caveat: the electricity consumption associated with these high-performance GPUs.
While the increased units may be a plus for CEO Jensen Huang’s AI vision, analysts also warn the GPU’s will consume energy, which is more than annual consumption for some countries.
High energy consumption
According to estimates from market analysis source Stock Talk, the collective power demand of 3.5 million H100 units is anticipated to reach approximately 13,000 gigawatt-hours (GWh) annually.
This energy consumption is even greater than the annual consumption of some countries, like Lithuania and Guatemala.
If this number comes to pass, it will surpass the electricity usage recorded in 2020 during the cryptocurrency mining boom, when 173.42 terawatt-hours were consumed by cryptocurrency mining alone.
“By next year, Nvidia’s ~3.5 million units of deployed H100 GPUs will consume a whopping 13,000 GWh of electricity annually, which is greater than the power consumption of some entire countries like Guatemala and Lithuania,” wrote Stock Talk in a post on the X platform.
A Wccftech report indicates that Nvidia’s leadership in the market is cemented by the H100 AI GPUs, which have been widely deployed globally and are unquestionably at the forefront of the AI hardware and software ecosystem.
The chip-making firm plans to sell between 1.5 and 2 million units next year. This plan could potentially triple the current electricity consumption figures associated with H100 GPUs. The surge in demand for AI accelerators is a testament to the growing need for advanced AI capabilities globally.
By next year, Nvidia's ~3.5 million units of deployed H100 GPUs will consume a whopping 13,000 GWh of electricity annually, which is greater than the power consumption of some entire countries like Guatamela and Lithuania.
— Stock Talk (@stocktalkweekly) December 27, 2023
Demand for AI to spur demand for energy
Nvidia is the leading player in AI hardware, and there is a potential surge in the need for these hardware components. This comes as several supercomputers are gearing up for operations within the coming years, which means competitors are also working overtime to bring their products to the market.
According to Wccftech, power consumption in related AI industries and data centers is expected to increase beyond expected levels. Some users on the X platform seem to agree.
“The crazy thing is that $NVDA (Nvidia) is just getting started. Imagine that a few years from now, Nvidia will be consuming enough energy to power multiple nations,” wrote a user identified as Mon on the X platform.
But Nvidia is not alone; it’s an industry-wide challenge.
Also read: The New York Times Files Copyright Infringement Lawsuit Against OpenAI and Microsoft
An industry-wide challenge
The AI industry and related sectors are grappling with high energy and water consumption as demand for AI services surges this year.
A JLL report revealed data centers are taking up a lot of energy due to increased demand for generative AI services, which was spurred by the launch of OpenAI’s ChatGPT.
The report further shows the costs of running data centers are likely to continue skyrocketing as demand for AI services surges.
Data centers are also reportedly consuming large volumes of water to cool off servers following increased usage of the technology, according to a SiliconANGLE report.
Microsoft alone saw a jump in water consumption in 2023 compared to the prior year. The company reportedly used 6.4 billion liters of water, which is enough to “fill up more than 2,500 Olympic-sized stadiums.”
Google saw a 20% jump in water consumption at its data centers this year compared to 2022, although the rate was not even across data centers.
Experts have called for greater adoption of liquid cooling systems and the need for ultra-efficient energy powered by renewable energy.