Nvidia is estimated to ship around 550,000 of its latest H100 compute GPUs worldwide in 2023, with the majority going to American tech firms. This is based on information from insiders linked to Nvidia and TSMC.
Nvidia's A100 GPU has a maximum power consumption of 250W with PCIe and 400W with SXM, while the H100's power consumption is up to 75% higher, with 300-350W for PCIe and up to 700W for SXM. The increasing power consumption per chip is a result of boosted computing performance in each new generation of AI accelerators.
The peak power consumption of AMD's MI300x accelerators is 750W, which is nearly double the 450W figure of NVIDIA's RTX 4090. This increase in power consumption is due to the growing demands of AI workloads and the need for greater compute performance.