AI Energy Usage Simulator (Bottoms Up)

Explore how AI query demand and model efficiency affect energy consumption over time.

Query Demand

Define how AI usage (queries) grows over time

300.0B
80%
40%
8×

AI Model & Efficiency Factors

Select AI model and set efficiency parameters

Energy: 8.68 Wh per 1k queries
1.6×
Includes idle power overhead and data center PUE. Typical range: 1.2-2.5×
30×
How much more energy-intensive training is compared to inference
30%
Fraction of models retrained each month (25% = retrain every 4 months)
20%
5

Energy Usage Formula

AI Energy Score = N/A
PUE Factor = N/A
Current Calculated Energy/Query (Wall-plug, incl. PUE, initial efficiency): N/A

Inference Energy (Wh) = (Query Demand / 1000) × AI Energy Score × PUE Factor

(Note: AI Energy Score is in Wh per 1k queries. Query Demand is daily.)

Training Energy = Inference Energy × Training Ratio × Monthly Retraining Rate
Total Energy = Inference Energy + Training Energy
Annual efficiency gains further reduce the actual energy per query over time.

Monthly Energy Usage Projection

No data available. Adjust parameters to see results.

Projection starts from ChatGPT launch (November 30, 2022)