Data Center Electricity (2024)

415 TWh

~1.5% of global electricity

More than South Africa's entire annual electricity consumption

Projected 2030

945 TWh

More than doubling in 6 years

Equivalent to Japan's total electricity consumption

US Data Center Water (2023)

64 B liters

Could quadruple by 2028

Enough to fill 25,600 Olympic swimming pools

Daily AI Users

600M

600 million people daily

More than the population of the European Union

Projected Data Center & AI Energy Demand

Source: IEA, Goldman Sachs projections

  • AI Share (TWh)
  • Total Data Center (TWh)
202420252026202720282029203002505007501000TWh

Carbon Intensity by Region

The same AI query can produce 15x more CO2 depending on where the data center is located.

  • Renewable %
  • g CO2/kWh
0200400600800Northern Europe(Sweden)BrazilUS West (Oregon)Western EuropeGermanyUS CentralUS East (Virginia)AustraliaIndia

Hardware & Infrastructure Impact

CO2 per GPU Manufactured

200 kg

Equivalent to driving 500 miles

GPU Obsolescence

2.5 years

Rapid hardware turnover

E-waste Recycling Rate

25%

75% of e-waste is not recycled

Inference vs Training

85%

of total AI energy is inference

AI Provider Sustainability Scorecard

Note: Corporate emissions may be 7.62x higher than reported due to favorable accounting (market-based vs location-based).

ProviderRenewable EnergyEmissionsWaterTransparencyNotes
Google100% matched+50% since 2019+88% since 2019HighPublished per-query energy data for Gemini. 33x efficiency gain in one year.
Microsoft34 GW contracted+23.4% since 2020WUE improved 39%HighCarbon negative by 2030 goal. Launched zero-water cooling designs.
Amazon (AWS)100% matched (claimed)Rising in 2024Not disclosed in detailMediumLargest corporate renewable buyer 5 years running.
MetaGrid-based matchingGrowing with AI investmentNot disclosed in detailMediumPursuing nuclear power for data centers.
OpenAINot publishedNot publishedNot publishedLowNo dedicated sustainability report as of 2026.
AnthropicNot publishedNot publishedCommitted to water-efficient coolingLowClaude scored highest eco-efficiency on AWS. No emissions data published.

Key Facts

Inference dominates: 85% of AI energy goes to running queries, not training models.

Power demand growth: Goldman Sachs projects 165% increase in data center power demand by 2030.

Fossil fuel reliance: ~60% of global data center energy still comes from fossil fuels (30% coal, 26% natural gas).

Water crisis: US data center water use could quadruple from 17 billion gallons (2023) by 2028.

Efficiency gains: Inference costs at GPT-3.5 level dropped 280x in two years (Stanford AI Index).

Nuclear renaissance: Three Mile Island may reopen in 2028 to power Microsoft data centers.

Where the Energy Goes

85%Inference
Inference — 85%
Training — 15%

Inference — responding to user queries — now accounts for 80–90% of AI's total energy consumption. Every question you ask contributes to the larger slice.

What's Being Done

Google achieved a 33x reduction in Gemini's per-query energy between May 2024 and May 2025, through a combination of hardware upgrades (TPU v5e/v6e), model distillation, and inference optimisation.

Mixture-of-Experts architectures (like DeepSeek-V3) activate only ~5% of their parameters per query, dramatically reducing computation. This architectural shift is one of the most promising avenues for reducing inference energy.

Hardware efficiency improves approximately 2x every 18 months, but total demand grows faster. This is a classic example of the Jevons paradox — as AI becomes more efficient, we use more of it, and total energy consumption continues to rise.

Curious about your own impact?

Calculate your AI footprint →