Power Consumption AI News & Updates
AI Data Centers Projected to Reach $200 Billion Cost and Nuclear-Scale Power Needs by 2030
A new study from Georgetown, Epoch AI, and Rand indicates that AI data centers are growing at an unprecedented rate, with computational performance more than doubling annually alongside power requirements and costs. If current trends continue, by 2030 the leading AI data center could contain 2 million AI chips, cost $200 billion, and require 9 gigawatts of power—equivalent to nine nuclear reactors.
Skynet Chance (+0.04%): The massive scaling of computational infrastructure enables training increasingly powerful models whose behaviors and capabilities may become more difficult to predict and control, especially if deployment outpaces safety research due to economic pressures.
Skynet Date (-2 days): The projected doubling of computational resources annually represents a significant acceleration factor that could compress timelines for developing systems with potentially uncontrollable capabilities, especially given potential pressure to recoup enormous infrastructure investments.
AGI Progress (+0.1%): The dramatic increase in computational resources directly enables training larger and more capable AI models, which has historically been one of the most reliable drivers of progress toward AGI capabilities.
AGI Date (-4 days): The projected sustained doubling of AI compute resources annually through 2030 significantly accelerates AGI timelines, as compute scaling has been consistently linked to breakthrough capabilities in AI systems.