That’s almost the scale of electricity that Sam Altman and his partners say will be devoured by their next wave of AI data centers—a single corporate project consuming more power, every single day, than two American cities pushed to their breaking point.
The announcement is a “seminal moment” that Andrew Chien, a professor of computer science at the University of Chicago, says he has been waiting for a long time to see what’s coming to fruition.
“I’ve been a computer scientist for 40 years, and for most of that time computing was the tiniest piece of our economy’s power use,” Chien told Fortune. “Now it’s becoming a large share of what the whole economy consumes.”
He called the shift both exciting and alarming.
“It’s scary because computing was always the tiniest piece of our economy’s power use,” he said. “Now it could be 10% or 12% of the world’s power by 2030. We’re coming to some seminal moments for how we think about AI and its impact on society.”
“It’s pretty amazing,” Chien said. “A year-and-a-half ago they were talking about five gigawatts. Now they’ve upped the ante to 10, 15, even 17. There’s an ongoing escalation.”
Fenqi You, an energy systems professor at Cornell University, who also studies AI, agreed.
“Ten gigawatts is more than the peak power demand in Switzerland or Portugal,” he told Fortune. “Seventeen gigawatts is like powering both countries together.”
“So you’re talking about an amount of power that’s comparable to 20% of the whole Texas grid,” Chien said. “That’s for all the other industries—refineries, factories, households. It’s a crazy large amount of power.”
Altman has framed the build-out as necessary to keep up with AI’s runaway demand.
“This is what it takes to deliver AI,” he said in Texas. Usage of ChatGPT, he noted, has jumped tenfold in the past 18 months.
Chien, however, is blunt about the near-term limits.
“As far as I know, the amount of nuclear power that could be brought on the grid before 2030 is less than a gigawatt,” he said. “So when you hear 17 gigawatts, the numbers just don’t match up.”
With projects like OpenAI’s demanding 10 or 17 gigawatts, nuclear is “a ways off, and a slow ramp, even when you get there.” Instead, he expects wind, solar, natural gas, and new storage technologies to dominate.
Fenqi You, an energy systems expert at Cornell, struck a middle ground. He said nuclear may be unavoidable in the long run if AI keeps expanding, but cautioned that “in the short term, there’s just not that much spare capacity” — whether fossil, renewable, or nuclear. “How can we expand this capacity in the short term? That’s not clear,” he said.
He also warned that timeline may be unrealistic.
“A typical nuclear plant takes years to permit and build,” he said. “In the short term, they’ll have to rely on renewables, natural gas, and maybe retrofitting older plants. Nuclear won’t arrive fast enough.”
The environmental costs loom large for these experts, too.
“We have to face the reality that companies promised they’d be clean and net zero, and in the face of AI growth, they probably can’t be,” Chien said.
Ecosystems could come under stress, Cornell’s You said.
“If data centers consume all the local water or disrupt biodiversity, that creates unintended consequences,” he said.