Welcome to Eye on AI, with AI reporter Sharon Goldman. In this edition…Open AI’s gigawatt arms race is underway in Abilene, Texas…Nscale announces record-breaking $1.1 billion Series B...OpenAI and Databricks strike AI agent deal…Trump administration will provide Elon Musk’s xAI to federal agencies.
Altman and OpenAI have been relentless in their drive to “scale compute.” By this, they don’t mean chasing the next algorithmic breakthrough or elegant line of code. They mean brute industrial force: millions of chips, sprawling campuses wired with fiber, and gigawatts of electricity — along with the gallons of water needed to help cool all that equipment. To OpenAI, scaling compute means piling on ever more of this horsepower, betting that sheer scale — not software magic — is what will unlock not just artificial general intelligence (AGI), which the company defines as “highly autonomous systems that outperform humans at most economically valuable work,” but what it calls artificial super intelligence (ASI), that would hypothetically surpass human capabilities in all domains.
That’s why OpenAI keeps pointing to a number: 10 gigawatts of capacity across the Stargate project sites by the end of 2025. Ten gigawatts — enough to power roughly 7.5 million homes, or an entire regional grid — marks a shift in how AI capacity is measured. At this scale, Altman explained to me with a quick handshake before the press gaggle, companies like OpenAI don’t even bother counting GPUs anymore. The unit of measure has become gigawatts: how much electricity the entire fleet of chips consumes. That number is shorthand for the only thing that matters: how much compute the company can keep running.
To put that into perspective: 250 gigawatts would be about a quarter of the entire U.S. electrical generation capacity, which hovers around 1,200 GW. And Altman isn’t just talking about electricity. The number is shorthand for the entire industrial system required to use it: the chips, the data centers, the cooling and water, the networking fiber and high-speed interconnects to tie millions of processors into supercomputers.
Heath reported that Altman’s Slack note announced OpenAI is “formalizing the industrial compute team,” led by Peter Hoeschele, who reports to president Greg Brockman. “The mission is simple: create and deliver massive usable compute as fast as physics allows, to power us through ASI,” Altman wrote. “In several years, I think this could be something like a gigawatt per week, although that will require us to completely reimagine how we build compute.”
“Industrial compute should be considered a new core bet (like research, consumer devices, custom chips, robotics, applications, etc.) which will hire and operate in the way it needs to run at maximum effectiveness for the domain,” Altman continued. “We’ve already invested hundreds of billions of dollars, and doing this right will cost trillions. We will need support from team members across OpenAI to help us move fast, unlock projects, and clear the path for the buildout ahead.”
A quarter of the U.S. power grid. Trillions in cost. Does that sound bonkers to you? It does to me — which is precisely why I hopped on a plane to Dallas, rented a car, and drove three hours through rolling hills and ranches to Abilene to see for myself. The scale of this one site is staggering. Imagining it multiplied by dozens is nearly impossible.
I told Altman that the scene in Abilene reminded me a bit of a tour I recently took of Hoover Dam, one of the great engineering feats of the 20th century that produces about 2 gigawatts of power at capacity. In the 1930s, Hoover Dam was a symbol of American industrial might: concrete, turbines, and power on a scale no one had imagined.
Altman acknowledged that “people like to pick their historical analogies” and thought the “vibe was right” to compare Stargate to Hoover Dam. It wasn’t his own personal favorite, however: “A recent thing I’ve thought about is airplane factories,” he said. “The history of what went into airplane factories, or container ships, the whole industry that came around those,” he said. “And certainly, everything that went into the Apollo program.”
That’s when I realized: whether you think Altman’s goals make sense, seem nuts, or feel downright reckless really comes down to what you believe about AI itself. If you think supercharged versions of AI will change everything — and mostly for the good, like curing cancer — or you are a China hawk that wants to win the new AI ‘cold war’ with China, then Altman’s empire of data centers looks like a necessary bet. If you’re skeptical, it looks like the biggest boondoggle since America’s grandest infrastructure follies: think California’s long-awaited high-speed rail. If you’ve read Karen Hao’s Empire of AI, you might also be shouting that scaling isn’t inevitable — that building a ‘compute empire’ risks centralizing power, draining resources, and sidelining efficiency and safety. And if you think AGI will kill us all, like Eliezer Yudowsky? Well, you won’t be a fan.
No one can predict the future, of course. My greater concern is that there isn’t nearly enough public awareness of what’s happening here. I don’t mean just in Abilene, with its mesquite shrubland ground into dust, or even OpenAI’s expanding Stargate ambitions around the US and beyond. I mean the vast, almost unimaginable infrastructure buildout across Big Tech — the buildout that’s propping up the stock market, fueling a data center arms race with China, and reshaping energy, land, and labor around the world. Are we sleepwalking into the equivalent of an AI industrial revolution—and not a metaphorical one, but in terms of actual building of physical stuff—without truly reckoning with its costs versus its benefits?
Even Sam Altman doesn’t think enough people understand what he’s talking about. “Do you feel like people understand what ‘compute’ is?” I asked him outside of Building 2. That is, does the average citizen really grok what Altman is saying about the physical manifestation of these mega data centers?
“No, that’s why we wanted to do this,” he said about the Abilene media event. “I don’t think when you hit the button on ChatGPT…you think of walking the halls here.”
Of course, Hoover Dam, too, was also divisive, controversial and considered risky. But I wasn’t alive when it was built. This time I could see the dust rising in Abilene with my own eyes — and while Altman talked about walking the newly-built halls filled with racks of AI chips, I walked away unsettled about what comes next.
With that, here’s the rest of the AI news.