Welcome to Eye on AI! In this edition...OpenAI releases GPT-5 and strikes a $1 government deal. AI homework helpers duke it out. Zoox gets a special exemption.
One of the defining truths about the world of generative AI is that even when you’re on top, the lead doesn’t last for long.
OpenAI says GPT-5 delivers “more accurate answers than any previous reasoning model,” and is “much smarter across the board,” reflected by strong performance on academic and human-evaluated benchmarks. Its research blog boasts of new state-of-the-art performance across math, coding, and health questions, and found that GPT-5 outperformed other OpenAI models across tasks spanning over 40 occupations, including law, logistics, sales, and engineering.
“GPT-5 really feels like talking to a PhD level expert in any topic,” OpenAI CEO Sam Altman told journalists in a pre-briefing on Wednesday. “Something like GPT-5 would be pretty much unimaginable in any other time in history.”
Altman described GPT-5 as a “significant step” along the path to artificial general intelligence (AGI), which, according to OpenAI’s mission statement, is defined as “highly autonomous systems that outperform humans at most economically valuable work.”
Whether GPT-5 propels OpenAI back to the top of the AI hill will become clear in the days and weeks ahead, as researchers put the model through its paces, testing it against the likes of other elite models, including Anthropic’s latest Claude model and Google’s Gemini.
With GPT-5 now finally out, OpenAI CEO Sam Altman acknowledged that staying at the frontier means one thing: relentless scaling.
In AI, scaling refers to the idea that models get more powerful as you increase the amount of data, computing power, and model components used during training. It’s the underlying principle that drove progress from GPT-2 to GPT-3 to GPT-4—and now GPT-5. The catch is that each leap requires exponentially more investment, particularly in AI infrastructure—for OpenAI, that includes its Stargate Project, a joint venture it announced in January with Softbank, Oracle and investment firm MGX with a goal to to invest up to $500 billion by 2029 in AI-specific data centers across the U.S.
When asked whether scaling laws still hold, Altman said they “absolutely” do. He pointed to better models, smarter architectures, higher-quality data, and significantly more computing power as the path to “order-of-magnitude” improvements still ahead.
But that kind of progress comes at a cost. “It’s going to take an eyewatering amount of compute,” he admitted. “But we intend to continue doing it.”
That of course, requires massive amount of money and partnerships.
There’s much more to say about GPT-5. Journalists received the full set of materials from OpenAI, including a research blog, system card, and safety card, a mere 90 minutes before releasing the model, so I still have much more to go through. Stay tuned for more!
With that, here’s the rest of the AI news.