Welcome to Eye on AI, with AI reporter Sharon Goldman. In this edition: SoftBank plans to list a new AI and robotics company in the US…AI model’s goblin habit, explained…Putting Google’s AI to the test as a trip planner.
If Big Tech’s AI spending spree were like climbing Mount Everest, they would still be ascending toward the summit, getting dizzy from the altitude.
The market reaction has been mixed. Shares of Meta fell sharply after its earnings report as investors focused on the scale of its AI spending plans, and Microsoft also slipped. By contrast, Alphabet and Amazon rose on strong cloud growth—highlighting a growing divide on Wall Street over whether this buildout is justified or getting ahead of itself.
It’s important to understand how much of that spending is going directly into the physical infrastructure that supports AI—both training frontier models and running them. But it can be hard to wrap your mind around the scale of this buildout.
Another key piece is networking—the cables and switches that connect thousands of chips so they can work together. Training and running modern AI models requires constant, high-speed communication between machines, using specialized switches, fiber optic or ethernet connections, and network cards. Without that, even the most powerful chips can’t do much.
But this AI spending race is now in its third year and still shows no signs of slowing. In 2024, the combined capex of the four biggest hyperscalers was just over $200 billion. Two years later, it’s on track to approach $700 billion.
If this is a climb, there’s still no clear view of the summit.
With that, here’s more AI news.



