OpenAI CEO Sam Altman isn’t worried about AI’s increasingly glaring resource consumption, and argued humans require a lot too.
In an on-stage interview at the India AI Impact summit, he went on the defensive after he was asked about ChatGPT’s water needs.
Altman was then asked about the electricity needed for AI. In contrast to the issue of water, he claimed it was “fair” to bring up the technology’s energy requirements, saying “We need to move toward nuclear, or wind, or solar [energy] very quickly.”
But he pointed out that comparing AI’s power needs to humans isn’t exactly apples to apples.
“It also takes a lot of energy to train a human,” he said, prompting some in the crowd to laugh. “It takes, like, 20 years of life, and all of the food you eat during that time before you get smart.”
Altman expanded even further by noting that today’s humans wouldn’t even be here were it not for their ancestors dating back hundreds of thousands of years to when modern humans first emerged.
“Not only that, it took, like, the very widespread evolution of the 100 billion people that have ever lived and learned not to get eaten by predators and learned how to, like, figure out science or whatever to produce you,” he added.
When comparing humans to ChatGPT’s potential, you have to take this context into account, he argued. A fair comparison would be to pit the energy a human uses to answer a query with an AI after it is trained. On that measure “probably, AI has already caught up on an energy efficiency basis measured that way.”
In a June 2025 blog post, Altman claimed each ChatGPT query takes about 0.34 watt-hours of electricity, or around what an oven uses in about a second. Still, he published this fact before OpenAI released its newest GPT-5 model and its subsequent upgrades. Energy consumption can also vary based on the complexity of a query, for example, answering a question versus creating an image.
Over that same period, rising electricity demands are expected to increase the water use for data centers’ power generation by about 18%, reaching roughly 22.3 trillion liters (5.8 trillion gallons) per year. Meanwhile, the ever more complex chips data centers use will need more water during the manufacturing process, which will skyrocket the amount they require by 600% to 29.3 trillion liters (7.7 trillion gallons) annually from about 4.1 trillion liters (1.8 trillion gallons) today.
While OpenAI has moved away from evaporative cooling, 56% of all data centers globally still use the method in some form, according to the Xylem and Global Water Intelligence report.



