OpenAI boss says AI is energy efficient because humans take '20 years' to get smart

OpenAI CEO Sam Altman, compares the energy it takes to train AI and run models to the idea that it takes a human 20 years and a lot of energy to get smart.

OpenAI boss says AI is energy efficient because humans take '20 years' to get smart
Comment IconFacebook IconX IconReddit Icon
Senior Editor
Published
2-minute read time
TL;DR: OpenAI CEO Sam Altman defended AI's high energy use by comparing it to the extensive energy humans consume over a lifetime to learn and perform tasks. He argued that once trained, AI models like ChatGPT are more energy-efficient per query than humans, highlighting AI's long-term efficiency despite initial training costs.

Recently, OpenAI CEO Sam Altman sat down for a lengthy interview with The Indian Express, where he gave a rather strange and cold response to a question about the energy required to train complex AI models. This has become a significant concern in many markets, as AI energy consumption is now dwarfing that of most, if not all, other industries.

And with that, the underlying approach of Sam Altman's bizarre response is more or less the idea that if a human can complete a task in a few seconds or minutes versus an AI model that took vast amounts of energy, more than, say, a small city, to train, what's the point or benefit?

"One of the things that is always unfair in this comparison, where people talk about how much energy it takes to train an AI model relative to how much it costs for one human to do an inference query," Sam Altman says. "It also takes a lot of energy to train a human; it takes 20 years of life, and all of the food that you eat during that time, before you get smart."

He then goes on to infer that when you add in the entirety of human evolution, covering everything from being chased by predators in the prehistoric era to the development of science and an understanding of the universe, then yeah, AI is actually a lot more efficient at what it does than humanity as a whole. Or something along those lines.

On one hand, it's an impressive demonstration of verbal gymnastics to go from comparing the energy consumption of modern AI data centers drawing more power than the cities that they're situated in to all of the energy it took humanity to get to this point. And from there, implying that a single "inference" or AI-like task for a human to complete requires decades of energy to produce a result. Aside from making it sound like people only exist to complete tasks as part of a white-collar job, it is quite the leap.

"The fair comparison," Sam Altman adds. "If you ask ChatGPT a question, how much energy does it take once a model is trained to answer that question, versus a human. Probably, AI has already caught up on energy efficiency when measured that way." "That way," in this case, seems to hypocritically account for the person's entire life of energy consumption and "training," but not the AI model's.

Photo of the GIGABYTE GeForce RTX 5070 WINDFORCE OC SFF 12G
Best Deals: GIGABYTE GeForce RTX 5070 WINDFORCE OC SFF 12G
Today7 days ago30 days ago
$649.99 USD-
$649.99 USD-
$899.99 CAD-
$899.99 CAD-
$649.99 USD-
$649.99 USD-
Check PriceCheck Price
* Prices last scanned 4/2/2026 at 8:07 pm CDT - prices may be inaccurate. As an Amazon Associate, we earn from qualifying purchases. We earn affiliate commission from any Newegg or PCCG sales.
News Source:x.com

Senior Editor

Email IconX IconLinkedIn Icon

Kosta is a veteran gaming journalist that cut his teeth on well-respected Aussie publications like PC PowerPlay and HYPER back when articles were printed on paper. A lifelong gamer since the 8-bit Nintendo era, it was the CD-ROM-powered 90s that cemented his love for all things games and technology. From point-and-click adventure games to RTS games with full-motion video cut-scenes and FPS titles referred to as Doom clones. Genres he still loves to this day. Kosta is also a musician, releasing dreamy electronic jams under the name Kbit.

Follow TweakTown on Google News
Newsletter Subscription