It Takes a Lot of Energy to Train a Human Too”: Altman’s Controversial Take on AI Power Use – And the Indian Billionaire Who Just Shut It Down

The recent exchange between OpenAI CEO Sam Altman and Zoho founder Sridhar Vembu has ignited fresh debate on the environmental toll of artificial intelligence, particularly its surging electricity demands. During a candid session at Express Adda in New Delhi on February 20, 2026, Altman defended AI’s power consumption by drawing an unconventional parallel to human development.

Altman argued that critiques often unfairly spotlight the massive upfront energy required to train large AI models while ignoring the “training” costs for humans. He noted that it takes roughly 20 years of life—encompassing all the food, care, and resources consumed—to develop a capable human mind ready to tackle complex tasks. Extending this, he referenced humanity’s collective evolutionary journey across billions of lives to accumulate knowledge, from evading dangers to advancing science.

His key point: the more balanced metric is energy per task once training is complete. For a single query to ChatGPT (or similar models), the inference cost is minimal compared to a human performing the same intellectual effort. Altman dismissed exaggerated claims, such as one linked to Bill Gates suggesting a ChatGPT query equals 1.5 iPhone battery charges, calling it far off base. He referenced his own earlier disclosure that an average query uses about 0.34 watt-hours—comparable to a high-efficiency bulb running for a couple of minutes.

Altman acknowledged the broader validity of concerns over AI’s aggregate energy footprint as adoption explodes. He stressed the urgent need to scale clean sources like nuclear, wind, and solar to meet rising demand without curbing innovation.

This perspective met direct resistance from Indian tech leader Sridhar Vembu. The Zoho founder rejected any equivalence between machines and humans, stating he envisions a future where technology supports quietly from the background rather than overshadowing or being likened to human life. Vembu emphasized that technologists’ responsibility lies in fostering innovation that enhances society without letting it dominate or equate tools to people.

Online reactions have been polarized. Some view Altman’s analogy as a pragmatic reframing of intelligence-building costs—biological or artificial—while others criticize it as reductive, stripping human growth of its emotional, social, and experiential depth.

The discussion unfolds amid alarming projections for AI’s energy appetite. The International Energy Agency estimates global data center electricity use could approach or exceed 1,000 terawatt-hours by 2026 in aggressive scenarios—roughly matching Japan’s entire annual consumption—up sharply from around 460 TWh in recent years. In the U.S., AI-related computing is on track to outstrip the combined power needs of energy-intensive industries like aluminum, steel, cement, and chemicals by decade’s end.

Altman, who also leads nuclear-focused startup Oklo, consistently advocates accelerating clean energy infrastructure to fuel AI’s growth rather than restricting it. Vembu’s pushback highlights a counterview: prioritizing human-centric design where tech remains subordinate.

This clash underscores a deeper tension in the AI era—balancing transformative potential against sustainability and societal values—as the world grapples with how to power intelligence without compromising what makes us human.

Leave a Comment

All You Need to Know About Arjun Tendulkar’s Fiance. Neeraj Chopra’s Wife Himani Mor Quits Tennis, Rejects ₹1.5 Cr Job . Sip This Ancient Tea to Instantly Melt Stress Away! Fascinating and Lesser-Known Facts About Tea’s Rich Legacy. Natural Ayurvedic Drinks for Weight Loss and Radiant Skin .