OpenAI CEO Sam Altman on Friday called concerns about data center water usage “bogus” and defended the resource demands of artificial intelligence, comparing the energy used by AI systems to that of humans.
Speaking to The Indian Express on the sidelines of the India AI Impact Summit, Altman was asked to address common criticisms of AI, such as energy and water consumption.
The CEO responded that claims circulating online that ChatGPT uses a gallon of water per query are “completely false and completely insane” and have “nothing to do with reality.”
Data centers have traditionally relied on large amounts of water to cool electrical components and prevent them from overheating. While data center cooling technology promises to reduce consumption, and some new data centers don’t rely on water at all, many operational facilities still use water.
Despite improvements in efficiency, a report released last month by water technology company Xylem and Global Water Intelligence predicts that rising computing demands will more than triple the amount of water withdrawn for cooling over the next 25 years, putting pressure on water systems.
However, he noted that energy consumption remains a concern. “Not per query, but overall, because there’s so much AI being used in the world, and we need to move to nuclear power, wind power, and solar power very quickly.”
When asked about my previous comment, microsoft Founder Bill Gates suggested that the efficiency of the human brain proves AI can evolve to become more energy efficient over time, but Altman pushed back.
“One thing that’s always unfair about this comparison is that people argue about how much energy it takes to train an AI model. But it also takes a lot of energy to train a human.”
“It takes about 20 years of your life to become smarter, and you have to eat everything you’ve ever eaten,” he added.
“A fair comparison would be with a human, if you ask ChatGPT a question, how much energy does it take for the model to be trained to answer that question? On an energy efficiency basis, measured that way, AI is probably already catching up.”
The process Altman is referring to is known as inference, which refers to the use of AI models that have already been trained to create new outputs. AI inference typically consumes much less power than the training itself.
Altman’s comments, particularly the comparison between AI and humans, sparked debate online amid growing concerns about AI’s ability to replace human jobs.
Sridhar Vembu, co-founder and chief scientist at Indian software company Zoho, who attended the summit, criticized the equivalence between humans and AI. “I don’t want to see a world where we equate some piece of technology with a human being,” the billionaire said in the X post.
The debate comes as governments and companies pour billions of dollars into new data centers to support the computing needs of AI systems.
According to a May report from the International Monetary Fund, global data center power consumption in 2023 had already reached levels comparable to Germany and France shortly after the launch of OpenAI’s groundbreaking ChatGPT AI model.
In response, some governments are working to speed up approval processes for bringing new, cheaper energy online, with some environmentalists warning such moves could clash with global net-zero goals.
Some communities in countries such as the United States are also delaying development projects over concerns that they will strain the power grid and raise overall electricity costs.
Last week, the city council in San Marcos, Texas, rejected a proposed $1.5 billion data center project after months of public opposition.
Amid this backlash, many technology leaders, including OpenAI’s Altman, say data centers need to get more energy from a variety of sources, including renewable and nuclear energy.
