OpenAI CEO Advocates Swift Shift to Nuclear, Wind, or Solar Power for AI's Energy Needs

Deep News02-24 15:50

OpenAI CEO Sam Altman recently stated that he is not concerned about the increasingly evident resource consumption of artificial intelligence, pointing out that humanity itself also consumes vast amounts of resources.

During a live interview at the AI Impact Summit in India, when questioned about the water requirements for ChatGPT, Altman adopted a defensive stance. He dismissed claims that the chatbot consumes significant amounts of water per query as "completely untrue and utterly absurd." He explained that the data centers powering ChatGPT have largely moved away from water-intensive "evaporative cooling" methods, adopting alternative techniques to prevent overheating.

Altman was then asked about the electricity demands of AI. Unlike the water issue, he acknowledged that discussing the technology's energy needs was "reasonable" and stated, "We must transition to nuclear, wind, or solar energy as quickly as possible."

However, he argued that directly comparing AI's energy consumption to that of humans is not entirely appropriate. "Training a human also consumes a lot of energy," he remarked, eliciting laughter from the audience. "It takes about 20 years, and all the food you eat during that time, to make you intelligent."

Altman further noted that without the ancestors of modern humans who first appeared hundreds of thousands of years ago, humanity would not exist today. "Furthermore, it required extensive evolution from the 100 billion people who have lived throughout history, learning to evade predators, understand science, and more, to ultimately produce you," he added.

He emphasized that this context must be considered when comparing human potential to that of ChatGPT. A fair comparison, he suggested, would be the energy a human expends to answer a question versus the energy a trained AI uses for the same task. From this perspective, "AI may have already caught up to humans in terms of energy efficiency."

In a blog post last June, Altman claimed that each ChatGPT query consumes approximately 0.34 watt-hours of electricity, equivalent to the energy an oven uses in one second. However, he shared this figure before OpenAI released its latest GPT-5 model and subsequent upgrades. Energy consumption can also vary based on query complexity, such as the difference between answering a question and generating an image.

According to reports from Xylem and Global Water Intelligence, although OpenAI no longer uses evaporative cooling, 56% of data centers worldwide still employ some form of this cooling method.

Local media reported that OpenAI's 800-acre data center campus in Abilene, Texas, will utilize a more efficient, recirculating water system designed to continuously recycle water for cooling. Initially, the data center will draw 8 million gallons of water from the city of Abilene for its cooling system.

Disclaimer: Investing carries risk. This is not financial advice. The above content should not be regarded as an offer, recommendation, or solicitation on acquiring or disposing of any financial products, any associated discussions, comments, or posts by author or other users should not be considered as such either. It is solely for general information purpose only, which does not consider your own investment objectives, financial situations or needs. TTM assumes no responsibility or warranty for the accuracy and completeness of the information, investors should do their own research and may seek professional advice before investing.

Comments

We need your insight to fill this gap
Leave a comment