As OpenAI’s GPT-5 takes the world by storm with its new, advanced capabilities, a pressing question is emerging from the research community: at what environmental cost? With the company providing no official data, independent experts are stepping in to measure the model’s energy footprint. Their findings reveal a dramatic increase in power consumption, suggesting that the quest for smarter AI is creating a hidden and potentially unsustainable environmental burden.
The numbers are stark. According to a team at the University of Rhode Island’s AI lab, a medium-length response from GPT-5 consumes an average of 18 watt-hours. This is a significant jump from previous models and is “significantly more energy than GPT-4o.” To put this into context, 18 watt-hours is enough to power an incandescent light bulb for 18 minutes. When you consider that ChatGPT handles billions of requests a day, the total energy consumption could power 1.5 million US homes, a staggering figure that brings the environmental impact of AI into sharp focus.
This surge in power consumption is directly tied to the model’s size. Although OpenAI has not released the parameter count for GPT-5, experts believe it is “several times larger than GPT-4.” This is consistent with a study from the French AI company Mistral, which found a “strong correlation” between a model’s size and its resource consumption. The study concluded that a model ten times bigger would have an impact one order of magnitude larger. This suggests that the trend of building ever-larger AI models, championed by many in the industry, will continue to drive up resource usage at an alarming rate.
The new capabilities of GPT-5 also play a significant role in its high energy demands. Its advanced “reasoning mode” and ability to process video and images require more intensive computation than simple text generation. A professor studying the resource footprint of AI models noted that using the reasoning mode could increase resource usage by a factor of “five to 10.” This means that while a “mixture-of-experts” architecture offers some efficiency, the new, more complex tasks are driving the overall energy footprint to new heights. This raises serious questions about the long-term sustainability of the AI industry and the need for greater transparency.