A growing concern among researchers and experts revolves around the energy consumption of artificial intelligence (AI), particularly proprietary models like ChatGPT. Sam Altman, CEO of OpenAI, previously mentioned that an average ChatGPT query consumes 0.34 watt-hours of energy, comparable to the energy used by an oven in just over one second. With OpenAI’s significant user base of 800 million weekly active users, the total energy expenditure is rapidly becoming a pressing issue.
However, without additional context about how OpenAI arrived at that figure, the reliability of this number is questionable. Experts, including Sasha Luccioni from Hugging Face, express skepticism regarding Altman’s estimation, raising concerns about missing definitions around what constitutes an "average" query — whether it includes image generation, and if it factors in the energy consumed during model training or server cooling.
Research is underway to quantify AI’s carbon emissions accurately. Luccioni’s recent analysis points out that a staggering 84% of large language model (LLM) usage comes from models with no disclosed environmental impacts. This lack of transparency means consumers remain unaware of the actual efficiency metrics and emissions factors related to the AI tools they use daily.
The disparity is stark when comparing the regulation of other industries. For instance, car buyers know the miles-per-gallon rating of vehicles, while AI users lack essential energy consumption information. The implications of this secrecy are profound, as unverified claims circulate, such as the assertion that a ChatGPT query requires ten times the energy of a Google search — a figure attributed to a casual comment from Google’s chairman that has entered the public discourse without substantiation.
To shed light on energy consumption, researchers are turning to open-source models that allow for independent verification. A recent study evaluated 14 open-source LLMs and found substantial energy discrepancies among them. Some models consumed up to 50% more energy, particularly those employing complex reasoning processes, which inherently use more energy due to the intensive internal computations they perform.
Maximilian Dauner from the Munich University of Applied Sciences highlighted the potential for AI systems to evolve toward energy efficiency — directing simpler queries to low-energy models while reserving complex challenges for more intensive models. This could result in effective responses while minimizing carbon footprints.
Some companies already implement this strategy, such as Google and Microsoft, utilizing smaller models where feasible to enhance speed and reduce energy usage. However, the broader industry often fails to inform users about how various factors influence energy consumption during AI outputs.
Calculating the true cost of AI energy usage involves many variables, including physical infrastructure like data centers, which require significant energy for operations and cooling, often depending on the local energy grid. The computing hardware also matters significantly; newer GPUs designed for AI can be more energy-intensive.
Understanding the full scope of AI’s environmental impact is hindered by a lack of clarity on energy consumption during model training and updates — details many companies keep confidential. Luccioni argues for mandatory disclosure of carbon emissions for any AI system in production, emphasizing the pressing need for accountability as AI technologies continue to proliferate.