The academic literature currently contains few details regarding the precise power consumption of large AI models. However, most researchers agree the environmental impact of AI services come at a high cost.
Model Training
- Training a common large AI model requires the equivalent of over 626,000 pounds of carbon dioxide, which is roughly the equivalent of the emissions produced by five gas powered American cars for the life of each car (1).
- Training ChatGPT-3 is estimated to have required the equivalent energy consumed by an average American household for 120 years (2).
Queries: Chat GPT-4 vs Google
- The energy consumption of a Large Language Model (LLM) like ChatGPT-4 can vary by the number of tokens (units of data processed) required of the query. An AI query response (or inference) is considered to be significantly more than required of a standard Google search.
- An average Google text query requires .0003 kWh (2, 3).
- The estimated average ChatGPT-4 query requires .001 - .01 kWh (4), which is 3 to 33 times greater than a Google query.
- According to the Allen Institute for Artificial Intelligence, the power consumed by one query of the most popular chatbots is the same required for a light bulb for 20 minutes, which by their estimation is over ten times greater than a Google search (6).
- It is estimated the average daily inferences currently produced by ChatGPT-4 per day could power 52 average American homes for a year (5).
Image creation
The power required to create an AI image can vary. Simpler AI image models are more energy efficient than their more powerful counterparts such as Stable Diffusion XL.
- According to an October 2024 joint study (not yet peer-reviewed as of this writing) with Carnegie Melon University, an AI image created from a powerful model requires the energy equivalent to a full smart phone charge. (7)
- By comparison, over 6000 AI text query responses can be generated from the power required of one AI image from a powerful model.