Skip to Main Content
Skip to Library Help widget
 

Artificial Intelligence

This guide discusses generative AI tools available for research, with tips for writing prompts and citing AI generated content in your works.

Power Consumption and Emissions by the Numbers

The academic literature currently contains few details regarding the precise power consumption of large AI models. However, most researchers agree the environmental impact of AI services come at a high cost.  

Model Training 

  • Training a common large AI model requires the equivalent of over 626,000 pounds of carbon dioxide, which is roughly the equivalent of the emissions produced by five gas powered American cars for the life of each car (1). 
  • Training ChatGPT-3 is estimated to have required the equivalent energy consumed by an average American household for 120 years (2). 

Queries: Chat GPT-4 vs Google 

  • The energy consumption of a Large Language Model (LLM) like ChatGPT-4 can vary by the number of tokens (units of data processed) required of the query. An AI query response (or inference) is considered to be significantly more than required of a standard Google search.  
  • An average Google text query requires .0003 kWh (2, 3). 
  • The estimated average ChatGPT-4 query requires .001 - .01 kWh (4), which is 3 to 33 times greater than a Google query.  
  • According to the Allen Institute for Artificial Intelligence, the power consumed by one query of the most popular chatbots is the same required for a light bulb for 20 minutes, which by their estimation is over ten times greater than a Google search (6). 
  • It is estimated the average daily inferences currently produced by ChatGPT-4 per day could power 52 average American homes for a year (5). 

Image creation 

The power required to create an AI image can vary. Simpler AI image models are more energy efficient than their more powerful counterparts such as Stable Diffusion XL. 

  • According to an October 2024 joint study (not yet peer-reviewed as of this writing) with Carnegie Melon University, an AI image created from a powerful model requires the energy equivalent to a full smart phone charge. (7
  • By comparison, over 6000 AI text query responses can be generated from the power required of one AI image from a powerful model. 

Footprint Reduction Strategies

Some strategies for reducing the carbon footprint of generative AI include:

  • Utilize energy efficient (green) algorithms to reduce computational overhead (15). 
  • Optimize data centers by using more efficient hardware and limiting power distribution so that hardware run cooler (12,16). 
  • Increased Investments in renewable energy to power data centers (17). 
  • Apply AI to find ways to improve data center efficiency (18). 

Environmental Impact of AI Development

Manufacturing 

  • AI chips and GPUs require rare earth metals such as cobalt, nickel, and lithium, which are extracted. Deforestation and water pollution are often a consequence of these mining efforts (8). Some countries do not regulate mining labor and produce “conflict minerals”, which is human exploitation (9).  
  • The demand for faster chips results in accelerated hardware turnover. Materials that are not recycled end up in landfills (9).  

LLM Training Phase 

  • The duration required for training an LLM is dependent on its size, that is, the number of parameters to be trained; the efficiency of the algorithms; and the number of training passes required to optimize (tune) the model (11).   
  • Training and tuning a model can take weeks or months. 

Data Centers 

  • Data centers provide the physical infrastructure for LLMs. Given the immense data processing involved in training an LLM, and the increasing demand for inferences by consumers, large amounts of water are required to cool the hardware systems in data centers (12, 13). 
  • From 2018 to 2024, economic investment in, and demand for more AI assisted services collectively doubled the energy consumption required of data centers to about 4.59% of all energy used in the United States. Carbon emissions tripled in the same amount of time such that in 2023-24 data center emissions were comparable to combined emissions from domestic commercial airlines (14). 
  • Many U.S. data centers are built in regions dependent on fossil fuels (14). 

Books on AI and Sustainability

Report ADA Problems with Library Services and Resources. Resources on this page may require Document Viewers