Explained: Generative AI’s environmental impact
This article explains how rapid development and deployment of powerful generative AI models comes with environmental consequences such as increased electricity demand and water consumption.
The computational power required to train generative AI models that often have billions of parameters, such as OpenAI’s GPT-4, can demand a staggering amount of electricity, which leads to increased carbon dioxide emissions and pressures on the electric grid.
Beyond electricity demands, a great deal of water is needed to cool the hardware used for training, deploying, and fine-tuning generative AI models, which can strain municipal water supplies and disrupt local ecosystems. The increasing number of generative AI applications has also spurred demand for high-performance computing hardware, adding indirect environmental impacts from its manufacture and transport.
Increasing Data Centres
The electricity demands of data centers are one major factor contributing to the environmental impacts of generative AI, since data centers are used to train and run the deep learning models behind popular tools like ChatGPT and DALL-E.
A data center is a temperature-controlled building that houses computing infrastructure, such as servers, data storage drives, and network equipment. For instance, Amazon has more than 100 data centers worldwide, each of which has about 50,000 servers that the company uses to support cloud computing services.
While data centers have been around since the 1940s (the first was built at the University of Pennsylvania in 1945 to support the first general-purpose digital computer, the ENIAC), the rise of generative AI has dramatically increased the pace of data center construction.
By 2026, the electricity consumption of data centers is expected to approach 1,050 terawatt-hours (which would bump data centers up to fifth place on the global list, between Japan and Russia).
While not all data center computation involves generative AI, the technology has been a major driver of increasing energy demands.
“The demand for new data centers cannot be met in a sustainable way. The pace at which companies are building new data centers means the bulk of the electricity to power them must come from fossil fuel-based power plants,” says Noman Bashir, lead author of the impact paper, who is a Computing and Climate Impact Fellow at MIT Climate and Sustainability Consortium (MCSC) and a postdoc in the Computer Science and Artificial Intelligence Laboratory (CSAIL).
Increasing Energy Demand
Besides this, generative AI models have an especially short shelf-life, driven by rising demand for new AI applications. Companies release new models every few weeks, so the energy used to train prior versions goes to waste, Bashir adds. New models often consume more energy for training, since they usually have more parameters than their predecessors.
While electricity demands of data centers may be getting the most attention in research literature, the amount of water consumed by these facilities has environmental impacts, as well.
Chilled water is used to cool a data center by absorbing heat from computing equipment. It has been estimated that, for each kilowatt hour of energy a data center consumes, it would need two liters of water for cooling, says Bashir.
“Just because this is called ‘cloud computing’ doesn’t mean the hardware lives in the cloud. Data centers are present in our physical world, and because of their water usage they have direct and indirect implications for biodiversity,” he says.
While all machine-learning models must be trained, one issue unique to generative AI is the rapid fluctuations in energy use that occur over different phases of the training process, Bashir explains.
Power grid operators must have a way to absorb those fluctuations to protect the grid, and they usually employ diesel-based generators for that task.
Once a generative AI model is trained, the energy demands don’t disappear.
Each time a model is used, perhaps by an individual asking ChatGPT to summarize an email, the computing hardware that performs those operations consumes energy. Researchers have estimated that a ChatGPT query consumes about five times more electricity than a simple web search.
There are also environmental implications of obtaining the raw materials used to fabricate GPUs, which can involve dirty mining procedures and the use of toxic chemicals for processing.
The industry is on an unsustainable path, but there are ways to encourage responsible development of generative AI that supports environmental objectives, Bashir says.
Powered by Blogger.
Platform anthropocene Inc. or planthro is a New York
registered, globally active, not-for-profit public charity organization.
planthro targets scientists, students, citizens, governing
bodies, entrepreneurs and stakeholders concerned with the
concept of anthropocene and its multiple implications.
The organization aims at:
● conveying and sharing a lucid view of the complexity
characterizing human interaction with Earth,
● empowering individuals and organisations to work
collaboratively in economic, social, environmental, and
governance contexts,
● supporting and promoting informed and creative solutions
on sustainability, mitigation and adaptive strategies.
Find out more...
What do you think?
Have a comment, need more information, found an error, copyright claims, found a broken link, want to get involved, want to suggest a reference...
Write a comment on this page on this reference or get in touch with the project through the contact form on the corporate page www.planthro.org






No comments:
Post a Comment