A request made through an AI-based virtual assistant consumes 10 times the electricity of a regular search engine query. This is because the computational power required to train Generative AI (GenAI) platforms require the use of data centers and thus staggering amounts of electricity to generate a response. Recent studies highlight this considerable energy consumption as well as carbon emissions and electronic waste associated with GenAI, thereby raising questions about the sustainability of GenAI.
How Is GenAI Different from Traditional AI?
Before diving into the sustainability challenges of AI, it is helpful to understand what the platform itself entails. Traditional AI systems (e.g., recommendation engines, spam filters, virtual assistants, and fraud detection systems) are designed to excel within a defined scope, automating tasks and making decisions based on structured data and pre-set parameters.
Whereas traditional AI relies on pre-programmed rules or algorithms for analyzing and classifying existing data, GenAI learns from patterns in vast datasets to generate novel outputs—creating new, original content like text, audio, images, or code.
How Are the Sustainability Implications of GenAI Different from Traditional AI?
What is different about GenAI than traditional AI is the power density it requires. Fundamentally, it is just computing, but a GenAI training cluster might consume seven or eight times more energy than a typical computing workload. Thus, underlying the excitement surrounding GenAI’s transformative potential, are increasingly large and energy-intensive systems that highlight important concerns about thesustainability of large AI models. These concerns that encompass energy consumption, resource depletion, and electronic waste generation.
What Are the Sustainability Challenges GenAI Poses?
Training and operating AI models, especially large language models (LLMs), require substantial energy, contributing to greenhouse gas (GHG) emissions and water usage. Furthermore, the manufacturing and disposal of AI hardware like GPUs and data centers introduce resource extraction challenges and electronic waste concerns.
Energy Consumption from Data Centers
Most large-scale AI deployments are housed in data centers that use massive amounts of electricity, spurring the emission of planet-warming GHGs. A data center is a temperature-controlled building that houses computing infrastructure, such as servers, data storage drives, and network equipment. While data centers have been around since the 1940s (the first was built to support the first general-purpose digital computer, the ENIAC), the rise of GenAI has dramatically increased the pace of data center construction.
In 2023, data centers consumed 4% of US electricity (a number that could triple by 2028); around the world data centers account for 2.5 to 3.7% of global GHGs—exceeding even those of the aviation industry.
Energy Consumption from LLMs
Large language models (LLMs) are typically developed and deployed within data centers and provide the necessary infrastructure, including powerful computing resources and high-speed networking. They are AI models trained for continuous months, leading to high electricity consumption, on vast amounts of text data that enable them to understand, generate, and manipulate human language. LLMs require enormous computational resources that include, for instance, the central processing units (CPUs), graphical processing units (GPUs), tensor processing units (TPUs), video random access memory (VRAM), etc.
Training large AI models can consume as much electricity as hundreds of homes over several months. In some examples, the energy consumption for training the model equates to the yearly power usage of 5,000 US homes.
GHG Emissions from GenAI
This demand for training LLMs is contributing to increased GHG emissions and puts a strain on power grids. One report found that training certain AI models can produce about 626,000 pounds of CO₂, equivalent to approximately 300 round-trip flights between New York and San Francisco—nearly five times the lifetime emissions of an average car. AI’s energy consumption is thus a growing concern because of its reliance on electricity—especially from fossil fuels—leads to significant CO₂ output.
Water Use for GenAI
The rising water demands of data centers, driven largely by cooling requirements, underscore growing concerns about water scarcity. Data centers consume vast amounts of water for cooling purposes, with each inference of 20–50 queries on a LLM using approximately 500 milliliters of water. Hence, a great deal of water is needed to cool the hardware used for training, deploying, and fine-tuning generative AI models, which can then strain municipal water supplies and disrupt local ecosystems. Since many data centers are located in water-stressed regions (e.g., Arizona, Texas, India, Chile), the high-water use can intensify local scarcity and community conflict.
A 2025 projection estimates AI (mainly via data centers) may require 4.2–6.6 billion cubic meters of water annually by 2027—more than half of the England’s total water withdrawal.
Electronic Waste from GenAI
The demand for high-performance GPUs, TPUs, servers, and memory modules contributes to electronic waste (e-waste), especially because they have short lifespans (2–5 years) due to fast advancements. Unfortunately, many of these components are underutilized post-training and promptly discarded. A Nature Computational Science study estimates that e‑waste from generative AI (especially large language models) could reach between 1.2 and 5 million metric tons by 2030.
E‑waste is rife with toxic metals like lead, mercury, and chromium, which can leach into ecosystems, contaminating soil, water, and air. In developing regions, informal recycling exposes workers—especially children—to dangerous chemicals, causing serious health issues.
Making GenAI Sustainable
To make GenAI sustainable, there is a need for proactive solutions — streamlining AI models, developing greener infrastructure, and fostering collaboration across disciplines. Implementing these solutions requires changes across the entire GenAI lifecycle—from model design and training, to hardware use, deployment, and practices. By adopting sustainable measures and fostering innovation in AI, we can harness GenAI’s potential for positive change while minimizing its negative sustainability consequences.
Sustainable practices for AI are laid out in ISO/IEC 42001, which focuses on establishing an Artificial Intelligence Management System (AIMS) within organizations. This standard provides a framework for organizations to manage risks, comply with regulations, and build trust in their AI applications, ultimately promoting responsible and ethical AI practices. ISO/IEC 42001 also addresses global challenges like ethical considerations, data privacy, and security risks.
ISO/IEC 42001:2023—Information technology – Artificial intelligence – Management system is available on the ANSI Webstore and in the following Standards Packages:
- ISO/IEC 42001 / ISO/IEC 42005 / ISO/IEC 42006 – Artificial Intelligence Package
- ISO/IEC 42001 / ISO/IEC 22989 / ISO/IEC 23894 – Artificial Intelligence Package
- ISO/IEC 42001 / ISO/IEC 23894 – Artificial Intelligence Set
- ISO/IEC 42001 / ISO/IEC 23894 / ISO/IEC 42006 – Artificial Intelligence Package
- ISO/IEC 5338 / ISO/IEC 8183 / ISO/IEC 42001 – Artificial Intelligence Package