Generative AI and the Environment
What if the very technology poised to save the planet is also contributing to its demise?
Today, our host Carter Considine unpacks the ecological conundrum of generative AI's massive energy consumption which is, ironically, pushing tech titans like Google and Microsoft further from their professed sustainability goals.
Yet, there's a silver lining: AI could also be the key to unlocking unprecedented efficiencies and emissions cuts. Our host dissects the dual-edged nature of generative AI, exploring both the challenges it poses and the innovative solutions on the horizon.
From a pioneering generative AI startup striving to stay carbon neutral, to global giant PwC using AI to steer decarbonization efforts, it’s a tug-of-war between AI's environmental impact and its potential to revolutionize business processes and climate change mitigation.
As we continue to debate technology's role in our ecological future, let’s find out if AI might just turn out to be our unlikely hero in the fight against climate change.
Key Topics:
Generative AI consumes a significant amount of computing resources, both electricity and water, creating a significant amount of carbon emissions. However, AI also has the potential to help reduce carbon emissions by creating more efficiencies and speed along climate change. This podcast will focus on evaluating both the positive and negative impacts, as well as what future research and policy changes mean for this technology and being environmentally sustainable.
The Negative Impacts
AI has always had a tumultuous relationship with the environment. With the increasing popularity of generative AI tools in our daily experience, understanding this relationship is becoming increasingly important. In 2023, 22% of business leaders cited sustainability as a top issue in generative AI deployment. But companies such as Microsoft, and now Google, have failed to hit their environmental goals in the past year due to generative AI. Compared to 2019, Google’s emissions have sorted to 48%, citing AI’s electricity use as the main reason for failing to meet their climate targets.
Intensive Resource Consumption
Generative AI has two significant sources of energy consumption: the model’s initial training and the energy consumed while running the model or inference costs.
Electricity consumption
Foundation models are trained on vast amounts of data, which requires vast amounts of computational power and data centers to build and run them. However, PwC estimates that the biggest contributor to increasing emissions will come from usage. For instance, any time you write a prompt to chatGPT or create an image with Stable Diffusion, the model response consumes energy.
Sweden estimates power demand from data centers to roughly double over this decade and double again by 2040. In the UK, AI is expected to consume 500% more energy over the next decade. In the US, data centers are projected to use 8% of total power by 2030, up from 3% in 2022, according to Goldman Sachs, which described it as “the kind of electricity growth that hasn’t been seen in a generation.”
Water consumption
Another energy resource consumed by generative AI systems is fresh water to cool the processors and generate electricity. For example, West Des Moines, Iowa, is the home of a giant data center cluster that serves OpenAI’s GPT-4. A lawsuit was filed against OpenAI by the local residents in July 2022, the month before OpenAI finished training the model, claiming that the cluster had used about 6% of the district’s water.
When Google and Microsoft trained Bard and Bing, both saw spikes in water use - 20% and 34% - in one year, according to the companies’ reports.
Use case: power consumption in a gen AI startup
Let’s run through the energy consumption of a specific B2C generative AI seed-stage startup. This startup makes use of both chatGPT and Stable Diffusion base models to generate text and images for their clients.
For generating images, the startup uses NVIDIA A100 40GB GPUs, which are well known for efficiency and high performance. The GPUs scale down (i.e., turn off) when not in use. On average, the GPUs of this startup run for about 400 hours each month. This is an estimate of their energy consumption:
- Power Consumption: The NVIDIA A100 GPU has a power consumption of approximately 400 watts (0.4 kW) under load
- Monthly Usage: Running for 400 hours per month, the total energy consumption is approximately 160 kWh
Since the startup is quite early-stage, its power consumption and carbon footprint are quite small overall but this will change as the company scales. The startup recently became carbon neutral, though transparently, by purchasing carbon offsets through TerraPass.
The Positive Impacts
Optimizing business processes
Generative AI has the potential to make workflows and business processes much more efficient, which reduces both manual activities of individual workers as well as non-generative AI compute workloads. These changes could result in a significant drop in emissions across enterprise organizations, including initiatives working directly to reduce carbon emissions.
One example of this is PwC using generative AI and analytics to help clients prioritize decarbonization efforts. Generative AI can offer multiple options in various departments, helping key decision-makers drive well-informed decisions.
Climate Change powered by AI startups
Besides having a transformative impact on business operations, generative AI has the potential to help address some of the planet’s most daunting climate-related challenges.
Monitoring energy consumption, for example, within large organizations can be tricky. The startup CarbonBright uses AI to help companies calculate the complete carbon footprint of their consumer products. In a similar vein, BrainBox AI uses autonomous AI to lower commercial buildings' carbon emissions.
KoBold Metals uncovers new sources of lithium, cobalt, copper, and nickel with the help of AI, intending to help decrease global reliance on fossil fuels by ensuring 60% of all new light cars and trucks are electric by 2030, with a target of achieving 100% by 2050.
Moving forward
Reducing Energy Costs
Although the intentions may not be the same as those interested in climate change and environmental sustainability, companies deploying generative AI are also incentivized to reduce energy costs as much as possible. AI companies spend a significant amount of capital on both training and running models since both require GPUs to work. High energy consumption for AI companies is one of the biggest financial sinks for running a generative AI model.
Research is currently being pushed to optimize the pre-training of generative models better, as well as reduce the energy costs of inference, which will result in overall less energy consumption and decreased carbon footprint.
Changing Regulations
EU AI Act has also been officially published, with specific requirements related to sustainability:
- Requires providers of general-purpose AI models to disclose energy consumption and training compute
- Calls for the development of harmonized standards for AI energy disclosure
- Encourages AI applications that mitigate climate change and biodiversity loss
- Calls for the creation of a Code of Conduct for assessing and minimizing AI's impact on the environment
Very few general-purpose AI providers currently publicly disclose energy consumption and training compute power. By requiring base model providers to disclose energy consumption, companies will be incentivized to reduce carbon emissions. However, not all base models are the same; some models differ in energy efficiency by 6000x, depending on the specific model and how it has been trained.
Although implementation timelines and dissemination will take time, changing legislation in the EU is paving the way for more transparent and sustainable AI.