⚙️ Can AI and the environment coexist?

Good Morning,

Welcome to this special weekend edition of The Deep View, presented in partnership with Athyna.

The environmental cost of AI

Since OpenAI released its ChatGPT chatbot to the world, AI companies have been working on developing bigger Large Language Models (LLMs) to deliver more powerful AI products.

In the pursuit of more advanced AI capabilities, the environmental implications have become increasingly significant. Training OpenAI's GPT-4, for instance, consumed approximately 50 gigawatt-hours (GWh) of electricity—equivalent to the annual energy consumption of about 4,600 average U.S. households. This substantial energy requirement underscores the growing environmental footprint associated with developing and deploying large language models.

Bringing AI to consumers is just the beginning; each query on advanced AI models like GPT-4o consumes around 0.3 watt-hours of electricity, roughly the same as a typical Google search. However, the broader integration of AI into everyday activities could significantly impact global electricity demand. According to the International Energy Agency (IEA), if AI-powered searches replaced all 9 billion daily searches, global electricity usage could increase by approximately 10 terawatt-hours annually—equivalent to the total yearly electricity consumption of about 1.5 million European Union residents.

The cost of construction: mining and manufacturing impacts

When one thinks about an AI model, most of us think about a chatbot. However, at an enterprise level, AI does much more than that. It can automate repetitive tasks, making companies more efficient. However, all of this takes an immense amount of computational power.

In order to sustain this performance, AI models are run in data centers that house huge supercomputers. These supercomputers can contain anywhere between 1,000-1,000,000 specialized GPU chips.

Now consider this: around 800 kg of raw materials are used in the construction of a single computer that weighs 2 kg. This represents a fraction of the raw materials that go into the construction of a single supercomputer, “making it a mountain of materials”, not just computational power.

Materials such as silicon, copper, gold, silver, platinum, and nickel are mined to create a supercomputer, a process that contributes significantly to carbon emissions, deforestation, and water pollution. This environmental toll doesn't even account for accidents or abandoned mining sites, which can continue releasing pollutants long after valuable minerals have been extracted.

As AI technologies become increasingly integrated into daily life, their environmental impact warrants closer scrutiny. A single query to ChatGPT can consume approximately 0.0029 kilowatt-hours (kWh) of electricity, nearly ten times the energy required for a typical Google search, which uses about 0.0003 kWh. If AI-powered searches were to replace all 9 billion daily Google searches, this shift could add an estimated 10 terawatt-hours (TWh) to global electricity demand annually—equivalent to the yearly electricity consumption of approximately 1.5 million European Union residents. This substantial increase underscores the importance of considering the energy implications of widespread AI adoption.

Throughout these steps, the computations are performed on GPUs, which require a constant flow of energy.

As AI models become increasingly sophisticated, their training demands more computational resources, driving up energy consumption. Currently, most AI companies prioritize enhancing their models' performance, often overlooking improvements in energy efficiency.

It is estimated that since 2012, the amount of computing power used for deep learning research has been doubling every 3.4 months, according to OpenAI researchers. The result will be an increasing pressure on the global net carbon emission goals to be achieved by 2050.

It is estimated that the rise of AI could see data centers account for nearly 35% of the country’s energy use by 2026.

Data centers’ unquenched thirst for water

Data centers also require large quantities of water for cooling. There is very little publicly available data on how much water is used by data centers, but estimates can be made by looking at a lengthy legal battle that involved Google’s data centers in Dalles, Oregon. During the course of the lawsuit, it was revealed that Google’s data centers in the region consumed more than 355 million gallons of water in 2021. The amount had tripled since 2016 and represented more than one-quarter of the town’s annual water consumption. Data centers figure amongst the top 10 ‘water-consuming industrial or commercial industries’, according to a study led by Landon Marston, an assistant professor of civil engineering and environmental engineering at Virginia Tech. The study further noted that data centers draw water from nearly 90% of the U.S. watersheds. And their thirst for water is only going to increase as the adoption of AI increases.

More AI will mean more e-waste

An important side effect of the adoption of AI systems around the world is the push it has given to the upgrade cycles. An upgrade cycle is the time a consumer holds on to his/her devices. With AI coming into the picture, more users will be willing to let go of their older devices before the end of their performance cycles to access AI features.

The biggest example of this can be seen in the smartphone market, where newer devices feature dedicated neural engines capable of running on-device AI features. Due to the lack of newer chips that feature neural engines, more users are willing to let go of their older devices.

At the time of its launch, the iPhone 15 series was one of the best-selling smartphones in the market, but its sales were overshadowed by the iPhone 16 series devices launched just a year later. The major difference between the two series was the introduction of AI features.

A similar trend can be seen in the personal PC market, where manufacturers are rushing to introduce AI features in the devices to boost sales.

All this contributes to e-waste, a majority of which still ends up in landfills. The inability of older devices to run AI features not only impacts their resale value but also contributes to more of them being discarded rather than being refurbished.

AI’s potential for environmental sustainability

Despite their environmental impact, AI models can assist humanity in recognising patterns to better combat challenges posed by climate change. AI can also be used to reduce the carbon emissions of industries like mining, refining, and manufacturing. It can also be used to improve closed-loop manufacturing, where more materials can be effectively reclaimed and reused.

Stronger legislation around the sources of energy used for running data centers can also help reduce the impact of AI on the environment.

However, this is more speculative as AI could also have unintended consequences. For example, the development of AI-powered self-driving cars could cause more people to drive instead of cycling or taking public transport, increasing carbon emissions. AI could also cause misinformation about climate change, downplaying its impact on people’s lives, further exacerbating the problem. Whatever the future, AI holds tremendous power, depending on how it is used, not just on social and economic fronts, but also on the environmental front.

Final Thoughts

While AI promises to revolutionize numerous sectors, its rapid growth comes with significant environmental costs. It's crucial to balance innovation with sustainable practices, ensuring technology progresses without sacrificing our planet’s health.

For companies in growth mode, remote hiring offers access to top global talent—particularly across Latin America—while reducing emissions from commuting and office operations. It also presents a cost-efficient alternative, with potential savings of up to 70% compared to the U.S. market. Learn more here.