Resources > Our Business

Cloud Computing’s Climate Impact: AI’s Role in the Rising Carbon Emissions

cloud computing technology emissions

Artificial intelligence (AI) has been touted as the tech industry’s latest marvel, with the potential to revolutionize several trillion-dollar industries, ranging from retail to medicine. However, the development of every new chatbot and image generator requires a substantial amount of electricity, which may be responsible for a growing amount of planet-warming carbon emissions, according to Bloomberg news.

To train AI algorithms called models, Microsoft, Google, and OpenAI use cloud computing that relies on thousands of chips inside servers in massive data centers worldwide, analyzing data to help them “learn” to perform tasks. The AI sector is growing so fast and has such limited transparency that no one knows precisely how much total electricity use and carbon emissions can be attributed to AI.

The energy consumption of AI is higher than other computing forms, and the training of a single model may consume more electricity than a hundred US homes consume in a year. Additionally, emissions may vary based on the source of the electricity used, with data centers that rely on coal or natural gas-fired power plants responsible for much higher emissions than those using solar or wind farms.

Researchers have tallied the emissions from the creation of a single model, and some companies have provided data about their energy use, but they don’t have an overall estimate for the total amount of power the technology uses. Sasha Luccioni, a researcher at the AI firm Hugging Face Inc. wrote a paper quantifying the carbon impact of Hugging Face Inc.’s BLOOM, which is a rival of OpenAI’s GPT-3.

Researchers like Luccioni calls for greater transparency on the power usage and emissions of AI models. Governments and companies can then decide whether using GPT-3 or other large models is worth the electricity and emissions for researching cancer cures or preserving indigenous languages.

The development of AI models may lead to more scrutiny, similar to the cryptocurrency industry. Bitcoin has faced criticism for its outsized power consumption, using as much electricity annually as Argentina, according to the Cambridge Bitcoin Electricity Consumption Index. The massive energy consumption of cryptocurrency mining has led China to prohibit mining, and New York has imposed a two-year ban on new licenses for fossil fuel-based crypto mining.

A single general-purpose AI program called GPT-3, which can generate language and has numerous applications, took 1.287 gigawatt-hours to train, the equivalent of about 120 US homes’ annual electricity consumption. The training generated 502 tons of carbon emissions, roughly the same as 110 US cars emit in a year. OpenAI is already working on GPT-4, and models must be retrained regularly to remain aware of current events.

Google’s researchers found that AI accounted for 10 to 15% of the company’s total electricity consumption, equivalent to about 2.3 terawatt-hours annually. Microsoft, Google, and Amazon, the three largest US cloud companies, have all pledged to be carbon negative or neutral.

In a statement, OpenAI emphasized its efforts to enhance the efficiency of the ChatGPT application programming interface, resulting in reduced electricity usage and lower costs for its customers. The company recognizes its obligation to combat climate change and is continuously exploring ways to optimize computing power usage. OpenAI runs on the Azure platform and collaborates closely with Microsoft’s team to boost efficiency and minimize the environmental impact of running extensive language models.

Net Zero Pledges

Google is pursuing net-zero emissions across its operations by 2030, with a goal to run its offices and data centers entirely on carbon-free energy. Microsoft, for its part, is purchasing renewable energy and taking other measures to meet its previously stated goal of being carbon negative by 2030.

Related articles