Loading...

  • 27 Apr, 2024

The energy required to run advanced processing chips and cooling systems makes AI like oil, useful to humans but harmful to the environment.

Humanity is enthusiastically embracing artificial intelligence despite academic and safety concerns, but its energy hunger and carbon emissions are cause for concern. AI is often compared to fossil fuels. Once produced and processed, oil is a very profitable commodity. Like oil, AI will have a major impact on the environment, which may surprise many.

According to an article in MIT Technology Review, the life cycle of training a wide range of AI models has a significant impact on the environment. “The whole process can release more than 626,000 kilograms of carbon dioxide. This is nearly five times our lifetime emissions. Average American. Cars (and this includes making the cars themselves).

Research by Alex de Vries of the VU Amsterdam School of Business and Economics also raises concerns about the energy consumption of accelerating advances in computing and the potential environmental impact of AI and data centers. "In recent years, data center electricity consumption has been relatively stable at around 1% of global electricity consumption, excluding cryptocurrency mining," De Vries said.

How AI data centers work

According to an MIT study from 10 years ago, "Most natural language processing (NLP) models can be trained and developed on a regular laptop or server." However, AI data centers now require multiple instances of specialized hardware such as graphics processing units (GPUs) or tensor processing units (TPUs). "The purpose of large-scale language models is to predict what will happen in text," says a paper from the Columbia Climate School. "To achieve that, you have to train first. Training involves exposing the model to large amounts of data (potentially hundreds of billions of words) available on the Internet, books, articles, social media, and custom datasets.

This learning process can take weeks or months, during which the AI ​​model measures various data sets to determine how accurately it will perform a particular task.

Initially, the AI ​​model makes random guesses to find the correct answer. However, with continuous training, it detects more patterns and relationships in the given data, producing accurate and relevant results.

In recent years, advances in neural network training techniques and techniques have "produced impressive accuracy gains in many basic NLP tasks".

"Consequently, training modern models requires significant computational resources that require energy and significant financial and environmental costs," the MIT study added. The energy demand and carbon footprint of AI data centers

After the launch of OpenAI ChatGPT, the rapid expansion and widespread use of artificial intelligence in 2022 and 2023 has led to the development of large language models (LLM) at large technology companies such as Microsoft and Alphabet (Google).

Vries said in the article that the success of ChatGPT (which reached an unprecedented 100 million users in two months) led Microsoft and Google to release chatbots Bing Chat and Bard AI, respectively. Vris told Urdu Voice, “We know that data centers account for 1% of global electricity consumption. Thanks to digital trends like cryptocurrency mining and artificial intelligence, it will easily grow by over 2% in the next few years.”

An MIT study found that cloud computing has a larger carbon footprint than the entire aviation sector. Furthermore, a single data center can require the same amount of energy to power approximately 50,000 homes.

Power is needed to drive high-performance chips and cooling systems. This is because the processor overheats when analyzing large amounts of data and providing accurate answers. According to De Vries' research, Hugging Face's "BigScience Large Open-Science Open-Access Multilingual (BLOOM)" model consumed 433 MWh of electricity during training.

“Other LLMs including GPT-3, Gopher, and Open Pre-Trained Transformer (OPT) use 1287, 1066, and 324 MWh respectively for training. "Each of these LLMs is trained on terabytes of data and contains more than 175 billion parameters," the study adds. In his article, De Vries cited research firm SemiAnalytics, which estimates that OpenAI would need 3,617 NVIDIA HGX A100 servers to support ChatGPT. This corresponds to an energy requirement of 564 MWh per day.

“Google reports that from 2019 to 2021, 60% of AI-related energy consumption will come from inference (real-time data provided by AI models). "Alphabet, Google's parent company, has also raised concerns about the cost of inference and training," he added.

A study by researchers at the University of California at Berkeley estimates that GPT-3, modeled after ChatGPT, has 175 billion parameters, producing 502 tons of CO2 during exercise, with a daily carbon footprint of 50 pounds. It is possible. (8.4 tons per year). Discussion of AI viability and future steps

De Vries says that the high energy requirements of data centers are largely met by fossil fuels. "We have a limited supply of renewable energy and we've prioritized it, so the additional demand is coming from fossil fuels and we need to remove that," he told VOU. "Even if we put renewable energy into AI, we'll have to use fossil fuels elsewhere, which will make climate change worse."

Avik Sarkar, a professor at the Indian Business School and former director of India's Niti Aayog Center for Data Analytics, dismissed the debate over AI's energy demands and carbon footprint. In 2018, it collaborated with the International Energy Agency (IEA) to analyze data center growth in India and its impact on the country's energy consumption. "The impact of AI on energy consumption is very small, and many technologies consume large amounts of energy," he told VOU. “If you look at a shopping street in a major city, the amount of billboard lighting is so great that it can be seen from space. It is also called night lighting and is a good indicator of economic development and growth. Energy consumption is a natural consequence of urbanization, capitalism, and economic growth. We have to accept this reality."

As for the energy demands and CO2 impact of AI data centers, De Vries says climate change is a global issue, not just for India. "The increase in both energy demand and CO2 emissions due to AI will impact all vulnerable countries," he said. Sarkar acknowledges that AI will be power-intensive as large data centers provide storage and computing infrastructure. Water used to cool data centers has another energy impact. Sarkar pointed out that most of the global data centers are located outside India and asserted that India does not currently face major challenges. Apart from personal data, other Indian data points may be stored in centers outside the country.

“Sensitive data related to financial transactions, Aadhar data, or healthcare will have to reside in India and the scale will be huge. "India has diverse climate zones, which can reduce high energy consumption by locating data centers in cool, earthquake-free regions of the country," he said. De Vries says the good news is that there are barriers in the AI ​​server supply chain that will somewhat limit growth in the short term. "We need to take this opportunity to think about the responsible use of artificial intelligence and ensure transparency about where artificial intelligence is being used so that we can properly assess the impact of this technology," he said.

By Sanjeev Kumar is a journalist based in Shimla, India, specializing in environment, climate change, and politics.