Why could ChatGPT destroy the planet?

ChatGPT is one of the most exciting and promising technologies of our time. However, its rapid development could have unexpected and potentially disastrous consequences for the planet.

Although ChatGPT is currently being used to enhance the user experience, it could potentially cause damage to the planet. Indeed, learning and executing AI models requires massive amounts of data and energy.

ChatGPT: the potential environmental impact of AI

Linguistic models such as ChatGPT require the processing and analysis of massive volumes of data. This quanitity of data will be all the more important with Google and Microsoft search engines. The learning process involves careful analysis of the links between the data, which requires a high level of expertise. significant computing power.

ChatGPT planet

Only large companies with considerable resources can afford to fund such projects. However, the increasing use of AI in search engines could have adverse environmental consequences.

Alan Woodward, professor of cybersecurity, asserts that “every time there is a step change in online processing, we see a significant increase in energy resources and cooling resources required by large processing centers.”

The efficient processing power, storage and search required to integrate AI into search engines could lead to a considerable increase in the amount of energy required. This increase in energy demand on ChatGPT could have significant negative repercussions on the planet.

ChatGPT: a major carbon emission

In addition to the increase in computing power, language model training also generates a significant carbon footprint. The datacenters required to process the information consume large amounts of energy and power. generate CO2 emissions.

Independent research has revealed that the creation of GPT-3, the basis of ChatGPT, consumed a considerable amount of energy. This AI consumed 1,287 MWhemissions, as well as CO2 emissions of 550 tonnes. This is equivalent to 550 round-trips between New York and San Francisco. And let’s not forget the resource costs of providing and running ChatGPT for millions of users. According to Carlos Gómez-Rodríguez, a computer scientist, “it’s not only necessary to train this model”; it’s also necessary to “execute it and provide it to a large number of users.”

Reducing the carbon footprint of large language models

Nafise Sadat Moosavi, Professor of Natural Language Processing at the University of Sheffield, is currently working on the sustainability of natural language processing such as ChatGPT. “We need to work on how to reduce the inference time required for such large models,” explains Nafise Sadat Moosavi.

To achieve this, it is important touse lighter language modelssuch as the first version of Bard published by Google. With the aim of limit the carbon footprint of machine learning systemsGoogle conducted detailed research into the energy costs of large language models. The results showed that a combination of efficient models, processors and data centers with clean energy sources can reduce the carbon footprint by up to 1,000 times.

Although the increase in computing power for ChatGPT may have a detrimental impact on the planet, Moosavi believes it is important to take into account the gains in search accuracy for users. She believes that this will enable more people to access large language models, which were previously reserved for an elite.