Artificial intelligences consume a lot of energy

Artificial intelligences consume a lot of energy
Artificial intelligences consume a lot of energy

Last May, Google introduced the AI ​​Overviews service, capable of automatically answering users’ questions thanks to generative artificial intelligence. At the time, the main topic of discussion was the bizarre mistakes made by AIs – Google recommended adding glue to pizza to make it more stringy, for example – one of the many manifestations of the so-called “hallucinations” typical of these tools. But there is another worrying aspect of a sector that has been growing continuously for years now: the energy consumption necessary for their operation.

According to a study by researcher Alex de Vries, in fact, every time Google generates a response with AI Overviews it consumes about three watt hours, an amount of energy equal to that needed for an hour-long phone call (from a landline) or ten times greater than that needed for a traditional Google search.

Generative AI, however, does not consume only when used by users. In fact, at the basis of these technologies there are large language models (also called LLMs), a type of artificial intelligence that uses deep neural networks to learn from large quantities of documents of various types how to generate texts, images and videos. This training phase of LLMs is also expensive in terms of energy: according to the site The Vergein fact, to “train” GPT-3 (a linguistic model by OpenAI that has since been superseded by GPT-4) just under 1300 megawatt hours of electricity were consumed, «roughly as much as 130 US homes consume annually» (or the energy needed to watch Netflix for 1.625 million hours).

These language models are capable of producing various types of content, and generating texts is the least energy-intensive application. According to a study carried out by researchers from the AI ​​company Hugging Face and Carnegie Mellon University, the production of images and – above all – videos consumes even more energy. According to their calculations, on average, the generation of a single image by AI consumes enough energy to charge a smartphone.

The consequences of all this are starting to be felt in the electricity grids of some countries, which must manage a significant increase in demand from technology companies. In particular in Virginia, in the United States, where dozens of data processing centers (also called data centers) are located, essential elements of the internet network infrastructure. Virginia has a close relationship with the sector (ARPANET, a military project from which the internet was born, was created here in 1969) and the main cloud computing companies, such as Amazon Web Services, Google Cloud and Microsoft Azure, still operate there today. However, the number of data centers has increased in recent years so much so that today the sector absorbs a fifth of the energy consumed by the state, as reported by the magazine Jacobin.

In particular, Bloomberg told the case of DataBank, a company that builds and manages data processing centers. The increase in energy consumption recorded by the company was instantaneous and notable: «This is the problem with artificial intelligence» explained James Mathes, head of DataBank, «it needs a lot of energy, and as soon as we have it you need it right away.”

– Read also: We need to understand what “artificial intelligence” is

These kinds of surges in demand aren’t unique to Virginia: data center demand growth is outpacing supply in many parts of the world, driving up prices and the risk of blackouts. In Sweden, data center demand is set to double between now and the end of this decade, then double again by 2040; in the U.K., it’s expected to grow 500% by 2030; and in the U.S., the sector is set to account for 8% of total consumption by 2030, up from 3% in 2022. This is “the kind of surge in electricity demand that hasn’t been seen in a generation,” Goldman Sachs said.

John Ketchum, CEO of NextEra Energy, which produces solar and wind energy, believes that artificial intelligence is undoubtedly behind this spike in demand, he told Bloomberg. In particular, some of its applications, including inference processes, with which linguistic models draw conclusions from new data (and therefore absent from the initial training material). Already today, data centers use more energy overall than most countries, including Italy: only 16 nations, including China and the United States, consume more.

All this represents a problem for the AI ​​sector, which is already dealing with the first skepticisms regarding the real applications of these technologies and will also have to deal with their economic, energy and environmental impact. Also because the demand for energy risks canceling out the many progress made by the sector in recent years, in which it has tried to make the operation of data centers more sustainable. Google, for example, plans to completely power its computing centers with energy from renewable sources by 2030: the development of services such as AI Overviews risks making it more difficult to meet these goals.

The sector is also increasingly characterised by smaller and lesser-known companies, often born in the wake of the success of generative AI, which are proving to be very unscrupulous when it comes to the energy source they use. Washington Post he told of the network of about 2,700 data centers that operate in the United States alone and in many cases are owned by smaller companies that have an “agnostic” approach to sustainability, also buying energy from fossil fuels. Tech companies promised that “clean energy would be a magical, infinite resource,” commented Tamara Kneese, director of the nonprofit Data & Society, while in reality “coal plants are being revitalized thanks to the boom in artificial intelligence.”

Every time a data center comes online, tech companies say they purchase wind, solar or geothermal energy to offset its emissions. As he explained Washington Posthowever, some critics compare these announcements to the three-glass game, because “the companies are operating on the same electrical grid as everyone else, while taking away much of the finite amount of clean energy.” A process that forces energy companies to buy fossil fuel energy to meet collective demand.

– Read also: The enthusiasm for artificial intelligence is waning a bit

For this reason, in addition to investing in new chips and more efficient servers, some of the major technology companies are hoping for radical energy breakthroughs that are still far away. In 2018, Sam Altman, co-founder of OpenAI, invested 375 million dollars in Helion Energy, a startup that aims to build a nuclear fusion plant by 2028. Fusion, the same nuclear reaction that powers the stars, is a goal that has been pursued for many decades by various scientists and companies and that could represent an enormous source of enormous and clean energy. But industry experts are very skeptical both about the timing and the concrete possibilities of the startup.

Altman considers nuclear fusion essential for the development of AI: in an interview with CNBC He said that “if we can lower the cost of AI and the cost of energy by a lot, it will dramatically improve the quality of life for all of us.” Microsoft, a longtime partner of OpenAI, has already promised to buy energy from Helion Energy as soon as it is safe to do so.

 
For Latest Updates Follow us on Google News
 

PREV Users Damaged by 1.2 PureTech Don’t Give Up: Compensation or Nothing
NEXT Dollar at the top and Yen at the bottom, but it can’t go on forever