AIs aren’t just stealing jobs: we may soon not have enough power for everyone

AIs aren’t just stealing jobs: we may soon not have enough power for everyone
Descriptive text here

As AI continues to revolutionize entire areas of our lives, it is crucial to consider its impact on energy consumption.

The technological evolution of recent years has brought artificial intelligence (AI) to the center of numerous discussions, often focused on its impact on the job market. While many fear that AI could replace human labor in various industries, it exists another aspect, less explored but equally critical, linked to its development. This element concerns the basic infrastructure that allows AI to operate: electricity.

In academic and industrial circles, there is a growing awareness about the energy needs necessary to support the advancement of AI-based technologies. These considerations raise very important questions about the sustainability of this incessant progress, among other things in an era in which the management of energy resources is becoming increasingly crucial.

AI’s energy consumption puts our future at risk

AI technologies, particularly generative language models such as OpenAI’s GPT-4, they require a huge amount of electricity. Arm Holdings marketing director Ami Badani highlighted how these systems require tens of thousands of computing clusters to operate effectively. This “insatiable demand” for energy therefore arises serious doubts about the future of electricity supplyespecially considering that the goal is to make AI increasingly accessible and integrated into everyday devices.

AI could consume a quarter of total US electricity by 2030 – biopianeta.it

During the Fortune Brainstorm AI conference in London, it was highlighted that widespread implementation of AI could lead to a enough energy consumption to account for a quarter of total U.S. electricity by 2030.

This scenario highlights the urgency of developing more efficient technological solutions from an energy point of view. Arm Holdings itself is working on semiconductor chips optimized to reduce power consumption, an essential move to ensure that technological progress does not exceed the capacity of existing infrastructure.

Despite the benefits that AI can bring, the energy cost associated with its operation is therefore considerable and, potentially, unsustainable. To train advanced models like Sora (OpenAI’s video generator), you need 100,000 AI chips working at full capacity. This level of consumption obviously puts a strain on current energy resources, but it also relieves environmental issues related to the increase in carbon emissions. It is clear that, without a significant change in the design and use of the devices that power these technologies, we could be facing an energy crisis of unimaginable proportions.

Tags:

 
For Latest Updates Follow us on Google News
 

NEXT the company warns, it is a scam