Is AI Sucking the World? The Worst is Yet to Come

In recent years, the rise of artificial intelligence (AI) has caused widespread discussion and concern, many people worry that AI will cause unemployment to soar, while some optimistic friends jokingly said that "as long as the cost of electricity is more expensive than buns, AI will never be able to completely replace people".

 

Although this is a joke, behind the real AI energy consumption, more and more people are worried that high energy consumption will become a bottleneck restricting the development of AI. Not long ago, technology entrepreneur and former Google engineer Kyle Corbitt

 

Kyle Corbitt, a technology entrepreneur and former Google engineer, said on social media X that Microsoft has already encountered this problem.

 

How power-hungry is AI?

 

According to Corbitt, the Microsoft engineers who trained GPT-6 were busy building an IB network (InfiniBand) to connect GPUs in different regions. The job is difficult, but they have no choice because if more than 100,000 H100 chips were deployed in the same region, the power grid would collapse.

 

Why would the concentration of these chips have the effect of collapsing the power grid? Let's do some simple math.

 

Data published on NVIDIA's website shows that each H100 chip has a peak power of 700W, with 100,000 H100s consuming up to 70 million watts of peak power, while an energy industry practitioner in X's comments section has pointed out that the total energy consumption of 100,000 chips would be equivalent to the entire output of a small solar or wind power plant. On top of that, you have to consider the energy consumption of the facilities that support so many chips, including servers and cooling equipment. With so many power-consuming facilities, concentrated in a small area, the pressure on the power grid can be imagined.

 

 

AI power consumption, is the tip of the iceberg

 

 

On the issue of AI energy consumption, the New Yorker's report once attracted widespread attention. The report estimated that ChatGPT consumes more than 500,000 kWh of electricity per day. (See: ChatGPT consumes more than 500,000 kilowatt-hours of electricity per day, and it is energy that is killing the development of AI.

 

In reality, current AI power consumption, while seemingly astronomical, is still far less than cryptocurrencies and traditional data centers. Microsoft engineers encountered the problem also shows that the constraints on the development of AI are not only the energy consumption of the technology itself but also the energy consumption of supporting infrastructure, as well as the carrying capacity of the power grid.

 

A report released by the International Energy Agency (IEA) shows that the global power consumption of data centers, AI, and cryptocurrencies will reach 460 TWh in 2022, accounting for nearly 2% of global energy consumption. the IEA predicts that, in the worst-case scenario, the power consumption of these areas will reach 1,000 TWh in 2026, which is comparable to that of the entire Japanese power consumption.

 

However, the report also shows that energy consumption directly into AI R&D is currently much lower than data centers and cryptocurrencies. NVIDIA, which holds about 95% of the AI server market, supplied about 100,000 chips in 2023, consuming about 7.3 TWh of electricity per year. but in 2022, cryptocurrencies will consume 110 TWh of energy, comparable to that of the entire Netherlands.

 

Cooling Energy Consumption, Not to be Ignored

 

The energy efficiency of a data center is often evaluated in terms of Power Usage Effectiveness, which is the ratio of all energy consumed to the energy consumed by the IT load. The closer the ratio is to 1, the less energy is wasted in the data center. According to a report published by the Uptime Institute, a data center standards organization, the average energy efficiency ratio for large data centers globally in 2020 will be approximately 1.59. That is, for every kilowatt-hour of power consumed by a data center's IT equipment, 0.59 kilowatt-hour of power will be consumed by its ancillary equipment.

 

 

The vast majority of the additional energy consumed by data centers is used for cooling systems. A research study shows that the energy consumed by the cooling system can be up to 40% of the total energy consumption of the data center. In recent years, as chips are updated and the power of a single piece of equipment increases, the power density (i.e., power consumption per unit area) of data centers is increasing, which puts higher requirements on heat dissipation. However, at the same time, by improving data center design, it is possible to significantly reduce the waste of energy.

 

 

Energy efficiency ratios vary widely from data center to data center because of differences in cooling systems, structural design, etc. The Uptime Institute reports that European countries have reduced their energy efficiency ratios to 1.46, while more than one in ten data centers in the Asia-Pacific region still have ratios of more than 2.19.

 

 

Countries around the world are taking steps to urge data centers to meet their energy efficiency goals. The European Union requires large data centers to set up waste heat recovery equipment; the U.S. government has injected funds into the research and development of more energy-efficient semiconductors; and the Chinese government has introduced measures requiring data centers to have an energy efficiency ratio of no more than 1.3 by 2025, and to increase the proportion of renewable energy use year by year to reach 100 percent by 2032.

 

 

With tech companies using electricity, it's harder to cut back on costs than to open up resources

 

 

With the growth of cryptocurrencies and AI, major tech companies are expanding their data centers. According to the International Energy Agency (IEA), the U.S. will have 2,700 data centers in 2022, consuming 4% of the nation's electricity, and predicts that this percentage will reach 6% by 2026. As land on the east and west coasts of the U.S. gets tighter, data centers are gradually shifting to central regions such as Iowa and Ohio, but these second-tier regions are not well developed by their original industries, and the power supply may not be able to meet the demand.

 

 

Some technology companies try to get rid of the grid and buy electricity directly from small nuclear power plants, but this way of using electricity and new nuclear power plants is subject to complex administrative processes. Microsoft tries to use AI to assist in completing applications, while Google uses AI for computing task scheduling to improve grid operation efficiency and reduce corporate carbon emissions. As for when controlled fusion will be put into application, it is still unknown.

 

Climate warming adds insult to injury

 

AI research and development requires a stable and robust power grid, but with the frequency of extreme weather, the power grid in many areas is becoming more vulnerable. Warming will lead to more frequent extreme weather events, which will not only cause a surge in demand for electricity and increase the burden on the grid but will also directly impact grid facilities. the IEA reports that the global share of hydroelectric power generation will fall to a three-decade low of less than 40 percent in 2023, due to the effects of droughts, lack of rainfall, and early snowmelt.

 

 

Natural gas is often seen as a bridge in the transition to renewable energy, but it is unstable in extreme winter weather. in 2021, a cold snap struck the US state of Texas, causing widespread power outages that left some residents without power in their homes for more than 70 hours. A major cause of this disaster was frozen natural gas pipelines, which caused natural gas-fired power plants to shut down. The North American Electric Reliability Council (NERC) predicts that in 2024-2028, more than 3 million people in the U.S. and Canada will be at increasing risk of losing power.

 

To ensure energy security while realizing energy conservation and emission reduction, many countries also consider nuclear power plants as a transitional measure. At the 28th Summit of the United Nations Committee on Climate Change (COP 28) held in December 2023, 22 countries signed a joint statement pledging to increase nuclear power generation capacity to three times the 2020 level by 2050, and the IEA predicts that by 2025, the global nuclear power generation capacity will increase to three times the 2020 level, with China, India, and other countries vigorously pushing forward the construction of nuclear power plants. Meanwhile, with China, India, and other countries vigorously promoting the construction of nuclear power, the IEA predicts that by 2025, the global nuclear power generation capacity will reach a record high!

 

The IEA report states, "In the face of changing climate patterns, it will become increasingly important to increase energy diversification, improve the ability of the grid to dispatch across regions, and adopt more shock-resistant generation methods." Safeguarding grid infrastructure is not only about the development of AI technology but also about the nation's livelihood.

Privacy    |    Terms of use