According to a report by the International Energy Agency (IEA), data centers are going to use double the amount of electricity by 2030 than they are using right now mostly because of the rise of artificial intelligence. This can create a lot of challenges and problems for climate goals and energy security. Only 1.5% of the global electricity is consumed by data centers, but this consumption has increased by 12% over the past five years. The countries that consume the most electricity for data centers are Europe, the US, and China.
Many big tech companies need more electricity for their data centers as AI is growing and more power is needed. To meet the demand for electricity for its data centers, Google signed a deal last year that the company will use electricity from small nuclear reactors. Amazon also plans to use nuclear energy for its data centers. Microsoft is also set to use electricity from new reactors at Three Mile Land which is the site of a nuclear accident. If the current trend of electricity consumption by data centers continues, they will consume up to 3% of world’s energy by 2030 and the electricity use will reach 945 terawatt hours (TWh) by then.
The IEA also reports that data centers are going to use more electricity than Japan does now by 2030 because of various digital services and AI products. A data center with 100-megawatt energy can use as much power as 100,000 homes and even two million homes if they are under construction. As the energy usage of data centers increases, it will also give huge potential to boost efficiency, cut energy costs, and reduce emissions.
US President Donald Trump has launched the ‘National Council for Energy Dominance’ to stay ahead of China in the AI race. Right now, coal is the source of 30%of power in data centers but natural gas and renewable energy sources are going to be used more in the future. Carbon footprints are also growing with the rise in the number of data centers, and are expected to grow from 180 million tonnes of CO2 to 300 million tonnes of CO2 by 2035. This isn't much because right now, there are 41.6 billion tonnes of CO2 globally.
Read next: Top AI Models Fail Simple Debugging Test — Human Coders Still Reign Supreme
Many big tech companies need more electricity for their data centers as AI is growing and more power is needed. To meet the demand for electricity for its data centers, Google signed a deal last year that the company will use electricity from small nuclear reactors. Amazon also plans to use nuclear energy for its data centers. Microsoft is also set to use electricity from new reactors at Three Mile Land which is the site of a nuclear accident. If the current trend of electricity consumption by data centers continues, they will consume up to 3% of world’s energy by 2030 and the electricity use will reach 945 terawatt hours (TWh) by then.
The IEA also reports that data centers are going to use more electricity than Japan does now by 2030 because of various digital services and AI products. A data center with 100-megawatt energy can use as much power as 100,000 homes and even two million homes if they are under construction. As the energy usage of data centers increases, it will also give huge potential to boost efficiency, cut energy costs, and reduce emissions.
US President Donald Trump has launched the ‘National Council for Energy Dominance’ to stay ahead of China in the AI race. Right now, coal is the source of 30%of power in data centers but natural gas and renewable energy sources are going to be used more in the future. Carbon footprints are also growing with the rise in the number of data centers, and are expected to grow from 180 million tonnes of CO2 to 300 million tonnes of CO2 by 2035. This isn't much because right now, there are 41.6 billion tonnes of CO2 globally.
Read next: Top AI Models Fail Simple Debugging Test — Human Coders Still Reign Supreme