Even though AI has brought several advantages to different sectors of life, a lot of electricity is required to power these models. It is reported that 25,000 NVIDIA A100 GPUs are needed for the training process of GPT-4 which requires about 62,000 megawatt-hours. This amount of energy can power 1000 US houses for 5 years.
This is just the energy required for one model. There are various other AI models like Meta’s supercluster which will require 350,000 NVIDIA H100 GPUs which are going to need a lot more electricity than GPT-4. Google and X are also making some hardware projects which require large amounts of energy.
Electricity for these tech models is a requirement which is a resource burden and needs huge amounts of investment from the companies. In addition to that, this can hugely impact the environment in a negative way too. We've put together some insights on Microsoft’s need for electricity as it is working with OpenAI on various projects.
Microsoft’s 2024 Sustainability Report from 2020-2023 states that Microsoft’s energy requirements have increased from 11 TWh to 24 TWh in just four years. To put this into perspective, take the example of Jordan. Jordan (which has a population of 11 million) uses 20 TWh (terawatt-hour) of electricity in a year.
Microsoft has partnered up with OpenAI to train its AI Models like ChatGPT with Microsoft Azure. Microsoft is also investing hundreds of millions of dollars to build a supercomputer by linking thousands of NVIDIA GPUs for development of ChatGPT. It is obvious that if a company needs to build a data center, the need for electricity is a crucial requirement. When there's a use of energy like electricity, carbon emissions also take place which harm the environment. Microsoft’s data centers were responsible for 30% of these carbon emissions from 2020 to 2023. But both Microsoft and Google are aiming for carbon neutrality by 2030. On the other hand, Google’s carbon emissions have increased by nearly 50% since 2019.
Read next:
• Report Shows Many Businesses Are Quick in Implementing AI But They Are Struggling to Scale It
• Mobile Use Soars: 98% Adults, 90% Kids; Children Dominate Tablets, Consoles
This is just the energy required for one model. There are various other AI models like Meta’s supercluster which will require 350,000 NVIDIA H100 GPUs which are going to need a lot more electricity than GPT-4. Google and X are also making some hardware projects which require large amounts of energy.
Electricity for these tech models is a requirement which is a resource burden and needs huge amounts of investment from the companies. In addition to that, this can hugely impact the environment in a negative way too. We've put together some insights on Microsoft’s need for electricity as it is working with OpenAI on various projects.
Microsoft’s 2024 Sustainability Report from 2020-2023 states that Microsoft’s energy requirements have increased from 11 TWh to 24 TWh in just four years. To put this into perspective, take the example of Jordan. Jordan (which has a population of 11 million) uses 20 TWh (terawatt-hour) of electricity in a year.
Microsoft's Power Use and CO₂ Emissions: 2020-2023 |
Microsoft has partnered up with OpenAI to train its AI Models like ChatGPT with Microsoft Azure. Microsoft is also investing hundreds of millions of dollars to build a supercomputer by linking thousands of NVIDIA GPUs for development of ChatGPT. It is obvious that if a company needs to build a data center, the need for electricity is a crucial requirement. When there's a use of energy like electricity, carbon emissions also take place which harm the environment. Microsoft’s data centers were responsible for 30% of these carbon emissions from 2020 to 2023. But both Microsoft and Google are aiming for carbon neutrality by 2030. On the other hand, Google’s carbon emissions have increased by nearly 50% since 2019.
Read next:
• Report Shows Many Businesses Are Quick in Implementing AI But They Are Struggling to Scale It
• Mobile Use Soars: 98% Adults, 90% Kids; Children Dominate Tablets, Consoles