Tech giant Google is giving a blow to expert claims made during this month’s climate change debate.
Reports went as far as to explain how the growing demand for AI has caused a rise in data center emissions by 13% and that’s worsening climate change. However, chief scientist Jeff Dean at the firm says it’s beyond misleading, adding how the reality was far from the findings unveiled.
Dean heads both Google’s Research and DeepMind divisions and he’s been very clear cut about how the company is not sidetracking from its goal of being 100% clean and using sustainable energy sources to power the firm. This is a part of the company’s goals to attain by the year 2030.
He agreed that progress on this front was not nearly as great as he had envisioned it to be, adding how it was not a linear relationship. Furthermore, discussions revolved around attempts made to better the process whose results might not be visible now but would become fruitful in several years.
There was special emphasis on the tech giant’s projects with providers of clean energy sources that again need time to flourish. Such matters give rise to serious jumps in terms of energy percentage which is free from carbon.
The focus is more related to making the system super efficient, he added and AI is not to blame for the great usage of data centers.
AI was highlighted as taking up a small fraction of the overall emissions and while the trend might seem like it’s growing fast, it cannot be blamed for the overall rise in carbon emissions from these data centers.
To better get a clearer picture, Dean says all data sources must be thoroughly investigated so that the right trends that give rise to the growing phenomenon increase. However, he failed to delineate what those trends were exactly.
Dean is a major expert in the world of tech, being one of Google’s oldest employees by experience. He was onboarded in 1999 and then given credit for being a pioneer in the firm that converted the early search engine into one of the largest and most impactful systems seen today that can index the web and serve billions of people every single day.
There was a similar discussion about Google’s Astra project which is based on research and was launched at this year’s developer conference. It had to do with the feature enabling users to point cameras on their smartphones at anything and ask the AI model relevant queries that would be answered in minutes.
From determining your exact location to finding your lost belongings, it could do it all. Hence, Google says it might arrive soon on the Gemini chatbot by this year’s end so stay tuned. Further commenting on that matter, Dean says the testing phase would begin by this year’s end.
Dean concluded with a discussion on the future of AI and how models are undergoing a process of evolution that will need time for scaling so we can carry out more tasks but at the same time, he emphasized the need to get more algorithmic breakthroughs.
This means enabling AI models to imagine plausible outcomes and provide reasoning that actually makes sense. These are the types of changes needed to transform models into something worthwhile and reliable than what we’re already witnessing today.
Image: DIW-Aigen
Read next: iPhone 15 Demand Weakens As Sales For iPhone 14 Continues To Grow
Reports went as far as to explain how the growing demand for AI has caused a rise in data center emissions by 13% and that’s worsening climate change. However, chief scientist Jeff Dean at the firm says it’s beyond misleading, adding how the reality was far from the findings unveiled.
Dean heads both Google’s Research and DeepMind divisions and he’s been very clear cut about how the company is not sidetracking from its goal of being 100% clean and using sustainable energy sources to power the firm. This is a part of the company’s goals to attain by the year 2030.
He agreed that progress on this front was not nearly as great as he had envisioned it to be, adding how it was not a linear relationship. Furthermore, discussions revolved around attempts made to better the process whose results might not be visible now but would become fruitful in several years.
There was special emphasis on the tech giant’s projects with providers of clean energy sources that again need time to flourish. Such matters give rise to serious jumps in terms of energy percentage which is free from carbon.
The focus is more related to making the system super efficient, he added and AI is not to blame for the great usage of data centers.
AI was highlighted as taking up a small fraction of the overall emissions and while the trend might seem like it’s growing fast, it cannot be blamed for the overall rise in carbon emissions from these data centers.
To better get a clearer picture, Dean says all data sources must be thoroughly investigated so that the right trends that give rise to the growing phenomenon increase. However, he failed to delineate what those trends were exactly.
Dean is a major expert in the world of tech, being one of Google’s oldest employees by experience. He was onboarded in 1999 and then given credit for being a pioneer in the firm that converted the early search engine into one of the largest and most impactful systems seen today that can index the web and serve billions of people every single day.
There was a similar discussion about Google’s Astra project which is based on research and was launched at this year’s developer conference. It had to do with the feature enabling users to point cameras on their smartphones at anything and ask the AI model relevant queries that would be answered in minutes.
From determining your exact location to finding your lost belongings, it could do it all. Hence, Google says it might arrive soon on the Gemini chatbot by this year’s end so stay tuned. Further commenting on that matter, Dean says the testing phase would begin by this year’s end.
Dean concluded with a discussion on the future of AI and how models are undergoing a process of evolution that will need time for scaling so we can carry out more tasks but at the same time, he emphasized the need to get more algorithmic breakthroughs.
This means enabling AI models to imagine plausible outcomes and provide reasoning that actually makes sense. These are the types of changes needed to transform models into something worthwhile and reliable than what we’re already witnessing today.
Image: DIW-Aigen
Read next: iPhone 15 Demand Weakens As Sales For iPhone 14 Continues To Grow