GenAI: A Spotlight on Cloud Carbon Emissions
To embrace the revolutionary capabilities of Generative AI, we must also face the accompanying surge in computational demands on data centres.
According to ChatGPT, Generative AI is “a type of artificial intelligence that is capable of creating new content, such as text, images, or music, by learning patterns from existing data and then generating original output that aligns with those patterns.”
Forbes has predicted that GenAI will lead to data centre infrastructure and operating costs rising to over $76 billion by 2028, more than double the current annual operating cost of Amazon's AWS. This dramatic increase is a consequence of GenAI's resource-intensive systems, which fuel the processing of trillions of “tokens” for millions of users.
Balancing Innovation and Sustainability
When probed on the carbon impact of GenAI, ChatGPT was honest enough to concede that “the extensive computational resources required for training these models can lead to significant carbon emissions, contributing to environmental concerns.”
So how can we responsibly satisfy the growing demand for GenAI services?
Innovation has always been a cycle of problem-solving and challenge-creating, with cloud computing as a prime example. On one hand, the cloud has been a hugely important environmental innovation: replacing resource-intensive processes and fuel-dependent activities with efficient digital alternatives, as well as optimising energy usage through shared resources, reducing the need for physical infrastructure. On the other, it presents a new challenge: managing its own enormous carbon footprint.
Now, the growing sub-set of cloud emissions owing to GenAI only increases the need for innovations in the field of cloud sustainability.
GenAI and Cloud Emissions
As with the cloud itself, the rise of GenAI is not merely a question of resources in terms of hardware and computing power, but fundamentally one of energy consumption and carbon emissions. As the power demand of data centres increases, so does their carbon footprint (estimated to reach 4,250 megawatts by 2028 - a 212X increase).
So, if the challenge of cloud emissions (already projected to hit 10% of global emissions in the next decade) did not seem urgent enough, a peak at the promises of GenAI will surely put things in perspective. Almost all of the computation running in LLMs (Large Language Models) is doing so as part of the cloud, yet accurate cloud emission measurement is still lacking across the industry.
The Need for Carbon-Conscious AI Development
As the industry continues to innovate, the need for carbon-conscious AI development is as urgent as it is for the rest of the cloud, if not moreso. Where exponential scalability is likely, the tiniest details in efficiency can make a huge difference.
That's why, as with all areas of cloud sustainability, the first step forward is well-researched cloud emissions data in transaction level detail.
By leveraging comprehensive and detailed data on cloud emissions, we can develop more carbon-efficient practices and apply these to the development and deployment of GenAI services. The quest for cloud efficiency is already understood to be more than a cost reduction exercise and is increasingly framed as a crucial effort towards minimising the environmental impact of the sector.
The next chapter of AI's evolution must not be written without considering climate implications. With the right tools, we can better understand and control the carbon footprint of this historic innovation.
In the end, our planet's sustainability is the ultimate benchmark of success.