Site icon Business Manchester

Balancing AI Ambitions and Green Goals in West London

balancing ai ambitions and green goals in west london business manchester

Microsoft’s endeavour to expand its AI capabilities has found a new focal point: the Park Royal Data Centre in west London. This project underscores the company’s dual commitment to innovation and sustainability.

However, as the tech giant grapples with rising energy demands from AI, its ambitious goal to be carbon negative by 2030 becomes increasingly complex. Therefore, the construction of this datacentre, despite its reliance on renewable energy, highlights the broader challenges of aligning technological advancement with environmental responsibility.

Construction of Microsoft’s Park Royal Data Centre

Microsoft’s commitment to AI expansion is evident with the construction of the new Park Royal Data Centre in west London, which aims to be powered entirely by renewable energy. However, the challenge lies in balancing AI’s energy demands with the company’s green ambitions, particularly their goal of achieving carbon negativity by 2030.

Despite aiming for renewable energy, the construction and operation of these datacentres significantly impact CO2 emissions. Specifically, scope 3 emissions, such as the CO2 generated by building materials and user electricity consumption, have risen by over 30% since 2020. This uptick puts Microsoft above its overall emissions target by a similar margin.

AI’s Contradictory Role in Climate Goals

Bill Gates, co-founder of Microsoft, recently spoke about AI’s potential in combating climate change. He emphasised that the tech industry is eager to adopt clean electricity sources, even at a higher cost, to claim they are using green energy. However, this stance conflicts with the immediate energy-consuming nature of AI.

Brad Smith, Microsoft’s president, admitted that their AI strategy has complicated their green goals, likening it to a ‘moonshot’ that’s now even more distant. The company plans significant investments, approximately £2.5bn over the next three years, to expand its AI datacentre infrastructure in the UK and other countries like the US, Japan, and Germany. These datacentres are essential for training and operating AI models like ChatGPT, which consume substantial amounts of electricity.

This electricity is needed not just for running the models but also for cooling the hardware. Additionally, the production and transportation of related equipment add to the carbon emissions. Alex de Vries, founder of Digiconomist, highlights that AI is a technology that inherently increases energy consumption.

Global Energy Impact of Datacentres

The International Energy Agency predicts that the total electricity consumption of datacentres could double by 2026, reaching 1,000 TWh. This would equate to Japan’s current energy demand. According to SemiAnalysis, AI will drive datacentres to use 4.5% of global energy generation by 2030.

Given concerns about AI’s broader impacts, including job displacement and potential risks to humanity, environmental considerations are increasingly coming to the fore. The International Monetary Fund recently suggested that governments should introduce carbon taxes to account for AI’s environmental footprint. This could be implemented through general carbon levies or specific taxes on CO2 emissions from AI equipment.

Major tech companies like Meta, Google, Amazon, and Microsoft are all actively seeking renewable energy resources to meet their climate targets. For instance, Amazon has invested in offshore wind farms in Scotland, and Microsoft has committed to $10bn in renewable energy projects. Google aims to run its datacentres entirely on carbon-free energy by 2030.

Microsoft’s Renewable Energy Strategy

A Microsoft spokesperson reaffirmed the company’s commitment to its climate goals, despite the challenges posed by AI’s energy demands. Bill Gates also believes that AI can directly contribute to fighting climate change, arguing that increased electricity demand will be balanced by new investments in green energy.

However, a recent UK government-backed report pointed out that the carbon intensity of energy sources is crucial in determining AI-related emissions. The report noted that a significant portion of AI training globally still relies on high-carbon sources such as coal and natural gas. Moreover, the cooling of servers also poses a challenge; one study estimated that AI could account for up to 6.6 billion cubic meters of water use by 2027.

Sustainable Computing and Renewable Energy

Alex de Vries argues that the quest for sustainable computing power strains the supply of renewable energy, potentially causing fossil fuels to fill the gap in other sectors of the global economy. More energy consumption can lead to insufficient renewable resources to meet this increased demand.

NexGen Cloud, a UK firm offering sustainable cloud computing, advocates for situating datacentres near hydro or geothermal power sources instead of urban areas. The company’s co-founder, Youlian Tzanev, suggests that building datacentres around economic hubs complicates achieving carbon goals.

Amazon, the largest cloud computing provider, aims to achieve net zero carbon emissions by 2040 and match its global electricity use with 100% renewable energy by 2025. Similarly, Google and Meta are pursuing net zero targets by 2030. OpenAI, the developer of ChatGPT, relies on Microsoft datacentres to train and operate its AI products.

Energy Consumption in AI Training and Inference

There are two primary ways that large language models consume energy: training and inference. Training requires huge amounts of data and energy to build a model that can understand language and generate responses. Inference, or running the model, consumes even more energy as users interact with the AI.

The energy cost of training AI is substantial, making it difficult for smaller companies and governments to enter the field without significant investment. According to Brent Thill, an analyst at Jefferies, 90% of AI’s energy cost is in the inference phase, where models respond to user queries.

This high energy usage is facilitated by a growing digital infrastructure of datacentres filled with specialised servers. For example, a single training server may have a basic CPU paired with dozens of GPUs or TPUs designed to handle AI workloads. When a user interacts with a chatbot, a powerful GPU consumes about a quarter of the power required to boil a kettle.

Potential Energy Savings with AI

Despite the high energy costs, some argue that AI can lead to overall energy savings. A study in Nature’s Scientific Reports suggested that AI systems could produce written and illustrated content with significantly lower carbon emissions compared to humans.

The study found that AI emits between 130 and 1,500 times less carbon dioxide per page of text and up to 2,900 times less per image. However, this raises questions about the future roles of human writers and illustrators, who may need to shift to other fields, potentially contributing to new ‘green jobs.’


Microsoft’s journey to balance advanced AI innovation and its carbon-negative goals by 2030 remains a formidable challenge. The Park Royal Data Centre exemplifies these difficulties, reflecting the broader global struggle of aligning technological progress with environmental sustainability.

While strides are being made in renewable energy investments and innovative strategies, the inherent energy-intensive nature of AI technologies necessitates ongoing efforts. Continued collaboration among tech giants, government bodies, and environmental organisations is essential to achieving these green ambitions. Therefore, sustainable development must remain a priority as AI continues to evolve.

Exit mobile version