top of page

AI is racing to fix the future. It may be breaking the planet in the process.

  • Writer: Richard Tyler
    Richard Tyler
  • 6 days ago
  • 4 min read

Artificial intelligence has been cast as a solution to some of the world’s most complex problems, from accelerating scientific discovery to boosting productivity across entire industries. But beneath the optimism lies a contradiction that is becoming harder to ignore.The technology designed to help solve global crises is quietly intensifying one of the biggest: the strain on the planet itself.


The rapid rise of generative AI has triggered a surge in demand for computing power on a scale that few outside the industry fully grasp. Training a single large model, often containing billions of parameters, requires vast amounts of electricity. That energy demand does not end once the model is built. It continues as systems are deployed, used, updated and replaced.


What appears frictionless to users, is underpinned by an infrastructure that is anything but.

At the centre of this system are data centres: vast, temperature-controlled facilities packed with servers, storage systems and networking equipment. These sites have existed for decades, but the demands of generative AI have accelerated their expansion dramatically.


The difference now is intensity. AI workloads require significantly higher power density than traditional computing, pushing energy consumption to levels that are increasingly difficult to sustain. In North America alone, the electricity demand of data centres more than doubled in the space of a year. Globally, their consumption has reached a level comparable to that of entire nations and is expected to more than double again by 2026.


Much of that energy still comes from fossil fuels. The pace of construction has outstripped the development of clean energy infrastructure, meaning new facilities are often powered by the fastest available option rather than the most sustainable one. The result is a growing dependence on carbon-intensive energy sources at the very moment the world is trying to reduce them.


Even the process of training a single model can carry a significant environmental cost. Estimates suggest that training large systems can consume enough electricity to power hundreds of homes for a year, generating substantial carbon emissions in the process. And as companies race to release newer, larger models, that energy is repeatedly spent, often rendering previous versions obsolete within months.


But the environmental impact of AI is not confined to electricity. Water, an increasingly scarce resource in many parts of the world, plays a crucial role in keeping these systems running. Data centres rely on large volumes of water to cool their hardware, preventing overheating as machines process enormous amounts of information.

For every unit of energy consumed, significant quantities of water are required. At scale, this places pressure on local supplies and ecosystems, particularly in regions already facing environmental stress. The “cloud”, in reality, has a very physical footprint.


There are further hidden costs. The hardware powering AI, particularly high-performance processors like GPUs, requires complex manufacturing processes involving rare materials, energy-intensive production, and global supply chains. Mining and processing these materials can involve environmentally damaging practices, while the rapid turnover of hardware adds to the overall footprint.


Demand is rising quickly. Millions of these processors are now being shipped to data centres each year, with growth expected to continue as AI adoption spreads. And unlike previous waves of computing, the energy demand does not stabilise once systems are built.


Generative AI introduces a new phase of continuous consumption.Every query, every generated image, every automated task draws power. While individual interactions may seem trivial, their cumulative effect is significant and growing. Early estimates suggest a single AI query can consume several times more electricity than a standard web search.


As these tools become embedded across everyday applications, from search engines to office software, the long-term energy burden shifts from development to constant use. At the same time, the lifecycle of AI models is shrinking. New iterations are released at increasing speed, each larger and more complex than the last. The resources used to train previous models are effectively discarded, replaced by systems that demand even more.

The result is a cycle of escalating consumption.This presents a broader question about how progress is being measured. The benefits of AI are often framed in terms of efficiency, speed and capability. But the environmental cost of achieving those gains is less visible and rarely factored into the equation.


The issue is not simply the electricity consumed when a system is used. It is the cumulative impact of the entire ecosystem: the energy required to build, run and replace infrastructure; the water needed to sustain it; and the material resources embedded within it.


Taken together, these pressures are reshaping the environmental footprint of the digital economy.

There is growing recognition that this trajectory is unsustainable. But the pace of development has left little time to fully understand, let alone mitigate the consequences.


Artificial intelligence is being positioned as a tool to address change, optimise resource use and improve global systems. Yet its own expansion is placing additional strain on those same systems. It is, in effect, solving everything except the problem it is helping to create.

I prefer this response

Comments


Top Stories

Bring global news straight to your inbox. Sign up for our weekly newsletter.

  • Instagram
  • Facebook
  • Twitter
bottom of page