Back to insights
Length:
10 min
Published:
July 21, 2025

Given its significant environmental impact, the AI and data center industry is actively seeking innovative strategies to reduce its environmental footprint - with a focus on energy efficiency, water conservation and resource optimization. These solutions address the challenges described in the previous section.
One example is rethinking traditional approaches to cooling. Google has conducted a study that suggests that excessive cooling of components, particularly hard disk drives (HDDs), can be counterproductive. Too low temperatures can lead to mechanical and electrical problems, which paradoxically increases the failure rate.
These findings support the trend of increasing operating temperatures in data centres. Google, for example, operates some of its data centres at temperatures up to 27 °C, which contributes to higher energy efficiency without negatively impacting equipment reliability.
Taking advantage of natural climatic conditions offers significant potential for reducing energy consumption:
In addition to efficient cooling, the focus is also on the smart use of waste heat from data centres:
Wider corporate commitments and investments also play an important role:
Target: water positivity: Technology companies such as Microsoft and Google have committed to making 2030 will be "water positive" - that is, they will return more water than they use.
Integration of renewable energy sources: Data centres are increasingly using renewable energy sources - solar and wind - to reduce the carbon footprint associated with electricity generation.
In addition to hardware and infrastructure, the optimization of AI models itself plays an important role, which can significantly reduce resource consumption:
Reducing the accuracy of model computations (e.g. from 32-bit to 8-bit), which reduces computational power and energy consumption and carbon footprint - without significantly affecting the quality of the results.

Training smaller, effective models (students) to imitate the behavior of larger models (teachers). The result is models with high accuracy and significantly lower resource requirements.

It uses only selected parts of the model according to the specific task, which reduces the number of calculations and energy consumption.

It allows you to store and reuse frequently repeated parts of a prompt, thus significantly reducing latency and computation costs.
For example, OpenAI implemented prompt caching in its APIs, which led to 50% reduced costs and faster prompt processing.

Removing less important neurons or connections in the model, resulting in smaller model size and lower computational power requirements.
Studies show that pruning can reduce size, and thus energy, by up to 90% with minimal loss of model performance (when pruning is done carefully).

This method speeds up text generation by having a smaller and faster "draft" model suggest several tokens up front, which are then verified and possibly modified by the larger "verification" model.
Thanks to parallel token processing, inference is significantly accelerated without the need to retrain the model.

An open-source library that optimizes inference of large language models using the PagedAttention algorithm, which efficiently manages memory by partitioning keys and values into smaller blocks.
In this way, it reaches up to 24-fold increased throughput over traditional libraries without having to change the model architecture.

While systemic change and technological innovation are key to reducing the environmental impact of AI, we as individuals and as a society also have an important role to play. Each of us can contribute to a more sustainable digital ecosystem through our approach:
These comprehensive approaches, both at an industry-wide level and in our individual actions, show that the industry is taking the environmental impacts of AI seriously and moving towards more responsible and sustainable technologies and approaches. Sustainability in AI is not just a technical task, it is a shared responsibility - and every thoughtful step counts.
What other innovative approaches would you see as key to reducing the ecological footprint of AI in the future?
Looking to learn more about AI's environmental impact? Check out these related articles:
Back to insights
Don't miss our best insights. No spam, just practical analyses, invitations to exclusive events, and podcast summaries delivered straight to your inbox.