DXHEROES Logo
What We Do

/

Back to insights

#ai

Data centres: the heart of the digital world and its environmental footprint

Length: 

12 min

Published: 

June 23, 2025

The resource requirements that we have described in previous parts, is directly related to how data centres are designed and operated.

Data centres are dedicated complexes that host the servers and infrastructure that power our digital world - including AI-powered applications. Their operation requires significant power and advanced cooling systems to ensure optimal performance and protect key hardware from overheating.

How data centres work

Data centres are an essential infrastructure for the operation of large language models (LLM), such as GPT or LLaMA. These models require significant computing resources, which places high demands on the energy efficiency of data centers.

Standard metrics assess data centre energy efficiency PUE (Power Usage Effectiveness), which shows how much of the total power consumed is actually used by IT equipment (servers, storage, network devices, GPUs, etc.).

PUE is calculated as the ratio of the total power consumption of the data centre to the power consumption of the IT equipment:

PUE = Total data centre energy consumption / IT equipment energy consumption

Efektivita využití energie (PUE)

A value of 1.0 means perfect efficiency - all energy goes directly to the computing and nothing is wasted on cooling, lighting, etc. In practice, however, it is very difficult to achieve this value. For example, Google's data centres achieved an average PUE of 1.1 in 2022, with the best values being 1.07. However, the global average was around 1.57, indicating significant room for improvement.

Key components

  • Connectivity - It provides connectivity between devices inside the data center and to the outside world. It includes elements such as routers, switches, firewalls and application controllers. Fast and reliable data transfer is essential for effective LLM training and inferencing.

  • Storage - It is used for data storage and backup. Includes hard drives, SSDs, tape drives and data backup and recovery systems. Fast NVMe SSDs are often used to minimize latency.

  • Computing equipment - They provide the computing power and memory needed to run applications and services. They consist of servers and other powerful computers, including GPU servers, that are capable of handling the heavy computation associated with LLM training and inference.

Klíčové části datových center

Types and categories of data centres

  • Enterprise Data Centres (Enterprise) - Owned and operated by individual organisations for their internal needs. They can be located on internal premises or at external sites.

  • Colocation data centres (Colocation) - They offer space and infrastructure to various organizations that host their own hardware. The provider provides power, cooling, physical security and connectivity.

  • Cloud data centres - Operated by cloud service providers such as AWS, Microsoft Azure or Google Cloud. They allow customers to use on-demand compute and storage capacity without the need to manage physical hardware.

  • Edge data centres - Smaller devices located closer to end users or data generating devices. They minimize latency and increase data processing speed, which is key for applications such as IoT or autonomous vehicles.

  • Hyperscale data centres - High-capacity facilities designed for large-scale operation, often owned by technology giants. They contain thousands of servers and provide massive computing and storage capacity for services such as cloud computing and big data analytics.

  • On-premise data centres - They provide full control over the infrastructure, which is advantageous for organizations with strict data security and privacy requirements.

Uptime Institute data center classification

The Uptime Institute has created a four-tier system for classifying data centers based on their ability to provide service availability:

  • Tier I - Basic Capacity - Availability: 99.671% (max. 28.8 hours of outage per year). Single path for power and cooling, no redundant components. Suitable for small businesses with low availability requirements.

  • Tier II - Redundant Capacitive Components - Availability: 99.741% (max. 22 hours of outage per year). Single path for power and cooling with some redundant components. Suitable for small to medium sized businesses looking for reliability.

  • Tier III - Simultaneously maintainable - Availability: 99.982% (max. 1.6 hours of outage per year). Multiple paths for power and cooling, allowing maintenance without interrupting operation. Suitable for organisations requiring high service availability.

  • Tier IV - Fault Tolerant - Availability: 99.995% (max. 26.3 minutes of outage per year). Fully redundant systems (2N), able to withstand outages without service interruption. Suitable for critical applications where downtime is unacceptable (e.g. financial institutions, hospitals).

Infrastructure requirements for the LLM

  • Network throughput - Fast and reliable data transfer is crucial for LLM training and inference, especially when working with large datasets.

  • Safety and compliance - Ensuring data security and compliance with regulatory requirements such as GDPR is essential when processing sensitive information.

  • Performance and cooling - LLM models require high computing power, which generates significant heat. Efficient cooling is essential to maintain optimal operation and equipment lifetime.

Cooling systems

Data centres use a variety of cooling methods, each with specific impacts on energy and water consumption:

Air cooling

It uses air (often through large HVAC systems) to remove heat from servers. This method can be energy intensive due to the power required to power the fans and cooling units.

Vzduchový chladící systém

Water cooling

It circulates the coolant directly through the components or through "cold plates" that absorb and dissipate heat. This method is increasingly being adopted due to its higher efficiency - although it can mean higher water consumption depending on the specific system.

Vodní chladící systém

Evaporative cooling

It uses water evaporation to cool the air or coolant. It is energy efficient but can increase water consumption, which is important to consider, especially in areas with limited water resources.

Evaporační chladící systém

Hybrid systems

They combine air, water and evaporation methods to optimise performance and energy efficiency, while balancing water and electricity consumption.

The ecological impact of AI and data centres

The operational specifics of data centres directly translate into significant environmental impacts. As artificial intelligence technologies become more and more involved in our daily lives, it is important to understand the cumulative environmental consequences of data center operations - especially due to their massive electricity and water consumption.

Energy consumption

Total consumption - Data centres are huge consumers of energy. In 2023, they will consume approximately 4.4% total electricity in the US, with an expected increase to 6.7-12% by 2028 - mainly due to the growing demand driven by AI applications.

Training AI models - Training large language models (LLMs) is extremely energy intensive. For example, training GPT-3 required approximately 1,287 megawatt-hours (MWh) of electricity - equivalent to the annual consumption of about 120 average American households.

Inference - In addition to training, the actual generation of answers (inference) is also energy intensive - even more so the more users and queries the system serves.

Water consumption

Cooling requirements - Water is primarily used for cooling in data centres - which is essential given the amount of heat that servers generate. In 2021, Google's data centres consumed approximately 16.3 billion litres of water, which is on average about 1.7 million litres per day per center.

Impact of AI workload - The deployment of AI systems has been shown to lead to increased water consumption. For example, Microsoft's water consumption increased between 2021 and 2022 by 34%, partly due to the cooling needs of applications like ChatGPT.

Water shortage problems - What is very worrying is that approximately two thirds of new data centres built from 2022 onwards are located in areas already facing high water scarcity - further exacerbating the local water shortage situation.

Linking energy and water consumption

The choice of cooling method in data centres directly affects the ratio between energy and water consumption, and it is important to consider both direct and indirect consumption of these resources.

  • Air cooling often leads to higher electricity consumption but has lower direct water consumption.
  • On the contrary liquid cooling is more energy efficient, but consumes a significant amount of water directly in the data centres.
  • This exchange between energy and water consumption shows that a reduction in one type of consumption usually means an increase in the other.

Visualization and water intensity of data centers

The following figure gives a better idea of how data centres work. Their high energy intensity, coupled with their connection to power plants, partly shifts the water load away from the centres themselves. However, the direct water consumption, especially in cooling systems, cannot be ignored.

chladící systém

Summary

It is therefore important to strike a balance between the internal and external water requirements of data centres, taking into account the availability of natural resources in a particular region. As we mentioned in the previous article, excessive water consumption can in some cases be mitigated by shifting the load to less vulnerable areas or by using technological solutions - for example, the use of excess heat or closed cooling systems.

It is these approaches and innovations that respond to growing environmental demands that are the focus of the entire AI and data centre sector today. In the final article of this trilogy, we take a closer look at specific sustainability strategies and how the wider society can contribute to reducing our environmental footprint.


Related reading

Looking to learn more about AI's environmental impact? Check out these related articles:

Back to insights

Want to stay one step ahead?

Don't miss our best insights. No spam, just practical analyses, invitations to exclusive events, and podcast summaries delivered straight to your inbox.