Understanding Data Centre Cooling Systems: A Beginner’s Guide

Data Centre Cooling Systems

Data centres are the backbone of many businesses.

They host everything from websites and apps to cloud services and critical business operations.

These centres house large amounts of equipment that require significant cooling to prevent overheating and maintain optimal performance. Without proper cooling systems, servers and other hardware could fail, causing costly downtime and even permanent damage.

Let’s explore the importance of cooling systems in data centres and break down the different methods used to manage temperature and humidity.

Why Cooling is Crucial for Data Centres

A data centre is essentially a large facility filled with servers, storage devices, and networking equipment. All of these machines generate heat when in use. However excessive heat can impact their performance. Even it can break them down. Therefore, it is important to control the temperature and humidity where your servers are kept.

Cooling systems in data centres are designed to remove excess heat and maintain a stable environment where equipment can function properly. If cooling is inadequate, servers can overheat, leading to slower performance, crashes, or even data loss.

So, efficient cooling not only helps lower equipment failure but also improves the longevity and reliability of the systems. However, controlling humidity is as important as managing heat.

Too much moisture in the air can cause electrical components to short-circuit, while too little humidity can lead to static electricity build-up, damaging sensitive equipment. Data centres, therefore, must have precise control over both temperature and humidity.

The Ideal Temperature for Data Centres

The ideal temperature for most data centres is between 70 and 75°F (21 to 24°C). This temperature range is considered optimal for both energy efficiency and equipment performance. However, maintaining an optimum temperature in a data centre is not as easy as it sounds.

Some studies suggest that keeping the temperature below 70°F (21°C) may be a waste of energy. It increases cooling costs without providing significant benefits to the equipment.

 

Ideal Temperature for Data Centres Colling Systems

That said, factors like the layout of the space, airflow, and even the amount of heat generated by the servers can influence the required cooling.

In some cases, data centres may need to keep the temperature slightly lower to compensate for heat from servers and equipment. Additionally, managing how air flows through the room is a key factor in optimizing cooling and efficiency.

How Cooling Systems Work

Several different methods are used to cool data centres, each with its unique advantages. The size of the data centre, its location, and the particular requirements of the equipment all influence the system selection.

Air Conditioning (AC)

The most basic cooling technique utilized in data centres is air conditioning (AC). It works by controlling the facility’s humidity levels and chilling the air. Computer Room Air Conditioning (CRAC) and Computer Room Air Handler (CRAH) systems are two popular types of air conditioning devices found in data centres.

Like conventional air conditioners, CRAC machines cool the air using refrigerants. In contrast, CRAH devices cool the air using chilled water. Depending on the particular requirements of the data centre, both systems may or may not be effective.

In-Row Cooling

In-row cooling places the cooling units directly between rows of servers. This method improves efficiency by reducing the distance air has to travel to cool the equipment. The hot air is instantly caught, and the cooled air is sent straight to the desired location. This can greatly increase the data centre’s overall cooling efficiency by reducing the amount of hot and cold air mixing.

Rear Door Heat Exchangers

Rear door heat exchangers are specialized doors attached to the back of server racks. These doors are designed to absorb the heat produced by the servers. The heat is then transferred to a coolant, typically water, which flows through the door and carries the heat away. This system helps remove hot air from the servers before it can mix with the cooler air in the data centre, improving overall efficiency.

Cold and Hot Aisle Containment

In large data centres, maintaining a proper airflow pattern is essential for keeping things cool.

One popular method is cold and hot aisle containment. This system arranges servers in alternating rows, with cold aisles where cool air is drawn in and hot aisles where the warm air is expelled. Physical barriers, such as walls or curtains, are used to separate the cold and hot aisles, ensuring that the cool air doesn’t mix with the hot air, which would reduce cooling efficiency.

Liquid Cooling

While more complex, liquid cooling systems are highly efficient and are increasingly used in high-performance data centres. In liquid cooling systems, a coolant (usually water or a special liquid) is brought directly into contact with server components.

Direct-to-chip cooling and immersion cooling are the two primary liquid cooling techniques.

By submerging servers in a coolant, immersion cooling enables heat to be taken straight from the hardware. The coolant is sent straight to the chips and other parts that produce the most heat in direct-to-chip cooling. Liquid cooling is more effective than air-based cooling, but it costs more and needs to be handled carefully.

Free Cooling

An energy-efficient technique for cooling the data centre is free cooling, which makes use of external ambient factors.

Air-side economization and water-side economization are the two categories of free cooling systems.

Air-side economization uses cool outside air to bring down the temperature inside the data centre. Water-side economization uses nearby water sources, like lakes or rivers, to cool the water used in cooling systems. Free cooling can significantly reduce energy costs, especially in locations with cool climates.

Monitoring and Managing Data Centre Cooling

To keep things running efficiently, it’s important to constantly monitor temperature and humidity levels in the data centre. Many data centres use software tools, like Data Centre Infrastructure Management (DCIM), to monitor conditions in real-time. These tools help facility managers adjust the cooling system as needed to maintain optimal conditions.

Monitoring and Managing Data Centre Cooling

The Impact of Climate Change on Cooling Costs

As climate change continues to impact global temperatures, cooling costs are expected to rise.

In regions like Singapore, average temperatures are steadily increasing, making it more challenging and expensive to cool data centres.

This is one of the reasons why businesses are looking for locations with cooler climates to host their data centres. More than ever, effective cooling systems are essential since they lessen the effects of growing energy expenses.

Conclusion

One of the most crucial elements of managing a successful data centre is efficient cooling. Businesses can guarantee that their equipment stays in optimal condition, reducing downtime and preventing expensive damage, by installing the proper cooling system.

As climate change continues to affect temperatures worldwide, investing in efficient cooling systems will only become more crucial for data centres.

About Ng Wei Khang

Ng Wei Khang, CEO at a Singapore based IT consultancy company, APIXEL IT Support, has been operational for over 8 years from now. Apixel provides fixed price IT Support Services with unlimited packages that includes small business server setup, Cloud Solution Configuration, Network management and Data security & Theft prevention. The company also provides expert IT consultancy to SMEs in Singapore.

View all posts by Ng Wei Khang →

Leave a Reply

Your email address will not be published. Required fields are marked *