In July 2022, the heatwave in Europe brought record-breaking temperatures to the UK. The village of Coningsby in Lincolnshire hit
40.3°C
– the hottest-ever recorded temperature in the UK – and the heat was severe enough for the government to declare a
national emergency
.
It wasn't just the
energy system
that felt the strain – it was also the
internet
. One of Google's data centres suffered a cooling failure, resulting in service shutdowns for Google, Oracle and two London hospitals.
Guy's and St Thomas' hospitals were forced temporarily to become "paper hospitals" and operations had to be cancelled. All because, in the
words
of the chief digital information officer, "the servers couldn't handle the heat and they collapsed in an unmanaged and uncoordinated way".
In the world of cloud computing, we often talk about the need for 99.99999% service levels as a way of avoiding disruption to business – but outages can also disrupt public services with potentially fatal consequences.
All of this goes to show the importance of effective cooling. Data centres are the engine room of our digital world – but temperature and humidity controls are essential for them to function. Without cooling, overheating can lead to automatic server shutdowns, fires and damaged equipment.
The picture is complicated, however, by their potentially deleterious environmental impacts. In this article, we'll take a look at how data centre temperatures are controlled – and what cloud providers are doing to address the environmental question.
How are data centres kept cool?
There are
four main ways
of keeping data centres cool.
The first is a CRAC (computer room air conditioner). Typically placed along the walls, these force cold air under the floor which then rises and cools the servers.
The second involves arranging the racks of servers in alternating rows so that the cold air intakes at the front are facing each other in a "cold aisle" and the hot air exhausts at the back face each other in a "hot aisle". This is frequently used in conjunction with CRAC units.
This hot/cold aisle layout was introduced by IBM in the early 1990s and is one of the earliest attempts to conserve energy in data centres.
In places where the outside temperature is lower than the server room temperature all year round,
free cooling
– also known as fresh-air cooling – is a greener option that does away with internal cooling systems. It's a more high-tech version of airing your house in winter by opening a window – cool air is drawn in and then sent around the data centre.
Finally, there's
direct-to-chip
– or direct-on-chip – cooling. This is a method that helps to lower energy consumption by using liquid coolant rather than electricity-guzzling air con. Coolant fluid is sent through a network of pipes into the motherboard – "direct to chip", as the name says.
You can see a real-life cooling system in action in a
video
showing us the inside of a Google data centre. Rather than having the air con units placed around the perimeter, they "butt" the server racks right up against the air con unit. The heat rises and passes across copper coils that contain cool water. This warm water runs to an external cooling plant. Once cooled, it's returned to the data centre.
Are there green alternatives?
Of these four methods, free cooling is by far the greenest – but it's only possible in cold climates where the data centre can do away with electrically powered air con.
One out-of-the-box solution is Microsoft's
underwater data centres
, which are cooled by the sea itself rather than electricity generators. Whether these pods' energy efficiency is outweighed by their potential negative effects on
marine habitats
and ocean temperature is a question yet to be answered.
Overall, tech brands like Microsoft and Google have reached their 100% renewable energy targets – but as
Greenpeace
has noted, their supply chains are still heavily reliant on fossil fuels. So while, for example, Google's data centres are now
carbon-neutral
, they're not yet zero-carbon.
One green energy solution is the
passive cooling system
developed and patented by Katrick Technologies. This is a heat removal system that cuts carbon emissions by 50%. At the time of writing, it's been installed at iomart's
Glasgow data centre
, replacing two on-site condensers.
How do data centres cope in cold weather?
Most data centres deal just fine with cold weather – but only with the right measures in place.
Cologix
runs data centres across Canada and in Minnesota, where below-freezing winters are the norm. Its chief technology officer has
said
that "cold weather is hard on equipment, can create unique challenges for fuel and cooling and can limit access to a site for customers and staff".
If the temperature is too cold, it's difficult to take in any outside air. Cooling towers, air conditioning units and generators can freeze and vents become clogged with snow. All of this increases the risk of shutdown.
Data centres in Minnesota or other cold climates are run with these challenges in mind – but facilities facing a cold wave will need to take extra measures.
Why you need a disaster recovery plan
It's not only extreme weather that can lead to service outages. There's also the risk of flood, fire and other natural or man-made disasters.
Downtime isn't just a pain in the neck. It's a huge drain on resources as your technicians drop what they're doing and focus all their energies on putting out fires.
This is why having robust disaster recovery safeguards in place is so important. There's no one way to achieve this – but managed cloud migration services are a tried-and-tested option.
Ascend Cloud Solutions is a
cloud consultancy
that offers this service and a whole lot more. We're past masters at cloud migration, with over 400 successful migrations under our belt. Interested? Please don't hesitate to
get in touch
to find out more.