One of the best ways to keep a data center cool is likely not a surprise any longer. Although the amount of gaming, socializing, online video binges, and more has increased significantly, that portion only accounts for about 30% of the internet’s needs for data centers. The other 70% is still all of the various corporate computing, with that data crunching going through the roof over the last decade. We now talk about how many zettabytes we are expecting to add to the data center IP traffic year-over-year, with the growth still increasing steadily. Groups are advancing steadily with high performance computing as demand grows for exascale supercomputers, AI, and quantum computing so the demand is set to grow at the same or even greater pace.
This leads to more power use per year, with the U.S. electricity devoted to data centers being about 80 billion kWh. This growth has flattened though, with power consumption increasing 25% from 2005-2010, then 5% to 2015 and another 4% to 2020. Does this mean slow growth? Not at all – data centers have been targeting their energy use and improving on efficiencies immensely during that time. One of the ways progress has been made is through cooling innovations.
The cooling advances often began with looking at the root needs for it – the IT loads and how much was needed to support a data center. Designers have been working more closely with the IT professionals to challenge the cooling needs and shift the dominant paradigm away from conservative oversizing of the infrastructure capacity.
Among the new innovations is that of throwing the servers into a liquid bath for better cooling performance. This seems counterintuitive at first, as water and electronics don’t mix, but as engineers know liquid is exponentially better at moving heat than air cooling. Dielectric fluids, primarily nonconductive synthetic oils, means that the heat exchange can happen directly with the chips on the servers. Cooling the servers directly takes less volume, as only the server liquid is cooled instead of cooling a whole data hall room. Already testing is underway to solve for galvanic corrosion and other issues, but overall the potential for this cooling solution is vast.
For the cooling loop, the hot fluid (110 degrees F or even warmer) circulates to and from a heat exchanger where the heat is then rejected via a mechanical plant to the atmosphere or even reused throughout a building. It can save an estimated 90% of the energy, with actual performance in some installations getting to 75% versus using air cooling. The mechanical cooling energy is kept low, to about 27-35 kW per MW. This energy savings can then be easily translated into operational cost savings per year of $400,000 or greater, depending on the efficiency of the original system.
One of the things that is often left out of the savings is the reduction in the server energy needs; the servers do not need fans, saving energy up to 20% of the server energy use as well. Another is that is often not noticed is the decrease of space and mechanical systems construction costs, which can be another 20-50% of the data center cost.
Despite the cost and efficiency benefits there are still many that are ‘hydrophobic’ – they do not want water or liquids anywhere within the data halls. Sometimes a compiled list of the major advantages can help sway the decision makers: power savings; reduced losses; reduction & elimination of mechanical equipment; and space savings, with potentially more.
Then we can also add in the potential to densify the IT equipment. Instead of having a limit of about 20-25 kW per rack due to cooling, the racks can be configured at almost whatever density that the IT team may wish with today’s servers. However liquid cooling also has a limit, but an order of magnitude higher; 200 kW per rack and beyond is possible.
From using liquid for cooling, a new type of surrounding infrastructure began to emerge as the servers submerged. Portable data centers based on standardized shipping containers have been introduced, which lend themselves to be moved more easily by truck and rail and even by air. Contanization has also been used for air cooled data centers, so this has been to be readily accepted for deployment with liquids for cooling as well. The container can be liquid or air tight, meaning that it can be deployed nearly anywhere and not worry about dust, water, or most weather as long as it remains powered and has heat rejection.
Liquid cooling also lends more easily to transporting the waste heat to useful elsewhere. This leads not to just improved energy savings for the data center and associated equipment, but also the advantage of reducing the need to heat elsewhere. Several projects have developed solutions that feed the temperature needed for each side, cooling for the data center and heating for the building, with storage tanks and small supplemental heating systems.

footer_logo
© 2022, Green Data Center Guide. All Rights Reserved.