The AI boom has taken off worldwide, with many new and old companies in the sector investing in the latest technology, such as the nVidia platform, causing those manufacturer values to soar ever higher.  Now the equipment that so many has purchased is starting to be delivered, allowing a real surge of technology to gobble up data center space even as the average densities step higher yet.   

What are those averages?  It seems a lot of resellers and brokers have no idea – they are simply asking around for space and power that is available right now.  However, like most in real estate, data center developers are reluctant to build a new facility on speculation without at least a client in mind or a letter of understanding for a certain capacity.  That is because the data centers of today are not the same as they used to be years ago, with certain design or build decisions that might go against some client best practices or becoming overly expensive to flex to potential needs.   

Back to the averages: checking to see what most common servers being manufactured around the most popular GPU setups, and soon we see a pattern of ~4-6U servers at up to 10-12kW each.  Stacking two in a rack gets that average above 20kW each, which can be a threshold for many data center spaces to then look more closely at their power and cooling schemes.  Doubling the amount per cabinet and adding network servers and suddenly we can reach 40-45kW each – a density that will trigger many to investigate supplement cooling solutions.  With the server manufacturers coming up with their own rack and row configurations, 10-14 racks can require a commitment of 500kW from a facility.  Since a lot of the AI companies have already invested in the equipment and more is being delivered weekly, they are looking to deploy tens of megawatts now and hundreds over the years ahead.   

AI has launched across so many industries it has been difficult to track let alone vet how the application of the model may apply to anyone’s benefit – remember the first days of adding ‘green’ (or blockchain, cloud, convergent, sustainability, XAAS, etc.) to everything across products and services?    

For data center operators, it will be necessary to see the results of their latest model and whether switching or upgrading to a new one would be advantageous, and why.  This harkens to the days of deeper engineering assessments to understand how the current system is operating and where there are gaps for improvement.  But this time it will be judging the latest AI software against the past performance on what is currently operating to evaluate how it can help you better, and how, while lowering cost and energy use.   

There are open-source AI models that are hitting the market (not the official company OpenAI, which is proprietary) that allow communities of users to modify and build upon the large language and similar AI models at no cost.  Those ‘free’ AI models already have the advantage of being robust, and now they have legions of users adding and improving daily.  The others with commercial models are then not as desirable by comparison, and their efforts to supersede the open source with better versions is soon likely to be more hype and sales effort than getting better results. 

footer_logo
© 2022, Green Data Center Guide. All Rights Reserved.