From Ocean Floor to Industry Standard: Microsoft’s Cooling Breakthrough - Yash Yagnik

 Yash Yagnik

Professor Horgan

HST 401

3 May 2025


From Ocean Floor to Industry Standard: Microsoft’s Cooling Breakthrough

The backbone of all technological advancements throughout history, and the current AI race that every company is fighting tooth and nail to win, is data. With the compounding growth of data across industries, its storage relies on massive data centers. These huge facilities spanning hundreds of thousands of square feet house (Microsoft’s 700,000+ square feet facility on right) and manage a company’s vital IT infrastructure. As the central hub for a company’s technological operations, they consist of computer systems, servers, and networking equipment that allow it to support a wide range of functions, like storing user/company data, hosting websites, and maintaining cybersecurity. But the demand for faster, energy-efficient, and cheaper data centers is growing exponentially. 


Respectively to the amount of data they have and will continue to gain, major technology companies such as Amazon, Google, Microsoft, etc. have hundreds of data centers worldwide. Each with its own purpose. However, just as helpful as they are, data centers come with their own environmental and economic burden. It’s just the sheer size that they have to be; take Microsoft’s newest data center project that they are currently building in Mount Pleasant, South Carolina. Just the materials for the actual building will cost $3.3 billion, however, the facility will consume 450 megawatts of power. At $25 million per megawatt, the total electricity bill comes out to $11.25 billion. For reference, those 450 megawatts could power more than 300,000 houses. This mind-boggling energy demand not only illustrates the double-edged sword that data centers are, but also makes the tech industry a major contributor to global warming. With growing concerns from both climate change activists and governments, companies are forced to search for greener and more efficient solutions. This raises a critical question: Can these costs be reduced?


One of the most resource and cost-intensive aspects of operating data centers is cooling. Since the servers are constantly in use and are working overtime during peak hours, the hardware overheats, leading to poor performance, and needs to be cooled. Circling back to the energy costs of a data center, cooling is responsible for at least 40% of the electricity cost. Traditional land-based centers often relied on significant water evaporation, but though it showed promising signs, the water loss proved to be too significant. There was also a big concern with the humidity caused by water particles in the cooler, evaporated water air, a concern we’ll revisit later. So, in an attempt to reduce that 40% figure, and recognizing the environmental and operational limitations of traditional cooling methods. Several tech giants started introducing the idea of building data centers in colder parts of the world, closer to the Arctic, exposing them to colder air. Google invested $230 million in a facility in Finland that strategically leverages the region's naturally cool air to reduce energy costs and consumption. Similarly, Meta committed $1 billion to build a data center in Sweden, where again, the Arctic air serves as a free cooling source.


However, cold climates were not the answer; multiple reasons deemed it not enough. First, the terrain in some places made it difficult to find flat land where you could build a simple data center. Second, their remoteness increased the data latency between users. Third, there was some pushback from the governments in these regions, arguing that the environmental protection laws would be violated if more data centers were created there, so scalability was questionable. Overall, it was the right idea, but it seemed like it could be improved. And who better to go back to than your ex? Water.


Enter Microsoft’s Project Natick. The project aimed to build the first underwater data center, which they started to execute in 2015. The Pacific Ocean off the coast of California was where Phase One was executed, and after it showed promising results, Phase Two started in 2018. Now, Microsoft wanted to test the full capabilities of the data center, putting it off the coast of Scotland in much rougher waters. Though there was speculation that the data center might fail, the project’s metrics proved that it was a viable model for the next-generation data centers. The underwater facility housed 855 servers, of which only six failed, representing a 0.7% failure rate; Beating the land-based control group, which experienced a failure rate of 5.9%, with eight out of 135 servers failing. The sea-based data center was seen as the future, demonstrating its exceptional reliability. Also, it operated entirely on renewable energy: onshore wind, offshore tides, and wave power, and required no maintenance for 5 years. Furthermore, the data center could leave the factory and be operational in less than 90 days. But the best part was that it reduced the cooling costs from 40% to 20%. The experiment was a home run, and it seemed like Microsoft had finally created a viable solution to the cooling problem, which could save them upwards of $2.25 billion. 


Despite the promising outcomes, in late 2024, Microsoft formally announced that it would no longer pursue underwater data centers. The glaring issue was that the underwater data center was not commercially viable on a larger scale. The pace of innovation in AI chips outpaces the five-year maintenance window. So, with new chips coming out every couple of months, paired with an increase in the need for AI data centers, means that by the time the underwater data center resurfaces for updates, the data center’s hardware will be considered an ancient relic. And trying to upgrade the center by taking it out of the water to upgrade the chips would come with its logistical hurdles and ultimately defeat the purpose of the data center. Another major problem is scalability. On land, expanding a facility is as easy as building some additional buildings or making existing rooms bigger. Underwater, however, expansion is far more complicated and almost guarantees the data center has to be resurfaced. So, the data center would have to be on land. Microsoft’s Cloud Operations and Innovation chief, Noelle Walsh, weighed in on the project: “My team worked on it, and it worked. We learned a lot about operations below sea level and vibration, and impacts on the server. So we'll apply those learnings to other cases.” 


But what other cases could she be talking about? Behind people's shallow understanding of the project and deeming it a failure, Microsoft was able to extract a golden nugget from this research project. The system itself was perfect. Reusing the internal coolant and transferring the heat in the coolant to the surrounding seawater with a heat exchanger solved the original problem of losing water through evaporation. Also, another takeaway from Natick was that the external pressure from submersion allowed the internal coolant to manage heat better. This allowed the internal liquid to be hotter before it evaporated; the higher the pressure, the higher the boiling point. But here it’s important to understand that the coolant itself is not coming into contact with the hardware, because water and electronics don’t exactly have a good history. The coolant is only used to absorb the heat out of the air after it is done being used to cool the hardware. Now with that cleared up and the evaporation problem solved, the only thing that remained was to replicate this underwater environment on land, but more specifically, to achieve the same pressure inside the data center. 


With this new goal, Microsoft was able to deliver something called a closed-loop cooling system. This system simulated the pressure created by the surrounding sea water in Natick by pressurizing the inside of the land server. This again allowed the internal coolant’s temperature to be regulated and, when the coolant got too hot, the heat exchanger was able to transfer the heat away from the system. Not only did this solve the expensive cost by using less energy to cool the data center, but it also solved a problem that Microsoft saw just as big: preventing water loss. If water is evaporated, then it is lost forever and needs to be replaced, which can be used in huge amounts when talking about data centers, so that problem was also avoided. But remember when I mentioned humidity? It’s no longer a concern in this system as none of the inner coolant evaporates. This allows cool air to be blown directly on the critical spots of hardware because there are no water molecules in the cool air. This is contrary to previous methods, which used the cool air with water molecules to cool the entire room and not target a specific area. 


Through years of trial and error, setbacks, and brilliant engineering. Microsoft was finally able to create a solution to a problem that they and the world had been trying to address for over a decade. The finished product, the closed-loop cooling system, was born from lessons learned by Project Natick: Microsoft’s underwater data center research project. The system slashed cooling costs by 50% and eliminated water waste, cementing itself as the next generation of data center cooling methods. By simulating the pressure of being 117 feet deep in the ocean on land, Microsoft redefined the possibilities for building sustainable and commercially viable data centers in a world that demands them at an unprecedented rate: around an 18% increase year over year. Alongside data centers, the need for AI and cloud computing will only increase. Microsoft has set a benchmark for future success, proving that technological advancement can prioritize both environmental responsibility and high performance.

Graph representing the amount of water consumed by the Closed Loop, which

is much less than the alternative methods, which result in evaporated water



Sources

https://datacenters.microsoft.com/wp-content/uploads/2023/05/Azure_Modern-Datacenter-Cooling_Infographic.pdf


https://www.jsonline.com/story/money/business/energy/2025/01/09/microsoft-data-center-will-need-power-equal-to-more-than-300000-homes/77481855007/


https://www.sciencedirect.com/science/article/pii/S1876610215009467


https://www.webwerks.in/blogs/it-good-idea-build-data-center-cold-regions#:~:text=Some%20of%20the%20leading%20data,down%20power%20and%20cooling%20cost.


https://www.datacenterknowledge.com/hyperscalers/microsoft-pays-800m-more-in-data-centers-energy-costs


https://www.latitudemedia.com/news/microsoft-plans-80b-for-data-centers-as-power-constraints-loom/#:~:text=Microsoft%20plans%20$80B%20for%20data%20centers%20as%20power%20constraints%20loom%20%7C%20Latitude%20Media.&text=%E2%80%9CThe%20cost%20of%20a%20data%20center%20today%2C,of%20energy%20on%20Latitude%20Media's%20Catalyst%20podcast.


https://natick.research.microsoft.com/#:~:text=Microsoft's%20Project%20Natick%20team%20deployed%20the%20Northern,deep%20to%20the%20seafloor%20in%20June%202018.&text=With%20the%20retrieval%20and%20preliminary%20analysis%20of,of%20the%20datacenter%20when%20compared%20to%20land.


https://redmondmag.com/Articles/2024/06/24/Project-Natick-Dries-Up.aspx#:~:text=OpenAI%20Secures%20Historic%20$40%20Billion,capital%20in%20the%20tech%20sector.


https://tech.yahoo.com/general/articles/microsoft-waves-goodbye-underwater-data-113352588.html


https://www.all-about-industries.com/microsoft-abandons-the-concept-of-reliable-underwater-data-centers-a-df4646fadf7fbdfdf4f8b2603a06f2f2/#:~:text=No%20interest%20from%20Microsoft,direction%20of%20underwater%20data%20centers.


https://www.microsoft.com/en-us/microsoft-cloud/blog/2024/12/09/sustainable-by-design-next-generation-datacenters-consume-zero-water-for-cooling/#:~:text=Starting%20August%202024%2C%20all%20new,coming%20online%20in%20late%202027.


https://www.datacenterknowledge.com/investing/microsoft-abandons-more-data-center-projects-td-cowen-says



Comments

Popular posts from this blog

Scaling the Potential of Vertical Farming Going into 2025 and Beyond

Knot Your Average Problem: How do Tongue Ties Impact Oral Myofunctional Health?

Crisis to Care: NJ’s Battle with Addiction and Homelessness