Liquid Dreams: The Rise of Immersion Cooling and Underwater Data Centers
By Omkar Ashok Bhalekar with Ajay Lotan Thakur
As demand for data keeps rising, driven by generative AI, real-time analytics, 8K streaming, and edge computing, data centers are facing an escalating dilemma: how to maintain performance without getting too hot. Traditional air-cooled server rooms that were once large enough for straightforward web hosting and storage are being stretched to their thermal extremes by modern compute-intensive workloads. While the world’s digital backbone burns hot, innovators are diving deep, deep to the ocean floor. Say hello to immersion cooling and undersea data farms, two technologies poised to revolutionize how the world stores and processes data.
Heat Is the Silent Killer of the Internet – In each data center, heat is the unobtrusive enemy. If racks of performance GPUs, CPUs, and ASICs are all operating at the same time, they generate massive amounts of heat. The old approach with gigantic HVAC systems and chilled air manifolds is reaching its technological and environmental limits.
In the majority of installations, over 35-40% of total energy consumption is spent on simply cooling the hardware, rather than running it. As model sizes and inference loads explode (think ChatGPT, DALL·E, or Tesla FSD), traditional cooling infrastructures simply aren’t up to the task without costly upgrades or environmental degradation. This is why there is a paradigm shift.
Liquid cooling is not an option everywhere due to lack of infrastructure, expense, and geography, so we still must rely on every player in the ecosystem to step up the ante when it comes to energy efficiency. The burden crosses multiple domains, chip manufacturers need to deliver far greater performance per watt with advanced semiconductor design, and software developers need to write that’s fundamentally low power by optimizing algorithms and reducing computational overhead.
Along with these basic improvements, memory manufacturers are designing low-power solutions, system manufacturers are making more power-efficient delivery networks, and cloud operators are making their data center operations more efficient while increasing the use of renewable energy sources. As Microsoft Chief Environmental Officer Lucas Joppa said, “We need to think about sustainability not as a constraint, but as an innovative driver that pushes us to build more efficient systems across every layer of the stack of technology.”
However, despite these multifaceted efficiency gains, thermal management remains a significant bottleneck that can have a deep and profound impact on overall system performance and energy consumption. Ineffective cooling can force processors to slow down their performance, which is counterintuitive to better chips and optimized software. This becomes a self-perpetuating loop where wasteful thermal management will counteract efficiency gains elsewhere in the system.
In this blogpost, we will address the cooling aspect of energy consumption, considering how future thermal management technology can be a multiplier of efficiency across the entire computing infrastructure. We will explore how proper cooling strategies not only reduce direct energy consumption from cooling components themselves but also enable other components of the system to operate at their maximum efficiency levels.
What Is Immersion Cooling?
Immersion cooling cools servers by submerging them in carefully designed, non-conductive fluids (typically dielectric liquids) that transfer heat much more efficiently than air. Immersion liquids are harmless to electronics; in fact, they allow direct liquid contact cooling with no risk of short-circuiting or corrosion.
Two general types exist:
- Single-phase immersion, with the fluid remaining liquid and transferring heat by convection.
- Two-phase immersion, wherein fluid boils at low temperature, gets heated and condenses in a closed loop.
According to Vertiv’s research, in high-density data centers, liquid cooling improves the energy efficiency of IT and facility systems compared to air cooling. In their fully optimized study, the introduction of liquid cooling created a 10.2% reduction in total data center power and a more than 15% improvement in Total Usage Effectiveness (TUE).
Total Usage Effectiveness is calculated by using the formula below:
TUE = ITUE x PUE (ITUE = Total Energy Into the IT Equipment/Total Energy into the Compute Components, PUE = Power Usage Effectiveness)
Reimagining Data Centers Underwater
Imagine shipping an entire data center in a steel capsule and sinking it to the ocean floor. That’s no longer sci-fi.
Microsoft’s Project Natick demonstrated the concept by deploying a sealed underwater data center off the Orkney Islands, powered entirely by renewable energy and cooled by the surrounding seawater. Over its two-year lifespan, the submerged facility showed:
- A server failure rate 1/8th that of land-based centers.
- No need for on-site human intervention.
- Efficient, passive cooling by natural sea currents.
Why underwater? Seawater is an open, large-scale heat sink, and underwater environments are naturally less prone to temperature fluctuations, dust, vibration, and power surges. Most coastal metropolises are the biggest consumers of cloud services and are within 100 miles of a viable deployment site, which would dramatically reduce latency.
Why This Tech Matters Now Data centers already account for about 2–3% of the world’s electricity, and with the rapid growth in AI and metaverse workloads, that figure will grow. Generative inference workloads and AI training models consume up to 10x the power per rack that regular server workloads do, subjecting cooling gear and sustainability goals to tremendous pressure. Legacy air cooling technologies are reaching thermal and density thresholds, and immersion cooling is a critical solution to future scalability. According to Submer, a Barcelona based immersion cooling company, immersion cooling has the ability to reduce energy consumed by cooling systems by up to 95% and enable higher rack density, thus providing a path to sustainable growth in data centers under AI-driven demands
Advantages & Challenges
Immersion and submerged data centers possess several key advantages:
- Sustainability – Lower energy consumption and lower carbon footprints are paramount as ESG (Environmental, Social, Governance) goals become business necessities.
- Scalability & Efficiency – Immersion allows more density per square foot, reducing real estate and overhead facility expenses.
- Reliability – Liquid-cooled and underwater systems have fewer mechanical failures including less thermal stress, fewer moving parts, and less oxidation.
- Security & Autonomy – Underwater encased pods or autonomous liquid systems are difficult to hack and can be remotely monitored and updated, ideal for zero-trust environments.
While there are advantages of Immersion Cooling / Submerges Datacenters, there are some challenges/limitations as well –
- Maintenance and Accessibility Challenges – Both options make hardware maintenance complex. Immersion cooling requires careful removal and washing of components to and from dielectric liquids, whereas underwater data centers provide extremely poor physical access, with entire modules having to be removed to fix them, which translates to longer downtimes.
- High Initial Costs and Deployment Complexity – Construction of immersion tanks or underwater enclosures involves significant capital investment in specially designed equipment, infrastructure, and deployment techniques. Underwater data centers are also accompanied by marine engineering, watertight modules, and intricate site preparation.
- Environmental and Regulatory Concerns – Both approaches involve environmental issues and regulatory adherence. Immersion systems struggle with fluid waste disposal regulations, while underwater data centers have marine environmental impact assessments, permits, and ongoing ecosystem protection mechanisms.
- Technology Maturity and Operational Risks – These are immature technologies with minimal historical data on long-term performance and reliability. Potential problems include leakage of liquids in immersion cooling or damage and biofouling in underwater installation, leading to uncertain large-scale adoption.
Industry Momentum
Various companies are leading the charge:
- GRC (Green Revolution Cooling) and submersion cooling offer immersion solutions to hyperscalers and enterprises.
- HPC is offered with precision liquid cooling by Iceotope. Immersion cooling at scale is being tested by Alibaba, Google, and Meta to support AI and ML clusters.
- Microsoft is researching commercial viability of underwater data centers as off-grid, modular ones in Project Natick.
Hyperscalers are starting to design entire zones of their new data centers specifically for liquid-cooled GPU pods, while smaller edge data centers are adopting immersion tech to run quietly and efficiently in urban environments.
- The Future of Data Centers: Autonomous, Sealed, and Everywhere
Looking ahead, the trend is clear: data centers are becoming more intelligent, compact, and environmentally integrated. We’re entering an era where: - AI-based DCIM software predicts and prevents failure in real-time.
- Edge nodes with immersive cooling can be located anywhere, smart factories, offshore oil rigs.
- Entire data centers might be built as prefabricated modules, inserted into oceans, deserts, or even space.
- The general principle? Compute must not be limited by land, heat, or humans.
Final Thoughts
In the fight to enable the digital future, air is a luxury. Immersed in liquid or bolted to the seafloor, data centers are shifting to cool smarter, not harder.
Underwater installations and liquid cooling are no longer out-there ideas, they’re lifelines to a scalable, sustainable web.
So, tomorrow’s “Cloud” won’t be in the sky, it will hum quietly under the sea.
References
- https://news.microsoft.com/source/features/sustainability/project-natick-underwater-datacenter/
- https://www.researchgate.net/publication/381537233_Advancement_of_Liquid_Immersion_Cooling_for_Data_Centers
- https://en.wikipedia.org/wiki/Project_Natick
- https://theliquidgrid.com/underwater-data-centers/
- https://www.sunbirddcim.com/glossary/submerged-server-cooling
- https://www.vertiv.com/en-us/solutions/learn-about/liquid-cooling-options-for-data-centers/
- https://submer.com/immersion-cooling/
About Author:
Omkar Bhalekar is a senior network engineer and technology enthusiast specializing in Data center architecture, Manufacturing infrastructure, and Sustainable solutions. With extensive experience in designing resilient industrial networks and building smart factories and AI data centers with scalable networks, Omkar writes to simplify complex technical topics for engineers, researchers, and industry leaders.