Liquid Dreams: The Rise of Immersion Cooling and Underwater Data Centers

By Omkar Ashok Bhalekar with Ajay Lotan Thakur

As demand for data keeps rising, driven by generative AI, real-time analytics, 8K streaming, and edge computing, data centers are facing an escalating dilemma: how to maintain performance without getting too hot. Traditional air-cooled server rooms that were once large enough for straightforward web hosting and storage are being stretched to their thermal extremes by modern compute-intensive workloads. While the world’s digital backbone burns hot, innovators are diving deep, deep to the ocean floor. Say hello to immersion cooling and undersea data farms, two technologies poised to revolutionize how the world stores and processes data.

Heat Is the Silent Killer of the Internet – In each data center, heat is the unobtrusive enemy. If racks of performance GPUs, CPUs, and ASICs are all operating at the same time, they generate massive amounts of heat. The old approach with gigantic HVAC systems and chilled air manifolds is reaching its technological and environmental limits.

In the majority of installations, over 35-40% of total energy consumption is spent on simply cooling the hardware, rather than running it. As model sizes and inference loads explode (think ChatGPT, DALL·E, or Tesla FSD), traditional cooling infrastructures simply aren’t up to the task without costly upgrades or environmental degradation. This is why there is a paradigm shift.

Liquid cooling is not an option everywhere due to lack of infrastructure, expense, and geography, so we still must rely on every player in the ecosystem to step up the ante when it comes to energy efficiency. The burden crosses multiple domains, chip manufacturers need to deliver far greater performance per watt with advanced semiconductor design, and software developers need to write that’s fundamentally low power by optimizing algorithms and reducing computational overhead.

Along with these basic improvements, memory manufacturers are designing low-power solutions, system manufacturers are making more power-efficient delivery networks, and cloud operators are making their data center operations more efficient while increasing the use of renewable energy sources. As Microsoft Chief Environmental Officer Lucas Joppa said, “We need to think about sustainability not as a constraint, but as an innovative driver that pushes us to build more efficient systems across every layer of the stack of technology.”

However, despite these multifaceted efficiency gains, thermal management remains a significant bottleneck that can have a deep and profound impact on overall system performance and energy consumption. Ineffective cooling can force processors to slow down their performance, which is counterintuitive to better chips and optimized software. This becomes a self-perpetuating loop where wasteful thermal management will counteract efficiency gains elsewhere in the system.

In this blogpost, we will address the cooling aspect of energy consumption, considering how future thermal management technology can be a multiplier of efficiency across the entire computing infrastructure. We will explore how proper cooling strategies not only reduce direct energy consumption from cooling components themselves but also enable other components of the system to operate at their maximum efficiency levels.

What Is Immersion Cooling?

Immersion cooling cools servers by submerging them in carefully designed, non-conductive fluids (typically dielectric liquids) that transfer heat much more efficiently than air. Immersion liquids are harmless to electronics; in fact, they allow direct liquid contact cooling with no risk of short-circuiting or corrosion.

Two general types exist:

  • Single-phase immersion, with the fluid remaining liquid and transferring heat by convection.
  • Two-phase immersion, wherein fluid boils at low temperature, gets heated and condenses in a closed loop.

According to Vertiv’s research, in high-density data centers, liquid cooling improves the energy efficiency of IT and facility systems compared to air cooling. In their fully optimized study, the introduction of liquid cooling created a 10.2% reduction in total data center power and a more than 15% improvement in Total Usage Effectiveness (TUE).

Total Usage Effectiveness is calculated by using the formula below:

TUE = ITUE x PUE (ITUE = Total Energy Into the IT Equipment/Total Energy into the Compute Components, PUE = Power Usage Effectiveness)

Reimagining Data Centers Underwater
Imagine shipping an entire data center in a steel capsule and sinking it to the ocean floor. That’s no longer sci-fi.

Microsoft’s Project Natick demonstrated the concept by deploying a sealed underwater data center off the Orkney Islands, powered entirely by renewable energy and cooled by the surrounding seawater. Over its two-year lifespan, the submerged facility showed:

  • A server failure rate 1/8th that of land-based centers.
  • No need for on-site human intervention.
  • Efficient, passive cooling by natural sea currents.

Why underwater? Seawater is an open, large-scale heat sink, and underwater environments are naturally less prone to temperature fluctuations, dust, vibration, and power surges. Most coastal metropolises are the biggest consumers of cloud services and are within 100 miles of a viable deployment site, which would dramatically reduce latency.

Why This Tech Matters Now Data centers already account for about 2–3% of the world’s electricity, and with the rapid growth in AI and metaverse workloads, that figure will grow. Generative inference workloads and AI training models consume up to 10x the power per rack that regular server workloads do, subjecting cooling gear and sustainability goals to tremendous pressure. Legacy air cooling technologies are reaching thermal and density thresholds, and immersion cooling is a critical solution to future scalability. According to Submer, a Barcelona based immersion cooling company, immersion cooling has the ability to reduce energy consumed by cooling systems by up to 95% and enable higher rack density, thus providing a path to sustainable growth in data centers under AI-driven demands

Advantages & Challenges

Immersion and submerged data centers possess several key advantages:

  • Sustainability – Lower energy consumption and lower carbon footprints are paramount as ESG (Environmental, Social, Governance) goals become business necessities.
  • Scalability & Efficiency – Immersion allows more density per square foot, reducing real estate and overhead facility expenses.
  • Reliability – Liquid-cooled and underwater systems have fewer mechanical failures including less thermal stress, fewer moving parts, and less oxidation.
  • Security & Autonomy – Underwater encased pods or autonomous liquid systems are difficult to hack and can be remotely monitored and updated, ideal for zero-trust environments.

While there are advantages of Immersion Cooling / Submerges Datacenters, there are some challenges/limitations as well –

  • Maintenance and Accessibility Challenges – Both options make hardware maintenance complex. Immersion cooling requires careful removal and washing of components to and from dielectric liquids, whereas underwater data centers provide extremely poor physical access, with entire modules having to be removed to fix them, which translates to longer downtimes.
  • High Initial Costs and Deployment Complexity – Construction of immersion tanks or underwater enclosures involves significant capital investment in specially designed equipment, infrastructure, and deployment techniques. Underwater data centers are also accompanied by marine engineering, watertight modules, and intricate site preparation.
  • Environmental and Regulatory Concerns – Both approaches involve environmental issues and regulatory adherence. Immersion systems struggle with fluid waste disposal regulations, while underwater data centers have marine environmental impact assessments, permits, and ongoing ecosystem protection mechanisms.
  • Technology Maturity and Operational Risks – These are immature technologies with minimal historical data on long-term performance and reliability. Potential problems include leakage of liquids in immersion cooling or damage and biofouling in underwater installation, leading to uncertain large-scale adoption.

Industry Momentum

Various companies are leading the charge:

  • GRC (Green Revolution Cooling) and submersion cooling offer immersion solutions to hyperscalers and enterprises.
  • HPC is offered with precision liquid cooling by Iceotope. Immersion cooling at scale is being tested by Alibaba, Google, and Meta to support AI and ML clusters.
  • Microsoft is researching commercial viability of underwater data centers as off-grid, modular ones in Project Natick.

Hyperscalers are starting to design entire zones of their new data centers specifically for liquid-cooled GPU pods, while smaller edge data centers are adopting immersion tech to run quietly and efficiently in urban environments.

  • The Future of Data Centers: Autonomous, Sealed, and Everywhere
    Looking ahead, the trend is clear: data centers are becoming more intelligent, compact, and environmentally integrated. We’re entering an era where:
  • AI-based DCIM software predicts and prevents failure in real-time.
  • Edge nodes with immersive cooling can be located anywhere, smart factories, offshore oil rigs.
  • Entire data centers might be built as prefabricated modules, inserted into oceans, deserts, or even space.
  • The general principle? Compute must not be limited by land, heat, or humans.

Final Thoughts

In the fight to enable the digital future, air is a luxury. Immersed in liquid or bolted to the seafloor, data centers are shifting to cool smarter, not harder.

Underwater installations and liquid cooling are no longer out-there ideas, they’re lifelines to a scalable, sustainable web.

So, tomorrow’s “Cloud” won’t be in the sky, it will hum quietly under the sea.

References

About Author:
Omkar Bhalekar is a senior network engineer and technology enthusiast specializing in Data center architecture, Manufacturing infrastructure, and Sustainable solutions. With extensive experience in designing resilient industrial networks and building smart factories and AI data centers with scalable networks, Omkar writes to simplify complex technical topics for engineers, researchers, and industry leaders.

7 thoughts on “Liquid Dreams: The Rise of Immersion Cooling and Underwater Data Centers

  1. Amazon Building Huge, Super Sized AI Data Centers
    A year ago, a 1,200-acre stretch of farmland outside New Carlisle, Ind., was an empty cornfield. Now, seven Amazon data centers rise up from the rich soil, each larger than a football stadium.

    Over the next several years, Amazon plans to build around 30 data centers at the site, packed with hundreds of thousands of specialized computer chips. With hundreds of thousands of miles of fiber connecting every chip and computer together, the entire complex will form one giant machine intended just for artificial intelligence.

    The facility will consume 2.2 gigawatts of electricity — enough to power a million homes. Each year, it will use millions of gallons of water to keep the chips from overheating. And it was built with a single customer in mind: the A.I. start-up Anthropic, which aims to create an A.I. system that matches the human brain.

    The complex — so large that it can be viewed completely only from high in the sky — is the first in a new generation of data centers being built by Amazon, and part of what the company calls Project Rainier, after the mountain that looms near its Seattle headquarters. Project Rainier will also include facilities in Mississippi and possibly other locations, like North Carolina and Pennsylvania.

    Project Rainier is Amazon’s entry into a race by the technology industry to build data centers so large they would have been considered absurd just a few years ago. Meta, which owns Facebook, Instagram and WhatsApp, is building a two-gigawatt data center in Louisiana. OpenAI is erecting a 1.2-gigawatt facility in Texas and another, nearly as large, in the United Arab Emirates.

    These data centers will dwarf most of today’s, which were built before OpenAI’s ChatGPT chatbot inspired the A.I. boom in 2022. The tech industry’s increasingly powerful A.I. technologies require massive networks of specialized computer chips — and hundreds of billions of dollars to build the data centers that house those chips. The result: behemoths that stretch the limits of the electrical grid and change the way the world thinks about computers.

    Amazon, which has invested $8 billion in Anthropic, will rent computing power from the new facility to its start-up partner. An Anthropic co-founder, Tom Brown, who oversees the company’s work with Amazon on its chips and data centers, said having all that computing power in one spot could allow the start-up to train a single A.I. system.
    https://www.nytimes.com/2025/06/24/technology/amazon-ai-data-centers.html?searchResultPosition=1

  2. How might the shift toward immersion cooling and underwater data centers influence the future design of hardware components, especially in terms of materials, form factors, and maintenance requirements?

    1. The shift toward immersion cooling and underwater data centers is driving significant changes in server and facility design, primarily focused on eliminating air-cooling infrastructure to improve energy efficiency, density, and reliability. This fundamental change influences hardware components in terms of materials compatibility, specialized form factors, and revised maintenance requirements.

      Materials

      Hardware component materials must be re-evaluated for long-term compatibility with dielectric fluids to prevent degradation.
      Elimination of Thermal Grease/Pads: Traditional thermal greases often dissolve in dielectric fluids, compromising heat transfer. Alternatives such as graphite pads or malleable metallic sheets (like indium foil) are being used as solid thermal interface materials (TIMs).
      Specialized Plastics and Adhesives: Standard plastics, seals, gaskets, and adhesives found on circuit boards, cables, and connectors may swell, shrink, or chemically break down when exposed to certain coolants. Component manufacturers are developing specialized, immersion-rated materials and conformal coatings to ensure reliability.
      Corrosion Resistance: While dielectric fluids are non-corrosive, underwater data centers require robust, pressure-resistant external housing (e.g., the sealed, nitrogen-filled container in Microsoft’s Project Natick) that can withstand the corrosive and high-pressure ocean environment for extended periods.

      Form Factors

      The removal of air-cooling constraints enables higher-density configurations and different physical layouts.
      Removal of Fans and Heatsinks: Server fans are no longer needed and are removed, along with bulky air-optimized heatsinks. New heatsink designs are optimized for the thermal conductivity and viscosity of dielectric fluids, often featuring different fin architectures to maximize heat dissipation in a liquid medium.
      Increased Density: The superior heat transfer of liquids allows for ultra-high-density compute per square foot. This leads to more compact server designs and the ability to pack more powerful components (like high-TDP AI accelerators) into a smaller space than is possible with air cooling.
      Vertical Orientation: In some open-bath immersion systems, servers may be oriented vertically to optimize fluid dynamics and natural convection within the tank, a departure from traditional horizontal rack-mounting.
      Modular and Sealed Designs (Underwater): Underwater data centers, such as those in the Project Natick experiment, use sealed, autonomous, and modular “pods” or “containers” that can be deployed remotely and operate without human intervention for years, fundamentally changing the traditional “hot/cold aisle” data hall layout.

      Maintenance Requirements

      Maintenance procedures are fundamentally altered, moving away from routine, hands-on server swaps to specialized, condition-based servicing.
      New Workflows and Training: Day-to-day operations require new staff training on safety procedures and specialized tools for handling submerged equipment. Technicians must remove equipment from tanks, allow it to drain, and service it in a separate, controlled environment.
      Reduced Routine Maintenance: The stable, sealed environment of both immersion and underwater systems significantly reduces common issues like dust accumulation, humidity fluctuations, and thermal stress, leading to fewer hardware failures and a longer lifespan for components.
      Fluid Management: New maintenance tasks focus on monitoring, testing, and managing the quality of the dielectric fluid itself (e.g., purity checks, filtration, and replenishment due to minor evaporation losses in two-phase systems).
      Remote and Specialized Servicing (Underwater): For underwater data centers, most maintenance is performed remotely through monitoring and automated updates. Physical servicing often requires specialized marine equipment or retrieving the entire module, making in-situ repairs virtually impossible and necessitating designs built for extreme reliability.

  3. It’s fascinating how immersion cooling in high density data centers improves the energy efficiency of IT and facility systems compared to air cooling. Meanwhile, underwater data centers are no longer science fiction with Microsoft’s Project Natick. The energy efficiency gains and environmental benefits could truly revolutionize the way we handle heat in data centers.

    The move towards innovative cooling solutions like these is both exciting and necessary as this article points out: “over 35-40% of total energy consumption is spent on simply cooling the hardware, rather than running it.”

  4. The carbon footprint of everyday digital activities and AI applications is huge:

    -Streaming an hour of YouTube or Netflix produces 42 grams of CO₂, 500 times more than sending 2 texts to Gemini.
    -A one-hour Zoom call generates 17 grams of CO₂, roughly the same amount as creating a short AI video.
    -AI image generation creates 1 gram of CO₂ per image, about 10 times more than asking ChatGPT a question.

    https://docs.google.com/spreadsheets/u/0/d/e/2PACX-1vR_zrU51G3BGQU4KtOXqQ-fpfLLKnjivkzN13GzWeyIdIAU_vfLxa2bGV4E8a_8pJCp-q2X8mXQPp5R/pubhtml?urp=gmail_link&pli=1

  5. This article effectively highlights the growing thermal challenges in modern data centers, but it should include more specific examples or data to support the claims. Adding a clear comparison between traditional air cooling and immersion cooling will make the argument stronger and more credible. To further improve readability and avoid AI-detection issues, rewrite some sentences with varied structure and a more human tone. Overall, incorporating concrete statistics and a balanced tone will help fix AI-flagged writing while keeping the message compelling.

    1. Eva, The article your criticizing was written by By Omkar Ashok Bhalekar with Ajay Lotan Thakur AND not me! It is quite readable and will NOT be rewritten to suit your frivolous complaints.
      Here are a few examples of immersion cooling data centers:

      Microsoft’s Quincy Datacenter: Microsoft installed a two-phase immersion cooling solution developed with Wiwynn using an engineered fluid from 3M for high-performance computing (HPC) and AI workloads.

      KDDI Micro Data Center: The Japanese telecommunications leader, KDDI, built a micro data center using GIGABYTE’s single-phase immersion cooling solution.

      NorthC Datacenters: NorthC is implementing immersion cooling as an energy-efficient method for AI data centers and has a project in Rotterdam Zestienhoven to reuse the waste heat for district heating systems.

      Shell & Intel Collaboration: Shell and Intel collaborated to develop and validate a full immersion cooling solution for data centers using Intel Xeon-based servers and Shell’s specialized dielectric fluids.

Leave a Reply

Your email address will not be published.

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>

*