Muon Space in deal with Hubble Network to deploy world’s first satellite-powered Bluetooth network

Muon Space, a  provider of end-to-end space systems specializing in mission-optimized satellite constellations, today announced its most capable satellite platform, MuSat XL, a high-performance 500 kg-class spacecraft designed for the most demanding next-generation low Earth orbit (LEO) missions. Muon also announced its first customer for the XL Platform: Hubble Network, a Seattle-based space-tech pioneer building the world’s first satellite-powered Bluetooth network.  IEEE Techblog reported Hubble Network’s first Bluetooth to space satellite connection in this post.

The XL Platform delivers a dramatically expanded capability tier to the flight-proven Halo™ stack – delivering more power, agility, and integration flexibility while preserving the speed, scalability and cost-effectiveness needed for constellation deployment. Optimized for Earth observation (EO) and telecommunications missions supporting commercial and national security customers that require multi-payload operations, extreme data throughput, high-performance inter-satellite networking, and cutting-edge attitude control and pointing, the XL Platform sets a new industry benchmark for mission performance and value.  “XL is more than a bigger bus – it’s a true enabler for customers pushing the boundaries of what’s possible in orbit, like Hubble,” said Jonny Dyer, CEO of Muon Space. “Their transformative BLE technology represents the future of space-based services and we are ecstatic to enable their mission with the XL Platform and our Halo stack.”

The Muon Space XL platform combines exceptional payload power, precise pointing, and high-bandwidth networking to enable advanced space capabilities across defense, disaster response, and commercial missions.

Enhancing Global BLE Coverage:

In 2024, Hubble became the first company to establish a Bluetooth connection directly to a satellite, fueling global IoT growth. Using MuSat XL, it will deploy a next-generation BLE payload featuring a phased-array antenna and a receiver 20 times more powerful than its CubeSat predecessor, enabling BLE detection at 30 times lower power and direct connectivity for ultra-low-cost, energy-efficient devices worldwide. MuSat XL’s large payload accommodation, multi-kW power system, and cutting-edge networking and communications capabilities are key enablers for advanced services like Hubble’s.

“Muon’s platform gives us the scale and power to build a true Bluetooth layer around the Earth,” said Alex Haro, Co-Founder and CEO of Hubble Network.

The first two MuSat XL satellites will provide a 12-hour global revisit time, with a scalable design for faster coverage. Hubble’s BLE Finding Network supports critical applications in logistics, infrastructure, defense, and consumer technology.

A Next Generation Multi-Mission Satellite Platform:

MuSat XL is built for operators who need real capability – more power, larger apertures, more flexibility, and more agility – and with the speed to orbit and reliability that Muon has already demonstrated with its other platforms in orbit since 2023. Built on the foundation of Muon’s heritage 200 kg MuSat architecture, MuSat XL is a 500 kg-class bus that extends the Halo technology stack’s performance envelope to enable high-impact, real-time missions.

Key capabilities include:

  • 1 kW+ orbit average payload power – Supporting advanced sensors, phased arrays, and edge computing applications.
  • Seamless, internet-standards based, high bandwidth, low latency communications, and optical crosslink networking – Extremely high volume downlink (>5 TB / day) and near real-time communications for time-sensitive operations critical for defense, disaster response, and dynamic tasking.
  • Flexible onboard interface, network, compute – Muon’s PayloadCore architecture enables rapid hardware/software integration of payloads and deployment of cloud-like workflows to onboard network, storage, and compute.
  • Precise, stable, and agile pointing – Attitude control architected for the rigorous needs of next-generation EO and RF payloads.

In the competitive small satellite market, MuSat XL offers standout advantages in payload volume, power availability, and integration flexibility – making it a versatile backbone for advanced sensors, communications systems, and compute-intensive applications. The platform is built for scale: modular, manufacturable, and fully integrated with Muon’s vertically developed stack, from custom instrument design to full mission operations via the Halo technology stack.

Muon designed MuSat XL to deliver exceptional performance without added complexity. Early adopters like Hubble signal a broader trend in the industry: embracing platforms that offer operational autonomy, speed, and mission longevity at commercial scale.

About Muon Space:

Founded in 2021, Muon Space is an end-to-end space systems company that designs, builds, and operates mission-optimized satellite constellations to deliver critical data and enable real-time compute and decision-making in space. Its proprietary technology stack, Halo™, integrates advanced spacecraft platforms, robust payload integration and management, and a powerful software-defined orchestration layer to enable high-performance capabilities at unprecedented speed – from concept to orbit. With state-of-the-art production facilities in Silicon Valley and a growing track record of commercial and national security customers, Muon Space is redefining how critical Earth intelligence is delivered from space.  Muon Space employs a team of more than 150 engineers and scientists, including industry experts from Skybox, NASA, SpaceX, and others.  SOURCE: Muon Space

About Hubble Network:

Founded in 2021, Hubble is creating the world’s first satellite-powered Bluetooth network, enabling global connectivity without reliance on cellular infrastructure. The Hubble platform makes it easy to transmit low-bandwidth data from any Bluetooth-enabled device, with no infrastructure required. Their global BLE network is live and expanding rapidly, delivering real-time visibility across supply chains, fleets, and facilities.  Visit www.hubble.com for more information.

References:

https://www.muonspace.com/

https://www.prnewswire.com/news-releases/muon-space-unveils-xl-satellite-platform-announces-hubble-network-as-first-customer-302523719.html

https://www.satellitetoday.com/government-military/2025/05/16/muon-space-advances-to-stage-ii-on-nro-contract-for-commercial-electro-optical-imagery/

https://www.satellitetoday.com/manufacturing/2025/06/12/muon-space-expands-series-b-and-buys-propulsion-startup-in-a-bid-to-scale-production/

Hubble Network Makes Earth-to-Space Bluetooth Satellite Connection; Life360 Global Location Tracking Network

WiFi 7: Backgrounder and CES 2025 Announcements

Emerging Cybersecurity Risks in Modern Manufacturing Factory Networks

By Omkar Ashok Bhalekar with Ajay Lotan Thakur

Introduction

With the advent of new industry 5.0 standards and ongoing advancements in the field of Industry 4.0, the manufacturing landscape is facing a revolutionary challenge which not only demands sustainable use of environmental resources but also compels us to make constant changes in industrial security postures to tackle modern threats. Technologies such as Internet of Things (IoT) in Manufacturing, Private 4G/5G, Cloud-hosted applications, Edge-computing, and Real-time streaming telemetry are effectively fueling smart factories and making them more productive.

Although this evolution facilitates industrial automation, innovation and high productivity, it also greatly makes the exposure footprint more vulnerable for cyberattacks. Industrial Cybersecurity is quintessential for mission critical manufacturing operations; it is a key cornerstone to safeguard factories and avoid major downtimes.

With the rapid amalgamation of IT and OT (Operational Technology), a hack or a data breach can cause operational disruptions like line down situations, halt in production lines, theft or loss of critical data, and huge financial damage to an organization.

Industrial Networking

Why does Modern Manufacturing demand Cybersecurity? Below outlines a few reasons why cybersecurity is essential in modern manufacturing:

  • Convergence of IT and OT: Industrial control systems (ICS) which used to be isolated or air-gapped are now all inter-connected and hence vulnerable to breaches.
  • Enlarged Attack Surface: Every device or component in the factory which is on the network is susceptible to threats and attacks.
  • Financial Loss: Cyberattacks such as WannaCry or targeted BSOD Blue Screen of Death (BSOD) can cost millions of dollars per minute and result in complete shutdown of operations.
  • Disruptions in Logistics Network: Supply chain can be greatly disarrayed due to hacks or cyberattacks causing essential parts shortage.
  • Legislative Compliance: Strict laws and regulations such as CISA, NIST, and ISA/IEC 62443 are proving crucial and mandating frameworks to safeguard industries

It is important to understand and adapt to the changing trends in the cybersecurity domain, especially when there are several significant factors at risk. Historically, it has been observed that mankind always has had some lessons learned from their past mistakes while not only advances at fast pace, but the risks from external threats would limit us from making advancements without taking cognizance.

This attitude of adaptability or malleability needs to become an integral part of the mindset and practices in cybersecurity spheres and should not be limited to just industrial security. Such practices can scale across other technological fields. Moreover, securing industries does not just mean physical security, but it also opens avenues for cybersecurity experts to learn and innovate in the field of applications and software such as Manufacturing Execution System (MES) which are crucial for critical operations.

Greatest Cyberattacks in Manufacturing of all times:

Familiarizing and acknowledging different categories of attacks and their scales which have historically hampered the manufacturing domain is pivotal. In this section we would highlight some of the Real-World cybersecurity incidents.

Ransomware (Colonial Pipeline, WannaCry, y.2021):

These attacks brought the US east coast to a standstill due to extreme shortage of fuel and gasoline after hacking employee credentials.

Cause: The root cause for this was compromised VPN account credentials. An VPN account which wasn’t used for a long time and lacked Multi-factor Authentication (MFA) was breached and the credentials were part of a password leak on dark web. The Ransomware group “Darkside” exploited this entry point to gain access to Colonial Pipeline’s IT systems. They did not initially penetrate operational technology systems. However, the interdependence of IT and OT systems caused operational impacts. Once inside, attackers escalated privileges and exfiltrated 100 GB of data within 2 hours. Ransomware was deployed to encrypt critical business systems. Colonial Pipeline proactively shut down the pipeline fearing lateral movement into OT networks.

Effect: The pipeline, which supplies nearly 45% of the fuel to the U.S. East Coast, was shut down for 6 days. Mass fuel shortages occurred across several U.S. states, leading to public panic and fuel hoarding. Colonial Pipeline paid $4.4 million ransom. Later, approximately $2.3 million was recovered by the FBI. Led to a Presidential Executive Order on Cybersecurity and heightened regulations around critical infrastructure cybersecurity. Exposed how business IT network vulnerabilities can lead to real-world critical infrastructure impacts, even without OT being directly targeted.

Industrial Sabotage (Stuxnet, y.2009):

This unprecedented and novel software worm was able to hijack an entire critical facility and sabotage all the machines rendering them defunct.

Cause: Nation-state-developed malware specifically targeting Industrial Control Systems (ICS), with an unprecedented level of sophistication. Stuxnet was developed jointly by the U.S. (NSA) and Israel (Unit 8200) under operation “Olympic Games”. The target was Iran’s uranium enrichment program at Natanz Nuclear Facility. The worm was introduced via USB drives (air-gapped network). Exploited four zero-day vulnerabilities in Windows systems at that time, unprecedented. Specifically targeted Siemens Step7 software running on Windows, which controls Siemens S7-300 PLCs. Stuxnet would identify systems controlling centrifuges used for uranium enrichment. Reprogrammed the PLCs to intermittently change the rotational speed of centrifuges, causing mechanical stress and failure, while reporting normal operations to operators. Used rootkits for both Windows and PLC-level to remain stealthy.

Effect: Destroyed approximately 1,000 IR-1 centrifuges (~10% of Iran’s nuclear capability). Set back Iran’s nuclear program by 1-2 years. Introduced a new era of cyberwarfare, where malware caused physical destruction. Raised global awareness about the vulnerabilities in industrial control systems (ICS). Iran responded by accelerating its cyber capabilities, forming the Iranian Cyber Army. ICS/SCADA security became a top global priority, especially in energy and defense sectors.

Upgrade spoofing (SolarWinds Orion Supply chain Attack, y.2020):

Attackers injected malicious pieces of software into the software updates which infected millions of users.

Cause: Compromise of the SolarWinds build environment leading to a supply chain attack. Attackers known as Russian Cozy Bear, linked to Russia’s foreign intelligence agency, gained access to SolarWinds’ development pipeline. Malicious code was inserted into Orion Platform updates, released between March to June 2020 Customers who downloaded the update installed malware known as SUNBURST. Attackers compromised SolarWinds build infrastructure. It created a backdoor in Orion’s signed DLLs. Over 18,000 customers were potentially affected, including 100 high-value targets. After the exploit, attackers used manual lateral movement, privilege escalation, and custom C2 (command-and-control) infrastructure to exfiltrate data.

Effect: Breach included major U.S. government agencies: DHS, DoE, DoJ, Treasury, State Department, and more. Affected top corporations: Cisco, Intel, Microsoft, FireEye, and others FireEye discovered the breach after noticing unusual two-factor authentication activity. Exposed critical supply chain vulnerabilities and demonstrated how a single point of compromise could lead to nationwide espionage. Promoted the creation of Cybersecurity Executive Order 14028, Zero Trust mandates, and widespread adoption of Software Bill of Materials (SBOM) practices.

Spywares (Pegasus, y.2016-2021):

Cause: Zero-click and zero-day exploits leveraged by NSO Group’s Pegasus spyware, sold to governments. Pegasus can infect phones without any user interaction also known as zero-click exploits. It acquires malicious access to WhatsApp, iMessage or browsers like Safari’s vulnerabilities on iOS, including zero-days attacks on Android devices. Delivered via SMS, WhatsApp messages, or silent push notifications. Once installed, it provides complete surveillance capability such as access to microphones, camera, GPS, calls, photos, texts, and encrypted apps. Zero-click iOS exploit ForcedEntry allows complete compromise of an iPhone. Malware is extremely stealthy, often removing itself after execution. Bypassed Apple’s BlastDoor sandbox and Android’s hardened security modules.

Effect: Used by multiple governments to surveil activists, journalists, lawyers, opposition leaders, even heads of state. The 2021 Pegasus Project, led by Amnesty International and Forbidden Stories, revealed a leaked list of 50,000 potential targets. Phones of high-profile individuals including international journalists, associates, specifically French president, and Indian opposition figures were allegedly targeted which triggered legal and political fallout. NSO Group was blacklisted by the U.S. Department of Commerce. Apple filed a lawsuit against NSO Group in 2021. Renewed debates over the ethics and regulation of commercial spyware.

Other common types of attacks:

Phishing and Smishing: These attacks send out links or emails that appear to be legitimate but are crafted by bad actors for financial means or identity theft.

Social Engineering: Shoulder surfing though sounds funny; it’s the tale of time where the most expert security personnel have been outsmarted and faced data or credential leaks. Rather than relying on technical vulnerabilities, this attack targets human psychology to gain access or break into systems. The attacker manipulates people into revealing confidential information using techniques such as Reconnaissance, Engagement, Baiting or offering Quid pro quo services.

Security Runbook for Manufacturing Industries:

To ensure ongoing enhancements to industrial security postures and preserve critical manufacturing operations, following are 11 security procedures and tactics which will ensure 360-degree protection based on established frameworks:

A. Incident Handling Tactics (First Line of Defense) Team should continuously improve incident response with the help of documentation and response apps. Co-ordination between teams, communications root, cause analysis and reference documentation are the key to successful Incident response.

B. Zero Trust Principles (Trust but verify) Use strong security device management tools to ensure all end devices are in compliance such as trusted certificates, NAC, and enforcement policies. Regular and random checks on users’ official data patterns and assign role-based policy limiting full access to critical resources.

C. Secure Communication and Data Protection Use endpoint or cloud-based security session with IPSec VPN tunnels to make sure all traffic can be controlled and monitored. All user data must be encrypted using data protection and recovery software such as BitLocker.

D. Secure IT Infrastructure Hardening of network equipment such switches, routers, WAPs with dot1x, port-security and EAP-TLS or PEAP. Implement edge-based monitoring solutions to detect anomalies and redundant network infrastructure to ensure least MTTR.

E. Physical Security Locks, badge readers or biometric systems for all critical rooms and network cabinets are a must. A security operations room (SOC) can help monitor internal thefts or sabotage incidents.

F. North-South and East-West Traffic Isolation Safety traffic and external traffic can be rate limited using Firewalls or edge compute devices. 100% isolation is a good wishful thought, but measures need to be taken to constantly monitor any security punch-holes.

G. Industrial Hardware for Industrial applications Use appropriate Industrial grade IP67 or IP68 rated network equipment to avoid breakdowns due to environmental factors. Localized industrial firewalls can provide desired granularity on the edge thereby skipping the need to follow Purdue model.

H. Next-Generation Firewalls with Application-Level Visibility Incorporate Stateful Application Aware Firewalls, which can help provide more control over zones and policies and differentiate application’s behavioral characteristics. Deploy Tools which can perform deep packet inspection and function as platforms for Intrusion prevention (IPS/IDS).

I. Threat and Traffic Analyzer Tools such as network traffic analyzers can help achieve network Layer1-Layer7 security monitoring by detecting and responding to malicious traffic patterns. Self-healing networks with automation and monitoring tools which can detect traffic anomalies and rectify the network incompliance.

J. Information security and Software management Companies must maintain a repo of trust certificates, software and releases and keep pushing regular patches for critical bugs. Keep a constant track of release notes and CVEs (Common Vulnerabilities and exposures) for all vendor software.

K. Idiot-Proofing (How to NOT get Hacked) Regular training to employees and familiarizing them with cyber-attacks and jargons like CryptoJacking or HoneyNets can help create awareness. Encourage and provide a platform for employees or workers to voice their opinions and resolve their queries regarding security threats.

Current Industry Perspective and Software Response

In response to the escalating tide of cyberattacks in manufacturing, from the Triton malware striking industrial safety controls to LockerGoga shutting down production at Norsk Hydro, there has been a sea change in how the software industry is facilitating operational resilience. Security companies are combining cutting-edge threat detection with ICS/SCADA systems, delivering purpose-designed solutions like zero-trust network access, behavior-based anomaly detection, and encrypted machine-to-machine communications. Companies such as Siemens and Claroty are leading the way, bringing security by design rather than an afterthought. A prime example is Dragos OT-specific threat intelligence and incident response solutions, which have become the focal point in the fight against nation-state attacks and ransomware operations against critical infrastructure.

Bridging the Divide between IT and OT: Two way street

With the intensification of OT and IT convergence, perimeter-based defense is no longer sufficient. Manufacturers are embracing emerging strategies such as Cybersecurity Mesh Architecture (CSMA) and applying IT-centric philosophies such as DevSecOps within the OT environment to foster secure by default deployment habits. The trend also brings attention to IEC 62443 conformity as well as NIST based risk assessment frameworks catering to manufacturing. Legacy PLCs having been networked and exposed to internet-borne threats, companies are embracing micro-segmentation, secure remote access, and real-time monitoring solutions that unify security across both environments. Learn how Schneider Electric is empowering manufacturers to securely link IT/OT systems with scalable cybersecurity programs.

Conclusion

In a nutshell, Modern manufacturing, contrary to the past, is not just about quick input and quick output systems which can scale and be productive, but it is an ecosystem, where cybersecurity and manufacturing harmonize and just like healthcare system is considered critical to humans, modern factories are considered quintessential to manufacturing. So many experiences with cyberattacks on critical infrastructure such as pipelines, nuclear plants, power-grids over the past 30 years not only warrant world’s attention but also calls to action the need to devise regulatory standards which must be followed by each and every entity in manufacturing.

As mankind keeps making progress and sprinting towards the next industrial revolution, it’s an absolute exigency to emphasize making Industrial Cybersecurity a keystone in building upcoming critical manufacturing facilities and building a strong foundation for operational excellency. Now is the right time to buy into the trend of Industrial security, sure enough the leaders who choose to be “Cyberfacturers” will survive to tell the tale, and the rest may just serve as stark reminders of what happens when pace outperforms security.

References

About Author:

Omkar Bhalekar is a senior network engineer and technology enthusiast specializing in Data center architecture, Manufacturing infrastructure, and Sustainable solutions with extensive experience in designing resilient industrial networks and building smart factories and AI data centers with scalable networks. He is also the author of the Book Autonomous and Predictive Networks: The future of Networking in the Age of AI and co-author of Quantum Ops – Bridging Quantum Computing & IT Operations. Omkar writes to simplify complex technical topics for engineers, researchers, and industry leaders.

Countdown to Q-day: How modern-day Quantum and AI collusion could lead to The Death of Encryption

By Omkar Ashok Bhalekar with Ajay Lotan Thakur

Behind the quiet corridors of research laboratories and the whir of supercomputer data centers, a stealth revolution is gathering force, one with the potential to reshape the very building blocks of cybersecurity. At its heart are qubits, the building blocks of quantum computing, and the accelerant force of generative AI. Combined, they form a double-edged sword capable of breaking today’s encryption and opening the door to an era of both vast opportunity and unprecedented danger.

Modern Cryptography is Fragile

Modern-day computer security relies on the un-sinking complexity of certain mathematical problems. RSA encryption, introduced for the first time in 1977 by Rivest, Shamir, and Adleman, relies on the principle that factorization of a 2048-bit number into primes is computationally impossible for ordinary computers (RSA paper, 1978). Also, Diffie-Hellman key exchange, which was described by Whitfield Diffie and Martin Hellman in 1976, offers key exchange in a secure manner over an insecure channel based on the discrete logarithm problem (Diffie-Hellman paper, 1976). Elliptic-Curve Cryptography (ECC) was described in 1985 independently by Victor Miller and Neal Koblitz, based on the hardness of elliptic curve discrete logarithms, and remains resistant to brute-force attacks but with smaller key sizes for the same level of security (Koblitz ECC paper, 1987).

But quantum computing flips the script. Thanks to algorithms like Shor’s Algorithm, a sufficiently powerful quantum computer could factor large numbers exponentially faster than regular computers rendering RSA and ECC utterly useless. Meanwhile, Grover’s Algorithm provides symmetric key systems like AES with a quadratic boost.

What would take millennia or centuries to classical computers, quantum computers could boil down to days or even hours with the right scale. In fact, experts reckon that cracking RSA-2048 using Shor’s Algorithm could take just 20 million physical qubits which is a number that’s diminishing each year.

Generative AI adds fuel to the fire

While quantum computing threatens to undermine encryption itself, generative AI is playing an equally insidious but no less revolutionary role. By mass-producing activities such as the development of malware, phishing emails, and synthetic identities, generative AI models, large language models, and diffusion-based visual synthesizers, for example, are lowering the bar on sophisticated cyberattacks.

Even worse, generative AI can be applied to model and experiment with vulnerabilities in implementations of cryptography, including post-quantum cryptography. It can be employed to assist with training reinforcement learning agents that optimize attacks against side channels or profile quantum circuits to uncover new behaviors.

With quantum computing on the horizon, generative AI is both a sophisticated research tool and a player to watch when it comes to weaponization. On the one hand, security researchers utilize generative AI to produce, examine, and predict vulnerabilities in cryptography systems to inform the development of post-quantum-resistant algorithms. Meanwhile, it is exploited by malicious individuals for their ability to automate the production of complex attack vectors like advanced malware, phishing attacks, and synthetic identities radically reducing the barrier to conducting high impact cyberattacks. This dual-use application of generative AI radically shortens the timeline for adversaries to take advantage of breached or transitional cryptographic infrastructures, practically bridging the window of opportunity for defenders to deploy effective quantum-safe security solutions.

Real-World Implications

The impact of busted cryptography is real, and it puts at risk the foundations of everyday life:

1. Online Banking (TLS/HTTPS)

When you use your bank’s web site, the “https” in the address bar signifies encrypted communication over TLS (Transport Layer Security). Most TLS implementations rely on RSA or ECC keys to securely exchange session keys. A quantum attack would decrypt those exchanges, allowing an attacker to decrypt all internet traffic, including sensitive banking data.

2. Cryptocurrencies

Bitcoin, Ethereum, and other cryptocurrencies use ECDSA (Elliptic Curve Digital Signature Algorithm) for signing transactions. If quantum computers can crack ECDSA, a hacker would be able to forge signatures and steal digital assets. In fact, scientists have already performed simulations in which a quantum computer might be able to extract private keys from public blockchain data, enabling theft or rewriting the history of transactions.

3. Government Secrets and Intelligence Archives

National security agencies all over the world rely heavily on encryption algorithms such as RSA and AES to protect sensitive information, including secret messages, intelligence briefs, and critical infrastructure data. Of these, AES-256 is one that is secure even in the presence of quantum computing since it is a symmetric-key cipher that enjoys quantum resistance simply because Grover’s algorithm can only give a quadratic speedup against it, brute-force attacks remain gigantic in terms of resources and time. Conversely, asymmetric cryptographic algorithms like RSA and ECC, which underpin the majority of public key infrastructures, are fundamentally vulnerable to quantum attacks that can solve the hard mathematical problems they rely on for security.

Such a disparity offers a huge security gap. Information obtained today, even though it is in such excellent safekeeping now, might not be so in the future when sufficiently powerful quantum computers will be accessible, a scenario that is sometimes referred to as the “harvest now, decrypt later” threat. Both intelligence agencies and adversaries could be quietly hoarding and storing encrypted communications, confident that quantum technology will soon have the capability to decrypt this stockpile of sensitive information. The Snowden disclosures placed this threat in the limelight by revealing that the NSA catches and keeps vast amounts of global internet traffic, such as diplomatic cables, military orders, and personal communications. These repositories of encrypted data, unreadable as they stand now, are an unseen vulnerability; when Q-Day which is the onset of available, practical quantum computers that can defeat RSA and ECC, come around, confidentiality of decades’ worth of sensitive communications can be irretrievably lost.

Such a compromise would have apocalyptic consequences for national security and geopolitical stability, exposing classified negotiations, intelligence operations, and war plans to adversaries. Such a specter has compelled governments and security entities to accelerate the transition to post-quantum cryptography standards and explore quantum-resistant encryption schemes in an effort to safeguard the confidentiality and integrity of information in the era of quantum computing.

Arms Race Toward Post-Quantum Cryptography

In response, organizations like NIST are leading the development of post-quantum cryptographic standards, selecting algorithms believed to be quantum resistant. But migration is glacial. Implementing backfitting systems with new cryptographic foundations into billions of devices and services is a logistical nightmare. This is not a process of merely software updates but of hardware upgrades, re-certifications, interoperability testing, and compatibility testing with worldwide networks and critical infrastructure systems, all within a mode of minimizing downtime and security vulnerabilities.

Building such a large quantum computer that can factor RSA-2048 is an enormous task. It would require millions of logical qubits with very low error rates, it’s estimated. Today’s high-end quantum boxes have less than 100 operational qubits, and their error rates are too high to support complicated processes over a long period of time. However, with continued development of quantum correction methods, materials research, and qubit coherence times, specialists warn that effective quantum decryption capability may appear more quickly than the majority of organizations are prepared to deal with.

This convergence time frame, when old and new environments coexist, is where danger is most present. Attackers can use generative AI to look for these hybrid environments in which legacy encryption is employed, by botching the identification of old crypto implementations, producing targeted exploits en masse, and choreographing multi-step attacks that overwhelm conventional security monitoring and patching mechanisms.

Preparing for the Convergence

In order to be able to defend against this coming storm, the security strategy must evolve:

  • Inventory Cryptographic Assets: Firms must take stock of where and how encryption is being used across their environments.
  • Adopt Crypto-Agility: System needs to be designed so it can easily switch between encryption algorithms without full redesign.
  • Quantum Test Threats: Use AI tools to stress-test quantum-like threats in encryption schemes.

Adopt PQC and Zero-Trust Models: Shift towards quantum-resistant cryptography and architectures with breach as the new default state.

In Summary

Quantum computing is not only a looming threat, it is a countdown to a new cryptographic arms race. Generative AI has already reshaped the cyber threat landscape, and in conjunction with quantum power, it is a force multiplier. It is a two-front challenge that requires more than incremental adjustment; it requires a change of cybersecurity paradigm.

Panic will not help us. Preparation will.

Abbreviations

RSA – Rivest, Shamir, and Adleman
ECC – Elliptic-Curve Cryptography
AES – Advanced Encryption Standard
TLS – Transport Layer Security
HTTPS – Hypertext Transfer Protocol Secure
ECDSA – Elliptic Curve Digital Signature Algorithm
NSA – National Security Agency
NIST – National Institute of Standards and Technology
PQC – Post-Quantum Cryptography

References

***Google’s Gemini is used in this post to paraphrase some sentences to add more context. ***

About Author:

Omkar Bhalekar is a senior network engineer and technology enthusiast specializing in Data center architecture, Manufacturing infrastructure, and Sustainable solutions with extensive experience in designing resilient industrial networks and building smart factories and AI data centers with scalable networks. He is also the author of the Book Autonomous and Predictive Networks: The future of Networking in the Age of AI and co-author of Quantum Ops – Bridging Quantum Computing & IT Operations. Omkar writes to simplify complex technical topics for engineers, researchers, and industry leaders.

Liquid Dreams: The Rise of Immersion Cooling and Underwater Data Centers

By Omkar Ashok Bhalekar with Ajay Lotan Thakur

As demand for data keeps rising, driven by generative AI, real-time analytics, 8K streaming, and edge computing, data centers are facing an escalating dilemma: how to maintain performance without getting too hot. Traditional air-cooled server rooms that were once large enough for straightforward web hosting and storage are being stretched to their thermal extremes by modern compute-intensive workloads. While the world’s digital backbone burns hot, innovators are diving deep, deep to the ocean floor. Say hello to immersion cooling and undersea data farms, two technologies poised to revolutionize how the world stores and processes data.

Heat Is the Silent Killer of the Internet – In each data center, heat is the unobtrusive enemy. If racks of performance GPUs, CPUs, and ASICs are all operating at the same time, they generate massive amounts of heat. The old approach with gigantic HVAC systems and chilled air manifolds is reaching its technological and environmental limits.

In the majority of installations, over 35-40% of total energy consumption is spent on simply cooling the hardware, rather than running it. As model sizes and inference loads explode (think ChatGPT, DALL·E, or Tesla FSD), traditional cooling infrastructures simply aren’t up to the task without costly upgrades or environmental degradation. This is why there is a paradigm shift.

Liquid cooling is not an option everywhere due to lack of infrastructure, expense, and geography, so we still must rely on every player in the ecosystem to step up the ante when it comes to energy efficiency. The burden crosses multiple domains, chip manufacturers need to deliver far greater performance per watt with advanced semiconductor design, and software developers need to write that’s fundamentally low power by optimizing algorithms and reducing computational overhead.

Along with these basic improvements, memory manufacturers are designing low-power solutions, system manufacturers are making more power-efficient delivery networks, and cloud operators are making their data center operations more efficient while increasing the use of renewable energy sources. As Microsoft Chief Environmental Officer Lucas Joppa said, “We need to think about sustainability not as a constraint, but as an innovative driver that pushes us to build more efficient systems across every layer of the stack of technology.”

However, despite these multifaceted efficiency gains, thermal management remains a significant bottleneck that can have a deep and profound impact on overall system performance and energy consumption. Ineffective cooling can force processors to slow down their performance, which is counterintuitive to better chips and optimized software. This becomes a self-perpetuating loop where wasteful thermal management will counteract efficiency gains elsewhere in the system.

In this blogpost, we will address the cooling aspect of energy consumption, considering how future thermal management technology can be a multiplier of efficiency across the entire computing infrastructure. We will explore how proper cooling strategies not only reduce direct energy consumption from cooling components themselves but also enable other components of the system to operate at their maximum efficiency levels.

What Is Immersion Cooling?

Immersion cooling cools servers by submerging them in carefully designed, non-conductive fluids (typically dielectric liquids) that transfer heat much more efficiently than air. Immersion liquids are harmless to electronics; in fact, they allow direct liquid contact cooling with no risk of short-circuiting or corrosion.

Two general types exist:

  • Single-phase immersion, with the fluid remaining liquid and transferring heat by convection.
  • Two-phase immersion, wherein fluid boils at low temperature, gets heated and condenses in a closed loop.

According to Vertiv’s research, in high-density data centers, liquid cooling improves the energy efficiency of IT and facility systems compared to air cooling. In their fully optimized study, the introduction of liquid cooling created a 10.2% reduction in total data center power and a more than 15% improvement in Total Usage Effectiveness (TUE).

Total Usage Effectiveness is calculated by using the formula below:

TUE = ITUE x PUE (ITUE = Total Energy Into the IT Equipment/Total Energy into the Compute Components, PUE = Power Usage Effectiveness)

Reimagining Data Centers Underwater
Imagine shipping an entire data center in a steel capsule and sinking it to the ocean floor. That’s no longer sci-fi.

Microsoft’s Project Natick demonstrated the concept by deploying a sealed underwater data center off the Orkney Islands, powered entirely by renewable energy and cooled by the surrounding seawater. Over its two-year lifespan, the submerged facility showed:

  • A server failure rate 1/8th that of land-based centers.
  • No need for on-site human intervention.
  • Efficient, passive cooling by natural sea currents.

Why underwater? Seawater is an open, large-scale heat sink, and underwater environments are naturally less prone to temperature fluctuations, dust, vibration, and power surges. Most coastal metropolises are the biggest consumers of cloud services and are within 100 miles of a viable deployment site, which would dramatically reduce latency.

Why This Tech Matters Now Data centers already account for about 2–3% of the world’s electricity, and with the rapid growth in AI and metaverse workloads, that figure will grow. Generative inference workloads and AI training models consume up to 10x the power per rack that regular server workloads do, subjecting cooling gear and sustainability goals to tremendous pressure. Legacy air cooling technologies are reaching thermal and density thresholds, and immersion cooling is a critical solution to future scalability. According to Submer, a Barcelona based immersion cooling company, immersion cooling has the ability to reduce energy consumed by cooling systems by up to 95% and enable higher rack density, thus providing a path to sustainable growth in data centers under AI-driven demands

Advantages & Challenges

Immersion and submerged data centers possess several key advantages:

  • Sustainability – Lower energy consumption and lower carbon footprints are paramount as ESG (Environmental, Social, Governance) goals become business necessities.
  • Scalability & Efficiency – Immersion allows more density per square foot, reducing real estate and overhead facility expenses.
  • Reliability – Liquid-cooled and underwater systems have fewer mechanical failures including less thermal stress, fewer moving parts, and less oxidation.
  • Security & Autonomy – Underwater encased pods or autonomous liquid systems are difficult to hack and can be remotely monitored and updated, ideal for zero-trust environments.

While there are advantages of Immersion Cooling / Submerges Datacenters, there are some challenges/limitations as well –

  • Maintenance and Accessibility Challenges – Both options make hardware maintenance complex. Immersion cooling requires careful removal and washing of components to and from dielectric liquids, whereas underwater data centers provide extremely poor physical access, with entire modules having to be removed to fix them, which translates to longer downtimes.
  • High Initial Costs and Deployment Complexity – Construction of immersion tanks or underwater enclosures involves significant capital investment in specially designed equipment, infrastructure, and deployment techniques. Underwater data centers are also accompanied by marine engineering, watertight modules, and intricate site preparation.
  • Environmental and Regulatory Concerns – Both approaches involve environmental issues and regulatory adherence. Immersion systems struggle with fluid waste disposal regulations, while underwater data centers have marine environmental impact assessments, permits, and ongoing ecosystem protection mechanisms.
  • Technology Maturity and Operational Risks – These are immature technologies with minimal historical data on long-term performance and reliability. Potential problems include leakage of liquids in immersion cooling or damage and biofouling in underwater installation, leading to uncertain large-scale adoption.

Industry Momentum

Various companies are leading the charge:

  • GRC (Green Revolution Cooling) and submersion cooling offer immersion solutions to hyperscalers and enterprises.
  • HPC is offered with precision liquid cooling by Iceotope. Immersion cooling at scale is being tested by Alibaba, Google, and Meta to support AI and ML clusters.
  • Microsoft is researching commercial viability of underwater data centers as off-grid, modular ones in Project Natick.

Hyperscalers are starting to design entire zones of their new data centers specifically for liquid-cooled GPU pods, while smaller edge data centers are adopting immersion tech to run quietly and efficiently in urban environments.

  • The Future of Data Centers: Autonomous, Sealed, and Everywhere
    Looking ahead, the trend is clear: data centers are becoming more intelligent, compact, and environmentally integrated. We’re entering an era where:
  • AI-based DCIM software predicts and prevents failure in real-time.
  • Edge nodes with immersive cooling can be located anywhere, smart factories, offshore oil rigs.
  • Entire data centers might be built as prefabricated modules, inserted into oceans, deserts, or even space.
  • The general principle? Compute must not be limited by land, heat, or humans.

Final Thoughts

In the fight to enable the digital future, air is a luxury. Immersed in liquid or bolted to the seafloor, data centers are shifting to cool smarter, not harder.

Underwater installations and liquid cooling are no longer out-there ideas, they’re lifelines to a scalable, sustainable web.

So, tomorrow’s “Cloud” won’t be in the sky, it will hum quietly under the sea.

References

About Author:
Omkar Bhalekar is a senior network engineer and technology enthusiast specializing in Data center architecture, Manufacturing infrastructure, and Sustainable solutions. With extensive experience in designing resilient industrial networks and building smart factories and AI data centers with scalable networks, Omkar writes to simplify complex technical topics for engineers, researchers, and industry leaders.

Indosat Ooredoo Hutchison and Nokia use AI to reduce energy demand and emissions

Indonesian network operator Indosat Ooredoo Hutchison has deployed Nokia Energy Efficiency (part of the company’s Autonomous Networks portfolio – described below) to reduce energy demand and carbon dioxide emissions across its RAN network using AI. Nokia’s energy control system uses AI and machine learning algorithms to analyze real-time traffic patterns, and will enable the operator to adjust or shut idle and unused radio equipment automatically during low network demand periods.

The multi-vendor, AI-driven energy management solution can reduce energy costs and carbon footprint with no negative impact on network performance or customer experience. It can be rolled out in a matter of weeks.

Indosat is aiming to transform itself from a conventional telecom operator into an AI TechCo—powered by intelligent technologies, cloud-based platforms, and a commitment to sustainability. By embedding automation and intelligence into network operations, Indosat is unlocking new levels of efficiency, agility, and environmental responsibility across its infrastructure.

Earlier this year Indosat claimed to be the first operator to deploy AI-RAN in Indonesia, in a deal involving the integration of Nokia’s 5G cloud RAN solution with Nvidia’s Aerial platform. The Memorandum of Understanding (MoU) between the three firms included the development, testing, and deployment of AI-RAN, with an initial focus on transferring AI inferencing workloads on the AI Aerial, then the integration of RAN workloads on the same platform.

“As data consumption continues to grow, so does our responsibility to manage resources wisely. This collaboration reflects Indosat’s unwavering commitment to environmental stewardship and sustainable innovation, using AI to not only optimize performance, but also reduce emissions and energy use across our network.” said Desmond Cheung, Director and Chief Technology Officer at Indosat Ooredoo Hutchison.

Indosat was the first operator in Southeast Asia to achieve ISO 50001 certification for energy management—underscoring its pledge to minimize environmental impact through operational excellence. The collaboration with Nokia builds upon a successful pilot project, in which the AI-powered solution demonstrated its ability to reduce energy consumption in live network conditions.

Following the pilot project, Nokia deployed its Energy Efficiency solution to the entire Nokia RAN footprint within Indonesia, e.g. Sumatra, Kalimantan, Central and East Java.

“We are very pleased to be helping Indosat deliver on its commitments to sustainability and environmental responsibility, establishing its position both locally and internationally. Nokia Energy Efficiency reflects the important R&D investments that Nokia continues to make to help our customers optimize energy savings and network performance simultaneously,” said Henrique Vale, VP for Cloud and Network Services APAC at Nokia.

Nokia’s Autonomous Networks portfolio, including its Autonomous Networks Fabric solution, utilizes Agentic AI to deliver advanced security, analytics, and operations capabilities that provide operators with a holistic, real-time view of the network so they can reduce costs, accelerate time-to-value, and deliver the best customer experience.

Autonomous Networks Fabric is a unifying intelligence layer that weaves together observability, analytics, security, and automation across every network domain; allowing a network to behave as one adaptive system, regardless of vendor, architecture, or deployment model.

References:

https://www.nokia.com/newsroom/indosat-ooredoo-hutchison-and-nokia-partner-to-reduce-energy-demand-and-support-ai-powered-sustainable-operations/

https://www.telecoms.com/ai/nokia-to-supply-indosat-ooredoo-hutchison-with-ai-powered-energy-efficient-ran-software

Analysts weigh in: AT&T in talks to buy Lumen’s consumer fiber unit – Bloomberg

Bloomberg News reports that AT&T is in talks to acquire Lumen Technologies’ consumer fiber operations, in a deal that could value the unit at more than $5.5 billion, citing people with knowledge of the matter. The companies are in exclusive discussions about a transaction valuing the unit at more than $5.5 billion, said one of the people, who requested to not be identified discussing confidential information. The terms of the unfinalized deal could change or the talks might still collapse, according to the report.

“If the rumored price is correct, it is a great deal for AT&T,” wrote the financial analysts at New Street Research in a note to investors. “The value per [fiber] location at $5.5 billion would be about $1,300 which compares to Frontier at $2,400, Ziply at $3,800, and Metronet at $4,700,” the analysts continued.

The potential move to offload Lumen’s fiber business, which provides high-speed internet services to residential customers, comes as Lumen is focusing on the AI boom for business customers for growth, while grappling with a rapid decline of its legacy business.  Lumen initiated the process to sell its consumer fiber operations, Reuters reported in December.  “We’re looking at all possible arrangements,” Lumen CFO Chris Stansbury said during the company’s quarterly conference call, according to a Seeking Alpha transcript.   “Ultimately, that consumer asset was going to sit in the space where the market was going to consolidate and at that point of consolidation, we were not going to be a consolidator,” Stansbury said.
Bundling of fiber-to-the-home and wireless gives large providers lower churn and more pricing strength, Stansbury said, adding that the asset has garnered “a great deal of interest.” Any transaction is likely to help Lumen lighten its debt load, he added.
…………………………………………………………………………………………………………………………
Lumen’s mass market business served 2.6 million residential and small business customers at the end of the third quarter of 2024. Roughly 1 million of them were on fiber connections, while the rest were on the operator’s copper network.  The fiber-optic based network provider has over 1,700 wire centers across its total network, with consumer fiber available in about 400 of them.
“For Lumen, a sale at $5.5 billion would be disappointing,” the New Street analysts wrote. “The rumored range was $6-9 billion. Most clients seemed to focus on the low end of that range, anticipating perhaps $6 billion for a sale of just the fiber asset.”
Source: Panther Media GmbH/Alamy Stock Photo
………………………………………………………………………………………………………………………………………………………………………………………………………………………………………………………………………………………………
Sidebar – Lot’s of fiber deals:
Verizon is pursuing Frontier Communications for $20 billion; Canada’s BCE is hoping to acquire Ziply Fiber for $5 billion; and T-Mobile and KKR are seeking to buy Metronet.  Earlier this month Crown Castle sold its small cell business to the EQT Active Core Infrastructure fund for $4.25 billion and its fiber business to Zayo for $4.25 billion.
………………………………………………………………………………………………………………………………………………………………………………………………………………………………………………………………………………………………
AT&T has been investing in its high-speed fiber internet offerings to help drive faster subscriber and revenue growth. Earlier this month, it had forecast first-quarter adjusted profit in line with analysts’ estimate.  If AT&T does close on a purchase of Lumen’s fiber business, the deal would surely solidify AT&T’s position as the nation’s largest fiber network operator.
……………………………………………………………………………………………………………………………………..
References:

https://www.bloomberg.com/news/articles/2025-03-25/at-t-said-in-talks-to-buy-lumen-s-consumer-fiber-unit?embedded-checkout=true  (paywall)

https://www.reuters.com/markets/deals/att-talks-buy-lumens-consumer-fiber-unit-bloomberg-news-reports-2025-03-25/

https://www.lightreading.com/fttx/is-at-t-getting-a-screaming-deal-on-lumen-s-fiber-

Lumen Technologies to connect Prometheus Hyperscale’s energy efficient AI data centers

Microsoft choses Lumen’s fiber based Private Connectivity Fabric℠ to expand Microsoft Cloud network capacity in the AI era

Lumen, Google and Microsoft create ExaSwitch™ – a new on-demand, optical networking ecosystem

ACSI report: AT&T, Lumen and Google Fiber top ranked in fiber network customer satisfaction

Lumen to provide mission-critical communications services to the U.S. Department of Defense

AT&T sets 1.6 Tbps long distance speed record on its white box based fiber optic network

WiFi 7: Backgrounder and CES 2025 Announcements

Backgrounder:

Wi-Fi 7, also known as the IEEE 802.11be-2024 [1.], is the latest generation of Wi-Fi technology, offering significantly faster speeds, increased network capacity, and lower latency compared to previous versions like Wi-Fi 6, by utilizing features like wider 320MHz channels, Multi-Link Operation (MLO), and 4K-QAM modulation across all frequency bands (2.4GHz, 5GHz, and 6GHz).  Wi-Fi 7 is designed to use huge swaths of unlicensed spectrum in the 6 GHz band, first made available in Wi-Fi 6E standard, to deliver a maximum data rate of up to 46 Gbps.

Note 1. The Wi-Fi Alliance began certifying Wi-Fi 7 devices in January 2024. The IEEE approved the IEEE 802.11be standard in 2024 on September 26, 2024The standard supports at least one mode of operation capable of supporting a maximum throughput of at least 30 Gbps, as measured at the MAC data service access point (SAP), with carrier frequency operation between 1 and 7.250 GHz, while ensuring backward compatibility and coexistence with legacy IEEE Std 802.11 compliant devices operating in the 2.4 GHz, 5 GHz, and 6 GHz bands.

………………………………………………………………………………………………………………………………………………………………………………………………..

The role of 6 GHz Wi-Fi in delivering connectivity is changing, and growing. A recent report from OpenSignal, found that smartphone users spend 77% to 88% of their screen-on time connected to Wi-Fi. Further, the latest generations of Wi-Fi (largely due to the support of 320 MHz channels and critical features like Multi-Link Operation) are increasingly more reliable and deterministic, making them viable options for advanced applications like extended reality in both the home and the enterprise.

New features:

  • 320MHz channels: Double the bandwidth compared to Wi-Fi 6E. 
  • Multi-Link Operation (MLO): Allows devices to connect using multiple channels across different bands simultaneously. 
  • K-QAM modulation: Enables more data to be transmitted per signal. 
CES 2025 WiFi 7 Announcements:

1.  TP-Link unveiled the Deco BE68 Whole Home Mesh Wi-Fi 7 solution, which is claims delivers speeds of up to 14 Gbps, covering 8,100 sq. ft. and supporting up to 200 connected devices. “Featuring 10G, 2.5G, and 1G ports, it ensures fast, reliable wired connections. With Deco Mesh technology, the system delivers seamless coverage and uninterrupted performance for streaming, gaming, and more,” stated the company.

TP-Link also announced an outdoor mesh system to address the increasing demand for outdoor Wi-Fi connectivity. The Deco BE65-Outdoor and Deco BE25-Outdoor nodes are equipment with weather, water and dust proof enclosures. When combined with the Deco indoor models, a cohesive and reliable indoor-outdoor mesh network that allows a user to move seamlessly between the two environments can be achieved.

2.  Intel Core Ultra Series 2) are all equipped with Wi-Fi 7 capabilities integrated into the silicon, Intel has made Wi-Fi its standard choice. On its website, the company explained that a “typical” Wi-Fi 7 laptop is a potential maximum data rate of almost 5.8 Gbps. “This is 2.4X faster than the 2.4 Gbps possible with Wi-Fi 6/6E and could easily enable high quality 8K video streaming or reduce a massive 15 GB file download to roughly 25 seconds vs. the one minute it would take with the best legacy Wi-Fi technology,” Intel added.

3. ASUS  New Wi-Fi 7 Router Lineup

ASUS unveiled a range of new networking products at CES 2025, including the ASUS RT-BE58 Go travel router and ASUS 5G-Go mobile router – both recipients of the CES 2025 Innovation Award – alongside the ROG Rapture GT-BE19000AI gaming router and the ZenWiFi Outdoor series for home Wi-Fi setups.

  • The RT-BE58 Go – is a dual-band, Wi-Fi 7-capable mobile router supports three use cases: 4G/5G mobile tethering, public Wi-Fi hotspot (WISP), and conventional home router. It also supports VPN from up to 30 service providers and subscription-free Trend Micro security for online protection, while AiMesh compatibility allows for the router to be paired with other ASUS routers to provide wider signal coverage.
  • The ROG Rapture GT-BE19000AI is the iteration of the GT-BE19000 router released last year, this time with an NPU onboard coupled with CPU and MCU. This tri-core combination enables features like ROG AI Game Booster and Adaptive QoS 2.0 to reduce network latency by up to 34% for supported games, plus 46% power savings through its AI Power Saving mode that saves power based on usage patterns. Additional features include advanced ad and tracker blocking, network insights, and RF scanning.

References:

https://standards.ieee.org/ieee/802.11be/7516/

https://en.wikipedia.org/wiki/Wi-Fi_7

https://www.mathworks.com/help/wlan/ug/overview-of-wifi-7-or-ieee-802-11-be.html

[CES 2025] ASUS Presents New Wi-Fi 7 Router Lineup

Google, MediaTek team up; a new Wi-Fi HaLow chip; Wi-Fi 7 becomes standard — Top Wi-Fi news from CES 2025

WiFi 7 and the controversy over 6 GHz unlicensed vs licensed spectrum

Highlights of GSA report on Private Mobile Network Market – 3Q2024

According to GSA, the private mobile network market (PMNM) continued to grow in 3Q2024, as the number of unique customer references for deployments reached 1,603. The market is being driven by sectors like manufacturing, education, and mining, which use these networks for enhanced data, security and mobility needs.

On average, 71% of references included in the GSA database are non-public and unique to this database, submitted by members of the GSA Private Mobile Networks Special Interest Group (SIG). This number can be higher for certain industries, with more than 80% of sectors such as military and defense, maritime and power plants not visible in the public domain. The referenced SIG includes 16 companies: 450Alliance, 5G-ACIA, AI-Link, Airspan, Celona, Dell, Ericsson, GSMA, JMA Wireless, Keysight Technologies, Mavenir, Nokia, OnGo Alliance, OneLayer, PrivateLTEand5G.com and TCCA. GSA would like to thank its members 450Alliance, Airspan, Celona, Ericsson, Keysight Technologies, Mavenir, Nokia and OneLayer for sharing general information about their network deployments to enable this report and data set to be produced. New data has resulted in a significant uplift in this update.

Other PMNM highlights in the 3rd quarter 2024 include:

• There are 80 countries around the world with at least one private mobile network.

• Of the top 10 reporting countries, the United States reported growth of 24%, followed by the United Kingdom, up 11%, Sweden by 9% and Japan and Australia by 5% each. Finland and the Republic of Korea grew by 4% each

• Seaports and oil and gas were the fastest-growing industries, up 9%. Manufacturing, education and academic research and mining remain the top three sectors for customer references, although this does not represent the actual size and scale of deployments, which vary by user type.

• There are 80 countries around the world with at least one private mobile network.

• There is typically a strong, positive correlation between the number of private mobile network references and countries with dedicated spectrum. Private mobile networks are mainly in high- and upper-middle-income regions so far, with the United States, Germany, the United Kingdom, China and Japan having the most references. It is sometimes reported that China has a high number of networks, reaching up to 30,000, but GSA believes a large portion use the public network and therefore do not meet our definition.

Image Credit: GSA

Notes:

The definition of a private mobile network used in this report is a 3GPP-based 4G LTE or 5G network intended for the sole use of private entities, such as enterprises, industries and governments. They can use only physical elements, RAN or Core, or a combination of physical and virtual elements — for example hosted by a public land mobile network — but as a minimum, a dedicated network core must be implemented. The definition includes MulteFire or Future Railway Mobile Communication System. The network must use spectrum defined in 3GPP, be generally intended for business-critical or mission-critical operational needs, and where it is possible to identify commercial value, the database includes contracts worth more than €50,0000 and between €50,000 and €100,000 to filter out small demonstration network deployments. Private mobile networks are usually not offered to the general public, although GSA’s analysis does include the following: educational institutions that provide mobile broadband to student homes; private fixed wireless access networks deployed by communities for homes and businesses; city or town networks that use local licenses to provide wireless services in libraries or public places (possibly offering Wi-Fi with 3GPP wireless backhaul), which are not an extension of the public network.

Non-3GPP networks such as those using Wi-Fi, TETRA, P25, WiMAX, Sigfox, LoRa and proprietary technologies are excluded from the data set. Network implementations using solely network slices from public networks or placement of virtual networking functions on a router are also excluded. Where identifiable, extensions of the public network (such as one or two extra sites deployed at a location, as opposed to dedicated private networks) are excluded. These items may be described in the press as a type of private network.

References:

PMN December 2024 Summary

SNS Telecom & IT: Private 5G and 4G LTE cellular networks for the global defense sector is a $1.5B opportunity

SNS Telecom & IT: $6 Billion Private LTE/5G Market Shines Through Wireless Industry’s Gloom

SNS Telecom & IT: Private 5G Network market annual spending will be $3.5 Billion by 2027

Dell’Oro: Private RAN revenue declines slightly, but still doing relatively better than public RAN and WLAN markets

Pente Networks, MosoLabs and Alliance Corp collaborate for Private Cellular Network in a Box

HPE Aruba Launches “Cloud Native” Private 5G Network with 4G/5G Small Cell Radios

 

 

U.S. Weighs Ban on Chinese made TP-Link router and China Telecom

Today, the Wall Street Journal (WSJ) reported that the U.S. is considering banning the sale of China made TP-Link internet routers over concerns the home networking devices pose a security risk. Government authorities may ban the popular routers which have been linked to Chinese cyberattacks.  TP-Link has roughly 65% of the U.S. market for routers for homes and small businesses. It is also the top choice on Amazon.com, and powers internet communications for the Defense Department and other federal government agencies.

Investigators at the U.S. Commerce, Defense and Justice departments have opened their own probes into the company, and authorities could ban the sale of TP-Link routers in the U.S. next year, according to people familiar with the matter. An office of the Commerce Department has subpoenaed TP-Link, some of the people said. If its routers are banned from the U.S., it would mark the biggest extraction of Chinese telecom equipment from the country since the Trump administration in 2019 ordered Huawei Technologies ripped out of American infrastructure.

TP-Link routers are routinely shipped to customers with security flaws, which the company often fails to address, according to people familiar with the matter. While routers often have bugs, regardless of their manufacturer, TP-Link doesn’t engage with security researchers concerned about them, the WSJ said.  However, TP-Link told CBS MoneyWatch that the company’s “security practices are fully in line with industry security standards in the U.S.”

 

                                                                                                                                                             TP-Link router. Photo: Meghan Petersen/WSJ

…………………………………………………………………………………………………………………………………………………………………………………………………………

TP-Link has also joined with more than 300 internet providers in the U.S. to be the router that is mailed to new homes that sign up for their services. Federal contracting documents show TP-Link routers supply everything from the National Aeronautics and Space Administration to the Defense Department and Drug Enforcement Administration, and the routers are sold at online military exchanges.  The company’s market dominance has been achieved in part through lower prices. Its routers are cheaper than competitors, often by more than half, according to market data.

TP-Link sells in the U.S. through a business unit based in California. According to business records, TP-Link co-founder Zhao Jianjun is the chief executive of the California operation and he and his brother still ultimately control all global TP-Link entities. A spokeswoman for that unit said TP-Link assesses potential security risks and takes action to address known vulnerabilities.

“We welcome any opportunities to engage with the U.S. government to demonstrate that our security practices are fully in line with industry security standards, and to demonstrate our ongoing commitment to the U.S. market, U.S. consumers, and addressing U.S. national security risks,” the spokeswoman said.

Asked to comment about potential actions against TP-Link, Liu Pengyu, a spokesman for the Chinese Embassy in Washington, said the U.S. was using the guise of national security to “suppress Chinese companies.” He added that Beijing would “resolutely defend” the lawful rights and interests of Chinese firms.

TP-Link’s U.S. growth took off during the pandemic, when people were sent home to work and needed reliable internet. The company climbed from around 20% of the U.S. market for home and small-business routers in 2019 to around 65% this year. It took an additional 5% of the market in just the third quarter of this year, according to industry data. The TP-Link spokeswoman disputed the industry data but said the company’s market share has grown in the U.S.

An analysis from Microsoft published in October found that a Chinese hacking entity maintains a large network of compromised network devices mostly comprising thousands of TP-Link routers. The network has been used by numerous Chinese actors to launch cyberattacks. These actors have gone after Western targets including think tanks, government organizations, nongovernment organizations and Defense Department suppliers.

The Defense Department earlier this year opened an investigation into national-security vulnerabilities in Chinese routers, according to people familiar with the matter. The House Select Committee on the Chinese Communist Party in August urged the Commerce Secretary to investigate TP-Link because it presents an “unusual degree of vulnerabilities.” The House of Representatives in September passed legislation that called for a study of the national-security risks posed by routers with ties to foreign adversaries, on which the Senate has yet to act.

………………………………………………………………………………………………………………………………………………………………………………………………

Separately, the U.S. Commerce Department is moving to further crack down on China Telecom’s U.S. unit over concerns it could exploit access to American data through their U.S. cloud and internet businesses by providing it to Beijing, a source told Reuters. The source confirmed a New York Times report that the department last week sent China Telecom Americas a preliminary determination that its presence in U.S. networks and cloud services poses U.S. national security risks and gave the company 30 days to respond.

Previously, the FCC moved to shrink China Telecom’s presence in the U.S.  In October 2021, nine months into Mr. Biden’s term, the Commission revoked all licenses for China Telecom Americas to provide ordinary phone services in the United States, saying it was “subject to exploitation, influence and control by the Chinese government.”  That left in place China Telecom’s network nodes on U.S. telecom networks and carrier neutral data centers with the power to “peer in” to internet and phone traffic. That ability would be stripped under the Commerce Department order, assuming that the Trump administration went along. China Telecom Americas did not respond to messages left at its office in Herndon, Va.

“We’ve been taking a hard look at where Chinese technologies are in the United States and asking ourselves the question of, is this an acceptable level of risk?” Anne Neuberger, the deputy national security adviser for cyber and emerging technologies, said in an interview on Monday. “For a number of years, these companies have operated networks and cloud service businesses in the U.S., which involved network equipment that’s co-located with our internet infrastructure. And while in the past we may have viewed this as an acceptable level of risk, that is no longer the case.”

The F.C.C. action to block China Telecom from most of its business in the United States did not prevent Volt Typhoon — China’s placement of malicious code in the electric grid and water and gas pipeline networks — or Salt Typhoon, the surveillance effort that was uncovered over the summer. Taken together, officials say, they amount to the most significant assault on American critical infrastructure in the digital age.

Speaking last week at the Paley Center for Media in Manhattan, Gen. Timothy D. Haugh, the director of the National Security Agency and commander of U.S. Cyber Command, said, “If I look at today, the PRC is not deterred,” using the initials for the People’s Republic of China.  He declined to say whether his forces were conducting offensive operations against China in retaliation for any of its recent incursions into American networks.

On Sunday, President-elect Donald J. Trump’s incoming national security adviser, Representative Mike Waltz, a Florida Republican, suggested on CBS’s “Face the Nation” that the new administration would be much more tempted to use offensive cyber-actions against China.  “We need to start going on offense and start imposing, I think, higher costs and consequences to private actors and nation-state actors that continue to steal our data, that continue to spy on us and that, even worse, with the Volt Typhoon penetration, that are literally putting cyber time bombs on our infrastructure, our water systems, our grids, even our ports,” he said.

Officials have said they do not believe that the Chinese hackers have been ousted from the networks of at least eight telecommunications firms, including the nation’s two largest, Verizon and AT&T. That suggests that China’s hackers retain the capability to escalate.

Since Microsoft first alerted the telecommunications firms over the summer that they had found evidence of hackers deep in their systems, the Biden administration has struggled to come up with a response. It created a task force inside the White House, and the issue is considered so serious that the group meets almost daily. Chief executives of the affected firms have been summoned to the Situation Room to come up with a joint plan of action.

https://www.wsj.com/politics/national-security/us-ban-china-router-tp-link-systems-7d7507e6?st=SEX5iL&reflink=desktopwebshare_permalink

https://www.cbsnews.com/news/tp-link-router-china-us-ban/

https://www.reuters.com/business/media-telecom/us-moves-boost-crackdown-china-telecoms-us-unit-source-says-2024-12-17/

https://www.nytimes.com/2024/12/16/us/politics/biden-administration-retaliation-china-hack.html

Aftermath of Salt Typhoon cyberattack: How to secure U.S. telecom networks?

WSJ: T-Mobile hacked by cyber-espionage group linked to Chinese Intelligence agency

China backed Volt Typhoon has “pre-positioned” malware to disrupt U.S. critical infrastructure networks “on a scale greater than ever before”

FBI and MI5 Chiefs Issue Joint Warning: Chinese Cyber Espionage on Tech & Telecom Firms

Quantum Technologies Update: U.S. vs China now and in the future

SKT-Samsung Electronics to Optimize 5G Base Station Performance using AI

SK Telecom (SKT) has partnered with Samsung Electronics to use AI to improve the performance of its 5G base stations in order to upgrade its wireless network.  Specifically, they will use AI-based 5G base station quality optimization technology (AI-RAN Parameter Recommender) to commercial 5G networks.

The two companies have been working throughout the year to learn from past mobile network operation experiences using AI and deep learning, and recently completed the development of technology that automatically recommends optimal parameters for each base station environment.  When applied to SKT’s commercial network, the new technology was able to bring out the potential performance of 5G base stations and improve the customer experience.

Mobile base stations are affected by different wireless environments depending on their geographical location and surrounding facilities. For the same reason, there can be significant differences in the quality of 5G mobile communication services in different areas using the same standard equipment.

Accordingly, SKT utilized deep learning, which analyzes and learns the correlation between statistical data accumulated in existing wireless networks and AI operating parameters, to predict various wireless environments and service characteristics and successfully automatically derive optimal parameters for improving perceived quality.

Samsung Electronics’ ‘Network Parameter Optimization AI Model’ used in this demonstration improves the efficiency of resources invested in optimizing the wireless network environment and performance, and enables optimal management of mobile communication networks extensively organized in cluster units.

The two companies are conducting additional learning and verification by diversifying the parameters applied to the optimized AI model and expanding the application to subways where traffic patterns change frequently.

SKT is pursuing advancements in the method of improving quality by automatically adjusting the output of base station radio waves or resetting the range of radio retransmission allowance when radio signals are weak or data transmission errors occur due to interference.

In addition, we plan to continuously improve the perfection of the technology by expanding the scope of targets that can be optimized with AI, such as parameters related to future beamforming*, and developing real-time application functions.

* Beamforming: A technology that focuses the signal received through the antenna toward a specific receiving device to transmit and receive the signal strongly.

SKT is expanding the application of AI technology to various areas of the telecommunications network, including ‘Telco Edge AI’, network power saving, spam blocking, and operation automation, including this base station quality improvement. In particular, AI-based network power saving technology was recently selected as an excellent technology at the world-renowned ‘Network X Award 2024’.

Ryu Tak-ki, head of SK Telecom’s infrastructure technology division, said, “This is a meaningful achievement that has confirmed that the potential performance of individual base stations can be maximized by incorporating AI,” and emphasized, “We will accelerate the evolution into an AI-Native Network that provides differentiated customer experiences through the convergence of telecommunications and AI technologies.”

“AI is a key technology for innovation in various industrial fields, and it is also playing a decisive role in the evolution to next-generation networks,” said Choi Sung-hyun, head of the advanced development team at Samsung Electronics’ network business division. “Samsung Electronics will continue to take the lead in developing intelligent and automated technologies for AI-based next-generation networks.”

Page 1 of 94
1 2 3 94