From LPWAN to Hybrid Networks: Satellite and NTN as Enablers of Enterprise IoT – Part 2

By Afnan Khan (ML Engineer) and Mehsam Bin Tahir (Data Engineer)

Introduction:

This is the second of two articles on the impact of the Internet of Things (IoT) on the UK Telecom industry.  The first is at

Enterprise IoT and the Transformation of UK Telecom Business Models – Part 1

Executive  Summary:

Early Internet of Things (IoT) deployments relied heavily on low power wide area networks (LPWANs) to deliver low-cost connectivity for distributed devices. While these technologies enabled initial IoT adoption, they struggled to deliver sustainable commercial returns for telecom operators. In response, attention has shifted towards hybrid terrestrial–satellite connectivity models that integrate Non-Terrestrial Networks (NTN) directly into mobile network architectures. In 2026, satellite connectivity is increasingly positioned not as a universal coverage solution but as a resilience and continuity layer for enterprise IoT services (Ofcom, 2025).

The Commercial Limits of LPWAN-Based IoT:

LPWAN technologies enabled low-cost connectivity for specific IoT use cases but were typically deployed outside mobile core architectures. This limited their ability to support quality of service guarantees, enterprise-grade security and integrated billing models. As a result, LPWAN deployments often remained fragmented and failed to scale into durable enterprise business models, restricting their long-term commercial value for telecom operators (Ofcom, 2025).

Satellite and NTN as Integrated Mobile Extensions:

In contrast, satellite and NTN connectivity extends existing mobile networks rather than operating as a parallel IoT layer. When non-terrestrial connectivity is integrated into 5G core infrastructure, telecom operators are able to deliver managed IoT services with consistent security, performance and billing models across both terrestrial and remote environments. This architectural shift allows satellite connectivity to be packaged as part of a unified enterprise service rather than sold as a standalone or niche connectivity product (3GPP, 2023). Figure 1 illustrates this hybrid terrestrial–satellite model, showing how satellite connectivity functions as an extension of mobile networks to support continuous IoT services across urban, rural and remote environments.

Figure 1: Hybrid terrestrial–satellite connectivity supporting continuous IoT services across urban, rural and remote environments.

Industrial Use Cases and Hybrid Connectivity

In sectors such as offshore energy, agriculture, logistics and remote infrastructure monitoring, IoT deployments prioritise coverage continuity and service resilience over peak data throughput. Hybrid terrestrial–satellite connectivity enables operators to offer coverage guarantees and service level agreements that LPWAN-based models could not reliably support. In 2026, Virgin Media O2 launched satellite-enabled services aimed at supporting rural connectivity and improving resilience for IoT-dependent applications, reflecting a broader operator strategy to monetise non-terrestrial coverage where reliability is a core requirement (Real Wireless, 2025).

The commercial implications of this transition are further illustrated in Figure 2, which contrasts siloed LPWAN deployments with integrated mobile and satellite IoT services delivered through a unified network core.

Figure 2: Transition from siloed LPWAN deployments to integrated mobile and satellite IoT services delivered through a unified network core.

Satellite Connectivity and Enterprise IoT at Scale:

The UK Space Agency has identified hybrid terrestrial–satellite connectivity as an enabling layer for remote industrial operations, environmental monitoring and agricultural IoT systems. UK-based firms such as Open Cosmos are contributing to this model by integrating Low Earth Orbit satellite connectivity with existing mobile core networks. This approach allows telecom operators to deliver end-to-end managed connectivity for enterprise customers without deploying separate IoT network stacks, converting coverage limitations from a cost burden into chargeable, service-based revenue opportunities (Open Cosmos, 2024; UK Space Agency, 2025).

Conclusion

In 2026, IoT is reshaping the UK telecom sector primarily by enabling new revenue models rather than by driving incremental network expansion. Following the limited commercial success of LPWAN-based IoT strategies, satellite and Non-Terrestrial Network integration is increasingly deployed as an extension of mobile networks to provide coverage continuity and service guarantees for industrial and remote use cases. When integrated into 5G core architectures, satellite connectivity enables telecom operators to monetise resilience and reliability as part of managed enterprise services rather than offering standalone connectivity. Taken together, these developments show that satellite and NTN integration has become a critical enabler of scalable, enterprise-led IoT business models in the UK (Ofcom-2025; 3GPP-2023).

…………………………………………………………………………………………………………………………………………………………………………

References:

Ofcom. (2025). Connected Nations UK report.
https://www.ofcom.org.uk

Real Wireless. (2025). Satellite to mobile connectivity and the UK market.
https://real-wireless.com

UK Space Agency. (2025). Connectivity and space infrastructure briefing
https://www.gov.uk/government/organisations/uk-space-agency

Open Cosmos. (2024). Satellite solutions for IoT and Earth observation.
https://open-cosmos.com

3GPP. (2023). Non-Terrestrial Networks (NTN) support in 5G systems.
https://www.3gpp.org/news-events/ntn

Non-Terrestrial Networks (NTNs): market, specifications & standards in 3GPP and ITU-R

Keysight Technologies Demonstrates 3GPP Rel-19 NR-NTN Connectivity in Band n252 (using Samsung modem chip set)

Telecoms.com’s survey: 5G NTNs to highlight service reliability and network redundancy

ITU-R recommendation IMT-2020-SAT.SPECS from ITU-R WP 5B to be based on 3GPP 5G NR-NTN and IoT-NTN (from Release 17 & 18)

China ITU filing to put ~200K satellites in low earth orbit while FCC authorizes 7.5K additional Starlink LEO satellites

Samsung announces 5G NTN modem technology for Exynos chip set; Omnispace and Ligado Networks MoU

 

 

 

Enterprise IoT and the Transformation of UK Telecom Business Models – Part 1

By Afnan Khan (ML Engineer) and Raabia Riaz (Data Scientist)

Introduction:

This is the first of two articles on the impact of the Internet of Things (IoT) on the UK Telecom industry.  The second is at

From LPWAN to Hybrid Networks: Satellite and NTN as Enablers of Enterprise IoT – Part 2

Executive Summary:

In 2026, the Internet of Things (IoT) is fundamentally changing the UK telecom sector by enabling new business models rather than simply driving incremental network upgrades.

As consumer mobile markets show limited YoY growth between 2025 and 2026, telecom operators have prioritised IoT-led enterprise services as a source of new revenue (as per Ofcom-2025; GSMA-2024). Investment has shifted away from consumer facing upgrades towards private networks, managed connectivity and long-term service contracts for industry and infrastructure. This change reflects a broader move from usage-based connectivity towards service-based delivery.

IoT and Enterprise Connectivity through Private 5G:

Figure 1: Transition from consumer mobile connectivity to enterprise IoT services in the UK telecom sector, highlighting the shift towards managed connectivity and long-term service contracts.

The growth of private 5G and managed enterprise networks represents one of the clearest IoT driven business shifts. Industrial customers increasingly require predictable performance, low latency and enhanced security, which are not consistently available through public mobile networks. 5G Standalone architecture enables features such as network slicing and low latency communication, allowing operators to sell connectivity as a managed service rather than a commodity product (Mobile UK, 2024).

In the UK, this model is visible in projects such as the Port of Felixstowe private 5G trials supporting automated port operations and asset tracking (BT Group, 2023), the Liverpool City Region 5G programme focused on connected logistics (DCMS, 2022), the West Midlands 5G transport and connected vehicle projects (WM5G, 2023) and Network Rail 5G rail monitoring trials supporting safety and asset management (Network Rail, 2024). These deployments are typically delivered through long term enterprise contracts.

Together, these projects illustrate how connectivity is increasingly sold as a managed operational capability embedded within enterprise workflows rather than them being priced through consumer-style data usage as illustrated in figure 1.

IoT and Long-Term Infrastructure Revenue:

IoT enables telecom operators to participate in long-term infrastructure-based revenue models. The UK national smart meter programme illustrates this shift. By the third quarter of 2025, more than 40 million smart and advanced meters had been installed across Great Britain, with around 70% operating in smart mode (Department for Energy Security and Net Zero, 2025).

These systems rely on continuous, secure connectivity over long lifecycles. The Data Communications Company network processes billions of encrypted messages each month, creating sustained demand for resilient connectivity (DCC, 2024). Ofcom has linked the growth of such systems to increased regulatory focus on network resilience where connectivity underpins critical national infrastructure, while the National Cyber Security Centre has highlighted security risks associated with large IoT deployments (Ofcom, 2025; NCSC, 2024).

For telecom operators, these deployments favour long-term service contracts and regulated infrastructure partnerships over short-term retail revenue models.

Conclusions:

In 2026, IoT is transforming the UK telecom sector primarily by reshaping how connectivity is monetised rather than by driving incremental network upgrades. As consumer mobile markets show limited growth, telecom operators have increasingly aligned investment with enterprise IoT demand through private 5G deployments and long-term infrastructure connectivity. These models prioritise predictable performance, security and service continuity over mass-market scale. Private 5G projects across ports, transport networks and logistics hubs demonstrate how IoT demand has accelerated the commercial adoption of 5G Standalone capabilities, allowing operators to sell connectivity as a managed operational service embedded within enterprise workflows (Mobile UK, 2024). At the same time, national smart infrastructure programmes such as smart metering illustrate how IoT supports long-duration connectivity contracts that favour regulated partnerships and resilient network design over short-term retail revenue (Department for Energy Security and Net Zero, 2025; DCC, 2024). Taken together, these developments indicate that IoT is no longer an adjunct to UK telecom networks. Instead, it has become a central driver of enterprise-led, service-based business models that align network investment with stable, long-term revenue streams and critical infrastructure requirements.

…………………………………………………………………………………………………………………………………………………………..

References:

BT Group. (2023). BT and Hutchison Ports trial private 5G at the Port of Felixstowe.
https://www.bt.com/about/news/2023/bt-hutchison-ports-5g-felixstowe

Data Communications Company. (2024). Annual report and accounts 2023–24.
https://www.smartdcc.co.uk/our-company/our-performance/annual-reports/

Department for Digital, Culture, Media and Sport. (2022). Liverpool City Region 5G Testbeds and Trials Programme.
https://www.gov.uk/government/publications/5g-testbeds-and-trials-programme

Department for Energy Security and Net Zero. (2025). Smart meter statistics in Great Britain Q3 2025.
https://www.gov.uk/government/collections/smart-meters-statistics

GSMA. (2024). The Mobile Economy Europe.
https://www.gsma.com/mobileeconomy/europe/

Mobile UK. (2024). Unleashing the power of 5G Standalone.
https://www.mobileuk.org

National Cyber Security Centre. (2024). Cyber security principles for connected places.
https://www.ncsc.gov.uk

Network Rail. (2024). 5G on the railway connectivity trials.
https://www.networkrail.co.uk

Ofcom. (2025). Connected Nations UK report.
https://www.ofcom.org.uk

MTN Consulting: Satellite network operators to focus on Direct-to-device (D2D), Internet of Things (IoT), and cloud-based services

IoT Market Research: Internet Of Things Eclipses The Internet Of People

Artificial Intelligence (AI) and Internet of Things (IoT): Huge Impact on Tech Industry

ITU-R M.2150-1 (5G RAN standard) will include 3GPP Release 17 enhancements; future revisions by 2025

5G Americas: LTE & LPWANs leading to ‘Massive Internet of Things’ + IDC’s IoT Forecast

GSA: 102 Network Operators in 52 Countries have Deployed NB-IoT and LTE-M LPWANs for IoT

LoRaWAN and Sigfox lead LPWANs; Interoperability via Compression

IEEE/SCU SoE Virtual Event: May 26, 2022- Critical Cybersecurity Issues for Cellular Networks (3G/4G, 5G), IoT, and Cloud Resident Data Centers

 

Huawei’s Electric Vehicle Charging Technology & Top 10 Charging Trends

Huawei EV Charging Backgrounder –from Google Gemini:

Huawei is a major player in the electric vehicle (EV) charging infrastructure market, focusing primarily on developing and supplying ultra-fast, liquid-cooled charging solutions and related smart energy management systems. Their involvement includes manufacturing core charging hardware and developing software/AI for intelligent network management.
Key aspects of Huawei’s involvement in charging technology:
  • Ultra-Fast Charging Technology: Huawei’s flagship product is the FusionCharge system, which uses a fully liquid-cooled design to enable ultra-fast DC charging at high power levels, including up to 600 kW and even experimental 1.5 MW chargers. This technology is designed to add significant range (e.g., over 200 km in 5 minutes) and is compatible with most EV models.
  • Integrated Energy Solutions: A core part of their strategy is the integration of EV charging with renewable energy (photovoltaics or PV) and energy storage systems (ESS). This “PV+ESS+Charger” solution helps maximize green power consumption, reduces the impact of high-power charging on the main power grid, and allows for intelligent peak shaving to optimize operational costs.
  • Hardware and Components: Huawei designs and supplies key charging components, including power units, charging dispensers, and silicon carbide (SiC) chips that enhance efficiency and power density. Their modular designs allow for scalable power output and a service life of over 10 years.
  • Smart Network Management: Huawei provides platforms for smart charging network management that enable remote monitoring, data analysis, and intelligent power distribution among multiple vehicles at a single station. This intelligent power pooling improves efficiency and ensures optimal use of available power.
  • Innovation in Convenience: Huawei has showcased an experimental prototype of a robotic charging arm that can automatically locate and plug into a vehicle’s charging port, facilitating a seamless “self-charging” experience that would work well with autonomous vehicles.
  • Strategic Partnerships and Market Deployment: Huawei works with partners, including logistics companies and car manufacturers, to deploy its charging solutions across China and other markets. They are also involved in joint ventures for manufacturing EVs, such as with Chery under the Harmony Intelligent Mobility Alliance (HIMA).
  • Battery Technology Research: The company holds patents for advanced battery technology, including a solid-state battery with a high energy density, which could further revolutionize EV range and charging times if commercialized. 

………………………………………………………………………………………………………………………………………………………………………………….

Here are Huawei’s top 10 trends in charging systems for electric vehicles:

  1. From passenger vehicles to commercial vehicles, “high quality” has become a must for ultra-fast charging infrastructure, driving large-scale upgrade of legacy charging devices to meet the energy needs of different vehicle models. High-quality development will extend from “cities of ultra-fast charging” to “cities of megawatt charging” through unified planning, standards, supervision, and O&M, enabling industry partners to turn high quality into high returns.
  2. Ultra-fast-charging vehicle models, once premium necessities, will be embraced by everyone. The extensive application of third-generation power semiconductor materials and high-C-rate traction batteries will further increase the market share of ultra-fast-charging vehicles. Megawatt-charging commercial vehicles will dominate the market.
  3. Megawatt-Scale Logistics Electrification: “Fuel-to-electricity” conversion for viable business will rapidly expand Heavy Goods Vehicle (HGV) from limited, closed applications to widespread, all-scenario adoption. The cost reduction of traction batteries and the innovation of megawatt charging technologies will make megawatt-scale logistics electrification an unstoppable trend, bringing significant economic and social values.
  4. 100 Megawatt Scale Charging Stations: For electrified logistics, 100 MW-scale charging stations will become the essential infrastructure for high-throughput operations. Factors such as technical strengths, competitive electricity pricing, and scalable deployment will unlock powerful cluster effects and secure long-term, sustainable profitability for charging station investments.
  5. Security and Trustworthiness:  Compared with passenger vehicles, commercial vehicles require higher charging power and a greater proportion of energy storage system (ESS) capacity in charging stations. Therefore, security and trustworthiness will become fundamental requirements for charging networks. The comprehensive electrical safety protection architecture will seamlessly safeguard people, vehicles, and chargers, reinforced by a robust cybersecurity foundation.
  6. Liquid-cooled ultra-fast charging delivers superior heat dissipation and protection, enabling reliable performance across increasingly distributed charging scenarios. In contrast, conventional air-cooled systems struggle in demanding environments such as high heat, humidity, salt fog, and heavy dust. In the future, the liquid cooling technology will be applied in vehicles and chargers, enabling efficient megawatt charging and contributing to overall vehicle cost reduction.
  7. A DC-based ESS+charger system can effectively increase power capacity, helping customers quickly and cost-effectively deploy ultra-fast charging stations, even in locations with limited grid power. This system is ideal for upgrading legacy low-capacity stations, enabling ultra-fast charging stations to be rapidly repurposed or newly deployed with minimal grid power, and maximizing the capability to meet vehicle charging demands.
  8. Modular Station Construction: The station-level modular solution is built for engineering construction and device commissioning, adapting to a wide range of charging scenarios. Its low cost, rapid deployment, and easy relocation make it a flexible choice, while its durable design ensures long-term value and protection for investors.
  9. Campus Microgrid: The grid-forming PV+ESS system integrates the liquid-cooled ultra-fast charging technology, and can operate in on-grid or off-grid mode. This forms a one-stop “PV+ESS+charger+vehicle+network” solution that boosts power capacity, maximizes the use of green energy, and enhances revenue through time-of-use arbitrage.
  10. AI Empowerment: The intelligent evolution of charging networks will enable seamless collaboration across networks, stations, chargers, and vehicles. By breaking down digital silos, it will elevate the end-to-end charging experience for vehicle owners and enhance overall logistics and transportation efficiency.

Huawei says they will continue to work with partners to accelerate the rollout of seamless, high-quality ultra-fast charging networks, and capture opportunities of mobility electrification.

SOURCE Huawei Digital Power

………………………………………………………………………………………………………………………………………………………………………………………………………………

References:

https://www.prnewswire.com/apac/news-releases/jointly-charging-the-road-ahead–huawei-releases-top-10-trends-of-charging-network-industry-2026-302663360.html

https://interestingengineering.com/energy/china-huawei-worlds-first-100mw-charging

Agentic AI and the Future of Communications for Autonomous Vehicles (V2X)

Private 5G networks move to include automation, autonomous systems, edge computing & AI operations

Telecom operators investing in Agentic AI while Self Organizing Network AI market set for rapid growth

Arm Holdings unveils “Physical AI” business unit to focus on robotics and automotive

2025 Year End Review: Integration of Telecom and ICT; What to Expect in 2026

 

Fiber Optic Networks & Subsea Cable Systems as the foundation for AI and Cloud services

Introduction:

A foundational enabler of global AI infrastructure and cloud service expansion are the fiber-optic networks interconnecting data centers worldwide. These high-capacity optical systems form the invisible backbone of modern digital society, facilitating everything from real-time financial transactions and mission-critical enterprise traffic to defense systems, entertainment, and personal communications.  Access to cloud-based AI platforms—and the data-driven intelligence they deliver—depends on efficient, low-latency connectivity to data centers. As AI workloads proliferate across industries and continents, the unifying role of optical fiber becomes paramount, ensuring equitable global access to advanced digital capabilities.

A core prerequisite for scaling AI and cloud services is the mesh of high-capacity fiber-optic networks that interconnect data centers globally. These networks silently underpin digital society, carrying the data that powers financial markets, mission-critical enterprise applications, national security, entertainment platforms, and everyday human communication.

Cloud-based AI services only become meaningful when users, enterprises, and machines can reach them with low latency, high reliability, and predictable performance. In this context, the unifying role of fiber is increasingly strategic, as it determines who can participate in the AI economy and at what scale.

Subsea (fiber) cable systems as digital unifier:

The massive capacity and spectral efficiency of optical fiber have driven its deployment from access networks to backbone routes and across the world’s oceans. Today, more than 570 subsea cables carry over 99% of international traffic, effectively stitching together a single global fabric for AI and cloud connectivity.

New subsea systems highlight how infrastructure investments are closing regional gaps rather than just adding raw terabits: the Medusa submarine cable system will help narrow the digital divide between Europe and North Africa, the Bangladesh Private Cable System (BPCS) will establish the country’s first private subsea on-ramps to global cloud and AI ecosystems, and a new Jakarta–Singapore route by PT Solusi Sinergi Digital Tbk (Surge) is set to increase data center interconnectivity while expanding affordable broadband to tens of millions of Indonesians.

As multiple new subsea cable system build outs enter planning and deployment, global bandwidth growth is expected to remain strong, extending the reach of AI and cloud platforms to more geographies, users, and industries.

From PoPs to data centers:

The traffic matrix of the AI era looks very different from that of legacy telecom networks. Instead of primarily connecting PoPs, carrier hotels, and central offices, modern optical networks are being engineered around dense, high-capacity flows between data centers.

More than 11,000 data centers, including over one thousand hyperscale facilities, now form the core nodes of the global digital infrastructure, generating on the order of thousands of petabytes of WAN traffic daily. Subsea bandwidth demand is expected to grow at roughly 30% per year as AI and cloud services scale, placing new design pressure on how subsea and terrestrial backhaul networks are engineered end-to-end.

Unifying subsea and terrestrial backhaul:

This shift is driving a deliberate architectural pivot: instead of treating subsea and terrestrial backhaul as separate domains, leading operators and cloud providers are moving toward unified, end-to-end design philosophies. Traffic no longer “terminates” at a cable landing station or central office; it flows optically and logically from data center to data center across continents.

By optimizing subsea and terrestrial segments as a single system, operators can simplify their networks, reduce CapEx and OpEx, and unlock higher effective capacity. Approaches such as optical pass-through at cable landing sites reduce cost, footprint, and power, while spectrum expansion into C+L bands can deliver a twofold or greater increase in per-fiber capacity, significantly lowering the cost of backhauling subsea traffic to inland data centers.

An ever-increasing number of data centers powering AI services is driving significant bandwidth growth over subsea fiber optic cables. ​ Image Credit: Nokia

Unified optical platforms for the AI supercycle:

Realizing this vision at scale requires platforms that unify roles traditionally split across multiple, specialized systems. For Nokia’s customers, this means leveraging the 1830 Global Express (GX) compact modular portfolio as a single, DCI-optimized solution for transponders, open optical line systems (OLS), and submarine line terminal equipment (SLTE) across both subsea and terrestrial applications.

High-performance coherent transponders on the 1830 GX support 800 Gigabit Ethernet across trans-oceanic distances, using techniques such as Probabilistic Constellation Shaping, Nyquist filtering, and continuous baud rate tuning to push performance toward the Shannon limit. The integrated OLS delivers the full suite of SLTE capabilities, including ROADM-based wavelength switching and spectrum management, ASE or CW idler insertion, and optical channel monitoring, while C+L operation on terrestrial backhaul provides step-function increases in capacity per fiber and reduces the cost of leased backhaul infrastructure.

Photo Credit: Nokia​

Operational simplicity and resilience:

Beyond raw capacity, unified platforms enable operators to rationalize operations. Using a common hardware and software stack across subsea and terrestrial domains simplifies planning, training, sparing, deployment, and lifecycle management.

Capabilities such as constant-power ILAs for stable end-to-end DC-to-DC transport, integrated OTDR for proactive fiber monitoring and fault localization, and a rich set of optical protection schemes for service protection and restoration help operators build networks that are not only faster and denser, but also more resilient and easier to run.

What’s next: pluggables and sensing:

The industry is now entering a phase where innovation in optics is tightly coupled to AI and automation. At PTC 2026 in Honolulu, discussions will highlight how pluggable coherent optics and fiber sensing are being introduced into subsea environments to further collapse layers and enhance awareness.

ICE-X 800G coherent pluggables are already enabling 400G, 600G, and 800G per wavelength over regional subsea spans exceeding 4,000 km, and future advances in chromatic dispersion tolerance are expected to extend the thin transponder layer paradigm to trans-Atlantic routes. In parallel, operators are exploring fiber sensing, powered by machine learning and advanced coherent techniques, to transform existing fiber assets into distributed sensors capable of supporting security, integrity monitoring, and new data-driven services.

Connectivity for all:

“Advancing connectivity for the AI supercycle” is more than a tagline; it captures two simultaneous imperatives: scaling networks for performance, efficiency, and sustainability while extending those networks to every region and community.  As described herein, fiber optics connectivity is becoming the strategic control point for value creation in the age of large-scale AI.

Nokia’s Role in Subsea Fiber Optic Networks:

Nokia has invested for more than 15 years in helping subsea operators and their customers design, deploy, and operate end-to-end SLTE and terrestrial optical networks, backed by global services and multi-country program support. Following its unification with Infinera, Nokia has emerged as the number-two global vendor of subsea optical transport equipment, earning the confidence of a large majority of operators involved in the latest wave of Asia-Pacific subsea builds. These partnerships position Nokia to help the industry scale and unify networks for the AI supercycle—and to ensure that the benefits of AI-era connectivity reach as many people, countries, and enterprises as possible.

Nokia’s 1830 Global Express (GX) supports high-performance coherent transponders for transmission of high-speed data connections such as 800 Gigabit Ethernet (800GE) across trans-oceanic distances, leveraging features such as Probabilistic Constellation Shaping (PCS), Nyquist filtering and continuous baud rate adjustment to maximize optical reach and fiber capacity up to the Shannon Limit. The 1830 GX OLS supports all needed SLTE functions including ROADM-based wavelength switching and spectrum management, insertion of ASE spectrum or continuous-wave (CW) idler channels, and optical channel monitor.

……………………………………………………………………………………………………………………..

References:

https://www.nokia.com/blog/the-unifying-role-of-subsea-fiber-networks/

https://www.nokia.com/optical-networks/1830-global-express/

Subsea cable systems: the new high-capacity, high-resilience backbone of the AI-driven global network

FCC updates subsea cable regulations; repeals 98 “outdated” broadcast rules and regulations

Automating Fiber Testing in the Last Mile: An Experiment from the Field

 

Automating Fiber Testing in the Last Mile: An Experiment from the Field

By Said Yakhyoev with Sridhar Talari & Ajay Thakur

The December 23, 2025 IEEE ComSoc Tech Blog post on AI-driven data center buildouts [1.] highlights the urgent need to scale optical fiber and related equipment[1]. While much of the industry focus is on manufacturing capacity and high-density components inside data centers, a different bottleneck is emerging downstream— a sprawling last-mile network that demands testing, activation, and long-term maintenance. The AI-driven fiber demand coincided with the historic federal broadband programs to bring fiber to the premises for millions of customers[2]. This not only adds near-term pressure on fiber supply chains, but also creates a longer-term operational challenge: efficiently servicing hundreds of thousands of new fiber endpoints in the field.

As standard-setting bodies and vendors are introducing optimized products and automation inside data centers, similar future-proofing is needed in the last-mile outside plant. This post presents an example of such innovation from a field perspective, based on hands-on experimentation with a robotic tool designed to automate fiber testing inside existing Fiber Distribution Hubs (FDHs).

While central office copper terminating DSLAMs—and Optical Line Terminals (OLTs) in Passive Optical Networks (PONs)—aggregate subscribers and automate testing and provisioning, FDHs function as passive patch panels[3] that deliberately omit electronics to reduce cost. Between an OLT and the subscriber, the passive distribution network remains fixed. As a result, accessing individual ports at a local FDH—and anything downstream of it—remains a manual process. In active networks, DSLAMs and OLTs can electronically manage thousands of subscribers efficiently, but during construction this manual access is a bottleneck. There are likely tens of thousands of FDHs deployed nationwide.

Consider this problem from a technician’s perspective: suburban and urban Fiber to the Home (FTTH) networks are often deployed using a hub-and-spoke architecture centered around FDHs. These cabinets carry between 144 and 432 ports serving customers in a neighborhood, and each line must be tested bidirectionally[4]. In practice, this typically requires two technicians: one stationed at the FDH to move the test equipment between ports, and another at the customer location or terminal.

Testing becomes difficult during inclement weather. Counterintuitively, the technician stationed at the hub—often standing still for long periods—is more exposed than technicians moving between poles in vehicles. In addition to discomfort, there is a real economic penalty: either a skilled technician is tied up performing repetitive port switching, or an additional helper must be assigned. Above all, dependence on both favorable weather and helper availability makes testing schedules unpredictable and slows network completion.

To mitigate this bottleneck, we developed and tested Machine2 (M2)—a compact, gantry-style robotic tool that remotely connects an optical test probe inside an FDH, allowing a single technician to perform bidirectional testing independently.

M2 was designed to retrofit into a commonly deployed 288-port Clearfield FDH used in rural and small-town networks. The available space in front of the patch panel—approximately 9.5 × 28 × 4 inches—constrained the design to a flat Cartesian mechanism capable of navigating between ports and inserting a standard SC connector. Despite the simple design, integrating M2 into an unmodified FDH in the field proved more challenging than expected. Several real-world constraints shaped the redesign.

FDH cabinet. Space to fit an automated switch M2 installed for dry-run testing

Space and geometry constraints: The patch panel occupies roughly 80% of the available volume, leaving only a narrow strip for motors, electronics, and cable routing. This forced compromises in pulley placement, leadscrew length, and motor orientation, limiting motion and requiring multiple iterations. The same constraints also limited battery size, making energy efficiency a primary design concern.

Port aiming: The patch panel is composed of cassettes with loosely constrained SC connectors. Small variations in connector position led to unreliable insertions. After repeated attempts, small misalignments accumulated, rendering the system ineffective without corrective feedback.

Communications reliability: A specialized cellular modem intended for IoT applications performed poorly for command-and-control. Message latency ranged from 1.5 seconds to over 12 seconds – and in some cases minutes – making real-time control impractical. In rural areas of Connecticut and Vermont, cellular coverage was also inconsistent or absent. Thus, the effort was abandoned between 2022 and 2024.

When the project resumed, an unexpected solution emerged. A low-cost consumer mobile hotspot proved more reliable than the specialized modem when cellular signal was available, providing predictable latency and stable Wi-Fi connectivity inside the FDH—even with the all-metal cabinet door closed and locked.

To further reduce latency, we explored using the fiber under test itself as a communication channel, a kind of temporary orderwire. When a two-piece Optical Loss Test Set (OLTS) is connected across an intact fiber, the devices indicate link readiness via an LED. By tapping this status signal, M2 can infer when a technician at the far end disconnects the meter and automatically connects to the next port. While this cue-based mode is limited, it enables near-zero-latency coordination and rapid testing of multiple ports without spoken or typed commands, which proved effective for common field workflows.

A second breakthrough came from addressing port aiming with vision. Standard computer-vision techniques such as edge detection were sufficient to micro-adjust the probe position at individual ports. To detect and avoid dust caps, M2 also uses a lightweight edge-ML[5] model trained to recognize caps under varying illumination. Using only 30 positive and 30 negative training images, the model correctly detected caps in over 80% of cases.

In our experience, lightweight vision models proved sufficient for practical field tasks, suggesting that accessibility—not sophistication—may drive adoption of automation in outside-plant environments.

M2’s simplified vision sequence to account for nonuniformity of connectors
Camera view: clipped to region of interest. Rough position Processed view adjust -12px left, -16px up Processed view after micro-adjusting

What building M2 revealed:

  1. Overcoming communications issues led to an intriguing idea: optical background communication, where modulated laser light subtly changes ambient illumination inside the FDH that a camera can detect and extract instructions.
  2. M2 also proved useful beyond testing. For example, in a verify-as-you-splice workflow, M2 can lase a specific fiber as confirmation before splicing. Interactive port illumination and detection allow a single technician to troubleshoot complex situations.

The comparison below is illustrative and reflects observed workflows rather than controlled benchmarking.

Illustrative comparison of testing workflows in our experience

Human helper (remote) M2
Connect next port 1–1.5 s 2.5–4 s
Connect random / distant port 8–24 s ~11–30 s
Ease of deployment Requires flat ground, fair weather, ground-level FDH ~15 min setup; requires software familiarity
Functionality Highly adaptable Limited to 2–3 functions
Economics Inefficient for small networks Well-suited for small and medium networks
Independence factor Low; requires two people High; largely weather-independent
Best use Variable builds, high adaptability Repetitive builds, independent workflows

Early insights for OSP vendors and standards

Building M2 revealed two broader lessons relevant to operators and vendors. First, there are now practical opportunities for automation to enter outside-plant workflows following developments in the power industry and datacenters[6]. Second, infrastructure design choices can facilitate this transition.

More spacious or reconfigurable FDH cabinets would simplify retrofitting active devices. Standardized attachment points on cabinets, terminals and pluggable components would allow mechanized or automated fiber management, reducing the risk of damage in dense installations.

Fiducial marks are among the lowest-cost adaptations. QR marks conveying dimensions and part architecture would help machines determine part orientation and position easily. Although these are common in the industry, it may be time to adopt them more broadly in telecom infrastructure maintenance.

Aerial terminals may benefit the most from machine-friendly design. Standardized port spacing and swing-out or hinged caps would significantly simplify autonomous or remotely assisted connections. Such cooperative interfaces could enable standoff connections without requiring a technician to climb a pole, improving safety and reducing access costs. Retrofitting aerial infrastructure to make it robot-friendly has been recommended[7] by the power industry and is also needed in the broadband utilities.

Conclusion

A growing gap is emerging between rapidly evolving data-center infrastructure and the more traditional telecom networks downstream. As fiber density increases, testing, activation, and maintenance of last-mile networks are likely to become bottlenecks. One way ISPs and vendors can future-proof outside-plant infrastructure is by proactively incorporating automation- and robot-friendly design features. M2 is one practical example that helps inform how such transitions might begin.

Short video clip from our early field trial in Massachusetts:
https://youtube.com/shorts/MiDoQd_S6Kw

References:

[1] IEEE ComSoc Technology blog post, Dec 23 2025, How will fiber and equipment vendors meet the increased demand for fiber optics in 2026 due to AI data center buildouts?

[2] U.S. Dept. of Commerce Office of Inspector General, “NTIA Broadband Programs: Semiannual Status Report,” Washington, DC, USA, Rep. no. OIG-25-031-I, Sept. 24, 2025.

[3] for an overview of an FTTH architecture see: Fiber Optic Association (FOA), FTTH Network Design Considerations and Fiber Optic Association (FOA), FTTH and PON Applications

[4] Corning Optical Communications, “Corning Recommended Fiber Optic Test Guidelines,” Hickory, NC, USA, Application Engineering Note LAN-1561-AEN, Feb. 2020.

[5] Refer to tools available for easy to use edge computing by Edge Impulse.

[6] See state of the art indoor optical switches like ROME from NTT-AT and G5 from Telescent.

[7] Andrew Phillips, “Autonomous overhead transmission line inspection robot (TI) development and demonstration,” IEEE PES General Meeting, 2014.

About the Author:

Said Yakhyoev is a fiber optic technician with LightStep LLC in Colorado and a developer of the experimental Machine2 (M2) platform for automating fiber testing in outside-plant networks.

The author acknowledges the use of AI-assisted tools for language refinement and formatting.

Virtual RAN gets a boost from Samsung demo using Intel’s Grand Rapids/Xeon Series 6 SoC

Samsung is the fifth largest worldwide RAN equipment vendor, behind Huawei, Ericsson, Nokia and ZTE.  This week, the South Korean conglomerate claimed to have reached a virtual RAN (vRAN) milestone with the completion of a commercial phone call using Granite Rapids – Intel’s  Xeon 6700P-B SoC processor series. The call took place on the network of a large, undisclosed U.S. network operator, but apparently Verizon. Samsung said, “this builds upon the company’s previous achievement in 2024, when it completed the industry-first end-to-end call in a lab environment with Intel Xeon 6 SoC.”

Samsung’s cloud-native vRAN with Intel’s latest Xeon SoC ran on a single commercial off-the-shelf (COTS) server from Hewlett Packard Enterprise with a cloud platform from Wind River. This milestone, coming only a few months after the first wave of Intel Xeon 6 SoC was made commercially available, presents an innovative pathway for single-server vRAN deployments for next-generation networks.

The commercial readiness of vRAN technology promises to give network operators the ability to run RAN and AI workloads on fewer, more powerful servers. 

Samsung wrote: “As operators accelerate their transition to software-driven, flexible architectures while seeking more sustainable infrastructure, the ability to run RAN and AI workloads on fewer, more powerful servers becomes critical, On a single server of Samsung’s AI-powered vRAN with enhanced processors, operators can consolidate software-driven network elements such as mobile core, radio access, transport and security, which traditionally required multiple servers, significantly simplifying the management of complex site configuration.”  

Image Credit: Samsung

“This breakthrough represents a major leap forward in network virtualization and efficiency. It confirms the real-world readiness of this latest technology under live network conditions, demonstrating that single-server vRAN deployments can meet the stringent performance and reliability standards required by leading carriers,” said June Moon, Executive Vice President, Head of R&D, Networks Business at Samsung Electronics. “We are not only deploying more sustainable, cost-effective networks, but also laying the foundation to fully utilize AI capabilities more easily and prepare for 6G with our end-to-end software-driven network solutions.”

Samsung’s vRAN leverages the latest Intel Xeon 6 SoC with Intel Advanced Matrix Extensions (Intel AMX), Intel vRAN Boost and up to 72 cores, delivering significant improvements in AI processing, memory bandwidth and energy efficiency compared to the previous generation.

“With Intel Xeon 6 SoC, featuring higher core counts and built-in acceleration for AI and vRAN, operators get the compute foundation for AI native, future ready networks,” said Cristina Rodriguez, VP and GM, Network & Edge, Intel. “This collaborative achievement with Samsung, HPE and Wind River enables greater consolidation of RAN and AI workloads, lowering power and total cost while speeding innovation.”

Samsung has been leading the deployment of vRAN solutions with major network operators worldwide and has achieved many industry breakthroughs, including the industry’s first call on a commercial network and large-scale deployments utilizing Intel Xeon processors with Intel vRAN Boost. The company continues to push the boundaries of network virtualization, working closely with ecosystem partners like Intel to deliver solutions that help operators build networks that are more efficient and sustainable.

“This successful first call is an important milestone for the industry,” said Daryl Schoolar, Analyst and Director at Recon Analytics. “By demonstrating multiple network functions running on next-generation processing technology, Samsung is showing what future networks look like — more cloud-native, more scalable and significantly more efficient. This achievement moves the industry beyond theoretical performance gains and into practical, deployable innovation that operators around the world can leverage to modernize their networks, accelerate automation and better support AI-driven use cases.”

“With Samsung’s vRAN and Intel’s Xeon 6 SoC running on a single server, Samsung expects enhanced cost savings for operators,” said a Samsung spokesperson via email to Light Reading, when asked what cost impact Granite Rapids would have. “The ability to consolidate multiple network functions including RAN, core, transport and security onto a single, high-performance COTS server reduces hardware footprint, simplifies site design and lowers power consumption.”

Vodafone is one Samsung customer that now expects to benefit from the availability of Granite Rapids. In November, Paco Pignatelli, Vodafone’s head of open RAN, told Light Reading that the new Intel platform offers “much better capacity and efficiency” than its predecessors. That was several weeks after the telco had announced plans to deploy Samsung’s virtual RAN technology in Germany and other European markets, starting in 2026.

…………………………………………………………………………………………………………………………………….

vRAN Market Assessment:

Virtual RAN still accounts for a very small share of the entire market. In 2023, data from Omdia put its market share at just 3% of the total RAN market which generates If vRAN is considered as part of the subsector for baseband RAN, its share was about 10% that year, implying baseband represents about 30% of the total expenditure on RAN products.

Hardware still dominates the RAN equipment business, but there is a rapid shift toward Commercial Off-The-Shelf (COTS) servers, particularly those using Intel’s Xeon 6 processors. Regional Dominance: North America and Asia-Pacific are expected to remain the largest markets in 2026, together accounting for over 70% of global vRAN revenue

Analyst Insights by Leading Telecom Market Research Firms:
  • Dell’Oro Group:
    • 2026 Stability: Predicts overall RAN revenues will remain “mostly stable” in 2026, but identifies AI-RAN, Cloud RAN, and Open RAN as favorable growth segments within that flat topline.
    • Market Share: Expects vRAN to account for 5% to 10% of the total RAN market by 2026.
    • Private Wireless: Forecasts that private wireless campus network RAN revenue will surpass USD 1 billion in 2026.
  • Omdia:
    • Growth Surge: Anticipates a doubling of vRAN’s market share by 2028. Specifically, it expects Open vRAN to reach a 16% share of the total RAN market in 2026, up from 7% in 2022.
    • Automation Focus: Forecasts the Service Management and Orchestration (SMO) category to grow at a massive 99% CAGR through 2030 as operators align with O-RAN architectures.
  • Research and Markets:
    • Estimates the global Open RAN market size will reach between USD 5.0 billion and USD 10.0 billion by 2026, driven by aggressive greenfield deployments. 

…………………………………………………………………………………………………………………………

References:

https://news.samsung.com/global/samsung-achieves-another-industry-first-virtualized-ran-milestone-accelerating-ai-native-6g-ready-networks

https://www.lightreading.com/5g/intel-and-samsung-add-to-pressure-on-purpose-built-5g

vRAN market disappoints – just like OpenRAN and mobile 5G

RAN silicon rethink – from purpose built products & ASICs to general purpose processors or GPUs for vRAN & AI RAN

LightCounting: Open RAN/vRAN market is pausing and regrouping

Dell’Oro: Private 5G ecosystem is evolving; vRAN gaining momentum; skepticism increasing

Heavy Reading: How network operators will deploy Open RAN and cloud native vRAN

Comparing AI Native mode in 6G (IMT 2030) vs AI Overlay/Add-On status in 5G (IMT 2020)

Executive Summary:

AI integration in 6G specifications (3GPP) and standards (ITU-R IMT 2030) highlights a strategic shift in the telecom industry towards AI-native networks, with telecom industry heavyweights like Huawei, Samsung, Ericsson, and Nokia actively developing foundational technologies. Unlike 5G, where AI and machine learning were limited applications or add-on features over existing architecture, 6G will incorporate AI from the onset with an “AI native” approach where intelligence will allow the network to be smart, agile, and able to learn and adapt according to changing network dynamics.

This transformation is necessary because future 6G networks will be too complex for human operators to manage, requiring AI-empowered and learning-driven networks that can facilitate zero-touch network management through capabilities including learning, reasoning, and decision-making.

Key Developments and Analysis:
  • AI-Native Networks: The industry consensus is that 6G will be “AI-native,” meaning artificial intelligence will be built directly into the core functions of network control, resource management, and service orchestration. This moves AI from an optimization layer in 5G to an foundational element in 6G.

AI Native Image Courtesy of Ericsson

…………………………………………………………………………………………………………………………………………………………………………………………………………………………………………………………………………………………………

  • Company Initiatives:
    • Huawei is focused on making AI a native element of the network architecture (AI-native 6G) rather than an overlay technology, integrating communication, sensing, computing, and intelligence. This vision, called “Connected Intelligence,” involves two aspects: AI for 6G (network automation) and 6G for AI (AI as a Service, AIaaS).  More in Huawei Research Areas below.
    • Samsung is a major proponent of AI-RAN (Radio Access Network) technology. The company hosted a summit in November 2025 to showcase working AI-RAN technology that autonomously optimizes network performance and is conducting joint research with SK Telecom (SKT) on AI-supported RAN. Samsung sees vRAN (virtualized RAN) as a key enabler for “AI-native, 6G-ready networks”.
    • Ericsson emphasizes the necessity of a strong 5G Standalone (5G SA) foundation for an AI future, using AI to manage and automate current networks in preparation for 6G’s demands. Ericsson is also integrating agentic AI into its platforms for more autonomous network management.
    • Nokia is deepening its AI push, licensing software to expand AI use in mobile networks and preparing for early field trials in 2026 by porting baseband software to platforms like NVIDIA’s, which opens the door for more advanced AI use cases.
  • Industry Analysis and Trends:
    • Standardization: 2026 is crucial as formal 6G specification work begins in earnest within 3GPP with Release 21. In WP5D, the IMT 2030 RIT/SRIT standardization work will commence at the February 2027 meeting with the final deadline for submissions at the February 2029 meeting.  More in the ITU-R WP5D section below. 
    • The AI-RAN Alliance is an industry initiative (not a traditional SDO) focused on accelerating real-world AI applications and integration within the RAN. It works alongside SDOs, providing industry insights and pushing for rapid validation and testing of AI-RAN technologies, with a specific focus on leveraging accelerated computing.
    • Automation and Efficiency: AI-native algorithms in 6G are expected to deliver extreme spectrum and energy efficiency, significantly reducing operational costs for telcos while improving reliability and performance.
    • Monetization Challenges: Despite the technological promise, analysts caution that 6G remains largely theoretical for now. Some operators are stalling on full 5G SA deployment, waiting to move to 6G-ready cores later in the decade, leading to concerns that 5G SA might become an “odd generation.”
    • Infrastructure Constraints: The physical demands of AI infrastructure, particularly energy consumption and construction timelines, are becoming operational realities that may bound the pace of AI growth in 2026, regardless of software advancements. 
    • ITU-R Working Party (WP) 5D is making AI a native and foundational element of the 6G (IMT-2030) system, rather than the “add-on” or “overlay” status it had in 5G (IMT 2020). This shift is being achieved through the definition of specific AI capabilities and requirements that future 6G technologies must inherently support. In particular:
  • Defining AI as a Core Capability: The Recommendation ITU-R M.2160 (“Framework and overall objectives of the future development of IMT for 2030 and Beyond”) officially defines “Artificial Intelligence and Communication” as one of the six major usage scenarios and an overarching design principle for IMT-2030.
  • Integrating AI into the Radio Interface: WP 5D is actively developing technical performance requirements (TPRs) and evaluation criteria for proposed 6G radio interface technologies (RITs) that inherently incorporate AI/Machine Learning (ML). This includes work on:
    • AI-enabled air interface design: This involves the physical layer, potentially moving towards AI-native physical (PHY) layers that can dynamically adapt waveforms and network parameters in real-time, rather than relying on predefined, static configurations.
    • AI-driven resource management: AI/ML algorithms will be crucial for real-time optimization of spectral and energy efficiency, managing complex traffic, and ensuring Quality of Service (QoS).
  • Enabling AI-Driven Services: The framework for IMT-2030 is designed to support the full lifecycle of AI components, from data collection and model training to deployment and performance monitoring, enabling new AI-driven services and applications directly within the network infrastructure.
  • Establishing a Formal Timeline: WP 5D has established a clear timeline for 6G standardization, with specific stages for vision, requirements, evaluation methodology, and specifications. This structured approach ensures that all proposed RITs/SRITs are evaluated against the new AI-native requirements, promoting global alignment and preventing AI from becoming a fragmented, proprietary solution.
    • Stage 1 (Vision): Completed in June 2023.
    • Stage 2 (Requirements & Evaluation): Targeted for completion in 2026.
    • Stage 3 (Specifications): Expected by the end of 2030.
6G, as envisioned in the ITU-R’s IMT-2030 framework, is being designed from the ground up as an “AI-native” system. 
  • Purpose: AI is integral to the entire network lifecycle, from initial design and deployment to autonomous operation and service creation.
  • Integration Level: Intelligence is embedded across all layers of the network stack, including the physical layer (air interface), control plane, and data plane.
  • Scope: AI enables core functionalities such as real-time self-optimization, self- healing capabilities, and dynamic resource allocation, rather than static, predefined configurations.
  • Outcome: The creation of a fully cognitive, self-managing, and highly adaptable “intelligence fabric” capable of supporting advanced use cases like real-time holographic communication, digital twins, and autonomous systems with ultra-low latency. 
Comparing AI as an overlay in 5G (IMT 2030) vs AI native mode in 6G (IMT 2030):
Feature  5G (IMT-2020) 6G (IMT-2030)
AI Role Optimization tool (overlay) Foundational and native element
Network Operation Manual configuration with AI assistance Autonomous and self-managing
Air Interface Human-designed with some ML optimization AI/ML-designed and managed
Complexity Management Relies on standard protocols Manages complexity through embedded AI/ML
Services Supported Enhanced mobile broadband, basic IoT Integrated AI & Communication, sensing, holographic comms

–>By embedding AI into the fundamental design principles and technical requirements of IMT-2030, ITU-R WP 5D is ensuring that 6G is an AI-native network capable of self-management, self-optimization, and supporting a vast ecosystem of AI applications, a significant shift from the supplementary role AI played in 5G. 

……………………………………………………………………………………………………………………………………………………………………………………………………………………………………………..
Huawei’s Research Areas and Activities:
  • Agentic-AI Core (A-Core): Huawei unveiled a blueprint for a 6G core network (which will be specified by 3GPP and NOT ITU) where services are managed by specialized AI agents using a large-scale network AI model called “NetGPT”. This allows the network to program, update, and execute its own control procedures automatically without human intervention, based on natural language instructions.
  • Network Architecture Redesign: Huawei proposes the NET4AI system architecture, a service-oriented design that moves beyond the 5G service-based architecture. It introduces a dedicated data plane (DP) to handle the massive volume of data generated by AI and sensing services, enabling flexible and efficient many-to-many data flow for distributed learning and inference.
  • Integrated Sensing and Communication (ISAC): A core pillar of Huawei’s 6G work is the native integration of sensing with communication. This allows the network to use radio waves for high-resolution sensing, localization, and imaging, creating a “digital twin” of the physical world. The large volume of data collected from sensing then serves as a source for AI model training and real-time environmental monitoring.
  • Distributed Machine Learning: Huawei researches deep-edge architecture to enable massive, distributed, and collaborative machine learning (ML). This includes the development of frameworks like a two-level learning architecture that combines federated learning (FL) and split learning (SL) to optimize computing resources and ensure data privacy by keeping raw data local to devices.
  • AI as a Service (AIaaS): The 6G network is designed to provide AI capabilities as a service, allowing the training and inference of large AI models to be distributed across the network (edge and cloud). This offers low-latency performance and access to rich data for AI-driven applications like collaborative robotics and autonomous driving.
  • Energy Efficiency and Sustainability: The company is researching how native AI capabilities can improve overall energy efficiency by up to 100 times compared to 5G. This involves smart energy control, dynamic resource scaling, and optimizing communication paths for lower power consumption.
  • Standardization and White Papers: Huawei is actively contributing to global 6G discussions and standardization bodies like the ITU-R, sharing its vision through publications such as the book 6G: The Next Horizon – From Connected People and Things to Connected Intelligence and various technical white papers. The goal is to define the technical specifications and use cases for 6G that will drive industry-wide innovation by around 2030. 
In summary, the telecom industry is laying the critical groundwork for an AI-native 6G era through research, standard setting, and strategic investments in AI-powered network solutions, even as commercial deployment remains several years away. Decisions must be made on spectrum use (especially in the FR3 range of 7-24 GHz), silicon roadmaps, and network architectures which will have lasting impact.
………………………………………………………………………………………………………………………………………………………………………………………………………………………………………………………………………………………………………..

References:

https://www.ericsson.com/en/reports-and-papers/white-papers/ai-native

Roles of 3GPP and ITU-R WP 5D in the IMT 2030/6G standards process

AI wireless and fiber optic network technologies; IMT 2030 “native AI” concept

ITU-R WP5D IMT 2030 Submission & Evaluation Guidelines vs 6G specs in 3GPP Release 20 & 21

ITU-R WP 5D Timeline for submission, evaluation process & consensus building for IMT-2030 (6G) RITs/SRITs

ITU-R WP 5D reports on: IMT-2030 (“6G”) Minimum Technology Performance Requirements; Evaluation Criteria & Methodology

AI wireless and fiber optic network technologies; IMT 2030 “native AI” concept

Highlights of 3GPP Stage 1 Workshop on IMT 2030 (6G) Use Cases

Should Peak Data Rates be specified for 5G (IMT 2020) and 6G (IMT 2030) networks?

GSMA Vision 2040 study identifies spectrum needs during the peak 6G era of 2035–2040

Highlights and Summary of the 2025 Brooklyn 6G Summit

NGMN: 6G Key Messages from a network operator point of view

Nokia and Rohde & Schwarz collaborate on AI-powered 6G receiver years before IMT 2030 RIT submissions to ITU-R WP5D

Verizon’s 6G Innovation Forum joins a crowded list of 6G efforts that may conflict with 3GPP and ITU-R IMT-2030 work

Nokia Bell Labs and KDDI Research partner for 6G energy efficiency and network resiliency

Deutsche Telekom: successful completion of the 6G-TakeOff project with “3D networks”

Market research firms Omdia and Dell’Oro: impact of 6G and AI investments on telcos

Qualcomm CEO: expect “pre-commercial” 6G devices by 2028

Ericsson and e& (UAE) sign MoU for 6G collaboration vs ITU-R IMT-2030 framework

KT and LG Electronics to cooperate on 6G technologies and standards, especially full-duplex communications

Highlights of Nokia’s Smart Factory in Oulu, Finland for 5G and 6G innovation

Nokia sees new types of 6G connected devices facilitated by a “3 layer technology stack”

Rakuten Symphony exec: “5G is a failure; breaking the bank; to the extent 6G may not be affordable”

India’s TRAI releases Recommendations on use of Tera Hertz Spectrum for 6G

New ITU report in progress: Technical feasibility of IMT in bands above 100 GHz (92 GHz and 400 GHz)

 

Dell’Oro: Fixed Wireless Access revenues +10% in 2025 & will continue to grow 10% annually through 2029

5G Fixed Wireless Access (FWA), along with Private 5G, have become quite popular despite not being one of the ITU-R use cases for IMT 2020 (5G).  According to a new report by the Dell’Oro Group, FWA is experiencing continued strong growth. The technology’s straightforward deployment and the increasing availability of 4G LTE and 5G Sub-6GHz networks are driving its adoption for both residential and enterprise connectivity.  Sub‑6 GHz 5G in particular combines wide‑area coverage with better indoor penetration and capacity, making it attractive for operators as a mass‑market broadband alternative to DSL and cable.
Preliminary projections indicate that total FWA revenues—encompassing RAN equipment, residential CPE, and enterprise router and gateway revenue—are poised to grow by 10% in 2025. This advancement is fueled by mobile operators expanding FWA service availability into new markets, aiming to capture subscribers currently using DSL and cable broadband services.
“In the US, we continue to see the largest mobile operators expand their availability of FWA services in both existing and new markets, especially as FWA service revenue has boosted overall earnings,” said Jeff Heynen, Vice President with the Dell’Oro Group. “Mobile operators in India, Southeast Asia, Europe, and the Middle East are taking a page from the US operators’ book and are quickly expanding their own FWA offerings, especially with the imminent threat of Starlink, Amazon, OneWeb, and other LEO satellite broadband providers,” added Heynen.

Additional highlights from the Fixed Wireless Access Infrastructure and CPE Advanced Research Report:

  • Total FWA subscriptions, which include residential, SMB, and large enterprises, are expected to grow steadily, surpassing 191 million by 2029.
  • 5G Sub-6GHz and mmWave units will dominate the global residential CPE market.

About the Report:

The Dell’Oro Group Fixed Wireless Access Infrastructure and CPE Report includes 5-year market forecasts for FWA CPE (Residential and Enterprise) and RAN infrastructure, segmented by technology, including 802.11/Other, 4G LTE, CBRS, 5G sub-6GHz, 5G mmWave, and 60GHz technologies. The report also includes regional forecasts for FWA subscriptions, including for both residential and enterprise markets, with the enterprise subscriptions segmented by SMB and Large Enterprise. To purchase this report, please contact us by email at [email protected].

………………………………………………………………………………………………………………………………………………………………………

Independent Analysis via Perplexity.ai:

Fixed Wireless Access Schematic Diagrams

……………………………………………………………………………………………………………………………………………………………

Demand-side drivers:

  • Rising demand for high‑speed home and enterprise broadband, including video streaming, gaming, and cloud/SaaS, in areas poorly served by DSL or legacy cable.

  • Customer appetite for quick‑install, no‑truck‑roll broadband that can be activated using wireless CPE instead of waiting for fiber construction.

  • Growing need for reliable connectivity for remote work, distance learning, and SME digitization, especially in suburban and rural regions.

Supply-side / operator economics:

  • Ability to leverage existing 4G LTE macro grids and sub‑6 GHz spectrum, with incremental capex mainly in CPE and software rather than full new access builds.

  • Refarming of LTE spectrum and overlay of 5G NR on the same bands allows operators to run both mobile broadband and FWA on a common RAN/core.

  • Attractive ROI relative to fiber in low‑density areas, since one macro site at sub‑6 GHz can cover large rural or ex‑urban footprints.

Technology and spectrum factors (4G & sub‑6 GHz 5G):

  • 4G LTE coverage ubiquity: years of investment mean LTE already reaches most urban, suburban, and many rural markets, making LTE‑FWA immediately deployable.

  • Sub‑6 GHz 5G propagation: better penetration through buildings and walls than higher bands, enabling more reliable indoor FWA without extensive outdoor CPE alignment.

  • Massive MIMO and beamforming on sub‑6 GHz bands increase sector capacity and improve non‑line‑of‑sight performance, which is critical for FWA quality at cell edge.

Competitive and regulatory drivers:

  • Mobile operators using FWA to attack cable and DSL bases; in several markets FWA contributes a high share of net broadband additions, pressuring incumbents on price and speed.

  • Government rural‑broadband programs and subsidies (e.g., U.S. RDOF‑type initiatives) encourage use of FWA as a cost‑effective tool to close the digital divide.

  • Regulatory allocation of additional mid‑band and sub‑6 GHz spectrum (e.g., 3–4 GHz bands) increases usable capacity and supports scaling FWA to millions of homes.

Market growth indicators:

  • FWA market value is growing at double‑digit CAGRs, with 4G still a large share today but 5G FWA projected to dominate new subscriptions by the late 2020s.

  • Sub‑6 GHz FWA gateways and CPE are a rapidly expanding device segment, driven by operator deployments targeting residential and SME broadband.

…………………………………………………………………………………………………………………………………………………………………….

References:

https://www.delloro.com/news/fwa-infrastructure-and-cpe-spending-will-remain-above-10-billion-annually-through-2029/

Fiber and Fixed Wireless Access are the fastest growing fixed broadband technologies in the OECD

Ookla: FWA Speed Test Results for big 3 U.S. Carriers & Wireless Connectivity Performance at Busy Airports

Point Topic: Global Broadband Subscribers in Q2 2025: 5G FWA, DSL, satellite and FTTP

Aviat Networks and Intracom Telecom partner to deliver 5G mmWave FWA in North America

T-Mobile’s growth trajectory increases: 5G FWA, Metronet acquisition and MVNO deals with Charter & Comcast

Dell’Oro: 4G and 5G FWA revenue grew 7% in 2024; MRFR: FWA worth $182.27B by 2032

Latest Ericsson Mobility Report talks up 5G SA networks and FWA

Highlights of Qualcomm 5G Fixed Wireless Access Platform Gen 3; FWA and Cisco converged mobile core network

Ericsson: Over 300 million Fixed Wireless Access (FWA) connections by 2028

 

China ITU filing to put ~200K satellites in low earth orbit while FCC authorizes 7.5K additional Starlink LEO satellites

China has submitted regulatory filings with the International Telecommunication Union (ITU) to put approximately 200,000 satellites in orbit.  It’s part of a national strategy to secure orbital positions and radio frequencies for a massive low-Earth orbit (LEO) broadband satellite network (aka Non Terrestrial Network or NTN).
The vast majority of these new satellites are from a new joint government-industry body called the Radio Spectrum Development and Technology Innovation Institute (RSDTII) -discussed below- which has applied to launch a total of 193,000 satellites for two non-geostationary constellations, CTC-1 and CTC-2. It is the first disclosure of these two constellations, about which no other details have been confirmed.
The ITU filings were made in December  by various Chinese entities, with two constellations alone accounting for nearly 97,000 satellites each.  These applications are subject to strict ITU “use it or lose it” provisions, which mandate that operators deploy the first satellite within seven years of application and complete the entire constellation rollout within 14 years.
  • Purpose: The planned systems are intended to provide global broadband connectivity, data relay, and positioning services, directly competing with U.S. efforts like SpaceX’s Starlink network.
  • Filing Entities: The primary filings were submitted by the state-backed Institute of Radio Spectrum Utilization and Technological Innovation, along with other commercial and state-owned companies like China Mobile and Shanghai Spacecom.
  • Status: These filings are an initial step in a long international regulatory process and serve as a claim to limited spectrum and orbital slots. They do not guarantee all satellites will ultimately be built or launched. The actual deployment will be a gradual process over many years.
  • Context: The move is part of an escalating “space race” to dominate the LEO environment. Early filings are crucial for securing priority access to orbital resources and avoiding signal interference. The sheer scale of the Chinese proposal would, if realized, dwarf most other planned constellations.
  • Regulations: Under ITU rules, operators must deploy a certain percentage of the satellites within seven years of the initial filing to retain their rights.
Several Chinese entities are actively pursuing the expansion of their low-Earth orbit (LEO) satellite constellations, signaling a significant push in the nation’s space technology sector. 
  • Shanghai Yuanxin (Qianfan), currently China’s most advanced LEO satellite operator, has submitted a regulatory request for an additional 1,296 satellites.
  • Telecommunications giant China Mobile is planning two separate constellations totaling 2,664 satellites.
  • ChinaSat, the established state-owned satellite provider, is focusing on a 24-satellite medium-Earth orbit (MEO) system.
  • GalaxySpace, a private satellite manufacturer based in Beijing, has applied for 187 satellites, and China Telecom has applied for 12. 

Image Credit: Klaus Ohlenschlaeger/Alamy Stock Photo

The RSDTII (Radio Spectrum Development and Technology Innovation Institute) is a hybrid entity merging government bodies—including the Ministry of Industry and Information Technology’s (MIIT) State Radio Monitoring Center—with local Xiongan departments, the military-affiliated electronics conglomerate CETC, and ChinaSat. The RSDTII’s creation appears to be the latest governmental restructuring effort aimed at stimulating domestic satellite development and closing the technological gap with international competitors like Starlink. 
The RSDTII’s application for an exceptionally large number of orbital slots (200,000) for projects still in the conceptual phase represents an ambitious strategic claim. To contextualize, SpaceX’s Starlink currently operates approximately 9,500 satellites and has FCC approval for a further 7,500 Gen2 satellites, with long-term plans potentially reaching 42,000 satellites. 
Achieving China’s projected deployment schedule faces logistical challenges, primarily regarding current launch vehicle capacity. China’s commercial LEO initiatives only recently matured, launching 303 commercial satellites in the past year out of a total national fleet of 800 in orbit. China currently manages three primary LEO constellations: the GW system (operated by China Sat-Net), the G60 system (operated by Shanghai Yuanxin/Qianfan), and the smaller Honghu-3 project. 
…………………………………………………………………………………………………………………………………………………..
In the U.S., the FCC has authorized 7,500 additional Starlink satellites in lower earth orbits, giving parent company SpaceX options to add capacity for fixed Internet and D2D mobile services.  The FCC order increases the number of satellites Starlink can launch by 50%, expanding approved launches from approximately 12,000 to 19,000. Half of the new satellites are required to be in orbit and operational by December 1, 2028, and the remainder by December 1, 2031.
At the end of December 2025, the Starlink system comprised more than 9,000 fixed broadband satellites in orbit and over 650 that support D2D mobile services.  SpaceX originally requested permission for nearly 30,000 new satellites, but the FCC decided to proceed “incrementally” and defer approval for the roughly 15,000 remaining satellites, which includes those proposed to operate above 600km (373 miles).

“This gives SpaceX what they need for the next couple of years of operation. They’re launching a bit over 3,000 satellites a year, so 7,500 satellites being authorized is potentially enough for SpaceX to do what they want to do until late 2027,” said Tim Farrar, satellite analyst and president at TMF Associates.

SpaceX has plans for a larger D2D satellite constellation that would use the AWS-4 and H-block spectrum it is acquiring from EchoStar. It is awaiting FCC approval for the US$17 billion deal, but the spectrum is not expected to be transferred until the end of November 2027. 

The FCC noted that the changes will allow the Starlink system to serve more customers and deliver “gigabit speed service.” Along with permission for another tranche of satellites, the FCC has set new parameters for frequency use and lower orbit altitudes. The modified authorizations will also apply to new satellites to be launched. 

Starlink’s LEO satellite network competitors are Amazon Leo, OneWeb and AST Space Mobile.

………………………………………………………………………………………………………………………………………………………..

References:

U.S. BEAD overhaul to benefit Starlink/SpaceX at the expense of fiber broadband providers

Huge significance of EchoStar’s AWS-4 spectrum sale to SpaceX

Telstra selects SpaceX’s Starlink to bring Satellite-to-Mobile text messaging to its customers in Australia

SpaceX launches first set of Starlink satellites with direct-to-cell capabilities

SpaceX has majority of all satellites in orbit; Starlink achieves cash-flow breakeven

Amazon Leo (formerly Project Kuiper) unveils satellite broadband for enterprises; Competitive analysis with Starlink

NBN selects Amazon Project Kuiper over Starlink for LEO satellite internet service in Australia

GEO satellite internet from HughesNet and Viasat can’t compete with LEO Starlink in speed or latency

Amazon launches first Project Kuiper satellites in direct competition with SpaceX/Starlink

Vodafone and Amazon’s Project Kuiper to extend 4G/5G in Africa and Europe

Page 1 of 349
1 2 3 349