Nvidia AI-RAN survey results; AI inferencing as a reinvention of edge computing?

An increasing focus on deploying AI into radio access networks (RANs) was among the key findings of NVIDIA’s third annual “State of AI in Telecommunications” survey of 450 telecom professionals, as more than a third of respondents indicated they’re investing or planning to invest in AI-RAN. The survey polled more than 450 telecommunications professionals worldwide, revealing continued momentum for AI adoption — including growth in generative AI use cases — and how the technology is helping optimize customer experiences and increase employee productivity.  The percentage of network operators planning to use open source tools increased from 28% in 2023 to 40% in 2025. AvidThink Founder and Principal Roy Chua said one of the biggest challenges network operators will have when using open source models is vetting the outputs they get during training.

Of the telecommunications professionals surveyed, almost all stated that their company is actively deploying or assessing AI projects. Here are some top insights on impact and use cases:

  • 84% said AI is helping to increase their company’s annual revenue
  • 77% said AI helped reduce annual operating costs
  • 60% said increased employee productivity was their biggest benefit from AI
  • 44% said they’re investing in AI for customer experience optimization, which is the No. 1 area of investment for AI in telecommunications
  • 40% said they’re deploying AI into their network planning and operations, including RAN

The percentage of respondents who indicated they will build AI solutions in-house rose from 27% in 2024 to 37% this year.  “Telcos are really looking to do more of this work themselves,” Nvidia’s Global Head of Business Development for Telco Chris Penrose [1.] said. “They’re seeing the importance of them taking control and ownership of becoming an AI center of excellence, of doing more of the training of their own resources.”

With respect to using AI inferencing, Chris said, “”We’ve got 14 publicly announced telcos that are doing this today, and we’ve got an equally big funnel.”  Penrose noted that the AI skills gap remains the biggest hurdle for operators. Why? Because, as he put it, just because someone is an AI scientist doesn’t mean they are also necessarily a generative AI or agentic AI scientist specifically. And in order to attract the right talent, operators need to demonstrate that they have the infrastructure that will allow top-tier employees to do amazing work. See also: GPUs, data center infrastructure, etc.

Note 1.  Penrose represented AT&T’s IoT business for years at various industry trade shows and events before leaving the company in 2020.

Rather than the large data centers processing AI Large Language Models (LLMs),  AI inferencing could be done more quickly at smaller “edge” facilities that are closer to end users. That’s where telecom operators might step in.  “Telcos are in a unique position,” Penrose told Light Reading. He explained that many countries want to ensure that their AI data and operations remain inside the boundaries of that country. Thus, telcos can be “the trusted providers of [AI] infrastructure in their nations.”

“We’ll call it AI RAN-ready infrastructure. You can make money on it today. You can use it for your own operations. You can use it to go drive some services into the market. … Ultimately your network itself becomes a key anchor workload,” Penrose said.

Source: Skorzewiak/Alamy Stock Photo

Nvidia proposes that network operators can not only run their own AI workloads on Nvidia GPUs, they can also sell those inferencing services to third parties and make a profit by doing so.  “We’ve got lots of indications that many [telcos] are having success, and have not only deployed their first [AI compute] clusters, but are making reinvestments to deploy additional compute in their markets,” Penrose added.

Nvidia specifically pointed to AI inferencing announcements by SingtelSwisscomTelenorIndosat and SoftBank.

Other vendors are hoping for similar sales.  “I think this vision of edge computing becoming AI inferencing at the end of the network is massive for us,” HPE boss Antonio Neri said last year, in discussing HPE’s $14 billion bid for Juniper Networks.

That comes after multi-access edge computing (MEC) has not lived up to its potential, partially because a 5G SA core network is needed for that and few have been commercially deployed.  Edge computing disillusionment is clear among hyperscalers and also network operators. For example, Cox folded its edge computing business into its private networks operation. AT&T no longer discusses the edge computing locations it was building with Microsoft and Google. And Verizon has admitted to edge computing “miscalculations.”

Will AI inferencing be the savior for MEC?  The jury is out on that topic.  However, Nvidia said that 40% of its revenues already come from AI inferencing. Presumably that inferencing is happening in larger data centers and then delivered to nearby users. Meaning, a significant amount of inferencing is being done today without additional facilities, distributed at a network’s edge, that could enable speedier, low-latency AI services.

“The idea that AI inferencing is going to be all about low-latency connections, and hence stuff like AI RAN and and MEC and assorted other edge computing concepts, doesn’t seem to be a really good fit with the current main direction of AI applications and models,” argued Disruptive Wireless analyst Dean Bubley in a Linked In post.

References:

https://blogs.nvidia.com/blog/ai-telcos-survey-2025/

State of AI in Telecommunications

https://www.lightreading.com/ai-machine-learning/telcos-profiting-from-ai-inferencing-we-ve-been-here-before

https://www.fierce-network.com/premium/whitepaper/edge-computing-powered-global-ai-inference

https://www.fierce-network.com/cloud/are-ai-services-telcos-magic-revenue-bullet

The case for and against AI-RAN technology using Nvidia or AMD GPUs

Ericsson’s sales rose for the first time in 8 quarters; mobile networks need an AI boost

AI RAN Alliance selects Alex Choi as Chairman

Markets and Markets: Global AI in Networks market worth $10.9 billion in 2024; projected to reach $46.8 billion by 2029

AI sparks huge increase in U.S. energy consumption and is straining the power grid; transmission/distribution as a major problem

Tata Consultancy Services: Critical role of Gen AI in 5G; 5G private networks and enterprise use cases

Leave a Reply

Your email address will not be published.

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>

*