Will Google Cloud’s AI and data analytics revenue +TPU IP licensing income offset huge AI CAPEX to produce a decent ROI?

An April 24th Investors Business Daily (IBD) article asserts that Google’s AI position is strong, but the real test will be monetization.  Specifically, can Gemini translate technical lead and user scale into durable profits for parent company Alphabet?  The company has benefited from AI enthusiasm and Google Cloud momentum, but investors are now focused on whether heavy AI spending will generate sufficient revenues to justify the enormous capex ramp up.  The article highlights Gemini’s growing traction, Google Cloud’s rapid expansion, and a very large backlog as signs of demand, but it also stresses that those positives must offset rising infrastructure costs.

With its Gemini family, Google continues to push its AI technology across the “stack,” (see quote below) deploying it to Google Maps, enterprise Workplace productivity tools, and YouTube’s content and ad platforms. AI technology is even making Google’s autonomous vehicle company, Waymo, better and safer amid its large market expansion.

A key theme is that Google has multiple ways to earn revenue from AI, including consumer subscriptions, enterprise software, and cloud services. The article points to Gemini Advanced as an example of paid AI packaging, while also implying that the larger opportunity is converting AI usage into higher-value cloud and platform revenue rather than just user growth. However, Alphabet is planning very large AI infrastructure spending (much more below), and the article questions whether the company can turn that investment into sustainable high-margin revenue fast enough to satisfy investors.

Google has also ventured into AI semiconductors with its AI accelerator Tensor Processing Unit, known as TPU, co-developed with Broadcom and manufactured by TSMC (Taiwan Semiconductor Manufacturing Company). Google is shifting future TPU generation designs to include MediaTek for design support, with TSMC continuing as the primary fabrication partner for advanced 2nm, 3nm, and 5nm nodes.

Google has recently introduced the 7th-gen “Ironwood TPU and revealed plans for the 8th-gen TPU 8t (Sunfish) and TPU 8i (Zebrafish) for 2027.  Long time colleague Amin Vadat, PhD wrote in a blog post, “We are introducing the eighth generation of Google’s custom Tensor Processor Unit (TPU), coming soon with two distinct, purpose-built architectures for training and inference: TPU 8t and TPU 8i. These two chips are designed to power our custom-built supercomputers, to drive everything from cutting-edge model training and agent development, to massive inference workloads. TPUs have been powering leading foundation models, including Gemini, for years. These 8th generation TPUs together will deliver scale, efficiency and capabilities across training, serving and agentic workloads.”

Image credit:  Google.

Indeed, Google’s TPUs have emerged as a threat to Nvidia’s dominance in the AI chip market. Anthropic has licensed Google’s TPU accelerators for use in data centers. Broadcom will modify the TPUs for Anthropic before the customized chips are made by TSMC. Wells Fargo estimates that Google could bring in over $10 billion in high-margin intellectual property (IP) licensing fees from TPUs in 2026 and 2027.

“What stands out about Google is that they’ve been investing up and down the technology stack, from silicon to the AI models,” said Daniel Flax, managing director at investment management firm Neuberger Berman. “While competition is fierce, they’ve been able to innovate. What we’re focused on is (Google’s) ability to execute on their product road map from one generation of AI models to the next.”

………………………………………………………………………………………………………………………………………………………………………………………………………………………………………….

AI Competition from OpenAI and Anthropic:

Google faces lots of AI competition from other hyperscalers (Amazon, Microsoft, Meta, etc) and especially from two private AI companies:.

  1. OpenAI remains a major AI player, powered by the rapid advance of ChatGPT, which launched in 2022.  In its latest funding round, OpenAI landed $122 billion in capital commitments, which values the company at $852 billion. OpenAI’s  GPT-6 is its next-generation AI model, as soon as late 2026.  GPT-6 is expected to include new memory features that support the personalization of AI chatbots. It’ll also offer more support for autonomous AI agents that perform tasks over the internet.
  2. Anthropic’s Claude AI model family has grabbed the spotlight this year. With Claude-based coding and other AI tools, Anthropic shook up the enterprise software market.  Anthropic is preparing a next-generation, more powerful AI model called Mythos.  Anthropic recently raised $30 billion in a funding round that valued the AI company at $380 billion.

………………………………………………………………………………………………………………………………………………………………………………………………………………………………………….

AI Cloud Competition:

Google’s cloud computing business is one area that should benefit from the company’s AI spending. The unit has excellent momentum. Cloud revenue climbed 47% to over $16 billion in the December quarter, up from 34% growth in the previous quarter. And Google’s cloud computing sales backlog grew 55% to $240 billion from the September quarter.  AWS still has the largest cloud market share, with Azure second and Google Cloud third.  Google Cloud’s edge is AI and data analytics, especially through Vertex AI, Gemini-related services, and TPU-based infrastructure. The company has developed AI Gemini models targeting specific industries, such as financial services and pharmaceutical companies.  With the recent $32 billion purchase of Wiz, Google plans to offer AI-based cybersecurity threat detection tools.

Google Cloud is growing faster than AWS on an AI-driven basis, but it still trails Azure in the most AI-sensitive growth comparisons and remains third in overall cloud share. The broad pattern is: AWS leads in scale, Azure leads in AI momentum and enterprise pull, and Google Cloud is the strongest “AI-first” challenger with faster growth than AWS but a smaller base.  Recent comparisons show AWS revenue growth around 18% year over year, while Google Cloud grew about 32%, and Azure’s estimated growth was about 39% in the same period.

Microsoft reported Intelligent Cloud segment growth was also faster than AWS. The rough share split cited in recent coverage is AWS about 30%, Azure about 20%, and Google Cloud about 13%.  Azure’s edge is enterprise distribution and the Azure OpenAI ecosystem, while AWS offers the broadest infrastructure catalog and strong AI tooling but is less clearly identified as the AI growth leader. Investor takeaway For investors, Google Cloud looks like the fastest-improving AI cloud franchise relative to its size, but not the biggest one. The real question is whether Google’sAI-led growth can stay above AWS while also narrowing the gap with Azure’s enterprise AI momentum.

Monetization is a Major Issue:

Many analyst say it’s unclear how many consumers will pay for AI. Only about 5% of ChatGPT’s user base is paid.  “Consumer AI is becoming a distribution channel and brand builder, while enterprise agents are where the high-margin, sticky revenue is actually getting locked in,” Ben Lorica, editor of the Gradient Flow AI newsletter, told IBD in an interview. “Widespread platform promiscuity across ChatGPT, Gemini and Claude signals low switching costs and thin margins, which is not a great recipe for durable revenue.”

“Cloud, AI revenues have to scale fast enough for people to say, ‘OK, this is actually working,'” said Michael Landsberg, chief executive of Landsberg Bennett Private Wealth Management. “With Google, a lot of things are going very well, but when is it going to translate into money in the pocket? Gemini is doing really well gaining market share from ChatGPT. But there’s no money yet,” Landsberg added. “The big issue around Google search is, ‘Are they going to be able to put advertising in Gemini?'”

“I think most people want free AI because we’ve been trained that free is how we do this computer thing,” said Kimberly Forrest, Bokeh Capital Partners’ chief investment officer. “Facebook, Instagram — it’s all free now. There might be some people willing to spend $20 monthly on AI, but probably not enough to generate the income that these models need to be continually improved.”

Alphabet has historically monetized consumer products through advertising rather than subscriptions. “I think the average consumer doesn’t want to pay for AI, and if they do, they certainly don’t want to pay much for AI,” said Tim Ghriskey, senior portfolio strategist at Ingalls & Snyder.

Author’s Note:  I regularly use Gemini for Home on my Google Smart Speaker and a different Gemini on PCs and my Samsung phone.  There’s a huge difference in performance with the former making many more mistakes and “AI Hallucinations” than the latter.   The reason is the Gemini for Home and regular Gemini run on two totally different AI systems.  For reasons neither I or Gemini for Home can explain, the Home version is severely deficient with many wrong answers and hallucinations that you don’t get when you use Gemini on a pc or the Gemini app on a smartphone.

One particularly bothersome Gemini for Home response to a question asked or a complaint is: “These pictures should match” or “Here are your photos” or “check out these pictures” with corresponding pics/photos displayed on the speaker’s screen.

–>THAT HAS ABSOLUTELY NOTHING TO DO WITH ANYTHING yet it happens frequently AFTER the Google speaker promises never to repeat it!  Ugggh!!!!

……………………………………………………………………………………………………………………………………….

Google/Alphabet’s Surging CAPEX and ROI:

Alphabet said its 2026 capex will be $175 billion to $185 billion, and management has framed the spending as overwhelmingly AI/infrastructure-related which will support revenue growth in Google Cloud, Gemini, and AI-enhanced Search.

The clearest breakdown disclosed to date is roughly 60% to servers and 40% to data centers and networking equipment. Using the company’s forward guidance ranges:

  • AI Compute Servers: about $105 billion to $111 billion.

  • Data centers and networking equipment: about $70 billion to $74 billion.

That means most of the spend is going into fast-depreciating compute hardware, with the rest funding the physical and network buildout needed to host AI workloads. Google says the investment is meant to expand AI compute, support Google Cloud demand, and scale Gemini and enterprise AI offerings.

The company also pointed to a $240 billion cloud backlog and strong cloud revenue growth as signs that the spending is tied to real demand rather than just speculative buildout.  The key issue for investors is whether this capital intensity converts into enough cloud and AI revenue to justify the return profile.  Alphabet has not given a specific ROI number for its 2026 AI investments. What it has said, and what analysts infer, is that the return should come from faster cloud growth, higher AI-related search usage, and paid enterprise adoption rather than a near-term accounting yield.

In conclusion, 2026 is an AI scale-up year for Google, but the ROI question is still open.

………………………………………………………………………………………………………………………………………………………..

References:

https://www.investors.com/news/technology/google-stock-artificial-intelligence-ai-models-gemini/

https://blog.google/innovation-and-ai/infrastructure-and-cloud/google-cloud/eighth-generation-tpu-agentic-era/

Will billions of dollars big tech is spending on Gen AI data centers produce a decent ROI?

Big tech spending on AI data centers and infrastructure vs the fiber optic buildout during the dot-com boom (& bust)

AI infrastructure spending boom: a path towards AGI or speculative bubble?

Expose: AI is more than a bubble; it’s a data center debt bomb

China vs U.S.: Race to Generate Power for AI Data Centers as Electricity Demand Soars

Anthropic’s Project Glasswing aims to reshape IT cybersecurity

IDC Survey of Networking Leaders: Enterprise AI progress stalls despite ambitious goals

Will “AI at the Edge” transform telecom or be yet another telco monetization failure?

Nvidia Survey Reveals How Telcos Plan to Use AI; Quantifying ROI is a Challenge

Analysis: Cisco, HPE/Juniper, and Nvidia network equipment for AI data centers

Networking chips and modules for AI data centers: Infiniband, Ultra Ethernet, Optical Connections