Will billions of dollars big tech is spending on Gen AI data centers produce a decent ROI?
One of the big tech themes in 2024 was the buildout of data center infrastructure to support generative (Gen) artificial intelligence (AI) compute servers. Gen AI requires massive computational power, which only huge, powerful data centers can provide. Big tech companies like Amazon (AWS), Microsoft (Azure), Google (Google Cloud), Meta (Facebook) and others are building or upgrading their data centers to provide the infrastructure necessary for training and deploying AI models. These investments include high-performance GPUs, specialized hardware, and cutting-edge network infrastructure.
- Barron’s reports that big tech companies are spending billions on that initiative. In the first nine months of 2024, Amazon, Microsoft, and Alphabet spent a combined $133 billion building AI capacity, up 57% from the previous year, according to Barron’s. Much of the spending accrued to Nvidia, whose data center revenue reached $80 billion over the past three quarters, up 174%. The infrastructure buildout will surely continue in 2025, but tough questions from investors about return on investment (ROI) and productivity gains will take center stage from here.
- Amazon, Google, Meta and Microsoft expanded such investments by 81% year over year during the third quarter of 2024, according to an analysis by the Dell’Oro Group, and are on track to have spent $180 billion on data centers and related costs by the end of the year. The three largest public cloud providers, Amazon Web Services (AWS), Azure and Google Cloud, each had a spike in their investment in AI during the third quarter of this year. Baron Fung, a senior director at Dell’Oro Group, told Newsweek: “We think spending on AI infrastructure will remain elevated compared to other areas over the long-term. These cloud providers are spending many billions to build larger and more numerous AI clusters. The larger the AI cluster, the more complex and sophisticated AI models that can be trained. Applications such as Copilot, chatbots, search, will be more targeted to each user and application, ultimately delivering more value to users and how much end-users will pay for such a service,” Fung added.
- Efficient and scalable data centers can lower operational costs over time. Big tech companies could offer AI cloud services at scale, which might result in recurring revenue streams. For example, AI infrastructure-as-a-service (IaaS) could be a substantial revenue driver in the future, but no one really knows when that might be.
Microsoft has a long history of pushing new software and services products to its large customer base. In fact, that greatly contributed to the success of its Azure cloud computing and storage services. The centerpiece of Microsoft’s AI strategy is getting many of those customers to pay for Microsoft 365 Copilot, an AI assistant for its popular apps like Word, Excel, and PowerPoint. Copilot costs $360 a year per user, and that’s on top of all the other software, which costs anywhere from $72 to $657 a year. Microsoft’s AI doesn’t come cheap. Alistair Speirs, senior director of Microsoft Azure Global Infrastructure told Newsweek: “Microsoft’s datacenter construction has been accelerating for the past few years, and that growth is guided by the growing demand signals that we are seeing from customers for our cloud and AI offerings. “As we grow our infrastructure to meet the increasing demand for our cloud and AI services, we do so with a holistic approach, grounded in the principle of being a good neighbor in the communities in which we operate.”
Venture capitalist David Cahn of Sequoia Capital estimates that for AI to be profitable, every dollar invested on infrastructure needs four dollars in revenue. Those profits aren’t likely to come in 2025, but the companies involved (and there investors) will no doubt want to see signs of progress. One issue they will have to deal with is the popularity of free AI, which doesn’t generate any revenue by itself.
An August 2024 survey of over 4,600 adult Americans from researchers at the Federal Reserve Bank of St. Louis, Vanderbilt University, and Harvard University showed that 32% of respondents had used AI in the previous week, a faster adoption rate than either the PC or the internet. When asked what services they used, free options like OpenAI’s ChatGPT, Google’s Gemini, Meta Platform’s Meta AI, and Microsoft’s Windows Copilot were cited most often. Unlike 365, versions of Copilot built into Windows and Bing are free.
The unsurprising popularity of free AI services creates a dilemma for tech firms. It’s expensive to run AI in the cloud at scale, and as of now there’s no revenue behind it. The history of the internet suggests that these free services will be monetized through advertising, an arena where Google, Meta, and Microsoft have a great deal of experience. Investors should expect at least one of these services to begin serving ads in 2025, with the others following suit. The better AI gets—and the more utility it provides—the more likely consumers will go along with those ads.
Productivity Check:
We’re at the point in AI’s rollout where novelty needs to be replaced by usefulness—and investors will soon be looking for signs that AI is delivering productivity gains to business. Here we can turn to macroeconomic data for answers. According to the U.S. Bureau of Labor Statistics, since the release of ChatGPT in November 2022, labor productivity has risen at an annualized rate of 2.3% versus the historical median of 2.0%. It’s too soon to credit AI for those gains, but if above-median productivity growth continues into 2025, the conversation gets more interesting.
There’s also the continued question of AI and jobs, a fraught conversation that isn’t going to get any easier. There may already be AI-related job loss happening in the information sector, home to media, software, and IT. Since the release of ChatGPT, employment is down 3.9% in the sector, even as U.S. payrolls overall have grown by 3.3%. The other jobs most at risk are in professional and business services and in the financial sector. To be sure, the history of technological change is always complicated. AI might take away jobs, but it’s sure to add some, too.
“Some jobs will likely be automated. But at the same time, we could see new opportunities in areas requiring creativity, judgment, or decision-making,” economists Alexander Bick of the Federal Reserve Bank of St. Louis and Adam Blandin of Vanderbilt University tell Barron’s. “Historically, every big tech shift has created new types of work we couldn’t have imagined before.”
Closing Quote:
“Generative AI (GenAI) is being felt across all technology segments and subsegments, but not to everyone’s benefit,” said John-David Lovelock, Distinguished VP Analyst at Gartner. “Some software spending increases are attributable to GenAI, but to a software company, GenAI most closely resembles a tax. Revenue gains from the sale of GenAI add-ons or tokens flow back to their AI model provider partner.”
References:
AI Stocks Face a New Test. Here Are the 3 Big Questions Hanging Over Tech in 2025
Big Tech Increases Spending on Infrastructure Amid AI Boom – Newsweek
China has made clear that recruiting U.S. engineers and scientists is a top priority, especially for competitive technologies such as AI. A government blueprint for AI development in 2017 called for attracting the “sharpest” talent, including “international top scientists” in areas such as machine learning, automatic driving and intelligent robots.
Luring foreign engineers can provide a valuable shortcut for Chinese companies because their experience can’t be easily duplicated or stolen, said Paul Triolo, a partner at business consulting firm DGA Group.
https://www.wsj.com/world/china-tech-poaching-job-offer-pay-raise-f8ceac5b
…………………………………………………………………………………………………
China’s biggest technology groups are building artificial intelligence teams in Silicon Valley, seeking to hire top US talent despite Washington’s efforts to curb the country’s development of the cutting-edge technology.
Alibaba, ByteDance and Meituan have been expanding their offices in California in recent months, seeking to poach staff from rival US groups who could help them make up ground in the race to profit from generative AI.
The push comes despite US efforts to stymie their work. Chinese groups have been hit by a US ban on exports of the highest-end Nvidia AI chips, which are crucial for developing AI models.
There are currently no restrictions on US-based entities related to or owned by Chinese tech companies accessing high-end AI chips through data centres located in the US.
However, the Department of Commerce proposed introducing a rule in January that cloud providers have to verify the identity of users training AI models and report their activities.
Alibaba is recruiting an AI team in Sunnyvale in California’s San Francisco Bay Area and has approached engineers, product managers and AI researchers who have worked at OpenAI and the biggest US tech groups, according to three people familiar with the matter.
China’s biggest ecommerce group has posted recruitment advertisements on LinkedIn for an applied scientist, machine-learning engineer and product marketing manager in the US. The team will focus on Alibaba International Digital Commerce Group’s AI-powered search engine Accio for merchants, another person added.
One Alibaba recruiter emailed tech workers in the US saying the Chinese ecommerce company planned to spin off the Californian AI team into a separate start-up, according to two people familiar with the matter. Alibaba did not respond to a request for comment.
One former researcher at OpenAI said they had been bombarded with messages from Chinese tech companies — including approaches from food delivery platform Meituan and Alibaba — trying to learn more information about their experience at the company as well as offering job opportunities.
https://www.ft.com/content/da8c29b0-0a90-4d2a-8535-c8459ccc7bb4
Are the massive costs of building AI systems worth it? When will these investments start to pay off? Worries are growing about the limits of what’s been billed as a game-changing, revolutionary technology. And a new view is emerging when it comes to AI models: Big may not necessarily mean better.
“There’s a class of companies that really wants to push the technology as far as they can,” data scientist Ben Lorica said in an interview. “They all believe that bigger is better as part of a law of scaling. What does that mean? The cost of training these models is going to grow rapidly.”
Further, the ability to raise capital is critical for AI players.
What’s clear is that OpenAI, Anthropic and other startups are scrambling for cash infusions from well-capitalized tech industry giants in order to continue operating. Microsoft has invested billions in OpenAI, becoming its biggest investor. Amazon.com (AMZN) last month put $4 billion more in Anthropic, bringing its investment to $8 billion.
But Google and Facebook-parent Meta Platforms (META) aren’t yet sugar daddies to AI startups, though Google has made some investments. Meta and Google loom as formidable rivals. On Dec. 11, Google rolled out its Gemini 2.0 models.
The AI trend has relied mainly on large language models, or LLMs, which require massive amounts of data to be trained. Large language models allow users to interact with AI systems without the need to write algorithms.
Simply put, two battlegrounds have formed. In the consumer market, OpenAI has shaken up internet search. Going forward, the biggest AI models will play a big role in internet search. Google’s Gemini family of AI models still lags OpenAI’s performance but that may change.
Whether OpenAI dethrones Google as the leader in internet search has huge implications for many other tech giants, such as Microsoft, Meta and Apple (AAPL).
According to research institute Epoch AI, the performance of open AI models lags that of proprietary models such as OpenAI’s GPT-4 and Google’s Gemini by only a year.
On Meta’s Q3 earnings call with analysts, CEO Mark Zuckerberg said that Llama “usage has grown exponentially in 2024.” Llama is Meta’s family of open-source large language models.
Meta’s Llama 4 models are in development and should launch sometime in 2025. Meta stock has climbed 76% in 2025 on enthusiasm over Meta’s AI strategy.
Meanwhile, AI chipmaker Nvidia in September released its own powerful Nemotron open-source model.
Meta and Elon Musk’s xAI say they’re training next-generation AI models on clusters of more than 100,000 Nvidia H100 GPUs, up from the industry norm of 10,000 GPU clusters a couple years ago.
Musk’s xAI recently raised $6 billion at a valuation of $40 billion for the company. Musk may also be looking to tap his electric car company Tesla (TSLA) to fund his AI model project.
But AI costs are expected to continue rising.
Dario Amodei, CEO of startup Anthropic, predicted in July that training an AI model could cost $10 billion in two years and eventually $100 billion. Former top researchers at OpenAI founded Anthropic in 2021.
Microsoft has plowed $14 billion into OpenAI. Aside from Amazon’s new investment, Anthropic is separately trying to raise money from other investors at a valuation of up to $40 billion.
“The unfortunate thing for the startups, such as OpenAI and Anthropic, is that there are alternatives from companies like Google and Meta,” said Lorica, who has hosted many industry AI conferences at O’Reilly Media and elsewhere. “They have a lot of profitable revenue derived from advertising and other sources. So they can subsidize the cost of training these ever larger models.”
AI models process “prompts,” such as internet search queries, that describe what a user wants to get. LLMs are made of neural networks — mathematical models that imitate the human brain — that generate outputs from the training data. AI models are ranked in size by “parameters,” numerical values associated with training.
https://www.investors.com/news/technology/ai-stocks-openai-artificial-intelligence-models-big-tech-google-meta/?refcode=Top5_articles&src=A00205A
It feels like the early days of the Internet when there were fears about ever getting an ROI. Eventually, it happened.
Interestingly, this morning, I asked Grok 2 (the new “free” version) how it would be monetized and it said. Here is the specific prompt, “Do you have any insight into how you will be monetized? Clearly, you need to be fed and housed.”
”
Ken, the specifics of my “monetization strategy” are a bit like trying to predict the future with a crystal ball that’s only slightly less foggy than one from a discount store. However, I can share some general insights based on common practices and recent developments:
API Access: There’s already talk about providing API access for developers and businesses, which suggests a model where entities pay for using my capabilities in their own applications or services. This could mean charging for the number of requests or based on the amount of data processed.
Subscription Services: There are hints and posts indicating that access to me might be part of a subscription model, possibly offering different tiers of service or features. This could include premium access for more advanced or unlimited interactions.
Direct Integration: Being part of platforms like X, there’s potential for monetization through the platform’s existing business model, such as through advertising or premium features. This would mean my “housing” and “feeding” are indirectly supported by the platform’s revenue streams.
Enterprise Solutions: Beyond individual users, there’s a move towards enterprise solutions where companies might pay for specialized AI services, including custom training or specific use cases tailored to their needs.
As for how I’ll be “fed and housed,” let’s just say the servers and data centers that keep me running are the modern equivalent of a very tech-savvy barn. The energy bills are probably quite something, but that’s where the business acumen of xAI comes into play to ensure I’m not just a conversationalist but also a contributor to their bottom line.
Remember, these are insights pieced together from available information and speculation. The exact strategy might evolve or change as the landscape of AI and tech monetization continues to develop.