U.S. export controls on Nvidia H20 AI chips enables Huawei’s 910C GPU to be favored by AI tech giants in China

Damage of U.S. Export Controls and Trade War with China:

The U.S. big tech sector, especially needs to know what the rules of the trade game will be looking ahead instead of the on-again/off-again Trump tariffs and trade war with China which includes 145% tariffs and export controls on AI chips from Nvidia, AMD, and other U.S. semiconductor companies.

The latest export restriction on Nvidia’s H20 AI chips are a case in point.  Nvidia said it would record a $5.5 billion charge on its quarterly earnings after it disclosed that the U.S. will now require a license for exporting the company’s H20 processors to China and other countries. The U.S. government told the chip maker on April 14th that the new license requirement would be in place “indefinitely.”

Nvidia designed the H20 chip to comply with existing U.S. export controls that limit sales of advanced AI processors to Chinese customers. That meant the chip’s capabilities were significantly degraded; Morgan Stanley analyst Joe Moore estimates the H20’s performance is about 75% below that of Nvidia’s H100 family.  The Commerce Department said it was issuing new export-licensing requirements covering H20 chips and AMD’s MI308 AI processors.

Big Chinese cloud companies like Tencent, ByteDance (TikTok’s parent), Alibaba, Baidu, and iFlytek have been left scrambling for domestic alternatives to the H20, the primary AI chip that Nvidia had until recently been allowed to sell freely into the Chinese market.  Some analysts suggest that H20 bulk orders to build a stockpile were  a response to concerns about future U.S. export restrictions and a race to secure limited supplies of Nvidia chips.  The estimate is that there’s a 90 days supply of H20 chips, but it’s uncertain what China big tech companies will use when that runs out.

The inability to sell even a low-performance chip into the Chinese market shows how the trade war will hurt Nvidia’s business.  The AI chip king is now caught between the world’s two superpowers as they jockey to take the lead in AI development.

Nvidia CEO Jensen Huang “flew to China to do damage control and make sure China/Xi knows Nvidia wants/needs China to maintain its global ironclad grip on the AI Revolution,” the analysts note. The markets and tech world are tired of “deal progress” talks from the White House and want deals starting to be inked so they can plan their future strategy.  The analysts think this is a critical week ahead to get some trade deals on the board, because Wall Street has stopped caring about words and comments around “deal progress.”

Raa | Nurphoto | Getty Images
………………………………………………………………………………………………………………………………………………………………………………………………………
Domestic (China) Alternatives to Nvidia’s H20:
“There are several local Chinese companies that produce chips to compete with Nvidia,” said Brady Wang, associate director at Counterpoint Research.  In particular:
  • Baidu is developing its own AI chips called Kunlun. It recently placed an order for 1,600 of Huawei’s Ascend 910B AI chips for 200 servers. This order was made in anticipation of further U.S. export restrictions on AI chips.
  • Alibaba (T-Head)  has developed AI chips like the Hanguang 800 inference chip, used to accelerate its e-commerce platform and other services.
  • Cambricon Technologies: Designs various types of semiconductors, including those for training AI models and running AI applications on devices.
  • Biren Technology: Designs general-purpose GPUs and software development platforms for AI training and inference, with products like the BR100 series.
  • Moore Threads: Develops GPUs designed for training large AI models, with data center products like the MTT KUAE.
  • Horizon Robotics: Focuses on AI chips for smart driving, including the Sunrise and Journey series, collaborating with automotive companies.
  • Enflame Technology: Designs chips for data centers, specializing in AI training and inference. 

“With Nvidia’s H20 and other advanced GPUs restricted, domestic alternatives like Huawei’s Ascend series are gaining traction,” said Doug O’Laughlin, an industry analyst at independent semiconductor research company SemiAnalysis.  “While there are still gaps in software maturity and overall ecosystem readiness, hardware performance is closing in fast,” O’Laughlin added.   According to the SemiAnalysis report, Huawei’s Ascend chip shows how China’s export controls have failed to stop firms like Huawei from accessing critical foreign tools and sub-components needed for advanced GPUs. “While Huawei’s Ascend chip can be fabricated at SMIC, this is a global chip that has HBM from Korea, primary wafer production from TSMC, and is fabricated by 10s of billions of wafer fabrication equipment from the US, Netherlands, and Japan,” the report stated.

Huawei’s New AI Chip May Dominate in China:

Huawei Technologies plans to begin mass shipments of its advanced 910C artificial intelligence chip to Chinese customers as early as next month, according to Reuters. Some shipments have already been made, people familiar with the matter said.  Huawei’s 910C, a graphics processing unit (GPU), represents an architectural evolution rather than a technological breakthrough, according to one of the two people and a third source familiar with its design. It achieves performance comparable to Nvidia’s H100 chip by combining two 910B processors into a single package through advanced integration techniques, they said. That means it has double the computing power and memory capacity of the 910B and it also has incremental improvements, including enhanced support for diverse AI workload data.

China’s SMIC is manufacturing some main components of the GPUs using its N+2 7nm process technology although its chip yield rates are low, a source has previously said. At least some of Huawei’s 910C GPUs use semiconductors that were made by TSMC for China-based Sophgo, according to one of the sources and a fourth person.  SMIC is under its own export controls, which prevents it from accessing some of the world’s most advanced chipmaking equipment.
Huawei reiterated that it has not used TSMC-made Sophgo [1.] chips.  TSMC said it complies with regulatory requirements and it has not supplied Huawei since mid-September 2020.  Of course, the 910C GPU will be used in Huawei’s cloud service offering, which has ~ 20% of China’s cloud market
………………………………………………………………………………………………………………………………………………………………………………………………………………..
Note 1. The U.S. Commerce Department has been investigating work done by the Taiwanese contract chip manufacturing giant for Sophgo after one of its TSMC-made chips was found in a 910B processor.
TSMC made nearly three million chips in recent years that matched the design ordered by Sophgo, according to Lennart Heim, a researcher at RAND’s Technology and Security and Policy Center in Arlington, Virginia, who is tracking Chinese developments in AI.
………………………………………………………………………………………………………………………………………………………………………………………………………………..

Conclusions:

The U.S. Commerce Department’s latest export curbs on Nvidia’s H20 “will mean that Huawei’s Ascend 910C GPU will now become the hardware of choice for (Chinese) AI model developers and for deploying inference capacity,” said Paul Triolo, a partner at consulting firm Albright Stonebridge Group.

The markets, tech world and the global economy urgently need U.S. – China trade negotiations in some form to start as soon as possible, Wedbush analysts say in a research note today. The analysts expect minimal or no guidance from tech companies during this earnings season as they are “playing darts blindfolded.”

References:

https://www.reuters.com/world/china/huawei-readies-new-ai-chip-mass-shipment-china-seeks-nvidia-alternatives-sources-2025-04-21/

https://qz.com/china-six-tigers-ai-startup-zhipu-moonshot-minimax-01ai-1851768509#

https://www.cnbc.com/2025/04/21/us-chip-controls-boon-for-china-nvidia-rivals-like-huawei-analysts-.html

https://www.huaweicloud.com/intl/en-us/

Goldman Sachs: Big 3 China telecom operators are the biggest beneficiaries of China’s AI boom via DeepSeek models; China Mobile’s ‘AI+NETWORK’ strategy

Telecom sessions at Nvidia’s 2025 AI developers GTC: March 17–21 in San Jose, CA

Nvidia AI-RAN survey results; AI inferencing as a reinvention of edge computing?

FT: Nvidia invested $1bn in AI start-ups in 2024

Omdia: Huawei increases global RAN market share due to China hegemony

Huawei’s “FOUR NEW strategy” for carriers to be successful in AI era

Reuters & Bloomberg: OpenAI to design “inference AI” chip with Broadcom and TSMC

Bloomberg reports that OpenAI, the fast-growing company behind ChatGPT, is working with Broadcom Inc. to develop a new artificial intelligence chip specifically focused on running AI models after they’ve been trained, according to two people familiar with the matter.   The two companies are also consulting with Taiwan Semiconductor Manufacturing Company(TSMC) the world’s largest chip contract manufacturer. OpenAI has been planning a custom chip and working on its uses for the technology for around a year, the people said, but the discussions are still at an early stage.  The company has assembled a chip design team of about 20 people, led by top engineers who have previously built Tensor Processing Units (TPUs) at Google, including Thomas Norrie and Richard Ho (head of hardware engineering).

Reuters reported on OpenAI’s ongoing talks with Broadcom and TSMC on Tuesday. It has been working for months with Broadcom to build its first AI chip focusing on inference (responds to user requests), according to sources. Demand right now is greater for training chips, but analysts have predicted the need for inference chips could surpass them as more AI applications are deployed.

OpenAI has examined a range of options to diversify chip supply and reduce costs. OpenAI considered building everything in-house and raising capital for an expensive plan to build a network of chip manufacturing factories known as “foundries.”

REUTERS/Dado Ruvic/Illustration/File Photo Purchase Licensing Rights

OpenAI may continue to research setting up its own network of foundries, or chip factories, one of the people said, but the startup has realized that working with partners on custom chips is a quicker, attainable path for now. Reuters earlier reported that OpenAI was pulling back from the effort of establishing its own chip manufacturing capacity.  The company has dropped the ambitious foundry plans for now due to the costs and time needed to build a network, and plans instead to focus on in-house chip design efforts, according to sources.

OpenAI, which helped commercialize generative AI that produces human-like responses to queries, relies on substantial computing power to train and run its systems. As one of the largest purchasers of Nvidia’s graphics processing units (GPUs), OpenAI uses AI chips both to train models where the AI learns from data and for inference, applying AI to make predictions or decisions based on new information. Reuters previously reported on OpenAI’s chip design endeavors. The Information reported on talks with Broadcom and others.

The Information reported in June that Broadcom had discussed making an AI chip for OpenAI. As one of the largest buyers of chips, OpenAI’s decision to source from a diverse array of chipmakers while developing its customized chip could have broader tech sector implications.

Broadcom is the largest designer of application-specific integrated circuits (ASICs) — chips designed to fit a single purpose specified by the customer. The company’s biggest customer in this area is Alphabet Inc.’s Google. Broadcom also works with Meta Platforms Inc. and TikTok owner ByteDance Ltd.

When asked last month whether he has new customers for the business, given the huge demand for AI training, Broadcom Chief Executive Officer Hock Tan said that he will only add to his short list of customers when projects hit volume shipments.  “It’s not an easy product to deploy for any customer, and so we do not consider proof of concepts as production volume,” he said during an earnings conference call.

OpenAI’s services require massive amounts of computing power to develop and run — with much of that coming from Nvidia chips. To meet the demand, the industry has been scrambling to find alternatives to Nvidia. That’s included embracing processors from Advanced Micro Devices Inc. and developing in-house versions.

OpenAI is also actively planning investments and partnerships in data centers, the eventual home for such AI chips. The startup’s leadership has pitched the U.S. government on the need for more massive data centers and CEO Sam Altman has sounded out global investors, including some in the Middle East, to finance the effort.

“It’s definitely a stretch,” OpenAI Chief Financial Officer Sarah Friar told Bloomberg Television on Monday. “Stretch from a capital perspective but also my own learning. Frankly we are all learning in this space: Infrastructure is destiny.”

Currently, Nvidia’s GPUs hold over 80% AI market share. But shortages and rising costs have led major customers like Microsoft, Meta, and now OpenAI, to explore in-house or external alternatives.

Training AI models and operating services like ChatGPT are expensive. OpenAI has projected a $5 billion loss this year on $3.7 billion in revenue, according to sources. Compute costs, or expenses for hardware, electricity and cloud services needed to process large datasets and develop models, are the company’s largest expense, prompting efforts to optimize utilization and diversify suppliers.
OpenAI has been cautious about poaching talent from Nvidia because it wants to maintain a good rapport with the chip maker it remains committed to working with, especially for accessing its new generation of Blackwell chips, sources added.

References:

https://www.bloomberg.com/news/articles/2024-10-29/openai-broadcom-working-to-develop-ai-chip-focused-on-inference?embedded-checkout=true

https://www.reuters.com/technology/artificial-intelligence/openai-builds-first-chip-with-broadcom-tsmc-scales-back-foundry-ambition-2024-10-29/

AI Echo Chamber: “Upstream AI” companies huge spending fuels profit growth for “Downstream AI” firms

AI Frenzy Backgrounder; Review of AI Products and Services from Nvidia, Microsoft, Amazon, Google and Meta; Conclusions

AI sparks huge increase in U.S. energy consumption and is straining the power grid; transmission/distribution as a major problem

Generative AI Unicorns Rule the Startup Roost; OpenAI in the Spotlight