share_log

英伟达AI芯片的最大买家是谁?这家科技巨头力压同行霸榜

Who is the biggest buyer of NVIDIA's AI Chip? This technology giant outperforms its peers to top the list.

cls.cn ·  16:17

① Microsoft became the largest buyer of Nvidia Hopper chips, with purchases reaching 0.485 million units during the year, far exceeding Meta's 0.224 million units; ② Huang Renxun said demand for Hopper chips was strong, and the company's revenue for the third fiscal quarter increased 94% year-on-year, and net revenue increased 109%.

Financial Services Association, December 18 (Editor: Zhao Hao) According to the latest data from market research and consulting agency Omdia, Microsoft has become the largest buyer of Nvidia's flagship Hopper chip, and the number of purchases is far ahead of other technology competitors.

Omdia analysts estimate that Microsoft purchased 0.485 million Hopper chips this year. Compared with that, Meta Platforms, the second-largest US customer, bought 0.224 million chips, less than half of Microsoft's purchases.

Omdia claims that its calculation results come from data such as capital expenditure, server shipments, and supply chain intelligence disclosed by various companies. According to Omdia, ByteDance and Tencent each ordered about 0.23 million Nvidia chips this year, which is a bit higher than Meta in terms of quantity.

Although Amazon and Google are trying to deploy their own customized alternatives, the two companies still purchased 0.196 million units and 169,000 Hoppers respectively. The data also shows that the total number of chips purchased by Tesla and xAI managed by Musk is slightly higher than Amazon.

Last month, Nvidia CEO Wong In-hoon said in an earnings call that although the next generation of Blackwell chips is scheduled to begin shipping this quarter, the current Hopper chips are still very popular, thanks to the outstanding contributions of basic model developers in pre-training, post-training, and reasoning.

Since the chatbot ChatGPT debuted two years ago, major technology companies have successively spent 10 billion dollars to develop AI infrastructure, starting an unprecedented investment boom. This has also made Nvidia's AI chips one of the hottest products in Silicon Valley.

Hwang In-hoon has mentioned many times, “Demand [for Nvidia products] is very strong. Everyone wants to be the first to receive the product, and everyone wants to receive the most products.” Last month, Nvidia announced a 94% year-on-year increase in revenue for the third fiscal quarter and a 109% increase in net revenue.

Compared to other technology companies, Microsoft is arguably the most active in building infrastructure, because Microsoft not only needs data centers to run its own AI services (such as Copilot), but also leases computing power to cloud service customers through the Azure division.

According to Omdia, the number of Nvidia chips purchased by Microsoft in 2024 is three times that of 2023. Microsoft Azure executive Alistair Speirs told the media, “Good data center infrastructure is very complex. It is a capital-intensive project that requires years of planning.”

Speirs added, “So it's important to anticipate our growth and leave some room.” Omdia estimates that by 2024, global tech companies will spend $229 billion on servers, of which $31 billion for Microsoft and $26 billion for Amazon.

Vlad Galabov, head of cloud computing and data center research at Omdia, said that about 43% of server spending in 2024 went to Nvidia. “Nvidia GPUs account for a very high share, but we expect this may be close to the peak.”

On the one hand, AMD (Chaowei Semiconductor), Nvidia's main competitor in the GPU field, is making progress. According to Omdia, Meta bought 0.173 million AMD Mi300 chips this year, and Microsoft also bought 0.096 million chips.

At the same time, large technology companies are also increasing their use of their own chips. Google has been developing its “tensor processing unit” (TPU) for ten years, and Meta has also launched a self-developed “MTIA” chip, and the two companies have each deployed about 1.5 million units.

Amazon is also investing in its Trainium and Inferentia processors, and has deployed around 1.3 million such chips this year. Earlier this month, Amazon said the company plans to use its latest Trainium chip to build a new cluster for partner Anthropic.

Compared to them, Microsoft's use of its first self-developed chip “Maia” is conservative, and the number deployed this year is only about 0.2 million.

Disclaimer: This content is for informational and educational purposes only and does not constitute a recommendation or endorsement of any specific investment or investment strategy. Read more
    Write a comment