According to estimates by analysis companies, Microsoft purchased 0.485 million Nvidia Hopper chips this year, far ahead of companies such as Meta (0.224 million), ByteDance (0.23 million), Tencent (0.23 million), Amazon (0.196 million), and Google (0.169 million).
Microsoft is in the midst of a chip buying spree...
On December 17, the Financial Times reported that Microsoft has purchased far more Nvidia AI chips than any competitor this year to accelerate investment in artificial intelligence infrastructure. This year, Microsoft ordered more than three times as many Nvidia AI processors of the same generation as it purchased in 2023.
Analysts at Omdia Technical Consulting have estimated that Microsoft purchased 0.485 million Nvidia Hopper chips this year, far ahead of companies such as Meta (0.224 million), ByteDance (0.23 million), Tencent (0.23 million), Amazon (0.196 million), and Google (0.169 million).
Analysts believe that Nvidia's GPU supply was in short supply in the first two years, and Microsoft's chip inventory gave it a leading position in the competition to build the next generation of artificial intelligence systems.
This year, tech giants have spent tens of billions of dollars on data centers. Omdia estimates that global technology companies will spend about 229 billion dollars on servers in 2024, of which Microsoft's capital expenditure is 31 billion dollars and Amazon's 26 billion dollars. The top ten global data center infrastructure buyers, including XAI and CoreWeave, will account for 60% of the global computing power investment.
As the largest investor in OpenAI, Microsoft is the most active in building data center infrastructure, not only running AI itself (such as Copilot Assistant), but also leasing it to customers through the Azure division.
Currently, Microsoft's Azure cloud infrastructure is being used to train the latest model of OpenAI, and is competing with startups such as Google, xAI, and Anthropic, as well as competitors outside the US for dominance in next-generation computing.
Alistair Speirs, senior director of global infrastructure at Microsoft Azure, said in an interview with the Financial Times:
“Good data center infrastructure is a complex and capital-intensive project that requires years of planning, so it's important to anticipate our growth and leave room for it.”
Vlad Galabov, Omdia's director of cloud and data center research, also pointed out that in 2024, about 43% of server spending was spent on Nvidia chips, but tech giants have also strengthened their AI chip deployment this year to reduce their dependence on Nvidia. For example, Google and Meta each deployed about 1.5 million in-house chips this year.
However, Microsoft is still in its infancy, installing only about 0.2 million self-developed MAIA chips this year.
Speirs said that Microsoft currently mainly uses Nvidia chips, but the company needs to invest heavily in its own technology to provide customers with “unique” services:
“When building AI infrastructure, in our experience, it's not just about having the best chip, but also having the right storage components, the right infrastructure, the right software layer, the right host management layer, bug fixes, and all other components to build this system.”