AI's Leap Forward: Nvidia's Blackwell Chips Poised to be the Next Revenue Powerhouse?
Nvidia's constantly iterative chip products mean that the AI giant's revenue surge may not be over yet, even after the release of financial results that greatly exceeded expectations.
As the company reports a tripling or more of its business, Nvidia CEO Jensen Huang said that the company's next-generation AI GPU, called Blackwell, would lead to more growth.
Demand for H200 and Blackwell is well ahead of supply and we expect demand may exceed supply well into next year," Nvidia's finance chief Colette Kress said on the earnings call.
Huang clarified that the new system is set for production shipments starting in the second quarter and will ramp in the third quarter. He indicated customers should have Blackwell stood up in their data centers in the final quarter of fiscal 2025. When asked if investors should expect Blackwell to impact revenue, Huang said, "we will see a lot of Blackwell revenue this year."
Here's a comparison between Nvidia's different chips. It is noteworthy that the NVIDIA GB200 Blackwell Superchip features a connection of two NVIDIA B200 Core GPUs to the NVIDIA Grace CPU via a 900GB/s high-speed, low-power NVLink chip-to-chip interconnect.
The latest dynamics of the Asian semiconductor supply chain suggest that, based on the advanced CoWoS (Chip-on-Wafer-on-Substrate) technology capacity estimates, approximately 420,000 GB200 superchips are expected to be integrated into downstream applications by the second half of 2024. Looking ahead to 2025, considering the future capacity allocation, the annual production of GB200 chips is likely to increase to between 1.5 and 2 million units.
HSBC's estimated data shows that the price of Nvidia AI GPU GB200 will be 60,000-70,000 US dollars, and the price of B100 will be 30,000-35,000 US dollars. The server GB200 NVL36 equipped with 18 GB200s is priced at US$1.8 million, and the server GB200 NVL72 equipped with 36 GB200s is priced at US$3 million.
According to Tom's Hardware analysis, Nvidia is now more inclined to sell entire servers rather than independent GPUs, which allows Nvidia to absorb part of the premium earned by system integrators and will increase Nvidia's profitability.
Besides, major cloud computing providers all plan to significantly increase their capital expenditures. Microsoft has announced its plans to continue expanding its investment scale and capacity in artificial intelligence in 2024, with a quarter-over-quarter increase of 21.74% in the first quarter. Google expects to maintain capital expenditures of no less than $12 billion for the remaining quarters of 2024. Meta has raised its Capex guidance from $30 billion – $35 billion to $35 billion – $40 billion and anticipates that capital expenditures will continue to grow into 2025.
Disclaimer: Moomoo Technologies Inc. is providing this content for information and educational use only.
Read more
Comment
Sign in to post a comment
tinybird : Noted and filed with AI #1312
eldritch : wow some really expensive computing power there
Carolyn Smith : It seems that the demand for the H200 and Blackwell products from Nvidia is surpassing the available supply. Nvidia's finance chief, Colette Kress, mentioned on the earnings call that they anticipate this demand-supply gap to continue well into the next year.
This is indicative of the popularity and desirability of these products among consumers. The high demand could be attributed to the advanced features and capabilities offered by the H200 and Blackwell, making them sought-after choices in the market.
Nvidia will likely need to strategize and ramp up their production efforts to meet the increasing demand and ensure a steady supply. It will be interesting to see how they address this challenge in the coming months to keep up with the market demand for these products.
74136657 : nice lesson