Account Info
Log Out
English
Back
Log in to access Online Inquiry
Back to the Top

High-Bandwidth Memory, The Key Components of AI Chips, Could Attain a $30 Billion Market. How to Seize Investment Opportunities?

avatar
Analysts Notebook wrote a column · Jun 3 03:14
Goldman Sachs analyst Guini Lee released a report on May 27 stating that their team expects the High-Bandwidth Memory (HBM) market size to grow at a compound annual growth rate (CAGR) of close to 100% from 2023 to 2026 and reach US$30 billion by 2026. HBM is one of the most critical components used in AI chips, especially considering that NVIDIA's latest chips have increased their memory.
High-Bandwidth Memory, The Key Components of AI Chips, Could Attain a $30 Billion Market. How to Seize Investment Opportunities?
The analyst reaffirms the prediction of the HBM undersupply over the next few years, emphasizing that the projected increase in HBM demand is expected to surpass the slightly elevated supply estimates. The revised projections for HBM in the years 2024, 2025, and 2026 indicate an undersupply of 2.7%, 1.9%, and 0.9%, respectively. This suggests a more constrained supply/demand scenario than previously anticipated, where the earlier projections were an undersupply of 2.0%, 1.0%, and 0.7% for the same years. That may push its unit price to increase by 6% in 2024, according to the report.
■ Why is High-Bandwidth Memory promising?
1) Strong AI-related revenue outlook across the HBM supply chain
Regarding High-Bandwidth Memory (HBM) clientele, Nvidia reported Data Center revenues for the April quarter that surpassed expectations, as well as revenue projections for the July quarter that were higher than consensus estimates. The Goldman Sachs team anticipates continued growth in Data Center compute revenue, bolstered by an array of upcoming product releases (including H200, H20, B100, and GB200) planned from now until the end of the year. Consequently, the team has increased their projections for Data Center compute revenue growth in 2024, 2025, and 2026 from the previously stated rates of 112%, 31%, and 8% (as updated on March 19) to revised rates of 142%, 39%, and 18%, respectively. Similarly, AMD has updated its revenue forecast for Data Center GPUs in 2024 from US$3.5 billion, as mentioned in the December 2023 quarter earnings call, to US$4.0 billion.
High-Bandwidth Memory, The Key Components of AI Chips, Could Attain a $30 Billion Market. How to Seize Investment Opportunities?
2) Faster HBM roadmap
During its most recent earnings call, Samsung Electronics Co. (SEC) announced that it initiated mass production of 8-Hi HBM3E in April and has plans to begin mass production of 12-Hi HBM3E within the second quarter of 2024 (2Q24). It's worth noting that the timeline for producing 12-Hi HBM3E is ahead of the original market expectation, which was the third quarter of 2024 (3Q24). However, recent reports suggest that SEC is encountering challenges in getting HBM3E qualified for use with Nvidia products.
Shortly after SEC's earnings announcement, SK Hynix declared that it would start the mass production of 12-Hi HBM3E by the third quarter of 2024 (3Q24) and has moved up the schedule for mass producing HBM4 from 2026 to 2025.
Since the upcoming HBM products will feature higher memory content compared to their predecessors—owing to either increased monodie density or a greater number of stacked DRAM dies—the accelerated HBM development timeline from these suppliers will lead to an increase in the average amount of HBM used per GPU.
Source: Goldman Sachs
Source: Goldman Sachs
3) Higher global cloud capex forecast
All major cloud providers intend to boost their capital spending substantially. Microsoft aims to grow its AI investments, with an increase of 21.74% in Q1 2024. Google plans to sustain at least $12 billion in Capex for the rest of 2024. Meta has increased its Capex outlook from $30-$35 billion to $35-$40 billion, expecting further growth into 2025.
Overall, Goldman Sachs's team raised the 2024E/2025E/2026E HBM total market size estimates by 16%/24%/31% from US$13bn/19bn/23bn to USS15bn/23bn/30bn.
■ How is the competitive Landscape?
The operating dynamics of each company show that Hynix will likely maintain its position as the leading supplier in the HBM3 and HBM3E segments, holding over 50% of the overall HBM market share for the next two to three years. Samsung Electronics is projected to continue dominating the market for legacy products (HBM2E and earlier versions). Starting from 2025E, Micron's overall HBM revenue is forecasted to grow faster than its peers, showing the most significant market share increase from its currently lower base.
For the current year, the market share estimate for Hynix has been raised, while that for Samsung Electronics has been decreased due to Hynix's smooth advancement in yield rate improvements and its success in supplying the latest generation HBM to key clients. In contrast, Samsung Electronics reportedly struggles with obtaining qualifications from Nvidia. As for Micron, it's expected to experience a steady climb in market share from a low starting point, with projections for it to reach a low teens percentage level by 2026.
Source: Goldman Sachs
Source: Goldman Sachs
■ Goldman Sachs is still bullish on Micron after rising 54% this year
Goldman Sachs has assigned a "Buy" rating to Micron with a target price of $122. The price target is derived from applying an 18x multiple to the forecasted EPS of $6.80.
Guini Lee noted that positive signals are evident regarding Micron's HBM allocation trajectory, with pricing and volumes (through Long-Term Agreements) established for 2024 and indications that both pricing and volumes for 2025 are nearing finalization.
Regarding the product, Micron has confirmed that their 8-Hi iteration of HBM3E offers a 30% power efficiency advantage compared to competitors and is currently being shipped for use in Nvidia's H200 GPUs in the second quarter of 2024.
He also noted that key factors that could negatively impact these projections are: 1) demand for servers, smartphones, and personal computers falling short of expectations, 2) Micron or any of its rivals in the DRAM and NAND sectors failing to maintain disciplined supply management, and 3) challenges in executing node transitions and achieving cost reductions.
Source: SK Hynix, Micron, Goldman Sachs
Disclaimer: Moomoo Technologies Inc. is providing this content for information and educational use only. Read more
31
3
+0
1
Translate
Report
63K Views
Comment
Sign in to post a comment