Account Info
Log Out
English
Back
Log in to access Online Inquiry
Back to the Top

Nvidia: Why skeptics keep missing the point

Nvidia: Why skeptics keep missing the point
Despite skepticism about the ROI of AI equipment investment and competition with AMD's MI300X chip, NVIDIA's stock price skyrocketed, driven by strong AI demand.
Cloud providers such as AWS, Azure, and Google Cloud have shown an increase in profitability, reversing the claim that low AI ROI is affecting Nvidia's chip demand.
The transformative impact of AI extends beyond consumer applications, and fast computing is becoming essential for traditional data center tasks.
Nvidia's valuation reflects the company's leading position in the AI revolution, and sales are expected to increase significantly, justifying its premium price.
Investing Essay
Nvidia Corporation (NASDAQ: NVDAI wrote about) for the first timearticlesI remember it. A “buy” rating was given in that article. NVDA rose 240% year over year, and questions began about whether ticker could maintain its sharp rise. The main concern for many was that competition was intensifying as AMD was preparing to launch the MI300X chip as an answer to NVDA's H100 at the time.
Since then, things have moved rapidly. By 2024/5, my secondarticlesIt was virtually a tribute to voices exaggerating AMD's MI300X market position, citing its failed debut in the first quarter of 2024. At that point, NVDA rose another 300%, and this time a new type of skeptic has emerged questioning whether demand for AI will continue. I did my best to argue that NVDA is in the early stages of a secular trend spanning several years or even decades, and I've shown Meta's huge computational needs as an example of the size of the market.
Since the May article, NVDA has risen 55%. Today, investors face a different kind of skeptics. They argue that technology companies will review capital spending at some point in view of the low return on investment (ROI) in AI infrastructure. This article addresses these concerns.
Erroneous assumptions
For trained analysts, analyzing arguments becomes a second habit. Let's take a look at this discussion. Demand for NVDA chips will decline as technology companies become aware of the imbalance between AI capital investment and ROI.
Assumptions: The ROI of AI projects is low.
Premise: Businesses make AI capital investment decisions based on ROI.
Bottom line: Taking into account the assumptions and underlying assumptions, demand for NVDA chips will decrease.
The assumptions and assumptions underlying this discussion are false. We understand that NVDA has a diverse client base and that some of them may have low ROI, but let's start at the beginning.
Half of NVDA's data center sales, or 42% of total sales, is Microsoft (MSFT)'sAzure, Amazon (AMZN AWS for), alphabet (GOOG It is by public cloud companies such as Google Cloud). If the return on AI investment is as weak as we claim, depreciation and amortization expenses are recorded in the product cost (COGS) section in GAAP accounting, so a decline in profit margins should be seen. Once the server is set up and running, a portion of its value is deducted as COGS regardless of revenue.
What we're actually seeing is an increase in profitability. AWS's operating margin reached 38% in the last quarter, up from 30% in the same period last year. In the second quarter of 2022, before the generative AI boom, AWS's profit margin was 26%, lower than when AMZN accelerated AI Capex spending. Azure's profit margins have been stable for the past three years, at 43.6%, 44.5%, and 44.1% for the third quarter of 2024, the third quarter of 2023, and the third quarter of 2022, respectively. Google doesn't categorize profits by segment, but when looking at gross profit margins, there are no signs that profit margins are shrinking, and they're moving in the right direction.
Since consumer AI applications aren't growing as fast as expected, I'm sure some people are skeptical about the ROI of AI. People are still Apple equipped with AI ( AAPL I'm waiting for Siri). However, behind the scenes, AI is providing significant cost benefits. JPMorgan ( JPM) has recently helped employees summarize documents and come up with ideaschatbotsIt has been introduced. GitHub (part of Microsoft) programmers write computer code with several natural language promptsgeneratingIt can now be done, and productivity has improved. Seeking Alpha recently launched a virtual reporting feature summarizing analysts' article summaries. Consumer applications are increasing, leading to increased productivity, increased sales, and lower costs.
Please don't get me wrong. I told investors that although generative AI is a growth opportunity for cloud providers, the pace of growth is not worth the hypewarnedI'm the first one. I was right. But that's completely different from saying AI isn't profitable.
The reason why AI upgrades for many popular apps, such as Apple's Siri and Amazon's Alexa, have been delayed seems to be due to low ROI. However, it has more to do with the business model of these apps than with the capabilities of NVDA's chips or AI technology. The good news is that NVDA calculates the price per product upgradeIt's been loweredThat's it. For example, the A100 launched in 2020 can handle 19.5 teraflops per second and is leased by Data Crunch for $1.89 per hour. Launched in 2022, the H100 can handle 67 teraflops and can be rented for $3.17 an hour. This is a high price reflecting the high equipment costs of the cloud provider H100. That said, performance per dollar is far more economical using NVDA's H100 (21 teraflops per dollar) than the previous A100 (10 teraflops per dollar).
As costs continue to fall, it will be more economical to introduce AI features into freemium business models such as Siri and Alexa. Also, we can't ignore the fact that generative AI is a new technology and isn't perfect. This will also affect the pace of development of consumer AI applications.
More than a large language model
I am IntelINTC ) has also been interviewed, but if you listen to the management team's comments, you can see that they have a strange view of their position in the market. They differentiate spending on traditional data centers and AI data centers, and they believe the latter is a temporary surge that can eventually be balanced.
In particular, major cloud customers are putting a lot of effort into building high-end AI training environments. As a result, much of the budget is focused or prioritized on the AI construction part. That said, this is a short-term surge, and we expect it to balance out over time - Intel Q2 2023Calculation table.
I don't think the budget shift from traditional racks to AI racks in the data center market is temporary. Many traditional computing tasks performed on traditional data center CPUs, such as search engines and recommendation algorithms, have moved to faster computing using GPUs. Wherever big data exists, it's likely more efficient to run it on a faster platform than traditional data centers. I believe Intel's loss of market share in the data center market is more permanent than management is telling shareholders.
valuations
I believe we are now in the early stages of a long-term tailwind of AI that will span decades, if not decades. This tailwind will completely change the world as we know it. There's a good chance that an emotionally intelligent digital assistant could become a reality. Neuralink, which maps our thoughts, isn't as fictional as it was 15 years ago. General-purpose humanoid robots are currently being developed. All of these innovations require enormous computational power.
Current computing power is clearly insufficient. Even trivial tasks such as creating AI images are too slow. Queries in the latest version of ChatGPT, which supports basic reasoning skills, are limited by computing power limitations. All of this shows that today's AI infrastructure requires even more microchips. This year, Wall Street expects NVDA sales to increase 115% to reach $129 billion. It is estimated that in 2025, sales will increase 47% to $192 billion. I think NVDA has the potential to surprise us. The pace of AI innovation has accelerated dramatically, and I think NVDA sales will grow in parallel with these advancements.
The company's 48-fold P/E ratio is based on 2024 revenue forecasts. If the EPS forecast for next year is $4.37 per share, P/E would drop 32 times. The company is still not the cheapest microchip company, but it's still within the industry. Equally important, given the company's dominant position in the AI data center market, NVDA will demand a price premium for the foreseeable future.
Third quarter results
Data centers currently account for the majority of NVDA's revenue, and half of this revenue comes from a handful of public cloud providers that lease AI computing capacity to the public. 2023Fourth quarter of the yearthroughQ3 2024During that time, data center sales increased nearly tenfold. Over 24 months, the gap between data centers and gaming has grown from $1.6 billion to around $27 billion, reflecting the “infrastructure building” phase we're currently experiencing.
Also interesting is that NVDA's game revenue has almost doubled. This reflects the advancement of GPUs under the GeForce brand, but also an increase in reputation given NVDA's leading position in AI.
The virtual reality and augmented reality business reported by the Professional Visualization Division generated 0.4 billion 86 million dollars in the third quarter of 2024, an increase of 17% over the previous year. While this is a reasonable achievement, it still does not fully reflect the market's potential opportunities.
The automotive division grew 71% in the third quarter, and if this trend continues, it will surpass the professional visualization division by the fourth quarter of 2024. NVDA GPUs are being used in autonomous vehicles to provide real-time image analysis. Approximately 75 million cars were sold last year. Tesla (TSLA), cruises (GM), Waymo (GOOG) (GOOGL), etc. continue to work on the advancement of autonomous driving and the development of regulatory frameworks for autonomous driving, there is a possibility that NVDA's sales in this field will rapidly increase in the same way as data centers two years ago.
Final thoughts and why I might be wrong
Fluctuations can be expected for any company that opens up new markets. But I believe we've entered a phase of rapid AI advancements that will transform every industry. Driving a car, high-resolution virtual reality, a brain connecting a brain to a computerneural links, etc.Please think about it. Growth isn't linear, but there is a possibility that there will be significant benefits for those willing to ride the waves.
In addition to potential variability and nonlinearity in demand, competition risk is also likely to become more pronounced in the long and medium term. In business, I think there are almost no situations where the winner gets everything. Different customers have different AI solutions, whether it's Intel Xeon and Gaudi accelerators or AMD's EPYC and Instinct product suites.
Today, however, NVDA has the best GPUs, and its competitiveness is enhanced by the software ecosystem represented by frameworks such as CUDA, TensorRT, and Omniverse. These frameworks allow developers to optimize NVDA hardware for specific needs. Equally important, these frameworks make it harder for data centers to switch to different hardware.
Disclaimer: Community is offered by Moomoo Technologies Inc. and is for educational purposes only. Read more
18
1
+0
See Original
Report
19K Views
Comment
Sign in to post a comment
    日興證券 HSBC証券 2社の証券会社の設立 などの証券会社での勤務
    1526Followers
    821Following
    5642Visitors
    Follow