Account Info
Log Out
English
Back
Log in to access Online Inquiry
Back to the Top

Nvidia: demonstrating strength in modernizing computing and creating AI architectures.

Nvidia: demonstrating strength in modernizing computing and creating AI architectures.
Demand for Nvidia Corporation's AI hardware surged after ChatGPT, leading to record revenue and operating profit, demonstrating strength in modernizing computing and constructing AI factories despite architecture transition.

Nvidia's advantage is reinforced by the widespread adoption of CUDA software, despite potential price advantages, the attractiveness of competitors' hardware is declining.

CEO Huang emphasizes Nvidia's role in the AI revolution, modernization of data centers, and establishment of AI factories, forecasting tremendous growth in the AI-driven industry.

Despite the risk of overvaluation similar to the dot-com era, Nvidia's conservative guidance, robust finances, and founder-led management make it a compelling long-term investment.
Introduction

According to my August article, Nvidia Corporation (NASDAQ: NVDA) continues to update its sales records while waiting for the Blackwell architecture. Since then, there have been the September 11th Goldman Sachs (GS) Communacopia Tech Conference and Nvidia's submission for the 25th quarter 3 of our financial estimates. My paper states that Nvidia is leveraging its strength by modernizing computing at a rapid pace and creating AI factories, despite being in the midst of architecture changes.

Nvidia's accounting quarter differs from the calendar quarter. In my opinion, they take about a month off for a year. For example, Nvidia's fiscal quarter 3 for 2025 just ended on October 27, 2024.

Number

In January 2023, when ChatGPT reached 0.1 billion users, it marked a turning point for the world. The demand for AI hardware began to surge for several reasons. One reason is that the old ways became suboptimal. Instead of programmers writing code by hand and executing on CPUs, they choose to modernize data centers with accelerated computing. Another reason is that new AI companies trying to build new tools need AI factories. ChatGPT was not replacing anything, it was a new type of company. Many other new AI companies are forming, leading to a tremendous demand for AI hardware. Nvidia is constructing AI factories needed by these new companies.

We are still in the early stages of AI, and we expect Nvidia to demonstrate strength over the next few years. Back in March 2024, on Lex Fridman's podcast, Yann Lecun, the Chief AI Scientist of Meta (META), mentioned that the Large Language Models (LLM) lack four critical intelligence capabilities. He named them as understanding the physical world, having persistent memory, inference, and planning. Nvidia's CEO, Huang, responded to questions from Haritoshiya, an analyst at Goldman Sachs (GS), during the Q3 '25 earnings call about improvements in these areas. The question pertained to a recent article by The Information discussing overheating in Blackwell chips. Huang appeared unconcerned, citing numerous engineering procedures. Particularly important was his and CFO Kress's mention of delivering more Blackwells in the quarter ending in January than previously expected. He specifically referenced intelligence capabilities and stated they were advancing to the next level in foundational models (emphasis added):

And we are at the beginning of a new generation of foundational models capable of inference and long-term thinking. One truly exciting field is physical AI, which is AI that understands the structure of the physical world.

Among the large companies in the S&P 500, the turning point of ChatGPT brought more profit to Nvidia than anyone else, but many other large companies using Nvidia as a supplier also started to thrive. I like to look at Nvidia's operating profit and revenue alongside the same figures for some large companies depending on them, like hyperscale cloud providers. Reports indicate that Nvidia's sales to Azure (MSFT) are the highest, but Microsoft does not break out Azure's cloud numbers from other business figures, showing numbers for Google Cloud (GOOG, GOOGL) and AWS (AMZN). Nvidia's operating profit rose from just $2.140 billion in the quarter ending April 30, 2023, to $21.870 billion in the quarter ending October 27, 2024. Simultaneously, Nvidia's revenue surged from $7.190 billion to $35.080 billion. In other words, Nvidia has shown incredible strength over the past 6 quarters.
*As a reminder, Nvidia's quarters are about a month off from the calendar quarters and also differ in years. I entered their accounting numbers into the nearest recent calendar quarter, so the latest accounting period for 3Q25 ending on October 27 falls under the 3Q24 calendar quarter ending on September 30.

The July 2024 Google Cloud update showcases advancements in TPU AI hardware. Google's TPU has been around since before ChatGPT's turning point in 2015, so I believe it's growing alongside Nvidia's hardware. TPU has taken a slightly different path but on a similar journey as Nvidia's GPUs.
Until the end of 2022, Google Cloud was operating at a loss every quarter. Concurrent with the initial rise in popularity of ChatGPT, Google Cloud achieved an operating profit of 0.1 billion 0.09 million dollars and revenue of 0.7 billion 4 0.04 million dollars in one year. These figures rose to 19.47 million dollars and 13.53 million dollars respectively in the 3rd quarter of the 24th year for Google Cloud's calendar period. One of the reasons Google Cloud experienced an increase in operating profit is because it provides Nvidia's hardware to customers for good profits. Another reason is that Google Cloud is also making solid profits with its in-house designed TPU. The point is that while Google's TPU is thriving and will continue to do so, it is not a major threat to Nvidia.

FT's article on November 13th describes AWS's Annapurna Labs chip that can be used as an alternative to Nvidia's hardware. While this is true on the surface, both hardware designs, Nvidia's and AWS's, are growing. Amazon (AMZN) mentioned AWS custom-designed Inferentia AI chip in its revenue announcement for the fourth quarter of 2018, stating that it helps customers improve performance and reduce costs for running inference workloads. Talk of the Trainium chip began in the revenue release for the fourth quarter of 2020, stating it would make an appearance on Amazon EC2 and Amazon SageMaker in the second half of 2021. Similar to Google's TPU, Amazon's custom silicon seems to be doing well, poised for growth alongside Nvidia.

An AWS blog article from April 2023 explains the turning point for ChatGPT. Following the surge in popularity of ChatGPT, AWS saw a sharp increase in operating profit. AWS had an operating profit of 5.1 billion dollars, with revenue of 21.4 billion dollars. These figures rose to 10.5 billion dollars and 27.5 billion dollars respectively by the 3rd quarter of the 24th year for AWS's calendar period. The reason for AWS's silicon growth is similar to what was seen with Google Cloud above. Like Google's in-house designed TPU, AWS's in-house designed silicon continues to perform well, not posing a major threat to Nvidia.

One of the reasons Nvidia is very powerful is because it is not easily replaceable. Even if other companies like Amazon, Google (GOOG), AMD (AMD), could manufacture hardware that may outperform in specific tasks, engineers are reluctant to switch as they are accustomed to Nvidia's CUDA software. When AMD began disrupting Intel's data center CPU business with Azure in late 2017 and AWS in 2018, there were no compatibility issues with switching silicon at that time. Another way to consider Nvidia's strength is to look at the data center sales against Intel (INTC) and AMD. An article from Seeking Alpha in September 2023 shows the trio's share amounts in the data center oligopoly.
Graphs of this type were useful when the data center trio was growing slowly. However, since the ChatGPT turning point in January 2023, the market has been expanding rapidly, so now I prefer to look at dollars rather than percentages. In the case of AMD, having a larger thin slice of a big pizza is better than a wide slice of a small pizza. In other words, revenue dollars are now more important than trio revenue shares. AMD has seen revenue growth from 1.3 billion dollars in the 1st quarter of the 23rd year to 3.5 billion dollars in the 3rd quarter of the 24th year. This occurred even though their trio revenue share decreased from about 14% to about 9%. They got more pizza even when the slice shrank.

As AMD was, Nvidia is remarkable, going from 4.3 billion dollars in data center revenue in the 1st quarter of 2013 to 30.8 billion dollars in the 3rd quarter of the 24th year (for clarity, Nvidia's fiscal quarter was matched to the nearest corresponding calendar quarter below). Nvidia has continued to gain market share until the 1Q24 calendar quarter and has seen rising revenue dollars since that time, despite the trio-poly share being relatively stable at 81% to 82%. The data center trio-poly market is growing very rapidly, and Nvidia's revenue dollars continue to rise even as the trio-poly share remains quite stable. AMD looks steady, but Nvidia is showing strength.
Nvidia's CEO Jensen Huang continues to explain the reasons why these strong numbers are seen every quarter. It's because they are modernizing computing for existing tasks and building AI factories for new companies aiming to become the next ChatGPT. He elaborated on the concept of modernization at the Goldman Sachs Communacopia Tech Conference on September 11 (emphasis added):

If you're processing SQL, speed it up. If you're doing any data processing, speed it up. If you're building an internet company and have a recommendation system, absolutely speed it up, they are fully accelerated now. A few years ago, all of this was running on CPUs. But now, the world's largest data processing engine—the recommender system—is fully accelerated. So, if you have recommender systems, search systems, massive data processing, you need to speed it up. And the first thing that happens is that the $1 trillion general-purpose data centers in the world are modernized to accelerated computing.

Nvidia is not only in the early stages of the AI revolution, including the modernization of data centers and the construction of AI factories. Following the Goldman Sachs Communacopia Tech Conference on September 11, Forbes cited many sources in a September article stating that the long runway for AI investment has not yet come. Matt Garman, the CEO of AWS, mentioned that current AI use cases are just scratching the surface. Bill McDermott, the CEO of ServiceNow (NOW), stated that AI is a source of opportunity for the economy. This article also quotes Mike Scarpelli, CFO of Snowflake (SNOW), and Kevin Scott, CTO of Microsoft, on future AI investments.

Mike Scarpelli, CFO of Snowflake, considers it to be in the very early innings but explained that "very few people are using it heavily today in reality." According to Kevin Scott, CTO of Microsoft, bringing AI to the masses is when the industry unlocks what was previously considered impossible or very expensive when AI adoption becomes ubiquitous. Scott believes it may take 5 to 10 years to see what developers can do and what applications they can create.

CEO Huang roughly reiterated his message to UBS analyst Timothy Arcuri about the world's beginnings of two fundamental changes in computing and answered roughly (emphasis added):

The first is transitioning from coding running on CPUs to machine learning on GPUs. And the fundamental transition from coding to machine learning is widespread at this point. There are no companies not doing machine learning. Therefore, machine learning also enables generative AI. On one hand, what's happening first is that the world's $1 trillion worth of computing systems and data centers worldwide are being modernized for machine learning. On the other hand, secondly, it's creating new types of functions called AI. When we talk about generative AI, we're essentially saying that these data centers are actually AI factories. They are creating something. Just as we generate electricity, we are now going to generate AI.

At the end of the 3Q25 call, CEO Huang pointed out strong points about Nvidia's strength, mentioning exponential growth before and after training (emphasis added):

The phenomenal growth of our business is driven by two fundamental trends promoting the global adoption of NVIDIA computing. First, the computing stack is reinventing, experiencing a platform shift from coding to machine learning. From running code on CPUs to processing neural networks on GPUs. The conventional $1 trillion data center infrastructure is being rebuilt for Software 2.0, which applies machine learning to generate AI. Secondly, the era of AI is intensifying. Generation AI is not just a new software function but a new industrial revolution that can create a multi-trillion dollar AI industry, with industries pumping out digital intelligence from AI factories.
In the midst of the hardware shift, all this power can be seen. For companies with moderate strength, the hardware shift is not very specific when it comes to information about the latest and greatest products being broadcast. Tesla (TSLA) is rumored to be working on a new Model Y, but has not officially announced it in order to avoid damaging sales of the existing Model Y. With an impressive display of power, Nvidia can announce new products in advance without significantly harming existing sales. In May, Nvidia's new Blackwell architecture was revealed to be up to 25 times more cost and energy efficient in running LLM-generated AI compared to Nvidia's current Hopper architecture. At the latest for the 3Q25 financial period, customers were not able to purchase Blackwell yet, but Nvidia's revenue continues to grow. Often, when companies announce new products, sales of old products decrease, but this has not happened with Nvidia's Hopper architecture. This is a show of immense strength.

During Nvidia's 3Q25 call, CFO Kress stated that the demand for the new Blackwell architecture is phenomenal! She also mentioned that Microsoft will be the first provider to offer Blackwell-based cloud instances. Furthermore, she stated that 64 Blackwell GPUs could perform the same work as 256 H100 Hoppers (emphasis added):

Compared to 256 H100, running GPT-3 benchmarks requires only 64 Blackwell GPUs, resulting in a 4x cost reduction. The NVIDIA Blackwell architecture with NVLINK switch enables up to 30x faster inference performance and allows for new levels of inference scaling throughput and response times optimal for running new inference applications like OpenAI's o1 model. With each shift of the new platform, a wave of startups emerges. Hundreds of AI-native companies are already offering AI services and achieving great success. While Google, Meta, Microsoft, OpenAI lead the way, companies like Anthropic, Perplexity, Mistral, Adobe Firefly, Runway, Midjourney, Lightricks, Harvey, Codeium, Cursor, Bridge are also experiencing great success, with thousands of AI-native startups building new services.

Evaluation

There are many risks involved in evaluating Nvidia's stock. According to a November 21st FT article, Cisco's (CSCO) stock price reached 130 times earnings before the dot-com crash. Investors who purchased Cisco before the dot-com crash understood that the internet would be important in the coming decades, but were mistaken about the valuation. Many agree that AI will dramatically change the world. However, just as internet stock valuations were unrealistic before the dot-com crash, there is a possibility that AI stock valuations may occasionally be ahead of themselves. It is understood that Nvidia's overall gross margin is expected to decline in the next quarter during the early ramp of Blackwell. Hardware from AWS, Google Cloud, AMD, Intel's Annapurna Labs may potentially steal sales from Nvidia. Despite these risks, I remain very optimistic about Nvidia's business and its stock.

Nvidia, from now until judgment day, has a value in terms of the amount that can be extracted from the company. These are some of the numbers I consider when evaluating Nvidia.
These historical numbers were born at a time when there was approximately $1 trillion in the world's data centers. CEO Huang detailed the modernization of computing and building of AI factories during the 35th quarterly call of 2035 when queried about Blackwell by BofA analyst Vivek Arya. CEO Huang stated that ChatGPT has not replaced anything and believes it resembles the era when the world first received the iPhone. Just as mobile-first companies were seen with the launch of the iPhone, AI-natives are now prevalent. CEO Huang notes the need for AI factories in this new industry, giving us ideas about what this market should look like (emphasis added):

Let's assume that as we grow into IT over the next four years, the world's data centers may be modernized. As you know, IT continues to grow at an annual rate of about 20% to 30%. By 2030, the world's computing data centers are said to be valued in the trillions of dollars. And we need to grow into that. We need to modernize data centers from coding to machine learning. That's number one. The second part is Generation AI, and we are creating new capabilities that the world did not know, new market segments the world did not have. It did not replace OpenAI. It is completely new. It's much like when the iPhone arrived in many ways.

In addition to quantitative considerations for evaluation from the table above, there are also qualitative factors. I think founder-led companies like Nvidia are special. Clearly, Nvidia is not a software (SaaS) company as a service, but it is not as much a hardware company as it uses software to design hardware. Fabricators like TSMC (TSM) actually manufacture most of the hardware.

According to the 40 rules of SaaS companies, the total revenue growth and revenue margin compared to the previous year must be above 40%. Concerning the revenue margin side of the rule, FCF margin, operating margin, and EBITDA margin have all been discussed. In the last period, Nvidia's revenue growth rate was 94%, FCF margin was 48%, and operating profit margin was 62%.

The management team has a history of conservative guidance. According to the 2Q25 release, revenue for 3Q25 was expected to be $32.5 billion +/- 2%. As per the 3Q25 release, the actual revenue was $35.1 billion, which is $2.6 billion higher, or 8% higher than estimated. The release for 3Q25 anticipates revenue for 4Q25 to be $37.5 billion +/- 2%. It would not be surprising if the actual revenue for 4Q25 approaches $40 billion.

Nvidia's operating profit in Q3 of '25 was $21,869 million, and revenue was $35,082 million. The operating margin was $87.5 billion, and I believe the valuation range of 40 to 50 times this amount is not unreasonable. This implies an optimistic range from $3500 to $437.5 billion.

As of November 15, there were 24.49 billion shares for Q3 of '25, and based on the stock price of $135.34 on November 27, the market capitalization is $3,314 billion. The market cap is below the optimistic valuation range, making this stock a good buy for long-term investors.

Future investors need to pay attention to the latest information on Blackwell and other developments. Events mentioned by CFO Kress during the 3Q25 call must not be missed. These include the UBS Global Tech and AI Conference on December 3, CES keynote on January 6, and CES Q&A on January 7.

Disclaimer: None of the materials in this article should be relied upon as formal investment advice. Please do not buy stocks without conducting thorough research.
Disclaimer: Community is offered by Moomoo Technologies Inc. and is for educational purposes only. Read more
10
+0
See Original
Report
11K Views
Comment
Sign in to post a comment
    日興證券 HSBC証券 2社の証券会社の設立 などの証券会社での勤務
    1528Followers
    824Following
    5662Visitors
    Follow