Training surpasses inference in AI infrastructure spending: Bernstein
According to Bernstein Societe Generale Group, as artificial intelligence companies are eager to build large-scale language models that continue to expand, Microsoft (NASDAQ: microsoft), Amazon Web Services (NASDAQ: AMZN (NASDAQ stock code: NVDA). This means that the situation of the seven giants may soon change. With the expected increase, Tesla's stock fell today. Before the weight change was announced on July 14, the company may be in trouble along with other peers. The weight change will take effect before the opening on July 24. As the seven major companies prepare for weight adjustments, should investors be prepared for a difficult road ahead? Let's take a closer look at the recent developments and assess what this means for these leading technology stocks. What happened to Tesla's stock? Despite Tesla's stock currently being in a loss state, it had significant volatility throughout the day. As of writing this article, the index fell 0.40% during the day after multiple declines and surges. However, its current trajectory suggests that it may soon fall further. Investors should note that the company received positive news yesterday as Jefferies analyst Philippe Houchois raised his TSLA stock price target from $185 to $265. The fact that the stock fell despite good news indicates a negative market reaction to the upcoming Nasdaq 100 index adjustment. Does this mean that the magnificent seven large-cap stocks will be severely impacted? According to the Investor's Business Daily report, that is not the case. As the media pointed out:), and Google (NASDAQ: GOOG(NASDAQ: GOOGL), the Meta Platform (NASDAQ: META ) is expected to reach a total of 160 billion dollars in spending on AI infrastructure in 2024.
Investors generally believed that spending on inference would increase, but Bernstein has started claiming that training costs have increased significantly due to the increasing efficiency in the field. According to the latest data from the company, inference accounts for only about 5% of AI infrastructure spending.
Bernstein discovered that the new LLM requires about 10 times more infrastructure cost than previous models. For example, OpenAI's GPT-2 was trained on a chip cluster costing about 3 million dollars, while training GPT-3 required hardware worth about 30 million dollars. Subsequently, GPT-4 was trained on 25,000 A100s, costing about 0.3 billion dollars.
With this history in mind, Bernstein determined that training the model for GPT-5 would require 0.1 million Nvidia H100 units, with a cost of about 3 billion dollars. In comparison, OpenAI is predicted to generate sales of 3.7 billion dollars in 2024.
Investors generally believed that spending on inference would increase, but Bernstein has started claiming that training costs have increased significantly due to the increasing efficiency in the field. According to the latest data from the company, inference accounts for only about 5% of AI infrastructure spending.
Bernstein discovered that the new LLM requires about 10 times more infrastructure cost than previous models. For example, OpenAI's GPT-2 was trained on a chip cluster costing about 3 million dollars, while training GPT-3 required hardware worth about 30 million dollars. Subsequently, GPT-4 was trained on 25,000 A100s, costing about 0.3 billion dollars.
With this history in mind, Bernstein determined that training the model for GPT-5 would require 0.1 million Nvidia H100 units, with a cost of about 3 billion dollars. In comparison, OpenAI is predicted to generate sales of 3.7 billion dollars in 2024.
Disclaimer: Community is offered by Moomoo Technologies Inc. and is for educational purposes only.
Read more
Comment
Sign in to post a comment
Magnificent7 : Amazing.