Weak demand prevents Amazon from deploying AMD's Instinct AI accelerators in the cloud — the company plans to strengthen its portfolio with Nvidia Blackwell GPUs
NOIR(のあ)
:
This topic is important. stem inc has adopted low-cost AMD GPUs in the cloud of AWS's AI infrastructure, but since it does not fit the development environment of NVDA's ecosystem, there is no demand. So it is understood that NVDA's GPU blackwell is used to stimulate demand. Demand is increasingly excessive without NVDA, making the supply tight. The financial results after the 4th quarter will not fail.
103730757
NOIR(のあ)
:
Fully agreed with Noir that this topic is very important for us to determine the less demand for AMD's AI GPUs compared to the huge excessive demand for Nvidia's AI GPUs. Basically it boils down to Nvidia's popular and widely-used CUDA software stack and tools in Nvidia's ecosystem.
NOIR(のあ)
103730757
:
Thank you for your insightful comment. I truly appreciate your perspective on this important topic and the clarity you provided regarding Nvidia's ecosystem and its impact on demand.
103730757
:
The key question is how long can Nvidia maintain its AI leadership. Another 1 year or 2 years perhaps? All the other tech giants are developing their own AI chips to move away from over-dependence on Nvidia.
103730757
:
Several tech giants are actively developing their own AI chips to power their AI systems and reduce reliance on third-party providers. Some of the notable companies include:
* Google: Google has developed its own Tensor Processing Units (TPUs) specifically designed for machine learning tasks. TPUs are used in Google's data centers to train and run large-scale AI models.
* Amazon: Amazon has developed its own Inferentia chips designed for high-performance inference tasks, which involve running trained AI models on new data. Inferentia chips are used in Amazon's cloud services to power AI applications.
* Apple: Apple is rumored to be developing its own AI chips for use in its devices and data centers. While details are limited, it's expected that these chips will be optimized for machine learning tasks.
These are just a few examples of tech giants developing their own AI chips. Other companies, such as Baidu, Alibaba, and Huawei, are also investing heavily in AI chip development. The trend of developing custom AI chips is driven by the increasing demand for AI processing power and the desire for greater control over hardware and software.
NOIR(のあ) : This topic is important.
stem inc has adopted low-cost AMD GPUs in the cloud of AWS's AI infrastructure, but since it does not fit the development environment of NVDA's ecosystem, there is no demand. So it is understood that NVDA's GPU blackwell is used to stimulate demand.
Demand is increasingly excessive without NVDA, making the supply tight. The financial results after the 4th quarter will not fail.
103730757 NOIR(のあ) : Fully agreed with Noir that this topic is very important for us to determine the less demand for AMD's AI GPUs compared to the huge excessive demand for Nvidia's AI GPUs. Basically it boils down to Nvidia's popular and widely-used CUDA software stack and tools in Nvidia's ecosystem.
103730757 : Thanks Kimihiko for sharing this important article in our community.
NOIR(のあ) 103730757 : Thank you for your insightful comment. I truly appreciate your perspective on this important topic and the clarity you provided regarding Nvidia's ecosystem and its impact on demand.
Kimihiko OP NOIR(のあ) : コメント有り難うございます♪
103730757 : The key question is how long can Nvidia maintain its AI leadership. Another 1 year or 2 years perhaps? All the other tech giants are developing their own AI chips to move away from over-dependence on Nvidia.
103730757 : Several tech giants are actively developing their own AI chips to power their AI systems and reduce reliance on third-party providers. Some of the notable companies include:
* Google: Google has developed its own Tensor Processing Units (TPUs) specifically designed for machine learning tasks. TPUs are used in Google's data centers to train and run large-scale AI models.
* Amazon: Amazon has developed its own Inferentia chips designed for high-performance inference tasks, which involve running trained AI models on new data. Inferentia chips are used in Amazon's cloud services to power AI applications.
* Apple: Apple is rumored to be developing its own AI chips for use in its devices and data centers. While details are limited, it's expected that these chips will be optimized for machine learning tasks.
These are just a few examples of tech giants developing their own AI chips. Other companies, such as Baidu, Alibaba, and Huawei, are also investing heavily in AI chip development. The trend of developing custom AI chips is driven by the increasing demand for AI processing power and the desire for greater control over hardware and software.