Account Info
Log Out
English
Back
Log in to access Online Inquiry
Back to the Top

Who won the AI semiconductor match race between AMD and NVIDIA?

avatar
moomooニュース米国株 wrote a column · Nov 14, 2023 00:53
This article uses automatic translation for some of its parts
Who won the AI semiconductor match race between AMD and NVIDIA?
$NVIDIA(NVDA.US)$announced a next-generation AI supercomputer chip that is likely to play a major role in future breakthroughs in large-scale language models (LLM) such as deep learning and OpenAI's GPT-4.
The key product is the HGX H200 GPU based on NVIDIA's “Hopper” architecture, which is the successor to the popular H100 GPU. This is the company's first chip to use faster and larger capacity HBM3e memory,Great for large language modelsIt has become.
“With HBM3e, the NVIDIA H200 realizes 141 GB of memory at 4.8 terabytes per second, providing almost double the capacity and 2.4 times the bandwidth compared to its predecessor, the NVIDIA A100,” the company stated.
AI performance
As an advantage for AI, according to NVIDIA, the HGX H200 is Llama 2, which is an LLM with 70 billion parametersInference speed increased by 2 times that of H100Let me do it. The HGX H200 is available in 4- and 8-way configurations that are compatible with both H100 system software and hardware. Also, it is said that it can be used in all types of data centers (on-premise, cloud, hybrid cloud, edge), such as Amazon Web Services, Google Cloud, Microsoft Azure, and Oracle Cloud Infrastructure. The service is scheduled to begin in the 2nd quarter of 2024.
GH200 Grace Hopper
Another key product of NVIDIA is the GH200 Grace Hopper “super chip,” which combines an HGX H200 GPU and an ARM-based Nvidia Grace CPU using the company's NVLink-C2C interlink.
It is designed for supercomputers, and NVIDIA stated that “scientists and researchers can tackle the world's most difficult problems by speeding up complex AI and HPC applications that execute terabytes of data.” The company has stated that the GH200 will be used in “over 40 AI supercomputers from research centers, system manufacturers, and cloud providers around the world,” such as Dell, Eviden, Hewlett Packard Enterprise (HPE), Lenovo, QCT, and Supermicro.
Among them, it is worth noting that HPE's Cray EX2500 supercomputer, which uses a quad GH200, can scale up tens of thousands of Grace Hopper Superchip nodes.
AMD Instinct MI300X
Who won the AI semiconductor match race between AMD and NVIDIA?
December 6, $Advanced Micro Devices(AMD.US)$Is“Adopting AI”It was held and codenamed“MI300”The full story of the next-generation Instinct accelerator family called will be revealed. The AMD Instinct MI300X clearly targets NVIDIA's Hopper and Intel's GAUDI accelerators in the AI field, so it's definitely the chip that is getting the most attention.
AMD adopted a chiplet design for the MI300X and MI300A accelerators, $Taiwan Semiconductor(TSM.US)$By utilizing advanced packaging technology, it competes with the monolithic H100 and H200 Hopper AI GPUs. AMD's new Instinct MI300X combines 5 nm and 6 nm IP, and is characterized by a huge scale of 153 billion transistors.
The memory is the part where significant performance improvements can be seen, and the MI300X is 50% more than the previous model MI250X (128GB)HBM3 with 192 GB of capacityIt is equipped with. In order to realize a 192 GB memory pool, AMD has equipped the MI300X with 8 HBM3 stacks, each stack is 12-Hi, and a 16 GB IC is built in.
Who won the AI semiconductor match race between AMD and NVIDIA?
this192 GB of HBM3 memoryprovides a maximum bandwidth of 5.2 Tb/s and an Infinity Fabric Bandwidth of 896 Gb/s. As a comparison, $NVIDIA(NVDA.US)$What is the AI accelerator H200 that was recently announced141GBIt provides a capacity of, $Intel(INTC.US)$What is Gaudi 3 by144 GBA capacity of is provided. Large-capacity memory pools are extremely important in LLM, where most are bound to memory, and AMD can definitely show the power of AI by leading in the memory field.
Who won the AI semiconductor match race between AMD and NVIDIA?
This article uses automatic translation for some of its parts
Sources: Official websites of each company, Bloomberg, Engadget, Wccftech, Tweaktown, CNBC
moomoo news - Zeber
Disclaimer: Moomoo Technologies Inc. is providing this content for information and educational use only. Read more
54
5
+0
1
See Original
Report
32K Views
Comment
Sign in to post a comment
avatar
moomoo News Official Account
28KFollowers
2Following
62KVisitors
Follow