share_log

苹果为何一直拒用英伟达?

Why does Apple consistently refuse to use NVIDIA?

wallstreetcn ·  Dec 25 15:10

In the 2000s, there was a brief "honeymoon period" between the two parties, but over time, the contradictions between business and technology have intensified. Apple has always aimed to create a complete ecosystem, and large-scale procurement of NVIDIA's GPUs will undoubtedly weaken Apple's dominance in the AI field.

In the era of the AI explosion, NVIDIA has almost monopolized the AI Chip market with its powerful GPU, becoming a sought-after partner by many Technology giants.

However, Apple has maintained a subtle distance from NVIDIA, and it can even be said that it is deliberately avoiding them. In fact, there was a brief 'honeymoon period' between the two parties in the 2000s, but over time, their conflicts have intensified.

This inevitably raises curiosity: why has Apple consistently refused to use NVIDIA? What hidden 'grudges and resentments' and strategic considerations lie behind this?

Apple has always sought to create a complete ecosystem, and large-scale procurement of NVIDIA's GPU would undoubtedly weaken Apple's dominance in the AI field. To break free from dependence on NVIDIA, Apple has adopted various strategies.

However, as the competition in AI deepens, Apple is under pressure to train larger and better models, which will require more high-end GPUs. In the short term, the competitive and cooperative relationship between the two may still exist.

Historical Grievances: From 'Honeymoon Period' to 'Ice Age'

The cooperation between Apple and NVIDIA was not always filled with hostility. As early as 2001, Apple used NVIDIA's chips in its Mac computers to enhance graphic processing capabilities. At that time, the relationship between the two was good and could even be described as a 'honeymoon period.'

However, this honeymoon period did not last long.

The first significant event that caused a rift in their relationship occurred in the mid-2000s. At that time, Steve Jobs publicly accused NVIDIA of stealing technology from Pixar Animation Studios (of which Jobs was a major shareholder), undoubtedly casting a shadow over their relationship.

In 2008, the tension between the two parties further escalated. At that time, a batch of defective GPUs produced by NVIDIA was used in several laptops, including the Apple MacBook Pro, leading to widespread quality issues known as the 'bumpgate' incident.

NVIDIA initially refused to take full responsibility and compensate, which enraged Apple and directly led to the breakdown of their cooperative relationship. Apple had to extend the warranty period for the affected MacBooks and endured significant economic and reputational losses.

According to The Information, citing Apple insiders, NVIDIA executives have long viewed Apple as a 'demanding' and 'low-margin' customer, unwilling to invest too many resources in it. After the success of the iPod, Apple also became more assertive, believing that collaboration with NVIDIA was difficult. Furthermore, NVIDIA's attempt to charge licensing fees for the graphics chips used in Apple's mobile devices further exacerbated the conflict.

The game of business and technology strategy.

In addition to historical grievances, Apple's refusal to use NVIDIA is closely related to its consistent business strategy.

Apple has always emphasized comprehensive control over its product hardware and software, striving to create a complete ecosystem. To achieve this goal, Apple continuously strengthens its independent research and development capabilities, reducing its reliance on external suppliers.

In the chip field, Apple is at the forefront of the Industry. From the A series chips of the iPhone to the M series chips of the Mac, Apple continuously launches high-performance self-developed chips, gradually decreasing its reliance on traditional chip giants like Intel. In this context, Apple is naturally unwilling to be restricted by NVIDIA in the field of AI Chips.

Apple hopes to have complete control over key technologies to ensure product performance optimization and differentiated competitive advantages. Relying heavily on NVIDIA's GPUs would undoubtedly weaken Apple's dominance in the AI field, limiting its product innovation and technological roadmap.

In addition, although NVIDIA's GPUs are powerful, they also have issues with high power consumption and heat generation, which poses a significant challenge for Apple's pursuit of lightweight and portable products. Apple has always been committed to making products lighter, thinner, and more efficient, while NVIDIA's GPUs are somewhat contrary to its design philosophy.

Apple has repeatedly requested NVIDIA to customize low-power and low-heat GPU Chips for its MacBook, but has not succeeded. This has prompted Apple to turn to AMD and collaborate with it to develop custom graphics chips. Although AMD's chips are slightly inferior to NVIDIA's in performance, their power consumption and heat dissipation are more aligned with Apple's needs.

New challenges in the AI wave.

In recent years, the explosive development of AI technology has posed new challenges for Apple. To remain competitive in the AI field, Apple needs to train larger and more complex AI models, which undoubtedly requires more powerful computing capabilities and additional GPU resources.

To break free from reliance on NVIDIA, Apple has adopted a multi-pronged strategy.

Firstly, Apple primarily rents NVIDIA's GPUs through cloud service providers like Amazon and Microsoft rather than making large purchases. This approach can avoid significant financial investments and long-term reliance.

Secondly, Apple has previously used AMD's graphics chips and collaborated with Google to use its TPU (Tensor Processing Unit) for AI model training.

In addition, Apple is working with Broadcom to develop its own AI Server Chip, codenamed 'Baltra', which is expected to be put into mass production by 2026. This chip is not only used for inference but may also be used for training AI models.

Although Apple has been striving to reduce its reliance on NVIDIA, the competitive relationship between the two may still persist in the short term. Mastering core technology is essential to remain competitive in a fiercely contested market.

Disclaimer: This content is for informational and educational purposes only and does not constitute a recommendation or endorsement of any specific investment or investment strategy. Read more
    Write a comment