share_log

苹果官宣:支持Apple Intelligence的模型在谷歌定制芯片上训练

Apple announced that the models supporting Apple Intelligence are trained on Google customized chips.

wallstreetcn ·  19:35

Apple's paper revealed that a large server language model, the AFM, was trained on 8192 Google TPUv4 chips, with 6.3 trillion tokens trained; the client-side AFM was trained on 2048 TPUv5p chips; both the AFM and AFM service were trained on the "Cloud TPU Cluster".

Author of this article: Li Dan.

The weather is good today The weather is good today.

According to public documents, the development of Apple's own artificial intelligence (AI) system, Apple Intelligence, relies on the support of Google's custom chips.

On Monday, July 29th, Eastern Time, Apple's official website published a technical paper that details the development of basic language models to support Apple's personal intelligent system, Apple Intelligence, including a 3 billion parameter model for efficient on-device operation - the End-Side 'Apple Basic Model' (AFM), and a large server language model designed for Apple's cloud-based AI architecture, 'Private Cloud Compute' - Server AFM.

In the paper, Apple introduced that the End-Side AFM and Server AFM are members of its generation model family, which are used to support users and developers. In the paper, Apple disclosed that the training models used Google's fourth-generation AI ASIC chip TPUv4 and updated-generation chip TPUv5. The article stated:

"We trained Server AFM from scratch on 8,192 TPUv4 chips using a sequence length of 4,096 and a batch size of 4,096, resulting in 6.3 trillion token training."

"End-Side AFM was trained on 2,048 TPUv5p chips."

In this 47-page paper, Apple did not mention the names of Google or Nvidia, but said that the AFM and AFM services were trained on a 'cloud TPU cluster.' This means that Apple rented servers from cloud service providers to execute calculations.

In fact, during this year's Worldwide Developers Conference in June, the media discovered in the technical document released by Apple that Google has become another winner for Apple in the AI field. Apple's engineers used their own framework software and multiple hardware, including tensor processing units (TPUs) available only on Google Cloud, when building the basic model. However, Apple did not reveal how much it relies on Google's chips and software compared to other AI hardware suppliers such as Nvidia.

Therefore, comments on social media X on Monday pointed out that Apple had used Google chips in June, and now we have more details about the training stack.

Some people commented that Apple did not dislike Nvidia, but TPU is faster. There are also comments that TPU is faster, so it makes sense for Apple to use it, and of course it may be cheaper than Nvidia chips.

The media commented on Monday that Google's TPU was originally created for internal workloads and is now being used more widely. Apple's decision to use Google chips to train models indicates that some tech giants may be looking for and have found alternatives to Nvidia's AI chips in AI training.

Wall Street News once mentioned that last week, Meta's CEO Zuckerberg and Alphabet, as well as Google's CEO Pichai, hinted in their speeches that their companies and other tech companies may be investing too much in AI infrastructure, 'perhaps too much investment in AI.' But they both acknowledged that the commercial risk is too high if they don't do so.

Zuckerberg said:

"The consequence of lagging behind is that you will be at a disadvantage in the most important technology over the next 10 to 15 years."

Pichai said:

AI is expensive, but the risk of underinvestment is greater. Google may have invested too much in AI infrastructure, including purchasing NVIDIA GPUs. Even if the AI boom slows down, the data centers and computer chips the company has purchased can be used for other purposes. For us, the risk of underinvestment is far greater than the risk of overinvestment.

Disclaimer: This content is for informational and educational purposes only and does not constitute a recommendation or endorsement of any specific investment or investment strategy. Read more
    Write a comment