Artificial intelligence

Apple Trained Its AI on Alphabet Chips Instead of Expensive Nvidia

The iPhone maker has published technical documentation for its Apple Intelligence AI, revealing that the neural network was deployed on Google’s cloud computing power, running on parent company Alphabet’s own chips, the Tensor Processing Unit (TPU). Apple has thus found a cheaper replacement for Nvidia chips.

Apple does not directly mention the developer of the processors that it used to train its language model Apple Foundation Model, but the text of the note includes the wording “cloud clusters based on TPUs.” This is the abbreviated name that Google uses for its tensor processors. Among other things, this revelation by Apple speaks of the company’s use of cloud computing resources leased from Google. For the stage of development of Apple’s artificial intelligence systems, this is a completely justified approach.

The large language model that will run on Apple’s endpoints was trained on a cluster of 2,048 Google v5p processors, considered the most advanced. The server portion of the model was trained on a cluster of 8,192 v4 processors. Google rents out these clusters for $2 per hour per processor used by a client.

Google itself is one of Nvidia’s biggest customers, using Nvidia GPUs and its own TPUs to train AI systems, and selling access to Nvidia technology in its cloud.

Leave a Reply

Your email address will not be published. Required fields are marked *