Apple Introduced OpenELM Language Models
Apple has released several open source large scale language models (LLMs).
Their peculiarity is that they work directly on the user’s device without accessing cloud servers. The project, called OpenELM (Open-source Efficient Language Models), is available on Hugging Face Hub, a platform for sharing code in the field of artificial intelligence.
The OpenELM series of AI models has varieties with different sets of parameters: 270 and 450 million, 1.1 and 3 billion. Here, parameters refer to the number of variables that the model uses when making decisions based on training data sets. For example, Microsoft’s recently released Phi-3 has 3.8 billion parameters, and Google’s Gemma has 2.2 billion. However, the smaller models are cheaper to use and optimized for use on phones and laptops.
Apple says the publication of OpenELM models is intended to empower the research community by providing access to cutting-edge language models.