Apple releases new family of Open-source Efficient Language Models as AI work progresses


by 9to5Mac

9to5Mac— Ahead of iOS 18’s debut at WWDC in June, Apple has released a family of open-source large language models. Called OpenELM, Apple describes these as: a family of Open-source Efficient Language Models. In its testing, Apple says that OpenELM offers similar performance to other open language models, but with less training data. more

Geeky Gadgets—Apple release new open source AI models for on device processing. In a significant move towards enhancing privacy and processing efficiency, Apple has introduced a series of open source large language models (LLMs) known as OpenELM. These models are uniquely designed to operate directly on devices, diverging from the traditional reliance on cloud-based computations. This shift not only promises to improve user privacy by processing data […]

MacRumors—Apple Releases Open Source AI Models That Run On-Device. Apple today released several open source large language models (LLMs) that are designed to run on-device rather than through cloud servers. Called OpenELM (Open-source Efficient Language Models), the LLMs are available on the Hugging Face Hub, a community for sharing AI code. As outlined in a white paper [PDF], there are eight total OpenELM models, four of which were pre-trained using the CoreNet library, and four instruction tuned models. Apple uses a layer-wise scaling strategy that is...

VentureBeat—Apple releases OpenELM: small, open source AI models designed to run on-device. In terms of performance, the OpenLLM results shared by Apple show that the new models perform fairly well, especially the one with 3 billion parameters.