Apple releases OpenELM: small, open source AI models designed to run on-device


by VentureBeat

VentureBeat— In terms of performance, the OpenLLM results shared by Apple show that the new models perform fairly well, especially the one with 3 billion parameters.

MacRumors—Apple Releases Open Source AI Models That Run On-Device. Apple today released several open source large language models (LLMs) that are designed to run on-device rather than through cloud servers. Called OpenELM (Open-source Efficient Language Models), the LLMs are available on the Hugging Face Hub, a community for sharing AI code. As outlined in a white paper [PDF], there are eight total OpenELM models, four of which were pre-trained using the CoreNet library, and four instruction tuned models. Apple uses a layer-wise scaling strategy that is...

Geeky Gadgets—Apple release new open source AI models for on device processing. In a significant move towards enhancing privacy and processing efficiency, Apple has introduced a series of open source large language models (LLMs) known as OpenELM. These models are uniquely designed to operate directly on devices, diverging from the traditional reliance on cloud-based computations. This shift not only promises to improve user privacy by processing data […]

Ars Technica—Apple releases eight small AI language models aimed at on-device use. OpenELM mirrors efforts by Microsoft to make useful small AI language models that run locally.