Apple lets developers use offline AI models

At WWDC 2025, Apple introduced a major step forward in on-device artificial intelligence with the launch of its new Foundation Models framework. Designed to empower developers with powerful AI capabilities without relying on the cloud, the framework enables apps to tap into Apple’s AI models directly on-device—boosting privacy, speed, and offline availability.
On-Device Intelligence, Developer Friendly
Craig Federighi, Apple’s Senior VP of Software Engineering, took the stage to demonstrate how the Foundation Models framework works hand-in-hand with Apple Intelligence—the company's new suite of AI models powering iOS features. The focus: helping developers create smarter, more personal app experiences that don’t compromise user privacy.
“For example,” Federighi said, “if you’re getting ready for an exam, an app like Kahoot can create a personalized quiz from your notes to make studying more engaging.” Importantly, this process happens entirely on the user’s device—avoiding cloud API costs and keeping personal data local.
Minimal Code, Maximum Power
Apple highlighted the simplicity of integrating these models into apps. With native support for Swift, developers can get started with just three lines of code, making powerful AI integration remarkably accessible.
Built-in features like guided generation, tool calling, and more are part of the framework, giving developers robust tools to build dynamic and responsive app experiences.
Already in Use
Apple noted that companies like Automattic and AllTrails are already using the framework. Automattic’s Day One journaling app is enhancing user reflections with personalized AI-generated content, while AllTrails is using Apple’s models to suggest customized hiking routes.
Availability
The Foundation Models framework is available for testing today through the Apple Developer Program. A public beta is expected to roll out early next month, opening the door for broader experimentation and feedback.