Earlier this year, Apple introduced its Foundation Models framework during WWDC 2025, which allows developers to use the company’s local AI models to power features in their applications.
The company touted that with this framework, developers gain access to AI models without worrying about any inference cost. Plus, these local models have capabilities such as guided generation and tool calling built in.
As iOS 26 is rolling out to all users, developers have been updating their apps to include features powered by Apple’s local AI models. Apple’s models are small compared with leading models from OpenAI, Anthropic, Google, or Meta. That is why local-only features largely improve quality of life with these apps rather than introducing major changes to the app’s workflow.
Below are some of the first apps to tap into Apple’s AI framework.
The Lil Artist app offers various interactive experiences to help kids learn different skills like creativity, math, and music. Developer Arima Jain shipped an AI story creator with the iOS 26 update. This allows users to select a character and a theme, with the app generating a story using AI. The developer said that the text generation in the story is powered by the local model.

This developer is working on a prototype for automatically suggesting emojis for timeline events based on the title for the daily planner app.
Finance tracking app MoneyCoach has two neat features powered by local models. First, the app shows insights about your spending, such as whether you spent more than average on groceries for that particular week. The other feature automatically suggests categories and subcategories for a spending item for quick entries.

This word learning app has added two new modes using Apple’s AI models. There is a new learning mode, which leverages a local model to create examples corresponding to a word. Plus, the example asks users to explain the usage of the word in a sentence.

The developer is also using on-device models to generate a map view of a word’s origin.

Just like a few other apps, the Tasks app implemented a feature to suggest tags for an entry using local models automatically. It’s also using these models to detect a recurring task and schedule it accordingly. And the app lets users speak a few things and use the local model to break them down into various tasks without using the internet.

Automattic-owned journaling app Day One is using Apple’s models to get highlights and suggest titles for your entry. The team has also implemented a feature to generate prompts that nudge you to dive deeper and write more based on what you have already written.

The recipe app is using Apple Intelligence to suggest tags for a recipe and assign names to timers. It also uses AI to break down a block of text into easy-to-follow steps for cooking.
The digital signing app is using Apple’s local models to extract key insights from a contract and give users a summary of the document they are signing.
We will continue updating this list as we discover more apps using Apple’s local models.