Apple CEO Tim Cook on Monday opened the company’s annual developer conference by talking up a forthcoming racing film called “F1”.
If it seems odd for a company famous for hardware to open a major event with a chat about a movie, SVP of software engineering Craig Federighi stepped onto the stage and tied it to Apple’s core products by referring to his nickname – “Hair Force One” – in a racing-themed sketch during which he removed his racing helmet to reveal an exaggerated mane of silver hair.
The gag underscored the focus of the keynote – the visual redesign of Apple’s operating systems.
Before Apple execs provided a walkthrough of the redesign, Federighi called out Apple Intelligence – a suite of on-device AI tools and a developer framework for tapping large language models first unveiled at WWDC 2024. However, not everyone was impressed: One user sued the iGiant for false advertising in March.
“This year, we’re doing something new, and we think it’s gonna be pretty big,” said Federighi. “We’re opening up access for any app to tap directly into the on-device large language model at the core of Apple Intelligence, with a new Foundation Models Framework.”
For developers committed to the restraints of the Apple developer ecosystem, the Foundation Models Framework provides a way to integrate Apple Intelligence services into their apps.
The advantage of doing so is that Apple’s AI models reside on-device and work offline. For example, an iOS app could use this Swift-based framework to implement natural language search capabilities that would work on locally stored app data – no cloudy connection required.
More significantly, Apple’s on-device approach means there’s no cloud inference cost, meaning developers who want to add AI to apps won’t have to pay for services from the likes of OpenAI and Anthropic.
Live Translation looks to be one of the more useful features coming to iThing Apple device users through Apple Intelligence. Available today to those in the Apple Developer Program and next month through the Apple Beta Software Program, it allows automatic translation of text or the captioning of speech in Messages, FaceTime, or the Phone app.
Apple has also extended visual intelligence, which lets users search for objects they photograph with an iPhone camera, to include the ability to query OpenAI’s ChatGPT about content on the iPhone screen.
Cupertino announced a few additional features for its operating systems.
iOS 26 will gain Shortcuts that work with Apple Intelligence and improve call screening. Spotlight autocompletions will help macOS 26 users automate various tasks.
Improved windowing and multitasking is coming to iPadOS 26. watchOS 26 will allow users to mute incoming calls, silence timers, and dismiss notifications with the Wrist Flick gesture.
Apple also announced a new convention for naming its operating systems, which currently use version numbers.
Starting later this year, the company will label its OSes by the following calendar year.
In the fall of 2025, users will therefore receive iOS 26, iPadOS 26, macOS Tahoe 26, watchOS 26, tvOS 26, and visionOS 26
Next year’s OS releases will presumably be version 27.
UI makeover
Apple had much to say about its interface renovation, which brings a new look called “Liquid Glass”.
Alan Dye, Apple’s VP of human interface design, described it as “our broadest design update ever.”
The new look, as its name suggests, is glossy and translucent. And according to Dye, it possesses “a fluidity that only Apple can achieve.”

Apple’s new “Liquid Glass” UI offers icons “that render beautifully in light, dark, tinted, or clear looks” Image: Apple – Click to enlarge
Either you like it or you don’t, but at least the design rethink has the potential to make app interfaces more consistent across Apple products. For developers, the change means they must adapt their apps to the new design paradigm.
Even so, there’s a certain irony to talking up an interface refresh while rolling out Apple Intelligence services that reduce the need to tap, type, and manipulate interface controls. App interface details like slider sensitivity, icon size, and the arrangement of app controls become less important when AI models handle the input and output.
There are other aspects of software development that Apple could have chosen to focus on at its developer conference, like software quality, a persistent gripe in the Apple community. We can only hope the shiny, renumbered operating system updates tackle important bugs and improve overall performance. ®