At WWDC 2025, Apple shifted its tone, placing less emphasis on its AI initiative, Apple Intelligence, and instead spotlighted a suite of enhancements to its operating systems, services, and software. Central to the visual revamp was a new aesthetic Apple calls “Liquid Glass”, alongside updated naming conventions for its platforms. Yet, AI quietly remained in the picture.
Live Translation Across Calls and Messages
One of the more practical uses of Apple Intelligence is live translation, now integrated into Messages, FaceTime, and phone calls. Real-time translations — either as spoken words or on-screen captions — will make cross-language communication more fluid. During FaceTime calls, users can view translated captions live, while regular phone calls will include voice translations.
AI-Powered Call Features: Call Screening and Hold Assist
Apple also introduced AI features to improve phone call experiences:
- Call Screening allows the phone to answer unknown numbers in the background and relay the caller’s identity and intent.
- Hold Assist recognizes hold music and lets users navigate away from the call while waiting. Notifications will alert users when a real agent picks up.
Poll Suggestions in Messages
Group indecision gets a solution with AI-generated poll suggestions in the Messages app. If a conversation seems stuck — like choosing a restaurant — Apple Intelligence will propose a poll to simplify decision-making.
Smarter Shortcuts with AI
The Shortcuts app is becoming more dynamic. Users can integrate AI summarization into their custom shortcuts by selecting an AI model during setup.
Visual Intelligence: Real-World Recognition with Contextual Search
Apple is bringing image analysis to the next level. Visual Intelligence can recognize objects, locations, clothing, and more — directly from your surroundings or on-screen content. Whether it’s identifying a jacket or looking up a restaurant, the tool performs searches using Google, ChatGPT, and other integrated apps. It can be activated through Control Center or the Action button, and launches with iOS 26 later this year.
ChatGPT Enhances Image Playground
ChatGPT is now embedded into Image Playground, Apple’s AI image generator. Users can create visuals in stylized formats such as anime, oil painting, and watercolor, or send prompts directly to ChatGPT for customized results.
Workout Buddy: AI-Driven Motivation
The new Workout Buddy offers real-time, voice-assisted motivation using text-to-speech models. It highlights key milestones, average heart rate, and pace during workouts, and delivers personalized summaries post-exercise — simulating a digital personal trainer.
Spotlight Gets Contextually Aware
Apple’s Spotlight search on Mac now includes AI-based context awareness, enabling more relevant suggestions and quicker access to commonly used functions based on user behavior.
Foundation Models for Developers
Apple introduced its Foundation Models framework, giving developers access to its AI models offline. This opens the door to more intelligent, privacy-respecting apps that can run advanced AI features without constant cloud dependence.
No Siri Overhaul Yet
Despite anticipation, major AI updates to Siri were absent. Apple SVP Craig Federighi acknowledged the delay, stating more information would arrive next year. The setback may fuel scrutiny as competitors accelerate AI assistant innovation.