avautuu uudessa ikkunassa
LEHDISTÖTIEDOTE 09 kesäkuuta 2025

Apple Intelligence gets even more powerful with new capabilities across Apple devices

Developers can now access the Apple Intelligence on-device foundation model to power private, intelligent experiences within their apps
Apple’s latest product lineup displayed new Apple Intelligence features.
At WWDC25, Apple unveiled new Apple Intelligence features coming to iPhone, iPad, Mac, Apple Watch, and Apple Vision Pro.
CUPERTINO, CALIFORNIA Apple today announced new Apple Intelligence features that elevate the user experience across iPhone, iPad, Mac, Apple Watch, and Apple Vision Pro. Apple Intelligence unlocks new ways for users to communicate with features like Live Translation; do more with what’s on their screen with updates to visual intelligence; and express themselves with enhancements to Image Playground and Genmoji.1 Additionally, Shortcuts can now tap into Apple Intelligence directly, and developers will be able to access the on-device large language model at the core of Apple Intelligence, giving them direct access to intelligence that is powerful, fast, built with privacy, and available even when users are offline. These Apple Intelligence features are available for testing starting today, and will be available to users with supported devices set to a supported language this fall.
“Last year, we took the first steps on a journey to bring users intelligence that’s helpful, relevant, easy to use, and right where users need it, all while protecting their privacy. Now, the models that power Apple Intelligence are becoming more capable and efficient, and we’re integrating features in even more places across each of our operating systems,” said Craig Federighi, Apple’s senior vice president of Software Engineering. “We’re also taking the huge step of giving developers direct access to the on-device foundation model powering Apple Intelligence, allowing them to tap into intelligence that is powerful, fast, built with privacy, and available even when users are offline. We think this will ignite a whole new wave of intelligent experiences in the apps users rely on every day. We can’t wait to see what developers create.”
Apple Intelligence features will be coming to eight more languages by the end of the year: Danish, Dutch, Norwegian, Portuguese (Portugal), Swedish, Turkish, Chinese (traditional), and Vietnamese.

Live Translation Breaks Down Language Barriers

For those moments when a language barrier gets in the way, Live Translation can help users communicate across languages when messaging or speaking. The experience is integrated into Messages, FaceTime, and Phone, and enabled by Apple-built models that run entirely on device, so users’ personal conversations stay personal.
In Messages, Live Translation can automatically translate messages. If a user is making plans with new friends while traveling abroad, their message can be translated as they type, delivered in the recipient’s preferred language, and when they get a response, each message can be instantly translated.2 On FaceTime calls, a user can follow along with translated live captions while still hearing the speaker’s voice. And when on a phone call, the translation is spoken aloud throughout the conversation.3

News Ways to Explore Creativity with Updates to Genmoji and Image Playground

Genmoji and Image Playground provide users with even more ways to express themselves. In addition to turning a text description into a Genmoji, users can now mix together emoji and combine them with descriptions to create something new. When users make images inspired by family and friends using Genmoji and Image Playground, they have the ability to change expressions or adjust personal attributes, like hairstyle, to match their friend’s latest look.
Enhancements to Genmoji give users more ways to express themselves with the option to mix and combine emoji with descriptions to create something new.
In Image Playground, users can tap into brand-new styles with ChatGPT, like an oil painting style or vector art. For moments when users have a specific idea in mind, they can tap Any Style and describe what they want. Image Playground sends a user’s description or photo to ChatGPT and creates a unique image. Users are always in control, and nothing is shared with ChatGPT without their permission.
The new Any Style option in Image Playground with ChatGPT, displayed on MacBook Pro.
New styles come to Image Playground with ChatGPT, including an Any Style option, for moments when users have a specific idea in mind.

Visual Intelligence Helps Users Search and Take Action

Building on Apple Intelligence, visual intelligence extends to a user’s iPhone screen so they can search and take action on anything they’re viewing across their apps.
Apple Intelligence updates extend visual intelligence to the content on users’ iPhone screen, allowing users to take actions like searching for similar items using Google, Etsy, or other supported apps.
Visual intelligence already helps users learn about objects and places around them using their iPhone camera, and it now enables users to do more, faster, with the content on their iPhone screen. Users can ask ChatGPT questions about what they’re looking at on their screen to learn more, as well as search Google, Etsy, or other supported apps to find similar images and products. If there’s an object a user is especially interested in, like a lamp, they can highlight it to search for that specific item or similar objects online.
Visual intelligence enables users to ask ChatGPT for details on specific objects.
Visual intelligence also recognizes when a user is looking at an event and suggests adding it to their calendar.4 Apple Intelligence then extracts the date, time, and location to prepopulate these key details into an event.
Visual intelligence can also recognize when a user is looking at an event and suggest adding it to their Calendar, and Apple Intelligence will extract the relevant data to create an event.
Users can access visual intelligence for what’s on their screen by simply pressing the same buttons used to take a screenshot. Users will have the choice to save or share their screenshot, or explore more with visual intelligence.

Apple Intelligence Expands to Fitness on Apple Watch

Workout Buddy is a first-of-its-kind workout experience on Apple Watch with Apple Intelligence that incorporates a user’s workout data and fitness history to generate personalized, motivational insights during their session.5
To offer meaningful inspiration in real time, Workout Buddy analyzes data from a user’s current workout along with their fitness history, based on data like heart rate, pace, distance, Activity rings, personal fitness milestones, and more. A new text-to-speech model then translates insights into a dynamic generative voice built using voice data from Fitness+ trainers, so it has the right energy, style, and tone for a workout. Workout Buddy processes this data privately and securely with Apple Intelligence.
Workout Buddy will be available on Apple Watch with Bluetooth headphones, and requires an Apple Intelligence-supported iPhone nearby. It will be available starting in English, across some of the most popular workout types: Outdoor and Indoor Run, Outdoor and Indoor Walk, Outdoor Cycle, HIIT, and Functional and Traditional Strength Training.
Workout Buddy comes to Apple Watch with Apple Intelligence, a first-of-its-kind workout experience that incorporates a user’s workout data and personal fitness history to generate personalized insights and motivation.

Apple Intelligence On-Device Model Now Available to Developers

Apple is opening up access for any app to tap directly into the on-device foundation model at the core of Apple Intelligence.
The Foundation Models framework allows developers to tap directly into the on-device foundation model at the core of Apple Intelligence, giving them access to intelligence that is powerful, fast, built with privacy, and available when users are offline.
With the Foundation Models framework, app developers will be able to build on Apple Intelligence to bring users new experiences that are intelligent, available when they’re offline, and that protect their privacy, using AI inference that is free of cost. For example, an education app can use the on-device model to generate a personalized quiz from a user’s notes, without any cloud API costs, or an outdoors app can add natural language search capabilities that work even when the user is offline.
With native support for Swift, the Foundation Models framework allows app developers to easily tap into the Apple Intelligence model with as few as three lines of code.
The framework has native support for Swift, so app developers can easily access the Apple Intelligence model with as few as three lines of code. Guided generation, tool calling, and more are all built into the framework, making it easier than ever to implement generative capabilities right into a developer’s existing app.

Shortcuts Get More Intelligent

Shortcuts are now more powerful and intelligent than ever. Users can tap into intelligent actions, a whole new set of shortcuts enabled by Apple Intelligence. Users will see dedicated actions for features like summarizing text with Writing Tools or creating images with Image Playground.
Now users will be able to tap directly into Apple Intelligence models, either on-device or with Private Cloud Compute, to generate responses that feed into the rest of their shortcut, maintaining the privacy of information used in the shortcut. For example, a student can build a shortcut that uses the Apple Intelligence model to compare an audio transcription of a class lecture to the notes they took, and add any key points they may have missed. Users can also choose to tap into ChatGPT to provide responses that feed into their shortcut.
The Shortcuts app displayed on MacBook Pro.
Shortcuts are supercharged with Apple Intelligence, allowing users to tap directly into Apple Intelligence models, either on-device or with Private Cloud Compute, to generate responses that feed into the rest of a shortcut.

Additional New Features

Apple Intelligence is even more deeply integrated into the apps and experiences that users rely on every day:
  • The most relevant actions in an email, website, note, or other content can now be identified and automatically categorized in Reminders.
  • Apple Wallet can now identify and summarize order tracking details from emails sent from merchants or delivery carriers. This works across all of a user’s orders, giving them the ability to see their full order details, progress notifications, and more, all in one place.
  • Users can create a poll for anything in Messages, and with Apple Intelligence, Messages can detect when a poll might come in handy and suggest one. In addition, Backgrounds in the Messages app lets a user personalize their chats with stunning designs, and they can create unique backgrounds that fit their conversation with Image Playground.
These features build on a wide range of Apple Intelligence capabilities that are already available to users:
  • Writing Tools can help users rewrite, proofread, and summarize the text they have written. And with Describe Your Change, users can describe a specific change they want to apply to their text, like making a dinner party invite read like a poem.
  • Clean Up in Photos allows users to remove distracting elements while staying true to the moment as they intended to capture it.
  • Visual intelligence builds on Apple Intelligence and helps users learn about objects and places around them instantly.
  • Genmoji allow users to create their own emoji by typing a description. And just like emoji, they can be added inline to messages, or shared as a sticker or reaction in a Tapback.
  • Image Playground gives users a way to create playful images in moments, with concepts like themes, costumes, accessories, and places. And they can add their own text descriptions, and create images in the likeness of a family member or friend using photos from their photo library.
  • Image Wand can transform a rough sketch into a polished image that complements a user’s notes.
  • Mail summaries give users a way to view key details for an email or long thread by simply tapping or clicking Summarize.
  • Smart Reply provides users with suggestions for a quick response in Mail and Messages.
  • Siri is more natural and helpful, with the option to type to Siri and tap into its product knowledge about the features and settings on Apple products; Siri can also follow along if a user stumbles over their words, and maintain context from one request to the next.
  • Access to ChatGPT is integrated in Writing Tools and Siri, giving users the option to tap into ChatGPT’s image- and document-understanding capabilities without needing to jump between tools.
  • Natural language search in Photos makes it easier for users to find a photo or video by simply describing it.
  • Users can create a memory movie in Photos by typing a description.
  • Summaries of audio transcriptions in Notes are automatically generated to surface important information at a glance.
  • Users can generate summaries of call transcriptions to highlight important details.
  • Priority Messages, a section at the top of the inbox in Mail, shows the most urgent emails, like a same-day invitation to lunch or a boarding pass.
  • Priority Notifications appear at the top of a user’s notifications, highlighting important notifications that may require immediate attention.
  • Notification summaries give users a way to scan long or stacked notifications and provide key details right on the Lock Screen.
  • Previews in Mail and Messages show users a brief summary of key information without needing to open a message.
  • The Reduce Interruptions Focus surfaces only the notifications that might need immediate attention.

A Breakthrough for Privacy in AI

Designed to protect users’ privacy at every step, Apple Intelligence uses on-device processing, meaning that many of the models that power it run entirely on device. For requests that require access to larger models, Private Cloud Compute extends the privacy and security of iPhone into the cloud to unlock even more intelligence so a user’s data is never stored or shared with Apple; it is used only to fulfill their request. Independent experts can inspect the code that runs on Apple silicon servers to continuously verify this privacy promise, and are already doing so. This is an extraordinary step forward for privacy in AI.
Availability
All of these new features are available for testing starting today through the Apple Developer Program at developer.apple.com, and a public beta will be available through the Apple Beta Software Program next month at beta.apple.com. Users who enable Apple Intelligence on supported devices set to a supported language will have access this fall, including all iPhone 16 models, iPhone 15 Pro, iPhone 15 Pro Max, iPad mini (A17 Pro), and iPad and Mac models with M1 and later, with Siri and device language set to the same supported language: English, French, German, Italian, Portuguese (Brazil), Spanish, Japanese, Korean, or Chinese (simplified). More languages will be coming by the end of this year: Danish, Dutch, Norwegian, Portuguese (Portugal), Swedish, Turkish, Chinese (traditional), and Vietnamese. Some features may not be available in all languages or regions, and availability may vary due to local laws and regulations. For more details, visit apple.com/apple-intelligence.
Jaa artikkeli

Media

  • Tämän artikkelin teksti

  • Images in this article

  1. Genmoji and Image Playground are available in English, French, German, Italian, Portuguese (Brazil), Spanish, and Japanese.
  2. Live Translation in Messages supports English (U.S., UK), French (France), German, Italian, Japanese, Korean, Portuguese (Brazil), Spanish (Spain), and Chinese (simplified).
  3. Live Translation in Phone and FaceTime is available for one-on-one calls in English (U.S., UK), French (France), German, Portuguese (Brazil), and Spanish (Spain).
  4. The ability to add an event to Calendar with visual intelligence is available in English on all iPhone 16 models, iPhone 15 Pro, and iPhone 15 Pro Max.
  5. Workout Buddy will be available on Apple Watch Series 6 or later, Apple Watch SE (2nd generation), and Apple Watch Ultra and Ultra 2 with an Apple Intelligence-supported iPhone starting in English.

Press Contact

Apple Media Helpline

media.emea@apple.com