Background information
Apple is walking the personal intelligence tightrope
by Samuel Buchmann
Siri becomes the centrepiece of Apple's "Personal Intelligence". This includes AI functions that are deeply integrated into the ecosystem. Despite the collaboration with OpenAI, Apple promises that data protection will always be guaranteed.
Under the name "Apple Intelligence", Apple is integrating artificial intelligence (AI) into its operating systems for iPhone, iPad and Mac. The Californians presented numerous applications at the WWDC developer conference. Apple is working with OpenAI on some of them. The most important AI features at a glance:
Apple plays to its greatest strength in the integration of AI functions: control over the entire ecosystem of devices and apps. Other AI models usually only work in isolation as a standalone app or, at best, are woven into a single operating system, as is the case with Microsoft.
Apple Intelligence, on the other hand, can access the entire context of all devices that are linked to an Apple ID. This allows personalised responses and actions. For example, Siri can find a restaurant reservation in a text message and create a calendar entry from it.
Apple seems to be aware that many users will frown at the thought of granting an AI access to private information. The Californians have therefore focussed on data protection. The promise: personal information always remains protected. Simple requests are processed directly on the device. If the computing power of a server is required, the data is sent exclusively to special Apple servers. Allegedly, nothing is stored in the process.
The exception is requests for which you want ChatGPT support. For example, if you ask Siri a question, it checks whether the external chatbot could help - and asks your permission before forwarding the information. Only then does the request go to OpenAI. The IP address is made unrecognisable.
Where a prompt is processed depends on its complexity. Apple's chips manage simple things directly on the device - at least if they fulfil the minimum requirements. Complex requests go to servers that are operated by more powerful Apple silicon chips.
For tasks that require detailed external information, Apple enters into a collaboration with OpenAI. The "Compose" function in text programs can call on the help of ChatGPT based on the latest GPT-4o model. The whole thing appears to be a kind of integration of the well-known chatbot into Apple's own apps. If you send a request to OpenAI, Apple will only pass on the necessary information.
A centrepiece of Apple's new AI is Siri. The voice assistant was not previously known for being particularly intelligent. That is set to change. Apple promises that Siri now understands spoken instructions naturally - even if you change or correct them in the middle of a sentence. Siri also remembers requests and can link them to other commands as context.
Siri can also be controlled not only by voice commands, but also by text input. She has access to the information on the device and the screen content. Depending on the instructions, it switches between different apps independently. For example, if you need your passport number to book a flight, she can find it in a photo in your media library, copy it and paste it in the right place.
Siri's voice doesn't seem to have changed. Unlike Google and OpenAI, Apple has decided not to "humanise" the assistant.
Generative AI functions are finding their way into numerous apps. These are similar to the concepts already familiar from other models. Smart Reply makes suggestions for a quick reply and recognises questions in an email to ensure that everything is answered. "Rewrite" suggests different versions of a text in different styles. "Proofread" checks grammar, word choice and sentence structure and makes editing suggestions that you can view and then accept. The "Writing Tools" are available in the following apps:
Apple Intelligence not only generates text, but also images. In the Notes app, for example, "Image Wand" can be used to convert sketches into pretty illustrations - similar to Microsoft's new AI functions in Paint.
A feature called "Image Playground" sounds like a conventional image generator that creates an image from keywords. As this is calculated directly on the device, no miracles are likely to be expected. It seems to be more of a gimmick for chats. The new "Genmojis" - customised emojis generated from a short text prompt - fall into the same category. Or from photos in your media library.
Image editing in Apple's Photos app also receives a few AI-supported functions. However, they are modest in comparison to other programmes such as Adobe Photoshop. The only example cited by Apple is the removal of distracting objects using the "Cleanup Tool".
"Summarize" can summarise texts or information. Just like Google Gemini or ChatGPT. Apple integrates this function directly into native apps. For example, the preview of an email does not simply display the first sentence, but a summarised version of the content. The same works in notifications on the lock screen - or with voice memos, which the AI can transcribe and summarise.
The search function is also set to become smarter and more powerful. You can search images for specific content using a natural prompt. For example, "Katie with stickers on her face". This works not only in photos, but also in videos.
Apple Intelligence is free of charge - including use of the ChatGPT functions. An OpenAI account is not required, but can be linked optionally. The AI features work on Macs and iPads with M1 chip and newer, as well as on the iPhone 15 Pro and iPhone 15 Pro Max.
The beta version of Apple Intelligence will be rolled out with the new operating systems from autumn: iOS 18, iPadOS 18 and macOS Sequoia. The functions will initially only be available in English - if you change Siri and your device language, they will also be available in Switzerland and Germany. Support for other languages is set to follow in 2025.
You can find a detailed classification of Apple's new AI here:
At WWDC, the Californians presented even more innovations to their software. You can find an overview of the remaining features here:
My fingerprint often changes so drastically that my MacBook doesn't recognise it anymore. The reason? If I'm not clinging to a monitor or camera, I'm probably clinging to a rockface by the tips of my fingers.