Apple’s keynote was «shot on iPhone» – what that means exactly
Background information

Apple’s keynote was «shot on iPhone» – what that means exactly

Samuel Buchmann
1-11-2023
Translation: Patrik Stainbrook

Apple has recorded an entire event with the iPhone 15 Pro. It’s good marketing – but mainly a lesson in how unimportant the camera is in a major film production.

Yesterday, Apple presented new Macs at its Scary Fast event. A small note at the end may have surprised many: according to Apple, the entire keynote was «shot on iPhone». More precisely, with an iPhone 15 Pro Max. Now the Californians reveal what the set looked like in a behind-the-scenes video.

First of all, brilliant marketing. But what does it actually say about the iPhone’s capabilities? On closer inspection of the keynote and the behind-the-scenes video, I notice four things:

1. Smartphones have come a long way

Before all the Apple haters and video nerds start mashing keys in anger: don’t worry, I will be putting things in perspective. But first, I want to acknowledge unequivocally how good the video quality of the iPhone 15 Pro actually is. Over 90 per cent of the audience probably didn’t notice that the keynote was being filmed with a smartphone. And this despite the fact that Apple has accustomed us to absolute high-end productions in recent years.

I don’t know what other cameras Apple uses for keynotes. But I’m guessing it’s professional devices such as Arri Alexa or Sony Venice. Only the iPhone 15 Pro can reproduce the look of large cameras to some extent as a smartphone. In ProRes format with a log colour profile, videos are suitable for detailed colour grading for the first time – without falling apart completely.

This is important, as the shots only look this good thanks to elaborate post-production in editing programs. If Apple had shot the videos in the iPhone’s usual standard format and not edited them, they’d have been oversharpened and oversaturated. The less compressed ProRes format also sets the iPhone apart from other smartphones. However, it does result in huge amounts of data and is really only suitable for professionals. Without colour grading, the images are practically unusable.

2. Apple’s keynotes are like Hollywood movies

Whether filmed with the iPhone or not, Apple’s product presentations are as elaborate and professionally produced as Hollywood films. Every location is meticulously lit. Each article of clothing carefully selected. Every word is perfectly picked out. The camera practically never stands still. Instead, it moves smoothly and subtly when someone speaks – or rapidly for dramatic effect.

When switching from one product to the next, Apple doesn’t just use a cut, but transitions that’ll make any video editor flush. The colour look of the shots appears natural yet characteristic. It isn’t only consistent within a keynote, but across several events.

The films were born out of necessity. During Covid, the usual in-person events were out of the question. Since the end of the pandemic, Apple has been inviting media professionals to Cupertino again. But even they will still only get to see pre-recorded presentations. This way, Apple also eliminates unforeseen events such as the famous network overload that annoyed Steve Jobs in 2010.

3. The camera isn’t that important

Anyone who thinks they can reproduce the quality of the keynote with an iPhone in their own backyard is very much mistaken. Apple uses the big guns in production, camera here or there. The behind-the-scenes video shows equipment costing well over CHF 100,000. The crew on set consists of several dozen people.

«Shot on iPhone» – on top of lots of other expensive equipment.
«Shot on iPhone» – on top of lots of other expensive equipment.
Source: Screenshot YouTube/Apple

The small iPhone is clamped into a gigantic rig on a swivel arm, which provides butter-smooth camera movements – or it hangs from a drone. The quality of recordings is checked on large external monitors. Professional microphones record the sound.

And then there’s the lighting: behind the scenes, video specialist Jon Carr mentions a challenging lowlight scenario. He’s referring to the dark Halloween theme of the keynote. Luckily, huge LED panels illuminate Tim Cook as he steps out of the smoke in the intro. They’re perfectly balanced with the already exceptionally good lighting of the Apple Campus in the background.

The set looks like that of a Hollywood movie. An entire crew ensures optimal conditions for the iPhone.
The set looks like that of a Hollywood movie. An entire crew ensures optimal conditions for the iPhone.
Source: Screenshot YouTube/Apple

In this scenario, the camera isn’t as important any more. Thanks to the homogeneous brightness, the small dynamic range of the tiny smartphone sensor is hardly noticeable. Even slightly less sharpness or more image noise doesn’t matter. All the more because the videos are highly compressed anyway for streaming via YouTube or on Apple’s own website.

4. At second glance, weaknesses become apparent

Despite the remarkable video quality of the iPhone 15 Pro and all the expensive equipment, a closer look reveals differences to keynotes filmed with large cameras. The gradients from light to dark tones aren’t quite as smooth. In dark areas, fewer structures are recognisable, probably due to filters suppressing the image noise. The skin tones are also a little less natural – Tim Cook’s nose is sometimes quite red, for example.

Skin tones aren’t quite as harmonious as usual and the image is sharp front to back. Structures in dark areas are lost, such as the bushes at the top right, but this shouldn’t bother anyone in this scene.
Skin tones aren’t quite as harmonious as usual and the image is sharp front to back. Structures in dark areas are lost, such as the bushes at the top right, but this shouldn’t bother anyone in this scene.
Source: Screenshot YouTube/Apple

However, the iPhone’s biggest weakness is its fixed depth of field. In shots from the main camera, the person in focus doesn’t stand out from the background at all – everything is sharp, front to back. Even at longer focal lengths, there’s virtually no separation. With the iPhone, this is only possible digitally using Cine mode. However, it isn’t available in ProRes format and is rather mediocre anyway. That’s probably why Apple didn’t use it at all.

In other keynotes (here from the WWDC in June), Apple used depth of field specifically to separate a person from the background. This only works with large cameras.
In other keynotes (here from the WWDC in June), Apple used depth of field specifically to separate a person from the background. This only works with large cameras.
Source: Screenshot YouTube/Apple

True, in previous keynotes Apple never relied on exaggerated bokeh effects either, even with large cameras. But the backgrounds were usually slightly blurred. This looks nicer and guides the audience’s gaze. As a fan of high production value, I hope the iPhone stunt remains a one-off. All the more so because the headline could create false expectations among customers.

Header image: Screenshot, YouTube/Apple

229 people like this article


User Avatar
User Avatar

My fingerprint often changes so drastically that my MacBook doesn't recognise it anymore. The reason? If I'm not clinging to a monitor or camera, I'm probably clinging to a rockface by the tips of my fingers.


These articles might also interest you

Comments

Avatar