Background information

Apple NeuralHash vs. privacy – Pandora’s box is opened

Apple is rolling out a system that detects child abuse on smartphones. Following a manual check, the authorities are supposed to be notified. While this feature sounds good, it’s a privacy nightmare.

Apple wants to take action against child pornography and child abuse. This is to be done using a system that analyses the images on your device. If it deems a photo suspicious, it’s sent to a human for review. Then, the National Center for Missing and Exploited Children (NCMEC) is alerted (in the United States). On Thursday, Apple published a white paper on this system. It’s called NeuralHash and will be rolled out in the upcoming versions of iOS 15 and iPadOS 15.

In doing so, Apple is opening up a can of worms: safety vs. privacy. How should they be weighed against each other in the context of security?

The facts

There aren’t many facts to speak of beyond the white paper; the questions NeuralHash raises aren’t about a technology and its implementation. They’re about human rights. So, here’s an overview of all you need to know to understand the rest of the article. Please note: further in the article, I use Apple to refer to all technology companies, and NeuralHash to refer to all systems that work on similar principles.

Your iPhone already analyses images, as does your Android phone

Your smartphone already analyses images based on their content. You can easily check this yourself. Open your photo gallery app and go to the search function. If you search for «cat», pictures of your cat, as well as other cats, will pop up.

The photo gallery app loading so-called fingerprints onto your smartphone, i.e. parameters that define the appearance of a cat. Some of these parameters you can define yourself. When I pick up a photo of video producer Stephanie Tresch, Apple Photos doesn’t know the woman in the picture.

But I can assign a name to her. Then the data record «woman» becomes the data record «woman, Stephanie Tresch». As soon as Apple Photos knows that the «woman» is also «Stephanie Tresch», a fingerprint is created for «Stephanie Tresch». This fingerprint includes things like eye colour, cheekbones, lips, eyebrows, locations, locations with objects (for example, «Zürich Hardbrücke» + «2 p.m.» + «blue coat» + «camera» shows me pictures of Stephanie from behind, camera at the ready).

Images have a number

Each image file – like any other file – has its own number, a so-called hash value. This hash is smaller than the image file, so it’s more efficient at the database level.

When you send a selfie to your best friend, that picture gets assigned a hash value. If she forwards it, the forwarded image has the same hash. But if the image is changed (for example, recoloured to black and white), the hash changes with it.

The hash makes it possible for a unique image to be tracked if it’s deemed «not okay» by a central authority. Especially in the case of illicit pornography, where images are constantly being shared, tracking via image hash can make finding the perpetrator a quick affair. But if the image is constantly being modified, it becomes more difficult to track through hashing.

And hashes can be manipulated by simple means.

What is NeuralHash?

The NeuralHash system has been confirmed. It’s supposed to work similarly to the search function in your photo gallery app. The system automatically indexes images according to certain criteria. In our scenario, these are images of «child abuse». If NeuralHash finds a certain number of images that match the «child abuse» fingerprint, those images are decrypted and sent to Apple.

Apple’s partner in fighting child abuse opposes end-to-end encryption.
Apple’s partner in fighting child abuse opposes end-to-end encryption.
Source: missingkids.org

Once the individual viewing these images confirms the suspicion, the National Center for Missing and Exploited Children (NCMEC) is alerted. By its own account, this organisation works with authorities worldwide and opposes the widespread and unquestioned use of end-to-end encryption for the sake of child welfare.

According to page 5 of the white paper, NeuralHash creates a number based on the content of the image as opposed to its data, as is the case with a normal hash. NeuralHash looks at the image, understands what it depicts, and generates a value based on that. This means that even if a colour image is converted to black and white, ts NeuralHash doesn’t change. It doesn’t change if the file size changes, either. Or if the image is cropped. This is Apple’s way of circumventing hash manipulation by users.

Technologically speaking, it’s not necessary for an Apple employee to know your Apple ID. All they have to do is determine whether or not the offence (child abuse) has actually taken place. The link between image and user can happen in the depths of Apple’s databases; it doesn’t necessarily require the person responsible for double-checking to know that image A is from person A. They only have to answer the question «Does image A show child abuse?» if the hash of the image doesn’t already confirm that it does.

Privacy is a human right

A human right is something that everyone everywhere is, or should be, entitled to at all times. It’s inalienable, i.e. it can’t be relinquished or sold. Human rights are also inseparable. It’s not possible to advocate human right A, but oppose human right B.

The United Nations clearly defines privacy as a human right in Article 12 in the Universal Declaration of Human Rights:

Privacy is a fundamental human right. At Apple, it’s also one of our core values.
apple.com/privacy, 6 August 2021

Apple agrees with this and has declared privacy a cornerstone of the corporation:

Privacy is a fundamental human right. At Apple, it’s also one of our core values.
apple.com/privacy, 6 August 2021

Google sings a similar song. Facebook – a social network whose core mechanism is the presentation of formerly private data to the public – also goes to great lengths to ensure that nothing bad happens with your data.

The Apple effect

Apple is the most valuable brand in the world in the year 2021. With value comes power. When Apple does something, it has a signalling effect. And the rest of the tech world needs Apple far more than Apple needs them.

Here's an example of this in action: Tile has been making Bluetooth trackers since 2013. It didn’t manage a breakthrough because, among other things, Tile lacks a wide network of devices that aid in tracking your lost object. In 2021, Apple introduced the AirTag. Thanks to the with the «Find My» network, there were immediately one billion devices in the world ready to locate your AirTag. Tile is now allowed to use the «Find My» network. The fact is Tile needs Apple far more than Apple needs Tile.

The so-called Apple effect – power coupled with a signalling effect – shouldn’t be underestimated. When Apple introduces a feature, the competition that already had it earlier suffers. It puts the competitors in a tight spot.

The problems

NeuralHash is making waves in the information security and privacy activist communities. They deal not just with technology but above all with questions of an ethical-philosophical nature that arise as a consequence of technology.

Encryption with a bypass

If Apple is to search your images for signs of child abuse, the encryption on your iCloud backups must be bypassed. To do this, a «child abuse» fingerprint is created in NeuralHash and integrated directly into the Photos app.

According to page 4 of the white paper, the Photos app then scans the pictures on your iPhone, elegantly bypassing any and all encryption. Apple has found a way to have its cake and eat it, too. It can both have the encryption and bypass it, because the images on your iPhone are stored in plaintext.

This is called a «backdoor». So far, Apple can still cling to some level of encryption – even if complete encryption isn’t a given. The reason: the FBI and the U.S. government have put pressure on Apple. As it stands, Apple may be able to decrypt backups. The backdoor, as far as we know of it, can only be influenced through Apple’s official update channel.

NeuralHash can be arbitrarily configured

For NeuralHash to be effective against child abuse, someone must define what child abuse looks like in photos. A system threshold must also be defined; does the «child abuse» fingerprint only need «black eye», or are both «black eye» + «bloody lip» required to sound the alarm?

Who decides this? And based on what? At what point is child abuse bad enough?

To make matters worse, according to page 3 of the white paper, NeuralHash is invisible, and you have no control over it.

A stitch in time saves nine

Once NeuralHash is unleashed on humanity, there’s no stopping it. If the photo gallery app can recognise Stephanie Tresch from behind based on her blue coat and the fact that she’s holding a camera at a certain location, then the very same search function can be applied to pretty much anything and anyone.

Who can promise us that Apple won’t abuse this feature?

NeuralHash can be used for good as well as for evil. Sure, the next target after child abuse could be skin cancer. On the flip side, Apple could also create a gigantic archive of dick pics, given that NeuralHash allows for images to be decrypted and transmitted.

Information security experts such as cryptography professor Matthew Green fear that NeuralHash is an attack on end-to-end Encryption.

Apple developing mass surveillance

The Financial Times reports that, according to Ross Anderson, Professor of Security Engineering at the University of Cambridge, this is a horrendous idea because it leads to mass surveillance of our laptops and smartphones.

With NeuralHash, you’re constantly being monitored by your own smartphone – a device indispensable in modern everyday life. The surveillance is centralised and determined by people over whom you have no control. Even the law has only a limited effect on mechanisms like NeuralHash. For one, because the legislators of this world are quite interested in backdoors. In addition, legislatures in a democracy are often reactive and slow. By the time the law has taken a solid and long-term look at NeuralHash, the system has not only been unleashed on humanity, but has actually evolved.

The police is keen on NeuralHash

The executive branch in any system has a strong interest in making NeuralHash a success. Such a system would allow suspects of terrorism to be caught before committing a crime. If the system is expanded to include location data, criminals could be apprehended based on their location at the time of the crime. Given a fingerprint of «scene of the crime» + «clothing» + «time of the crime», NeuralHash could create a profile of the perpetrator and his or her movements; perhaps he or she can be seen in a selfie someone took at the cafe across the street of the crime scene. This would greatly simplify the work of the police and allow justice to be served more efficiently.

Technology is politics, politics is technology

It’s possible that governments could force Apple to inject NeuralHash – configured according to arbitrary factors – into all iPhones in their country. Here’s an illustrative example: the Chinese government might say, «Apple, you can’t sell iPhones in China unless NeuralHash helps us track down Uyghurs.»

If Apple were to oppose this, it would lose a large market. If Apple were to go along with it, people would suffer as a result.

Apple’s decisions have signalling power. Matthew Green, Associate Professor of Computer Science at John Hopkins University, told the Financial Times that this would open the door wide, stating governments will demand the feature from everyone.

This really puts the Apple effect to the test and possibly sets an unpalatable precedent. If Apple goes along with it, there’s little to stop Poland from using NeuralHash to hunt down homosexuals, or Iran from hunting down women who don’t wear a hijab.

False positives

When NeuralHash looks for images of violence, it looks for signs that can come about not just through violence. A black eye, bloody lip and the like also happen in sports. UFC fighter Cristiane Justino Venâncio aka Cris Cyborg looks badly banged up after – and sometimes already during – her matches.

Cris Cyborg during her normal working day. Would NeuralHash sound the alarm here?
Cris Cyborg during her normal working day. Would NeuralHash sound the alarm here?

If she has enough pictures of herself bleeding from the nose or with a black eye, NeuralHash will strike. A number of images will be transmitted to Apple, where a human will view them. In the context of Cris Cyborg’s everyday life, these «signs of abuse» are nothing out of the ordinary. The Apple employee would recognise this and close the corresponding ticket. Regardless, the employee saw the pictures.

A false positive is not only a violation of a human right; it’s one that happens for no reason. Someone at Apple just has a look-see at some pictures from the fighter’s private gallery. For no reason.

The question at hand is whether so-called false positives are okay. And if so, how many times are false positives allowed to occur before NeuralHash is considered to be going too far? Who determines this number? Who monitors it?

On page 3 of the white paper, Apple describes the number of false positives as «extremely low». The next page talks about a 1 in 1 trillion probability of false positives. In addition, all reports are supposed to be reviewed prior to being submitted to the NCMEC.

A question of trust

Can we trust Apple?

Apple has made a good name for itself by making privacy the foundation of the company. But Apple is under no obligation to uphold it.

Ideology in companies is always a marketing tool. Because what counts for companies – no matter how often a CEO repeats it – is not how much good the company can do for the world. The only thing that counts is the money. Apple must sell iPhones. Apple sets itself growth targets that it must meet. As soon as these figures no longer work, Apple changes course. As would any other company.

Car manufacturers are currently undergoing precisely such a change in course. Just ten years ago, Tesla was dismissed as a fun novelty. Now, entire car brands are being switched from gasoline to electricity. For the environment’s sake, of course. Not because lawmakers are passing more and more restrictive laws for gasoline vehicles, talking tax breaks for buyers of electric cars, and – not least – because people want to buy electric cars.

So, what’s the guarantee that Apple will forever remain the protector of privacy it is today?

The verdict

Despite all the noble goals Apple wants to pursue with NeuralHash, the feature is too dangerous not to be viewed as too great a risk to the human right of privacy. Apple can’t be trusted to that extent, and neither can any other company. Lawmakers, least of all. And an organisation that actively opposes end-to-end encryption shouldn’t be given a mechanism to bypass encryption without oversight. No matter the motives.

Privacy must remain unaffected. There must be no exceptions. Human rights also apply to criminals.

And even if an exception did a lot of good, that very exception would set a precedent that doesn’t allow for a happy ending.

187 people like this article


User Avatar
User Avatar

Journalist. Author. Hacker. A storyteller searching for boundaries, secrets and taboos – putting the world to paper. Not because I can but because I can’t not.

These articles might also interest you

  • Background information

    NeuralHash update: of attempts at explaining, hackers and the backdoor that supposedly isn’t

    by Dominik Bärlocher

  • Background information

    NeuralHash: Apple responds to questions about privacy

    by Dominik Bärlocher

  • Background information

    Meta under pressure – Part 4: the Apple Menace

    by Samuel Buchmann

Comments

Avatar