Issue No. 108August 2021
Apple Software Change Raises Privacy Concerns
As some of you have undoubtedly heard, Apple Computer announced a new security measure for their popular iOS platform (the operating system that runs on iPhones) that has many privacy advocates more than a little concerned. At a fundamental level, Apple has said that they will be checking photos stored locally on an iPhone that has iCloud backup enabled against known images of child sexual abuse material (CSAM). Violations of this check will then be verified by a human, and if found to be something concerning, forwarded along to law enforcement.
On its face, this probably seems like a good idea. Indeed, law enforcement agencies are resoundingly behind the move. In truth, Apple is a bit late to the party in some respects. Other image hosting behemoths, like Google and Amazon have been performing similar checks against photos uploaded to their cloud services for a long time now. The critical difference here, and what has privacy advocacy groups concerned, is that Apple is performing those same checks at the device level. Before we go further, it is important to explain a bit of technical detail behind how these checks happen, and to do that, we need to understand a little bit about how computers deal with images.
In order to represent an image, a computer is given a set of instructions. These instructions tell the computer how to render the image by telling it what color to make the tiniest pieces of the photo, called pixels. A simple example would be having a computer display an image of a candy cane—to do this you would tell it to make red bits that alternate with white bits, how many of each of those sections, how wide they are, etc. Now, say you’ve taken two hundred pictures of your business’ logo, and you want your computer to eliminate duplicates. This is a task that, while monotonous is trivial for a human to accomplish, but can be quite difficult for a computer. This is where something called hashing enters the picture.
Think of a hash as a fingerprint for an image. It doesn’t necessarily allow a computer to reconstruct an entire image from the hash, but the computer can use the hash derived from two different images to say whether the two images are the same. In order to check to see if an image is CSAM, image hosting providers check uploaded images against known hashes of CSAM content. If the hash matches, the image is flagged for human review.
You may be thinking to yourself now: nothing wrong with that! On its face, that statement is correct, and most people certainly don’t take umbrage with content providers utilizing this technology to protect children from predators. Where things become concerning, however, is when you change where that hash check process occurs. Apple has indicated that this scanning will take place locally on users phones. That means the initiation of a potentially exploitable backdoor into a user device where none existed before. Additionally, some privacy advocated have raised the ‘slippery slope’ specter and cited U.K. internet service providers as their example. Some years ago, the United Kingdom rolled out a similar program to all of their internet service providers. While the hash check process was initially scoped to only check for CSAM images, it has since been expanded to look for images of things like counterfeit watches.
In the end, Apple is a private company that manufactures both the device and the operating system that will utilize this new image scanning system. They believe they are within their rights to make these types of changes to end-user devices. The modern era is one in which many, if not most, people value their privacy highly. Enabling a feature on someone’s phone that can potentially give an outside party access to that device is concerning, even if it is well intentioned. It is important that consumers make informed choices, including the types of information they willing provide to the companies that manufacture our beloved mobile devices. Something that seems innocent on the surface level can evolve into a privacy concern.