Apple automatically searches iPhones for child pornography

Must read

Jeff Horseman
Jeff Horseman
Jeff Horseman got into journalism because he liked to write and stunk at math. He grew up in Vermont and he honed his interviewing skills as a supermarket cashier by asking Bernie Sanders “Paper or plastic?” After graduating from Syracuse University in 1999, Jeff began his journalistic odyssey at The Watertown Daily Times in upstate New York, where he impressed then-U.S. Senate candidate Hillary Clinton so much she called him “John” at the end of an interview. From there, he went to Annapolis, Maryland, where he covered city, county and state government at The Capital newspaper. Today, Jeff writes about anything and everything. Along the way, Jeff has covered wildfires, a tropical storm, 9/11 and the Dec. 2 terror attack in San Bernardino. If you have a question or story idea about politics or the inner workings of government, please let Jeff know. He’ll do his best to answer, even if it involves a little math.

Support of the US authorities
Paradigm shift? Apple automatically searches iPhones for child pornography

Via the FaceTime error, the microphone of an iPhone can be switched on secretly via a single call (symbol image)

The new measures are also intended to prevent young people themselves from sending sexual content (symbol image)

© ljubaphoto / Getty Images

To help the authorities fight child pornography, iPhones in the US will soon scan their memory for known photos. The privacy of users remains protected, the Group emphasizes. Critics fear that this could change quickly.

For years, Apple has been promoting a special protection of privacy on its devices. And not only makes friends with it. But critics fear that a new measure could represent a paradigm shift. First of all, it should sound good for the vast majority of people: Apple wants to automatically find child abuse images on its devices.

The group announced this on Thursday (local time). The features announced as” Advanced child protection measures ” will appear as part of iOS 15 in the fall, according to the group. This will warn parents when sending potentially problematic photos of their children via iMessage, detect abuse images stored on the iPhone and prevent searches for problematic terms, explains the message, which is also accompanied by detailed technical explanations.

Comparison without image evaluation

In particular, the second measure attracts attention: To detect possible abuse representations, the iPhone should automatically match the images on the device with corresponding databases as soon as the photo backup via iCloud is turned on, Apple explains in a document. So the iPhone will automatically check all the pictures and videos stored on the device.

Looking at the measure in detail, however, it turns out that Apple is quite considering the privacy of its users. Instead of evaluating the images themselves, a so-called hash value is compared with a database of known abuse representations. This is a kind of individual fingerprint, which is calculated individually for each image.

Protection against false suspicion

The procedure has two advantages for Apple: On the one hand, the group does not have to evaluate the images of its users themselves, but only compare values, so the content of their own images remains secret, privacy is preserved. On the other hand, Apple itself does not have to store a collection of child pornographic material for comparison on its server. Instead, the database is supplied by the National Center for Missing and Exploited Children (NCMEC).

In order to avoid false reports of innocent users by randomly identical hash values – Apple cites a probability of one in a trillion -, the group has also installed another security measure. The system does not immediately sound an alarm on a single hit, but only when an unspecified minimum quantity is reached. Only then will the account be marked to employees for manual verification. If allegations of abuse are then actually confirmed, there is an account suspension and a report to the law enforcement authorities.

Fear of dam breakage

Nevertheless, privacy advocates view the measure extremely critically. “They’re using it to send a (very influential) signal that it’s OK to search users’ phones for prohibited content, ” computer science professor Matthew Green of the Johns Hopkins Institute wrote on Twitter. “They send this message to the competition, to China and to you.”

Keyboard shortcut

The most frequently expressed fear concerns the database for hash values. True, it is currently provided by a non-governmental organization for a clear purpose. Technically, however, it is not a great effort to fill the infrastructure with other data, such as images of political protests or the depiction of homosexual sexuality, which is prohibited in many states. If the technical condition is created, it may be more difficult for Apple to defend itself against corresponding regulations of individual states, so the common argument. “This will break the dam – governments will demand it of everyone, “Green told the Financial Times.

“This is a back door”

There are also critical questions about the protection function when sending possible nude pictures by minors. According to Apple, it is activated automatically when an iPhone has been set up as part of the family feature with parental controls. If you send a picture, it will be scanned as a possible nude picture and the children will be warned that the parents will be informed after sending. The idea behind this is understandable: The sending of nude pictures of minors is illegal in the USA even if both sender and recipient are not of legal age, so many young people have already had serious legal problems.

But here, too, critics see potential for abuse. “It is impossible to build a scanning system that can only be used for sexual materials of children,” criticized the established Electronic Frontier Foundation (EFF) on Twitter. “Even a well-intentioned implementation calls into question the basic principles of encryption and opens a backdoor for other misuse of the system.”

Not everyone goes so hard with Apple in court. The system would be less invasive by scanning directly on the device than if the images were first scanned on the servers, explains computer security expert Alan Woodward from the University of Surrey to the “Financial Times”. Other services like Facebook would only look for problematic content when uploading. Apple’s approach is preferable, Woodward believes. “This decentralised approach is the best way to go in this direction.”

Source: Apple, Financial Times, Washington Post

Latest article

More articles