Nude Photo Detection: Apple announces new Parental Control features

Must read

Jeff Horseman
Jeff Horseman
Jeff Horseman got into journalism because he liked to write and stunk at math. He grew up in Vermont and he honed his interviewing skills as a supermarket cashier by asking Bernie Sanders “Paper or plastic?” After graduating from Syracuse University in 1999, Jeff began his journalistic odyssey at The Watertown Daily Times in upstate New York, where he impressed then-U.S. Senate candidate Hillary Clinton so much she called him “John” at the end of an interview. From there, he went to Annapolis, Maryland, where he covered city, county and state government at The Capital newspaper. Today, Jeff writes about anything and everything. Along the way, Jeff has covered wildfires, a tropical storm, 9/11 and the Dec. 2 terror attack in San Bernardino. If you have a question or story idea about politics or the inner workings of government, please let Jeff know. He’ll do his best to answer, even if it involves a little math.

Detection of nude photos
Apple announces new parental Control features

Apple logo

Apple has unveiled new features aimed at improving the protection of children and adolescents on the group’s devices. Photo: Matthias Schrader / AP / dpa

An innovation is intended to help identify child pornography material in the possession of users. However, it will initially only be introduced for US users in the autumn.

Apple wants to improve the protection of children and adolescents on its devices with new features. Among other things, it will be possible in the future for parents to receive a warning message when their child receives or sends nude photos in Apple’s iMessage chat service.

The nudity in the images is detected by software on the device, as Apple explained. The group does not know about this.

Another innovation, which will initially only be introduced for US users in the autumn, is intended to help identify child pornography material in the possession of users. Photos on users ‘ devices are compared with a database of already known child pornography images before they are uploaded to Apple’s iCloud online storage service. For comparison, a file with so – called “hashes” of such photos should be loaded onto the devices-a kind of digital fingerprint of the image. Above it, a similar photo can be seen, but it can not be restored from the hash.

In the event of a match, a suspicious image is provided with a certificate, thanks to which Apple can exceptionally open it after uploading it to iCloud and subject it to verification. If child pornographic material is actually discovered, Apple reports this to the American non-governmental organization NCMEC (National Center for Missing & Exploited Children), which in turn can involve authorities.

While the feature is enabled only for Apple customers with US accounts, the file with the hashes is a fixed part of the operating system and is loaded on all iPhones that install this version of the software. Before an international introduction of the function, legal requirements must first be clarified.

dpa

Latest article

More articles