No Comments

Apple will soon scan all iCloud Photos images for child abuse

 

Using a new on-device technology, Apple will detect illegal images while protecting user privacy.

Update: Apple has a Child Safety page that describes this new feature and how it works.

TechCrunch has confirmed that Apple will soon roll out a new technology to scan photos uploaded to iCloud for child sexual abuse material (CSAM). The rollout will happen later this year as part of a collection of technologies meant to make its products and services safer for children to use.

Most cloud services already scan images for material that violates its terms of service or the law, including CSAM. They can do this because, while the images may be stored encrypted, the companies have the encryption key. Apple’s encrypts photos in transit and stores them encrypted, but has the decryption key to decrypt them if necessary—to serve data stored in iCloud under subpoena, or to make your iCloud photos available in a web browser.

To help preserve user privacy, the company is relying on a new technology called NeuralHash that will check images as they are uploaded to iCloud Photos, looking for matches to a known database of child abuse imagery. It works entirely on your iPhone, iPad, or Mac by converting photos into a unique string of letters and numbers (a “hash”). Normally, any slight change to a photo would result in a different hash, but Apple’s technology is said to be such that small alterations (like a crop) will still result in the same hash.

 

Source: MacWorld

You might also like

More Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *

Fill out this field
Fill out this field
Please enter a valid email address.