iPhones will scan photo libraries for child-abuse images, fans aren’t happy

The effort to fight the spread of child abuse imagery on the internet hasn’t gone well in recent years. While initiatives like the Internet Watch Foundation have helped to knock illegal content from mainstream websites. However, what happens when the type of image scanning these initiatives use is built into something like iPhones?

Apple iPhones will scan for CSAM

Apple recently announced that they will be incorporating methods to detect child sexual abuse imagery into iOS. Working with the National Center for Missing and Exploited Children, iPhones and other Apple products will scan photo libraries to detect illegal contents.

Initiatives like the IWF and NCMEC hash known illegal images to make sure that they're easily detectable on sites like social media. Apple will utilise an on-board method that will check images and videos in a user's library against known hashes. If it detects known CSAM content, NCMEC will be notified.

This process will be used any time an image is uploaded to iCloud Photos, an automatic process for many. Apple claims that this process isn't actually looking at anyone’s images. As everything is done through hashes and metadata, nothing is actually being revealed.

Apple claims that the process has proven to be incredibly accurate with “a one in one trillion chance per year of incorrectly flagging”. However, when an account does get flagged, Apple will manually check the flagged image. If a flagged image is decided to be CSAM content, the user’s account is disabled and reported to NCMEC.

Read More: Crap Spider-Man NFTs are Marvel's first attempt at cryptoart

IPhones also have new explicit security

Apple isn't just incorporating CSAM imagery into iPhones and iPads. Using machine learning, a user’s iPhone will be able to scan image attachments to discovery sexually explicit content. This will be backed with new tools to “warn children and their parents when receiving or sending sexually explicit photos."

If the iPhones’ detect something that appears to be explicit, “the photo will be blurred and the child will be warned”. If a child views a sexually inappropriate image, parents will be notified. Furthermore, if a child attempts to send explicit material, they will be warned and parents will be notified if it's sent.

PSX iPhones Apple Messaging
expand image
iPhones will now tell you when there's an explicit image.

Read More: YouTuber creates handheld PS2 that actually looks good

Users aren't happy

While Apple’s new method of culling child abuse imagery certainly sounds like it has good intentions, there's been massive backlash against it. Across every platform the story has spread to, most are in agreement that Apple is overstepping. Many have compared the situation to George Orwell’s “Big Brother”.

In recent years, Apple has built its brand on privacy. In the wake of this news, many iPhone users are feeling like Apple is betraying that purpose. While the CSAM scanning appears to have the best of intentions, the majority of Apple fans aren't sold on the idea.

Read More: I need Brendan Fraser in the MCU and I don't care who he plays

This Article's Topics

Explore new topics and discover content that's right for you!

News
Have an opinion on this article? We'd love to hear it!