apple-accused-again-of-throttling-iphones-performance-noypigeeks

An upcoming feature for iOS, iPadOS, watchOS, and macOS will introduce Apple’s latest system to combat cases of child abuse. The system will be used to scan user photos stored on iPhones and on iCloud to look for photos depicting the exploitation of children.

The machine learning-powered system is called neuralMatch, which will compare scanned photos with known images of child sexual abuse gathered by the US National Center for Missing & Exploited Children. Flagged, suspicious photos will then be submitted to a team of human reviewers to verify if there’s online sexual abuse and exploitation of children (OSAEC) seen on the photos.

While it’s a noble effort for the sake of the children, Apple’s plans to scan user-uploaded photos have alarmed security researchers on the possible surveillance abuse and exploitation of neuralMatch by hackers and authoritarian governments.

Here in the Philippines, the Senate recently approved a bill known as the Anti-OSAEC law amid rising cases of online child abuse.

For more information, you can learn more about it here.


Leave a comment

Your email address will not be published. Required fields are marked *