Apple recently revealed its plan to scan US iPhones for images of child sexual abuse through a new digital tool called “neuralMatch.”
While the move is being lauded by child protection advocates, the decision also garnered pushback from people concerned about privacy and government overreach.
According to reports, neuralMatch will scan pictures before they are uploaded to iCloud. "If it finds a match, the image will be reviewed by a human. If child pornography is confirmed, the user’s account will be disabled and the National Center for Missing and Exploited Children notified," per the Associated Press.
The report also says that parents taking innocent photos of their children presumably needn't be concerned because the flagged images are only those already known in the National Center for Missing and Exploited Children database.
Source: youtube.com