Jump to content

How Apple plans to monitor users


Recommended Posts

Apple plans to use its new CSAM Detection system to monitor users and identify those who store child pornography on their devices.


In early August 2021, Apple unveiled its new system for identifying photos containing images of child abuse. Although Apple’s motives — combating the dissemination of child pornography — seem indisputably well-intentioned, the announcement immediately came under fire.

Apple has long cultivated an image of itself as a device maker that cares about user privacy. New features anticipated for iOS 15 and iPadOS 15 have already dealt a serious blow to that reputation, but the company is not backing down. Here’s what happened and how it will affect average users of iPhones and iPads.


Link to comment
Share on other sites


  • Create New...