Apple has postponed child protection features announced last month, including a controversial feature that would scan users’ photos for CSAM, after heavy criticism that the changes could reduce user privacy.
The changes were due to be implemented later this year. “Last month, we announced plans for features designed to help protect children from predators using communication tools to exploit them and reduce the spread of child sexual abuse material,” the company said in a statement.
“Based on feedback from customers, advocacy groups, researchers and others, we’ve decided to devote additional time over the coming months to gather input and make improvements before launching these critical child safety features.”
This release details three major changes in the business. One change about search and Siri might point to resources to block CSAM if the user searches for information about them.
The other two changes have come under further scrutiny. One of the changes could alert parents when their children receive or send sexually explicit images while blurring these images of children.