Apple has delayed the rollout of the Child Safety Features that it announced last month following negative feedback, the company has today announced.
The planned features include scanning users' iCloud Photos libraries for Child Sexual Abuse Material (CSAM), Communication Safety to warn children and their parents when receiving or sending sexually explicit photos, and expanded CSAM guidance in Siri and Search.
Apple confirmed that feedback from customers, non-profit and advocacy groups, researchers, and others about the plans has prompted the delay to give the company time to make improvements. Apple issued the following statement about its decision:
Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material. Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.
Following their announcement, the features were criticized by a wide range of individuals and organizations, including security researchers, the privacy whistleblower Edward Snowden, the Electronic Frontier Foundation (EFF), Facebook's former security chief, politicians, policy groups, university researchers, and even some Apple employees. Apple has since endeavored to dispel misunderstandings and reassure users by releasing detailed information, sharing FAQs, various new documents, interviews with company executives, and more.
The suite of Child Safety Features were originally set to debut in the United States with an update to iOS 15, iPadOS 15, watchOS 8, and macOS Monterey. It is now unclear when Apple plans to roll out the "critically important" features, but the company still appears to be intent on releasing them.
Source: Macrumors