Apple announces controversial new child safety features coming later this year

The news: Following rumors of an imminent announcement, Apple has officially confirmed new plans to scan users’ iCloud Photos and bolster child safety on their platforms.

The new plan takes a 3-pronged approach:

  • Messages app: When sending or receiving sexually explicit material, there will be new pop-ups to help users (children) navigate this with notifications sent out to parents letting them know whether or not their child viewed the content.
  • iCloud photo scanning (US only): Apple is rolling out technology later this year that will scan your iCloud Photos for known Child Sexual Abuse Material (CSAM). The system searches for existing content that can be matched to a database of CSAM provided by partners. If illegal content is found after a manual review, a user’s account will be reported and disabled. Apple says there is a one-in-one-trillion chance of a false positive, and will have an appeals process for these cases.
  • Siri and Search guidance improvements: “Apple is also expanding guidance in Siri and Search by providing additional resources to help children and parents stay safe online and get help with unsafe situations. For example, users who ask Siri how they can report CSAM or child exploitation will be pointed to resources for where and how to file a report. Siri and Search are also being updated to intervene when users perform searches for queries related to CSAM. These interventions will explain to users that interest in this topic is harmful and problematic, and provide resources from partners to get help with this issue.”
  • These new features are being rolled out later this year in updates across Apple’s platforms

Our take: All of these features sound great, and the first and third features are. As for the second feature, we all agree that CSAM is terrible and has no place anywhere, ever. But Apple’s method of finding CSAM is what I’m concerned about: specially, the precedent this sets and the door this opens for finding other illegal content in user’s iCloud Photos down the line.

Years from now, will we see this photo scanning feature expanded to other illegal content? This is a slippery slope.