Apple is planning updates to its iOS and iPadOS mobile operating systems that will help prevent the spread of what it describes as Child Sexual Abuse Material (CSAM). The updates, which will be introduced later this year in the USA, have three key elements:
Automatically detecting images that depict sexually explicit activities involving a child. This will be performed on the iPhone or iPad, using data supplied by the National Center for Missing and Exploited Children (NCMEC) in the USA. It works before images are uploaded to Apple’s iCloud Photos storage, which means Apple is only alerted if users have CSAM in their iCloud Photos account.
Warning when sensitive material is sent via Messages. This is another on-device service, which will warn children about potentially sensitive content whilst keeping their messages unreadable by Apple.
Updating Siri and Search to provide help if users encounter unsafe situations or try to search for CSAM.
Some organisations, including the Electronic Frontier Foundation, have expressed concerns that the technology could be expanded to spy on users, but Apple has pledged not to do this and says it won’t agree to any government requests to do this. The company insists its system will only be used for the detection of child abuse images.
[Apple's FAQ (pdf)]