News Articles

Thursday, August 12, 2021

Apple plans photo-analysis tool to protect children

Apple is planning updates to its iOS and iPadOS mobile operating systems that will help prevent the spread of what it describes as Child Sexual Abuse Material (CSAM). The updates, which will be introduced later this year in the USA, have three key elements:

Automatically detecting images that depict sexually explicit activities involving a child. This will be performed on the iPhone or iPad, using data supplied by the National Center for Missing and Exploited Children (NCMEC) in the USA. It works before images are uploaded to Apple’s iCloud Photos storage, which means Apple is only alerted if users have CSAM in their iCloud Photos account.

Warning when sensitive material is sent via Messages. This is another on-device service, which will warn children about potentially sensitive content whilst keeping their messages unreadable by Apple.

Updating Siri and Search to provide help if users encounter unsafe situations or try to search for CSAM.

Some organisations, including the Electronic Frontier Foundation, have expressed concerns that the technology could be expanded to spy on users, but Apple has pledged not to do this and says it won’t agree to any government requests to do this. The company insists its system will only be used for the detection of child abuse images.

[Apple's FAQ (pdf)]





 

Print
Author: The Fonecast
0 Comments
Rate this article:
No rating

Leave a comment

This form collects your name, email, IP address and content so that we can keep track of the comments placed on the website. For more info check our Privacy Policy and Terms Of Use where you will get more info on where, how and why we store your data.
Add comment

Follow thefonecast.com

Twitter @TheFonecast RSS podcast feed
Find us on Facebook Subscribe free via iTunes

Archive Calendar

«November 2024»
MonTueWedThuFriSatSun
28293031123
45678910
11121314151617
18192021222324
2526272829301
2345678

Archive

Terms Of Use | Privacy Statement