• Office Hours : 08:00 - 17:30

Apple delays controversial CSAM detection feature

Apple has delayed its controversial image scanning feature following negative feedback. 

The company updated its briefing page on the technology, explaining it was delaying the feature based on the response it received from customers and privacy advocates. 

“Based on feedback from customers, advocacy groups, researchers, and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features,” the company said. 

The feature, announced in August, would have allowed Apple to scan photos uploaded to iCloud. The photos would be scanned on user devices and compared against a set of hashes known as Child Sexual Abuse Material (CSAM) images. 

This database was originally supposed to come from the National Center for Missing and Exploited Children (NCMEC), but Apple later explained it would only scan images that also appeared in clearing houses across multiple countries. 

Apple also said it would only trigger a human review if it found 30 CSAM matches using this method. 

The company also announced a second feature that would scan images sent to children in its iMessage app to detect nudes and notify parents. 

Both plans provoked concerns from global privacy groups. They were concerned Apple’s scanning technology could be used to scan for other kinds of imagery, opening users up to government surveillance. The company vowed not to allow government requests for expanded searches. 

Related Resource

The IT expert’s guide to AI and content management

How artificial intelligence and machine learning could be critical to your business

Whitepaper front coverWhitepaper front coverDownload now

This week, the Electronic Frontier Foundation delivered a petition protesting the technology, which was slated to be included in the next version of iOS and initially restricted to US users. 

The organization pointed out that the technology breaks the end-to-end encryption functionality Apple has touted in its operating systems. 

“Apple’s surveillance plans don’t account for abusive parents, much less authoritarian governments that will push to expand it. Don’t let Apple betray its users,” it added. 

Featured Resources

Ask more from your CMS

How to get the most value in the shortest timespan

Download now

Manufacturing modernisation: How to get there

Get your free guide to modernising your factories with mobile technology

Download now

Aberdeen Report: How a platform approach to security monitoring initiatives adds value

Integration, orchestration, analytics, automation, and the need for speed

Download now

Delivering personalised content for dummies

Map the customer journey across touchpoints and scale up with the cloud

Download now

See the original article here: ITPro

By continuing to use the site, you agree to the use of cookies. more information

The cookie settings on this website are set to "allow cookies" to give you the best browsing experience possible. If you continue to use this website without changing your cookie settings or you click "Accept" below then you are consenting to this.

Close