Apple delays launch of child abuse detection tools

The Apple logo
Support us and go ad-free

Apple is to delay the launch of new tools designed to detect child sexual abuse material (CSAM), saying it wants to take more time to “make improvements” after privacy concerns were raised.

Privacy

The iPhone maker had announced plans to introduce new systems which would detect child sexual abuse imagery when someone tried to upload it to iCloud and report it to authorities. Apple said the process would be done securely and would not regularly scan a user’s camera roll, however, privacy campaigners raised concerns over the plans, with some suggesting the technology could be hijacked by authoritarian governments to look for other types of imagery – something Apple said it would not allow.

The tech giant has now confirmed it’s delaying the rollout following feedback from a number of groups. The company said in a statement:

Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of child sexual abuse material.

Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.

The system works by looking for image matches based on a database of “hashes” – a type of digital fingerprint – of known CSAM images provided by child safety organisations. This process takes place securely on a device when a user attempts to upload images to their iCloud photo library.

Read on...

Support us and go ad-free

It was to be joined by another new feature in the Messages app, which warns children and their parents using linked family accounts when sexually explicit photos are sent or received, with images blocked from view and on-screen alerts; and new guidance in Siri and Search which will point users to helpful resources when they perform searches related to CSAM.

Apple said the two features are not the same and do not use the same technology, adding that it will “never” gain access to communications as a result of the improvements to Messages.

Support us and go ad-free

We need your help to keep speaking the truth

Every story that you have come to us with; each injustice you have asked us to investigate; every campaign we have fought; each of your unheard voices we amplified; we do this for you. We are making a difference on your behalf.

Our fight is your fight. You’ve supported our collective struggle every time you gave us a like; and every time you shared our work across social media. Now we need you to support us with a monthly donation.

We have published nearly 2,000 articles and over 50 films in 2021. And we want to do this and more in 2022 but we don’t have enough money to go on at this pace. So, if you value our work and want us to continue then please join us and be part of The Canary family.

In return, you get:

* Advert free reading experience
* Quarterly group video call with the Editor-in-Chief
* Behind the scenes monthly e-newsletter
* 20% discount in our shop

Almost all of our spending goes to the people who make The Canary’s content. So your contribution directly supports our writers and enables us to continue to do what we do: speaking truth, powered by you. We have weathered many attempts to shut us down and silence our vital opposition to an increasingly fascist government and right-wing mainstream media.

With your help we can continue:

* Holding political and state power to account
* Advocating for the people the system marginalises
* Being a media outlet that upholds the highest standards
* Campaigning on the issues others won’t
* Putting your lives central to everything we do

We are a drop of truth in an ocean of deceit. But we can’t do this without your support. So please, can you help us continue the fight?

The Canary Support us