Apple has been ordered to abandon the plan to check its devices
Apple has been ordered to abandon the plan to check its devices

An international coalition of political and civil rights organizations has issued an open letter calling on Apple to drop its recently announced plans to develop monitoring functions for the iPhone, iPad and other products.

These groups include the American Civil Liberties Union, the Electronic Frontier Foundation, Access Now, Privacy International, and The Tor Project.

Earlier this month, the company announced plans to use new technology in iOS to detect images of potential child abuse, with the goal of curbing the spread of child sexual abuse material online.

The company also announced a new login security feature. It uses machine learning across devices to identify and hide pornographic images that children get in messaging apps.

If children view or submit such images, they can notify parents of children under 12 years of age.

In the letter, these groups wrote: Although the purpose of these capabilities is to protect children and reduce the spread of child pornography. However, we are concerned that it could be used to censor language and threaten people's privacy and safety everywhere. This can have serious consequences for many children.

The company's new child safety page outlines an appliance inspection plan. This is before photos are saved to iCloud.

It will not be scanned until the files are backed up to iCloud. The company said that if the account's encrypted receipt (along with photos uploaded via iCloud) reaches a known CSAM compliance threshold, the matching data will be received.

Apple has been ordered to abandon the plan to check its devices

The company and cloud email provider use a hash system to find CSAM sent via email. But the new software applies the same scanning process to photos stored in iCloud. Even if the user does not share it or send it to others.

In response to concerns about potential misuse of the technology, the company said we are restricting the use of CSAM data disclosures. We do not accept government requests for renewal.

Strong opposition to the new measures has mainly focused on the Wipe Device feature. But civil rights and privacy groups said plans to ban iMessage's child nudity could put children at risk and crack end-to-end iMessage encryption.

The message indicates that this tailgate function has been integrated once. The government can force companies to extend the warning to other accounts. It turns out that the rejected photos were for reasons other than pornography.

Previous Post Next Post