Everything you need to know about Apple's Child Safety Program
Everything you need to know about Apple's Child Safety Program

Apple has announced a series of tools designed to protect children's safety and prevent online exploitation that will appear in its device operating system later this year.

These actions include highly sensitive topics such as anti-child exploitation, CSAM testing, and online grooming.

The company has developed new tools to protect children from cybercriminals who abuse their communication tools.

Even if the government welcomes these new tools. However, many privacy advocates have strongly criticized the move.

How Apple is fighting child abuse

Apple wants to protect children who use its devices from the dangers of deception and exploitation and prevent pornography from spreading online. To this end, he announced three new measures:

  •     View child pornography (CSAM).
  •     The integrity of the communication in the message.
  •     Search Siri Voice Assistant and find step-by-step guides on topics related to child abuse.

The last two points are less controversial and more obvious. You don't have any real privacy or security issues.

When users search for search queries containing child pornography, voice and search assistants get involved.

The voice assistant explains to the user that it is dangerous to worry about this problem and provides links to help them solve the problem.

The security measures of message communications have been well received by most of the users. Associated with messaging and iMessage apps.

Apple is adding new tools to notify children and their parents when they receive or send pornographic images through these two apps. The purpose is to block offending images through a messaging application, while warning children of the dangers of such images and not viewing them.

Parents can also opt in to receive notifications when children see messages that may contain pornographic images.

How does the company display photos in iCloud?

The most controversial and complicated practice is to allow companies to search iCloud for images related to child pornography.

Apple compares known CSAM photos stored in the iCloud Photo accounts of iPhone and iPad users with databases from the Missing Children Agency and the state's Child Safety Organization.

Once these items are detected, Apple will immediately report these images to the appropriate law enforcement agencies.

The tool can detect illegal and authenticated photos that contain CSAM material without having to view the photos themselves or scan photos in iCloud.

When you upload offensive photos to your iCloud Photo Service account, the matching process is triggered by an encryption technology called private group intersection, which determines whether there is a match without revealing the result.

Apple says the risk of an automated system incorrectly reporting users is one in a trillion per year. Even if an error is reported, once the minimum requirements for the offending photo are met, the company can manually review the photos for confirmation.

Should I use these functions?

The new tool is not mandatory, can be used by children under the age of 13 and has already been added to the Family Sharing service.

If the child is between 13 and 17 years old, the parents will not be able to view the notification, but the child can still receive a warning about the content.

Additionally, Apple has confirmed that the message cannot be read under any circumstances as the entire process is done using machine learning technology to find these images in the device itself.

They will never know or access the photos or information around them due to the involvement of children and their parents.

Apple devices with parental controls

Apple is making sure that there are new child safety features in all new versions of the operating system scheduled for release in late 2021. Except for the CSAM discovery tool for iPhone and iPad.

Previous Post Next Post