The Apple Child Safety Program consists of several phases
The Apple Child Safety Program consists of several phases

Craig Federighi said Apple plans to search iCloud Photos for child sexual abuse or CSAM material, including multiple levels of verification.

In an interview with The Wall Street Journal, Federigi, Apple's senior vice president of software engineering, announced new details about its controversial child safety measures.

This includes claims that iPhone and iPad-level scans help security professionals verify that organizations are using the system responsibly.

Like many companies that provide cloud storage services, the company verifies the authenticity of iCloud photos by comparing them to lists from the National Center for Missing and Exploited Children (NCMEC). This is used to find an exact match with a known CSAM image.

But unlike many services, the search is done on all devices and not entirely remotely.

Federigi explained how this could lead to the assumption that the company would not significantly expand the database to include materials other than CSAM that are illegal, especially in countries with restrictive screening policies.

He said: We use the same database in China to offer the same program in China as we do in Europe. If someone comes to the company and asks to clear data other than CSAM, the company will refuse.

And he added, if you don't trust, you have to be sure that if you agree, business can't escape. So there are several levels of verification. We make sure that you don't have to trust any entity or even the image of the country.

Apple's plan consists of several stages

The company previously announced that it will only launch the system in the US and is considering rolling it out to other countries depending on certain circumstances.

The company has confirmed that it has the famous CSAM hashing database in the operating systems of all countries. However, they are only used for surveys in the United States.

The Wall Street Journal also found that independent reviewers can review problematic photos.

Federigi also provided more detailed information about when the scanning system was alerting company administrators about potentially illegal content.

Apple previously said that not a single game will trigger a red flag, a measure to prevent false positives.

Instead, the system generates security credentials for each game and warns the company when the number reaches a certain limit.

The company declined to reveal the exact number because it could allow attackers to avoid detection. But Federgi said that number was linked to 30 known child pornography images.

Some security professionals have carefully praised the company's system, realizing the importance of finding a CSAM online. However, many criticized the sudden introduction of the system and the lack of clarity on how it works.

In an interview with the newspaper, Federigi admitted his confusion. "Obviously a lot of information is not clear in the way things are understood," he said.



Save 80.0% on select products from RUWQ with promo code 80YVSNZJ, through 10/29 while supplies last.

HP 2023 15'' HD IPS Laptop, Windows 11, Intel Pentium 4-Core Processor Up to 2.70GHz, 8GB RAM, 128GB SSD, HDMI, Super-Fast 6th Gen WiFi, Dale Red (Renewed)
Previous Post Next Post