3aIT Blog

MacbookShortly after we published our blog last month asking whether Apple's child safety plan was the thin end of the wedge, they announced that they would be delaying these plans. While we won't take full credit for this, what has Apple decided to do here, exactly?

Just to briefly outline Apple's plans here again, they were twofold. The first proposal was to compare known images of child abuse with files that a user is trying to upload to the iCloud account, and alert the authorities if necessary. The second was to use AI to determine if a child account on a phone is sending / receiving images of an adult nature and blur them asking for confirmation that the child wants to view them, and also sending an alert to parents in some cases.

We won't go into the points made in our blog last month in detail, but our questions here were not about the proposals here in and of themselves, but about what may happen in the future if these technologies are extended to other applications.

Various rights groups and consumers made similar arguments. Enough so that Apple have decided to to delay the release of these new features. They told tech blog Techcrunch "Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features."

This announcement may or may not be related to the fact that various proof-of-concepts were released that was able to trick a system presumed to be similar to the one Apple was using to identify the files it wanted to flag that it had found one of these images when it actually hadn't. We shall see in the coming months whether Apple are refining this system, or whether it will be quietly dropped.