Apple’s web page on child protection feature has been updated recently. The new updated page was first identified by MacRumors somewhere between 2nd and 3rd week of December. This new webpage no longer holds the reference links to CSAM (child sexual abuse material) spotting options that were discussed back in August.
Out of the 3 protection features, 2 of them that were rolled out recently through iOS 15.2 update have not been removed and are still available under the name of Expanded Protection for Children. Other than these 2 features, the other child sexual abuse material identification options, that got delayed due to the resistance coming from privacy protectors are no longer present.
While talking about this, the Spokesperson of Apple, Shane Bauer told that the tech giant still holds the position it had when the announcement was made regarding the protection features getting postponed due to the response received by the audience and other protecting and research groups, as a result, the company decided to delay the procedure so it can make some space for an upgraded version of the announced features. Nowhere in the statement given by Shane, it can be concluded that the features have been removed completely and will not be released instead, the information on how Apple is planning to take every step is still available online.
The reason why the CSAM faced backlash was due to how it was supposed to work. The feature was programmed in a way that it would use fragments from photos and match them to the fragments of the abusive content. If the fragments match, the user responsible for such acts will be reported to the authorities. However, the concern was that Apple will be forced by the government to expand this feature ahead of CSAM, the company assured that it won’t accept any such request and will keep the users privacy as priority.
New date for the feature has not been given yet, however, 2 features have already moved into the development phase. These 2 features include warning young users if someone has sent them a message containing nude content and the other feature is to offer more details if they are searching for content related to child abuse through Safari or Siri search. Both of these features were released recently via iOS 15.2.
Read next: The UK Government Takes Issue Against Bad Anti-Competitive Practices From Apple And Google
Out of the 3 protection features, 2 of them that were rolled out recently through iOS 15.2 update have not been removed and are still available under the name of Expanded Protection for Children. Other than these 2 features, the other child sexual abuse material identification options, that got delayed due to the resistance coming from privacy protectors are no longer present.
While talking about this, the Spokesperson of Apple, Shane Bauer told that the tech giant still holds the position it had when the announcement was made regarding the protection features getting postponed due to the response received by the audience and other protecting and research groups, as a result, the company decided to delay the procedure so it can make some space for an upgraded version of the announced features. Nowhere in the statement given by Shane, it can be concluded that the features have been removed completely and will not be released instead, the information on how Apple is planning to take every step is still available online.
The reason why the CSAM faced backlash was due to how it was supposed to work. The feature was programmed in a way that it would use fragments from photos and match them to the fragments of the abusive content. If the fragments match, the user responsible for such acts will be reported to the authorities. However, the concern was that Apple will be forced by the government to expand this feature ahead of CSAM, the company assured that it won’t accept any such request and will keep the users privacy as priority.
New date for the feature has not been given yet, however, 2 features have already moved into the development phase. These 2 features include warning young users if someone has sent them a message containing nude content and the other feature is to offer more details if they are searching for content related to child abuse through Safari or Siri search. Both of these features were released recently via iOS 15.2.
Read next: The UK Government Takes Issue Against Bad Anti-Competitive Practices From Apple And Google