Apple’s technique to combat child sexual abuse materials on the internet took a 360 turn when people started questioning the feature. To overcome any concerns and a whole load of misconceptions, Apple publishes a Child Sexual Abuse Material (CSAM) FAQ (frequently asked questions), a detailed guide to answer a bunch of the questions raised.
The tech giant had the plan preapproved by various safety organizations hence harbored the approved seal however people still had to doubt the true intention of the initiative of a program solely based to shield children against harassers and the detection of harassment material.
In the initial plan, it announced three main areas it will be through, including iMessages. The platform’s primary aim is to provide notifications or warnings against explicit pictures on iCloud groups. Next, Apple included the detection of pictures already fed in the database through digital fingerprint detection. Lastly, search requests and Siri for these will result in a warning and links leading to websites that could help.
There have been a lot of misunderstandings regarding these three major points and the methods the tech giant has implied. To clarify, the platform cleared out that the in-app AI will be used to assess images that might hint towards nudity while CSAM will only be used for images that need the digital fingerprint process. This whole process is solely going to be based on the system’s understanding and detection, with no interference from any Apple employees. However, a person from the company may need to get involved when the picture is blurry or needs additional confirmation in order to match it up with the pre-existing CSAM database. It can also happen when there are multiple results from the CSAM. This is to ensure the time when the involvement of law enforcement is necessary.
Another major issue is the misuse of this information by ‘higher authorities, i.e. the governmental authorities and this is, surprisingly, something that even cyber security experts have been warning users about. To combat this, it is only being launched only in the US for now, keeping in accordance with the US legislature. When moving on with other countries, it will adopt the same procedure.
Apple first thanked the ones supporting its CSAM initiative, then addressed the major concerns by answering the questions provided. The major points including the CSAM FAQ include;
Although this is a good promise, there is no rule that would legally allow Apple to have such freedom. When launching its products, it needs to comply with the government in order to let users have access to them. Countries like china force the company to comply with the governmental demands and in this case, many run the risk of overburdening the CSAM database with pictures of protestors and critics, defying the whole purpose of the initiative.
We now just have to wait patiently as Apple rolls the feature out in more countries, adapting to their policies. Overall, we do think it is a pretty positive and constructive incentive much needed by all.
SOPA Images via Getty
Read next: Somebody’s Watching You: 6 Tips to Stop Being Tracked Online
The tech giant had the plan preapproved by various safety organizations hence harbored the approved seal however people still had to doubt the true intention of the initiative of a program solely based to shield children against harassers and the detection of harassment material.
In the initial plan, it announced three main areas it will be through, including iMessages. The platform’s primary aim is to provide notifications or warnings against explicit pictures on iCloud groups. Next, Apple included the detection of pictures already fed in the database through digital fingerprint detection. Lastly, search requests and Siri for these will result in a warning and links leading to websites that could help.
There have been a lot of misunderstandings regarding these three major points and the methods the tech giant has implied. To clarify, the platform cleared out that the in-app AI will be used to assess images that might hint towards nudity while CSAM will only be used for images that need the digital fingerprint process. This whole process is solely going to be based on the system’s understanding and detection, with no interference from any Apple employees. However, a person from the company may need to get involved when the picture is blurry or needs additional confirmation in order to match it up with the pre-existing CSAM database. It can also happen when there are multiple results from the CSAM. This is to ensure the time when the involvement of law enforcement is necessary.
Another major issue is the misuse of this information by ‘higher authorities, i.e. the governmental authorities and this is, surprisingly, something that even cyber security experts have been warning users about. To combat this, it is only being launched only in the US for now, keeping in accordance with the US legislature. When moving on with other countries, it will adopt the same procedure.
Apple first thanked the ones supporting its CSAM initiative, then addressed the major concerns by answering the questions provided. The major points including the CSAM FAQ include;
- CSAM on iCloud and protection on iMessages are two different features and are in no way related. CSAM on iCloud will only be implemented on those that use iCloud to store images. iMessages protection will keep children from exchanging explicit materials. The content on both platforms will remain hidden and secure.
- The iMessages exchanged will never surface in front of any law enforcement institution with the iMessages end-to-end encrypted.
- Children can also use iMessages to ask for help through text only when under abusive parents.
- Parents aren’t notified immediately but when a child under the age or of 12 continues the activity despite a warning.
- All CSAM fingerprints will be manually checked first if law enforcement needs to get involved.
Although this is a good promise, there is no rule that would legally allow Apple to have such freedom. When launching its products, it needs to comply with the government in order to let users have access to them. Countries like china force the company to comply with the governmental demands and in this case, many run the risk of overburdening the CSAM database with pictures of protestors and critics, defying the whole purpose of the initiative.
We now just have to wait patiently as Apple rolls the feature out in more countries, adapting to their policies. Overall, we do think it is a pretty positive and constructive incentive much needed by all.
SOPA Images via Getty
Read next: Somebody’s Watching You: 6 Tips to Stop Being Tracked Online