Apple has always been very conscious about its user safety and has always tried to provide the best tools to ensure that the data of the iOS users is secured.
This time too Apple has come up with another new feature, called Child Safety which will be made available with the new update in iOS 15.
Apple in its new feature has introduced a technology which will be able to identify any form of child sexual abuse images on an Apple device. Any image uploaded to the cloud will first be detected properly and only once cleared will move to the iCloud. However, if the system detects anything wrong with the image, the upload will fail and the user will receive a notification.
It is unlikely that your iOS device will detect a wrong image because the system will match the particular detected image with content which have already been flagged and reported in the databases. The image will also only be reported when it crosses a particular threshold level set by the company, which according to them is pretty high so the chances of a wrong report to Apple is one in a billion.
However, the company assures its users that once it receives a notification from the user’s phone about a CSAM (Child Sexual Abuse Material) in the iCloud, it will do a recheck to confirm its severity and once it is confirmed will report it to the NCMEC (the National Center for Missing and Exploited Children) for further actions to be taken and will also immediately disable the users' account.
This is a great step by Apple considering how high child sexual abuse cases have been in the recent times, but this is not the only step.
Apple is also working towards a similar protection feature in iMessage, which will most likely be directed towards the younger audience which use the Apple's device. The company is working towards a feature which will blur out any image which it deems as inappropriate for the young minds specially if they receive it from unknown numbers. The company then will warn the user about the content, guide them with helpful resources and assure them it is okay if they don't want to pursue opening the image. The young’s parents will also immediately be informed.
All this will help prevent abuse and harassment cases that have been emerging quite often now and will make both the kids and adult users feel safe and looked out for. Apple is taking some great step with their features and we are proud that the top most mobile giant has taken steps for the betterment of its society.
Read next: Does the world love Apple or Android more?
This time too Apple has come up with another new feature, called Child Safety which will be made available with the new update in iOS 15.
Apple in its new feature has introduced a technology which will be able to identify any form of child sexual abuse images on an Apple device. Any image uploaded to the cloud will first be detected properly and only once cleared will move to the iCloud. However, if the system detects anything wrong with the image, the upload will fail and the user will receive a notification.
It is unlikely that your iOS device will detect a wrong image because the system will match the particular detected image with content which have already been flagged and reported in the databases. The image will also only be reported when it crosses a particular threshold level set by the company, which according to them is pretty high so the chances of a wrong report to Apple is one in a billion.
However, the company assures its users that once it receives a notification from the user’s phone about a CSAM (Child Sexual Abuse Material) in the iCloud, it will do a recheck to confirm its severity and once it is confirmed will report it to the NCMEC (the National Center for Missing and Exploited Children) for further actions to be taken and will also immediately disable the users' account.
This is a great step by Apple considering how high child sexual abuse cases have been in the recent times, but this is not the only step.
Apple is also working towards a similar protection feature in iMessage, which will most likely be directed towards the younger audience which use the Apple's device. The company is working towards a feature which will blur out any image which it deems as inappropriate for the young minds specially if they receive it from unknown numbers. The company then will warn the user about the content, guide them with helpful resources and assure them it is okay if they don't want to pursue opening the image. The young’s parents will also immediately be informed.
All this will help prevent abuse and harassment cases that have been emerging quite often now and will make both the kids and adult users feel safe and looked out for. Apple is taking some great step with their features and we are proud that the top most mobile giant has taken steps for the betterment of its society.
Read next: Does the world love Apple or Android more?