Technology and internet usage amongst humans have largely increased in the past few years, with tech making daily life and its tasks easier, humans are also becoming increasingly dependent on it. While the wide spread of technology surely is a good thing, it does however have a disadvantage and that is the criminal activity on the internet. While the number of internet users increase, so does the amount of criminal activities on the internet. One major online criminal activity that has been on the rise since the last few years is the filming and sharing of ‘child abuse material’, predators have created a market around this and they film and sell pictures and videos of children’s being sexually abused. This has been a major problem since a long time and many innocent children’s have suffered at the hands of it.
While governments and law makers in the US and globe continue to actively fight against child abuse, recently the tech giant Apple joined in, with Apple mobile devices dominating the market this certainly was a great step. Apple in its fight against child abuse developed a system named ‘CSAM’. The CSAM system was aimed at monitoring all iPhones in the US, it did so by checking all data for child abuse material before it is uploaded in the iCloud. The technology is still under development however it has received quite a lot of criticism from users claiming that this is just another one of Apple’s trick to steal user data.
Recent claims by researchers from the Princeton university further added to the controversy Apple has received over the past month. The researcher duo had worked on developing a system similar to CSAM and said that the system being developed by Apple is open to abuse and even termed it as somewhat dangerous.
The researcher duo behind these claims spent years in developing a prototype of a system that identifies child abuse materials in end to end encrypted applications, the researchers like Apple wanted to safeguard children’s that suffer at the hands of predators. The two successfully created a prototype of the technology however they ran into a problem when they found out that this technology however helpful can be misused.
They said that the technology can be easily used the other way around for things such as surveillance and censorship. Thus, governments and other organizations can use a technology like the CSAM to keep an eye on most civilian devices.
The researchers also pointed out other disadvantages of using the technology, they said that the process which the technology follows for identifying CSAM can at times give wrong results thus resulting in innocent users being scrutinized.
Apple however didn’t give quite the importance to the concerns raised by the researchers and said that they do not believe that the system could be used for any other purpose. While referring to the problem of wrong identification and scrutinizing the company said that system only performs a manual review after someone uploads 30 photos to the iCloud.
Read next: Apple's new Safari update is having mixed reactions launching with the iOS beta 15
While governments and law makers in the US and globe continue to actively fight against child abuse, recently the tech giant Apple joined in, with Apple mobile devices dominating the market this certainly was a great step. Apple in its fight against child abuse developed a system named ‘CSAM’. The CSAM system was aimed at monitoring all iPhones in the US, it did so by checking all data for child abuse material before it is uploaded in the iCloud. The technology is still under development however it has received quite a lot of criticism from users claiming that this is just another one of Apple’s trick to steal user data.
Recent claims by researchers from the Princeton university further added to the controversy Apple has received over the past month. The researcher duo had worked on developing a system similar to CSAM and said that the system being developed by Apple is open to abuse and even termed it as somewhat dangerous.
The researcher duo behind these claims spent years in developing a prototype of a system that identifies child abuse materials in end to end encrypted applications, the researchers like Apple wanted to safeguard children’s that suffer at the hands of predators. The two successfully created a prototype of the technology however they ran into a problem when they found out that this technology however helpful can be misused.
They said that the technology can be easily used the other way around for things such as surveillance and censorship. Thus, governments and other organizations can use a technology like the CSAM to keep an eye on most civilian devices.
The researchers also pointed out other disadvantages of using the technology, they said that the process which the technology follows for identifying CSAM can at times give wrong results thus resulting in innocent users being scrutinized.
Apple however didn’t give quite the importance to the concerns raised by the researchers and said that they do not believe that the system could be used for any other purpose. While referring to the problem of wrong identification and scrutinizing the company said that system only performs a manual review after someone uploads 30 photos to the iCloud.
Read next: Apple's new Safari update is having mixed reactions launching with the iOS beta 15
If everybody knows what Apple is doing they will just buy another phone. If you are going to catch a bad person doing bad things you don't announce your doing it, that defeats the purpose.
ReplyDeleteAuthor uses many words to say not much.
ReplyDelete