The prevalence of child pornography online is something that makes the internet a really dangerous place for a lot of people because of the fact that this is the sort of thing that could potentially end up causing widespread psychological damage to countless individuals. Apple is attempting to solve this by creating a system wherein iCloud photos and the like can be scanned for materials depicting child sexual abuse, but with all of that having been said and now out of the way it is important to note that a lot of people, including hundreds of Apple employees, are saying that this might do more harm than good.
Apple is claiming that this new policy is not going to reduce privacy because it will not scan private messages and the like, but with all things having been considered and taken into account it is still highly possible that dictatorial governments might use this to crack down on dissidents and the like. It would be as easy as claiming that an individual is suspected of harboring child sexual abuse content and this would ostensibly give them a way to spy on people without them even realizing it.
While Apple maintains that this is not going to be the case, it’s hard to argue against it. Child sexual abuse material, or CSAM, should be cracked down at any cost though, and many would say that this is an adequate compromise because it might just result in this type of material disappearing from the internet forever. However, others would argue that this gives Apple too much power to infringe on people’s privacy. It remains to be seen what the impact of this would be, but there are plenty both within the company and outside out it that are raising concerns and Apple will have to address them sooner or later.
Read next: Senators Release A Bill Ending Google And Apple’s App Stores Monopoly
Apple is claiming that this new policy is not going to reduce privacy because it will not scan private messages and the like, but with all things having been considered and taken into account it is still highly possible that dictatorial governments might use this to crack down on dissidents and the like. It would be as easy as claiming that an individual is suspected of harboring child sexual abuse content and this would ostensibly give them a way to spy on people without them even realizing it.
While Apple maintains that this is not going to be the case, it’s hard to argue against it. Child sexual abuse material, or CSAM, should be cracked down at any cost though, and many would say that this is an adequate compromise because it might just result in this type of material disappearing from the internet forever. However, others would argue that this gives Apple too much power to infringe on people’s privacy. It remains to be seen what the impact of this would be, but there are plenty both within the company and outside out it that are raising concerns and Apple will have to address them sooner or later.
Read next: Senators Release A Bill Ending Google And Apple’s App Stores Monopoly