Both TikTok and Bumble are on a mission to help fight against the ongoing tech battle of anti-revenge porn.
The apps are the latest big tech names to join Meta which has been striving hard against putting an end to the spread of evils in a society like revenge porn. This is another term reserved for explicit images and videos that are shared on the web without taking into consideration a particular individual’s consent.
The platforms have reportedly partnered with the likes of StopNCIL.org which is an integral tool created in partnership with all of these tech platforms that would detect and even block any pictures of this sort.
This webpage allows for the creation of unique fingerprints through digital means, which entails images and videos. Hence, the process would take place on a device to protect a certain individual’s privacy. But the real files are not uploaded to this website, only a certain array of codes in the form of letters and numbers.
All such digital fingerprints that are being submitted to the website are done so through different partners of the website. So for instance, if an image or a video has been uploaded to apps like TikTok, Facebook, and even Bumble or Instagram and it matches the respective hash in question, this file would be handed over to the app’s moderation team.
In those cases where one moderator sees the image breaking the app’s respective policy, they would end up removing it. Moreover, you’ll find the other partner platforms blocking the picture so it’s not shared any further too.
For a while now, this tool has gone live and nearly 12,000 individuals have managed to make cases to stop such pictures from getting shared without any users’ consent. As of today, nearly 40,000 hashes were designed, up to date.
Bloomberg mentioned that Meta has partnered up with the likes of various non-profit organizations behind this particular helpline so it encourages others to come forward and sign up as well.
This particular effort is building up on the idea that was first begun by Meta as a pilot in Australia. It requested users to put up explicit images of a particular chat on Messenger that they carried out with themselves.
The company vowed to get rid of all the images, once the study or trial was complete but as anyone can imagine, people were very skeptical.
Both TikTok and Bumble are uniting on this front to rid this increasingly concerning matter and help carry out a major crackdown against all pictures that are intimate and shared without anyone giving consent.
Read next: This Study Reveals a Link Between Smartphone Addiction and Compulsive Shopping
The apps are the latest big tech names to join Meta which has been striving hard against putting an end to the spread of evils in a society like revenge porn. This is another term reserved for explicit images and videos that are shared on the web without taking into consideration a particular individual’s consent.
The platforms have reportedly partnered with the likes of StopNCIL.org which is an integral tool created in partnership with all of these tech platforms that would detect and even block any pictures of this sort.
This webpage allows for the creation of unique fingerprints through digital means, which entails images and videos. Hence, the process would take place on a device to protect a certain individual’s privacy. But the real files are not uploaded to this website, only a certain array of codes in the form of letters and numbers.
All such digital fingerprints that are being submitted to the website are done so through different partners of the website. So for instance, if an image or a video has been uploaded to apps like TikTok, Facebook, and even Bumble or Instagram and it matches the respective hash in question, this file would be handed over to the app’s moderation team.
In those cases where one moderator sees the image breaking the app’s respective policy, they would end up removing it. Moreover, you’ll find the other partner platforms blocking the picture so it’s not shared any further too.
For a while now, this tool has gone live and nearly 12,000 individuals have managed to make cases to stop such pictures from getting shared without any users’ consent. As of today, nearly 40,000 hashes were designed, up to date.
Bloomberg mentioned that Meta has partnered up with the likes of various non-profit organizations behind this particular helpline so it encourages others to come forward and sign up as well.
This particular effort is building up on the idea that was first begun by Meta as a pilot in Australia. It requested users to put up explicit images of a particular chat on Messenger that they carried out with themselves.
The company vowed to get rid of all the images, once the study or trial was complete but as anyone can imagine, people were very skeptical.
Both TikTok and Bumble are uniting on this front to rid this increasingly concerning matter and help carry out a major crackdown against all pictures that are intimate and shared without anyone giving consent.
Read next: This Study Reveals a Link Between Smartphone Addiction and Compulsive Shopping