Researchers Call For Stricter Laws Requiring Apps To Remove Revenge-Porn As X Accused Of Ignoring Requests Until DMCA Used

Researchers are looking for new reforms to try and force social media apps to take revenge porn seriously.

As per a study published recently, X was called out for refusing takedown requests. It was discovered that only if victims flagged it through the DMCA instead of the app’s NCII reporting technique, would immediate effect come into play.

The news is alarming because X and Musk have spoken about how non-consensual nudity is not promoted on the app. But the fact that they’re showing high tolerance to the problem as noted in the 404 Media study raises more questions.

Experts feel victims should not have to wait for long periods if they find their explicit images published on the app. Many victims refer to the act as challenging because removing content online is again getting tough.

Dealing with the mental impact that this may have, alongside worries about takedown requests, can really put victims in a difficult situation. Those who appeared in their mid-20s and mid-30s were targeted the most, the study explained. They are now seeking for justice and want reforms before the problem gets out of hand.

In this study, the researchers tested how fast and efficient X was at takedown requests. They gave the company a span of three weeks to see if they would remove it and shockingly, no luck. Only those reported through DMCA were removed while the others were either left as it is or temporarily suspended.

Why there is a huge difference between both forms of reporting and takedowns is alarming. Remember, not all victims can use DMCA for takedown reporting as it’s costly. If the victim cannot prove they were not behind the creation of the images, they’re at a complete loss.

Now another question on people’s minds is why this study was targeting X in the first place. The answer is simple. There is less impact on any paid moderators if they view the AI-based nude pictures. The same would not be the case with other social media apps where the entire moderation team is more efficient.

As per X’s latest transparency report, most non-consensual pictures are actioned by human moderators. This is opposed to what the study found, clearly showing the need for X to step up its game and be more vigilant.

Image: DIW-Aigen

Read next: Roblox Accused Of Inflating Active User Counts And Allowing Predatory Behavior On The Platform
Previous Post Next Post