UK’s Online Safety Regulator Says Social Media Apps Need To Do A Lot Of Work

The British communications regulator is shedding light on social media platforms not doing enough to protect children and adults from dangerous content.

Despite being a part of the UK Online Safety Act, it claims that all social media apps should know by now that they’ve got so much work to do. They need to roll out more measures to ensure all users remain safe.

Ofcom shared the news on Monday, adding how there’s a certain code practice that tech giants must follow. It does carry the threat of major fines and even site shutdown if firms end up breaching it.

The regulator also mentioned how many measures it shared as recommendations aren’t even followed by those apps working at a large scale and those who have high risks attached to them. Obviously, it’s a concerning matter, it concluded.

From OnlyFans to Facebook, X, Google, and Reddit - they were all laid down as being given three months to assess the matter. This includes what risks they possess in terms of harmful content publishing.

If the sites don’t follow the March 17, 2025 deadline, they must be prepared to roll out more safety measures to better work around the risks. For now, Ofcom is monitoring all of the progress. Its code of practice and guidelines are putting out more ways to deal with such risks.

The law applies to all those platforms and websites rolling out content designed by other users as well as those linked to large search engines. This covers more than 100,000 online services. It entails 130 priority offenses that entail a wide array of content domains like child abuse, scams, and terrorism.

Now tech giants need to better equip themselves to tackle all of this coming in its direction by making proactive decisions and bettering the moderation system. The country’s tech secretary shared how this was the biggest change to date related to online safety policies.

Amongst the steps outlined include removing all kinds of illegal material including those related to self-harm, properly funding moderation teams to deal with issues quickly, and even hiring the right staff to do the job right. Other than that, monitoring algorithms and conducting tests beforehand as well as curating what users see on feeds. All of this combined and more can make it more difficult for the spread of illegal material.

Meanwhile, large-scale apps must also give users more power to block or mute accounts as well as disable comments if and when required. If you ask us, it’s about time someone scrutinizes social media apps before it is too late.

Image: DIW-Aigen

Read next: Apple Shares 2024’s Most Downloaded Apps on iPhones and iPads
Previous Post Next Post