One of the top media watchdogs from the United Kingdom is shedding light on how it’s spent one-year conducting thorough surveillance of various apps where videos have become a norm.
From Snapchat and TikTok to Twitch and OnlyFans, this is the first time that Ofcom debuted its leading role and what its findings have been so far in this regard. The news comes after it set out content handling regulations that were aimed to better protect youngsters and other users of the apps from harmful content.
Other than goals linked to shrinking risks related to minors watching inappropriate content that’s age sensitive, the rules set out by the VSP require a different platform to make moves that provide more security from content that incites hate and malice against certain groups. This includes all forms of racist, sexually themed, explicit, and violent content.
It’s a feature that was a long time coming and one that has taken years to produce, better called the Online Safety Bill. And it still continues to dominate after the current UK Prime Minister came into power with her aides combating digital matters. They claim to be having a closer eye on such matters to control people’s freedom and thoughts-related concerns.
In the current Online Safety Bill, Ofcom is the country’s leading internet content chief in terms of regulation. But the bill has so far been the subject of plenty of controversies. And that’s because it’s bombarded with so many add-ons.
But with new bills come new powers and Ofcom is working hard to regulate all the video content it comes it's way from leading apps. It’s issuing notifications on things it feels are unacceptable and need work. Moreover, we’re seeing plenty of platforms penalized with financial penalties and notifications that require them to take action immediately.
However, there is one major fault that experts have pointed out. And that includes how not all leading platforms are being watched, as Instagram and Twitter are not being added to the list, to begin with.
So to conclude, it’s totally up to different platforms to critique their own performances. They need to see where the VSP regulation lies and apply it there and then. And while Ofcom’s criteria for working may appear stringent, they appear to follow one theme.
And that’s related to providers double-checking their content before it goes on the app. If they feel it’s not suitable for the general public, then it should be removed. Whether the content applies to the general public or just a selected subset, again, that’s another concern that needs to be taken into consideration.
Read next: 33% Of US Teenagers and Adults Reportedly to Be Scammed Recently
From Snapchat and TikTok to Twitch and OnlyFans, this is the first time that Ofcom debuted its leading role and what its findings have been so far in this regard. The news comes after it set out content handling regulations that were aimed to better protect youngsters and other users of the apps from harmful content.
Other than goals linked to shrinking risks related to minors watching inappropriate content that’s age sensitive, the rules set out by the VSP require a different platform to make moves that provide more security from content that incites hate and malice against certain groups. This includes all forms of racist, sexually themed, explicit, and violent content.
It’s a feature that was a long time coming and one that has taken years to produce, better called the Online Safety Bill. And it still continues to dominate after the current UK Prime Minister came into power with her aides combating digital matters. They claim to be having a closer eye on such matters to control people’s freedom and thoughts-related concerns.
In the current Online Safety Bill, Ofcom is the country’s leading internet content chief in terms of regulation. But the bill has so far been the subject of plenty of controversies. And that’s because it’s bombarded with so many add-ons.
But with new bills come new powers and Ofcom is working hard to regulate all the video content it comes it's way from leading apps. It’s issuing notifications on things it feels are unacceptable and need work. Moreover, we’re seeing plenty of platforms penalized with financial penalties and notifications that require them to take action immediately.
However, there is one major fault that experts have pointed out. And that includes how not all leading platforms are being watched, as Instagram and Twitter are not being added to the list, to begin with.
So to conclude, it’s totally up to different platforms to critique their own performances. They need to see where the VSP regulation lies and apply it there and then. And while Ofcom’s criteria for working may appear stringent, they appear to follow one theme.
And that’s related to providers double-checking their content before it goes on the app. If they feel it’s not suitable for the general public, then it should be removed. Whether the content applies to the general public or just a selected subset, again, that’s another concern that needs to be taken into consideration.
Read next: 33% Of US Teenagers and Adults Reportedly to Be Scammed Recently