Regulators across the United Kingdom have finally spoken up against popular social media apps and how their algorithms are targeting young kids in a harmful manner.
The new draft called upon leading tech giants to tame algorithms that come under the term ‘toxic’ due to the spread of viral content relating to themes such as the likes of adult-content, self-inflicted harm, and hazardous eating conditions.
Such services were said to be restricted as they could lead to the spread of dangerous ordeals in today’s society, not to mention a rise in bullying, the spread of hate, and material that markets dangerous acts. And out of those leading the orders included Ofcom says it’s about time action was taken to protect underage users.
This latest order entails close to 40 kinds of practical endeavors entailing age-checks that are of robust origin and also orders for enhanced moderation of content published online.
As mentioned time and time again, the top telecom regulators have repeatedly argued how social media is designed to spread fun and learning but some aspects of society cannot be denied and they are ugly.
Kids are not aware sometimes of what they are getting into and it’s about time leading social media platforms to take accountability for such actions and do better with more responsible behavior.
These latest rules are said to come into play by the latter part of next year as per reports published by the BBC. And anyone that opts to break the law would be highlighted and shamed.
As per the latest draft, such regulations are created for services having a huge number of kids that use the service, and therefore their main target audience has been them for a while.
Top tech giants that fall in this category will be required to fulfill risk assessments for kids that showcase the kind of harm they provide and also which amendments they hope to make to provide safer services online.
The news comes after serious criticism from leaked sources such as the Facebook Files where top tech giants are forced into making serious changes in terms of how younger audiences must be handled better to remain safe. Across America, those below the age of 13 were not allowed technically on several platforms such as Instagram and even Facebook. But such limitations were easy to cross over.
As mentioned by the top UK regulator, it’s about time such actions were taken in the better interest of kids against social media apps and algorithms that have been allowed to function for too long without any sort of regulation in place today.
Image: DIW-Aigen
Read next: Report Shows that 64% of the Technical SEOs Do Not Feel that AI is Threatening Their Jobs
The new draft called upon leading tech giants to tame algorithms that come under the term ‘toxic’ due to the spread of viral content relating to themes such as the likes of adult-content, self-inflicted harm, and hazardous eating conditions.
Such services were said to be restricted as they could lead to the spread of dangerous ordeals in today’s society, not to mention a rise in bullying, the spread of hate, and material that markets dangerous acts. And out of those leading the orders included Ofcom says it’s about time action was taken to protect underage users.
This latest order entails close to 40 kinds of practical endeavors entailing age-checks that are of robust origin and also orders for enhanced moderation of content published online.
As mentioned time and time again, the top telecom regulators have repeatedly argued how social media is designed to spread fun and learning but some aspects of society cannot be denied and they are ugly.
Kids are not aware sometimes of what they are getting into and it’s about time leading social media platforms to take accountability for such actions and do better with more responsible behavior.
- Also read: Can Generative AI Be Manipulated To Produce Harmful Or Illegal Content? Experts Have The Answer
These latest rules are said to come into play by the latter part of next year as per reports published by the BBC. And anyone that opts to break the law would be highlighted and shamed.
As per the latest draft, such regulations are created for services having a huge number of kids that use the service, and therefore their main target audience has been them for a while.
Top tech giants that fall in this category will be required to fulfill risk assessments for kids that showcase the kind of harm they provide and also which amendments they hope to make to provide safer services online.
The news comes after serious criticism from leaked sources such as the Facebook Files where top tech giants are forced into making serious changes in terms of how younger audiences must be handled better to remain safe. Across America, those below the age of 13 were not allowed technically on several platforms such as Instagram and even Facebook. But such limitations were easy to cross over.
As mentioned by the top UK regulator, it’s about time such actions were taken in the better interest of kids against social media apps and algorithms that have been allowed to function for too long without any sort of regulation in place today.
Image: DIW-Aigen
Read next: Report Shows that 64% of the Technical SEOs Do Not Feel that AI is Threatening Their Jobs