Meta is on a mission to enhance the online security and protection of young users. The news comes in wake of the Youth Safety and Well-Being Conference that was recently held for the first time in Washington DC.
Facebook’s parent firm mentioned how global cooperation was required among different governments to create new requirements. This was surrounding major elements of online safety for kids like providing greater access and detection while ruling around what is seemingly acceptable and what’s not.
The new Youth Safety Summit that was held brought together several experts from the category of mental health as well as educators and researchers. There was a round of conferences surrounding major issues like those related to children’s safety and how such matters can be curbed.
So many reports have already gone on to speak about how the problem is massive and thanks to the likes of low esteem, self-comparison, and kids dying while performing dangerous activities- it’s high time the matter was addressed. These have to do with what kids see online like on Instagram and TikTok.
So many apps on social media entail age requirements, alongside tool variance that are designed to highlight and restrict youngsters from getting in and accessing content that’s appropriate. Such safeguards are being overcome and many kids are being surrounded by such trends which they feel are savvy. Moreover, their parents are aloof from what’s taking place.
Today, many advanced systems are taking center stage and they entail the likes of facial recognition and other types of software that estimate age. This can determine the actual ages of account holders linked to a series of such factors.
As it is, Instagram is working hard and side by side with such apps owned by third parties. Meta knows that it has really set forward new measures that detect and prevent kids from gaining access to such platforms.
Meta really sees this as a major issue and that’s why its head of Global Affairs mentioned recently through a post that the EU and the US are working hard to devise a strategy where other countries can take part and tackle the matter as a whole. It requires teamwork to make such an effort work and that’s the case here.
Meta is taking a very similar approach in terms of content regulation and setting forward an Oversight Board that scrutinizes decisions done internally. It has started to call on various governments and create definitive rules that are going to apply to all those providing services to users online.
Read next: Meta Ensures Fair Distribution Of Ads Across Platforms By Adopting Machine Learning-Powered Technology
Facebook’s parent firm mentioned how global cooperation was required among different governments to create new requirements. This was surrounding major elements of online safety for kids like providing greater access and detection while ruling around what is seemingly acceptable and what’s not.
The new Youth Safety Summit that was held brought together several experts from the category of mental health as well as educators and researchers. There was a round of conferences surrounding major issues like those related to children’s safety and how such matters can be curbed.
So many reports have already gone on to speak about how the problem is massive and thanks to the likes of low esteem, self-comparison, and kids dying while performing dangerous activities- it’s high time the matter was addressed. These have to do with what kids see online like on Instagram and TikTok.
So many apps on social media entail age requirements, alongside tool variance that are designed to highlight and restrict youngsters from getting in and accessing content that’s appropriate. Such safeguards are being overcome and many kids are being surrounded by such trends which they feel are savvy. Moreover, their parents are aloof from what’s taking place.
Today, many advanced systems are taking center stage and they entail the likes of facial recognition and other types of software that estimate age. This can determine the actual ages of account holders linked to a series of such factors.
As it is, Instagram is working hard and side by side with such apps owned by third parties. Meta knows that it has really set forward new measures that detect and prevent kids from gaining access to such platforms.
Meta really sees this as a major issue and that’s why its head of Global Affairs mentioned recently through a post that the EU and the US are working hard to devise a strategy where other countries can take part and tackle the matter as a whole. It requires teamwork to make such an effort work and that’s the case here.
Meta is taking a very similar approach in terms of content regulation and setting forward an Oversight Board that scrutinizes decisions done internally. It has started to call on various governments and create definitive rules that are going to apply to all those providing services to users online.
Read next: Meta Ensures Fair Distribution Of Ads Across Platforms By Adopting Machine Learning-Powered Technology