Meta has long been in the spotlight for not doing enough to curb the growing issue linked to safeguarding minors on its platforms.
Now, Meta is speaking about how parents are the ones to blame for not safeguarding their kids as they refuse to use the tools in place. Nick Clegg highlighted how adults are failing in terms of embracing 50 tools on offer for child safety for the company. Meta says they introduced it over the past few years but the response isn’t quite what they may have expected.
Meta’s chief for Global Affairs also shared how there seem to be behavioral problems surrounding these tools. After admitting that it was ignored by parents, he explained how there is not a lot that the tech giant can do if compliance is not there.
As days go by, more issues having to do with regulatory pressure are taking center stage. Companies hope to protect kids from harm as much as possible. For instance, Australia’s government is working hard to ban social media for young teens.
However, it does not seem like Meta is in favor of that. It feels that using parental controls the right way is a better way to monitor kids and be aware of their actions. Parents always have a choice to monitor kids and restrict their timings when using the internet. If they don’t use it, they’re not going to benefit.
Clegg also delivered some great news linked to Meta’s apps giving users a general positive experience. There are a huge number of youngsters who were happy and satisfied with their online experience and the safeguards in place by the company. This is even though one whistleblower mentioned in the past that Meta was not doing enough to curb the growing issues of abuse, death, and harmful content on its apps.
One teenager is believed to have taken her own life because she was not happy with her surroundings. Social media had negatively impacted her mind to such an extent that she took this drastic step.
This is another reason why the UK rolled out its Online Safety Act to put out certain requirements on companies so kids are guarded from harmful material.
Image: Chatham House / YT
Read next: Meta Says It Will Restart Efforts To Train AI Systems Using Public Posts From Instagram and Facebook
Now, Meta is speaking about how parents are the ones to blame for not safeguarding their kids as they refuse to use the tools in place. Nick Clegg highlighted how adults are failing in terms of embracing 50 tools on offer for child safety for the company. Meta says they introduced it over the past few years but the response isn’t quite what they may have expected.
Meta’s chief for Global Affairs also shared how there seem to be behavioral problems surrounding these tools. After admitting that it was ignored by parents, he explained how there is not a lot that the tech giant can do if compliance is not there.
As days go by, more issues having to do with regulatory pressure are taking center stage. Companies hope to protect kids from harm as much as possible. For instance, Australia’s government is working hard to ban social media for young teens.
However, it does not seem like Meta is in favor of that. It feels that using parental controls the right way is a better way to monitor kids and be aware of their actions. Parents always have a choice to monitor kids and restrict their timings when using the internet. If they don’t use it, they’re not going to benefit.
Clegg also delivered some great news linked to Meta’s apps giving users a general positive experience. There are a huge number of youngsters who were happy and satisfied with their online experience and the safeguards in place by the company. This is even though one whistleblower mentioned in the past that Meta was not doing enough to curb the growing issues of abuse, death, and harmful content on its apps.
One teenager is believed to have taken her own life because she was not happy with her surroundings. Social media had negatively impacted her mind to such an extent that she took this drastic step.
This is another reason why the UK rolled out its Online Safety Act to put out certain requirements on companies so kids are guarded from harmful material.
Image: Chatham House / YT
Read next: Meta Says It Will Restart Efforts To Train AI Systems Using Public Posts From Instagram and Facebook