In this era of technology literally everyone is present on the social media platforms, all young and adults, even kids who are under the age of 13 are present which is against the age restriction guidelines of many platforms. The fact that such minors are on social media and interacting with strangers brings a huge risk to the child’s safety.
A report was carried out by Thorn which is a nonprofit organization that builds technology to defend children from sexual abuse, identifies a disturbing gap in efforts by Snap, Facebook, YouTube, TikTok, and others to keep children safe. The report found out that though children under the age of 13 are not supposed to be present on the application, they are still using it in a large number and because of it minors in the US often receive abuse, harassment, or sexual solicitation from adults on tech platforms. In response to this bullying and online harassment, the children do not turn to their parents or any guardian/adult for help but instead rely on the support for tech platforms, whose limited blocking and reporting tools have failed to address the threats they face. The report stated that from their survey they got know that children who block harassers or bully on the online platforms and block them are later re contacted by those same bully/harasser from either a different ID or on a different platform.
Some other key points of the report suggested some facts like children are using social media long before they turn 13 and they are reported of having online sexual interactions at high rates both with their peers and people they believe to adults and Children who identify as LGBTQ+ experience all of these harms at higher rates than their non-LGBTQ+ peers.
Currently, many lawmakers criticized Facebook and Google for the potential effect their apps could have on children and just last week attorneys from 44 different states called out Mark Zuckerberg to call off his plans on creating an Instagram for kids under the age of 13 and considering all these situations the Thorn report has come just in time because their report can trigger and highlight the ways young children use social platforms and how the tech giants have failed to offer them proper support in times of need. The reports can also push the industry to collaborate on cross-platform solutions between Facebook, Google, Snap and others that Thorn says are needed to fully address the threats children face.
Thorns suggested that since kids have an exposure to internet since elementary school now, it is up to the platforms, parents, and governments to work hand in hand and understand how the kids are using technology and develop better ways to protect them from any kind of nuisance that they face online.
Julie Cordua who is Thorn’s CEO in an interview said that these three factors can only work hand in hand when they perform individual tasks and then combine it together. The CEO said that platforms have to be better at designing experiences, Adults need to create safe spaces for kids to have conversations while the government and policy side, lawmakers need to understand kids’ online experiences at this level of detail.
Considering how times have changed they have to now look deep into what experiences does the youth seek out online and what are the impacts of it.
Doing this is not easy though because collecting data can be hard task on this mission. This is due to a variety of reasons but the main being is that the topics involved are extremely sensitive, and much of the best information is held within the private companies running the platform.
In order to achieve data on this topic, Thorn itself conducted a survey from October 25 to November 11 in which 1000 children aged between 9 to 17 were asked to fill out a survey form with the consent on their parents.
While a lot of things were observed in the survey the main thing was that since children are not supposed to be using the platform before the age of 13, the tech giants build safety tools on platforms without them in mind. On this the CEO suggested that despite the age criteria, a lot of kids are still present on social platforms and all tech giants know that hence it is their responsibility to make support mechanisms which can be handled by everyone and make their applications a safe space for all.
Thorn found that 22 percent of minors who attempted to report that their nudes had been leaked didn’t feel like platform reporting systems permitted them to report it and hence tech giants should rewrite their user reports language more accurately so that it can easily reflect it to the user as to what harm they are reporting. However, while a lot of minors stated that they were more than capable enough to use the reporting tools on the platform, they felt like the security and reporting menu did not have enough options and they could not find the one best fitting for their situation.
Thorn suggested some ideas as to what companies can do in order to provide a safer environment on their application.
According to Thorn, tech giants can invest more heavily in age verification and even if they believe that kids will always find a way online there is a possibility that this may decrease the number. Apart from this, platforms could integrate crisis support phone numbers into messaging apps to help kids find resources when they have experienced abuse. They could share block lists with each other to help identify predators, though this would raise privacy and civil rights concerns. And they could invest more heavily in ban evasion so that predators and bullies can’t easily create alternate accounts after they are blocked.
Tech giants should also make their reporting services and responds more efficient because the report suggested that there have been a lot of times where people report an account or a query and they get a respond to it after weeks and in some cases never.
We sincerely hope that tech giants take action and make their services a safer space for the children who are a part of the online community.
Read next: US Consumers Claim they aren't Addicted to their Smartphones + More Interesting Findings
A report was carried out by Thorn which is a nonprofit organization that builds technology to defend children from sexual abuse, identifies a disturbing gap in efforts by Snap, Facebook, YouTube, TikTok, and others to keep children safe. The report found out that though children under the age of 13 are not supposed to be present on the application, they are still using it in a large number and because of it minors in the US often receive abuse, harassment, or sexual solicitation from adults on tech platforms. In response to this bullying and online harassment, the children do not turn to their parents or any guardian/adult for help but instead rely on the support for tech platforms, whose limited blocking and reporting tools have failed to address the threats they face. The report stated that from their survey they got know that children who block harassers or bully on the online platforms and block them are later re contacted by those same bully/harasser from either a different ID or on a different platform.
Some other key points of the report suggested some facts like children are using social media long before they turn 13 and they are reported of having online sexual interactions at high rates both with their peers and people they believe to adults and Children who identify as LGBTQ+ experience all of these harms at higher rates than their non-LGBTQ+ peers.
Currently, many lawmakers criticized Facebook and Google for the potential effect their apps could have on children and just last week attorneys from 44 different states called out Mark Zuckerberg to call off his plans on creating an Instagram for kids under the age of 13 and considering all these situations the Thorn report has come just in time because their report can trigger and highlight the ways young children use social platforms and how the tech giants have failed to offer them proper support in times of need. The reports can also push the industry to collaborate on cross-platform solutions between Facebook, Google, Snap and others that Thorn says are needed to fully address the threats children face.
Thorns suggested that since kids have an exposure to internet since elementary school now, it is up to the platforms, parents, and governments to work hand in hand and understand how the kids are using technology and develop better ways to protect them from any kind of nuisance that they face online.
Julie Cordua who is Thorn’s CEO in an interview said that these three factors can only work hand in hand when they perform individual tasks and then combine it together. The CEO said that platforms have to be better at designing experiences, Adults need to create safe spaces for kids to have conversations while the government and policy side, lawmakers need to understand kids’ online experiences at this level of detail.
Considering how times have changed they have to now look deep into what experiences does the youth seek out online and what are the impacts of it.
Doing this is not easy though because collecting data can be hard task on this mission. This is due to a variety of reasons but the main being is that the topics involved are extremely sensitive, and much of the best information is held within the private companies running the platform.
In order to achieve data on this topic, Thorn itself conducted a survey from October 25 to November 11 in which 1000 children aged between 9 to 17 were asked to fill out a survey form with the consent on their parents.
While a lot of things were observed in the survey the main thing was that since children are not supposed to be using the platform before the age of 13, the tech giants build safety tools on platforms without them in mind. On this the CEO suggested that despite the age criteria, a lot of kids are still present on social platforms and all tech giants know that hence it is their responsibility to make support mechanisms which can be handled by everyone and make their applications a safe space for all.
Thorn found that 22 percent of minors who attempted to report that their nudes had been leaked didn’t feel like platform reporting systems permitted them to report it and hence tech giants should rewrite their user reports language more accurately so that it can easily reflect it to the user as to what harm they are reporting. However, while a lot of minors stated that they were more than capable enough to use the reporting tools on the platform, they felt like the security and reporting menu did not have enough options and they could not find the one best fitting for their situation.
Thorn suggested some ideas as to what companies can do in order to provide a safer environment on their application.
According to Thorn, tech giants can invest more heavily in age verification and even if they believe that kids will always find a way online there is a possibility that this may decrease the number. Apart from this, platforms could integrate crisis support phone numbers into messaging apps to help kids find resources when they have experienced abuse. They could share block lists with each other to help identify predators, though this would raise privacy and civil rights concerns. And they could invest more heavily in ban evasion so that predators and bullies can’t easily create alternate accounts after they are blocked.
Tech giants should also make their reporting services and responds more efficient because the report suggested that there have been a lot of times where people report an account or a query and they get a respond to it after weeks and in some cases never.
We sincerely hope that tech giants take action and make their services a safer space for the children who are a part of the online community.
Read next: US Consumers Claim they aren't Addicted to their Smartphones + More Interesting Findings