Meta, the parent company of WhatsApp, has made a change to the minimum age for users in the UK and EU. It has been lowered from 16 to 13 years old. This decision has upset many child safety groups. They argue that Meta is choosing profits over the safety of children.
The group Smartphone Free Childhood said this move goes against a growing demand for more child protection from big tech companies. They feel that allowing younger children to use the platform suggests it is safe, although many teachers, parents, and experts disagree.
WhatsApp says this new age limit is the same as in most other countries and assures that they have safety measures in place. Ofcom, the UK's communications regulator, is getting ready to enforce stricter online safety rules. Mark Bunting, the online safety strategy incharge, said that they are writing new rules for online platforms. Once these rules are active next year, Ofcom will check if companies are protecting users well. Companies that don't follow the rules could be fined heavily.
Meta has also introduced new safety features, including Facebook and Instagram, to address issues like sextortion and the sharing of private images. One new tool is the Nudity Protection Filter. This filter will automatically blur images that show nudity for users under 18 years old.
It also allows users to block the sender and report the chat if they receive inappropriate images. This tool is meant to help users feel safe and avoid pressure to respond to such images. Meta is taking these steps to make their platforms safer for all users, especially young people.
Image: DIW-AIgen
Read next: Employee Referrals Schemes Are Driving Recruitment In Us Tech And Media Jobs
The group Smartphone Free Childhood said this move goes against a growing demand for more child protection from big tech companies. They feel that allowing younger children to use the platform suggests it is safe, although many teachers, parents, and experts disagree.
WhatsApp says this new age limit is the same as in most other countries and assures that they have safety measures in place. Ofcom, the UK's communications regulator, is getting ready to enforce stricter online safety rules. Mark Bunting, the online safety strategy incharge, said that they are writing new rules for online platforms. Once these rules are active next year, Ofcom will check if companies are protecting users well. Companies that don't follow the rules could be fined heavily.
Meta has also introduced new safety features, including Facebook and Instagram, to address issues like sextortion and the sharing of private images. One new tool is the Nudity Protection Filter. This filter will automatically blur images that show nudity for users under 18 years old.
It also allows users to block the sender and report the chat if they receive inappropriate images. This tool is meant to help users feel safe and avoid pressure to respond to such images. Meta is taking these steps to make their platforms safer for all users, especially young people.
Image: DIW-AIgen
Read next: Employee Referrals Schemes Are Driving Recruitment In Us Tech And Media Jobs