States Accuse Meta of Ignoring Safety, Hooking Kids on Social Media

Attorneys general from 45 states and D.C. have filed multiple lawsuits against Meta, the company that owns Instagram and Facebook. The lawsuits claim Meta has used methods to hook kids and teens on its platforms and downplayed the risks.

According to The New York Times’ report, the Tennessee filings alone are 1,400 pages long. The documents show senior Meta executives, including CEO Mark Zuckerberg, publicly advocated for safety features while internally dismissing pleas to hire more staff and protect young users.

The lawsuits raise concerns about kids being exposed to online sexual exploitation and the role of algorithms in hooking them. Surgeon General Vivek Murthy has called for social media to have warning labels because it’s a public health issue for kids. The Kids Online Safety Act is also being considered in Congress to force social media companies to turn off certain features for kids, like notifications.

Meta spokesperson Liza Crenshaw said the company is committed to keeping the online space safe for teens. Crenshaw pointed to more than 50 safety tools the company has built, including features that prevent minors under 16 from getting messages from unknown people and restrict access to inappropriate content. She called the lawsuits “misleading” and said they “cherry pick” quotes and documents.

Despite Meta’s assurances, some parents are skeptical. For example, Mary Rodee from Canton, NY, sued Meta after her 15-year-old son died from sexual extortion on Facebook by someone pretending to be a teenager. Rodee claims Meta didn’t respond to her reports about her son’s situation.

Meta internal documents show ongoing struggles to keep teens engaged. A 2016 survey showed teens were shifting to Snapchat so Zuckerberg launched initiatives to boost engagement on Instagram, including features like Snapchat’s disappearing posts.

Instagram’s policy says users must be 13 but the lawsuits say Meta knows millions are under 13.

The lawsuits against Meta highlight a profound ethical and moral failure in prioritizing engagement and profit over user safety, especially for vulnerable children and teens, as emphasized by ethicists and child psychologists. The ad-driven business model incentivizes maximizing user engagement at the expense of mental health, with algorithms designed to keep users hooked.

To address these issues, stricter age verification systems, better parental control tools, and educational resources are needed. Meta should implement more humanized content moderation, mental health resources, and features that promote healthy usage habits, like screen time reminders. Regulatory oversight and legislation, such as the Kids Online Safety Act, should enforce stricter standards for user safety and data privacy.

Greater corporate accountability and transparency in handling user safety are crucial, and examples from other tech companies show that balancing profit with ethical responsibilities is possible.

Image: DIW-Aigen

Read next: 

• X to Restrict Live Streaming to Premium Subscriber

• Photographers Criticize Meta for Errors in 'Made with AI' Photo Tagging
Previous Post Next Post