Social media giants face the heat in child safety lawsuits

Kid sitting alone in a corner with a smartphone beside him on the floor
(Image credit: Shutterstock)

Child online safety has become an important topic of debate during the past six months. Giant social media platforms like ByteDance, Meta, Snapchat, and Alphabet are now facing the heat. Their platforms are claimed to be addictive and unsafe for children, and concern regarding their adverse mental health effects brought them before the court multiple times. 

42 US states, along with various schools in the US, filed the case claiming that social media platforms cause emotional and psychological harm to minors. The lawsuit addressed the profound impacts that Instagram, Facebook, and other sites have on young Americans' social and psychological realities. It addressed over 140 actions and individual cases made against the platforms.  

Shocking discoveries are on the way 

Recently, the US District Judge Yvonne Gonzalez Rogers rejected their requests to dismiss the lawsuits for being addictive to children. Most of these lawsuits were filed by schools and various states within the country. 

The lawsuit combined over 100 cases filed in early 2022 after Frances Haugen, a Facebook whistleblower, revealed Instagram’s negative influence on teen mental health. 

Another Meta whistleblower, Arturo Béjar, pointed out the company’s policies and added that the platform is fully aware of the harm it's causing the kids but failed to act. According to Béjar, Meta provides users with “placebo” tools that do not address issues affecting teenagers. He alleged that the company is misrepresenting the frequency of harm experienced by users, particularly children.

Read more: How to stop Facebook from training its AI on your data 

Lawsuit focus on design changes 

The lawsuit mainly focuses on applying product liability law to online platforms, requiring improved warnings and design changes. The ruling pointed out that Instagram, Facebook, YouTube, Snapchat, and TikTok will face liability despite Section 230 and the First Amendment of the Communications Decency Act. 

Section 230 states that online media platforms shouldn’t be considered third-party content publishers. This means that social media platforms cannot be held liable if a user posts something illegal or disturbing on their platform. The big tech companies were looking to score immunity under this very section.

However, Judge Rogers dismissed all claims under Section 230. The court held that the platforms were responsible for their design. The platforms didn’t provide adequate parental control measures that could be used by parents to limit their children’s screen time. 

During the trial, Judge Rogers added that the complaints by plaintiffs do not fall under the free speech or expression category. Rather, they are related to problems like lack of robust age verification, inadequate parental control measures, and the complexity of deleting the accounts. 

The plaintiffs added that the content is not to be blamed for the mental health issues, but the design features are. “Addressing these defects would not require that defendants change how or what speech they disseminate,” Judge Rogers writes. 

Why does the case matter? 

Helpful tools

The best parental control apps for Android and iOS and the best VPN services can provide you with tools to monitor, protect, and understand what your children do online, and help you limit their screen time and restrict access to inappropriate sites.

It’s uncommon for many states to collaborate to sue tech giants for consumer harm. However, these collaborative efforts show the states are taking issues against children seriously and pooling their resources to fight social media platforms just like how they fought the Big Pharma and Big Tobacco companies. 

Many lawmakers worldwide are fighting to regulate the use of Instagram, Facebook, and other platforms for the benefit of children.

In the last few years, Utah, California, and Britain passed laws to improve privacy and security measures for youngsters. Utah passed laws that automatically turn off social media notifications overnight for children to reduce interruptions while sleeping. However, lawsuits against online child safety in the US are moving pretty slowly since the tech giants are working hard to dismiss them.

Recently, a court document noted that Mark Zuckerburg, Meta CEO, rejected various initiatives to improve the platform for children and teens. José Castañeda, a Google spokesperson, argued that the claims are false. He further mentioned that the company offers age-appropriate content for families and kids on YouTube and offers robust parental control features. 

The other platforms haven’t responded yet. There have been a lot of lawsuits in recent years claiming social media platforms are harmful to kids. However, many of these cases, including harassment on Grindr, didn't receive attention in court and were dismissed. 

Recent studies have shown the many ways in which online platforms can disrupt mental health, and lawmakers are pressured to make laws protecting children, like age verification. While it’s not yet clear if the online platforms are legally responsible for the harm, this lawsuit may open the door to better safety claims in the future. 

Krishi Chowdhary
Contributor

Krishi is a VPN writer covering buying guides, how-to's, and other cybersecurity content here at Tom's Guide. His expertise lies in reviewing products and software, from VPNs, online browsers, and antivirus solutions to smartphones and laptops. As a tech fanatic, Krishi also loves writing about the latest happenings in the world of cybersecurity, AI, and software.