Beautiful Virgin Islands

Wednesday, May 13, 2026

Mother sues Meta and Snap over tween daughter’s suicide

Mother sues Meta and Snap over tween daughter’s suicide

An American mother alleges firms are to blame for the suicide of her 11-year-old daughter, who had an ‘extreme addiction’ to social media.
Meta Platforms Inc. and Snap Inc. are to blame for the suicide of an 11-year-old who was addicted to Instagram and Snapchat, the girl’s mother alleged in a lawsuit.

The woman claims her daughter Selena Rodriguez struggled for two years with an “extreme addiction” to Meta’s photo-sharing platform and Snap’s messaging app before taking her life last year.

The complaint in San Francisco federal court isn’t the first lawsuit to blame a youth’s suicide on social media, but it comes at a sensitive time for platforms that engage millions of young people worldwide.

In November, a group of U.S. state attorneys general announced an investigation of Instagram over its efforts to draw children and young adults, taking aim at the risks the social network may pose to their mental health and well-being. The states’ probe was launched after a former Facebook employee turned whistle-blower testified in Congress that the company knew about, but didn’t disclose, harmful impacts of its services like Instagram.

The backlash against social media isn’t limited to the U.S. The father of a 14-year-old in the U.K. touched off a firestorm when he blamed her 2017 suicide partly on Instagram. The company told the BBC that it doesn’t allow content that promotes self-harm.

“We are devastated to hear of Selena’s passing and our hearts go out to her family,” a Snap spokesperson said Friday in an emailed statement. “While we can’t comment on the specifics of active litigation, nothing is more important to us than the wellbeing of our community.”

Meta and Snap knew or should have known that “their social media products were harmful to a significant percentage of their minor users,” according to Thursday’s lawsuit. “In other words, defendants intentionally created an attractive nuisance to young children, but failed to provide adequate safeguards from the harmful effects they knew were occurring on their wholly owned and controlled digital premises.”

Meta representatives didn’t respond to an email seeking comment.

A Meta spokesperson said in November that allegations the company puts profit over safety are false and that “we continue to build new features to help people who might be dealing with negative social comparisons or body image issues.”

Snap said in May it was suspending projects with two app makers “out of an abundance of caution for the safety of the Snapchat community” in light of a wrongful-death and class-action suit filed in California that accused the companies of failing to enforce their own policies against cyber-bullying.

Tammy Rodriguez, who lives in Connecticut, said when she tried to limit her daughter’s access to the platforms, the girl ran away from home. She took her daughter to a therapist who said “she had never seen a patient as addicted to social media as Selena,” according to the suit.

The lawsuit levels its harshest criticism at Snapchat, saying the platform rewards users in “excessive and dangerous ways” for engagement. The mother alleges claims of product defect, negligence and violations of California’s consumer protection law. One of the lawyers on the case is from Social Media Victims Law Center, a Seattle-based legal advocacy group.

“Snapchat helps people communicate with their real friends, without some of the public pressure and social comparison features of traditional social media platforms, and intentionally makes it hard for strangers to contact young people,” the Snap spokesperson said. “We work closely with many mental health organizations to provide in-app tools and resources for Snapchatters as part of our ongoing work to keep our community safe.”

Social media companies have been largely successful fending off lawsuits blaming them for personal injuries thanks to a 1996 federal law that shields internet platforms from liability for what users post online.

The case is Rodriguez v. Meta Platforms Inc. f/k/a Facebook Inc. 3:22-cv-00401, U.S. District Court, Northern District of California (San Francisco).
Newsletter

Related Articles

Beautiful Virgin Islands
0:00
0:00
Close
The Great Western Exit: Why Best Citizens Are Fleeing the Rich World [PODCAST]
The New Robber Barons of Intelligence: Are AI Bosses More Powerful Than Rockefeller?
The End of the Old Order [Podcast]
Britain’s Democracy Is Now a Costume
The AI Gold Rush Is Coming for America’s Last Open Spaces [Podcast]
The Pentagon’s AI Squeeze: Eight Tech Giants Get In, Anthropic Gets Shut Out [Podcast]
The War Map: Professor Jiang’s Dark Theory of Iran, Trump, China, Russia, Israel, and the Coming Global Shock [Podcast]
Labour Is No Longer a National Party [Podcast]
AI Isn’t Stealing Your Job. It’s Dismantling It Piece by Piece.
Lawyers vs Engineers: Why China Builds While America Litigates [Podcast]
Churchill’s Glass: The Drunk, the Doctor, and the Myth Britain Refuses to Sober Up From
Apple issues an unusual warning: this is how your iPhone can be hacked without you doing anything
The Met Gala Meets the Age of Billionaire Backlash
Russian Oligarch’s Superyacht Crosses Hormuz via Iran-Controlled Route
Gunfire Disrupts White House Correspondents’ Dinner as Trump Is Evacuated
A Leak, a King, and a Fracturing Alliance
Inside the Gates Foundation Turmoil: Layoffs, Scrutiny, and the Cost of Reputational Risk
UK Biobank Breach Exposes Health Data of 500,000, Listed for Sale on Chinese Platform
KPMG Cuts Around 10% of US Audit Partners After Failed Exit Push
French Police Probe Suspected Weather-Data Tampering After Unusual Polymarket Bets on Paris Temperatures
News Roundup
Microsoft lost 2.5 millions users (French government) to Linux
Privacy Problems in Microsoft Windows OS
News roundup
Péter András Magyar and the Strategic Reset of Hungary
Hungary After the Landslide — A Strategic Reset in Europe
×