Jury Rules Instagram and YouTube Addictive, Negligent in Protecting Children
A jury in Los Angeles has issued a significant verdict against two of the world’s leading digital platforms, Instagram and YouTube. The court found that these applications are addictive by design and that their owners have been negligent in safeguarding the children who use them.
This ruling marks a serious moment for Silicon Valley with global implications. Meta and Google, the parent companies of Instagram and YouTube respectively, have been ordered to pay $6 million (£4.5 million) in damages to a young woman known as Kaley, the plaintiff in this case. Kaley alleged that the platforms contributed to her developing body dysmorphia, depression, and suicidal ideation.
Both Meta and Google have announced their intention to appeal the decision. Meta argues that no single app can be held solely responsible for the mental health crisis among teenagers, while Google maintains that YouTube should not be classified as a social network.
For now, the verdict signals what Dr Mary Franks, a law professor at George Washington University, describes as "the era of impunity is over."
It is difficult to overstate the significance of this court ruling for social media. Regardless of the appeals and ongoing legal processes expected, this decision is set to reshape the digital landscape and could mark the beginning of a fundamental change in how social media operates.

A 'Big Tobacco' Moment for Tech Giants?
While many users accustomed to endless scrolling might not be surprised by the verdict, it appears to have caught the tech companies off guard. Meta and Google incurred substantial legal expenses in defending the case, underscoring its importance to them.
Two other companies involved in the trial, TikTok and Snap (owner of Snapchat), settled before the case reached court, reportedly due to concerns about the costs of litigation.
Despite the platforms promoting various tools aimed at protecting children—often targeted at parents—the court concluded that these measures were insufficient.
Arturo Bejar, a former Instagram employee, revealed that he warned Mark Zuckerberg years ago about the platform’s risks to children. He stated on BBC Radio 4's Today programme,
"It changed from a product you used to a product that uses you."
Meta has denied these claims.
Some experts have likened the verdict to big tech’s "big tobacco" moment, drawing parallels to the tobacco industry’s legal battles, which, while impactful, did not eliminate smoking entirely.
This raises questions about potential future regulations such as health warnings on screens or restrictions on advertising and sponsorships.
Currently, tech companies in the US benefit from Section 230, a legal provision shielding them from liability for user-generated content. Other media companies do not enjoy this protection. Although the tech industry argues that Section 230 is essential for its survival, skepticism is growing. The US Senate Commerce Committee recently held a hearing on the matter.
Despite a generally close relationship between tech leaders and former US President Donald Trump, who has supported the sector, Trump has not publicly defended the companies in this case.
Another possible outcome is that platforms may be compelled to remove features designed to maximize user engagement. However, engagement is central to big tech’s business model.
Removing features such as infinite scrolling, algorithmic recommendations, and autoplay would fundamentally alter the social media experience, potentially limiting its appeal.
The success of major platforms depends on attracting and retaining large numbers of users for extended periods to maximize advertising revenue.
In several regions, including the UK, children have been excluded from contributing to this advertising revenue following regulatory interventions. However, the ideal scenario for tech companies is that children become established users by the time they reach adulthood.
Facebook, Meta’s original social network, is often referred to as the "boomer platform," yet 2025 data indicates that nearly half of its global users are aged 18 to 35.
Further Legal Challenges Expected
Kaley’s court victory represents big tech’s second loss in a series of similar cases scheduled for trial in the US this year, with more anticipated.
Dr Rob Nicholls of the University of Sydney commented,
"This landmark verdict, along with many other similar lawsuits against social media companies, signals a shift in how courts view platform design as a set of choices that can carry real legal and social consequences."
"It opens the door to wider challenges against social media and other technology systems engineered to maximise engagement at the expense of user wellbeing."
Australia has already taken decisive action by banning users under 16 from the largest social platforms since December.
The UK and other countries are considering similar measures, and this verdict strengthens the case for such policies.
For parents who have struggled with the impact of social media on children, banning these platforms for minors is a clear solution.
British mother Ellen Roome, who campaigns for social media reform following the death of her 14-year-old son Jools Sweeney—believed to have been caused by a dangerous online challenge in 2022—urged,
"Just do it now."
However, the UK Parliament remains divided. The House of Lords and Commons are engaged in a legislative process known as "ping pong" over an amendment to the Children’s Schools and Wellbeing Bill. This amendment would require ministers to decide within a year which platforms should be banned for under-16s.
Perhaps this new verdict will unify lawmakers not only in the UK but internationally. It may prompt future generations to question why children were ever allowed unrestricted access to social media.

for our Tech Decoded newsletter to follow the world’s top tech stories and trends. Outside the UK? here.







