What an LA County Courtroom case means for the way forward for social media
In Summer time 2023, a plaintiff recognized solely by their initials “Okay.G.M.” launched a case in opposition to Meta, Google, TikTok and Snap (proprietor of SnapChat). It was alleged that social media corporations had engineered their websites to encourage their customers to have interaction compulsively with platforms, which had triggered the plaintiff to endure from anxiousness, physique dysmorphia and despair as a toddler.
Meta and Google requested for the case to be dismissed, however this was declined by the choose in November 2025. A month later, Snap and TikTok settled out of courtroom. Regardless of Mark Zuckerberg (founding father of Meta) and Adam Mosseri (CEO of the Meta-owned Instagram) testifying, the jury determined in favour of the plaintiff.
On account of the jury’s determination, Meta and Google had been discovered negligent within the design of their functions. The choose ordered the businesses to compensate the plaintiff US$3m (greater than £2.2m) damages and pay an additional US$3m in punitive damages, with Meta ordered to pay two-thirds of the damages.
From moderating content material to regulating design
It’s value noting that the plaintiff centered their argument on the hurt they suffered as a result of design of the social media platforms and never on the content material that was on the platforms. Because of this, Part 230 of America’s 1995 Communications Decency Act, which often gives social media corporations with immunity from prosecution, was not a viable defence on this case.
“Part 230 of the Communications Decency Act shields on-line platforms from legal responsibility for content material created by third events that’s hosted on their platforms,” says Hellen Mukiri-Smith, a lecturer in regulation at Loughborough College. “The choice was not an entire shock as a result of lately US courts have discovered that Part 230 doesn’t grant social media corporations full immunity from negligence claims.”
Okay.G.M. is the primary of 1,600 plaintiffs suing Meta, Google, TikTok and Snap for harms attributable to social media to kids, as a part of a consolidated motion (combining separate lawsuits right into a unified case to enhance effectivity and scale back prices). The decision within the trial could possibly be used to find out a worldwide settlement, though Meta and Google are interesting the decision.
The Okay.G.M. case has turn out to be considered as a bellwether case. “A bellwether case is an indicator of future developments in litigation or a check case meant to gauge how juries will react to key proof and authorized arguments made in a case. It might assist to foretell how related circumstances are determined sooner or later,” says Mukiri-Smith. “The Okay.G.M case is a bellwether case that’s a part of multi-district litigation fits filed throughout the USA on behalf of youngsters and younger folks in opposition to the biggest huge tech corporations.”
This isn’t the one courtroom case affecting social media corporations. In New Mexico, a courtroom not too long ago dominated that Meta violated New Mexico’s Unfair Practices Act by failing to safeguard younger customers from baby predators when utilizing its platform, and was fined US$375m (almost £280m).
In November 2025, Spain’s Mercantile Courtroom No. 15 dominated that Fb had an unfair market benefit by extracting private information in violation of European regulation and was fined €481m (almost £420m) in damages to Spanish media retailers. A yr earlier, the European Fee fined Meta €797.72m (almost £700m) for breaching EU antitrust guidelines.
Though the aforementioned circumstances function within the current authorized framework, they’re being considered as a shift within the legislative stance relating to social media regulation by governments world wide.
All of those authorized actions spotlight the viability for related negligence claims in opposition to social media corporations for harms suffered by their customers. Meta and Google are interesting the newest consequence, but when the choice is upheld, then there may be the potential for a authorized precedent to be set for related circumstances difficult dangerous platform designs sooner or later.
Within the US, it’ll subsequently turn out to be tougher for platform suppliers to depend on Part 230 to keep away from legal responsibility for constructing addictive platforms.
From authorized challenges to legislative compliance
Some nations have now banned social media platforms for youngsters. Australia not too long ago imposed a social media ban for anybody below the age of 16. In the meantime, the UK is at the moment trialling a pilot programme on the practicalities of banning social media for folks below the age of 16.
Greater than the rest, the LA County Courtroom Case proves there’s a shifting focus from regulating dangerous content material on-line (such because the UK’s On-line Security Act 2023) to dangerous design practices that create addictive or poisonous platforms. It’s no longer a lot what’s on the platform, however how the platform operates and engages with its customers.
For a very long time, social media platforms have usually provided a free service that positive factors income by way of promoting. With the brand new legislative give attention to addictive design practices, some social media corporations could discover themselves struggling to take care of their aggressive edge. It can subsequently turn out to be a balancing act between creating different insurance policies that don’t create poisonous platforms for his or her customers and sustaining a fiscally viable enterprise mannequin.
Partaking with regulators could allow social media suppliers to develop platforms that provide a more healthy on-line expertise whereas nonetheless sustaining income.
“Social media corporations owe an obligation of care to the general public to design merchandise and function companies in ways in which promote human wellbeing and basic rights,” says Mukiri-Smith. “Firms ought to cease using addictive design options which are primarily geared toward rising time spent on their platforms, slightly than enabling the operation of platform companies – options that exploit and amplify the vulnerabilities of youngsters, younger folks and different teams with structural disadvantages, who usually tend to be drawn into addictive engagement cycles.”
Ought to sure platform design practices be deemed dangerous, it’s incumbent upon social media corporations to search out alternate means to make sure they will create participating platforms for his or her customers that aren’t addictive or poisonous. They must have a look at the choices for offering safeguards for customers, reminiscent of setting day by day limits, minimising pointless notifications or moral age verification methods.
By Okay.G.M.’s authorized motion specializing in addictive design options, reminiscent of algorithmic amplification, magnificence filters, fixed notifications and countless scrolling, social media corporations are now not exempt from prosecution below the Part 230 protect.
Though no social media laws have been introduced, the current authorized actions suggest that it’s a case of when, not if, additional laws are enacted world wide. Social media corporations ought to subsequently put together by participating with regulators and creating different improved moral practices when creating their platforms to take care of a aggressive edge.
Meta and Google had been approached for remark, however didn’t reply.

