Social media is now a massive liability for Meta, Google and the rest of Big Tech

Dow Jones03-29

MW Social media is now a massive liability for Meta, Google and the rest of Big Tech

By Jim Steyer

Landmark verdicts shatter the Section 230 shield, turning 'addictive' product design into a legal thicket for Meta, Alphabet and others

Meta Platforms CEO Mark Zuckerberg leaves a Los Angeles courthouse in February after testifying in a lawsuit alleging that Meta's social-media platforms harm children. A jury ruled this week against Meta and co-defendant Alphabet.

Plaintiffs' attorneys did not argue that Instagram or YouTube should be liable for what users posted, but rather that the platforms themselves are dangerous products.

For the first time ever, courts have meaningfully held social-media companies accountable for harming our kids.

This sea change was a long time coming. Social-media companies have had a green light to use our children as guinea pigs for a massive, uncontrolled experiment. It fostered an unprecedented youth-mental-health crisis that's still playing out today.

The consequences have been dire. A raft of research shows that excessive use of addictive-by-design platform features is linked to a rise in depression, anxiety and other mental-health issues in kids and teens. Children have taken their own lives after being sucked into dark social-media rabbit holes. Yet, until now, tech companies have been able to sidestep responsibility for the pain they have wrought on families across the country.

At Common Sense Media, we have long warned that the issue is not the material users post but the very design of tech companies' platforms. Many of our children's favorite apps and social-media sites were built from the ground up to keep young users hooked and helpless to look away, consequences to their health and well-being be damned.

A series of lawsuits has now broken through the public-relations and lobbying front to bring this reality home. Rulings in cases from New Mexico and California, both of which claimed that social-media companies knew their products harmed children, are giant leaps forward toward a safer, healthier digital future.

In the California case, a jury awarded a young woman $6 million after finding that Meta Platforms (META) and Alphabet's $(GOOG)$ $(GOOGL)$ Google were negligent in designing platforms that exploited her as a child, contributing to her anxiety, body dysmorphia and suicidal ideation.

Read: Meta's Mark Zuckerberg sticks to talking points in testimony on kids' Instagram use

The plaintiff had settled similar claims out of court with TikTok and Snap (SNAP). In New Mexico, a jury hit Meta with $375 million, the culmination of state Attorney General Raúl Torrez's case that documented how the company's platforms funneled sexual predators toward minors with chilling efficiency.

In both cases, the central finding was the same: that the harm was a feature, not a bug.

Read: American teens spend an average of 5 hours a day on social media. Here's what it's costing them.

Nine-figure settlements won't break Meta, which is valued at well over a trillion dollars - but their legal and cultural impact is far greater than any number, and the total monetary damage will certainly rise, as there are more than a thousand cases waiting to be heard.

For decades, the tech industry has hidden behind Section 230 of the Communications Decency Act of 1996, a federal law written when the internet was a novelty and its architects could not have imagined algorithmic feeds engineered to maximize screen time and the effect on developing adolescent brains.

Section 230 was designed to protect a bulletin board from being held responsible for what a stranger pins to it. It was never intended to shield a corporation from the consequences of its own engineering decisions. That is why Common Sense Media has long called for reforming the law to better protect our kids.

From the archives (October 2020): Jim Steyer: Only a breakup of Facebook and controls on social media can reduce disinformation and lies on the internet

Section 230, long treated as a near-absolute shield, now has a meaningful boundary.

Plaintiffs' attorneys in the California case understood this distinction clearly and built their strategy around it. They did not argue that Instagram or YouTube should be liable for what users posted, but rather that the platforms themselves, which weaponize infinite scroll, autoplay, and addictive feeds, are dangerous products.

That argument, and the jury's acceptance of it, is a legal watershed. It means Section 230, long treated as a near-absolute shield, now has a meaningful boundary. Design choices are not users' original material. Harm engineered into a product's architecture is the manufacturer's responsibility.

Meta's own internal research showed that the company knew Instagram was harmful to teenagers - yet it chose the relentless pursuit of engagement over kids' safety. At long last, two separate juries have reached the same conclusion.

Read: Kids as young as 13 can now trade stocks without a parent's approval - but don't ask them, 'How much did you make today?'

Social-media companies build their products to keep kids addicted at all costs. The juries said exactly that. Congress should wake up and act.

The verdicts are significant not only for confirming what we and others have long known, but for the changes in product features that the courts might end up imposing. And the cases will put renewed energy behind pending legislation to force product-design changes and more privacy.

State legislatures are already making progress. New York, California and other states, for instance, have passed age-appropriate design codes, laws requiring warning labels on social media, and restrictions on algorithmic feeds and notifications.

Other states, both red and blue politically, are considering proposals to rein in unsafe social-media design. Our new research is evidence that these actions have come just in time, as families across the U.S. and across the political spectrum are ringing the alarm about the urgent need for safeguards for kids online.

Despite this state-level progress, Congress has been slow to act. The Kids Online Safety Act has stalled despite bipartisan support in the U.S. Senate, leaving families with no federal framework to protect their children online. Further, some federal policymakers are pushing to block states from enforcing their own laws on tech safety.

Legislative and legal wins, though, have tilted the momentum in favor of kids and families. In addition to the state-level court cases, a federal trial is scheduled for later this year against Meta, Google, TikTok and Snap. The plaintiffs filing these cases deserve our enormous appreciation and respect. They are the hammer that is cracking through the tech companies' legal playbook by letting the world know what families have known for years: that social-media companies build their products to keep kids addicted at all costs. The juries said exactly that.

Congress should wake up and act. This is Big Tech's Big Tobacco moment. Every lawmaker must now ask themselves on which side of history they will choose to stand.

Jim Steyer is the founder and CEO of Common Sense Media.

More: Jury finds Instagram and YouTube liable in landmark social-media addiction trial

Also read: New Mexico jury says Meta harmed children's mental health and safety, must pay $375 million

-Jim Steyer

This content was created by MarketWatch, which is operated by Dow Jones & Co. MarketWatch is published independently from Dow Jones Newswires and The Wall Street Journal.

 

(END) Dow Jones Newswires

March 28, 2026 13:04 ET (17:04 GMT)

Copyright (c) 2026 Dow Jones & Company, Inc.

At the request of the copyright holder, you need to log in to view this content

Disclaimer: Investing carries risk. This is not financial advice. The above content should not be regarded as an offer, recommendation, or solicitation on acquiring or disposing of any financial products, any associated discussions, comments, or posts by author or other users should not be considered as such either. It is solely for general information purpose only, which does not consider your own investment objectives, financial situations or needs. TTM assumes no responsibility or warranty for the accuracy and completeness of the information, investors should do their own research and may seek professional advice before investing.

Comments

We need your insight to fill this gap
Leave a comment