Loading...
Loading...

What YouTube and Meta’s loss in the “social media addiction trial” could mean for your feed.
Guest: Ryan Mac, business and technology reporter for the New York Times.
Want more What Next TBD? Subscribe to Slate Plus to access ad-free listening to the whole What Next family and all your favorite Slate podcasts. Subscribe today on Apple Podcasts by clicking “Try Free” at the top of our show page. Sign up now at slate.com/whatnextplus to get access wherever you listen.
Podcast production by Evan Campbell, and Patrick Fort.
Hosted on Acast. See acast.com/privacy for more information.
That new thing?
Yeah, we've got it.
The Drop by G&C, bringing you all the newness that matters, handpicked by the pros,
who actually know what's up, and what's proven to work.
We keep you on top of the trends and dialed into what's next, whether you're crushing
it at the gym, leveling up your game, or thriving every day.
The Drop by G&C is where the latest solutions in health and wellness land first, non-stop
innovation and fresh finds daily.
Explore what's new, and what's next on The Drop by G&C.
Reddit is where people go for real human advice.
When you advertise on Reddit, your brand shows up alongside the most trusted recommendations
on the internet.
It's authentic, brand-safe, and highly effective across your entire funnel, ready to show
up where people trust what they read, new advertisers spend $500 and get $500 in
ad credit in terms of pie.
Start your test today at business.reddit.com slash podcasts.
Valid for new Reddit ad accounts only, only valid once per business, additional terms
of pie.
See business.reddit.com for more information.
These are live images outside the courthouse in Los Angeles where a verdict has been reached
in a potentially landmark social media addiction trial meta and YouTube essentially on trial.
And based on a jury's finding on Wednesday, liable for damages.
The jury in Los Angeles Superior Court said that meta and YouTube's negligence was a substantial
factor in causing harm to the plaintiff, who went by her initials KGM in court.
The jury also said the companies failed to adequately warn users of the dangers of Instagram
and YouTube.
New York Times tech reporter Ryan Mack was at the courthouse.
I asked him to describe the scene for me when the verdict came down after nine days
of deliberation.
There were camera crews outside who were also very tired waiting for this verdict.
Three sets of lawyers and, you know, also these families of these social media users who
had died, you know, or, you know, committed suicide after having bad experiences on social
media.
So those families were also around the courthouse.
But everyone was just kind of waiting for this thing to come in.
Nine days of deliberation is a pretty long time.
Were you expecting this verdict?
You know, after a certain point, I think I was expecting something to happen.
At least a verdict against these companies, I think it would have been easier for the
jury to come to go into deliberation and be like, you know, oh, you know, there's
no big deal here.
Let's, you know, get out of here and no one's liable.
But they really took their time going through the evidence and going through everything that
happened in the trial.
I think they understood the gravitas of this case, which, you know, is a bellweather case.
One of nine cases to be held in LA regarding social media.
And they, they took that responsibility pretty seriously.
What was the reaction like, certainly among the families when the jury returned its verdict?
Judge had issued some warnings about, you know, not displaying too much emotion.
You kicked out of the court room.
But, you know, there was, there were smiles that were, you know, kind of stifled cheers
in a way.
Stifled, you know, outbursts, you know, these people had been waiting for a long time.
And while these families weren't necessarily involved with the plaintiffs per se, you
know, they weren't the family members of the plaintiff.
They viewed this as kind of a proxy justice for their own loved ones.
And, you know, these loved ones had used social media.
They had had negative experiences and maybe died after using social media according to them.
And so they saw this as justice for these, or against these like, you know, massive
companies that they had no way of holding to account.
This is actually the second court case that Metta lost this week.
On Tuesday, a New Mexico jury found that Metta had failed to protect kids on its apps from
child sexual exploitation.
Metta has lost in court twice this week.
And not even a year ago, the idea of losing big cases, let alone this many this quickly,
I think might have seemed unfathomable.
Same thing goes for YouTube, that same idea.
So I wonder what that tells you about how we think about online platforms or maybe how
juries think about online platforms in 2026.
That's strange.
I've reported on Metta for almost a decade now.
And, you know, the thought of this happening seven or eight years ago when I was really
covering the company seriously is unfathomable, you know, and it's sort of like London busses,
these two things coming at the same time.
You do Mexico verdict that you're talking about $375 million in this state case about
child exploitation.
You know, it's a bit apples and oranges to the verdict that we got in L.A., which was
about social media addiction and whether or not these companies design their platforms
to be addictive, but it does represent something of a shift here or ground shift in, you know,
holding these companies again to account.
Today on the show, big tech loses in court.
Are juries ready to hold companies accountable when Congress won't?
I'm Lizzie O'Leary and you're listening to WhatNextTBD, a show about technology, power,
and how the future will be determined.
Stick around.
This episode is brought to you by Duck.ai, a new product from Duck.go.
It's 2026 and the news is full of AI.
It can be a little overwhelming, especially because it's easy to wonder if what you are
asking a chatbot is really secure.
That's why Duck.go built Duck.ai.
Duck.ai is designed to let you chat privately with the same AIs you might already be using.
You go to Duck.ai and they promise you can chat privately with the same AIs you might
already be using and your data stays yours, not theirs.
There's no signups, it's completely free, no subscriptions required.
And it's from Duck.go.
The company known for protecting your privacy, not collecting your data for nearly two decades.
If you want to use AI, but you're concerned about privacy, visit Duck.ai-tbd today.
That's Duck.ai-tbd from Duck.go, where AI is always optional and designed to be private.
This episode is brought to you by Pebble.
Hiring is hard to do.
I've been there, from hundreds of applications and dozens of interviews, to the actual back
and forth of onboarding.
It turns out hiring new help is actually a lot of work, but that's where Pebble can help.
Pebble is an AI-powered global human resources platform built for founders, HR leaders and
operators who are hiring and supporting teams around the world.
Pebble is designed to help you hire, pay, and manage talent in over 185 countries, with
fast onboarding that can be done in minutes.
You can send offers to anyone in the world and get them onboarded fast.
Instead of juggling, separate tools for contracts, payroll benefits, and compliance, Pebble
brings everything together, with built-in guidance and local expertise to support you.
This is especially helpful if you're managing teams internationally or planning to grow
your business.
Some of the fastest growing companies in the world use Pebble to state organized and reduce
risk, because you can use it to scale faster without feeling like you need to become HR
or compliance experts.
Online Pebble simplifies global people operations, so you can spend more time growing the business
and supporting your team.
Pebble's new standard discounted pricing at $399 per month for employee helps you contain
costs.
Go to highpebble.ai to get a free estimate that is h-i-p-e-b-l.ai for a free estimate.
Hi, this is Pablo Torei from Pablo Torei finds out, and today I want to talk to you about
Boost Mobile, because we spent a lot of time analyzing inefficiencies in sports, overvalued
contracts, money tied up in the wrong places, and so on, but those inefficiencies aren't
just on a roster.
Sometimes, they're in your own monthly expenses.
Boost Mobile says switching to their $25 unlimited forever plan can unlock up to $600 in
savings a year.
That's $25 a month for unlimited data, talk, and text when you bring your own phone.
If that money is trapped in a pricey phone bill, it might be worth a second look.
Visit BoostMobile.com to learn more.
After 30 gigabytes, customers may experience lower speeds, customers pay $25 per month
as long as they remain active on the Boost Mobile and Limited Plan.
Savings claim based on January 2026, Boost Mobile Survey of 1,000 Americans with single
line on limited plans, comparing average annual payments of major carriers to 12 months
on the Boost Mobile and Limited Plan.
For full offered details, visit BoostMobile.com.
Let's step back a bit for people who haven't been following this as closely as they
have.
Can you tell me the story of this Bellweather case in LA and the story of this plaintiff?
We'll start with the plaintiff.
The plaintiff is a 20-year-old woman, her name is KGM, and her case was filed.
I believe about four years ago when she was still a minor.
Throughout the proceedings, the case, even after the case, she has remained anonymous.
She also goes by Kaylee.
Before we know about her, she was in the court.
She testified.
She's from Northern California.
She has used YouTube since she was about eight years old, I believe, in Instagram since
she was nine, and has a lot of mental health issues.
She says are attributed to the use of those social media platforms, things like body
dysmorphia, for example, and anxiety, that she says would not have been as bad or even
possible without the social media.
In terms of this case itself, it's called the Bellweather case because it's kind of the
first of many.
It sets the tone.
It's one of many, many cases that have been filed against these companies.
It just happened to be that this one was the first selected to go to trial.
In Los Angeles, there are eight other cases that are set to go to trial.
There are other cases in the Northern District of California as well.
We're about to see a kind of period where these companies are constantly under trial.
I'm talking about meta and YouTube, you know, Alphabet, I guess, is a paired company.
This lawsuit also originally included TikTok and Snap, although those two companies settled
before the trial began, but it could be this kind of, I guess, golden age of litigation
against these companies.
What is so key about this case is that it was about addiction and design of the platform,
not the content itself.
I think that is an incredibly important distinction.
I wonder if you could explain why that's so key and why that legal strategy was so important.
Legal strategy is very important because it completely avoids Section 230 of the Communications
Decency Act.
Correct.
Section 230, which is a decades old law, says that companies are not liable for the content
that their users post.
So if a user, for example, posts a death threat against another user, Instagram or Reddit
or wherever, is not responsible for that piece of content, even if it is violent of other
laws.
The plaintiffs attorneys and the plaintiffs took a look at this and said, we don't want
to screw with that.
We don't want to go up against that because the precedent there is so strong that we'll
just get our case thrown out on those grounds.
What they went after was the design of these platforms and were the executives of these
companies aware that the design of these platforms could that lead to negative outcomes or social
media addiction?
This is the big tobacco philosophy, basically.
And it was based on the verdict of winning strategy.
Were there any moments in court that really stood out to you, particularly around these
sort of internal communications?
They had Mark Zuckerberg and Adam Messeri.
Head of Instagram.
Yeah, head of Instagram.
And it's interesting because you watch or you read about this testimony and they're trying
to thread the needle here where they feel that they're doing the best they can for kids,
they also don't want to admit wrong or fault.
And it's just kind of weird dance that you get with these executives.
You know, Mark Zuckerberg, especially, and anytime he testifies, it's always, it's not
very common.
So it was always a spectacle when that happened.
I was particularly interested in, I mean, I was there for closing and, you know, something
that really intrigued me is this larger idea of how do you hold these companies accountable
especially when, you know, there's two phases of this, of this trial, which was one was
trying to determine liability.
And the second much quicker phase was punitive damages.
Once they established liability, what was going to be punitive to these companies, what
could actually punish them?
And you know, someone has, whose report on these companies for years now, I understand
that these are the wealthiest, most powerful companies in the world.
And how do you punish a company that has a one and a half trillion dollar market cap in
Meta or three and a half trillion dollar market cap in Google?
And ultimately, the jury didn't really grapple with that, I think.
But I think this is something that will come up in later cases as well as is like, what
will prevent these companies from doing this again?
And what is the kind of appropriate financial penalty to assess to them?
Both companies say they will appeal.
Meta sent us a statement saying, teen mental health is profoundly complex and cannot be
linked to a single app.
We will continue to defend ourselves vigorously as every case is different and we remain confident
in our record of protecting teens online.
In terms of damages, the jury awarded the plaintiff $3 million in compensatory damages
and another $3 million in punitive damages.
The jury said Meta should pay 70% of that and YouTube 30%.
For companies with market caps north of a trillion dollars, well, it's a drop of a drop in
the pocket for these companies.
It's not even a slap on the wrist.
It's a tap on the wrist, I guess $3 million for, you know, six or six million or whatever
you want to call it, any millions of, you know, that I feel like they've probably spent
more on lawyers and legal fees in this case than they will have to pay to the plaintiff.
I think it was very interesting watching this trial in that the verdict came down initially
against Meta and YouTube and you could kind of see the faces of the lawyers drop, you
know, especially with YouTube, I don't think they're necessarily expecting this.
Meta, you know, I think there's a stronger case against them and there was a lot more
focus on Meta, but you know, after that first phase of trial on Wednesday, I think the
company saw, dang, we're going to be in for one here and you get to the second phase
where the jury goes and determines what's punitive damages, what's punitive to these companies.
And they come out and they say, you know, just another $3 million.
And, you know, I think that's in some ways a win for these companies.
That's such a small amount.
The plaintiff's attorney Mark Leneer was asking for potentially hundreds of millions is
not billions of dollars in punitive damages.
He had this very interesting prop where he brought a jar of M&Ms and each M&M represented
a billion dollars in net worth to these companies.
And so he said, you know, what is it if I take out one M&M or two M&Ms or a handful
of M&Ms?
It's not going to matter.
These companies will not notice.
That's why you as a jury need to consider that when you punish them.
That's kind of what takes me to my next point, which is the question about reputation and
public opinion because obviously from an economic standpoint, right?
As you said, this is not even like a tap on a tap on the rest.
There is this large amount of other similar cases.
It does make me wonder like, are we now in this new era of juries and plaintiffs attorneys
and state agencies and just people suing social media companies in a way that we've never
seen before?
And I think that's where the lasting impact of this trial will take us.
You know, if there had been a verdict where both companies were not liable, that would
have shut the door for many of these other cases.
But now we have this precedent where a plaintiff came in, sued these companies based on the
design of their apps and came away with a win.
You know, maybe financially not the win that some would have expected, but a win nonetheless.
And so you take that and you apply that now down the road to, that was just one person.
You know, what happens if you have eight other cases and what if everyone in those cases
is awarded $6 million and then other people see that and they file their own cases?
There could be this kind of domino effect here.
And I think if you're a plaintiff in those cases waiting in the wings, you are a little
heartened by that.
That there is a path to victory and there is a path to holding this company accountable.
And I think that's kind of what we'll see as the lasting kind of legacy of this.
After the break, is section 230 really going to change?
Hi, this is Pablo Torei from Pablo Torei finds out and today I want to talk to you about
boost mobile because we spent a lot of time analyzing inefficiencies in sports,
overvalued contracts, money tied up in the wrong places and so on.
But those inefficiencies aren't just on a roster.
Sometimes they're in your own monthly expenses.
Boost mobile says switching to their $25 unlimited forever plan can unlock up to $600 in savings
a year.
That's $25 a month for unlimited data, talk and text when you bring your own phone.
If that money is trapped in a pricey phone bill, it might be worth a second look.
Visit boostmobile.com to learn more.
After 30 gigabytes, customers may experience lower speeds, customers pay $25 per month
as long as they remain active on the boost mobile unlimited plan.
Savings claim based on a January 2026 boost mobile survey of 1,000 Americans with single
line unlimited plans, comparing average annual payments of major carriers to 12 months
on the boost mobile unlimited plan.
For full offered details, visit boostmobile.com.
When you manage procurement for multiple facilities, every order matters, but when it's
for a hospital system, they matter even more.
Granger gets it, and knows there's no time for managing multiple suppliers and no room
for shipping delays.
That's why Granger offers millions of products in fast, dependable delivery, so you can keep
your facilities stocked, safe and running smoothly.
Call 1-800-GRanger, click ranger.com, or just stop by.
Granger.
For the ones who get it done.
Well, there's this other verdict.
This jury in New Mexico found that metadata didn't protect kids from online harm.
It is very different from the case where you were in the courtroom, but it is similar in
that the platforms are being held liable, even though in both cases, the platforms are
going to appeal.
It just seems like this kind of multi-dimensional legal attack where juries, regular people,
are doing a thing that a lot of lawmakers have not really done over the past 10, 15 years
as we've been talking about all this stuff.
Right.
I think that going back to that New Mexico case real quick, I think that one will have
a much tougher time holding, given that there is content involved there, and so you get
section 230 that will kick in potentially on appeal and maybe erase some of Assam if
not all of that, $375 million verdict.
But to your point on the lawmakers and regulators here, we've had so little regulation with
regards to these companies in the U.S. around the world, we've seen other countries try to
enact laws whether those are age restrictions or identity verification or that kind of thing.
We can argue the merits of those laws or the intent of them, but at least some of these
countries have seen that they need to do something, I guess, and in the U.S., despite the
changing Democrat-Republican leadership over the years, there has been no really meaningful
federal regulation when it comes to social media.
Of course, there's been plenty of stuff that has been proposed, so I think some people
have simply had it, and they've gone this legal route and this very novel legal route
now that we've seen, and this potentially is the path to hold these companies accountable
in their view.
A thing that I find interesting, I talked to the Nebraska Attorney General a couple of
weeks ago about his suit against Roblox, and it's a different suit.
It's a bit more like the New Mexico one, where this is about predators and child sexual
abuse material, but it is striking to me.
He is a Republican, he is a Federalist Society guy.
It is striking to me that we are now seeing bipartisan legal action against Big Tech in
a way that feels a little different to me than we have seen before.
Potentially, but I don't know.
Yeah.
Yeah.
I mean, am I overselling it?
We've also seen lawsuits against Big Tech in the past, and there was this tech lash and
then nothing happens.
I don't know.
I feel like I've seen so many of these waves of it come and go.
I mean, there is this, I guess, greater focus on quote unquote, child safety that has played
out.
And I think that is where a lot of the energy is.
Of course, in this case, this was a case of a 20-year-old woman, but it was filed when
she was still underage.
And I think that is kind of uniting in a way for Democrats and Republicans.
I think we've seen senators on both parties kind of push on that.
I think the outcomes and maybe the intent is a little different, but this idea of rallying
around child safety.
But I guess I would caution a little bit as to like, you know, is asking if this is finally
the moment where people, you know, hold hands and come together on social media regulation.
I guess I'll believe it when I see it.
Okay.
So let's talk about Section 230.
We have just had hearings in Congress about Section 230 of the CDA and what could happen,
what should happen.
It is a very thorny issue and you have people kind of from all sides of the political spectrum
taking a whole variety of positions.
Right now CDA 230 protects platforms from liability for what third parties post on their platform.
Do you think there is real momentum to repeal or change 230?
Oh, man, I'm going to get, however I answer this, people are just going to yell at me.
So I do believe there are people out there that want Section 230 to be reformed or at least
ended.
I think Donald Trump has said that, you know, at some point, you know, who knows what
he believes.
But yeah, I don't know.
It's hard to gauge, I guess.
There's always been a very vocal contingent that has been against 230.
And I guess I'd be remiss if I didn't say, you know, Section 230 is largely why we have
the modern internet.
It's why the internet has flourished in the way that it has.
But at the same time, it's now become this shield for these multi-trillion dollar companies,
right?
And I think that's why people are concerned, those that have concerns.
This law that has existed for a few decades now is protecting the most powerful companies
in the world.
Yeah.
The original CDA is 1996.
So like, we've been doing this for a while.
20 years.
Yeah.
And, you know, back then, there wasn't Facebook.
There wasn't, I mean, the thought of these companies being worth trillions of dollars was
not on anyone's radar.
And now that they are, and they're protected from a lot of litigation around their content
because of this law, I think that concerns people.
And I think a lot of people are wondering if that was truly the spirit of that law to protect
the Mark Zuckerbergs of the world or the Googles and the YouTubes and TikToks.
Where does this put meta and alphabet today?
Because on the one hand, they are facing legal consequences, consequences in public opinion.
On the other, they're ubiquitous.
I mean, this conversation is going on YouTube.
Like, they are knit into so many aspects of our society.
The CEOs are closely allied with the Trump administration.
Does this mean anything from an accountability perspective?
I mean, I think every big question.
Yeah.
I think so.
I mean, in their world, they would have loved to have just won that verdict and not have
to deal with this.
But now there is this, I guess, strategy on how the little guy or how the user can go after
a meta, for example.
I think the irony of this is this happened on a day where I think Zuckerberg was added
to a White House Technology Advisory Council.
So I'm sure there is some discussion there as to what can be done.
It's now a secret of the lobbying effort of these companies with the Trump administration.
So the thing is here that that can only do so much, like this is the legal system.
And I think it'll at least cancel wine their way through the courts.
And as you pointed out, this is just a different form of accountability.
This isn't regulation.
This isn't lawmakers.
Senators or Congress people and two parties deciding it's regular juries here.
And I stress that because I've spent time with these folks, so I guess in the same courtroom
as these folks, and talked to two of them yesterday.
And these are the people that are going to be deciding the fates in some ways of these
massive companies.
Did they feel satisfied?
The juries or the jury members that we spoke to?
Yeah.
I think they were satisfied with their work.
Certainly, I think the length of the trial spoke to how serious they took it.
I thought it was interesting because in asking them questions, they played down the idea
of how significant this case was.
They weren't taking into account the eight other cases that would follow or the, you know,
they weren't told to consider those.
I think they were doing their job as jurors on a case.
I think it would have been wrong to be like, oh, wear this group of 12 and we're about
to set the precedent for years to come.
So that was our job as reporters to kind of link that significance to this.
So I think it was interesting.
There are two different perspectives.
These people are so tunnel vision on this one verdict, but in doing so, they have, again,
somebody's pushed the first domino over.
Ryan Mack, thank you so much for talking with me.
Thanks again.
Ryan Mack is a reporter for The New York Times.
All right, that is it for our show today.
What Next TBD is produced by Evan Campbell on Patrick Ford.
Our show is edited by Paige Osburn, who is the senior supervising producer for What Next
and What Next TBD.
Mia Lowe Bell is the executive producer here at Slate, and TBD is part of the larger
What Next family.
We will be back on Sunday.
I'm Lizzie O'Leary.
Thanks so much for listening.
Waiting days for key security answers you need straight away, not an option.
Start using solar, the AI platform for cybersecurity that turns weeks of work into minutes.
Simply connect your text act to solar, cloud, SAS, identity provider, anything, and ask
in plain language.
Get answers with your own context and with prioritized action items so you know exactly what to fix
first to get security done.
Start for free at solar.security.
Busy work weeks can leave you feeling drained.
Prolawn's five day fasting mimicking diet works at the cellular level to rejuvenate you
from the inside out, providing real results that include fat focused, sustainable weight
loss with no injection needed.
Next gen builds on the original Prolawn with 100% organic soups and teas, richer taste,
and ready to eat meals.
Developed USC's Long Jeviti Institute and backed by top medical centers, Prolawn supports
biological age reduction, metabolic health, skin appearance, fat loss, and energy.
And 15% off plus a $40 bonus gift when you subscribe at prolawnlife.com slash start.
That's prolawnlife.com slash start.



