Loading...
Loading...

Veteran trial lawyers Mark Lanier and Rahul Ravipudi — the legal team that just won a landmark bellwether verdict against Meta and YouTube — join the Chuck Toddcast to explain how civil litigation is doing more to rein in big tech than the federal government has managed in a decade. They walk through how they persuaded a jury that these platforms engaged in negligent and punitive conduct toward children, systematically dismantling the "it's on the parents" defense by showing that parents simply aren't equipped to manage what amounts to engineered addiction — and that when that addiction takes hold in children, it causes irreparable harm by literally rewiring developing brains. They reveal that Meta's own internal research documents were devastating at trial, that former tech employees took the stand to call out the companies' safety practices, and that these platforms behaved exactly like Big Tobacco did — knowing the harm was real and burying the evidence. They break down how they proved addiction by design: endless scroll, autoplay, slot-machine psychology, and deliberately hidden safety features all created to maximize "time spent," a corporate metric fundamentally at odds with user wellbeing.
The conversation gets into the nuts and bolts of the legal strategy and what comes next. Lanier and Ravipudi describe cross-examining Mark Zuckerberg, who they say couldn't handle basic questions about protecting kids, and explain why YouTube's defense — that it's a streaming service like Netflix rather than social media — collapsed once its own internal documents consistently referred to the platform as "social media." They explain that this is a bellwether case, meaning the judge used nine representative cases to establish facts and conditions that will now apply to roughly 3,000 other pending cases, with eight more trials coming and a settlement fund likely in the companies' future. The attorneys discuss whether tech companies are simply pricing these verdicts in as a cost of doing business (they argue settling would actually be a PR boon for the platforms), draw parallels and distinctions between big tech and tobacco, and offer concrete policy recommendations: a meaningful minimum age requirement, scrapping Section 230, nighttime curfews for minors, and removing the endless scroll. Their bottom line: tech companies won't do the right thing unless they're forced to, and the legal system is finally catching up to what regulators refused to address.
Thank you Wildgrain for sponsoring. Visit http://wildgrain.com/TODDCAST and use the code "TODDCAST" at checkout to receive $30 off your first box PLUS free Croissants for life!
Protect your family with life insurance from Ethos. Get up to $3 million in coverage in as little as 10 minutes at https://ethos.com/chuck. Application times may vary. Rates may vary.
Link in bio or go to https://getsoul.com & enter code TODDCAST for 30% off your first order.
Timeline:
(Timestamps may vary based on advertisements)
00:00 Mark Lanier & Rahul Ravipudi join the Chuck ToddCast
02:30 Civil litigation is doing more to rein in big tech than government
03:00 You can’t fight big tech without an army of lawyers
04:00 Meta & Youtube found liable by jury of negligence & punitive conduct
05:30 How did you push back on the narrative of “parental challenges”?
06:30 Parents aren’t equipped to control kids social media addiction/use
07:15 Addiction in children is an irreparable harm, brain is rewired
08:15 Meta’s own internal research documents were damning
09:30 Without guardrails, tech companies race to the bottom for engagement
10:30 Tech companies behaved just like big tobacco, knew harm was real
12:00 Former tech employees called out safety practices at trial
13:00 How did you prove addiction at trial?
14:15 Proved the companies deliberately made products more addictive
15:00 Endless scroll, autoplay and slot machine science used to trap you
16:30 Platforms make it hard to access or find safety features
17:30 Goal of “increasing time spent” is at odds with users well-being
19:45 Architect for Youtube algorithm was forced to take the stand
20:30 Architect proposed changing algorithm for kids, didn’t happen
21:30 TikTok & Snapchat settled, did that clear the way to win in court?
23:30 Plaintiffs had finished discovery before any settlements
24:30 Youtube’s lawyer argued it’s a streaming platform and not social media
26:15 Despite their protests, Youtube is not like Netflix because of features
28:00 Exhaustive internal documents refer to Youtube as “social media”
29:30 How was the experience of cross-examining Mark Zuckerberg?
31:00 Zuckerberg couldn’t handle some very basic questions about kids
33:00 What makes this case a “bellwether case”?
34:30 Judge used 9 cases to determine facts & conditions for other 3,000
36:30 8 more trials are upcoming
38:00 Companies will likely need to create a settlement fund
38:30 Similarities and differences between big tech & tobacco companies
40:30 Companies achieved a critical mass of kids using the product
42:00 Are companies pricing in penalties/settlements as “cost of doing business”?
43:15 Settling these cases would be a PR boon for these companies
44:45 Preview of the upcoming trials against the tech companies
47:00 What are some good guardrails congress can put on the tech companies?
48:45 An age limit of would do good, as would scrapping Section 230
50:30 A nighttime curfew and removing the endless scroll also has benefits
51:45 There’s no law mandating 25 years of age to rent car, industry imposed it
52:30 Companies might self-regulate after losing lawsuits
53:45 These companies won’t do the right thing unless forced to do so
54:15 Expectations for the appeals process?
56:00 What year do you expect all of these cases to be fully resolved?
57:30 A recommendation algorithm should make a platform a publisher
See omnystudio.com/listener for privacy information.
This is an iHeart podcast, Guaranteed Human.
This episode of the Chuck podcast is brought to you by Ethos.
And what Ethos does is it helps you find life insurance.
Let me tell you why life insurance bailed me out.
My father died when I was 16.
We didn't have a lot of money when it happened.
And we were in a pretty tough financial spot after it happened.
I'm an only child. Suddenly my mother's single.
Single mother. She had a job. Suddenly lost a job looking for
another job. And we were struggling.
You know, my dad didn't leave us in the best financial situation.
But he did make one purchase. He bought a life insurance policy.
Wasn't a huge chunk of change.
But it was enough when we found it.
And when I say we found it meaning we went through his desk and we found,
oh, look at this. Let's see if there's anything here.
And it helped us out at a time that we really needed it.
And that's why I think it's important that if you can,
especially if you have young children,
if you can get life insurance, you should get it.
Life is full of unexpected emergencies, unexpected things happen.
It is an extra bit of security.
Something you can leave and help out at a time when you're not there to help.
So what does ethos do?
Well, ethos makes getting life insurance fast and easy.
And it's 100% online.
You can get a quote in seconds.
You can apply in minutes and you can get same day coverage.
They will find you the right fit.
There's no medical exam.
You just answer a few simple health questions online.
You can get up to $3 million in coverage.
And some policies are as low as $30 a month.
And you'll get your lowest rate from their network of trusted carriers.
So take 10 minutes to get covered today with life insurance through ethos.
Get your free quote at ethos.com slash chuck.
That is ETHOS.com slash chuck.
Application times may vary and rates may vary.
So in March, as I talked about at the time,
we got to what could be landmark verdicts,
the big tech companies.
One was in New Mexico and another was in Los Angeles.
And the one in Los Angeles got a lot of attention
because this was the first one that talked about
where the lawsuit was about the product themselves,
not the content on social media platforms,
but the actual social media platform itself.
So there was no hiding behind section 230,
which is essentially all about having no liability
for what's published on their platforms.
But this was a lawsuit where meta-lossed
because a jury found them essentially guilty
of creating an addictive platform.
And many people have described this really
as the big tobacco moment for social media.
This will be the first of many.
This was considered a bell-weather case officially.
So what this means is that this, along with about seven or eight other cases,
once they're all decided,
they will be used to help decide thousands of other lawsuits
that are potentially brought against, whether it's meta,
snap, tick-tock, you name it there.
I'm sure I've put you some of this,
but to get into some more details on this,
I've got the two lead attorneys from that groundbreaking decision
in Los Angeles, Mark Lanier, and Rahul, Rafa Pudi,
are joining me now to just talk about the cases.
Talk about the cases are going to be
in the next few months, maybe years.
And find out how long is this going to take, right?
If you go back into the world of big tobacco,
these are lawsuits that started in the 80s.
I guess the first one maybe even started before that.
But we really didn't get the big settlement
and the big regulation that came to be
when government finally responded
until the 90s.
This stuff takes years, possibly decades.
And so with that, let me bring on my guess,
Rahul, Mark, good to see you.
Good to see you too, Chuck.
And I'll let Mark tee this off,
because Mark was lead trial counsel on this.
And it's on the next one.
Well, Mark, go ahead and look, you heard my little mini intro there.
I just was trying to set the table a little bit.
And I guess the question is, look, I've got my own,
you know, I, you know, sometimes, you know,
why do we have civil litigation?
Because government can't always tackle everything.
And sometimes government needs help identifying
things they should be regulating.
And this is where I think civil litigation comes in.
And this is a, we've all been demanding
some better accountability on big tech.
And sometimes you don't get it from government.
You get it from lawsuits.
Yeah, and I got to tell you, you do not fight the gargantuan
that is meta and Google without a substantial army.
And so Rahul is nice to call me lead counsel.
That's, that's a misnomer.
That's maybe court language.
But the bottom line is is,
Rahul has been chasing this forever in a day.
He let me do some of the more fun stuff,
while he kind of helped tell me how to do it.
We had other lawyers there, two of my daughters.
We had probably 30 or 40 people in this fight.
Because this, I think, is one of the most important cases
that America will see for the first 50 years of the 2020,
or the 2000s that we're in.
This is the tip of a spear.
And Rahul can say it better than me
because he's been in it much longer than I have.
But it was a brutal war we were in, Chuck.
Let me describe this one case.
Because I'll tell you why I've connected with it so much.
My daughter's 22.
And in this case of yours, Kaylee,
who is with us, all the name that is going to be referred to as.
She's 20, and I just want to outline some things here
because I'm going to be honest with you on the other side of
outlining it.
So Kaylee begins using YouTube at age six,
begins using Instagram at age nine.
She had 284 videos on YouTube that she had posted before,
by the end of her time in elementary school.
I assume that's fifth grade being designed here.
She testified that she stopped engaging with family.
She began suffering anxiety and depression at approximately
age 10 was later clinically diagnosed for both,
developed body dysmorphia.
She attributed constant exposure to beauty filters
and social comparisons on Instagram.
Reported suicidal tendencies as her struggles deepen.
Look,
allow all of us that had parents of young girls
to sort of height of the Instagram,
the initial Instagram boom and snap.
And I'd watch it with my own daughter.
And look, we would intervene every once in a while,
get more involved, make sure she was distracted and all these things.
So I've looked at this because I lived this, right?
I didn't live, I thankfully didn't have a child
this addicted in either sense.
But I live this because you know where I'm going to go here next,
Mark, which is this debate about.
And I'm sure Metta in their pushback said,
hey, you, you know,
there should have been more parental involvement here.
So let me start there.
I just described what it was.
How did you push back on the idea that this was parental
challenges not about Metta and Google?
I've decided there's not a parent on the planet
who has the wherewithal to stand up against these algorithms
in these powerful social media addiction platforms.
There's just not.
So you take her mother, for example.
Her mother would not let her download Instagram on her phone.
But Kaylee figured out a way to do it anyway,
because all of her classmates were on it and she wanted to fit it in.
Her mother put time limits on her phone.
But Kaylee figured out a way around it.
Her mother locked her phone up in the kitchen at night.
But Kaylee would wait for her mother to go to sleep
and she would go get the phone and she would get back on it.
There's not a parent in the world that can handle and do this right.
And the sad part is Metta not only knew that, but they took advantage of it.
Because Metta did this study that showed that especially children
who come from homes that are broken, where the money is not there,
where the resources aren't there, where the mayor's one parent, not two.
Single parent trying to work a job and raise three kids, which was Kaylee's situation.
The company met a new, they were more prone and likely
to get addicted and to get hurt by it.
The other thing about it, though, to add to what Mark said, is
when you're dealing with addiction and addiction and children,
it is an irreparable harm.
And so there's a rewiring of the brain that happens.
And it happens so much faster and with so much potency in a child's
brain than an adult's brain.
And the internal research that Metta did, they did studies and they found, hey,
once a kid's addicted, if a parent gets involved and tries to intervene,
it actually leads to more social media use for that child.
So there's no safety measures that can be done once the child's addicted.
There's no intervention by the parents that can be done.
The damage is already done and it's too late.
And that's that's some of the stuff that
lights were shined on the internal documents in these cases
to show what's really going on inside there.
Let's talk about those in, because I, I mean, these internal meta documents to me are the,
this is the, I mean, this was the gold as far as your concern, be able to like, hey,
don't trust our experts on this.
This is not like a case where you had to get your own expert witnesses in your own
expert analysis. You just had to say, hey, here's what you did.
Is this from all those leaks over the last few years?
Or was this due to discovery that you were able to force them to show this stuff?
This was discovery.
The leaks helped us conduct more discovery, got it,
the kind of pinpoint some issues, but all of this was through countless,
countless documents.
And this was a true in the technology era.
You talk about finding, finding needles and haystacks.
And it's exponential on that.
But so this is like finding out the tobacco companies found out, you know,
where did studies noting that nicotine was a dick exactly except it's not just one study.
It's dozens of studies, it's hundreds of studies.
And so when you talk about the number of smoking guns that exist in this case
against meta, YouTube, TikTok and Snap, it's unbelievable.
And what it really showed was that these companies, without any guardrails,
it's a race to the bottom.
Whatever it is, they can do to maximize the attention economy and take the largest market share.
If a competitor is doing something that's harmful to kids, they'll do it and they'll do it with
more potency just so that they can maximize their market share.
Chuck, if I could see what Raul said, he said the smoking guns.
He's making a really a stoop on there because these companies actually did have smoking guns in
this sense. The company said, you know, we're saying publicly, there's no addiction.
We're saying publicly, there's no problem. We're saying publicly, there's no harm.
Just like tobacco did, when internally our studies show that there are, and then these documents
say, one day, people are going to see this and they're going to know, we are big tobacco.
We are the tobacco companies. So they are truly smoking guns that we found.
When you had some leaks that, you know, was there how, how many times did you have to go back
at Metta to say, no, I think you have this. Surface this. I mean, how did you have to pull it out of
them? Did they eventually just sort of like realize that it's all going to leak and it's worse if
it looks like they're hiding or what was your sense? Well, everything's under probably the most
onerous protective order. And so these hundreds and thousands of documents that we have,
we've only been able to scratch the surface of showing them in trials because of the limited
time we have for trial presentation. So there's so many more that are going to see the light
in these next sets of bell weather trials, which hopefully will, will have a large impact on how
people act and how the social media companies act. The whistleblowers are so unbelievably important
because what ends up happening is documents say one thing and they're crystal clear on what they
say. Witnesses come in, they try their best to whitewash it. But when you have a former company
employee, a high level person, somebody like Arturo Bihar who was the, you know, the probably the
most knowledgeable safety expert within Metta or within the tech industry as a whole.
Coming out and calling out all of the internal practices and really the mindset at Metta
of profits at the expense of child safety, that's that's something that has real impact.
You know, it's interesting. I think the hardest thing to prove is addiction, especially if it's not
an addiction that we're, okay, a drug addiction or an alcoholic, alcohol addiction,
we kind of get that and we're sort of, we're sort of accepting of the premise that gambling can
create an addiction. And we're going to have that, that debate is starting and I think, you know,
the question's going to be same thing, right? There's predatory tactics that the,
just these sports gaming, mobile things are using it, et cetera.
What is the definition of addiction you guys used in this trial?
Well, we put on the stand, the doctor that I considered the greatest
expert in the land on addiction. She heads up the clinic, the section on addiction at
Stanford Medical School, her name's Dr. Anna Limke. She wrote dopamine nation, best cell
or addiction. And she uses what she called the foresees. But what it really boils down to is this,
is someone compulsively using something where they cannot stop using it even when it's to the
detriment of their health or their relationships. You've got to have those elements in some way
that you define them. Someone cannot stop using something even when it's destroying them,
their health or their relationships. And that's an addiction.
What, could you feel like you could point to me as this, were you able to prove
that they actually Instagram at times made things more addictive, not less addictive?
I was out of doubt. Without a doubt. And Rahul may have better insight on this than me,
but the things that I thought really stuck out to the jurors were the fact that they developed these
features like endless scroll. And they have features here in the US that they don't have overseas.
So they'll ignore the age of the user. They will sometimes not even ask for the age,
YouTube. You can get on YouTube main and never be asked what your age is.
You can do those shorts until you've gone through them all, which we showed would take something
on the order of 368 years of 24 hours a day. And still you wouldn't get through it all because
they're adding millions a day. So they put an endless scroll. They put auto play,
where it plays automatically. They put a finger swipe because it's like a slot machine,
where you do something physical and you get a change. They use slot machine science.
They sculpted the algorithm to trap you, to be sticky. They set any of the features that might
stop addiction into a mode where no one can access them reasonably. So you get an example of this.
Okay, so you've got a chance to turn off an endless scroll feature or where after 30 minutes,
it'll say, you've been doing this 30 minutes, you're sure you want to do it. Nobody even knows
that feature exists. You've got to go navigate all of this stuff to even find it
because they don't want you to put it into place. Remember, they only make money the longer time
you spend on it. The more time you spend, the more money they make. So they want you.
They're an early YouTube document said, quote, our goal is not viewership.
It's addiction. Clones quote.
To think about this, Chuck, and what Mark said is so on point because why do you have to search
for a safety feature? And why do you have to search hard for it? And this was a, I think,
a pivotal moment in the trial. Mark asked a witness, can we just get real for a second?
When is an addict going to look for a safety feature so that they can use the product less?
So they get you, they get these kids addicted. And then they put this, you know,
safety feature out there that's unfindable, that say, look, we're a safe company. And it really
kind of shines a light on who these folks are. But going back to some of the things and the proof,
let me just read from you a document from Metta's Wellbeing Department, which doesn't exist anymore.
Product features that are designed to exploit insecurity or provide a dopamine rush,
likes, notifications, the pull down to see the infinite scroll, etc. To increase time spent
are inherently at odds with Wellbeing. And take away from people's ability to consciously focus on
activities that add value to their lives. That's their internal document talking about all of these
design features. And they describe them in two different ways. They call them addictive,
like designs and dark patterns. Dark patterns are those things that just manipulate you into doing
something you don't even want to do. Like if a kid is steps off of Metta and Instagram for a bit,
it's night time. And then all of a sudden their phone lights up and with a with a notification
that's meaningless. Hey, somebody wants to be your friend. Why do you need that notification at one
in the morning? But once that once that notification comes, a user then goes back onto the app.
And then they start using it more for instead of like 10 seconds, another 30 minutes to an hour
and the sleep deprivation comes. So an internal document that was used in the trial showed
a Venn diagram. And the dark patterns and the addictive like designs include auto play and
less scroll and all of these notifications. These are their documents talking about it.
This episode of the Chuck Todd cast is brought to you by Seoul. So if you love that end of the day,
unwind, but hate banging over, Seoul's out of office is for you. These sparkling THC drinks and
gummies give you the same relaxed social feeling without the alcohol, without the calories and without
the crash. Seoul is a wellness brand that believes feeling good should be fun and easy. Seoul specializes
in delicious hemp derived THC and CBD products designed to boost your mood and help you unwind.
Their best selling out of office gummies were designed to provide a mild, relaxing buzz, boost
your mood and enhance creativity and relaxation with five different strengths you can tailor the
dose to fit your vibe. So also has a variety of products specifically designed to help you get a
better night's rest, including their top selling sleepy gummies, a fan favorite for deep
rest or asleep. Look, I have made no secret. I am a believer in this stuff and cannabis and hemp
derived. I do not partake an alcohol for the most part. I think this stuff is a lot safer than alcohol,
a lot less addictive than alcohol. And yes, I have to say I love these drinks. It is the glass of wine
and you know, after work. That's what this is. So no hang over and no excess calories like this.
I'm a big fan. So bring on the good vibes and treat yourself to Seoul today. Right now,
Seoul is offering my audience 30% off your entire order. Go to getsole.com and use the word
podcast. That's getsole.com promo code podcast for 30% off. And yes, I too. I'm a customer.
Well, I was at a neighbor party, stood out the word like it's crawfish boil on my neighbor
for those all the time. And I was talking to his father and mom, who's I've become friendly with
over the years. And he said, you know, I got to stop watching YouTube at night. He goes, I end up,
and this is, he's in his late 70s, early 80s. And he goes, I'm all of a sudden, I'll like about
9 30 start using YouTube. I look up and it's one in the morning. And he goes and suddenly I'm like
not getting enough sleep. And he goes and I don't feel right. And this is somebody in that right.
And he's self-aware. And yet he said it, he can't bring himself to turn it off. And like I said,
fully formed human being here. So Chuck, that is such a great example.
Because we had on the stand a felony in Christos Goodrow, who is one of the principal architects
behind that algorithm for YouTube that sucks people in like this. And and if it's an adult,
it's an adult. But when children who don't have their brain fully developed to understand
consequences, get onto this, it's even worse. Because those children who get robbed of sleep
now have deficient school performance. They don't learn. As Raul has said, it totally changes
their life trajectory. It rewires their brain. Now here's the sad part. Christos Goodrow, who
authored that algorithm, in a sense, was asked, can't we change some of this for kids? We could get
an extra 9,000 hours of sleep for kids just in California alone. If we would just make this little
alteration. And his answer was that we're not going to do that. We're not going to do that. I mean,
he's faced with vulnerable children can get 9,000 hours more sleep if you'll make this small tweak.
And he refuses to do it. And then just on the mindset of Christos Goodrow in particular,
he was quoted as saying, it would be great if my kids were on YouTube five to seven hours
a day. And so when you have the lead architect of YouTube with that mindset,
you know what's going to happen. Bad things are going to happen.
I want to do things happen in your case, guys, and this first one. Tiktok and Snap settled.
How no, I mean, the jury knows this. I mean, in some ways, that settlement got you the guilty verdict
on meta and YouTube and met in Google. I mean, if you're met in Google, you're probably like,
well, thanks for settling because now you've made it easier. I mean, am I just, am I being too
simple minded about this? I don't think you're being too simple minded. And I'd love to get raw
inside into this. But I will say it's a multifaceted diamond because I think it also allows the
jury to say, well, we don't need to give all of the damages because they've already gotten some
money. I also allows the jury to say, wait a minute, maybe this is Snapchat and TikTok's fault
because we do see she spent a lot of time on those. How on earth are you going to say it's the
fault of meta or it could, you think it could have cut both ways? Yeah, it scares us. We'd rather
them not know that, but it was such news that they knew it. I guess I just think juries aren't going
to distinguish between the four between those four entities the way you both think. But maybe on
that shock, I don't disagree with you in the sense that if the jury got to see the evidence as
tall or defendants, it's indistinguishable. But one of the things in the trial and we don't know
how it would play out and how it would play out differently. But meta and YouTube, they were not
blaming snap or TikTok. And I think if they did, that would be an admission of guilt on their
park, right? So they took the calculated decision of saying, we're just going to try and get people
to believe that these platforms are safe as designed, which is a very hard lift on their part.
How much discovery did you get out of TikTok and snap before they settled? It was part of their
motivation to settle not to produce some documents. The litigation has been going for over three
and a half years. Yeah. Though at the time of the first bell weather trial, all of the discovery
has been completed against all four defendants. So you have all the fun caveat. And they just settled
in this case, or are they settling all cases? Just in this case. Okay.
Could you use any of the TikTok or snap in this, or you could have the discovery in there,
or in this case, because they settled, you had to keep it just about meta in Google.
That's correct. It's just about meta in Google.
I want to get to something that Google's attorney said after the fact, which is interesting.
A Google spokesperson stated post-verting. This case misunderstands YouTube, which is
responsibly built streaming platform, not a social media site. So, well, hey, look,
this is the old divide and conquer strategy, potentially. But Google's basically, and this is
I think of Google as a friend of mine was active and been involved with Sierra Club for decades.
And he's to describe BP as sort of the oil company that apologizes for being an oil company.
Like, they always like want to be the ones that like, we care about climate change and all this
stuff. Google strikes me as that. They don't like to be the bad guy. They always are looking.
And if you told me Google was going to try to find a way to like, hey, we don't want to be lumped
in with Instagram and TikTok and Snap. And maybe they end up pulling away and changing some
of their behavior sooner than others. Are you getting a hint of that with these next trials that
are coming up or not? I don't think so. And I'll say this. That reminds me of an Italian restaurant
saying, hey, we serve pizza. You misunderstood it. We're a pizza rhea. Well, but for the fact,
you also have salads. You also have pasta. You also have tiramisu. You also have, you know,
a complete Italian menu. So, no, we're not going to just focus on the pizza you serve. Do they
have a video streaming service? Yes. But they also have shorts. They also have all sorts of
recommendations. They have lights. They have comments. They have dislikes. They send out notifications.
They have endless roll. They have auto play. They're not Netflix. Interesting. And the thing that
Netflix is not a social media site in your and they don't qualify for this. Right. And the thing
that drives me crazy about YouTube in particular is what that representative said is their forward
facing misrepresentation. And it's what I believe got them into the schools. They inject
themselves into the schools and now kid that really young ages are getting exposed screens and YouTube
and YouTube, which then leads to exposure to the other social media apps. And what they started as
and what they've become with the algorithm and infinite scroll is they become everybody else.
So they they wedge themselves in early on with kids under a false premise. And now they're candy
cigarettes, huh? Except they are. And then the candy box became cigarette nicotine in there too.
Right. Right. Well, I only look, I hear where you guys are coming from and I don't disagree
in how you're describing it. But going forward, what does that tell you that they're going out
and they're like, hey, hey, hey, don't call us a social media site. It's almost like, hey, man,
we don't want to be we don't want to be labeled that like that's terrible. Yeah, it is a it is
sort of saying, we're not guilty, but maybe they are and you can go after a matter and you can go
please, go for it. Be my guest. Yeah. Yeah. The side joke that was running with the lawyer for
YouTube in this case is he was trying so hard to say that they were not social media. He was
calling himself who to instead of you to and that did not work. And the reason it did not work
is because there are exhausted documents from Google and YouTube files that call them social media
that refer to them as social media that speak of their social media platform their app.
They are social media. You know, I can tell you that I'm not a human being that I'm actually a duck.
But you're looking and and I've got a birth certificate and I've got, you know, I'm a mammal.
I mean, they can say all they want to say, but the truth is the truth. Yeah, but I'm just from
another YouTube document. This is the vision. We aspire to create an app that is addictive.
And so, you know, you look at these documents and you look at what their their true aspiration is,
and they they can't hide from it and none of these social media companies can.
And then it just begs the question like what's in a name anyways, you know, it's it's it's obviously
a bad moniker to have to be called a social media company, but you can't hide from it by calling
yourself something else. Mark, what what what did you how was your experience cross-examining
Mark Zuckerberg? Were you nervous? Were you what were you just like chomping at the bed? Like
what was your mindset going into it? Yeah, I found him rather wooden. You know, he's had media
training for hundreds of hours before he testified yet, you know, probably hundreds of hours with
his lawyers getting him ready, which really calls out the cross-examination question.
How many hours of media training do you need to tell the truth? I mean,
do you get 50, 75 hours of media training before you tell your wife? Such a fun question.
Such a great when did you stop beating your wife question? Yeah, I mean, yeah,
how many hours of media training does it take to tell the truth? Yeah, there's no answer to that
question. Thanks a lot. Yeah, yeah. So, so it it just, you know, you don't need media training to
tell your kid, don't forget your backpack on the way to school, you know, you all we need is the truth.
And I found that he was prepared for certain questions.
What were they? What did you feel like he was most prepared for? And what were you surprised he
wasn't prepared? Or do you not want to? No, no, no. I'm fine saying it. He was clearly prepared
for how wealthy he is. Because he volunteered an answer that said, you know, but we've set up
a corporation to give money to so that it can cure cancer. So we're just really here to cure cancer.
But what he couldn't do is he couldn't, you know, for a guy who's supposed to be one of the
smartest in the world and certainly in his mind, maybe the smartest in the courtroom, he couldn't
handle the unusual questions like this. Mr Zuckerberg, would you agree with me? There are vulnerable
children in the world who don't come from the best homes whose parents aren't on the top of their
game. We don't have all the money. We may have a learning disability who may not be social misfits.
Those vulnerable children are in the world right, right? They're in America, right, right? They're
in California, right, right? They're in our neighborhoods, right, right? And then I said, it seems
to me there are three ways we can approach those vulnerable children. One, ignore them.
Out of sight out of mind, not not problem. Fair clear. Two, we can help them. We can volunteer at
the temples, the churches, the synagogues, the school groups, big brother, big sister,
library reading programs. We can go out of our way to help the vulnerable children. Fair fair.
I said, but there's a third option. We can look at those vulnerable children and see a dollar sign
on their back. If we trample on them, we can enrich ourselves upon them. We can pray upon them
to feed our bank accounts. Now, why did men choose option three? And his reaction is, you know,
was, oh, I don't think we did. And I said, well, that's why we've got these documents. Now,
let's look at them and see if you didn't do that. And so we started with the documents.
So, well, bell weather case, not everybody understands what this means. So, you know, let me just
paint the picture. There's a ton of people filed lawsuits against big tech companies on some form
of this, right? What is there some 2,300 cases? A little over 3,000. Now, we're over 3,000 cases.
So, what has happened that put that into a situation where there's a quote, bell weather? So,
there's 3,000 cases. They're not going to hear all 3,000 cases, correct? Well, I would like to believe
that, but the way what the way the court system is set up in here in California, thousands of cases
are filed. They all end up in front of one courtroom in front of a one judge. And that judge is
Carolyn Cole here in the Los Angeles Superior Court. And she needs to manage these thousands of
cases and try to get them in a position of efficiently getting tried, resolved, whatever, right? But
and the way that she wanted to do that is through a bell weather trial process. So, while all the
discovery depositions documents are being produced on the other side of it, there's hundreds,
if not thousands of plaintiffs with data, they're called plaintiff fact sheets that are provided
so that the judge and the parties can look and go, what are the similarities between the claims,
the usage, the injuries, et cetera? And then decide what would be maybe representative cases as to
what's a very powerful case and a little less powerful case and everything in between. And so,
judge Cole went through that exercise, the parties then talked about the cases and then selected
nine cases that she thought in total would have characteristics that would represent the other
2,991 cases. And how did she decide in the nine in consultation with defense and plaintiff here?
Correct. Correct. And it's not necessarily by agreement. So it's like selection of a jury,
you get like, no, no, yes, or, you know, exactly. The jury selection is a misnomer. It's just
deselection here. You're left with what's left there. Yeah, you don't get to pick who you want,
but you can eliminate who you really don't want. Exactly. Exactly. And so it's really in the judges
call on these cases and that's how she picked them. And these are all in Los Angeles because the
companies are based in California, correct? Correct. So these, I believe Kaylee was not a California,
she was a Californian. She was from Chicago up north. Yeah. Okay. So she is a Californian, but
there are cases where it won't just be California, right? Like the next trial for Russell is
he's a Florida resident, but it's going to be here in LA Superior as well. Yep. This case from start
to finish took how long? Oh, January 27th to March 27th feels like somewhere right around there.
So about two months. And well, that was because you got a trial date. How long did it take to
like winded, you know, winded this all get formalized with discovery and all that's. Oh,
three and a half years. So three almost four years. But we don't expect you to have to have three
years for each case, each subsequent next case. All of the cases got worked up together for these
bell weather cases. So eight cases, they could go back to back to back to back, if need be.
So we're going to show up in court tomorrow. And judge cool is going to set the trial date for
Russell's case. And it could be where this will be by the time this airs probably be later in the
month, but we're we're taping on 420 on April 20th here. So we'll we'll just to give people
a date stamp on this. So eight more trials. So the judge need to get the verdicts in all nine
before she starts characterizing the rest of the 3000. I don't know. That's that's I think a moving
target is to how she sees things and whether there's enough information to start getting the
cases out. Her mindset is what you what she wants to do is get enough rulings on all of the
unique issues. So that there are essentially no unique issues that require the knowledge base of
the entire section 230 workup and everything else. And then it can just get sent to other court
rooms in LA superior and have multiple trials going at the same time. She called you expect jury
trials for all 3000. I don't. Yeah. Now I think once there's a pattern established of
who's going to win who's going to lose and which ones do you win and which ones do you lose?
Realistically the companies that they don't need this media publicity. They need to figure out
how to get up making their money not spending it.
This episode of the Chuck Todd cast is brought to you by Wildgrain. Wildgrain is the first bake
from frozen subscription box for sourdough breads, artisanal pastries and fresh pastas.
Plus all the items conveniently bake in 25 minutes or less. Unlike many store brought options.
Wildgrain uses some simple ingredients you can pronounce and a slow fermentation process
that can be a lot easier on your belly, little gut health there, right? And richer in nutrients
and antioxidants. There's also no preservatives and no shortcuts. The Wildgrain boxes are
fully customizable in addition to their variety box. They have a gluten free box, a vegan box,
and a new protein box. I will tell you I have done the gluten free box. I have done it a second time.
I have also used the code, the Toddcast code. If you use the promo code Toddcast,
you get $30 off. I've already used it as a gift to somebody else who loves this bread. It is
hard to find good gluten free bread. It is fantastic. They give you step-by-step instructions.
I really dig this. There is nothing like having an artisan bakery in your freezer to chase away
the winter chill. Now is the best time to stay in and enjoy some comforting homemade meals with
Wildgrain. I obviously highly recommend it. It is worth giving Wildgrain a try. Right now,
Wildgrain is offering my listeners $30 off your first box plus free croissants for life. Come on.
When you go to Wildgrain.com slash Toddcast to start your subscription today, that's $30 off your
first box and free croissants for life when you visit Wildgrain.com slash Toddcast or simply use
the promo code Toddcast at checkout. This is a sponsor. I absolutely embrace, so use that code.
I guess the question is, we're all just spit-balling here, but if you win the first four,
one would assume they start to think about do we need to create a settlement fund and just
settle them all? I certainly agree with that. What's the model? Where is this
like tobacco and where is this not like tobacco? Well, it's like tobacco in the sense that they
created an addictive product by design. The intentionality. It was intentionally addictive and
they knew it. It's not something they made and they found out it was addictive after the fact and
that would be a slight difference here. You guys are contending. They wanted to build it
to be addictive and they accomplished their task. I would just spray this addiction by design.
Some of it is irresponsible deployment of different features as well. There may be some
circumstances where they just blindly launched certain features and then said, hey,
we don't know if it's safe or not, but let's just try it out and our test sample will be the
entire child poverty. That's negligence. It isn't negligible. Yes, exactly, but then getting to the
next step for all of the features that are part of this case, they knew for a very long time. They
either intentionally launched it knowing it's addictive or launched it negligently then found
out through their internal research that it's addictive and did nothing about it. Either way gets
to the same point which is intentionality in the way that they did it. Then you've got concealment
and this is to me the destruction of an entire generation of kids. I've got three boys,
2017 and 14 and looked at these things and looked what they were doing and the harm that they could
inflict. It's created irreparable harm. What ends up happening is through the concealment,
they're allowed to get a critical mass of kids using social media. I think it's almost 90% of
teens are using it now. What do you do next? You can't just renavigate a battleship. Turning that
is so hard to do and take so many layers. Now they've got a market share
and are profiting off of it and creating this situation where it's very hard to undo the damage
and to change the narrative so that kids stop using it so that they start using safe practices
and are forced to use the safe practices. With each passing day in Chuck, you asked about the
four trials and whether they finally see the writing on the wall. I don't know because they're
pretty happy lining their pockets with every passing day and the cost of this litigation
is less than the profits that they have. You're describing the Ford, the infamous Ford Pinto.
It really is. The Ford Pinto was this for some of my younger folks was this
economy car that when it got rear ended, it had a design flaw. If it got rear end of the wrong
way, it basically blew up the gas tank and it just had this explosion. Ford knew it. They made
a calculation of what it would take to fix it after the fact and the recall versus settling claims
that were filed and it would be cheaper to settle the file claims.
Are you sure we're not in a situation like that already with these social media companies?
I don't think we are and I'll give you my reasons and I'll think we are. The two of you just
agree on this. We may. If so, I think he's probably right and I'm wrong. At least historically,
I'll put my chips on him. But so much of these companies depend upon PR.
And you take Google, for example, Google wants YouTube in every classroom.
And Google is going to have trouble if they repeatedly go to trial and get hammered with
juries saying they've done this maliciously. They've done this intentionally. They're destroying
children. And then Google goes and says to the school district, hey, let us be on all your school
computers to enhance learning. That's a left hand and right hand. And the kind of money that
these companies make, settling this would be a boon to them. It would send their stock through
the roof. It would give them PR pluses. And they care deeply about the PR. I think they care more
about the PR than just the problem. What does that tell you? You think settlements coming?
I think it will at some point. I like your tri four cases. Maybe it's eight.
You know, maybe it's 15. But somewhere down the road, the settlement will come.
And you now, you guys, this happened to, I mean, it was like back to back days. We had the trial
and New Mexico was was brought by the state government. And you're going to have states and
that was a separate thing with tobacco, right? You had private litigation. I remember, to me,
that I remember the first one on tobacco that really sort of got everybody's attention.
I think it was the flight attendant lawsuit, right? It was just sort of like, hey, they were,
they they had no way of avoiding secondhand smoke. They couldn't do it and look what happened.
And then I think blackjack dealers were next like it was like, those were the two. You couldn't
dispute that they were sort of trapped with secondhand smoke on that front. So is, is, is there a
corollary here? Yeah. Private versus public. And then the state AGs came in and then they did it
as sort of harmful to their, you know, what, what is the distinction between a private litigation
for a class that's harmed? In this case, teens versus what the state AGs are doing?
Well, you ought to answer this because you've got your hands in both pies pretty deep.
Yeah. So there's kind of multiple buckets going here. There's going to be a school district
trial that's happening very soon. And then you've got the new Mexico AG who brought that case.
And it was really for statutory damages for violation of a state statute.
Right. And had a lot to do with a grooming exploitative material and exposure to children. And
that led to $5,000 per event and ultimately got to $375 million. And so I speaking not as the
lawyer, but as a human being, I think it's fantastic that all of these verdicts are coming with
different lenses. So you're looking at the conduct related to harming children through
exploit child exploitative material, sexual content, grooming, pedophiles, awful, awful stuff.
And then you've got verdicts that are coming through the impact of children directly
through addiction, awful, awful stuff. And then you're going to have public nuisance by way of
the school districts and how it's negatively impacting education because it's just permeated
the school district. That's terrible as well. And I think all of those things to Mark's point
are going to create a significant negative PR for these companies. And at the same time,
creating awareness where things are just going to get shut down. And there's school districts now
and there's parents in every community that are, Chuck, I think even you too, right? Don't want,
you know, buying local community. We're starting to see homemade signs that say,
tell our LinkedIn public schools to get iPads out of the classroom. Yeah. Yeah. And that's
not that's a grassroots movement that's now expanding across the country. And these verdicts
have led to that. This litigation, which started three and a half plus years ago, after it started,
the US Surgeon General came out with his report on how harmful social media is. So it's coming
from all angles, which I think is a great thing. Mark and Roe play, pray regulator for me here.
Okay. Let's assume I have a few members of Congress listening to this interview. I'd like to
think I do. And they want to do something. And they want, they've been, we've all been dying.
Like what, what is some good structure that could be put around these tech companies? You know,
I can get extreme here. I think that all of this stuff should be banned to 25.
I'd be, I believe in that for sports gambling. I believe in that for marijuana usage. I believe
that for alcohol usage that I may believe or I believe all of the science. It says the brain,
especially among young men versus young women, even, but still the 25 is when you know you're,
you're, you're sort of fully formed to bring. So and, and I will note, rental car companies,
insurance companies have made it. So there's a reason they don't let anybody under 25 rent a car
without paying through the nose on that front. But let me set that aside.
Actual regulatory ideas, you know, is it the end specifics that lawmakers should target
to limit social media addiction?
Well, I really like the idea of limiting it. The problem is I think in America that may be dead
on arrival. Because the power of these companies to influence and lobby is, look, we have a drinking
age of 21. We have tobacco. You know, I'm not saying it's perfect. You know, but we have put age
requirements. We have done some things. We usually wait till it's done all sorts of destructive
behavior before we do it. But we do eventually do it. And I guess what could be done without
getting it thrown out in court due to the first amendment?
Well, I think I think I like your idea of putting age limits on it. Australia is but 13.
And that made a big deal out of everything. But I'm thinking 13, mercy, go for 21 and go for 25.
You add to it. You know, if I were a legislator, I would get rid of Section 230.
Right now these companies can put porn on there to anybody at any age and you can't hold them
accountable. And I would get rid of 230 protection for these companies in a heartbeat. The sad part
is these companies have written their own legislation as to like protect the kids act.
Kids on the like stage. Yeah, all it does is give them free get out of jail cards. And so
you don't want to hand over policing to the industry. You got to do something roll. I don't know.
So just some real simple things. So I agree 100% with Mark. They need to be accountable and responsible
and repealing Section 230 is a part of forcing responsible behavior on the part of these social media
companies. But looking at as crazy as this sounds. Okay, TikTok in China. It's called DoYin.
And here's what they have. They have age modeling. So there's age verification to make sure
the kids are treated like kids. And as part of that, if you use it for just several minutes,
you're forced to rest. You have to get off of it. It won't work anymore. They have a 40-minute
daily limit. After 40 minutes, you're done. Night time curfew. You can't use it at night because
you need to get your rest at night. All of the things that, oh, even in the endless scroll,
you know, the slot machine effect of just keep swipe, swipe, swipe. It has a lag in between
each time you swipe it so that your mind actually doesn't get into that slot machine,
endless reward and intermittent reward system. There's already things that exist in other countries
that people know to be safe practices that can be implemented here. But I really think with
technology in particular, there has to be a responsibility shifted onto the companies
to continue to be vigilant. And they're failure to do so and do best practices. It's something
that has to hit them where it hits them the hardest, which is in their in their shareholder price
and they're in their bankbook. No, I mean, and this is sort of why I sort of I keep using the
rental car companies as an interesting example because there's no law that says you have to be
25 to rent a car. You don't, right? Actually, you can be younger. It's the rental car companies
that won't ensure you unless you're 25 and won't allow it to carry over. And I just wonder like
is, you know, this is why I always say in a true free market, free system, you have to have
private civil litigation or you'll never get, you'll never have, you know, be able to have
full accountability, right? Because you have government cap regulatory capture with the government
that Mark was just describing right now, if how they sort of get to write the legislation for
the Protect Kids Act, things like that. Is there a, is there an equivalent here? I mean, you guys
think if there's an, you know, if you extract enough financial pain, they're just going to change
because they're tired of losing these lawsuits? I think too. I think that's what we see.
You used fork into as an example, very clearly, the financial analysis of how much we lose
in a lawsuit was done by Ford. It's done by everybody. What's our exposure? Insurance companies
ask the same question. Car rental companies ask the same question that if car rental companies could
make more by renting out a car to a 20 year old or 18 year old or 16 year old, they'd do it in a
heartbeat. Well, they know financially they're going to lose money. Right. And that's what you've
got to do with these companies. That's, that's their language. And the, and the other problem in part
of it candidly, I think that it's going to require the Congress and Senate to do something because
what you're going to see is if somebody makes a move to be engaged in a safer, safer practice for
children, nobody else's. So let's just say TikTok does something because it's the right thing to do.
And this hypothetical world I'm offering. Okay. You think Instagram, meta, YouTube, and Snap are
going to follow along or they're going to just reap the market share at the expense of TikTok.
And so people are not better off for it. And, and so none of these companies are going to consistently
do the right thing unless, unless there's somebody forcing them to do it.
And you're right. The exact opposite. The appeals, they're going to appeal. What have, tell me
about the appeal process. Do you expect, is this going to, you think they're going to do, is this
through, so it's federal court? So it's going to go circuit. Then you expect to have to go all the way
this spring. This is done. This is actually a state court trial verdict. So this will go this is a
state court, even though it's federal. It's a federal statute that they're asserting as a defense
for the court case. Okay. So it's a state court case. So you expect this to go through the state
Supreme Court? That's, that's their intention. And then if it, if it ends in the state Supreme Court
being affirmed, then I expect they're going to try and get it to the United States Supreme Court.
What would be the, what's their, what, what, what, why would they just excessive or, or what,
what's their appeal or have they not filed, what they're, why they say the verdict was wrongly,
wrongly given? Yeah. So the process is it starts with an emotion for new trial, which has not yet
been filed. And then after that's ruled upon, then the appellate process starts through a notice
of appeal. And then once the notice is filed, then I think it's either 60 or 90 days, then the
appellate briefing gets filed. And then only after it's fully briefed, which could take upwards
of a year, then it gets calendared for argument in the court of appeal. And then, and so you're
probably about a year and a half ish in before that's actually heard. And then after the court of
appeal issues, it's ruling, which it has a period of time to do, then they can seek review from
the United, or from the California Supreme Court. If the Supreme Court then hears it, that takes
a period of time. And then after that, they have an opportunity, there may be an opportunity
for them to seek relief from this United States Supreme Court. So I'm like doing the math
in my head, 2030, 2032. Mark, there's resolution, some resolution, 10 years. Okay.
And that look tobacco took, that's why I was making, I think the first tobacco cases were in the 70s.
I don't remember. First of all, there's some early 80s. Like I said, I remember the early
flight attendant one when they just got rid of smoking sections and airplanes. Any of us that were
old enough to remember those were just the most ridiculous. As if you weren't going to have any
smoke and rose one through 18. But after, you know, now, you know, smoking 19 and back, yeah,
look at that. All right. Well, by then, you're going to be, well, I'll be getting Instagram
in our brain, right? Yeah, I know. I mean, like technology is something that, as you know, with
AI coming on the heels of this, it only highlights how much we need to be ahead of, ahead of it,
as opposed to waiting in, never the regulators. And by the way, on section 230, the minute they
create an algorithm, they've become a publisher. I don't think section 230 applies to any of
the stuff anymore. They've created way too many things. If they have created an accelerant for me to
see somebody's content, they're also a publisher. So I don't even for you for Supreme Court.
I mean, I have, I look, you know, when you decide if something's above the fold and below the
fold, you're a publisher. Well, an algorithm is deciding what's above the fold or below the fold.
So I don't even think like I, as much as 230 is a good boogie man, I think there's
I, I think there's actually still some arguments to be made that it's just
section 230's null and void the minute you create an algorithm.
I like the way you think. Go for it. Yeah. Well, if it works, then, uh,
then can I get a free law degree somewhere? I don't have to say it. If it works, if it works,
we will come on your podcast and tell everyone that this was the Chuck Todd podcast. There it is.
There it is. That we created new law, new, new legal doctrine here.
Mark and Rahul, this was great. Um, I think there's, you know, it's the thing. We like these
products. We just don't want them to be predatory. There are good things about them. Just stop trying
to like make it 24 hours of our life. Exactly. Exactly. I mean, I like a good steak. I don't want
it every hour of the day or I'm going to have a heart attack. Yeah. And that was our position
with with the jury. We were very honest. We said, Hey, these can be useful tools. We can enjoy
these tools, but it doesn't mean that they're allowed to addict our vulnerable children on purpose.
Well, the good news is this will appear on YouTube. So let's see how well the algorithm does
in making sure we get to see all of this gentleman. I learned a lot. Thank you so much. Good luck
on these cases.
This is an I heart podcast guaranteed human.
The Chuck ToddCast



