Loading...
Loading...

A jury in a Los Angeles court has found Meta and Google liable for $6m in damages in a civil suit alleging their social media products are designed to be addictive to children. It could pave the way for thousands of similar suits. Today on the show, technology reporter Hannah Murphy joins Robert Armstrong and Katie Martin to discuss what the verdict means. Also, they go long cold brews and long a Netflix documentary about America's Next Top Model.
Check out the FT News Briefing podcast on FT.com or on Acast, Apple Podcasts, Pocket Casts or Spotify.
For a free 30-day trial to the Unhedged newsletter go to: https://www.ft.com/unhedgedoffer.
You can email Robert Armstrong and Katie Martin at [email protected].
Read a transcript of this episode on FT.com
Hosted on Acast. See acast.com/privacy for more information.
It's been a pretty wild few weeks and it's hard to keep on top of everything you need
to know in markets and finance, but don't worry listeners, we have got your back.
One big story that's not about Iran came last week and it's all about social media.
A court found that both Meta, which owns Facebook and Instagram and Google, which owns YouTube,
are liable for content that's harmful to children and teenagers and they must pay a few million
dollars in damages. Meta shares are down by about 7% since that ruling, adding to earlier declines,
while Google is down about 5%. Now how much of that hit to shares is down to the ruling and how
much is down to the general hit to markets because of Iran, we don't know. But these are
huge companies with a massive weight in every investor's portfolio, so this stuff matters.
Today on the show, do these rulings open the floodgates for social media companies?
Is this social media's tobacco moment? This is Unhedge, the markets and finance podcast
from the Financial Times and Pushkin. I'm Katie Martin and markets columnist here at the FT in London
where we have some much-needed holiday days coming up. And I'm joined, yes, by the big fella,
Rob Armstrong, my partner in crime on the Unhedge newsletter in New York. But also, big love
to Hannah Murphy, one of our super team in far away San Francisco. Listeners, it's so early
in San Francisco when we were called this. Hannah, are you awake?
I'm slowly rising and I've got a cold brew here to get me through a very San Francisco.
Surely you've already been to a yoga class, done a cold plunge, had some centering meditation.
So while you're coming up on your cold brew, tell us what's happened.
So last week was the verdict in Los Angeles. The trial was a series, one of a series of nine
test cases that are going to set the direction for a larger group of similar claims. That's
sort of thousands of individuals, school districts, state attorneys general who are going after
meta and Google and other social platforms at the moment seeking damages and design changes
arguing that these platforms by design have negligently harmed children and young people
and are deliberately addictive. There will be a second set of cases heard in Oakland over the summer.
But these really the the verdict really tells us, you know, what a jury thinks of
Mark Zuckerberg and of YouTube when it comes to whether they are in fact negligent in this case.
So this is using product liability laws rather than looking at whether the content is harmful.
I see Hannah, who got hurt here and what harm did they suffer that they brought meta and Google
to court? So in this case, the plaintiff was called KGM or Kaley. So 20-year-old woman arguing that
over the course of her life using social media has it wasn't sort of the only thing they had to show
that it contributed to her mental health problems, anxiety and depression. In turn, the platforms
argued well, there are other mitigating factors. She had suffered familial abuse, for example.
This wasn't on us, but really here what they were looking at was whether features sort of like
infinite scroll, for example, where you could just go on and on and on made her addicted to the
platform and in turn then caused her harm. So it wasn't whether the content was particularly bad,
but it's whether by design these platforms are meant to get you coming back just like a,
I guess, a slot machine in a casino. And like this stuff all matters to markets,
doesn't it, Rob? Because these companies are almost unbelievably enormous. So meta is a 1.36 trillion
dollar company. And Google or Alphabet, that's like 3.3 trillion dollars worth of market capitalisation.
So you put them together and that is 8% of the entire S&P 500 US stock index.
And you might think about it this way. Think about how much internet advertising you see every
day, week, month, year of your life. Well, these two companies along with Amazon, which has a huge
ad business, those three companies get half the internet advertising dollars that get spent
anywhere. So they're absolutely dominant in this form of advertising that really colors the kind
of texture of our whole existence. So if this changes the way they have to do business,
it's going to be big news for everyone in a certain way, market, non-market, whatever.
So shares have come off pretty hard. Like I say, it's a little bit difficult to disentangle how much
of that decline in share prices is down to just other stuff and how much is specifically down to
this. But they're only having to pay out a few million dollars in damages. What is the market
extrapolating from here? So I think to date, the platforms have been protected by a legislation
in the US called Section 230 of the Communications Decency Act. And that says that platforms cannot
be liable for the user-generated content that they host. And that is broad, that is sort of
enshrined in the current discussion in the US around First Amendment rights and free speech.
The plaintiffs here have gone a different route by saying, we're not even looking at Section 230.
We're going to tap away the kind of product liability space and look at sort of negligence there.
And the fear is that now, this is the first of a series of test cases, there will be more.
But you see the direction that this could open the floodgates to anyone who says, oh well,
I'm feeling a little bit depressed and I just spent 10 days doom scrolling on social media.
I think it was this platform that did that and going after the platforms to try and claim some money.
I have two things to say about it. One thing I have to say is that I have two 16-year-olds in my
house. And boy, is there a lot of infinite scrolling going on in your swing household. So I'm getting
ready for my payday. That's point number one. Point number two is the point that Hannah, you just
made, is the point that has Wall Street really excited. This concept that product design is not
speech really has people thinking about the whole business model. And you notice, in responding
to the verdict, it's pretty clear that meta is still going to fight on the speech line here.
They're going to say that they are protected by things like the First Amendment and Section 230,
but this move of separating product design from freedom of expression or responsibility for
expression is just a huge deal for the whole industry. It feels like meta is kind of
seizing the political moment when there's a lot of debate about this, but I will say that I've
also seen reaction from academics, safety experts who typically don't love meta and are very
critical saying this sets a worrying precedent. They don't agree with it and it could have sort of
ramifications down the line. They argue actually both meta and some of these more cautious
experts say speech and the design factors are deeply connected. Nobody will doom scroll if there
isn't compelling content on the platform in the first place. If it was just videos of paint
drying, you would just put it down immediately. But on the other hand, Hannah, let me take the other
side of that argument. Suppose in 1972 or whatever it was, the car industry had said if you make
us put seatbelts in these cars, you're encroaching on our freedom of expression. We're expressing a
view about a lifestyle, an American way of living freedom and comfort. Of course, this is an
exaggeration, but you see my point. The freedom of speech broadly construed is very broad,
and you have to draw some lines somewhere to what that allows you to do.
Where does this fit into the broader tech lash that we're seeing? You're seeing lots of countries
looking to restrict the use of social media for children, Australia has already gone down that
route. Is this sort of of a piece with that? Is it connected? I think so. It's a bit of a moment
particularly as you note for child safety. Yeah. Obviously, across the political spectrum, there's
such debates about where people stand on what speech should be allowed harmful versus totally
illegal speech. What should we be allowed to say on platforms? But in terms of where the children
should be protected, there's a sort of bipartisan agreement that this is an area that you will get
politicians on both sides riled up and saying, we need for the kids think of the kids. So it seems
to be part of that bigger moment where we're kind of having a conversation about what age should
children start using social media? Are we now a generation into social media seeing the repercussions
on young formative minds of constant social media use? And if so, can we correct that? So this
sort of ties into that. My hobby horse on this whole issue is that certainly I know a good few
Gen X's and boomers who have really lost the plot online and you see lots of people who are not
children who are radicalised online. I'm not aware of very much that's being done about that.
That's a whole other topic. But like Rob say you assume that this does change the rules of the road
for social media companies. These companies are so ginormous. They're such an important part
of markets, you know, like it or not. They're a huge part of debt markets. They're a huge part of
stock markets. They're everywhere. But like your view that you were talking about in your newsletter
just the other day was that this sell off is overdone. Why are you confident about this please?
I have become so professionally deformed by covering tech industry lawsuits for like 20 years.
And my basic default position on all of them is this stuff never turns out to matter. You know,
you're suing Microsoft. You're having suits over Java. You're having suits over Android.
You know, it's this. It's that and tech companies in general and internet companies in particular
just keep rolling. And I think there's a couple of reasons for that. One is the technology changes
quickly enough that by the time the legal issues get sorted out, you're on to the next phase
the game anyway. So you're like, you know, this is what happened with like the the fight over
browsers and antitrust. It's like by the time that lawsuit got worked out, browsers weren't important
anymore. You know, they were a commoditized product. And also in the internet services business in
particular, these products are given away for free. And demand for them is tremendous.
Here is a free product that we all use and get value out of where the harms
are almost certainly real, but probably vague and counterbalanced by pretty significant goods.
All of this stuff I just mentioned mix it together. And you get my very cynical attitude that
these kind of lawsuits just don't tend to amount to very much. I mean, I think I agree that the
academic literature around whether social media is harmful to children is mixed that is pretty
direct correlation between if you're a young woman and body image issues and dysmorphia if you're
using a platformer laugh. But then as as you say, Rob, on the side, there is people who might not
find communities and are not particularly sociable out in the wild who online find their people
and derive great benefit for that. And it's very hard to see what's kind of causation versus
correlation. The platforms will say there's no link. Some academic studies will find that is
and some will they're not, but it's all murky, which I think plays to the platform's benefit.
I think what worked against them in this trial is that it was a jury trial. And that I was there
when Mark Zuckerberg testified and he is not a sympathetic character.
I tell you what Hannah, if I'm a lawyer soon, one of these companies and meta in particular,
boy do I want that Zuckerberg on the stand. Let's just have a week of him talking, you know.
It's like you've got it. It was just today. He came off sometimes kind of a bit arrogant and
pushing back up the questions. And so it was a bit sort of like, what are you doing? This is about
like a child's mental health. There was also kind of this line that, oh, we weren't juicing for
engagement or time spent. We really care about creating a valued community. Then they would show
emails from a few years back that was like, our main goal is increasing time spent. And he'd say,
oh, actually, I meant like, we changed that used to be, but we've changed that now.
So another, I think, important piece of context here is that we're already seeing some lawsuits
come up from people who say they were harmed or their relatives were harmed by AI chat bots.
Now, where do they fit into all this? Right. Because at the moment, Google and meta are two of the,
a handful of companies really throwing billions of dollars into this. This is what that's really
preoccupying these companies, winning this AI race and developing frontier models. What I think
is interesting when it comes to the safety piece is that we are yet to know whether section 230
will protect these companies when it comes to content generated, not by users, but by
chat bots that they have built. And therefore, these are products that they have created. And they
may therefore have more liability for the platforms are deliberately not talking about this. I've
heard even internally no one's really supposed to bring it up too loudly because there's a general
bit of a bit of a fear that this could come into play later down the line. So that's kind of lurking
in the longer term. I mean, I'd be interested to hear from you, Hannah. What do you think the next
steps are on this road? What is the next big point we should be looking forward to?
Absolutely. The appeal moment meta certainly are going to throw everything at this and they're
already as you noted using sort of rhetoric around free speech and First Amendment rights. And
this will erode our God-given First Amendment rights in the US. I was speaking to a policy person
who'd worked for a rival company before the verdict. And there was a moment when so TikTok and
snap both settled the case and we were sort of gamified. Okay, well, what position is metering now?
Metron YouTube, if they settle, they open the floodgates to many more cases. So that's a lose.
If they go to trial and they win, all good and done, although there might be an appeal process there,
but it's a good sign. If they go and lose, what happens next? They appeal. And they appeal during
a Trump administration with if it made its way all the way up to the Supreme Court,
a sort of conservative Supreme Court. And this person was like, actually, you know, if I'm them,
I'm going to take the gamble and I'm going to go and ride it out and hope that the wider political
moment and discussion around censorship versus free speech helps me when it comes to the appeals
process essentially. Ladies and gentlemen, that is American capitalism in
glorious technicolor right there. This is the American dream. We're going to have to break
and come back in just one sec with Long Short.
Oh, Kidoak, it is time for Long Short, that part of the show where we go long a thing we love
or short a thing we hate. How it can be anything, so what you got?
Today I am powered by this special cold brew. And explicably it is wicked branded. I don't know why
some weird colab. I bulk ordered them and they all came with the wicked branding on. So it's like
witchy, witchy coffee for witches. Witchy coffee. It says it has your daily collagen boost. I don't
really know what collagen is, but they love it here in SS. Oh, I bet they do. I'm getting it boosted
as I struggle to stay awake during this early morning podcast. Collagen in your coffee.
Rob, what are you got? I am long the FT news briefing podcast for totally non-self interested
reasons, including the fact that I'm going to be on the show later this week talking about the
history of the taco trade on the one year anniversary of Trump's disastrous tariff announcements
and subsequent chickening out. So I'm long that everybody listen in shameless self shameless
self promotion. Totally mercenary. I am long the Netflix documentary about America's next top
model. First of all, it's a bit of a kind of trip down memory lane and let's look at how like
dreadfully thin and weirdly dressed everyone was in the early noughties. Were you a fan of that
show back then? Were you at home? I loved it. Yes. I did love it, but it has not aged well.
And yes, some of the behind the scenes drama that they talk about in this documentary is
is pretty hair raising. So that was some high quality entertainment. So speaking of high quality
entertainment, we're going to be back with more for you on Thursday. So listen up then.
Unhedged is produced by Jake Harper and edited by Brian Erstert. Our executive producer is
Jake Goldstein. We had additional help and tow for four heads. Cheryl Bromley is the FT's global
head of audio. Special thanks to Laura Clark, Alistair Mackie, Greta Cohn and Natalie Sadler.
FT Premium subscribers can get the Unhedged newsletter for free and a 30 day free trial is
available to everyone else. Just go to ft.com slash Unhedged Offer. I'm Katie Martin. Thanks for listening.
