Loading...
Loading...

Today, Meta and Google lose landmark social media addiction trial in the US.
Jurors found that Meta, which owns Instagram, Facebook and WhatsApp, and Google, owner of YouTube, intentionally built addictive social media platforms that harmed a 20-year old's mental health.
The woman, known as Kaley, was awarded $6m (£4.5m) in damages, a result likely to have implications for hundreds of similar cases now winding their way through US courts. Meta and Google said they disagreed with the verdict and intended to appeal. Adam is joined by BBC technology editor Zoe Kleinman and Faisal Islam Economics Editor.
Apply for tickets to Castfest here https://www.bbc.co.uk/showsandtours/shows/castfest-2026
You can now listen to Newscast on a smart speaker. If you want to listen, just say "Ask BBC Sounds to play Newscast”. It works on most smart speakers. You can join our Newscast online community here: https://bbc.in/newscastdiscord
Get in touch with Newscast by emailing [email protected] or send us a WhatsApp on +44 0330 123 9480.
New episodes released every day. If you're in the UK, for more News and Current Affairs podcasts from the BBC, listen on BBC Sounds: https://bbc.in/4guXgXd Newscast brings you daily analysis of the latest political news stories from the BBC. The presenter was Adam Fleming. It was made by Jack Maclaren with Shiler Mahmoudi. The social producer was Beth Pritchard . The technical producer was Mike Regaard. The assistant editor is Chris Gray. The senior news editor is Sam Bonham.
This BBC podcast is supported by ads outside the UK.
Self-directed investing, trading, full service wealth management, automated investing, financial
planning, thematic investing, retirement planning, few, and to think. That's just a small
taste of what Schwab offers. Because Schwab knows that when it comes to your finances, choice matters.
No matter your goals, investing style, life stage or experience,
Schwab has everything you need, all in one place, so you can invest your way.
Visit Schwab.com to learn more. Have you ever received a call from a stranger
regarding student loans you don't owe, an unpaid parking ticket for a car that you don't even own?
If so, you might have been the target of a scam orchestrated by criminals for thousands of miles away.
I'm Tristan Redman, one of the hosts of the Global Story podcast, and we're taking a
linside look at the highly lucrative scam factories of Southeast Asia.
Listen to the Global Story on BBC.com or wherever you get your podcasts.
Hello, just after we've published the previous episode of newscast on Wednesday, there was a massive
news story broke in California, the west coast of the USA, where a jury found that met her,
the parent company of Instagram WhatsApp and Facebook, and Google, or at least the part of Google
that's responsible for YouTube, had created social media platforms that were addictive,
and had harmed the mental health of one American female user who's now in her 20s.
And this has got people asking if this is a watershed moment for the regulation and the business
model of social media, but of course the big caveat that actually the regulation of these platforms
in the UK and in Europe is different from America. So we thought we would spend the whole of this
episode of newscast cheering over the implications of this court case with the BBC's technology
editor Zoe Kleiman and Faisal Islam, our economics editor. So that's what you'll hear
on this episode of newscast. Newscast. Newscast from the BBC.
That boy sliver me in the classroom during our violin lessons.
I was the toppled tail in the class. Got to have an apology, please.
I trust almost nobody. That daddy has to sometimes do strong language.
Next time in Moscow. I feel delulu with no salulu.
Take me down to Downey Street. Let's go have a tour.
Blind me.
Hello, it's Adam in the newscast studio and joining us here in person. I think actually for the first
time ever is our technology editor Zoe Kleiman. Hello, Zoe.
Thank you for having me. Welcome to the fake living room.
I've never been here before. It's lovely.
Although we've been on newscast hundreds of times.
I have, but never in the same room.
And perhaps never with such a landmark court case to discuss.
Indeed. We'll see. And Faisal is here too. Hello Faisal.
Right. So Zoe, this verdict was in a court in California.
It happened on Wednesday.
Take us back to sort of the start of this case and who brought it and why.
And why was it in court in the first place?
At the heart of this case is a woman who is now 20 years old,
who is being called Kelly. We don't know her full name.
And she says that she was excessively using social media when she was younger.
She said she discovered YouTube at six and Instagram at nine.
And at some points she was online for up to 16 hours a day.
So a lot of time on social platforms.
And she says she had a mental health crisis.
She was diagnosed with depression with body dysmorphia and with suicidal thoughts.
And she blamed her social media use for those conditions.
There were originally four companies in this court case.
They were Meta, Google, Tiktok and Snap, which owned Snapchat.
Tiktok and Snap chose to settle, Meta and Google chose to fight.
And their line was no single app can be responsible for a teen mental health crisis.
It's a very complicated thing.
This young lady had a lot going on in her life.
She'd already had a difficult childhood by the time she discovered social media.
And that she was coming to the platforms to get away from her problems.
And they were not the cause of them.
However, the jury found that the social networks had been in their words negligent
in failing to protect her from harm on their platforms.
And that the interestingly that the platforms themselves were
because of the design features within them, addictive.
And what sort of features are we talking about then?
So you'll know them really well.
So this is auto scrolling, right?
The fact that you never run out of content.
It's auto play.
So a video lands and it just starts playing straight away to sort of suck you in.
It's the algorithmic recommendations that are so good, aren't they?
Most of the time it's serving up that thing that you didn't know you wanted to watch next.
Those are the kind of design features that are very deliberately there
to keep you engaged because what these platforms need are lots of people there
for as long as possible so that when they're adults,
they see as many adverts as possible and that's how the companies make money.
And was the core just looking at the features that we can all see
when we use these platforms ourselves?
Or was there other evidence that was that was brought forward
that the judge and the jury were looking at?
So I think those particular features were specifically brought up
because the heart of the trial was really was sheer addicted.
And it's a term that we use lightly, isn't it, you know?
Oh, I'm just addicted to Instagram or whatever.
If people say it all the time, but addiction is actually a clinical term
and so for a court to investigate that and decide whether or not,
you know, it truly was an addictive platform because of the way it was built
is a new era, I suppose.
It moves it on from that casual yell I'm addicted to actually this is addictive.
Yeah, and what sort of remedies did the court find?
Because I know they've awarded her some damages.
Is it just money?
Well, they haven't finished yet.
So initially there are damages which total $6 million,
it's about £4.5 million and they have said that
meta has to pay 70% and Google has to pay 30% of these damages.
But what we don't know is whether there will be further remedies to come.
And also, as I'm you've got to bear in mind that this is only the second case
of its type in the US.
There are loads of these coming down the track.
We've got a whole year of these coming through the courts
and less, of course, the companies decide to settle.
So we're going to see a lot more of it.
And already in the two that we've had, meta has lost.
And Faisal, does this seem like a new era for these businesses?
Well, what, so it's just described, is the core business model,
is it not, of many of these kind of offerings, of these forums,
of these ways in which to connect people, but people with businesses,
and increasingly young people, and the absolute heart of them
was engagement and keeping people stuck on those platforms.
And that is what made it good for advertisers.
So, you know, you kind of wonder where this goes.
And it's one person, right?
And we know the numbers.
And it's a very human example of some of the studies that have been done
across populations of 15-year-olds.
There's kind of quite famous charts showing, you know,
around the time when social media kind of burst onto the scene,
levels of happiness amongst young people going down.
And, you know, we have as journalists said,
well, is it cause, is it core relation?
Is it accident?
And I think intuitively, many of us, many of us who are parents,
are just like, you know, well, it seems obvious, really.
But now we're getting a kind of legal sort of pathway.
And clearly, their share prices have been hit.
Like, where does the business model go
if they're being held accountable in this way?
You know, it's a fascinating...
Yeah. Well, Zoe, let's brainstorm that.
Because I can see a pathway as far as it puts it,
where, okay, this has caused the companies a little bit of money
as far as they're concerned, a little bit of reputational damage.
But they can maybe withstand that.
Or as a different pathway,
we're actually the re-engineer of these products
as a result of this court case.
And there was a different court case in New Mexico
the day before as well, wasn't there?
Yeah. So some people are describing this
as big text, big tobacco moment.
And if we look at that path,
what might we see?
Are we going to get health warnings on screens?
Are we going to see restrictions on advertising
and on sponsorship and partnerships
at these companies are able to do?
Are we going to see these addictive features disappear?
But as Faisal says, that's kind of what makes them fun, right?
If you go on a social network and there isn't really anything to do,
you're not going to be there for very long.
Will that be the nail in the coffin when it just fizzles out?
Well, because there wasn't an era of Facebook
where you could get to the end of your news feed.
Yeah. And that was it.
You're like, oh, well, I'll just shut it down
and go and do something else.
Exactly. And because I'm really old,
I remember friends were united.
And how exciting that was going,
oh, here are all my school friends.
And then I found my school friends and said, hello.
And then I was like, well, I'm done now.
You know, what else do I do here?
It just got boring very quickly.
And without those engagement features, arguably,
you're not really left with a lot to do, I am.
And I think though, it may be the case
that what will seem to be the aberration
is the fact that we allowed a wild west of this type
in particular to be foisted upon children.
You know, so this is where we take one case.
We look at it through a business angle.
By the way, because you want to know,
snap share prices down nine percent,
matters is down six percent, Google's is down two percent,
because actually this part of,
this is not the hugeest part of Google's business, YouTube.
You know, it's only the massive part of their business,
but they do a whole lot of other stuff too.
So this shows you, you know, people are questioning
these stars, these tech stars of the US economy.
How much are they actually worth?
How can we actually price what they're worth
if they can't do this into the, into the future?
So yeah, you know, as we look back on this
and you think, wow, if they were addictive
and then people start to draw analogies
with other addictive.
Supporters are with big to back.
Yeah, well, well, tobacco,
but then alcohol, drugs, whatever.
Is that now how we treat this?
If it causes illness, actually, what I didn't,
why don't know is from the court case,
was there any sense of internal documents
that showed that they knew that this was happening?
You know, or, you know, one hears from whistleblowers
had to be various books about this.
Well, I was covering this on Five Live right here
this morning with Rachel.
And we spoke to one of the lawyers who was involved in it
and he was talking about this document
that had come from Meta,
so the people that own Facebook and Instagram
that was about beauty filters.
And basically they've been, Mark Zuckerberg personally
had been presented with this sort of dilemma
about beauty filters.
Oh, we internal research Meta suggests
this could be bad for people's body image,
but equally introducing this feature
and keeping it will be good for our business.
And Mark Zuckerberg took the business side
of that equation rather than the mental health side.
So there were documents, weren't there?
The whole discovery process meant
that they had to deliver internal stuff.
And sometimes I think, you know,
we are kind of surprised when these multi-billion dollar companies
act like multi-billion dollar companies.
But that's what it is.
You know, Meta might present itself as,
oh, we just want to unite the world
and get everybody to be friends and love each other.
But, you know, it is a business.
And at the heart of that, business is a bottom line.
It's the jurisdictional thing here
to use the very pretentious word
because actually regulation of companies
and social media and the media generally
is very different in America than it is in Europe.
I just wonder if actually we're already
in a different regulatory environment here.
Well, I mean, there has been a lot of complaints
from the White House that it's in Europe,
the European Union and the United Kingdom
have been treating these companies badly.
And I think the other way where this goes
is it does clearly open the floodgates
if they were holding back in any way, shape, or form
to kind of give these companies a massive kicking.
And so this was done through this.
What happens tends to happen in America
as it's done through the court process.
And what tends to happen in Europe,
including the UK, is it's kind of done by governments
and regulators and occasionally some bureaucrats.
And we've already heard already
from the Prime Minister sort of piling on the back of this
about a regulatory solutions.
I'm not sure about the extent of any court cases here,
but that has been in trying to regulate tech
across a whole stream of areas, copywriting things like that.
The US goes down the legal kind of route
and the Europe goes down the sort of governmental sort of route.
And so they're quite different.
This will greatly encourage those people in Europe
that want to give a bloody nose to the US tech companies.
There's no doubt about it.
The US companies have this incredibly powerful shield
in America, which is this thing called Section 230.
Sorry, Closely 30.
And basically what it does is protect them from liability
for what is published on their platforms.
So that means if you stick something on Instagram
that is illegal or whatever,
Instagram is not liable for it.
Whereas if you did that here on the BBC,
the BBC would very much be responsible.
Media companies do not have this protection.
And the argument in favour of it is
that the tech companies couldn't survive without it
because they've got billions of people
uploading billions of things in real time,
all the time around the world.
And obviously we're in an AI era now,
but we weren't.
And it was just impossible to manage that level of content.
But there are questions being asked about
whether they deserve that protection, right?
Yeah, and that bit of law comes from the Bill Clinton era,
doesn't it? With internet, it was very new
and we didn't really know where it was going to go.
So we just in terms of what the companies have said.
So Snapchat and TikTok, they settled in advance.
Do we know anything about their settlement?
No, they have not.
Super secret.
American court case style.
And then you hinted at what Meta had said,
saying that they thought there's lots of other factors
that you've got to take into account
when you're looking at somebody's mental health.
What did Google say?
Well, Google also disagrees with the verdict.
And actually it says that it does not consider YouTube
to be a social network.
It feels that the platform has been sort of misunderstood,
I suppose, in this case.
And the lawyer was saying this morning on Five Live
that this is the tip of the iceberg
and that there will be many, many more cases.
Is that what's going to happen?
Well, there are.
There are lots of cases already in the pipeline in the US.
And I suspect, as Faisal says,
everyone is watching this and thinking,
well, okay, I'm going to do it now
because, you know, there's a chink in the armour here.
I mean, these companies are so mighty
that the fees that they've spent defending
this one case are eye-watering
totally out at the scope of most people,
especially, you know, an ordinary family
or indeed an individual.
Nobody has that kind of money.
But they've lost, even with, you know,
that mighty defence behind them
and the seemingly bottomless resources.
I mean, some tech, a contact with mine
was saying to me quietly
that they think the reason TikTok and Snap settled
was because they couldn't afford to fight it.
All right, because the legal fees are going to be so huge.
And obviously, there's a social movement now.
And we know about it in the UK.
It's, you know, it's pretty clear that schools
are now saying secondary school.
You've got to have a sort of dumb phone, I guess,
or like an old-school burner phone.
I don't, that's probably wrong as well.
That's got the wrong connotation.
I can't explain.
I mean, you do have fun.
We don't want them to do that.
A phone that doesn't have access to social media,
you know what I'm talking about.
And so that's becoming the norm now.
And this is about social norms as well.
Like it's extremely hard for any parent
to try and wean their kids off this stuff
if it's given to them at any point.
So what you're seeing is the school system education system.
That's not mandated by the government.
I think it is in some other areas
and we can discuss in a minute
about whether the balance is going to come.
But you know, at school level, it's happening.
And then you set a social norm
and then you find out that kids find other ways
to communicate one-on-one instead of like on mass
instead of, you know, in this way,
sort of showing pictures of themselves.
You know, I think parents in general
are becoming a little bit more savvy
to what you want to expose a kid
going through the travails of, you know, early teenhood.
I mean, you know, how much do you want them to have to like,
like, how much is too much information, connection, contact,
you know, what is real about all this stuff?
And then we haven't gotten to like,
there's all this stuff going on with social media.
That's a little bit rear-view mirror now, isn't it?
They're all sort of becoming friends with their AI's now
and who knows where that goes?
But also, look at their role models,
their examples, look at us.
You know, I am...
Again.
Oh, yeah.
Let's fires us up.
Watch, strike us up.
Look at the rest of us in their lives, right?
I was on the year of Starless Morning coming back
from Brussels, every person in the carriage
was on their phone, almost the whole time.
All the adults are on their phones the whole time.
You know, whether they're chatting to a chatbot
or whatever they're doing, I don't know,
but nobody had a book or a paper.
We're all doing it.
So I think, on the other hand,
it's quite difficult to give this message to children
when you're sitting there on your phone yourself, you know,
and I am as guilty of that as anybody else.
Very honest stuff, you know.
Right, oh, and I'm glad you got a trip to Brussels, I'm jealous.
Let's talk about what is happening in the UK then.
So the Online Safety Act finally was passed a little while ago
after years and years and years are going through Parliament.
But then it's had various phases of implementation
and it's only sort of in the last six months,
the last year that kind of off-com
have got the full power as to actually use it.
But now the government's doing a consultation
about doing even more.
What are the options that they're looking at?
Just remind us.
The public consultation that's underway at the moment
is looking officially at various ways of safeguarding children.
And I can't remember the technical phrase at the beginning,
but something like children's wellbeing online.
And one of those options,
the one that, of course, is dominating the conversation
is should we consider a ban on social media
for under 16s like they've done in Australia?
And that is the one that's generating all the headlines.
But there are others and actually interestingly,
one of them is should these design features,
the auto scrolling, the, you know, the auto play,
should all of that be banned for children?
And another one is about, I mean, there are various ones.
There's one about VPNs, should children have access to VPNs?
Now these are things that mask where in the world you are.
They're useful tools for lots of people
and they're very sort of vital privacy tools.
Why do kids need them?
Or do they need them?
You know, it's one of the things that's up for discussion.
And I saw that this week,
the government's now launched some pilots
with actual families,
where they're going to try out different options
that are contained in the consultation.
So they've got a bit of data about what happens
when you implement these things.
Yeah, I bet those kids are delighted.
Yeah, who gets to be in the no-change group?
Can I just ask one thing then?
In theory, if this has shown damage,
it's not necessarily limited to kids, is it?
I mean, the lady here, she's now an adult.
There's no theoretical reason why this may not have had
a similar impact on some people who just happen
on many millions more people, I guess.
I mean, I'm just throwing that out there.
But is this, you know, in this conversation,
we're focusing on children.
I mean, it's obviously, obviously why?
You know, it was, it was the former Twitter exec,
Bruce Dazley said, said on air to the BBC today,
something like, if this actually was applied
around the world, some of these social media companies
would all be worthless.
And so since I've told you that Snap was 8% down,
it's now 9.5% down.
Right.
And actually, you can have no views about social media,
but if you've got a pension,
probably a big chunk of your pension fund
is invested in these firms.
Yeah, yeah.
Then probably matter, which is one of the magnificent seven,
less magnificent, as it occurs just now.
But yeah, you know, these are, yeah, I see,
he's a seriously, I do wonder though,
if in this case the courts are just catching up
with where society is moved.
Just so we're on the UK pilots and consultations point.
Again, from being on five live this week,
I've had a few conversations with people about this.
And it's interesting because today,
when Qistarma was on his way to Finland
for this defense summit with his Nordic allies,
and he was asked about the court case
that we started off the episode talking about.
And he then said, oh, we're doing this consultation,
which includes looking at a potential ban
for under 16 year olds,
sort of sending out the signal of like,
the government already knows what they want the answer
to be from this consultation.
But then yesterday I was chatting to the AI
and the safety minister who was on air,
and he was being much more open-minded
about what the options are.
So I wonder if actually is the government truly open-minded
about where they end up in the summer,
once they've looked at everyone's responses
and they've done these pilot studies,
or actually are they getting on
and thinking about just doing a ban?
So what's your take?
It's kind of weird, isn't it?
Because on the one hand,
it's definitely as Faisal said,
if you read the room,
this seems to be what people want.
But they can't really be seen to be jumping the gun.
They've committed to this consultation.
I don't even think we're a month into it yet.
They can't start saying, right,
well, we think we know what you're going to say.
And the other thing is,
there's a ping-pong situation going on
at the moment between the House of Lords and the Commons.
And this is over an amendment to a different bill.
This is the children's schools and wellbeing bill
that is primarily focused on education,
but also includes safeguarding children.
And the Lords wants to add an amendment,
which basically gives ministers a year
to decide which platforms to ban for under 16s.
And the Commons has kicked it out.
They've rejected it.
They went back to the Lords yesterday.
They've pushed it again.
So it's going back to the Commons.
So actually, currently,
as things stand,
Parliament's divided over it.
Well, and this consultation that they launched
a few weeks ago was the initial response
to the first time the Lords did that.
So...
Yeah.
Yeah, the Lords has strong feelings about it, I think.
And the government is trying to be able to mind it.
I think you could probably separate out this dilemma
of the UK trying to be kind of home
to as many tech companies as possible,
me and Mr. Tractive place,
from this issue of social media and children.
Indeed, what you hear when you hear these companies
is that they want clear regulation, right?
It's the direction of travel.
I don't think the UK is massively...
The future of the UK would be massively specialist
in social media, you know what I mean?
Yeah.
Probably more like AI and quantum computing
and things like that.
So I think that where in other areas
there's a more straightforward dilemma.
If you want to host all the AI companies
and the data centers want them to invest
in their best possible kit in the UK,
you may have a conversation about copyright,
you may have a conversation about energy policy.
Yeah.
I don't think too many of them are going to sort of run
for the hills if there's under 60.
Yeah, because this was a court case involving YouTube,
which is part of Google's parent company.
It was a court case involving DeepMind,
which is the AI bit that's based in the UK
of Google's parent company.
So, yeah, that's a pretty clear way of thinking.
Here we go again.
Here we go again.
One person, one person in my life has the ability
to get through my team through your firewall.
Yeah, right.
Can we talk about a few other sidebar tech stories?
So Zoe, one that really intrigued me yesterday
was open AI who run CHATGPT.
They had this tool that was available
in North America called Sora,
which is their video generating tool.
And if you're on social media,
you will have seen hundreds and hundreds of Sora videos
of like cat's playing violence and things like that,
which are not real.
It sounds like they're turning it off.
Yeah, so they launched an app
for people to do their own AI generated videos via Sora.
It had an enormous fanfare.
They also did a big partnership thing with Disney,
with the idea of creating content for Disney.
And yeah, they've shot at it.
They've said, oh, we're closing it down.
I read somewhere that it didn't take off as much as they thought.
And the other thing you've got to think is,
oh, it must be costing them an absolute fortune.
Sam Altman has said,
stop saying, please, and thank you to CHATGPT
because it costs money to process those extra words
and we don't need it.
A symbol of water is used in a day to spend.
Exactly.
Do you say, please?
Yeah, please stop.
But imagine how much of the earth you're scorching
to create your clip of your cat playing the violin.
But in terms of the money that it's costing this company,
open AI is one of these,
well, you'll probably have a view on this puzzle.
I think it's one of these companies that's incredibly big
and incredibly powerful,
but isn't actually making any money
once to IPO, we believe,
and really needs to start generating a bit of cash.
It's burned through a stonishing amounts of money.
It's not making any.
And now, you know, it feels a bit like a company
that's starting to tighten the belt straps.
The belt a bit, doesn't it?
You know, it's like it's cutting the sort of the nice to have
and perhaps the AI-generated video app is one of those.
The economics are getting real.
Yeah, it's focusing on enterprise.
It's said that hasn't it?
It's focusing on its business or something.
Well, I'm boring stuff like coding,
which is actually quite useful.
But I think it, very specifically,
that the constraint on the growth of these companies
is on compute power now,
which in term, in turn, it's about energy.
So they just, so if you look at something like Sora,
I'm going to imagine here that as amusing
as some of those videos were,
it isn't making any money.
If any money, probably, I don't know.
Wow, I don't know how it is making any money at all.
I'm sure how they were going to make,
the Disney was going to make money
out of me making a big deal of me or.
Well, they were going to take a share stake worth a billion,
which they would have been worth 10 billion
in a few years' time, I imagine.
And you get, you know, that you can see how
why they wanted to do that.
But like, this time last year,
every company was trying to do anything
that involved OpenAI or ChatGPT.
And every time they made these announcements,
their share price went up.
Now, we're a world away from that.
Almost like reality hits.
I'd say November, December,
ChatGPT also facing like serious competition,
for example, from Google, with Gemini,
but also from Anthropic, with Claude,
for certain use cases,
whereas ChatGPT is a bit more generalist.
So they are, I think it's about
the rationing of computer power
and in turn about energy.
And therefore, you deploy exactly as Zoe says
to where actually you're making revenues
in time for the IPO.
So it's incredibly interesting.
And so that means it's going to be tougher
for some of the fun, but like vaguely pointless use cases
of a generative AI.
It's going to be tougher,
because you'll be charged for it.
Amazing.
I mean, I don't know what the numbers are.
Sometimes, you know, how many trees were burnt
in order for you to make that video
of the elephant dancing on Strictly, right?
I don't know, I just made that.
I think I probably should make that video.
Well, you can't, no, it's just that.
I don't know if you can.
Anyway, I'm going to use another provider platform.
And then last one, and this is going to sign
maybe a bit niche to a lot of listeners,
but it's a big deal in business and tech.
Fiesel, you were very excited about arm holdings.
No, yes.
Just explain who arm holdings are.
Well, they are the granddaddy of UKtech.com era.
They, so this is very interesting, right?
They used to, they rooted out of Cambridge, ACON.
They contributed 350 billion devices,
iPhones, most famously, cars, used their imprints.
So they designed the chip, the chip core,
but they never made it, right?
And they've started the creation
of their own chips at the first time.
The manufacturing will be done in Taiwan, I think.
But there's a BBC angle here.
So arm is fabulously successful, born in Cambridge,
and part of its success story
was something called the BBC Micro, which was, you know,
and so it is now the global company.
Their roots are a bit more mixed now.
They were sold to the Japanese around Brexit time.
And professionally.
Yeah, yeah, yeah, and now they're floated,
but they're still headquartered out of Cambridge.
So there are interesting questions here about,
you know, could they have built this sort of chip factory
in the UK?
And the answer is that we probably missed that boat.
Yeah, well, because it's interesting, isn't it, Zoe?
Because you feel a sort of bit of national pride
when you talk about it.
Because like, oh, it was British people that invented it,
but it was also British people sort of let it go,
and it's now in the hands of people who are not British.
So it's kind of like nostalgia story.
Yeah.
It's a heritage story rather than a, like a thrusting British business
story.
They won't like you saying that, but I kind of agree.
And I think that this is just symbolic of what
happens in this country.
We're good at ideas, and we're good at startups,
and we're terrible at scaling up.
You know, whenever I meet small businesses,
medium sized businesses, I always say to them,
like, what's the game plan here?
You know, what do you want to get out of this?
And they almost always say, I want to be bought by Microsoft
or Google, or I'm the, you know, they never say,
I want to be Google or Microsoft.
There's just that ambition isn't there.
And I remember speaking to Mr. Officer Lohman,
who deliberately decided to go to the US to set up his company.
And I asked him why he's a British AI guy.
And I asked him why.
And he said, well, he said, firstly, there's
just not the money here.
You know, we think we're doing well,
if we get 100 grand, right?
And in the States, you're talking about $100 million, you know,
it's just a completely different set of values
in terms of how much money you can get.
And he also said, and I think this is really interesting,
that we just, we have such a different culture here.
We don't tolerate failure.
He said, in the US, if you meet a business professional
who hasn't ever failed, then you kind of don't trust them
because you think, well, how are you going to know what to do
when the, when the, exactly, how are you
going to know what to do when, you know, it hits the fan
because you've never lived it.
Whereas here, it's toxic, isn't it?
And you don't want to be near the person that's failed
because they might make you fail.
We have a totally different mindset.
And he argued, and people argue that that just, you know,
means that we don't push up.
But this is why cabinet ministers are, I mean, privately,
kind of almost thankful for Demis's abyss,
the sedemis, sorry, at deep-mind for having,
because Google kept on trying to get him to go over to America.
So the fact that he kept things going on in the UK,
and he's quite open about this in the documentary
that he was done quite well over, over, over Christmas.
I have a cameo in them.
Yes, a right name should do, yeah, yeah, yeah.
Oh, you don't have to watch it.
Do you ask him a really like probing question?
No, I think I'd nod.
What a close and a poor boy.
Yeah, yeah, yeah, yeah.
He got us down somewhere.
But it's viewed like 400 million times on YouTube.
It's massive.
There's a love of fear with Britain there,
which I think a lot of people don't really realise
the centrality of the UK.
But like, although deep-mind was sold to Google
for half a billion, which seems like the bargain
of the century now, like you could argue that it's worth
within the Google Empire, I don't know, half a trillion now.
A trillion, you could argue, just to park that there.
The fact that it's in the UK and then the ecosystem
that's now been built around this,
we've talked about this before.
I don't know if it's real or not,
but it's London Maxing thing, right?
And then on top of that,
when I spoke to the Chancellor a couple of weeks ago
with a maize lecture, they were trying to make this case,
and I'd be interested to know what's only
thinks about how realistic this is,
that for the next generation of SuperTech,
the quantum computers, we can persuade them to stay
in the UK, because we again,
we are killing it in terms of the tech,
but can we create the billion and trillion dollar quantum?
So you can have the last word,
but it has to be quite a short word,
because you've got lots of other things to go with.
I'm going to.
I'm going to.
Can we be the quantum champions?
Well, we're, again, we're good at it.
We've got the talent, we've got the ideas,
we've kind of getting the infrastructure.
I mean, it would be nice, but already,
the US companies are fishing here,
they're buying up our successful companies,
and they're waving big checkbooks,
and we're all human, wave a big enough check.
Of course, you're going to say,
apart from five minutes.
The BBC Micro, quantum computer, it wouldn't work.
I mean, I think it was based on BBC Micro technology,
from my memories of playing the Chucky Air
in the computer lab in Glasgow in 1988.
I did a lot of Pac-Man on a BBC Micro.
Yeah, I loved my BBC Micro.
Right, so you've got to go.
Thank you very much, and Faisal, thanks to you too.
Thank you.
Have you ever received a call from a stranger,
regarding student loans you don't owe,
an unpaid parking ticket for a car that you don't even own?
If so, you might have been the target
of a scam orchestrated by criminals
of thousands of miles away.
I'm Tristan Redman, one of the hosts of the global story podcast,
and we're taking a linside look
at the highly-looker,
scam factories of South East Asia.
Listen to the global story on BBC.com,
or wherever you get your podcasts.
And that's all for this episode of newscast,
but a reminder,
if you would like to see this podcast being made,
along with lots of other fellow podcasts
at the BBC's Legendary Made Avail Studio,
you can find the link in the description
in the description box below.
And that's all for this episode of newscast,
and we'll see you in the next one.
Hello, podcast,
at the BBC's Legendary Made Avail Studios in West London
on the 25th of April,
where you can,
because we're launching CastFest,
our big podcasting jamboree,
and we need you to come and be there.
And if you'd like to apply for tickets,
the deadline is approaching.
So get your skates on,
which seems like a very sort of old tech.
I actually know that's like BBC Micro,
isn't it? Get your skates on.
In the description of this episode of newscast,
and we will be back with another episode very soon.
Bye-bye.
Newscast.
Newscast from the BBC.
From one newscaster to another,
thank you so much for making it to the end of this episode.
You clearly do,
in the words of Chris Mason,
ooze stamina.
Can I also gently encourage you
to subscribe to us on BBC Sounds?
Tell everyone you know,
and don't forget,
you can email us anytime
at newscast at BBC.co.uk
or, if you're that way inclined,
send us a WhatsApp on plus or four,
0, 3, 3, 0,
1, 2, 3, 9, 4, 8, 0.
Be assured, I promise.
We listen to everyone.



