Loading...
Loading...

After a California jury found Meta and YouTube liable in a landmark social media addiction case, we explain why the platforms you use may soon be unrecognisable. Plus - an exclusive interview with an expert witness from the trial, the world-leading addiction psychiatrist Anna Lembke, about how tech companies used persuasive design techniques found in gambling to keep us hooked online.
Also on this episode, after Instagram and OnlyFans removed the accounts of the so-called 'Maga dream girl', Matt and Marianna discuss the world of AI influencers. As an increasing number of virtual creators amass millions of followers, is this becoming a medium for political propaganda? And do regular users care if they are real or not?
Top Comment is hosted by Marianna Spring and Matt Shea. The series producer is Laurie Kalus. The social producer is Sophie Millward. The technical producer was Mike Regaard. The editor is Justine Lang. The senior news editor is Sam Bonham.
Apply for tickets to Castfest here: https://www.bbc.co.uk/showsandtours/shows/castfest-2026
If you want to get in touch about something that's popped up on your feed, our email address is [email protected] or you can send us a WhatsApp on +44 330 123 9480.
This BBC podcast is supported by ads outside the UK.
Self-directed investing, trading, full service wealth management, automated investing, financial
planning, thematic investing, retirement planning, few, and to think. That's just a small
taste of what Schwab offers. Because Schwab knows that when it comes to your finances, choice matters.
No matter your goals, investing style, life stage or experience,
Schwab has everything you need, all in one place, so you can invest your way.
Visit Schwab.com to learn more.
This podcast is sponsored by Pocket Hose.
I gotta tell you, being a homeowner, there's so many things you have to think about all the time.
For example, I have to replace my hoses every single year. Because they're weak,
they get tangled and there's kinks, then I found the Pocket Hose Ballistic.
This is the upgrade I've been looking for for a long time, man. It's reinforced with a liquid
crystal polymer used in bulletproof vests. You know what that means? No kinks, it's not going
to get tangled. It also comes with this pocket pivot, which gives you total freedom of movement.
And the spigot has like a 360 degree rotation, which is pretty cool. There's also this upgraded
UV coating they added for free, so your hose basically looks brand new all the time.
Right now, when you get the new Pocket Hose Ballistic, you'll get a free 360 degree rotating
pocket pivot and a free thumb drive nozzle. Go to pockethose.com slash podcast.
That's pockethose.com slash podcast for your two free gifts with purchase pockethose.com slash
P-O-D-C-A-S-T. Matt, you've had a little bit of drama this week. Can we call it drama?
Well, yeah, so I mean, the first thing is we're both very excited that we're doing cast fest,
which is a podcast festival run by the BBC. All the best podcasts like America's newscast,
you name it an obviously top comment. Actual journalists like Laura Coonsberg are going to be
there as well. And us. And us. You can come along to it's in May,
diverse. It's in London. And if you go to the link in the description, you can sign up for
tickets to come. But the drama you're referring to is that my name was at the bottom of the line up
and very small on the poster. It was like Matt was like the supporting supporting it. Yeah, I still,
I feel like maybe people at the BBC don't understand how big of a deal I am yet.
Because I have the tiny chair as well, tiny name, tiny chair, like I'm getting, but I understand
they've now changed the lineup and my name is now quite near the top and much bigger. So I don't
know if my agents had a word with them or what's going on there. Yeah, or we've done a bit of,
we've done a bit whispering line. You probably assumed that it brews my ego and
preempted that. Okay, other than cast fest, we are going to be talking about loads of stuff on
the episode today. We're starting with this verdict that's come in, this landmark case that
was happening in California. I know landmark is kind of overused, but it actually kind of has been
all about social media addiction and the companies and what they're accountable for and not.
Yeah, this is probably the biggest story to happen so far since we've been doing this podcast
in social media. Yeah, we're also going to talk about the rise and rise of AI influencers.
As in ones that are actually AI generated. Yes, influencers that don't exist.
Okay, so this verdict that's come in in California, it's a really big deal because
meta, which owns Instagram and Facebook and then YouTube, which is owned by Google,
have been found liable in this case, which is essentially about social media addiction. So it's
about 20-year-olds who goes on the name of Kaylee, who sued these companies, several of the companies
actually some settled, some ended up in court, saying that the impact of being addicted to social media
as a kid massively has impacted her and her mental health and TikTok and Snap actually settled
before the case began, which is why we're focusing on meta and YouTube. They argued there's no evidence
that the issues that this young woman experienced were because of our platforms. Obviously she argued
the opposite, her lawyer's argued the opposite, and the jury found them liable, which means that
she's getting awarded something like $6 million. So three million in compensation and three million
that's like punitive damages. And the reason this is such a big deal is because this is not just one
person. This is what's called a bell weather case. There are over 10,000 miners or young adults who
allege harm or addiction caused by social media companies. There are over 800 school districts who
allege this. And this case was picked because it's representative of a large number of these cases
of children and young people being allegedly harmed by social media. She's won, this now changes
everything. But what is so interesting about this case and this litigation is the clever way that
the prosecution have made this about how the platforms are designed, not the content on the
platform. So until now, loads of people have tried to sue social media companies, but they've always
been protected by a law called section to 30, which has also been called the 26 words that created
the internet. It's basically a US law that means that social media companies and internet-based
platforms cannot be held liable for the content people post on those platforms. They're not
publishers that kind of argument. Exactly. This one law has arguably allowed all these social media
companies to become the force that shapes our world. But this litigation is so clever because
it's not saying the specific content being published is the issue. Therefore, free speech is not
the question here. It's alleging that these platforms are intentionally designed in a way that makes
young people addicted, not only that, but these companies knew that and did it anyways. And that's
what's so striking about this. What this jury has come back and said is this is deliberate. They make
these design choices and these design choices have an impact on people. It's not happening sort of
by accident or as a by-product. It is a kind of conscious, intentional decision. And that feels
really different from anything that's come before. Absolutely. And it's because one pro-anorexia video,
for example, in itself is arguably defensible by free speech and isn't that harmful. But if a
platform is designed in theory to send 10,000 pro-anorexia videos to one person, the way that platform
designed their ledge is intentionally harmful. And the company's obviously disagree with this verdict.
They've made that very clear. Meta said, quote, teen mental health is profoundly complex and cannot
be linked to a single app. We will continue to defend ourselves rigorously as every case is different.
And we remain confident in our record of protecting teens online. And then the spokesperson for Google
said, this case misunderstands YouTube, which is a responsibly built streaming platform, not a
social media site. And both of them are going to appeal this case. And obviously, you imagine that
they've got one way on the other cases that might come. That could end up the same way or not.
I mean, it really reminds me of that thing we were talking about before, which is how the companies
have often argued, we are a mirror. We're holding up a mirror to society. We don't actually affect
people. It's just we're reflecting it back to you. And essentially, what this case says is,
that's not true. These are proactive systems. They are features that you've designed and implemented
and they're having an impact. And this comes at around the same time that Meta was
ordered to pay $375 million in damages in a new Mexico case for allegedly misleading users
about the safety of its platform. This is possibly the worst week for social media companies ever.
The Zuck is not happy this week. I actually did an exclusive interview with Anna Lemke,
author of Dopamine Nation, who was an expert witness in the California trial. And she's also a
world-leading addiction psychiatrist. And this is what she said about the verdict.
I didn't know that the public was ready to identify just how addictive social media can be.
And my worry was that so many people themselves use social media that they would want to normalize
it and not look at the problematic piece of it. Adults are not obviously immune to this problem.
And I think that probably did impact the jury verdict. I suspect that in describing these
phenomena, the jury saw themselves in this behavior. And it resonated. They said, yeah, this is true.
I've had this experience on these platforms. Early on, people were likening it to a big
tobacco litigation. I think that's a fair analogy. But I had this sense that maybe we would have
to lose a bunch of cases before we would win them. But it's clear to me that we have reached a
tipping point that even with these very beginning trials, we're getting guilty verdicts, we're getting
large punitive damages. This is going to change things for these corporations.
To me, it's really interesting thinking about how a jury would respond in this case. Because I
guess the thing about social media is that it is just so universal. And yet, it increasingly feels
like, and maybe this is just the state of my DMs and why I feel like this, that people are much more
aware, much more vigilant as to what they're being pushed, why they're being pushed to how the
platforms actually work. And so it strikes me that someone like Anna who knows loads about this
was kind of like, I'm not sure whether people are going to realize that this is the impact that the
sites are having. And what seems to have happened is actually they've heard the evidence and thought,
oh, I kind of recognize this. Yeah. Because although these cases are about young people in children,
this applies to adults too. This applies to all of us. And when an LMK was explaining how these
platforms are designed with something called persuasive design to keep us on them as long as possible,
it resonated so much with me. I began to realize that I was addicted to. So they have neuroscientists,
the top neuroscientists in the world working on this. This stuff comes from the world of
gambling of slot machines. These are features like infinite scroll auto play. There's also
recommendation algorithms, follow accounts, which give you dopamine because the perception that
you are being liked by a social group becomes addictive. The sense of FOMO push notifications.
These are all intentionally built in because they know exactly how to keep you hooked and how to
control your dopamine reward cycle in your brain. But where it gets quite neuroscience and I think
quite sinister is in this concept of novelty uncertainty and variable rewards. So the way our
brain works, the way dopamine in our brain works is a constant dopamine stream, a constant reward
actually keeps us less hooked than variable. Here's a bit. Now we take it away. Here's a bit. Now we
take it away. So they use this novelty uncertainty principle where they intentionally do not let the
algorithm give you exactly what you want all the time because that actually keeps you going back
for more rather than getting what you want all the time, then you kind of feel sated and you're
tolerant to the dopamine. So they they give you a bit of dopamine, then they stop. Bit of dopamine,
stop. And you don't want to become almost like desensitized to that level of dopamine. They don't
want you to. The idea that you are always about to get the thing that you want is apparently one of
the most potent ways to keep us hooked. So when you think of a slot machine, if you need three lemons
and it'll do one lemon, then I'll give you a nearly get it. Then a cherry. Yeah. Yeah. That is
how these algorithms are designed. We know that intermittent rewards are actually more addictive
than reliable rewards. So when those rewards come at an unpredictable interval or we're not
quite sure when we're going to get the reward, we're more likely to stay engaged in anticipation
of that unknown reward. You know, these companies have incorporated that kind of slot machine
design knowingly into their product such that you have this pull to refresh phenomenon or the
quote unquote, mystery of the algorithm, their language, such that part of what keeps us engaged
is this kind of treasure seeking function, not knowing, you know, what's around the next algorithmic
bend. If you become tolerant to the same kind of reward, they then switch it up and give you a
different version of that. She compared this to if you're addicted to cookies and someone's like
shoving a chocolate chip cookie in your face constantly. And then you're like, okay, I should
don't wear any more chocolate chip cookies. That's when they would switch it and now it's a
chocolate chip cookie with almonds. Yeah. So it's different. Yeah. You're mixing it up a bit. You're
like, oh, okay, I kind of this is a this is a variation on a field that I quite like. And the novelty
uncertainty thing would be like, actually, I'm not going to give you any more chocolate chip cookies.
I'm just going to give you oranges and apples. And you're like, oh, and then you get a cookie again
and you're like, yes, exactly. And when I talk to people who work within the social media companies,
especially engineers, I'd say. So those are the people at basically in charge of designing and
controlling these algorithmic systems. I mean, they talk about that that like ultimately they are
optimizing for engagement, whatever the cost. And I guess the best way of optimizing for engagement is
to operate this kind of like reward with draw type system. You brought up earlier this argument
that they've often made that we're just holding a mirror up to you. You are the ones making the
content. This is what you are creating and what you want to see. We're just, you know, we're just
posting it, but this is so different from what we've heard in this trial. Not only are not holding
a mirror to us, but I mean, things like FOMO, we've all felt FOMO when looking at social media,
but to know that they arguably are aware of FOMO and are intentionally putting it in the product
to give that to you. There's like a neuroscientist sat there being like more FOMO. I mean, it's terrifying.
I imagine that's that's live scenes, live scenes from any social media recording think of,
but this is this is the point. And actually, I think this is where when we think about the
consequences of this case and the other cases, what they could be people within the companies,
they tell me the biggest worry is that they are asked not necessarily in the United States because
of the current sort of political environment and the approach to the companies, but certainly in
other places, they're basically asked to overhaul their models. Like they're asked to overhaul their
entire algorithmic structures and how they operate, which is what makes them all the money. Like,
that's the thing. If they then got to have more proactive moderation, slower algorithms,
they've got to, I don't know, separate out like the platforms that kids are on and teenagers are
on to the kind of main ones. All of that is incredibly expensive to execute. And actually,
it's the FOMO and the keeping you scrolling that is hugely valuable to them, which is why they've
continued to build on that rather than the opposite. If you're working inside a social media
company this week and you support what they're doing, you'd be thinking, whoa, are we, do we risk
being regulated in a much harsher way that's much more difficult for us to deal with in some places?
Do we risk getting banned in much more places? And also the public perception point, which might
explain why some of the companies settled and others didn't, because now it's kind of like meta and
in this case, meta and YouTube and big sort of bold lights, other ones who, if you use social media,
you might be looking thinking, oh, maybe I don't want to use those platforms then if that's the case.
I think there is a tipping point. I think people are waking up to this and the perception
of social media as a bad thing is growing really, really fast. I think people are comparing
this to Big Tobacco. Now, another aspect of Big Tobacco is they can't advertise to adults either.
So, you know, we're focusing on children now for good reason, but there are huge questions to be
raised about. Should they be allowed to do this to us? Digital media is a legal drug. I mean,
it's a Wild West, but it's also a legal drug. And we, as a society, have determined that we can't
market addictive drugs to kids. So, if we extend the analogy, then we have a perfect argument for
saying she was, maybe kids shouldn't be allowed to go on these addictive platforms. One of the
things that really came out in this litigation is that there are lots of tweens and even younger
children on defendants platforms. And the platforms know that they're there and are mostly doing
nothing about it. I don't think there's ever going to be a day when we make digital drugs
illegal, but I do think that there needs to be much more transparency, more warning.
People need to really appreciate that, you know, that the digital media is a powerful tool,
but also potentially a potent drug. It is worth saying, obviously, that, you know, there's a
chance that the other cases don't go in the same direction as this one, that it is very much case
by case. It's a test case, not class action here. And so, the companies, and you'd really notice
this in Meta's response to this case, they are like, it's case by case, we've got to see what happens.
And, you know, at someone who investigates social media, I find, I don't know if you find this
much. There are real examples where people properly evidence the impact that social media has had
on them. There are also examples of where people point to harm, and it's not quite what it seems.
Obviously, you'd imagine that, you know, when a case gets as far as being in court like this,
that there's pretty significant evidence back up. But that's not to say that it's all going to
completely go in this direction. It might end up being not quite so straightforward as that.
Yeah, and I think one can imagine that Zuck and the others are horrified by the process that it
does, because imagine six million dollars in damages, you know, times 10,000. And it's also
interesting to note that Meta, in this case, oh, 70% of the damages, YouTube. More than YouTube,
yeah. The other 30%. So YouTube perhaps was considered less addictive and problematic in this case.
Yeah, and it's kind of interesting, the idea that the social media platforms vary in terms of
the possible harm that they might be able to cause to you. I mean, YouTube, in particular,
when I was doing the investigation about algorithms, I looked into YouTube as I did all the big
companies, but in terms of the evidence to back up, like, what's happening right now,
and making decisions that exacerbate harm or exacerbate the problem, there was lots of evidence
when it came to several of the other companies. But actually, YouTube in some ways, and they
highlight this in their response, have tried to move from being kind of old rabbit hole YouTube,
you know, where people would end up, like goodness knows where, into we're like a streamer,
we're like, you know, one of the, we're like a video platform. That's not to say there's not
problems, of course there are. But you also have to remember that a lot of these cases are about
the experiences that teenagers had growing up. Yeah, I mean, I think this is clearly going to have
huge ramifications for how social media companies who are mainly based in the US, where this
litigation is happening, deal with children, age verification, we'll potentially have to become
more stringent. Algorithms will potentially have to be rewritten. Or you just operate completely
separate platforms altogether, like you literally have to have platforms potentially. Different
spaces. I mean, it could mean social media in the way that we have known it for kind of our
whole lives is a really different thing, certainly for people in like five, 10 years time,
that does raise the question of like, do they have to entirely rebuild their platforms? Like,
do they have to start from scratch? I mean, lots of people talk about safety by design.
And the problem is is that the designs don't feel inherently safe. And a lot of the engineers,
I speak to you say it's kind of a black box. It doesn't feel like we can fix the algorithms
very easily. And that's sort of why we're here. That excuse may no longer hold water.
Now this litigation has happened. This also has ramifications for AI companies. And whether
their content is harmful for children. And ultimately, it sets a precedent in which tech companies
in general can be held liable for the way their products are designed. And whether those designs
are harmful and addictive, not just very young people for all of us. So this could change everything.
So AI influences have very much now become a thing like capital A, capital T.
You might have seen them popping up on your feed. They've just become kind of much more widespread.
And perhaps that's because of the advancements in terms of the AI tech and how people can make
them. And they're increasingly convincing. They look like real people, real influencers.
And it's not just that they look like influencers. They're also kind of treated like real people,
like influencers. There's literally been this AI influences awards that have been launched
this week, possibly as part of a bid to kind of incentivise people to create these kinds of
characters and this kind of content. But the main reason that we're talking about this isn't the
awards. It's actually because there is one particular AI influencer who has been going super, super
viral. She is the so called MAGA Dream Girl. So who is she? So she is called it. It makes me think
of like the AI agents rebelling when we call them in the clankers. Yeah, freaks me out. I don't know.
By the way, just a quick side on clankers. Clankers is a slur for robots. I've never been said
by the way in any BBC podcast you ever. Is that a quick side on clankers?
Well, like on blue sky, which is like the super, super woke version of
the point where it's almost comical. There's a good debate over whether using the term clankers to
refer to AI and robots is actually offensive. A slur. AI is them. Yeah, exactly.
So we're going to be mindful of clankers. Yeah, let's be mindful. We don't want to get like.
Okay, so she's a clanker. So yeah, she's called MAGA Dream Girl is called Jessica Foster.
You might have seen her popping up. I mean, the reason that she's been popping up recently is
obviously because there is this wall going on in the Middle East and lots of the images
that Jessica Foster has popped up in a one's where she's wearing like an army uniform.
She's blonde. I mean, she's absolutely beautiful as all these AI influencers, which is kind
of the point. And she's really promoting a kind of certain conservative image or ideology.
But she's more of an only fan's model. But she's more of an only fan's model. Her whole thing
is basically being gorgeous. And I like posing next to Donald Trump and looking gorgeous.
Yeah, yeah, yeah. And you'll notice that her feet prominently feature in a lot of her posts
because for reasons unknown to mankind, people in the largest corners of the internet is the
foot fetishist corner. And that's like a guaranteed way to grow an audience and make money.
Yeah. And now Jessica Foster's account has actually been taken down from Instagram also from
only fans for violating their policies, presumably on synthetic material because it's not always
clear that she's not real. And I think the motive here might be so interesting because it's like
it reinforces certain political ideas. But also, she had over a million followers in four months
on Instagram. I mean, that is a huge tool that if you want to like make money, you want to
monetize your content, you want to, I mean, particularly direct people to your only fans where you
can make lots of money. You can see why someone would do it. It's like, it's both a business. And
then I guess it's more of a business because it's political. But it's got to be not just a hot
girl, but she has to fit a certain art type. So yeah, this is mega dream girl. There's also a
trucker version. There's a trucker girl. There's hot cop girls as well. But not all of these AI
influencers are just for money. So I've noticed, I don't know if you've seen this loads of AI
generated attractive Iranian female soldiers. Oh, I had seen some of this popping up.
Now, do you know what the biggest giveaway that these aren't real Iranian female soldiers is?
Tell me. Iran doesn't allow women to see them. Oh, yeah, of course.
What's interesting about these Iranian AI influencer female soldiers is they are speaking in
in English. They're not wearing the hijab. So that kind of indicates that they are aimed at
English speaking audiences, which I think is interesting because it's a lot of a lot of Iran's
war slop that it's coming out with right now. And it is winning the war as far as war slop is
concerned is aimed at towards English people. So that is an example of how these influencers
aren't just for money. They're also for political reasons too. Danny bones. Oh, yeah.
So Danny bones is a kind of working class British rapper with a fast growing online following.
He's has millions of views. He wraps our immigration in one of his rap videos. He kind of
is leading like men carrying St. George's cross. He's also wearing a jersey that says mass deportation
unit. He's not real. He's made he's an AI musician. And the Bureau of Investigative Journalism
discovered that he is a front for an anonymous influencer collective called the node project.
The node project was paid by the advanced UK party, which is like a really quite fringe
far right party. Yeah, advanced UK paid the node project to make its main campaign video and
repurpose some of his videos for its Gordon and Denton by election. I think the really crucial
thing here is do people care that these people aren't real? And in fact, most of the time,
they probably know that they're not real. But isn't this the point that when it comes to influencers
in general, I mean, the word influencer is almost synonymous to some extent with the word fake.
And that's not to be drawn to by influencers. But it's just this perception that you are trying to
influence people. Therefore, you've come in with an agenda. You're often paid to promote some
thing. You're an ad, whatever. And whether you're a real person or whether you're an AI
generated person, I mean, the effect is kind of the same. It's sort of like fake on steroids.
So you can see why people are actually just like, well, I don't really mind if this person
isn't real because they fulfill this image in my mind or something that I agree with politically
or that I like. And actually, that's what's so powerful about AI is that it can kind of leave an
impression, quite a strong impression of something. And it doesn't really matter whether it's real or
not. It starts to kind of shape how you view reality. I do, I do think that most people would care
that it isn't a real person. For whatever reason, I feel like we need to feel like it's a real human
being that we are like connecting with. Yeah, yeah, maybe not so much when it's like an only
fans creator and it's people like, just like that to them. Yeah, exactly. But when it's someone like
Danny Bones or like a musician, you know, we want to feel like there's a real human being
that we're connecting with. And when it comes to politics, I mean, we've spoken quite a lot
on various episodes about authenticity and politicians and actually how the content that
performs best when it comes to politicians and leaders is the stuff that seems genuine,
seems seems authentic. It's kind of funny in some ways that this like political content
and these AI influences are very much the opposite of authentic. And yet they kind of succeed in
in pushing this like hyper stylized message. It's the two good to be true thing, right? Like,
you see the stuff and you think, oh, I really want this to be true. I'd like the world to be like
this. I like this person. I find this person a truck, whatever it is. And it's like, well, it is
too good to be true. But I just feel like it's becoming, it's being really normalized. And so
actually everyone's kind of getting to the point where it becomes a sort of fair game thing to do,
whether it's to make money through it or to actually use it to further an agenda. I mean, you
can get an AI girlfriend or boyfriend now. I love how you said that like I was going to. You could,
okay. Are you okay? Now, okay, what are the rules on this? So TikTok, yes, AI influencers are
allowed on TikTok, but they must adhere to strict transparency guidelines, including clearly
labeling their content as AI generated. Yeah, so mandatory disclosure, they cannot impersonate
other people with AI. Meta not only allows, but actually supports AI influencers and creators
on Facebook through its AI studio program. So they stuff gets so much engagement, right? Like,
it's good for the creators. It's good for the platforms in that sense. But there is a darker side
to this as well. I mean, aside from the whole like tampering with reality point. So particularly,
for example, the beauty did this investigation and it was about AI generated black female
influencers. And it was like these edited images of influencers essentially making them black,
sexualized images of black women and using them to promote certain products or ideas. And
it raised all kinds of questions about the kind of use of AI, the intersection with race, the way
that the way that images were being distorted in order to push them, products appeal to different
people. And I think that's where you start to get into this kind of like murky. Oh, like,
is that all right? And these actually, these accounts got removed because TikTok decided it wasn't.
There's one specific one that's incredibly famous. I saw a lot of my male friends follow her,
which is kind of embarrassing. But one of the reasons that her and the other accounts you mentioned
kind of arguably creep over into fetishization of black women is their features are so exaggerated.
It's actually, but also their skin is it's not like a normal black person skin. It's like super
naturally dark, almost like a carry James Marshall painting. It's like very, very dark. So it feels
racist and it feels uncomfortable for people. And the woman I'm talking about that my friend follows
that is one of the most famous ones of these accounts you referenced. It turned out the BBC
uncovered that this was another real influencers face and body. Yeah, that we're being that was
map with AI. So her movements her face, it was all exactly turned into this black woman really
weird. She was very uncomfortable with it. Also, they're stealing her movements and her body's
shape. I guess you could argue. And I think that like AI influencers are not going anywhere. In fact,
if anything, things like these awards incentivize people to get involved. It's quite funny now to
think of influencers who five years ago would be called like fake by people now being like the bastions
of like reality. Yeah, yeah. When actually again, there's all sorts of problems around how
influencers work and how they sell you stuff and how that whole culture operates. And you can
imagine if you're an influencer, you wouldn't be super keen on the AI influencers. Although to your
point, we do hope that we still prefer human beings to clankers, maybe. We've been forbidden by
our producer from talking about the AI frutes affairs, which are far too pass A for this pod.
We've spoken about AI frutes too much. No, we're we we go way deeper than if you're still talking
about AI frutes. And there affairs you're behind guys. Yeah, catch up, scroll up. And on that note,
if you want to scroll up or down and send us what you're seeing or anything else, I also still
repeat my call out that if you're a general adult and we want to hear from you, listen to the last
episode. You can get in touch with us. You can send us a WhatsApp on plus four, three, three, zero,
one, two, three, nine, four, eight, zero. And you can email top comment at bbc.co.uk.
Happy scrolling. You didn't feel confident about that at all, didn't you?
At the UPS store, we understand the importance of a first impression.
That's why we're here to help you put your best foot forward and be unstoppable with our
printing services, with high quality paper stock options, banners, business cards,
venues and more. We make sure your small business stands out and your message reaches the masses.
After all, we're the one stop prints that pop store. Most locations are independently owned,
product services, prices and hours of operation may vary. See center for details.
The UPS store. Be unstoppable. Come into your local store today and get your print on.
Puerto Rico is having a moment of global attention behind it a decades of execution,
preparation and business momentum. Global companies manufacture, innovate and scale here with
confidence. Not culture or business, culture and business. Puerto Rico, it's not what's next.
It's where. Visit investpr.org forward slash business.
Hey, I'm slim. When I was at school, my report card said he's clever but not applying himself.
Little did my teachers know this kid from South London would go from driving a bus in Brixton
to becoming the first black British comedian to sell out the London palladium.
So when my daughter asks me about my life, you know, what was it like?
I realized I've had a hell of a ride. So I'm going to tell my story through every decade from where
I feel at home on the comedy stage. Yeah, man, Slim's guide to life. Listen to the whole series
now on BBC Sounds. This is Mike Voilo of Lexicon Valley. And I'm Bob Garfield.
Are you one of those people who sometimes uses words? Do you communicate or acquire information
with, you know, language? Hey, us too. So join us on Lexicon Valley to true over the history
culture and many mysteries of English plus some life cracks. Find us on one of those apps
where people listen to podcasts.
Hi, this is Alex Cantrowitz. I'm the host of Big Technology podcast, a longtime reporter and an
on air contributor to CNBC. And if you're like me, you're trying to figure out how artificial
intelligence is changing the business world and our lives. So each week on Big Technology,
I bring on key actors from companies building AI tech and outsiders trying to influence it.
Asking where this is all going, they come from places like Nvidia, Microsoft, Amazon,
and plenty more. So if you want to be smart with your wallet, your career choices,
and meetings with your colleagues and at dinner parties, listen to Big Technology podcast
wherever you get your podcasts.
