Support us on patreon and get an ad-free RSS feed with early episodes sometimes
Loading...

Discord delays their age-gating rollout but legislators are pushing for operating systems including Linux to verify ages, LLM licence laundering might mean the end of copyleft, and how and why you might want to detect Meta’s spy camera glasses.
News
Getting Global Age Assurance Right: What We Got Wrong and What’s Changing
US state laws push age checks into the operating system
I have actually read the text of California law CA AB1043 and, honestly, I don’t hate it
Relicensing with AI-assisted rewrite – Tuan-Anh Tran
Chardet dispute shows how AI will kill software licensing, argues Bruce Perens
No right to relicense this project
Hide from Meta’s spyglasses with this new Android app
Dear Meta Smart Glasses Wearers: You’re Being Watched, Too
Automox Turnkey Results
Endpoint management tailored to your specific environment. Know the plan. Trust the result. Learn more at www.automox.com
This late-night Linux family podcast is made possible by our patrons.
Go to latenightlinux.com slash support for details of how you can join them.
Support us on Patreon for access to ad-free episodes and early releases.
That's latenightlinux.com slash support.
Hello, welcome to episode 376 of late-nightlinux, recorded on the 9th of March 2026.
I'm John and with me, Alphanem.
Good evening.
Graham.
Hello.
And Will.
Hello.
Let's see some news.
We talked about Discord and their age-gating stuff.
They're delaying it to the second half of the year now and they've written a big, long blog post about how honestly it's fine.
It's fine, honestly.
The account deletion system overloaded within 24 hours at the moment.
I think they did a good job of explaining how they're going to go about it and how they are doing their best to preserve your privacy
by splitting the age verification from their systems and making sure that the only information that flows one way or the other is the age group, not the specific age.
So the people who are verifying your age do not know your Discord username and Discord don't ever see that data that is used to prove your age one way or the other.
So I think they've done a good job of explaining how it's going to work.
I just fundamentally don't believe that those systems that are used to do it are secure, worthwhile, not bypassable, and they're just going to keep your data safe.
I agree.
I think their intentions are good, but that doesn't mean that they're not going to fuck it up.
Yeah, and good luck getting on the airplane if your user details got leaked and then used some sort of terrorist incident.
You'd never be able to fly travel or anything again.
But this leads on to a big story in the last week or so, which is age verification checks in operating systems.
You've got California and some other states in the US, there's Brazil.
There's a lot of different legislators going after this idea of age verification in the operating system.
And that is going to include Linux supposedly.
The California one is interesting because it's just self-reported age.
You just basically, when you create your account on the local machine, you just tell it.
You're in the over 18 age, and that's it.
And then that reports to websites and stuff and puts the onus on websites.
So really, it's just aimed at parents being able to set up an account for a child, say they're a child, and then websites will know and services and all rest of it.
But then some of the other proposed laws aren't quite as not bad as that, shall we say.
So what do we think?
Is this realistically going to come to Linux?
A doubt it.
And maybe the likes of steam were valve have to, if they're selling a product with Linux on it.
But if someone tells me that I have to put that on my distra, I'll just tell them that they can just not use my distra.
Or use it illegally in the state of California or wherever it happens to be.
That's a good point. Like how are they going to enforce this around the world?
Because presumably you could go to another place and install Linux completely legally and then take it with you, right?
I suppose so, yeah.
Are you saying that they haven't thought of everything and they haven't even maybe even fully fleshed the whole thing out?
I haven't thought it through no way.
There were too busy thinking of the children.
Oh yeah, yeah.
I don't know the way this is going to go.
I am uncomfortable with it.
But also I feel as usual quite naive, is this the way that the internet is going to go?
And there's a whole generation that will just think this is normal.
If it does go this way, then Linux and open source is the only way that I will trust it.
Because at least then you can see the code and exactly what is happening with that age information,
whether it is just semi-ignored or used to lock down accounts for younger people
or if it's passed on to other places.
I mean, that's the only place I could see myself trusting it if I had to trust it,
but I'd much rather not have to go that far.
The thing is, you just know that they're going to try and tie it to something else.
I know they have the tech and the Intel Mazrana engine and all that type of nonsense,
but I don't think any of these places should have any specific info on any user of the computer.
If you want to label a set of above 18 below 18, then maybe a couple of categories in the below 18,
yeah, I can see how that could be useful, but they do, I'm sure,
want to fully identify users or use timing based on requests for authentication
as to who you are when you use that stuff like that.
I think these things always start out of like,
ah, that seems like that really does protect that particular group we're trying to protect right now
and then spirals from there.
But I feel so fatalistic these days about the fact that I don't think there's anything we can do
to stop the momentum behind this kind of thing, at least with Linux.
Like we have done so many times in the past decades, we have a choice to build things ourselves,
we have a choice to comment out the source code.
So we can be inventive in the way that other OSs can't be.
I think one of the ways to resist this type of stuff is like the likes of,
well, leaking information of those in Paris, browsing habits, for instance,
or, you know, the various dating apps that skyrocket every time there's a particular party's conference in town,
you know, show them the actual value of how dangerous, like not even specific information can be,
but in the fact that, well, we know only 40 or 50 of you went there,
and then all of a sudden this happened.
So would he and that group is guilty, at least?
Yeah.
Why is it when Republicans go to a certain town,
grind a usage coach to the road?
Making friends again, Joe.
Fuck them.
But we'll link to a post on Macedon from Schrodinger's Catgirl, which is quite a funny name.
And she quotes a friend who says people saying they can't force Linux to do age verification.
Its open source are being really naive.
The next step is locking bootloaders slash BIOS at the hardware level
and every computer are only capable of running approved OSs with age verification.
That doesn't seem like complete, hyperbole, and fear mongering to me.
That seems reasonably realistic, especially with ARM.
So you can kind of picture a future where you can't install non-approved, non-age-gated operating systems on brand new hardware.
I mean, my stack of old thinkpads will be fine.
But what happens when they break and I need to buy something new?
Yeah, much like the banking software on your root with Android phone won't install either.
I think that's the sort of dream of where they want this to go.
I'd imagine.
Yeah.
But then I kind of have enough faith in the Linux and open source community to find a way around these things.
None of us left.
We'll be dead by that point.
In around 10 to 15 years is what I'm talking about here.
All right.
Not in a few weeks when it all escalates.
All right, yeah, well, we'll really care about age verification then.
Yeah, don't mention the war.
I mentioned it once.
I think I got away with it.
But look at Asahi, for example.
We genuinely thought that was impossible.
Or at least I did.
I thought there was no way they're going to get Linux running natively on an ARM Mac.
And here we are.
Okay, it's not 100% working.
But on my M1 Macbook Air, it's pretty good.
And if someone was determined enough, they could do it on these new M5 ones.
It'd take a hell of a lot of work.
And I'm sure that the Asahi project is working towards that.
But the point being that there are some very, very smart and capable people in this wider community.
So it just about works on your six year old computer.
Great.
Yeah, but the point being that there are very clever people who,
if they are willing to put their effort in, can make it happen.
Can almost make it happen for a really old version, yes.
The chain of security is only as strong as the weakest link.
And so let's assume that the hardware cannot be hacked or bypassed.
Somebody will leak the key.
Somebody will check something into a public repo with the master sign-in key in it.
And then we'll all be fine.
That will happen.
Yeah.
Or someone will reverse engineer it.
They'll plug it into all sorts of equipment that you know about will.
I think you're being way too sort of hopeful for a disaster.
I don't think we have full access to a lot of stuff that we might wish to.
I'm, you know, hoping that somebody is incompetent at just the right time.
I know it's a bit of a poor future strategy, I think.
In another one of these articles that was linked to,
I read a bit that really concern me,
which was that the operating system through the browser will be responsible
for reporting your age group directly to websites.
That seems like another, I don't know,
disaster went into happen.
Is it that bad though if it's just reporting,
you're either under 13 between 13 and 18 or over 18.
I'd rather have a file in like Etsy somewhere that just says,
yes, this user is over 18.
Then having to fuck around with a VPN when I want to look at a subreddit
that has been deemed to be adult.
You raise a good point.
Yeah.
But I think that is a slippery slope though, isn't it?
Once the whole system is in place,
they can tweak it up, tweak it up, tweak it up until it becomes something
that you then can't just do that.
Because, you know, it'll be,
oh, this is even more secure or this is a much better system.
You don't have to take the,
take the hassle all out of it.
And again, you know, people just go along with it.
People always go along with it.
Yeah, I think that slippery slope argument is very much valid.
But what can you do?
If the law says you've got to put this into your distro,
you've got to do it.
And hopefully, there'll be ways to remove it.
See Steading, that's what it is.
That's what we need.
Once the oil and gas shuts down in the North Sea,
we can all buy our own oil rig and then set up as independent country.
Right. Yeah. Good idea.
All right.
Something that has really blown up recently
is this car debt thing.
I think that's how you say it's spelled char debt.
Or is it char day anyway?
It determines characters.
So it's car debt or whatever.
Anyway, this is a Python project.
And it was previously licensed with the LGPL,
which is a copy left license.
Well, recently, the maintainers decided to rewrite it
from scratch in quotes using you guessed it,
Claude code.
And are claiming that it is a clean room implementation.
So it's fine for them to re-license it as MIT,
which of course is a permissive license.
Well, the original author has piped up saying,
no, you can't do that.
And controversy has ensued.
Now, I'm going to guess for him that you think
that the original author is correct.
Well, I haven't said whether I think that or not.
It's very presumptuous of you.
Obviously, a thousand present, I think Mark Bellgroom,
is 100 million percent correct in his assumption.
And he should give him two fingers and tell the fuck off.
Yeah, he's the original author.
He is the original author, yes.
How can it be a clean room implementation
if it's using an LLM that was trained on the fucking internet?
Obviously, that LLM has seen the code to this.
So it's just impossible for it to be clean room.
That's not how clean room works.
Didn't we predict this would happen?
I'm pretty sure this came up before something like this
did happen, we discussed it, and we thought
this is something that we think LLMs will do.
And what we're going to do about it when it happens.
Fundamentally, I do agree with failing.
But I don't think that's going to change anything here
because the laws of LLM generated content
are being tested in court and so far
that argument just isn't working.
So I don't think it's going to work.
And what are we going to do then?
Well, they did say that you can't
as a machine get a copyright case
from that art project.
So if a machine can't copyright something,
I don't think it should be able to relic
something either.
But isn't ultimately the end artifact
a re-engineering of the other API,
the API, such as when Google did the same thing with Java
or where we've been on the other side of this argument simply
because the API is the same,
doesn't mean that can be copyrighted.
I think the problem with the way that they did it,
if they want to do it like this,
is the problem is the fact that they said,
what they should do is say,
write me an application that can detect
the character encoding of something
and just never mention his code.
Maybe even explicitly say,
do not look at the code of Blatt,
even though that's absolute Balox
and it will, of course, have learned from that
so it can't do that.
But I think with them trying to
re-implement his software,
they have invariably used his software
to influence how they've written it.
They should just take the task
and then go with that.
That would be the only real way.
Well, yeah, my understanding of a clean room implementation
is that you have to have two separate teams.
Like, if the source code is available,
there has to be one team who looks at the code
and then writes a detailed spec
based on that code
and then a separate team that has never looked at the code
that then has to write the new code from that spec.
And so if you gave an LLM the spec,
then you could at least pretend to argue
that it's not laundering of the code
and laundering of the license and everything.
But if you point it out the fucking repo
until it re-write this,
it's not clean room.
Well, maybe you pointed at the repo
and say, write a spec of this
and then use the spec to write the code for this.
But the model is still likely to have been trained on that code.
And that's the problem is the proving one way or the other
if it has or hasn't been trained on that code.
And I don't think,
oh, well, I'm sure they could tell you
but they certainly won't tell you.
I think that's where I'll buzz down to the fact
that they took a load of software outside of its license
and used it to allow you to write software
that breaches that license
whether explicitly or implicitly
without anybody's expressed permission to do anything.
And we can't do anything about it.
I think that's the key, isn't it, yeah?
Yeah.
We can argue until we're blue in the face.
We can try and do lawsuits, you know,
try and get, you know, conservancy involved, let's say.
But are you going to win this probably not?
And, you know, there are those who are arguing
that this is effectively the end of copy left
because if they get away with this
which they probably will,
then basically anyone could do the same
for any piece of copy left software
and thus copy left is dead.
And we are left with just permissive licenses
for open source software.
I would imagine the vast majority of code
that is in the training models of LLMs
is, in fact, fast software.
But let's not forget that there's proprietary software
and there that was taken
without those people's expressed permission.
So, I think the concept of ownership in general
has just been shut to shit.
And I think there's a lot of people
who are going to be very pissed off
when they start seeing clones of their stuff pop up
which possibly has even their own code in there.
Okay, this episode is sponsored by Automox.
If you're evaluating endpoint management solutions,
the real risk isn't which tool you choose.
It's choosing one without a proven blueprint for success
because while most endpoint tools promise automation,
next to none promise outcomes.
That's why Automox introduced turnkey results.
Turnkey removes the uncertainty
that holds endpoint automation back.
It delivers a personalized results blueprint
so Automox is configured and operated
to achieve real outcomes from day one.
No trial and error, just clarity.
Each blueprint is validated across millions
of endpoints managed by Automox
and tailored to your specific environment,
your risk tolerance and your operational goals.
Instead of guessing how aggressive to be with automation
or how to configure policies safely,
start with the plan you can trust.
And you can choose to implement the plan yourself
or have Automox do it for you.
Either way, you get faster time to value, lower risk
and predictable endpoint outcomes.
Automox turnkey results.
Know the plan, trust the result.
Learn more at www.automox.com.
Hide from Meta's Spy Glasses
with this new Android app.
So this is an app called Nearby Glasses.
Now, if you don't know about the Meta Rayban,
whatever they're called,
it's really quite creepy.
It's kind of like Google Glass,
but instead of looking like some sort of call sci-fi thing,
you just look like an 80s wanker
with the RayBans on.
And they've got a camera built in
and yes, there is an LED that lights up
when you're recording,
but you can hack it to turn that off
or defeat it with a sticker.
Yes, but now there is an Android app
that is open source, which is called
that's our excuse to talk about this,
that will detect the Bluetooth signature basically
of these Rayban Meta AI glasses.
Now, I know a young person
about 20-ish
and this young person is popular.
I mean, it's shocking
that someone could be related to me
and popular and fashionable,
but there we are.
My niece has got a pair of these
and she sent me the video
and I'm like, yeah, call.
And then a couple of days later,
I found an article
about how everything you record
with these Meta Glasses gets sent to
a team in Kenya
who after painstakingly go through
all of your footage
and label everything you see
to train the AI bullshit.
Jesus Christ.
Yeah, yeah.
And so I told her about that,
she said, I don't care.
All they're going to see is me
taking the piss out of my brother
or going to raves
and, you know, dancing
and taking a shit.
Well, and there's been reports of people
who've accidentally left them on the nightstand
as the Americans would call it
and then people come in,
and start getting undressed and whatnot
and they accidentally
recording money having a shit
or whatever, you know.
Which is worse than the glass.
Accidentally.
Yeah.
Accidentally on purpose.
Imagine all the things you see in a day
if they were accidentally
picking your nose
and examining the treasures
that you got out there.
Exactly.
They're mind boggles
with what they might see.
I just cannot believe that this is a thing.
Like we lived through that glass whole era
or only for it to just come back
and this time they just made them fashionable.
That was the mistake that Google made.
They made them too nerdy
and sci-fi looking
if they just made them
with a company like Rayban
which Rayban's kind of one of those timeless brands
in it like Converse All Stars
whatever, it never seemed to go out of fashion.
So that was genius from
Zuck to do that
to convince a bunch of young,
hip-trendy people
to walk around filming everything
to train his fucking AI.
There's a bit of hope here
because, you know, I've got an 18-year-old daughter
and she sent me a link to this Android detector app
and she asked me whether I thought this worked
because she was actually concerned
that people were around
or when she's out and about,
people would be wearing these glasses.
And she's not particularly privacy-focused,
you know, she's a typical older teenager.
But that's the first time I've kind of seen her care
and she genuinely was looking for something
that would help her detect people
with these kind of glasses
and she was uncomfortable
about being around those people.
So maybe, maybe there's a bit of a movement
against this kind of thing.
That's quite good.
That tells me that you're a better parent
than my brother Graham.
No, because she's ignored everything else we've ever said.
Does she have an Android phone then?
She doesn't have an Android phone, no.
She's got an iPhone.
Okay.
So she is relatively normal then.
Yeah, yeah.
That's interesting, right?
Because, you know, I sent this app as well to my niece
and she said,
oh, that's so weird.
Like, why would you do that?
And I said, no.
Walking around secretly filming people
is the weird thing.
Yeah, I'm surprised.
So that's why I remember it.
And so I don't know.
It's the first time I've seen her
really care for something like that.
I thought it was strange
because I've seen a few videos
crop up on YouTube
of people gone through airport security
wearing these things clearly.
Because I was, I just, I couldn't understand
Heather getting footage inside
the metal detector area
because you're clearly told not to film inside there.
Yeah, you get your phone out.
You could end up in cuffs.
Yeah.
Yeah.
Yeah.
So this must be why.
And I never linked the two together.
And yeah, interesting that people
have like worn in places where, you know,
technically they shouldn't have.
What a shocker.
One bit of credit I will give to Zach
right is that although they film vertically,
of course they film vertically,
it's four by three, not 16 by nine.
Why would you give them credit for that?
Have you got a stack of Sony tellies
from the 90s that you have there?
No.
Well, what might I have that suits
a four by three ratio failure?
Nothing.
That functions.
That doesn't have key marks on that.
That's all I can think of.
Unfold it.
Boom.
It just looks perfect.
Oh, dear.
Well, anyway, everyone should
download this open source app.
And it does specifically say
don't hassle people.
If you do, you might get in more trouble
than them for subtly filming you
when they shouldn't be.
It's not an f-troid, by the way.
I don't know why because it does have a
GitHub repo.
So hopefully it was just a matter of time.
But I couldn't find it.
Yeah, I looked.
It's available on the Play Store.
And I think another store that I can't remember.
And if you look on the GitHub,
there is a releases page with APK.
So you can directly download that.
Hopefully, yeah, it will be going through f-troid.
But f-troid, everything's manually reviewed, isn't it?
So it takes a while generally to get in there.
So if it has been submitted,
hopefully it'll be there soon.
While we are on the topic of glasses,
don't forget everybody to buy eclipse glasses
if you live in the Western Atlantic archipelago
or Spain or Portugal,
in which case is a actual solar eclipse cone up this year?
The f-kin Western Atlantic.
You mean the British Isles?
I do not mean that because that is a racist state
with you joined imperialist bastards.
The British Isles.
Just like it's the English Channel,
or not the Slave Channel.
Western European Atlantic archipelago
is what you think you will find it's called.
I have f-kin' heard it all now, failing.
You don't have to talk out of your ass.
Sorry, I don't understand you in that foreign tongue
you're speaking to me.
When's this eclipse then?
I think it's August or September.
I can't remember exactly,
but yeah, in the summertime.
So, I mean, even with the shiny weather over here,
we should still catch about 90% of it.
So, that should be like a proper dim day,
all of a sudden, with birds going to sleep
and then waking back up afterwards.
So, yeah, I mean, especially with that,
if you don't have a total eclipse,
Jesus Christ, don't look at the sun.
I remember the last one that was in the,
I think it was like 2000 and 1 or 2.
And I had a summer job I was working at the time,
and I got a piece of wood in my eye
and had to go to the eye hospital with it.
And there was like at least 20% of the people there
had previously that day looked at the eclipse
and thought they were gone blind.
And I was like, Jesus Christ.
And the warehouse I was working at was just beside this place
that they'd like photographic printing.
And there's a whole lot of lads looking through photographic film
at the sun.
And that was like the explicit thing they said,
don't look through photographic film at the sun.
I said, did you not hear that bit?
I went, oh, yes.
Meanwhile, fucking spa dove over here
had a sheet of cardboard with a hole
and making a pinhole camera.
Look, everybody, you can see the shadow
and every looking at going, fuck off.
I'm looking through the film.
And you're right, get the glasses now
before they are expensive.
Or sold out.
Yeah, more likely.
Maybe swap some RAM for some at the time.
Yeah.
Right, well, we better get out of here then.
We'll be back next week when who knows what it'll be.
Until then, I've been John.
I've been Felum.
I've been Graham.
And I've been Will.
See you later.
Bye.

