Loading...
Loading...

There's nothing like your first Mac.
Here's what people online are sharing.
At Doctor Rain says,
Everything is just so smooth and fast
I still can't get over it.
Sinking stuff between my phone and this is just chef's kiss.
At Mr. Incredible488 says,
Apple Silicon basically cures low battery trauma.
That's how they felt with their first Mac.
How will you?
Introducing the all-new MacBook Neo,
an amazing Mac at a surprising price.
Find out more on Apple.com slash Mac.
You've dedicated your life to preserving privacy.
Let's just start a big picture.
What is privacy and why is it important?
I believe that privacy is core to freedom at the end of the day.
I would even go as far as saying that it is synonymous with freedom.
It is protecting you, protecting your inner core essentially,
protecting your identity as a human being from forces that don't want you to be individual
and a human being at the end of the day.
So nicely put it.
I think what it really boils down to is...
In that regard, I think privacy is relatively similar to what was originally intended also
with the Second Amendment in the United States.
It is a tool for you as a human being to protect yourself against coercive force,
against your very soul, your inner core.
So there are forces and this has always been true at every time in history
that seek to make people less human to turn human beings into slaves or animals or objects
and privacy is the thing that prevents that.
So the crazy principle that exists within this universe is that there's this asymmetry
baked right into the very fabric that we exist in.
There's certain mathematical problems where the effort required to undo them
isn't just scaling linearly or exponentially, but that scales so violently
that the universe itself prohibits persons that don't have access,
don't have permission to undo this mathematical problem that they literally cannot do that.
So what that means is that with a very little amount of energy,
a minuscular amount of energy, a laptop, a battery and a few milliseconds of computation,
you can create a secret that not even the strongest imaginable superpower on Earth
is able to without your explicit granting of access are able to recover.
That is the fundamental principle on top of which encryption, cryptography and privacy
in the modern age are built.
And it's so fascinating that the universe itself allows for this computational asymmetry
where I can create a secret, I can encrypt something, I can make something hidden
and you with the most powerful imaginable coercive force violence,
you could imagine content and sized computers running for the entire lifespan of the universe,
you would not be able to apply that force to my secret because I have encrypted it
and the universe inherently sort of smiles upon encryption and appreciates that.
So I always found that so intoxicating this concept that this is inherently baked into the universe.
It is an interaction between mathematics and physics sort of
and is a fundamental property just like you could say nuclear weapons
are a fundamental property of reality, right?
And so encryption and privacy exist in this reality
and before you, we assume and I figured that out that wasn't necessarily clear, right?
It could also be that you can never hide something, encrypt something, keep something to yourself
but it turns out you actually can.
And so that is fascinating, I think.
And what it conceptually allows you to do is to take something and move it into a different realm
and then encrypt that realm, right?
And if someone else wants to go into that realm, follow you there,
they would need unlimited resources to do so.
And I would say that's what really got me into cryptography and privacy.
Okay, I'm having all kinds of realization simultaneously for sure
that you're an extraordinary person.
I think that's the first one.
Listen to three minutes, okay.
Who are you, where are you from?
And are you ready to suffer for your ideas?
Because what you've just articulated is the most direct subtle
but direct possible challenge to global authority.
Anyone could ever articulate.
But first, how did you come to this?
Where are you from?
Tell us about yourself for just a moment.
So I was born in Germany.
I'm 25 years old.
And I originally actually in my life I studied law
and then later I studied mathematics and computer science.
And then at some point I met a few people who also had these kinds of ideas about privacy,
technology, distributed technology, decentralization.
And we then decided to found a company that builds this kind of technology.
And that's how I ended up here, I guess.
So you're German, your product of Europe and European culture,
which is not privacy for all of its wonderful qualities.
It built the world.
I love Europe and the culture.
But it's not a privacy culture.
It doesn't have no.
No.
So especially German, why did you come to this conclusion when all of your neighbors didn't?
So I think it's interesting, right?
If you privacy as this inherent political thing that protects you as a human being,
there is data protection laws, GDPR, right?
There's fines against surveillance, capitalist tech giants in Europe.
But as you said, I feel like most of that stuff is a charade.
It's not really about protecting your privacy.
And we're seeing that in the UK, in the European Union,
I mean, there's so many cases that already have made some significant movements already this year.
So I would say for me personally,
it has really been this technological and mathematical understanding
of the power of this technology.
So realizing this, realizing that the universe allows you
to do these things.
And the universe has this built right into it.
It got me so fascinated that I really thought deeply about this.
And what I realized sort of is that what humans have done in the past
is that they've allowed information, right?
Any type of information that we now share with our mobile surveillance devices.
So that information to be encrypted and be put addressed somewhere securely, right?
That is how encryption has mainly been used.
Or to do things like signals doing where we do end-to-end encrypted messaging, right?
Where we're able to send some message from one human to another human being,
via something, some untrusted channel, right?
Where there can be interceptors that try to get those messages.
But thanks to mathematics, we're able to send this message across the whole universe
and it arrives at the end point with no intermediary being able to take a look at the message
because of this inherent property of the universe.
What I realized sort of has been that there's a missing piece, which is,
whenever we are accessing this information, whenever we are interacting with this information,
whenever we want to utilize it basically, we have to decrypt it again,
which then makes it accessible to whoever takes a look at it, right?
Whoever runs the machine that you decide to put that data on,
which can be AWS, which can be cloud providers, big data, big AI.
Exactly.
Whoever, right?
And so this idea that I had was what if we can take this asymmetry that is a fact of reality
and move that to computation itself to enable that all of those computations can be executed in private as well.
And then we can do some amazing things.
Then the two of us can decide to compute something together, not just exchange information,
we have some secure communication channel, but actually perform some mathematical function over something,
produce an output from some inputs, but we can keep those inputs to ourselves.
So Tucker has a secret, Janik has a secret, and with this technology,
we can produce some values, some information.
While you don't have to share your secret, I don't have to share my secret.
And we can scale that to enormous sizes where the entirety of humanity can do those things,
where countries can do those things.
But importantly, at its core, what we are doing is we are implementing this asymmetry
that exists within the universe and bringing that to the next level,
to the final form sort of.
And that's how I ended up founding Arcium.
Getting older can make you realize you don't actually want all the things you have.
That's why many storage is so big.
Consumerism kind of loses its appeal.
What you really want is peace, peace of mind.
And what's the best way to get that while keeping your home and family protected would be at the top of the list?
Enter Simply Safe.
This month you get a 50% discount on your first Simply Safe system.
Simply Safe takes a much better approach to home security.
The idea is, how about we stop people invaders before they come into the house?
Not just trying to scare them once they're already inside your house.
And they do that with cameras.
Backed by live agents who keep watching over your property.
If someone's lurking outside, they will tell the person, get out of here.
And then they'll call the police if they don't.
60 days satisfaction guarantee your money back.
So there really is no risk here.
Not surprisingly, Simply Safe has been named America's best home security system for five years running.
Protect your home today.
Enjoy 50% off a new Simply Safe system with professional monitoring at SimplySafe.com slash Tucker.
Simply S-I-M-P-L-I-Safe.com slash Tucker.
There is no safe like so.
Fandool is dropping bonus bets into everyone's account for the tournament.
All you have to do is opt in to claim your bonus.
But don't wait.
These bonus bets are only available for a limited time.
Fandool, play your game.
21 plus in present in select states.
Bonus issued as nonwithdrawable bonus bets which expire seven days after receipt.
Max bonus $500 unless otherwise specified.
Restrictions apply.
See terms at sportsbook.fandool.com.
Gamling problem called 1-800-Gambleer.
During Lowes Pro Savings days.
Save more on what goes into the job.
Add power to your lineup.
With a free to walk 20 volt max 5 amp hour battery.
When you buy a select to walk 20 volt max tool.
Plus get up to 35% off.
Select major appliances for world pool, maytag, and more.
Get the job done right.
Keep more in your pocket.
That's Pro Savings days.
Our best lineup is here at Lowes.
Valid through 327.
Selection fair is by location.
Raw supplies last.
Public safe.
I can't think of a more virtuous project.
And you said it in the first minute.
The point of the project is to preserve humanity.
To keep human beings human.
And they're not just objects controlled by larger forces.
They're human beings with souls.
And again, I don't think there's any more important thing
that you could be doing with your life.
So thank you for that.
Can you be more specific about our current system
and how it doesn't protect privacy?
Yes.
So I would say there's...
So I think there's a lot of things to unravel.
If we take a look at the systems
that we are interacting with every single day,
what those tools and applications,
those social media networks,
basically everything that we do in our digital lives
and all of our lives have basically shifted from physical reality
to the digital world.
So everything we basically do, everything we do in this room,
everything we do when we are out on the street,
because all of the technology has become part of physical reality,
has been consumed sort of.
And so all of this has been built on top of
what the former Harvard professor Shoshana Suboff
has called surveillance capitalism, right?
And I think that really lies at the core.
And it's relatively straightforward to understand
what those companies are doing.
If you ask yourself,
hey, why is this application that I'm using actually free, right?
Why is nobody charging me to ask this super intelligent
chatbot questions every day?
Why are they building data sensors for trillions of dollars?
Well, I don't have to pay anything for it, right?
So that's the question that you need to ask yourself, right?
And what you end up realizing is that all of those systems
are basically built as rent extraction mechanisms
where from you as a user,
you're not really a user,
you're sort of a subject of those platforms.
You are being used to extract value from you
without your noticing.
And they are able to extract value from you
because all of your behavior,
all of your interactions with those systems
are being taken and they perform mass surveillance,
bulk surveillance, and it's those companies, right?
We're just talking about companies.
We're not even talking about intelligence
or governments or anything.
We're just talking about those companies that exist
within our economy.
And so they record everything they can
because every single bit of information
that I can take from your behavior
allows me to predict your behavior.
And where I can predict your behavior,
I can utilize that to,
in the most simple case,
do something like serving your ads, right?
But in more complex cases,
I can do things like I can steer your behavior.
I can literally control you.
I can turn you into a puppet that does whatever I want.
And so those are the systems
that we are faced with right now.
And the internet has sort of been this amazing emancipator
for humanity, right?
This show is only possible because of the internet.
Otherwise, with traditional media,
we wouldn't be able to speak about those topics.
I feel like.
That's right.
But at the same time,
sort of nowadays,
it has transformed into one of the biggest threats
to human civilization.
At the user level,
at my level,
the level of the iPhone owner,
is it possible to communicate privately
with his sure answer privacy with another person?
That's an interesting question.
So we start with this concept
of insecure communication.
Yes.
And since every communication channel is insecure,
what we employ is end-to-end encryption.
And end-to-end encryption allows us
to take this information,
take a message,
and lock it securely,
so that only Tucker and Janik
are able to unlock them
and see what's going on.
And that is a fact.
So there have been,
there have been many cases
where there are big players
with big interests, I guess,
have attempted to undermine cryptography,
have attempted to get rid of end-to-end encryption
to install backdoors.
There has been what is commonly called the crypto-bores
in the 1990s, right,
where the cipherpunks
fought for the right to publish open-source encryption
and cryptography.
And many, many more cases, I guess.
But at the end of the day,
I would say as a realistic assessment,
this kind of cryptography is secure,
and it works.
Now, that unfortunately
is not the whole answer,
because what we have to think about is,
now, what happens
with those end devices, right?
Fair.
I mean, the message,
the messenger writes
that this being sent
from Janik to Tucker,
might be secure.
But now, if I cannot undermine
and apply force to this message,
to understand what's inside,
well, I'm just going to apply force to your phone.
And that's sort of what's happening.
So, when we look at different applications,
for sure, there is a whole variety of applications,
messaging applications, right,
that do not employ encryption
and security standards
and might collect all of your messages and images
and utilize them for those machines,
that extract as much value as possible from you.
But there's applications like Signal
that don't do that,
that are actual open source, cryptography,
technology that anyone can verify themselves
and take this code
and turn it into an actual application
and start it on your phone.
All of those things are possible, right?
So, that's not the issue.
The underlying issue really is that
you have this device in your hand
that is sort of closed hardware.
You don't know how that thing works, right?
It is impossible to understand how that thing works.
It is impossible to understand how the operating system
on that thing works.
And there's flaws in those systems, right?
Those are closed systems.
There's flaws in those systems.
For some reason, because people don't always
have the best interests of others in mind.
But also not always.
Not always.
But also because people make mistakes, right?
Honest mistakes that are not malicious.
And so, I think that in general also speaks
for the importance for free accessible hardware,
where people with technical skills can play around with
and find issues.
But at its core, what you're being subjected to right now,
I would say is technical surveillance.
And what it means is that there's some actor,
can be some state actor, can be someone else,
that decides that Tucker Carlson is worth to be surveilled.
I think that has been decided here.
I think I do.
I think I'm getting that sense.
So, so technical surveillance.
That means that you specifically are being targeted.
And that is in contrast to strategic surveillance,
which is this idea of everyone is being surveilled.
Let's just surveil everyone, collect every single bit
of information and store that for the entirety
of human history.
And then someday maybe we'll be able to use that, right?
So, those are those two concepts.
And what we've seen over the last few years
is sort of a shift away from technical surveillance,
towards strategic surveillance.
And surveillance capitalism has really helped this concept,
because there's so much data that is being locked,
that can be stored, there's so many new devices
and applications that can be employed.
And so, we see pushes like, for example,
chat control within the European Union,
that is sort of a backdoor to implement backdoors
within all of the messenger applications,
to be able to scan your applications,
to scan your messages, to take your messages somewhere else,
and decide whether or not those people like
what you're saying within your private messages.
So, I would say in general, as a normal human being,
with your iPhone, you are still able to privately communicate,
that is still something that exists.
However, this ability has greatly been limited.
If there is someone who wants to see your message,
I would say they can, unfortunately.
How difficult is it for a determined, say, state actor,
an intelligence agency to say,
I want to read this man's communications.
Listen to his calls, watch his videos, read his texts.
How hard is it for them to do that?
So, I think that, and we can look at different court cases
that have publicly emerged in regards to Apple, for example,
where Apple has refused intelligence
to give them backdoor access to their devices.
And what's so important about this discussion
that we are having here is that every time you're building a system
where you add backdoor access,
so that someone in the future can decide to get access,
and take a look at what you're writing,
what that invites is for everyone to do that,
because a backdoor inherently is a security flaw in our system.
And it's not just some specific intelligence agency
that decides to read your messages.
It's every intelligence agency on the other side of that point.
And so, that's why, as a nation,
you cannot weaken security by getting rid of privacy
without weakening your entire economy, cyber security,
and also social fabric at the end of the day, right?
And the whole strategic positioning of US and nation.
How difficult it is, I would say, also,
from a practical operational security standpoint,
depends on what are you doing with your phone, right?
Is your phone this strict device that is only used for messaging
or is your phone also using different types of media?
Are you sending images? Are you receiving messages?
So, I think two years ago,
there was this case where there was a zero-day backdoor
being used across Apple devices,
because when I sent you an image and your messenger
had auto download on,
I could get full access to your phone by sending you a message.
And you're not my contact, even, probably, right?
I just figure out what your phone number is.
I send you an image.
The image gets automatically downloaded.
Some malicious code that I have injected gets executed.
And now I own your phone and I can do whatever I want.
And then end training encryption doesn't help you, right?
Because I have literal access to the end device
that decrypts this information.
And so, that's very dangerous that that has been fixed.
But I think what it highlights really is that complexity
is the issue here.
So, complexity in the kinds of applications
that you're running, complexity in the underlying operating system
that this device has,
all of that complexity invites mistakes
and also malicious security flaws
to be installed in those systems.
Yeah.
Human organizations are the same way.
They are the easier they are to subvert.
Yes, of course.
February is the perfect month to get cozy,
because it's chilly outside.
Our partners to cozy earth understand this
and they're helping Americans everywhere
stay toasty throughout the frigid winter.
I hope you're seated,
because this detail may shock you.
Cozy Earth offers bamboo pajamas.
Light weight, shockingly soft.
These pajamas are a true upgrade.
They sleep cooler than cotton,
but they're made out of bamboo.
That is just wild.
And awesome.
From pajamas and blankets to towels and sheets,
Cozy Earth is something unusual and great for everybody
and it's entirely risk-free.
You get a hundred night sleep trial,
ten-year warranty.
There is no downside that we can see.
So share, love, this February.
Wrap yourself for someone you care for
and comfort that feels special.
Bamboo pajamas!
Visit CozyEarth.com.
Use the code Tucker for 20% off.
That's code Tucker for up to 20% off.
And if you get a post-purchase survey,
make certain dimension.
You heard about Cozy Earth from...
Ever feel like your brain just won't click
on it.
Alpha Brain is a daily supplement engineered
to support memory, focus, and mental speed.
Made with science-backed ingredients,
on it Alpha Brain helps you lock in,
tune out distractions and stay sharp.
See what your brain can really do.
Visit onit.com and shop Alpha Brain
to unlock your next level.
That's o-n-n-i-t dot com.
The sun shining, birds are singing
and all feels right in the world.
Until the season changes,
and suddenly you lose your motivation to get out of bed.
In fact, one in five people experience some form of depression
no matter the season or time of year.
At the American Psychiatric Association Foundation,
our vision is to build a mentally healthy nation for all
because we want you to live your best life
and be your best you all year round.
Please visit mentallyhealthynation.org to learn more.
Us.
So that's very, I mean, that's a very simple thing, man.
To send someone, you know, to text him an image
and all of a sudden you have control of his phone.
I think we can be fairly confident
that people who have adversaries are being surveilled, right?
Yes, I think so.
I would say that technical surveillance
really is something that exists.
I would say in this battle for privacy,
it is actually not the most important thing to focus on, right?
Because this kind of technical surveillance,
sort of, I feel like to a certain degree
we need to accept, unfortunately,
not the technical surveillance that says,
Tucker Carlson is a journalist.
I don't like that.
Let me surveil him, right?
That's not the kind of technical surveillance I'm speaking of.
But if we have legal procedures and actual
truly carry warrants in place, right?
I feel like as a society we could accept that
and to convert.
As long as we try to determine the system.
Right, we could definitely accept that, of course.
But the fundamental issue really is,
and that's sort of so ironic, right?
That all of the surveillance sort of needs to operate under secrecy
in order to function, right?
You should not know that you're being surveilled.
Nobody sort of has oversight.
Not even the democratic processes are able to have oversight
because it's all wrapped in secrecy.
So that really brings us to the fundamental issue here,
also with strategic surveillance, surveilling everyone,
just deciding, well, I'll take a look at everyone's phone,
store everything, and maybe I don't like someone in the future
then I have this backlog of information.
So the important question to consider here
is thinking about, is there even a future
where from a legal standpoint it is possible
to implement procedures that guarantee
there is no secret surveillance in place,
which I think the answer is pretty clear to that question.
And it is.
I think it is not.
So I think it is important to have these laws in place, right?
That prohibits surveillance and that enable different kinds
of processes with warrants, right?
Literally the fourth amendment, right?
To allow for that to be implemented in the 21st century.
But what we've seen sort of is that the tools
that governments have access to are so powerful
that it is impossible to make a law
that prohibits use of that,
because whoever within a centralized architecture
that's always the case has access to this technology
basically becomes a single point of failure.
And that single point of failure will necessarily be corrupted
by the power that exists.
Just a couple of obvious low-brow textual questions
is the iPhone safer than the Android or less?
That's a good question.
So I would say a huge advantage that Android devices
bring to the table, right?
This is this nature of, I guess, a subset of those devices, right?
Not speaking for the entirety, but the operating system, for example,
being publicly fuelable by anyone, right?
Yes.
You can understand it.
Yes.
And I think that is so important not just for security,
but also for technological innovation.
And so I would say that is a huge advantage.
Now, the devices are manufactured by some manufacturer
who you need to trust at the end of the day,
based on how the hardware is built
and how the firmware is compiled
and then put on your device.
So they have been interesting operating systems.
I think there's one called Graphene OS,
which is a secure, open source operating system as far as I know,
haven't looked too deeply into that.
But you could, on an Android device, theoretically say,
I'm going to run my own operating system on that,
which I think is a strong, very proposition.
Now, I myself, I'm also an Apple user.
There is also an element of institutional trust involved here,
where you say, okay, I trust the manufacturing
and software process that this company has.
But in general, if I'm being honest,
if I wouldn't be lazy, right,
what I'd be doing is I would actually be looking for
a minimalistic, secure, open source operating system
for my mobile phone, and I would build that myself
and get some hardware and put that on there.
So I would say that would be the smartest thing to do.
If you are technically versatile.
I read the EU's and iPad, not a Mac.
Is there an advantage?
That's what I did back in the day.
But it started, yeah.
Is there an advantage to the iPad over the Mac
from a privacy standpoint?
I think what it boils down there, down to there,
is what kind of applications could be installed on your system.
I would say in general, devices like the iPhone or the iPad,
operating a more sandboxed way,
where applications are actually isolated right,
rather than how it works on operating systems like macOS
or Windows, where you could compromise the entire system
way more easily, right?
So on the iPhone, you just have an App Store with applications
and the level of compromise that such an application can have
theoretically, at least from the idea is limited
to just this single application, right?
It doesn't have access to your messenger
if you're installing an app.
Although it has, I guess, if there's some flaw in the system,
which always is the case.
So you never have this absolute security.
I think what it really boils down to is this idea
that really emerged in the 1990s of decentralization, right?
Moving away from central single points of failures
towards decentralization, where we can mitigate a lot of these risks
by not depending, I guess, on one single type of computer
and not even depending on one single computer,
but having many computers, which introduces redundancy, resilience,
and, I guess, risk reduction and distribution to computer systems.
So speaking more broadly about how the internet
in our free society should be built, I guess, yeah.
So most people don't wake up in the morning
and decide they feel horrible, exhausted, foggy, disconnected
from themselves, but it does happen and it happens slowly.
You're working hard, you're showing up,
and your energy disappears by midday, your focus is dull,
your weight won't move, a lot of people are told,
that's just getting old, that's what it is,
but that's not actually true.
For many men and women, these are not personal fears,
they are signals tied to your metabolism, your hormones,
and nutrient in balances that go undetected for years,
you don't even know, you're deficient.
And that's why we're happy to partner with join blocs,
a company that was built for people who were all done guessing
and ready to figure out what exactly is going on.
And that starts with comprehensive lab work,
and a one-on-one consultation with a licensed clinician.
An actual human being explains what's happening inside you
and builds a personalized plan,
which includes hormone optimization,
peptide therapy, targeted supplements.
So don't settle, go to joinblocks.com,
slash chocker, use the code Tucker for 50% off your lab work
and 20% off all supplements.
That's joy and blocs.com slash chocker, use the code Tucker.
50% off labs, 20% off supplements.
Join blocs, get your age back.
America leads the world in medicine development.
It matters.
We get new medicines first, nearly three years faster.
Five million Americans go to work
because we make medicines here at home,
and not relying on other countries keeps us safe.
But China is racing to overtake us.
Will we let them, or will we choose to stay ahead?
When America leads, America cures.
Let's go to Washington to keep us in the lead.
Learn how at AmericaCures.com,
paid for by Pharma.
It's not just something you made,
it's the privilege that you get to work with your hands.
It's building something that serves a purpose,
proof that you have the grit to keep going.
At Timberland, we understand you take your craft seriously,
and we do too, which is why our products are built to the highest quality.
We put in the work so you can perfect yours,
with purpose in every detail and crafted with intention.
Timberland, built on craft,
visit Timberland.com to shop.
You've said a couple of times that the problem is the hardware.
It's not the software.
It's a device, right?
It's a union of the hardware and the software.
What's the option?
Is there an option at this point?
If I am intent on sending a private message to someone else electronically,
is there a way to do it as of right now?
That's private, guaranteed private.
I would say the way that I, myself,
at least handled it really is to have a dedicated phone for that specific use case.
And then just have encrypted messenger there.
That you can trust because maybe you don't even install it via the App Store,
but you have built it yourself.
And there's no other interactions taking place with that phone.
I would say from an operational security standpoint,
that is as good as it can get.
Otherwise, you would really have to look at,
I don't know, you can do creative things always, right?
You could write your message and hand encrypted
then type it in the phone, right?
So it doesn't matter at that point.
So maybe we need to get away from the devices altogether, right?
What's interesting, what we're doing with RQM is that we never have a single point of failure.
Everything is encrypted.
Everything sits within a distributed network,
where as long as you're not able to basically get access to the entire globally distributed network
to every single participant, you have security.
And it's difficult to do that with your own phone.
But at the end of the day, I think over time,
those systems get more secure.
However, what is important is to be certain that there is no backdoors explicitly installed, right?
From those manufacturing processes.
I think there's some countries where
if you're buying a phone from there, you could be certain, okay, there might be something installed
because the company itself is owned by the government.
And we need legal frameworks for that.
And also what we require sort of is that the manufacturing process itself mirrors distributed decentralized systems,
where there again is not a supply chain of single points of failure,
where if one single worker decides to install some backdoor because they get paid off right,
they can do so, but instead there is oversight.
And I think that Apple runs on that model already.
So I would be relatively comfortable with these kinds of systems,
but there's also other interesting technologies.
So for example, Solana, which is an American company blockchain network, right?
And they actually have their own phone company or offering phones.
They have a very small manufacturer and they manufacture those phones because they say,
well, those phones need to be very secure because you literally stored your money on there now
because your money is digital and on top of a blockchain network.
And so I think those are very interesting approaches,
where I'm really looking forward to seeing more phones like this,
where there's then again a competitive market emerging for who's building the most secure phone.
I actually think a friend of Julian Assange from Germany, I don't remember his name,
had a company manufacturing secure phones.
The issue with explicitly built secure phones, however, always is that I would say many of these companies are honeypots.
I noticed.
Yeah, with the anchor chat or whatever it was called, there was this large scale,
yeah, police operation to stop truck cartels, which worked out nicely, I guess, in the end.
But the company itself was just a facade to sell backdoor phones.
Yeah.
Right.
I mean, it's the perfect honeypot.
And so by the way, a signal, which I'm not saying is a honeypot, of course,
and I use it as the authorities know.
But it was created with CIA money, so it doesn't mean it's a CIA operation,
but why wouldn't it be?
I mean, honestly, I'm not accusing anybody because I have no knowledge.
Yeah.
I mean, pretty obvious move, right?
It would be.
I think what's important when we look at signal actually is that we look at what signal is.
Signal is open source software.
Yes, that anyone can verify for themselves.
And what it means is that we have this global community of mathematicians and cryptographers
that have invented those protocols that have independently without getting funding
from CIA or whomever.
Yes.
Ford of mathematical problems that they want to solve, that they are passionate about,
and all of those people look at those open source lines of code and mathematical formulas,
and they find those flaws in those systems.
And so that makes me confident in the design of signal itself.
Do you use it?
I use signal, yes.
I got my entire family to use signal.
Okay, good.
And I have to say, I know a lot of Intel people use signal.
Yeah.
A lot of the ones I know.
And so that tells you something right there.
Yes.
So I think it would be highly unlikely that signal itself would actually turn out to not be secure.
There has been, there has been this interesting case called, there was in the early 2000s,
where there was this attempt to actually undermine strong encryption called very exotic name,
dual elliptic curve deterministic random bit generator, dual ECDRBG, right?
Nobody understands, no non-technical person understands what that means, right?
And it was actually what you need to understand in order to comprehend what has happened there,
is that when we, when we encrypt information, when we, as I said earlier,
when we take something and move it into this different realm, where you cannot follow this information into that realm,
because that would require you to have literally infinite resources, more energy than the sun will emit over its lifespan.
Isn't that crazy, right?
So you, you cannot, you cannot follow there.
Well, how fundamentally this asymmetry is achieved in cryptography is that the universe runs on energy and uncertainty,
right?
Particles, chitter, stars, burst, and so there's this randomness in the universe.
If you look at the sky or if you just look at how, how, how things are made up,
there's random noise everywhere.
And so when we, when we encrypt something, we make use of that chaos,
and we unchecked it into a message that we are sending, for example,
and it's only possible to not decrypt that message in an unauthorized way,
if the randomness that has been unchecked in this message is actually unpredictable.
Now, if we think of randomness is truly random, it has to be truly random.
Yeah, I cannot, I cannot figure out how you arrived at the random number.
No, no pattern.
No pattern.
Exactly.
True randomness, true entropy, right?
Yes.
That's, that's what, what cryptographers, I would say, spend most of their time on thinking about
how can we achieve true randomness?
Because then if we are able to uncheck that using mathematics,
for you, it becomes impossible to distinguish this message from randomness.
You can't find a pattern.
Hence, you're not able to apply any, any optimized algorithm to undermine exactly.
So if we, if you think about it practically, what that means is,
let's say we have a deck of cards, 52 playing cards, right?
And I randomly shuffle the stack of poker cards.
We have 52 cards.
What that means is that there's so many possible ways that the deck could be stacked,
that it is very unlikely that for truly randomly shuffled decks,
there have ever been two identical decks in the history of humanity,
which is hard to believe in general, but that's how how statistics and mathematics work, right?
So we take this deck and we use that as the randomness.
Now, if I play with a magician, the magician can pretend to shuffle the deck,
but actually they have not shuffled the deck.
They know what the cards look like.
But we're doing with all of this randomness that we are injecting into information
is we're basically describing what key is being used to unlock them.
And if I don't know how the randomness looks like,
if I don't know what the next playing card in the stack is,
I have to try every single possible key and try to unlock it with this message.
So you could think of it as, I have this message.
Now I want to apply violence to this message in order to recover it.
What I'm doing is I take key number one, I try to unlock it, doesn't work.
Then let's try key number two.
And you do that for an inconceivable large amount of numbers.
So that's why you basically, practically speaking, cannot brute force these kind of mechanisms.
Although you can, if you know where to start looking for the keys,
if you know that you need to start looking at the, yeah, millionth key,
then you can recover it.
And so if the deck is being manipulated, the randomness is being manipulated,
then you can undermine encryption while the process of encrypting it itself remains sound, right?
You don't notice it.
You actually do what you mathematically need to do to securely send your message,
but the value that you use to do so this randomness is actually not random.
And that's what the, what had been attempted with this specific algorithm,
dual EC, the RPG.
What they did was they created this concept of kleptography,
where they actually have randomness derived in a way that is deterministic
and they actually have some some secret value.
And then from that secret value, they derive fake randomness.
They look random, but it's not actually random.
And the NSA proposed this algorithm to the, to the NIST,
the National Institute of Science and Technology in the early 2000s,
as the, yeah, best set of the art randomness derivation function, I guess, right?
And that got accepted.
They got accepted as official standard.
And then there was companies like RSA actually are highly sophisticated
and respected cryptography company, right?
With, with the founders being some of the fathers of modern and cryptography, right?
So that then build products and distributed it to industry and people using this technology.
Nobody knew about it, but it's not, not actually true.
Nobody knew about it, so there were a lot of cryptographers that raised questions.
A couple of years later, where they were like, I don't think this is actually random.
It looks suspicious to me.
Where they were like, if someone theoretically had access to some secret key S
and then created some mathematical formulas and actually mathematically approved
that there was insecurity there, it was not random.
Because they noticed a pattern in it.
They realized sort of that.
So basically what they realized is that there's just those numbers.
So they wrote this proposal.
Hey, let's use this algorithm.
And this algorithm contains some constant numbers.
So there's those numbers written there.
And then they were like, are those numbers random?
Because we're literally deriving our randomness from those numbers.
And we were like, yeah, those are random.
We randomly generated them.
And it turns out there was some other key that is being used
to then mathematically be able to recover whatever randomness you used.
So that was the secret attempt to undermine cryptography
by the US government.
Yes.
And I think what I was striking about this against that,
you're not just undermining privacy.
You're undermining the entire security of your economy, your country.
And banking, banking, missile codes, everything.
Yes, everything.
So the thing that didn't happen was in 2013,
Snowden revealed a few papers, I guess.
And one of those was Project Bull Run.
And within Project Bull Run, they allocated funding
to that specific project where they tried to undermine cryptography.
And so once that got published,
the corresponding companies and standardization institutes.
And it's so striking that you get standardization
because once it's defined as a standard,
you and industry need to implement it right to get certification.
So it's literally impossible to then use
some other alternative to the secure
because certification only gets provided
for disrespect or technology.
But I got uncovered, thanks to Snowden,
then people stopped using it.
Did he celebrate it?
Did he win the presidential medal of freedom for this?
Yes.
And then it turned into reality.
In the different realm, I guess.
One of the great patriots of our time.
Relentless, I mean, they'd murder him in a second.
He's still an exile, not by choice.
But it's, yeah.
But they also uncovered us that they actually paid
to this company that built those products
10 million U.S. stores, Tennessee,
to use that as a standard.
So yeah, that's why you cannot trust anyone.
As you point out, it's not simply,
I mean, so this is an intel agency trying to spy on its own people,
the ones you pay for yet to exist.
But it's, and that's immoral and, you know,
something that we should Fed against.
But they were also sabotaging the U.S. economy
and U.S. national security.
And because if your cryptography is fake,
then that means you're exposed
on every level throughout your system.
You are, yes.
Yeah.
And it's so interesting because it is debt or task
that's why it was possible for them to do that
to increase national security, right?
At that point, they were the leading cryptography research company
in the world sort of, right?
And so that really is striking to me,
that you're willing to undermine
the entire security of your nation.
And at the end of the day,
puts you in a worse strategic position.
I think many people don't realize that.
I've never thought about it until you mentioned it.
But it just highlights, I mean, I love Ed Snowden
and I'm not embarrassed to that, I'm proud.
But it just highlights, you know,
the suffering that he's been through
in order to help his own country
and he's still slandered constantly in it.
It drives me crazy.
But this has yet another example
of why he did something more than almost anyone else
to help this country.
But so you are,
sounds like you're convinced that open,
the current state of the art in cryptography
is actually secure.
Yes, yeah, 100%.
I think, as I said,
I think this is a great example to look at
where even with those backdoors
that had been implemented,
there were cryptographers
within this global open source,
mathematics, cryptography, community,
that rang the bell,
but nobody was listening to them.
But they actually identified the issue,
years in advance, and rang the bell
and said, this is not secure or not random,
even within those companies
and standardization institutes,
but nobody took it seriously.
Or I guess took it seriously,
but it doesn't matter if the law is,
you have to use this algorithm, right?
So that makes me very confident
that this system works,
the system of mathematician.
Is cryptography global,
which is to say like,
is Chinese cryptography different
or stronger than European-American?
Of course.
It is interesting.
So you have actually specific encryption standards
used by militaries of the world, right?
So the Chinese use different cryptography
than the Russians,
than the Americans.
It is, at the end of the day,
the same thing, right,
from a mathematical standpoint.
But there are some deviations
in the level of security
and the kind of numbers used, right?
So everyone builds their own standards
because they mutually distrust each other.
But at the end of the day,
they're underlying mathematics,
or are the same.
The cryptographic standards,
the way that cryptography works,
that is the same.
So there's no reason to think
that Chinese and the Russians
have stronger cryptography
than the Europeans and the Americans.
So I think,
no, no.
And I think,
I mean, it's interesting to think about,
is there cryptography
that is being developed in-house,
within militaries
or whatever proprietary human organization, right?
There's not publicly known
that there is incredibly powerful.
I mean, what I've been doing
with my team,
and I'm so glad that I have those,
incredible cryptographers in my team
that actually understand all of those things
on a way more detailed level than I do,
is build this protocol
that allows us to literally
take everyone's data
so you could imagine
the entirety of the United States, right?
We take everyone's health care data,
something like that, right?
And then we say,
well, we need to do something with that data.
Let's say we need to research our disease
or whatever,
instead of taking that data
and passing it to some company
that will inevitably expose it,
lose it,
it will get leaked,
or it will be used against those people,
we encrypt it,
nobody ever has to share any information,
and we just run whatever computation
that we collectively set,
we are going to do that with this data,
we do that,
we get the result,
we figure out a cure to cancer,
whatever,
but at no point in time,
you ever had to share your data,
your data never left your ownership,
and I think that's really core.
And it's sort of
it's the holy grail of cryptography, I would say,
being able to do these kinds of things,
because you can now run
any type of computer program,
instead of in the public,
in private,
and you can restructure the way that
your entire economy
and country can work, right?
And that goes beyond just
just economical human interactions
that also touches upon things like
rethinking how we can actually
improve democratic processes,
because what those computations
inherently have as a property
is so-called verifiability.
So what's the status quo sort of,
and the current internet is,
you task some cloud provider
to run a computer program for you, right?
Because you have limited resources,
you want them to run that computer program for you.
So you pass them some information,
an algorithm,
and you get an output back.
But how do you know
that this output is actually correct, right?
Could be that there was an error,
could be that
they maliciously tried to undermine
the output that they have sent you.
So this technology that we've built
actually solves this, right?
Verifiability for computations.
You can mathematically verify
that a computation has been correctly executed.
And that itself
is an amazing property.
An amazing property that you want to see
within every system, right?
But you don't get that amazing property
without implementing privacy for those systems.
Isn't that amazing?
It is amazing.
It is.
How did you all create this?
So I'm very lucky that within my company,
I have very experienced photographers
who've literally worked more years
on these specific issues
than I have been in cryptography.
And so I'm sort of building
on the shoulders of giants, of course, right?
And there has, for a very long time,
been research in those areas
being able to run those encrypted computations.
But it has never been practical enough,
where it is fast enough, cheap enough, right?
And versatile enough,
where you can actually do all of those things.
And so I think what really guided us
is to,
and what really guided me
in the way that I designed this system
is to think about, okay,
how can I actually build this system
so that people are going to use it
and are going to build applications
and are going to integrate that into systems, right?
Because I think with privacy technology
in general in the past,
what has been done is that
it sort of has been created
in an echo chamber
in a vacuum almost,
where you're a smart cryptographer,
it will be an amazing technology,
but you maybe don't understand
how markets work
and how to get product markets,
and how to actually get those users, right?
And so we've tried to build in a different way,
and that's how we ended up here.
But to be honest,
it was an evolutionary process for us.
So we originally started with a different kind of cryptography,
I would say,
that was more limited,
that didn't allow for all of those interactions,
and then, at some point,
we sort of decided, okay,
and we realized that
that was not good enough,
that was not enough,
and at that point,
basically everyone was still
building with that technology,
and we were like,
let's do something different instead,
let's think about how the future
will look like,
how sort of computation and privacy
can convert
and something bigger
for the entirety of humanity,
and that's then,
how we build that.
In very, very quick time, actually.
How did you fund it?
So we got investor funding,
and I'm incredibly thankful
for all of the investors that I've gotten,
Coinbase, for example.
So big names in the space of
blockchain distributed systems,
all of those networks like Bitcoin,
all of those networks are distributed
in nature, decentralized,
and yeah, there's a lot of players
within that space that truly believe
in the value of privacy,
and that privacy is a human right,
and privacy is inevitable as a technology
that like to support it,
but not just supported right,
because it is something they believe in,
but invest in it because they sort of
have realized that this is one
of the most powerful technologies
that can exist in humanity,
right, being able to take information,
move it into this realm,
and then it can stay in this realm,
and it can be processed,
and everyone can do that.
That is incredibly powerful.
It is emancipating,
and it is powerful for businesses,
but also nation states.
At the end of the day,
it is a neutral technology,
and so we have investors
that believe in that.
So one of the applications,
we're just talking about camera,
and one of the applications for this technology,
what one of the big ones,
is the movement of money
in a way that's private.
How exactly does that work?
And let me just add one editorial comment,
the greatest appointment of the last 10 years for me
is that crypto transactions
don't seem to be as private
or beyond government control,
as I thought they would be.
I hope they are someday,
but watching the Canadian truckers,
you know, have their crypto frozen
and was just such a shock.
I've never gotten over it.
Will this technology change that?
Yes, so if you think about Bitcoin,
as the state of the art model of,
or I guess the original,
or not say of the art model,
the original kind of blockchain network, right?
What it is at the end of the day,
is a way for distributed people
to find consensus over some unit of money,
which is actually more like a commodity
than actually a financial instrument.
That's right.
And they find consensus
and they create this currency.
And that's why people think
that it's fake, non-existent, right?
Although it's a way more real process
of creating a currency than fiat currency,
they mine it by taking energy
and solving a mathematical problem
and once they're correctly solved
that mathematical problem,
they get rewarded
in that newly mined currency, right?
So it's a very, very elegant design.
Most people think that these kinds of networks
are anonymous and are dangerous, right?
Because I feel like it has actually been a narrative
that media and different actors
want the people to believe.
I just have to add,
I would like them to be anonymously dangerous.
Oh, yes.
Oh, yes.
Yes, yes.
I was hoping, yes.
So people believe that,
which attracts people, right?
And also keeps other people from using them
and trying to outlaw them.
In actuality, they're not anonymous.
What you have in Bitcoin specifically
is pseudo-nomity.
So you don't see on the blockchain
Tucker Carlson has 10 Bitcoin
or whatever and San Giannick won Bitcoin.
You instead see ABCDEFG, blah, blah, blah, whatever, right?
A random string of numbers and letters
has sent something to another random string
and letters and numbers.
However, they're linked to this identity that you have.
So for every single transaction
that you've performed in history
on top of this distributed letter,
you will see all of those transactions.
So I, when you later after the show,
then we won Bitcoin, I guess, right?
So I would see their cheaper today
than they were yesterday.
You noticed.
So when you sent me something,
what I'll be able to see is
all of the other transfers
that you've performed in the past, right?
That's unfortunately how Bitcoin works.
And so it has this inherent full transparency.
There's no privacy because it's so easy to then
we are, I guess, on and off ramps
how you actually moved money in there, right?
Because you most likely don't actually
get this currency through work
by applying energy.
You buy it for a different currency,
via money, right?
So your identity is linked, everything is public.
And so that's a fundamental issue.
That is actually a dystopian scenario
where we could end up.
If this is adopted as the technology
where all of your money now sits
and you're sending transactions,
where you have this big upside
of having cash-like properties,
which is amazing,
but you have this tremendous downside
of literally everything being recorded
for the conceivable future of humanity, right?
And you have no privacy.
And that inherently limits your freedom
to use this technology.
And so that is an issue
that exists, not just within Bitcoin,
but also other blockchain networks.
And Bitcoin is this pure form.
That's why within the crypto industry,
there's a lot of competition also
between different players that say Bitcoin is this pure form
that only allows transfers of money, right?
And other networks allow execution as well.
And that has led to
what is commonly being called smart contracts.
So this concept of computer programs
that simply exist in the adder basically,
a computer program that can execute something
that you tell it to do,
and it will guarantee to do so.
And this amazing property
that all of the founding fathers
of those networks basically identified
as important as so-called censorship resistance,
which I think is also important in real life, very.
And so those networks provide censorship resistance.
It doesn't matter if one computer decides,
well, I'm not going to accept Tucker's transaction
because I don't like Tucker.
Well, there's going to be another computer that says,
I will accept it.
So that is censorship resistance
that is inherently baked into those systems.
And what that means is,
if you interact with this as this invisible machine, right,
you get guaranteed execution for whatever you tell it to do,
either send someone money
or perform some other computational logic
that is baked into the system.
And so there have been,
have been different kinds of pioneers
on the front of performing,
yeah, adding cryptographic privacy to those systems.
There has, for example, emerged
on a network called zero cache, C cache,
which is basically Bitcoin
with cryptographic privacy.
And there have also been pioneers
like the inventors of tornado cache
who have built a smart contract
that exists within this editor is unstoppable.
Once you've uploaded it,
you cannot stop it anymore.
So they did that.
And the kind of code
that they implemented there
gave you privacy on top of this public network,
which was the,
or is the Ethereum virtual machine?
So they did that,
and tornado cache to the tornado cache.
Well, did they win the Nobel Prize?
Did they get the presidential
and what happened next when they offered privacy?
So there were,
I think it was three founders,
Roman Storm, who's an American citizen,
Roman Seminoff,
who is a Russian national
and Alexei Peterson,
who is a Russian national as well,
who lives in the Netherlands.
He has been convicted
of assisting in money laundering
for five years.
And...
Five years in prison?
Five years in prison.
And Roman Storm has been convicted
of in the United States
of conspiring to run a money transmitter
with other license.
Now, why has this happened?
Why did they suffer such grave consequences?
They were arrested.
They were arrested.
A broad untrial.
I mean, it's actually,
if you look at what Roman Storm has faced,
it was 40 years in prison for this.
In the United States.
In the United States of America.
And why has that happened?
They built a privacy tool.
Well, it was an illicit actor
that used their privacy tool.
And that is a shame
because it was an illicit actor
that a lot of people agree on it.
It is an illicit actor.
I think the two of us also agreed
that North Korea laundering stolen hacked funds
is an illicit actor misusing a tool, right?
So there's no question about this.
The underlying question reveals.
Where were sure that actually happened?
We are sure that happened just.
For sure that has happened.
And so they stole funds
because they were able to hack
the French systems.
And then we're able to utilize
this platform to gain privacy
to then move those funds somewhere else.
Did Roman Storm participate
in the North Korean hedge fund theft?
He did not know.
So if I rob a bank
and then jump into my Chevrolet
and speed away,
does the president of General Motors get arrested?
Usually he doesn't know.
Okay.
Which is interesting because
he provided clearly this tool
for you to get away.
And he knows that people
get away with cars, right?
He does.
So got to weird how he dodged those obvious charges.
Is that really what happened?
That's really what happened, yeah.
And has faced 40 years in jail.
But the jury
could not find a unanimous decision
on the main charges,
I guess, circumventing sanctions
and helping with money laundering.
Now the interesting thing is
before they got arrested,
what has happened?
The OFAC,
the ORIOS for foreign asset control
in the United States,
they took the software
that those developers had written
and uploaded to the AFR
where it has become
out of anyone's control,
unstoppable by nature.
Anyone can use it.
They essentially wrote
code for a software tool
for anyone to get privacy.
That software tool got sanctioned.
It got put on the SDN list
for specially designated nationals
where you put the names of terrorists
and you put the address
in this AFR thing, right?
Of the software.
So the source code itself
became illegal.
It was deleted from the internet.
All of the companies closed
their developer accounts,
developer accounts.
The software they wrote,
the free speech that they performed
by coming up with those ideas
and publishing it to the world got censored
because they were added to a list
which they don't even belong on
because it is not...
Without any vote in Congress,
by the way,
or this is just...
Yeah.
Part of the...
I think it's under State Department now,
but I could be...
I have enormous power.
They've destroyed the lives of many
thousands of people without any
Democratic oversight at all.
And it's pretty shocking.
Yeah, and so...
So it got added onto this list.
And I think last year,
according to the State of Texas
actually ruled that OFAC
does not have the statutory authority
to do any of that.
And they then silently removed
tornado cash again from the SDN list.
However, nobody is able to use that
to an alright because every company
for compliance reasons
outcasts you from the user base
if you have ever touched
anything related to that.
And Roman Storm is...
He was convicted, you said.
There was a hungarian
on the strongest charges,
but on other charges,
he was convicted.
He was convicted on one charge.
On...
I think they...
It is called...
Yeah, conspiracy to...
to run a...
money processor,
financial institution, right?
A bank...
Without a banking license.
So, so...
So, so...
So they put him in jail.
Actually.
So, so he...
It is one year jail sentence
that's on the charge, right?
But he's currently in the process
of appealing that.
So, Roman Storm...
didn't run a bank.
He didn't create a bank.
He created software, right?
He made use of his
inherent right for freedom of speech
to build something that enables others
to make use of their right
for freedom of speech, right?
Because that is at the end of the day
the freedom of economic interaction, right?
Yes.
But he helped others
protect for themselves.
He never processed a transaction
for anyone, right?
He's not an intermediary.
He specifically built technology
that is disintermediated
where you yourself use that software.
Yeah.
And so...
The remarkable thing is,
I pay some attention, obviously, not enough.
I was not aware of this story
until I was reading up on you.
Where's all the coverage on Roman Storm?
He doesn't even have a Wikipedia page.
I've noticed.
So, there is...
I think incredible institutions
like the Electronic Frontiers Foundation,
the EFF,
and DeFi Education Fund,
but also companies like Coinbase,
who actually have invested
substantial amount of money
into defending Roman Storm.
And, yeah, I like to say,
as well,
I think Alexa Peters
also doesn't get enough attention.
He's...
I mean, he's now under house arrest
in the Netherlands
and preparing to appeal his decision,
I think, something like...
Why are so many of the players
in this Russian?
I think it really boils down
to them having a deep understanding
about...
I think historically,
maybe culturally,
they have an understanding
about the importance of privacy,
and a society to uphold freedom,
which is a shame.
Well, they've suffered for that knowledge.
Yes.
For 70 years.
More than.
So, yeah, it's just...
It's very striking.
It's 140 million people.
It's a tiny country,
relatively speaking,
and yet they are way over-represented
from Pavlov Duraven down.
For sure, yeah.
That is true.
So, I think...
I think it's interesting,
how we also,
all of us take that,
take it as a granted,
that these kinds of people
go out of their everyday lives
and put a target on their head
by shipping this technology
to enable you to gain privacy,
and simply the knowledge
about the existence
of bad actors in the world
has made them victims
and put them in jail,
which is insane.
Well, I mean,
it's something the rest of us
should push back against, I think.
But the hurdle for me is not knowing.
Again, I didn't even know this was happening.
I should have guessed so.
If you could be more precise
about what you think the real motive was
behind going after tornado cash
and Roman storm.
Like, why was the US government
not prosecuting drug cartels
in order to prosecute Roman storm?
I think...
So, that has taken place
under the previous administration.
So, I think
President Trump,
with his administration,
has done tremendous work
in regards to pushing the adoption
of decentralized technology
of really allowing us,
all of the people in that space,
to try to rethink
the financial system
and build this technology
because they've sort of realized that
technological innovation
runs at a faster pace
and then legislative processes
and under the previous administration
that looked differently.
So, I think
that has helped this technology spread a lot.
And it is, however, important
to consider privacy.
And when the executive order banning
CBDCs was signed,
central bank digital currencies,
an explicit reason why
CBDCs should never be adopted
in the United States
was the privacy concern.
Because if we look at all of those
new digital shiny currencies
being built in Europe
and all around the world, I guess,
besides the US,
which is great,
which actually
is amazing, I think,
is that all of them
are surveillance machines
to even a higher degree
than the current financial system is already, right?
It is already a surveillance system.
But what's important
about this next generation
of money is
we are sort of at a crossroads
do we want our money
to enable us freedom,
freedom of economical interaction,
freedom of thought at the end of the day
because whatever we think,
we do, right?
We want to put our money
where our mouth is
or do we want
a monetary system
that enables automatic
subsequent action
based on whatever activity
you perform in your digital life,
which can mean things like
now all of your money is frozen
and you don't have any access to it anymore
because whatever you just did
was deemed as undesirable
by big brother, I guess, right?
So this is literally
the two possible futures that we have.
It's two extremes.
There's no possible future
in between.
And what if
the architects of
the you're assuming cash is over?
Cash already is also
being heavily surveilled.
So your bank note has a Syrian number.
So if you actually think about
something like tornado cash
or all of the, I mean,
there's a lot of applications
that, for example, utilize
RQM to also bring
this level of privacy, right?
If you think about all of these
systems,
they are in my mind personally.
I mean, as long as you have an internet connection
if you don't have an internet connection,
maybe you cannot
spend your money right now
but as long as that exists
even superior to cash
because you don't have any serial numbers anymore, right?
So you're surveilled.
Sure. I mean, when I go to the ATM
and withdraw money,
the serial numbers are recorded in some database
and when the merchant
at Walmart, I guess,
or wherever,
puts that into their cash registry,
you can also record a serial number.
Is that true?
Yeah, there has been, I read an article
a few months ago
about a tracking system like that within Europe.
So that is very practical
Yeah.
I'm going to take a magic marker, a pen
and distort the serial numbers at all my cash now.
Yeah, right.
So I mean, it should be,
should still be legal tender, right?
I would think so. I'd never heard of that.
I mean, I mean, there, there, there, there could be other tracking mechanisms.
But I've read about this technology
which clearly exists
and is being used
to even turn the cash system
into a surveillance system
and
it's not even, again,
I think out of this,
it's not even just
someone with governmental authority
deciding to surveil people, right?
It is also companies, companies
seeing economical value in surveilling you
and then utilizing
this new technology, utilizing the internet
to do that
and it boils down to power,
I would say, control, right?
If you have access to as much information as possible,
you can better prepare for the future
and you can predict behaviors
of your users or different actors
and so that's why those systems
get implemented.
So we are on this fork
in the past,
what's the future?
And what the people
that are architecting
those central bank digital currency systems
have realized and that's
so interesting to me
is this old concept
that the cyber punks in the 1990s
came up with, which is
code is law
which expresses what
has happened with tornado cash, I think, nicely
where
it is the ultimate law, sort of,
when you have this network
that nobody controls and there's some piece of software
and it just executes.
Whatever is written within the software code
executes, there's no way of stopping it
there's no way of doing anything about it
and so that's what they mean when they say
code is law
and the architects of those
alternative systems have realized that
there's so much power
in being able to
let's say take your chat messages
and see that you have said something
against big brother
and big brother doesn't appreciate that, right?
And so automatically now
your money
is frozen and that is
code is law, right?
In the utopian sense
automatically can lock you
out of all of those systems
and I would much rather
have a utopian future
but at the end of the day
from a technological standpoint
those things are
similar to the only difference
really is
cryptography
privacy
privacy
because you're offering that
on a scale even larger than
that
whether or not you have prominent investors
like you face some risk
Sure, so I think
what
what I'm doing with RQM
at the end of the day
is I'm providing
the most versatile and superior form
you can execute a computer program
right?
Within encryption
you can execute a computer program
and you can have
all sorts of things
you can do things
starting with
financial transfers, right?
You can add privacy to financial systems
but that doesn't just mean
we're adding privacy to
me and you Tucker
interacting with each other
we can also add privacy to
entire markets, right?
Which again can also have
downsides. I'm not arguing that
there's only upsides
on ethical activity, right?
The way that people may
interact.
And it's core it is
neutral technology
but the use cases that
I'm really focused on enabling
also are use cases like
enabling within the health care system
to actually utilize data
that currently
is being stored
but it is being stored in a very
inefficient way where it's isolated, right?
So with my technology we can take this data and use it
to be exploited without ever
taking ownership of your data
because you're the patient, you're the human, right?
I have no right to take ownership
over that and I don't need with that technology
because you can consent and say
let's improve health care
or whatever with my data
but you're not getting my data
because it's encrypted, right?
It's a crazy concept
to wrap your head around.
I get it but it enables so much
also on a national security level
and it really is a period of technology.
And I think this example that I
told you earlier about
verifiability, right?
Mathematically
being able to be convinced
that a computer
program, a computation that has been executed
in privacy, right?
Has been executed correctly
is such an amazing concept
and the way
I think about it really is
opening up a new design space
and allowing companies
to do actual innovation
instead of innovating only on the front of
how can I extract as much
as possible from my user by surveilling them?
So I don't really
think about it the way that you frame
that I'm building this generalized
computing platform that
can be used by anyone
because I don't have any control over it, right?
I'm not building a controlled
infrastructure.
I'm building open
software that is used
for good.
And I'm grateful that you are
and I don't at all mean to make you pessimist
to paranoid but in so doing
your threatening
current stakeholders.
Sure but I think
that's always the case
with new technology, right?
Of course, yeah.
I mean when cars first came
along, right? There were unions
of
horse carriage
taxi ride providers that did not
want to see cars on the road.
Of course. So there's always
interests that
try to utilize
both technology and
law to
prevent others from
getting into the system.
Exactly. Of course. Yeah.
The stakes depend entirely on how disruptive the new technology is.
Ask Nikolai Tesla.
Yeah. Right.
Sorry. Dark.
But so it's not a concern.
It is not a concern for me.
No.
I wonder if that's just a quirk of your
personality where you're just not afraid of stuff.
That's actually an issue.
I would say I sort of
suffer from sometimes
not being afraid of things but
good. I think you need that
in order to proceed.
So from the perspective of the average American consumer
who's not following this carefully, when does your life begin to look different
as a result of this kind of technology? When will you see this sort of thing in action? How you experience it?
That's actually a brilliant question.
I think
just trying to run numbers on my hand and trying to
do something I've never done.
By the way, I've never paused in mid conversation that I've got to run some numbers in my head.
I do this all the time.
I never have.
So I think it will affect your
everyday life positively.
Once I guess there's an
inflection point reached on multiple fronts, right?
I was talking about health care and national security.
Also financial system, right?
But it also, I mean,
so that's a criticism I actually have for a signal
that is that
there exists one single point of failure within signals
technological stack
that I've been vocal about
and I dislike which is that
what they call private contact discovery
where I have a set of contacts
in my contacts on my phone, right?
You do the same thing.
And if there is an intersection
between the two sets that we have
where I have used a contact,
you have me as a contact.
I get Tucker suggested on signal, right?
Only in that case.
How does that work, right?
How does signal ensure that
those contacts are encrypted and secure, right?
They use trusted hardware for that.
It is a critical flaw within their infrastructure.
So there's technology,
trusted execution environments,
what they're called,
manufactured by Intel, for example.
And this technology comes with this promise of being secure
and being able to basically do what we're doing with
Mathics, but instead of trust.
So they say, we built a secure machine.
You think we shouldn't trust Intel?
I think so, yes.
I think it's wired to trust Intel.
Yeah, I think it's a insane idea to begin with that last year.
It's been funny last year.
There have been a myriad,
trust last year, but over the last ten years,
a myriad of exploits of the technology.
So in the past,
it has always been sold sort of as,
here's this technology.
And it does verifiability and privacy
and just put your data in that.
There's no backdoor, right?
Of course not.
Why would there be a backdoor?
Why would Intel cooperate with anyone?
Sure.
And you would do that.
And then last year,
there were those researchers that said,
well, if you have physical access to this computer,
you can just read out all of the data.
And you can not even trust read out of the data,
but you can fake keys.
And then you can perform fake computers.
You can perform fake computations
on behalf of other people.
So if you're building a financial system
with a computer like this,
I can just change numbers, right?
And I know what your numbers
and I can change those numbers.
And that's not even the core issue I have with that
in the case of signal, right?
So signal is,
I think still relying on that text
or I think they run this hardware.
I mean, I hope they run their hardware
because at least there,
for remaining trust assumption that,
okay, they will not try to hack those PCs,
which is relatively straightforward.
You just connect a few cables at the end of the day.
And then you can extra trade the information,
which is the interactions, right?
It's Tucker, my contact.
It's Yannick Tucker's contact, right?
That's very sensitive information.
And so that is a single point of failure,
whereas they could access the information
or whoever gets access to that information.
And we're not even thinking about potential backdoors
at that point, right?
Within that hardware.
So within the manufacturing process,
I mean, I think it would be very naive to assume
that there's no backdoor similar to what we talked earlier about
with Dual EC, right?
Or something like the Clipper Chip thing, right?
That was attempted in the 90s.
So there's,
very likely I would say that there's some randomness tempering.
Let's call it that, that could be in place
because you are literally also getting keys
right from the manufacturing process, right?
So it's this proprietary supply chain
and then they ship that computer to you
and it comes with random keys
that have been generated in that proprietary production line.
So there's many single points of failure
and that's what I, when I don't like about signal
because I don't want this information out there, right?
What does my address book look like?
So they can fix that.
They can fix that with technology that we've built, right?
They can use our technology.
I'm more than happy to just give the technology.
I mean, it's open source, right?
And then they can just build this thing
without a single point of failure
without a way because this is sort of a reasonable way
for a state also to say,
well, you actually have this data.
Give us a state, all right?
Well, they cannot really argue that they don't have that data
because they could connect a few cables to that computer
and then get that data.
So it's not the secure device that people claimed in the past.
It was.
So I think that is important to resolve.
I actually don't recall how I got to the attention.
I wonder if any big hardware manufacturer
will begin to offer truly secure devices for sale.
It's not worth it, probably, right?
So so I think it is worth it, right?
You have a military one to have security devices, right?
Everyone, I think everyone would rather compute
on a security device than an insecurity device.
But the manufacturers aren't making their money
from the devices.
I mean, they're making money.
I don't know what it costs to make an iPhone less than 900 bucks.
But I mean, it's an annuity like the long, you know,
the second you buy an iPhone,
you're making money for the company every day you use it, right?
Sure, sure.
So I think it is impossible to build secure hardware in that regard,
where those claims of full privacy and security are factually true.
There's impossible to have been so many techniques
where you actually just use so many different tools
to play around with those devices where it is literally impossible
to implement secure and verifiable systems
because even while verifying them,
you need to take them apart sort of destroying them in the process.
So that does not exist.
What I think however exists sort of is this concept
of decentralization and why that's so powerful
because it doesn't really matter if this manufacturer here creates a backdoor.
As long as I have 10 different computers or 100 computers
from different manufacturers and there's one that does not have
a full system level backdoor installed,
I am secure under this trust model that we've developed
in our company, right?
So I think that's why decentralization is so important.
That was the basis of our political system that was created
at the same concept.
Power is dangerous and so it has to be spread among
different holders, different entities.
So it doesn't concentrate and kill everybody and enslave them.
That's obviously going away.
But that was the concept of the American Republic.
Yeah, exactly.
And I think it is sort of important to look at surveillance
in the same way where if you have access to surveillance,
you basically have access to unlimited power.
So whatever surveillance system we implement,
be it chat control in the European Union
where I've been very vocal or vocally opposed to on X.
And I actually just learned last week that the UK implemented
their version of chat control on the 8th of January,
which is a censorship machine.
And surveillance backdoor, right, installed within all
of your messaging applications.
And it comes with this claim of, well, we're implementing this
because we need to find child exploitation, right?
There's always one child exploitation.
They care about their children.
Yeah, I strongly believe that.
So they basically have, they're basically
four reasons to implement surveillance.
So there's child exploitation.
Yes.
There's terrorism.
Yes.
There's money laundering.
Yes.
And there is war on drugs.
Oh, war on drugs.
Those are the four reasons, right?
And they always rotate.
The people engaged in importing drugs into our country,
laundering the money, exploiting the children,
and committing serial acts of terror against their own population.
They're all very concerned.
Oh, man, I really think we now need surveillance.
Not as you say.
Not of us.
Yeah.
But so funny is that in 1999,
the Samsung policing working group of the European Commission,
there was a transcript of their discussions.
And literally within the transcript,
when they were talking about implementing digital surveillance systems,
they were like, I think we should switch our arguments
over to child exploitation because that is more emotionally charged, right?
It convinces people.
And so it's not just that for us, it is obvious that that's not what's going on, right?
When the people who covered up the grooming gangs are making that case,
it's like, I don't think it's sincere at this point.
Exactly, right?
So there is a reason why we don't believe that that's the actual reason.
But what I'm arguing for is that that doesn't even matter.
Even if the politicians are convinced that it's about protecting the children
and that's the most effective measure to do that right to the way all of the chats,
what's going to happen is thanks to this being implemented as infrastructure
that exists everywhere.
And there being a small circle of people that have access to this technology,
it will get abused.
It is very easy to abuse those systems because the abuse itself happens within secrecy.
So there's no oversight.
And instantaneously, because of the rise in computational power,
it's not like someone has to go to the Stasi archives to read all the files.
Exactly.
And Sam Aldman would gladly help you to sit through out of the, you're your very reason.
Oh, he's a good guy.
By the way, a lot of these businesses draw the worst,
like the most unethical people have the most power in case you haven't noticed.
It's wild.
It is wild, yeah.
I mean, there's a, there's an economical function sort of to reward this right
because if I build an application and you build an application,
and we just provide some value to our user,
and the user pays for that.
Basically, capitalism, right?
All of that works out nicely.
But then you decide what if I take all of this information from my user
and I use that to extract additional value from him, right?
You're way more profitable through that.
So the incentives.
And so then those incentives shift towards the setup
and these kinds of applications are the ones that receive investment, right?
And so that, that trust increases.
And so unethical behavior gets rewarded in the system.
Just to be clear about what you're saying.
Are you saying that all techs sent within the UK are now monitored by the UK government?
I'm not 100% familiar with all of the intricacies of what digital servers,
or online safety, I think it's called in the UK.
What is happening there is that there is censorship being applied to the messages.
So you receive whatever unsolicited image, right?
And then that's being censored.
So what's happening there is,
I think what's important to understand is that censorship is a byproduct
of surveillance generally speaking.
Yes.
So you need to take a look at all messages in order to be censored something,
to do censored something, right?
Yes.
And so that's what's happening there.
And even if we assume only the best of intentions,
you have this infrastructure in place that tomorrow cannot just be abused by someone.
We should test it.
I'm in the UK all the time.
I'm family there.
And I'm going to do a double blind study with my wife.
I'm going to test to every person in my contact list, overthrow Kierstharmer.
Okay.
Yeah.
And two thousands of people exclamation point.
And she won't.
And we'll see who gets arrested.
Yeah.
That's a great experiment.
Actually, I need to attend a conference in the UK this year.
And it's so funny because a month ago there was this, I think it's also some proposal
that basically specifies that people that work on encryption are sort of persona non grata in the UK.
Something like that.
I think it's not yet implemented, but I saw that on X.
I mean, you can't get in the country if you're for privacy.
Something like that.
Yeah.
Where are we going to, like, big picture?
Where's everyone going to end up?
Do you think?
If the control grid snaps into place and it is snapping into place, where do people go?
US is at the only place.
So all of those, I mean, we're basically, I would say, not just sliding into that direction, but galloping.
And the infrastructure, it has been quite a while since they started trying to implement those in your face.
Things right where you're literally called a chat control.
I mean, imagine how crazy that is.
It's literally stating every single messaging platform, email, whatever we need to scan for this made up reason.
But trust us, we will only do that for this made up reason and no other reason.
And it happens on your device, right?
So that's why end-to-end encryption is not undermined because it is being scanned on your device, right?
And that's very different from putting microphones in your bedroom.
Trust is very, very different.
Yes.
I mean, I think people don't realize the extent to how surveillance is possible nowadays.
So with Wi-Fi routers, you can determine movements within your apartment, right?
And so there was this one company.
I mean, there wasn't a big scandal. It was literally just, I don't know if you're familiar.
I think it's called Lewis Rosman, who's a YouTuber from New York who was fighting for the right to prepare, right to repair devices and stuff, right?
So you always been very much educating those efforts.
And so he just made this video where he went through the privacy policy of some internet service provider.
And the privacy policy explicitly stated that they're allowed to monetize the movement data that they get from those devices that you put in your home.
And the funny thing about this case that he was highlighting is that for you as a person that lives in this building,
you didn't even have an option to choose a different internet service provider because with, I guess, bulk agreements between a landlord and the internet service provider,
you are forced to have those routers and those routers aren't even within your apartment there in the walls or somewhere.
And so you're just being scanned within your most intimate, intimate area of life, your home by your internet service provider.
And what about phones listening to people, the microphone on the phone or the camera on the phone, taping you?
So there's an interesting concept of ultra sound, listening of those phones where we're basically you have a TV advertisement.
And we don't hear ultra sound, right? But your phone with its microphone can could could hear it.
I don't know if it's ultra sound or whatever frequency, right?
So within that advertisement, we're going to play that sound. So your phone can pick that up.
And then when you go to our fast fast food restaurant on the same day, we know that this advertisement has has worked because your phone previously registered it.
So there have been a lot of attempts like this. I think that surfers a couple of years ago.
In this case, I don't recall the exact name of how this technology was called, but especially there were court cases actually against that where they require the company that offered the technology to make the user aware that this is happening because a lot of apps had this technology installed and they had microphone permissions and they just installed this library.
Because maybe that library pays the app developer some money, right? And at the end, it is tracking you.
So what I'm just trying to say is there's an infinite amount of ways you can be tracked.
I mean, just enough last year in the US, there were those cases surfacing surrounding cities surveillance cameras around 40,000 of these I think exist in the US.
And those camera cameras or also license plate readers right all of that are incredibly smart equipped with artificial intelligence to directly track faces of humans.
And there was this one YouTuber Ben Jordan who actually exposed that and finally enough after exposing that got private investigators from the set company to his home to I guess fully destroy his privacy.
So he I think he helped expose that that none of these come cameras were encrypted. So they were recording all cities across the US permanently 24 seven storing that everything being massive.
While anyone could just via a Google search and some specific query get access to the camera feed and see what it's going on and you showed videos of playgrounds where children were playing right.
And so that's what I mean when I say that surveillance does not bring us safety or security it is in most cases doing the opposite.
It's also all network to it's digital in this network to so that means that companies can pull up CCTV cameras from around the world.
Oh, yeah, anyone facial recognition. Yeah, it's anyone can I mean it's and and what I really found so striking about the story is him outlining how he was able to follow people around right.
I was able to say, oh yeah, they went to church here on Sunday and then they went there for shopping doesn't say right.
And I don't know you as a human being just there was this one video of an adult man just going on to a completely empty playground and just hopping onto the swing and just swing swinging there right.
If this person knew that he was being watched he would never have done that right.
And so this this idea of escapism is entirely impossible in our in a world like this because there is no escape.
Yeah, also with license plate readers which aren't license plate readers they are surveillance cameras that pretend to only do a specific function.
There was what other functions do they do I mean record everything and be able to track cars even if they don't have a license plate.
So you cannot be trusted license plate reader if you're one of your capabilities is to also help you identify cars that don't have a license plate right.
Fair so I just just recall one case where there was a police officer who then use this access to technology to stalk his ex girlfriend right which is inevitable with this kind of technology if you put that that power into into the hands of individuals who can use this technology in secrecy right.
It's not like throwing a nuclear bomb on on a country people will notice right.
Mass surveillance nobody notices.
Can you if so people made it two hours into this interview they're obviously interested in you first can you pronounce and spell your name.
Yonic straight Y a N N I K S C H R A D E.
The name of your company and it's spelling R Q a R C I U M.
How do you speak English is fluently as you do since it's your second language.
I would say as a as a it's funny because as a child but when I was in high school there were faces because I was consuming so much English content on the Internet that I was consciously thinking in English right as a child.
Yeah I would say that.
You're on Twitter where else can people go to read your views on technology and privacy mainly on my Twitter at why are straight and I also have a small small website just trust my personal website.
I don't have a block there I write all of my articles basically on on Twitter sometimes sometimes I get the chance to to publish my fuse on on some very niche news outlets in in Germany.
But most news outlets don't really care about privacy so I stick with X and I really like really like talking on X sharing my thoughts on X writing articles.
There right when I when I talked about about chat control specifically on X and it's so funny we haven't even touched on on on the fact that chat control the way it's aimed to be implemented in the European Union with the current proposal.
So what what happened is that there was this proposal where they said you need to all providers need to have chat control which is so called client side scanning right.
Tucker's phone is going to check the message that Tucker is sending right now if that message is illicit under some definition and if so then it's going to send a message to the police that is what client side scanning us.
And it's most I guess innocent form it would just be we're going to censor the message because I don't know child exploitation or whatever made up reason right so so we're going to censor that message.
The worst case would just be we're going to forward that message and that's that's what a lot that they had and is that received a lot of backlash also thanks to Elon Musk and didn't pass.
And then as you would expect shortly after I think it was less than a month they came back with a new proposal and a new proposal made it voluntary.
So the new proposal basically states hey Mark Zuckerberg do you want to voluntarily add surveillance mechanism to your applications which which is insane right because of course companies will will voluntarily implement those surveillance mechanisms but if you go down those different paragraphs and that proposal what you will realize is that it is in fact not voluntary.
What you realize is that in order to combat child exploitation the European terrorism money laundering yes yes.
So in order to do that they're going to introduce a new bureaucratic agency who is tasked with risk assessing different platforms right so we're going to look at signal we're going to look at WhatsApp we're going to look at Gmail every single platform we're going to risk assessment.
We're going to risk assess and then we're going to be like how risky sell platform if it's risky then we apply coercive measures and they need to implement and all I guess all all all measures to to combat whatever illicit activity is targeted which in the case of child exploitation explicitly means that because that's the only thing you can do scandals messages right.
And so it is not voluntary after all because if and it explicitly says that you if you don't want to land in the high risk category just voluntarily scan and then you're not in that category that's in the US that's called extortion yeah you don't have to give me your money but I'll shoot you.
Yeah but but but free free so not give me your choice yeah.
Last question where do you your 25 years old which is remarkable where do you imagine you'll be at 45 at 45 you mean what will you be doing what will the world look like what the world will look like.
I'm a very optimistic person so while there is those two trajectories right that I think not just the United States but humanity in general will either take right one of those I strongly believe that we will be able to move into the utopian direction instead of the dystopian direction and so what it means for what I need to achieve is I need to.
Not just tell tell people about the importance of this right people sort of know that privacy is important right I think most of your audience realizes that right otherwise I feel like there wouldn't be listening to you so it is of course about education and and and stuff but more importantly and that's this core realization that I had is that privacy is only going to get adopted if it enables strictly superior technology.
And so that's what I'm doing that's the mission that's what I'm doing with our to enable a situation in which you have to adopt it sort of because it would be retarded to not do so.
And so that's that's what I'm trying to do and I think we can end up in a world like this where because that's what it needs you're exactly right it's not enough to say we're not fully human without it.
Yeah board of directors is going to say well yeah but look at the returns exactly right yeah.
I can't thank you enough if our viewers knew how this interview came about I'm not even going to suggest I'm not even going to say how this interview came about but it was through a series of
chance encounters that was just really felt like the end of God so thank you very much for doing this.
Thanks for having me talk around and I appreciate it.
The Tucker Carlson Show



