Loading...
Loading...

Let's be honest, buying cannabis shouldn't be complicated, sketchy, or low quality.
That's why I want to tell you about mood.com. That's M-O-O-D.com.
Mood ships federally legal cannabis straight to your door.
No medical card, no hassle, and here's the kicker.
The quality is better than anything you'll find at your local dispensary.
Yeah, I said it. Whether you're into edibles, concentrates, flower, or just looking to explore,
you'll find it all at mood.
And it's not just the variety that makes them stand out.
Every product is sourced from small American-owned family farms
that care deeply about what they grow.
It's cannabis you can trust, delivered discreetly, and ready to elevate your mood.
And because you're a listener, you get 20% off your first order.
Just head to mood.com. That's M-O-O-D.com to get started.
Tyler Reddick here from 2311 Racing. Victory Lane? Yeah, it's even better with
Chamba by my side. Race to ChambaCasino.com. Let's Chamba.
No purchase necessary, VTW Group, voidware prohibited by law, CTNCs, 21 Plus, sponsored by Chamba Casino.
Let's be honest, buying cannabis shouldn't be complicated, sketchy, or low quality.
That's why I want to tell you about mood.com. That's M-O-O-D.com.
Mood ships federally legal cannabis straight to your door.
No medical card, no hassle, and here's the kicker.
The quality is better than anything you'll find at your local dispensary.
Yeah, I said it. Whether you're into edibles, concentrates, flower, or just looking to explore,
you'll find it all at mood. And it's not just the variety that makes them stand out.
Every product is sourced from small American-owned family farms that care deeply about what they
grow. It's cannabis you can trust, delivered discreetly, and ready to elevate your mood.
And because you're a listener, you get 20% off your first order.
Just head to mood.com. That's M-O-O-D.com to get started.
What if, right? What if the most analyzed, the most debated tech magnate in the world
isn't actually competing in the industries we think he is?
Right. I mean, think about it. Well, everyone else, you know,
Microsoft, Google, Amazon, they are all aggressively playing chess on this incredibly crowded board,
fighting over the exact same scarce resources.
The same chips, the same power grid. Exactly. And meanwhile, Elon Musk is just
off in the corner building an entirely new board from scratch.
It really fundamentally changes how you look at the whole tech landscape.
I mean, he isn't trying to outmaneuver the titans by, you know,
writing slightly more efficient code.
Right. We're getting a fractionally better enterprise contract.
Yeah, exactly. Yeah.
He is creating a completely separate game where the existing rules of scarcity,
software integration, all of that, it just simply does not apply.
Well, welcome to thrilling threads. And if you are joining us today,
we know exactly who you are. We do.
You're the kind of listener who doesn't just want the morning headlines, right?
You want to understand the actual underlying architecture of where the future is heading.
So whether you are catching up on the absolute latest tech trends for, you know,
a massive strategy meeting, or you're just insanely curious about how all these seemingly
disconnected pieces of the modern world actually fit together, consider this your shortcut
to a massive aha moment.
And the analysis where I'm packing today is, honestly, it's truly sweeping.
It connects dots that most financial analysts and even tech reporters,
they don't even realize are on the same piece of paper.
Yeah, they're looking at the trees and missing the forest.
So our mission today is to thoroughly unpack this fascinating,
comprehensive framework we've received.
And the core thesis is wild.
It really is. The thesis is that Elon Musk's ventures, right?
Tesla, SpaceX, XAI, Optimus.
They are not a portfolio of separate siloed companies, which is how everyone treats them.
Right. But they are actually a single, unified, vertically integrated,
physical AI system.
Yeah.
And we are going to explore every single layer of this sprawling ecosystem today,
from orbiting solar-powered data centers in space to, you know,
cars parked in your driveway that moonlight as office workers.
It sounds like sci-fi, but it's happening.
We want to see exactly how each piece feeds all the other pieces.
So where do we even begin with an architecture that is this massive?
Well, I think we have to start with the fundamental paradigm shift
in how artificial intelligence actually interacts with the world.
Okay.
The overarching idea here is changing the playing field entirely, right?
Yeah.
And the most vivid illustration of this is something that was announced fairly recently,
back on March 11th.
Oh, right. The joint project.
Yeah, between Tesla and XAI.
Internally, they refer to it as macro-hard.
Or sometimes digital optimists.
Macro-hard.
I mean, I do love that.
It's a very cheeky, literal inversion of Microsoft.
Very on-brand.
Yeah, totally.
So what exactly is this macro-hard initiative attempting to do?
So it's an AI agent designed to essentially do the work of an entire software company.
Wow.
But to understand why this is a completely different game,
we have to look at how the rest of the industry,
you know, Microsoft's the Google's Anthropic,
how they are currently building their AI agents.
When you use, say, Microsoft co-pilot or clawed to automated task,
how is that AI actually talking to the software?
Well, it's communicating through the back end.
It uses APIs, application programming interfaces.
Exactly.
It's basically reading the underlying code of the software
to figure out what data is there.
And then it injects its own code to make actions happen.
Right.
So the developers of the AI and the developers of the software
have to explicitly build those bridges
so the two systems can even talk to each other.
And building those bridges is incredibly labor intensive.
I mean, you have to write custom integration code
for every single application you want the AI to interact with.
It's exhausting.
Yeah.
And if an app updates its UI or changes its API structure,
the bridge breaks.
The developers have to scramble to fix it.
It's just brittle.
You're very fragile.
Musk's approach with digital optimists
throws all of that out the window.
Wow.
Instead of building an AI that reads back end code,
he is building an AI that functions exactly
like a human being does.
OK.
So it's the difference between, say,
hiring a team of interpreters to translate
every single local dialect in the world for you
versus just taking a magic pill
that lets you universally understand human intent.
That's a great way to put it.
If an API-based AI is a train,
it's stuck on pre-determined tracks, right?
Yep.
It can only go where the custom integrations
and the specific code have already been laid down.
Fast on its specific route, sure,
but completely useless if you want to go off course.
Exactly.
So how does digital optimists go off-road?
By using pure vision.
Digitally speaking, it sees what you see.
OK.
It continuously processes the last five seconds
of screen video, taking in the visual layout
of a desktop, a website, or a proprietary enterprise app.
Right.
And it executes actions using a digital mouse
and a digital keyboard, exactly like a human operator would.
No custom integrations.
No back-end APIs.
I have to admit, I struggle with this a bit.
How so?
Well, I understand the elegance of the idea.
But navigating a chaotic computer desktop
purely by looking at it seems incredibly difficult for a machine.
It is.
I know Ashock Elliswamy, Tesla's VP of AI.
He's quoted saying, it's so obvious you can solve this
with cameras.
It's an AI problem.
But isn't analyzing live, high-resolution video
of a computer screen vastly more compute-intensive?
Oh, absolutely.
And frankly, more prone to hallucination
than just reading a clean text feed from a structured API.
Like, why take the harder road?
Because if you solve the harder road,
you achieve absolute universality.
Meaning what, exactly?
Think about it.
The digital world, our computer interfaces, our software,
websites, it was all built for human eyes and human hands.
Right.
We didn't build graphical user interfaces
for machines to read in the background.
We built them with buttons, dropdown menus, visual cues
that are meant for biological processing.
That makes sense.
So if you teach the AI to look at a screen and understand,
ah, that red rectangle is a submit button,
and that blinking line is the text field,
then that AI can suddenly use every single piece of software
ever created, past, present, and future.
Without writing a single line of custom integration code.
So if a company updates their app tomorrow
and moves the button to the left side of the screen,
the AI just sees it moved and clicks it anyway,
just like a human would.
Precisely.
You bypass the entire API bottleneck.
Now, regarding your point of a compute intensity,
you are entirely correct.
Yeah, videos have.
Analyzing live video is immensely heavy,
and this is where the vertical integration really
starts to show its teeth.
OK, wait on me.
Digital Optimus isn't running on some generic cloud server.
It runs on Tesla's in-house AI4 hardware, which they manufacture
themselves for only about $650 per unit.
Wow, that cheap.
Yeah, it processes the visual input locally,
right there on the chip, and only calls out
to XAI's massive GROC cloud when it needs heavy strategic reasoning.
What they call system 2 thinking.
Exactly.
So the local chip handles the reflexes,
like where the mouse is, what the icon looks like,
and GROC handles the logic of what the task actually is.
You got it.
But the hardware aspect leads to an extrapolation
that, honestly, it sounds like science fiction.
There is a stated vision that your personal Tesla,
while it is just sitting parked,
can actually do this office work for you.
Right.
If you follow a logic of the hardware,
it makes perfect sense.
How so?
Every single Tesla equipped with this AI4 computer
is essentially a highly capable, localized inference node.
OK.
When you aren't actively driving it,
that computer is just sitting idle in your garage
or parking lot.
Doing nothing.
Doing absolutely nothing.
So the vision is to turn those millions of parked cars
into a massive distributed compute network.
So wait, while I'm sleeping,
the computer inside my car boots up digital optimists
and just starts churning through digital office tasks.
Yes.
And think about the contrast here.
Everyone else in the tech industry right now
is fighting tooth and nail to build
massive centralized server farms for cloud compute.
Right.
Data centers.
They are begging local governments for power permits.
Meanwhile, Tesla already has millions
of these highly capable compute nodes
deployed all over the world.
Yeah.
And the best part, they were paid for by the consumer.
Oh, wow.
That is wild.
And to power them for heavy compute tasks,
they are deploying millions of dedicated units
at supercharger stations.
Those stations represent roughly seven gigawatts
of available deployed power.
Seven gigawatts.
Yeah.
To put that in perspective for you listening,
one gigawatt can power a mid-size city.
That is a staggering amount of energy infrastructure
just sitting there.
It really is.
So instead of building a multi-billion dollar data center
from scratch in the middle of a desert somewhere,
he just taps into the computers
that people have already bought,
which are plugged into a power network he already owned.
It's a completely decentralized supercomputer.
Yeah.
But we have to ask the critical question here.
Which is?
How does a car company get so absurdly good at computer vision
that they can confidently apply it to complex office software?
Right.
Because navigating a cluttered spreadsheet is hard,
but navigating a cluttered intersection in Mumbai
is a nightmare.
Exactly.
And that's the secret, isn't it?
They've been training this visual intelligence
on the chaotic physical streets for years.
Which brings us to the foundation
of this entire ecosystem.
Let's be honest, buying cannabis
shouldn't be complicated, sketchy, or low quality.
That's why I want to tell you about mood.com.
That's m-o-o-d.com.
Mood ships federally legal cannabis straight to your door.
No medical card, no hassle.
And here's the kicker.
The quality is better than anything
you'll find at your local dispensary.
Yeah, I said it.
Whether you're into edibles, concentrates, flower,
or just looking to explore, you'll find it all at mood.
And it's not just the variety that makes them stand out.
Every product is sourced from small American-owned
family farms that care deeply about what they grow.
It's cannabis you can trust, delivered discreetly,
and ready to elevate your mood.
And because you're a listener, you get 20% off your first order.
Just head to mood.com.
That's m-o-o-d.com to get started.
Hello, it is Ryan.
And we could all use an extra bright spot in our day, couldn't we?
Just to make up for things like sitting in traffic,
doing the dishes, counting your steps, you know,
all the mundane stuff.
That is why I'm such a big fan of Chamba Casino.
Chamba Casino has all your favorite social casino style
games that you can play for free anytime, anywhere with daily bonuses.
So sign up now at chambacasino.com.
That's chambacasino.com.
No purchase necessary VGW group void for prohibited by law,
21 plus terms and conditions apply.
We have to completely reframe how we view Tesla as a company.
Yeah, I think people still get this wrong.
Most financial analysts still value Tesla primarily
as an automotive company that happens
to have some cool AI features tacked onto the dashboard.
Hardware company.
Right.
But the reality is the exact inverse.
Tesla is an AI company that happens to sell cars to fund its data collection.
So if they are an AI company, what is their proprietary data?
Video.
Video, right.
Every single Tesla on the road is equipped with eight cameras
capturing continuous high-definition video of the real world.
Every single one.
Every time someone drives to the grocery store,
those cameras are feeding the largest real-world AI training data set on Earth.
And the scale of this data set is, frankly, difficult to comprehend.
The full self-driving or FSD fleet has
logged over 8.4 billion cumulative miles.
8.4 billion miles.
But that's a gift.
I mean, the Earth is roughly 24,000 miles in circumference.
We're talking about driving around the entire planet hundreds of thousands of times.
It's not just highway driving in sunny California, either.
It's snowstorms in Minnesota, deer jumping out in rural Texas,
chaotic pedestrian traffic in Manhattan.
And the growth rate is purely exponential.
They went from 6 million miles in 2021 to 4.25 billion by the start of 2025.
OK.
Then, in just the first 50 days of 2026 alone,
they added another 1 billion miles.
A billion miles in 50 days?
Yeah.
That means they are capturing every conceivable edge case out there.
Think about the weirdest things you've seen while driving.
Oh, there's always something.
Like a person walking a dog while riding a unicycle in a chicken suit.
A Tesla camera has almost certainly seen it, categorized it,
and trained the neural network on how to react to it.
That volume of edge case data is the ultimate mode.
Achieving safe, unsupervised self-driving at scale requires an estimated 10 billion
mile benchmark just to iron out those statistical anomalies.
Right.
At their current pace, they will likely cross that threshold this year.
Now, Waymo, Cruz, Zooks, these companies are also trying to build self-driving cars.
They are.
But they are playing a fundamentally different game, right?
They aren't trying to solve the vision problem the way Tesla is.
Not at all.
They are relying on a completely different architecture.
Yeah.
The competitors use expensive LiDAR sensors,
which shoot thousands of lasers out to create exact 3D maps of the environment.
So they use lasers, not just cameras.
Right.
And they combine those lasers with heavily pre-mapped environments and rigid geofenced zones.
They are solving autonomous driving the city by city, map by map.
Which goes back to the train analogy from earlier.
Exactly.
A Waymo car is a train on highly specific tracks.
As long as the car stays within the perfectly mapped, perfectly laser-scan zone of downtown
Phoenix or San Francisco, it operates beautifully.
Lawlessly usually.
But if a construction crew alters an intersection overnight, or if you ask it to drive on a dirt
road in Montana that hasn't been pre-mapped, the system encounters the scenario it doesn't
understand and it just stops.
It's a brittle system.
Tesla conversely relies on pure vision.
No expensive laser sensors, no pre-mapped geofences, just cameras and a neural network.
So Waymo is trying to solve the specific task of driving.
Yes.
But Tesla is trying to solve the universal challenge of vision and general spatial intelligence.
Exactly.
Because if you saw real-world navigation using only cameras, you haven't just built
a chauffeur.
What have you built?
You've built an artificial intelligence that inherently understands physics, depth, object
permanence and human behavior.
Wow.
It knows that a ball rolling into the street means a child might be chasing it.
But the industry debate around this is fierce.
Very fierce.
And I'll play a devil's advocate here, because the criticism of pure vision is heavy.
Go for it.
Later provides exact mathematical distance.
It shoots a laser, it bounces back.
The computer calculates the time of flight and it knows exactly to the millimeter how far
away that pedestrian is.
That's true.
The vision, however, relies on inference.
The computer has to guess the distance based on flat 2D pixels.
It has to use parallax and temporal memory to infer 3D depth.
When human lives are at stake at 70 miles per hour, isn't relying on inference incredibly
risky compared to the certainty of a laser?
It's the central tension of the autonomous vehicle race.
The defense of pure vision anchors back to biological equivalents.
Biological equivalents.
Yes.
The global world was built for humans and humans navigated without lasers shooting out
of our foreheads.
That's a good point.
We drive using two optical sensors, our eyes and a biological neural network behind them
are brain.
We infer depth, speed and trajectory purely from visual data and we do it remarkably well.
We do most of the time anyway.
Right.
So the argument is that if a biological neural network can do it safely, a digital neural
network trained on 10 billion miles of diverse global data can do it even safer.
And without the crazy hardware cost.
Exactly.
Without the crushing hardware costs and geofencing limitations of LiDAR.
It's the ultimate bet on software over hardware.
And if that bet on pure vision pays off, the economic implications are just wild.
They're staggering.
Because if you have millions of data collecting AI robots on wheels rolling around the world
and suddenly you prove they don't need a human driver at all, the entire economic model
of car ownership flips completely upside down.
Which brings us to the concept of transportation as a service.
Yeah.
And what is widely misunderstood about the robot taxi model.
Yeah, let's talk about robot taxi.
Whenever the media covers it, the framing is almost always Tesla versus Uber.
Always.
It's portrayed as a battle of ride hailing apps competing for market share.
But that framing completely misses the forest for the trees, doesn't it?
It really does.
It ignores the fundamental inefficiency of the modern automobile.
As they exist today are arguably the most inefficient assets in our society.
Inefficient how?
Well, the average car sits idle for roughly 22 hours a day.
22 hours.
Yeah, it is a rapidly depreciating piece of heavy machinery that just sits in a driveway
or a parking garage, bleeding money in the form of insurance maintenance and loan payments
every single month.
It's a very expensive two ton metal brick taking a valuable real estate 90% of the time.
Precisely.
The robot taxi plan aims to take the seven million Tesla's already on the road and transform
them from depreciating liabilities into revenue generating assets.
OK, how?
When you aren't using your vehicle, you tap a button on your phone and the car wakes up,
unplugs itself, and drives off to provide rise to other people on the network.
The proposed economics of this are fascinating.
They are.
The owner of the car keeps 75% of the revenue generated.
While the network, Tesla takes 25% to cover the routing software, network management,
and insurance.
The estimate suggests an owner could earn anywhere from $10,000 to $50,000 a year depending
on their location and how often they let the car work.
Consider the financial relief that offers the average consumer.
If your car is generating even $10,000 a year while you are asleep or sitting at your
office desk, the vehicle is literally paying for its own financing and insurance.
It transitions from an expense to an income stream.
Exactly.
And from a corporate perspective, the scale is an absolute cheat code, a cheat code.
Think about it.
Even if only 10% of current Tesla owners opt into this network, just a 10% adoption rate,
that immediately yields a fleet of 700,000 autonomous vehicles operating globally.
And what is the capital expenditure to acquire those 700,000 vehicles?
Zero.
Zero.
The consumer already bought the hardware.
Uber and Waymo have to go out, race capital, purchase expensive, specialized vehicles,
outfit them with tens of thousands of dollars of lidar equipment, and deploy them city by
city in a highly capital intensive rollout.
It's painfully slow.
But Tesla simply pushes an over-the-air software update to cars that are already sitting in
people's driveways.
It is an instantaneous distributed deployment.
The Airbnb model taken to its absolute extreme.
Oh, totally.
Airbnb let you monetize your spare bedroom, but you still had to wash the sheets and, you
know, interact with the guests.
Right.
With this, your spare bedroom unplugs itself, leaves your house, generates cashful all night,
and parks itself back in your driveway before you even wake up.
This completely alters the trajectory of personal transportation.
It's not just about undercutting Uber's pricing, though the forthcoming CyberCab is designed
to do exactly that.
Oh, the CyberCab.
Yeah, the CyberCab, which is a purpose-built Robotaxi entering production next month.
It will cost under 30,000 dollars to manufacture.
It won't even have a steering wheel or pedals.
Right.
No pedals at all.
None.
And operating costs are projected at under 20 cents per mile.
Under 20 cents a mile.
That is up to 90% cheaper than a current Uber or Lyft-Ride.
Yeah.
If point-to-point transportation becomes that radically cheap and instantly accessible, why would
anyone living in a suburb or a city ever buy a personal car again?
That is the ultimate disruption.
Transportation is a service makes personal car ownership optional, even financially irrational
for many people.
Absolutely.
It's the ability to eliminate the need for personal auto insurance, routine maintenance,
and home parking spaces.
The ripple effects through urban planning alone, like turning massive concrete parking lots
into housing or green spaces, are staggering.
It's a total re-architecting of modern society.
It is.
And yet, incredibly, solving the autonomous car is only the beginning of the vision.
Just the beginning.
Because if you have successfully developed an AI brain that is sophisticated enough to
navigate the unpredictable chaos of a busy city street, and versatile enough to manage
a digital computer desktop, what happens when you take that exact same generalized intelligence
and give it arms and legs?
You get Optimus.
And the true manifestation of a unified physical AI ecosystem.
When most people think of humanoid robots, right, their minds immediately go to Boston
Dynamics.
We've all seen the viral videos of the Atlas robot doing backflips, parkour, and dancing
to pop songs.
It's brilliant engineering.
Very impressive hardware.
But comparing Optimus to Boston Dynamics is fundamentally flawed, isn't it?
Entirely.
Boston Dynamics builds standalone, highly specialized robotic products.
They are incredible pieces of mechatronics, but they are bespoke machines built for
specific physical tasks.
Right.
Optimus is not a separate product in the traditional sense.
It is quite literally the physical manifestation of the car's AI.
So the car and the robot are sharing a brain.
They were running the exact same end-to-end neural network.
They use a single neural world simulator for training.
Oh, wow.
The underlying architecture that takes visual input from a camera on a car's bumper and
translates it into steering and braking commands is simply adapted to take visual input from
the cameras in the robot's head and translate it into arm, hand, and leg movements.
That is a staggering concept.
That means the 8.4 billion miles of driving data we discussed earlier.
That massive mode of data isn't just making the cars better at driving, it's actively
making the humanoid robots smarter.
Yes, because the fundamental challenge of both driving and robotics isn't the specific
task itself.
The challenge is understanding the physical world.
By navigating billions of miles of roads, the AI has learned the universal rules of physics.
It understands gravity, momentum, depth of perception, how solid objects interact, and
how humans move unpredictably through space.
I understand the theory, but help me bridge the practical gap here.
If the AI learns how to successfully identify and stop for a pedestrian at a busy crosswalk
in Chicago, how does that specific automotive knowledge translate to helping an optimist
robot fold laundry in my living room or assemble a battery pack in a factory in Texas?
It comes down to generalized spatial awareness.
When the car learns to stop for a pedestrian, it isn't just learning a rigid rule that
says stop at white lines.
It is learning complex spatial reasoning.
It is learning to identify a 3D object moving through a 3D environment, predict its physical
trajectory, and manipulate its own physical form, the car, to interact safely with that
object.
But underlying spatial reasoning is identical to the reasoning required for a robot hand
to reach out, graph the delicate piece of fabric, understand the fabric's physical properties,
and fold it without tearing it.
It's all just manipulating physical matter in 3D space based on visual input.
And because they are sharing this unified neural brain, it creates an unstoppable flywheel
effect.
The flywheel is the core driver of the valuation model here.
Let's trace the loop.
Let's do it.
You have millions of cars on the road collecting edge case data.
More cars equals more varied data.
More data trains a vastly smarter generalized AI.
Because they share a brain, that smarter AI instantly creates smarter, more capable
optimist robots.
Those smarter robots are then deployed directly into Tesla's own factories to manufacture things
cheaper, safer, and faster.
And what are those robots manufacturing there, building more cars and more robots?
Exactly.
Which then deploy into the world to collect even more data, spinning the flywheel faster
and faster, driving costs down at every revolution.
The scale of this robotic deployment is already underway.
Over 1,000 Gen 3 Optimus Robots have already been deployed internally.
Wow.
Furthermore, they are currently converting the Fremont factories Model S and X lines to
produce 1 million robots per year.
One million.
And with the massive Gigatexis facility, they are targeting a staggering 10 million robots
per year with the upcoming Gen 4.
10 million humanoid robots a year.
It's almost impossible to visualize that level of manufacturing.
It is.
But there's a hard physics limit to this robotic flywheel, isn't there?
Hey, it's Cole Swindell.
After I give everything I've got to land a perfect vocal, I usually take five before
jumping into the next track.
And I've learned exactly how to recharge in that time.
Some folks grab coffee.
I hit a quick good lookspin.
Next thing you know, the break is just as fun as land down the track.
A better break makes for a better take.
Need a break?
Less chumble.
No purchase necessary, VGW group void were prohibited by law, 21 plus TNC supply, sponsored
by Chamba Casino.
You can build 10 million humanoids, you can deploy millions of autonomous cars, and you
can run digital optimists on millions of parked computers.
But all of that artificial intelligence requires a terrifying, almost incomprehensible amount
of computational power.
Yes.
And you can't just plug all of that into the wall.
The electrical grid is already tapped out.
This brings us to a massive bottleneck that threatens this entire vision, the global energy
crisis.
If you look at the macro twins and technology right now, energy is the only true currency.
Everyone is building colossal AI data centers to train their models.
And they are early training regional power grids.
How bad is it?
US data centers are currently drawing 41 gigawatts of power.
That represents 150% increase in just the last five years.
To put 41 gigawatts in perspective, that rivals the output of all the nuclear power plants
in the United States combined.
Just for data centers.
It's for data centers.
And the trajectory is completely unsustainable.
PJM, which is the largest grid operator in the country serving over 65 million people
across multiple states, projects they will be six gigawatts short of basic reliability
requirements by 2027.
That is next year.
That is next year.
Using the very real potential for rolling blackouts, simply because server farms are consuming
too much electricity.
What's fascinating about the grid crisis is how it completely crosses traditional political
divides.
It really does.
You have politicians bring completely different sides of the aisle, like Senator Bernie
Sanders and Governor Ron DeSantis, both pointing out the exact same issue.
And we are just looking at the math here, right, not endorsing any political stance.
But when both sides are publicly raising alarms about the sheer strain these data centers
are putting on local power grids, you know it's serious.
Absolutely.
When massive tech companies siphon off gigawatts of power, it restricts supply, which drives
up utility bills for everyday consumers.
The math doesn't care about politics.
A gigawatts shortfall is a gigawatts shortfall, and constituents are getting angry.
With grid power becoming this incredibly scarce, politically volatile resource, the major
tech titans are panicking.
They really are.
Google, Microsoft, and Amazon are desperately fighting each other for whatever grid
power is left.
They are locking up long term nuclear contracts.
Microsoft is famously involved in a massive effort to restart the decommissioned Three Mile
Island nuclear facility.
They are completely reliant on effectively begging traditional utility companies for permission
to grow.
But Musk is playing a different game here too.
He has a three-pronged strategy to completely bypass this earthly grid bottleneck.
Are you with number one Tesla energy?
Let's talk about it.
Microsoft and Google are begging for power.
Tesla is actually selling the power infrastructure.
Last year alone, they deployed 47 gigawatt hours of commercial energy storage, generating
over $12 billion in revenue.
And they rely on their own hardware.
XAI's massive Memphis supercomputer, named Colossus, isn't just subject to the whims of
the local Tennessee grid.
What do they use?
It is powered and stabilized by 168 Tesla Megapax.
Creating hundreds of millions of dollars of proprietary energy storage, acting as a massive
buffer.
So they insulate themselves from grid instability?
Yes.
But storing power still requires generating power, which brings us to move number two, solar
generation.
Right.
The stated target is to build out 100 gigawatts per year of US solar manufacturing capacity.
100 gigawatts a year.
That is more than double what the entire US data center industry is currently consuming.
The goal is complete energy autonomy.
By pairing that massive solar generation with the Megapax storage, they can create self-sufficient
mini power plants for their data centers and factories, cutting energy costs by up to 40%
compared to traditional fossil fuel reliance.
It's like everyone in the tech industry is crammed into a tiny kitchen, viciously fighting
over the very last slice of pie, and Elon just walks outside and builds a massive solar
powered bakery in the backyard.
Great analogy.
But eventually, even the backyard isn't big enough.
His move number three is where the architecture leaves the atmosphere entirely.
Let's talk about the space play.
This is perhaps the most audacious paradigm shift in the entire thesis.
SpaceX has filed with the FCC for a constellation of up to 1 million low earth orbit satellites.
A million.
But these are not for providing starlink internet to remote areas.
They're specifically designed to be solar powered orbital AI data centers.
They want to put the heavy compute in space.
I hear that and my initial reaction is why go through the immense difficulty of launching
servers into the vacuum of space.
Because in low earth orbit, there are no grid constraints.
There are no local zoning boards to fight, no land use battles with municipalities, and
no politicians complaining about consumer electricity prices.
That's true.
Most importantly, if you position them in specific sun-synchronous orbits, the sun never
sets.
Continuous unfiltered solar energy bombarding the solar panels 24 hours a day, seven days
a week.
You never need a battery backup.
The proposed math is staggering.
SpaceX estimates that launching 1 million tons of these satellites annually, with each
ton generating 100 kilowatts of compute power, could add 100 gigawatts of AI compute capacity
every single year.
Every single year.
Free energy beamed directly from the sun into silicon processors floating in zero gravity.
I'm sorry, I have to push back on this.
It sounds like a billionaire's sci-fi vanity project.
The sheer cost of rocketry aside, what about the laws of physics?
What about latency?
Oh, latency.
Yeah.
If I am sitting in my parked car and digital optimist is trying to click a button on a spreadsheet,
and it has to beam that request up to a satellite, process it, and beam it back down, the lag
would make the system completely unusable for real-time applications.
If they were relying on orbit for real-time inference, you'd be absolutely right.
The round-trip latency would be a deal breaker for reflexes.
But that's exactly why the vertical integration saves the concept.
The local AI for hardware inside the cars and the robots handles the split-second real-time
decisions, the fast reflexes.
The orbital compute layer would be strictly reserved for the heavy asynchronous training
of the massive foundation models.
Oh, I get it.
Right?
When you are training a neural network on billions of miles of video data over weeks or months,
it doesn't matter if it takes a few extra milliseconds for the data packet to retort
it.
Ah, so the heavy lifting is done in the sky and the quick reflexes happen on the ground.
Exactly.
And regarding your point on the sheer cost of rocketry, that is exactly why SpaceX spent
the last decade relentlessly developing fully reusable, massive payload rockets like Starship.
Right.
The entire economic purpose of Starship is to drive the cost to orbit down to a level
where launching a million tons of server racks becomes financially viable.
It all connects.
It's a closed loop.
SpaceX exists to launch the data centers, which provide the infinite compute to train
the AI, which runs the robots, which build the cars and the rocket.
It is dizzying.
It really is.
But to pull off orbital compute, millions of robots and autonomous cars at this global
scale, you can't just rely on outside suppliers for your basic building blocks.
You have to build the machine that builds the machine, which brings us to the crucial
concept of owning the stack.
Vertical integration is the invisible connective tissue holding this entire ecosystem together.
Apple, Microsoft, Google, they're all tech titans, but none of them own the entire stack.
That's true.
Apple doesn't own its own power grid.
Microsoft doesn't manufacture its own vehicles.
Google doesn't fabricate its own silicon or launch its own space data centers.
SpaceX architecture seeks to control the entire vertical, the energy, the chips, the factories,
the hardware, and the software models.
Let's start with chips, because right now, silicon is the single biggest bottleneck
in AI outside of energy.
It is.
If you are a tech company and you want to train an AI, you are desperately waiting in line
at one specific company, TSMC in Taiwan.
You weigh in line, you pay whatever Nvidia charges for their GPUs, and you hope geopolitical
tensions don't disrupt the supply chain.
Musk is bypassing that line entirely.
They are building TerraFab, a massive 20 to 25 billion dollar chip fabrication facility
in Texas, targeting a formal launch around March 21st.
Okay.
They are aiming for a cutting edge, two nanometer process technology, projecting a manufacturing
capacity of 1 million wafers per month by 2030.
Let's break down what a two-minute-meter process actually means, because it sounds like
jargon to a lot of people.
Sure.
Imagine the transistor as a microscopic drawbridge for electricity.
The smaller you make those drawbridges, down to just a few atoms wide, the more of them
you can pack onto a single piece of silicon.
Exactly.
More transistors mean exponentially more computing power.
But crucially, because the electrical current has less distance to travel, smaller transistors
require far less electricity and generates significantly less heat.
Which is key.
Absolutely.
Trying to solve a global energy bottleneck and put computers inside humanoid robots, highly
efficient, microscopic chips are just as vital as massive solar farms.
And the hardware they are baking in Texas is custom designed for their exact neural network
architecture.
The upcoming AI5 chip is projected to have 40 to 50 times more compute power and 9 times
more memory than the current AI4 generation.
Usually.
By designing and fabricating their own custom silicon, they completely eliminate their
dependency on invidious pricing and TSMC's geographic vulnerabilities.
It guarantees total hardware sovereignty.
Speaking of hardware, we have to look at how they actually assemble these physical products.
The manufacturing process itself is a proprietary advantage.
Right.
Unboxed manufacturing.
With the upcoming CyberCab, they're introducing a manufacturing revolution they call the
Unboxed process.
To really appreciate unboxed manufacturing, you have to look at the physics of a traditional
auto assembly line.
For over a century since Henry Ford, automakers have built cars sequentially.
You start with a metal frame and it moves slowly down a massive mile long conveyor belt.
A worker puts on a door, the line moves, another worker drops in an engine, the line moves.
It's entirely linear.
Like stringing.
How to have fun.
Anytime.
Anywhere.
Step one.
Go to chumbacacino.com.
Got it.
Step two.
Collector.
Come to Papa.
Welcome, bonus.
Step three.
Play hundreds of casino-style games for free.
That's a lot of games.
All for free.
Step four.
Unleash your excitement.
Chumbacacino has been delivering thrills for over a decade.
So claim your free welcome bonus now and live the ChumbacLife.
Visit chumbacacino.com.
No purchase necessary VGW group void for prohibited by law, 21 plus terms and conditions
applied.
Girls on a necklace.
One piece at a time.
In a straight line.
It's slow.
It requires massive amounts of factory floor space.
And if one station breaks down, the entire line stops.
It's a bottleneck.
The unbox process shatters that linear constraint.
It is modular, parallel assembly.
Think of it more like a group of people building a complex Lego set.
Instead of passing one single block around, different teams are building different sections
simultaneously.
Exactly.
One group builds the front end, another builds the rear casting, another builds the interior
seats.
These complex sub-assemblies are completed independently, and then they are brought together
at the very end and snapped together in one fluid automated motion.
The efficiency gains are incredible.
This parallel modular approach cuts the necessary factory footprint by 40%, it reduces labor
cost by 30%, and it cuts total assembly time by 50%.
Wow.
Their target cycle time is under 10 seconds per vehicle.
That is in traditional automotive manufacturing anymore.
That is the speed and precision of consumer electronics.
That is how you assemble iPhones, applied to two-ton autonomous vehicles.
And this manufacturing process is applied universally across the ecosystem.
The factories themselves are just nodes in the network.
The Fremont factory converts to building robots, Texas builds the custom chips in the cybercabs,
Houston builds the megapacks to power the supercomputers.
They own the factories that build the machines.
And finally, governing all of this physical hardware is the top layer of the stack.
The software mind.
We touched on XAI's GROC earlier, but it is much more than just a consumer chatbot generating
text.
Right.
The consumer side of GROC is projected to generate roughly $1.2 billion in subscription
revenue, which is nice.
But the real strategic value isn't a few dollars a month from internet users.
No.
The real value is having an in-house foundation model.
GROC is the system-to-reasoning layer.
Okay.
Unpack that.
While the local AI on the car or robot handles the immediate real-time spatial execution,
GROC acts as the deep strategic thinker.
It provides the contextual, logical understanding required for digital optimists to navigate a complex
enterprise software suite, or for a humanoid robot to plan out a multi-step physical manufacturing
task.
So it's the brains of the operation.
Exactly.
Having an in-house competitor to OpenAI's GPT or Google's Gemini that is natively seamlessly
integrated into your own proprietary silicon and your own physical robots is the ultimate
masterstroke of vertical integration.
Every single layer from the solar panel to the silicon chip to the software model makes
all the other layers cheaper, faster, and smarter.
That's the ecosystem.
Okay.
We have just painted a picture of an absolute technological juggernaut.
It sounds invincible.
It sounds like a closed loop empire that cannot possibly fail.
But what happens when this grand vision collides with reality?
We need to ground this.
Let's look at the immense risks in what financial analysts call the unpriced optionality.
We absolutely must address the execution risks, because a plan of this magnitude carries
existential threats.
First and foremost are the timelines.
Oh, yeah.
Elon Musk is notoriously perhaps pathologically optimistic about when these bleeding edge technologies
will actually be finished.
To put it mildly, the Robotaxi network was supposed to be driving us around years ago.
Optimus was supposed to be a household helper by now.
Right.
TerraFab is a $25 billion construction project that will take years to fully ramp up, assuming
there are no supply chain disasters.
And the Orbital Data Center.
Still theoretical.
Yeah.
The technology to process heavy AI compute in the radiation of space barely even exists
yet, and launching a million tons of hardware certainly doesn't have regulatory or FCC approval.
And then there is the massive capital expenditure required to keep the flywheel spinning while
these technologies mature.
Tesla's projected capex in 2026 alone exceeds 20 billion dollars.
That's a lot of money.
That is an astronomical amount of cash to burn if these long-term bets hit unexpected roadblocks.
If the Robotaxi network gets bogged down in endless municipal regulatory hearings, or
if the Optimus Robot proves too clumsy to work safely outside of a highly controlled
factory environment, tens of billions of dollars in capital could be completely stranded.
Not to mention, the existing tech titans aren't just taking an app.
Waymo is actively expanding their operational footprint right now, mapping new cities.
Global and anthropic are releasing vastly smarter AI models every few months.
Invidia is entrenching its GPU ecosystem deeper into every enterprise on Earth.
Just because a new custom stack is being built doesn't automatically mean the existing
massive industry stack just vanishes.
Those are very real existential risks.
But this provides a vital framework for you, the listener, as you evaluate this ecosystem.
You have to step back and look at the underlying macroeconomic trends.
Okay.
Let's walk through them.
Over the next decade, are we going to need more AI compute globally or less?
Definitely more.
Is energy generation and grid capacity going to be the ultimate bottleneck for that compute?
The data strongly suggests yes.
Is pure vertical integration fundamentally more defensible and agile than being totally
dependent on third-party international suppliers for your chips and software?
Almost always, yes.
And is real world messy, chaotic visual data more valuable for training physical AI than
clean simulated laboratory data?
So the final synthesis of this framework is this.
The risk is that this totally different game fails because the timelines are simply too
long or the capital requirements are too heavy or the physics are just too hard to overcome.
Right.
But the opportunity is that no one else is even playing this specific game.
If the timeline for the robotaxi or optimus slips by two years or even five years, the
underlying vertically integrated architecture being built remains entirely without peer.
It's like judging an Olympic decafé only by their score in the poll vault.
If you look at Tesla and say, well, their automotive profit margins dipped by 2% this quarter,
you are entirely missing the big picture of what they are actually training for.
Completely missing it.
And that brings us to the concept of unpriced optionality.
This is crucial for anyone trying to understand the financial reality of this system.
Currently, the broader market still largely prices Tesla strictly as a traditional car
company.
Yeah, they treat it like Ford or GM.
Exactly.
Analysts look at traditional automotive multiples, like how many metal boxes they sold, what
the margin on the metal was, and they price the stock accordingly, perhaps sprinkling
in a slight premium for the AI hype.
But if this vertically integrated thesis holds true, that pricing model is completely
disconnected from reality, entirely disconnected.
Because the scaled Robotaxi network wiping out personal car ownership, that's not priced
in.
No, enterprise grade digital optimists functionally doing the back office work of entire
software companies, not priced in a $25 billion tariff fab, churning out proprietary
hyper efficient AI chips, not priced in right and orbital compute data centers floating
in space, harvesting infinite solar energy, that's not even on a market's radar yet.
So for the learners out there who want to watch this play out in real time over the next
few years, what are the key metrics we should track to see if this grand vision is actually
coming together or if it's falling apart?
There are three main indicators to watch closely.
First, watch the CyberCab production ramp starting next month.
If they can truly utilize that modular unboxed manufacturing process at scale to turn out
cheap vehicles, it validates their entire hardware and factory thesis.
Okay, that's one.
Closely watch the commercial energy storage deployment numbers.
Every megapack they sell or install is another brick in their independent decentralized
power infrastructure.
Got it.
And the third.
And third, we have to watch that 10 billion mile FSD benchmark.
When they cross that massive data line, which is imminent, we need to see if it actually
unlocks the generalized unsupervised autonomy they've been promising, proving that pure
vision can conquer the physical world.
If those three metrics hold strong, the flywheel is spinning exactly as design.
Exactly.
This has been an absolutely staggering journey through a truly mind-bending framework.
But before we wrap up today, I want to leave you with one final, deeply provocative thought
to mull over, something that builds on all this but takes it one step further into the
future.
Okay, I'm ready.
We've talked extensively today about leveraging vertical integration to drive the cost
of compute, transportation, and manufacturing into the ground.
But if you extrapolate this ecosystem to its logical conclusion, if energy eventually
trends towards zero cost because of infinite solar arrays and orbit, and if physical labor
trends towards zero cost because of millions of tireless humanoid robots, what actually
happens to the fundamental nature of human economics?
That is the million dollar question, or I guess the trillion dollar question.
Our entire global economic system, capitalism itself, is based entirely on the management
of scarcity.
Exactly.
The scarcity of human labor, the scarcity of physical resources, the scarcity of energy.
If this unified physical AI system successfully replaces scarcity with absolute abundance,
the rules of the game don't just change for tech companies, they change for humanity.
If energy and labor are essentially free, what becomes the new most valuable resource
in our society?
Is it pure human creativity?
Is it authentic emotional connection?
You might be watching the infrastructure of a post-scarcity world being built right
in front of us, cleverly disguised as a car company.
That is a thought that is going to keep me up tonight.
So we've laid out the pieces of this massive complex puzzle.
Now we want to know where you stand.
Do you think this grand, vertically integrated, everything engine will actually succeed and
fundamentally reshape our economic reality?
Or do you think it is just a multi-billion dollar house of cards waiting for a stiff
breeze of regulatory or engineering reality to knock it down?
Drop a comment, share your thoughts, and let's keep this conversation going.
Thank you so much for joining us as we unpack this incredible vision.
It's always a pleasure to explore these paradigm shifts with you.
Absolutely.
Until next time, keep questioning the board you're playing on, and thanks for joining us
on this edition of Thrilling Threads.
Tyler Reddick here from 2311 Racing.
Victory Lane?
Yeah.
It's even better with Chamba by my side.
Race to ChambaCasino.com.
Let's Chamba.
No purchase necessary.
VTW Group.
Voidware prohibited by law.
CTNCs.
21 Plus.
Sponsored by ChambaCasino.

Thrilling Threads - Conspiracy Theories, Strange Phenomena, Unsolved Mysteries, etc!

Thrilling Threads - Conspiracy Theories, Strange Phenomena, Unsolved Mysteries, etc!

Thrilling Threads - Conspiracy Theories, Strange Phenomena, Unsolved Mysteries, etc!
