Loading...
Loading...

In this episode, we discuss Meta's new Manus desktop AI agent and its implications for AI as an operating system layer. We also cover startup Niv AI addressing AI data center energy consumption, Memories AI's visual memory layer for robotics, and Anthropic's dominance in new enterprise AI spending over OpenAI, which is now strengthening its government footprint with an AWS deal.
Chapters
00:00 Introduction and AIbox Update
01:26 Meta's Manus Desktop AI Agent
02:55 Niv AI Tackles Data Center Energy
04:32 Memories AI for Visual Memory
06:54 Anthropic Leads Enterprise AI Spend
08:36 OpenAI's AWS Government Deal
Links
Are you really buying a car online on auto trader right now?
Really?
At a playground?
Yeah, really.
Look at these listings from dealers.
Wow.
Your search can really get that specific.
Really?
And you just put in your info and boom.
Cars in your budget.
Mom needs a second, honey.
You can really have it delivered?
Really.
Or I can pick it up with the dealership.
One sec, sweetie.
Mommy's buying a car.
Mommy's working.
I think kid is walking up the slide.
Kyle, again, really?
Auto trader.
Buy your car online.
Really?
The Toyota Tundra and Tacoma are built to keep going.
Blending rugged muscle with precision engineering.
All supported by Toyota's time-tested legacy of dependability.
Step into a Tundra and feel the unyielding capability
with the available iForce Max engine.
Tundra puts out impressive power, torque, and towing performance.
And the roomy high-tech cabin keeps you connected on the go.
Or take a look at Tacoma made for drivers who push past the path.
Agile, tough and relentless with available features like crawl control,
portable JBL speaker, a power lift gate,
so gear goes in fast and the adventure keeps moving.
The Tacoma and Tundra are engineered to endure season after season,
mile after mile.
So drive one home today.
Visit toyota.com or stop by your local Toyota dealer to find out more Toyota.
Let's go places.
Welcome to the podcast.
I'm your host, Jayden Schaefer.
Today on the podcast, we have a number of interesting stories.
Meta has just launched Manus for desktop in this kind of AI agent on your computer.
We have anthropic, which is currently now officially flipped to switch
and is beating open AI and enterprise spending.
A new startup called Memories AI is building a visual memory layer for robotics.
Open AI is expanding their government footprint with a brand new AWS contract.
We're going to dive into all of those today in the state of what I think.
We're kind of watching these three different layers collide infrastructure,
enterprise spend, and agents.
I'm also super excited to announce that AI box, my own startup,
has officially added video to our platform.
You can now create AI tools with video.
And you can chat with over eight of the top AI video generation models.
We have bite dances, seed dance.
We have Google Vio models.
We have Open AI's Sora models.
And we have Pixiverse models on there.
This is super exciting for us.
It's been a big push.
And we hope to see what incredible tools you guys build with video on the platform.
If you don't already have a subscription, it's $8.99 a month.
And you get access to over 70 of the top AI models,
all in one platform for less than 20 bucks a month.
I hope this saves you a ton of money.
And you get access to all of these interesting new models to test out, try, and talk with.
You can try out the platform with the link in the description or typing in AIbox.ai.
All right, let's get into the episode today.
The first thing I wanted to cover is just the fact that we have a huge story from Meta's Manus.
This is a company they recently acquired.
And what's interesting is Manus was sort of going viral.
It was a Chinese firm that kind of had to pull itself out of China before this acquisition.
But they've just launched a desktop app that is basically bringing their AI agent directly onto your computer.
I think they've seen all of the hype around OpenClaw and realized that having an eight, you know,
they probably had a lot of these AI capabilities.
But bringing it beyond just having it on, you know, a website and into your computer,
I think they've seen the value of that.
It's a really big shift.
I think these agents were living in the cloud before.
Now they're going to be able to access your files, run apps, organize data,
even build software locally.
And I think this is a lot closer to how people actually work, which is why OpenClaw went so viral.
I think it's basically the beginning of AI agents becoming our operating system layer.
I think we're going to see a huge shift here.
This isn't just answering our questions, but they're actually going to be doing the work inside of our machines.
I think the trade-off is obviously obvious.
I think more power means that there's going to be more risks, security-wise,
you can now give AI access to your local environment, which is going to have a lot of privacy concerns as well.
I think for some people, right?
Meta doesn't have the greatest track record on privacy and on your data.
And so for some people, they might be a little concerned about having to access all of your computer files.
But at the end of the day, these are really powerful tools.
So I'm going to, I'm interested to see what sort of uptake Manus has.
This is already a product that's been doing quite well.
And I think this makes it a lot more useful.
There's also a new startup called NIV AI.
They just raised funding to solve a problem that I think a lot of people are not talking about enough, which is power.
AI data centers right now are using tons of power, tons of electricity,
because GPU workloads spike really unpredictably.
And I think that forces operators to throttle usage or they have to overpay for backup capacity.
So what NIV is doing is they're building a system right now to monitor and optimize power usage in real time.
Essentially, they're acting as a co-pilot for data center energy.
I think the reason why this matters is because AI isn't just a software problem right now.
It's an energy problem.
Companies that figure out how to squeeze more output from the same hardware and power constraints are going to have a massive advantage.
I think especially when you look at the state of the world today with everything happening in Iran and the energy shock that we've seen over the last few weeks,
I think energy is more important now than ever.
A lot of people are talking about the fact that AI companies are going to be very severely negatively impacted if these energy shocks,
these high oil prices continue because a lot of this was powering data centers.
A lot of this was powering energy.
And AI is literally just a direct pipeline from energy to what we're all using.
Like all of this stuff has to be run.
It takes insane amounts of energy.
A lot of the data centers, a lot of the AI training facilities that we're building.
They're told they should be building power plants basically attached to them because they use so much energy.
And a lot of people are seeing their local energy bills increase due to these kind of data center projects.
So I think this is a really fantastic startup and I'm excited to follow along with them.
There's a new startup I've been looking into called memories dot AI.
They're building what might become kind of this like foundational layer for physical AI.
And that is visual memory.
So instead of just remembering text like chat GPT does, they're basically building a system that's going to help AI remember what it actually sees.
So that means wearable devices, robotics and real world AI systems that can recall visual experiences over time.
Because right now if you have a conversation with chat GPT in a month later, I'm like, hey, you know, for this project, can you help me write a new, you know, some sort of new document or some sort of new file.
It can go and look at my history, my context and remember everything about that specific project from two months ago.
Now it's a completely different situation when we have robots running around in the real world with cameras on them that are learning and figuring out how to do things in warehouses and eventually in all of our homes with something like the optimist robots or the figure robot that you know these things are going to cost like 20 or $30,000, they'll be in our home to do things.
And I mean, there's a whole other conversation of people if people want that or trust that.
And I think inevitably they will once these things are have improved just like self driving cars.
But I think right now most of these AI systems are living in really a digital world.
And if AI is going to operate in the physical world right once we start moving from just having AI on our phone that we talked to AI in a robot that's walking around or in our home or in our warehouse, it needs to have memory the same way humans do.
So I think this is very early, but it points to where things are going in the future.
AI that doesn't just respond, but it actually remembers it learns it builds context from world world experiences just like a human wood and something I would actually expect to see from an optimist robot or a figure robot is they would have these this kind of memory built in.
So it's learning and understanding and let's say your robot breaks I would expect that you can transfer memory from one robot model to the next one you upgrade.
So let's say you've had a robot in your family for you know 10 years that's been helping out and it understands how to do everything inside of your home.
I think you'd be able to transfer those memories to the next iteration of the robot, which is really fascinating and I think a really strong moat between switching between robot companies in the future.
Now I know this sounds crazy in the future, but I think these are the problems that people are starting to solve now.
So I'm excited to see what memory dot AI does with this memories dot AI.
Okay, and the topic is now capturing over 70% of new enterprise AI spend is according to ramp data.
And I actually love these types of reports from Mercury or ramp or even a lot of different banks will put out these these types of reports.
But I mean basically they have access to what companies are actually spending because they have you know insights into their financials.
And so you know this is really solid data coming out of ramp. Just a couple of months ago it was a really tight race between open AI.
But now anthropic is pulling way ahead and they're actually moving ahead fast.
If you look at the charts, it's phenomenal.
They're businesses playing, I mean their businesses are basically paying all this money for their AI coding tools.
I think primarily but a lot of people are just using pain for clawed for regular chat tools.
At the same time open AI is reportedly rethinking their strategy. So the shifting more focused towards enterprise after they've been heavily investing in consumer products.
I think right now the AI race isn't just about who has the coolest demos, it's about really who makes money.
Enterprise adoption is a real scoreboard. And I think right now like people everyone's like wow open AI has this you know massive user base which is true almost you know nine hundred almost a billion weekly active users last I checked it was 900 million weekly active users.
I think Sam Altman, I mean I know he was just throwing some shade but he said something recently which was like there was more free chat GPT users in Texas than like all of anthropics users in the US combined or something like that which is sort of crazy.
But if you look at the revenue numbers they're not that far apart opening I said that they're on pace to generate about twenty five billion dollars in revenue this year.
And anthropic is on pace to generate about nineteen billion dollars. So these companies are much closer than you think when it comes to revenue.
Okay, there's a massive story unfolding with open AI right now they're expanding their government footprints through a new deal with AWS.
I think on the surface it just looks like another partnership but I don't think that is actually the case. So open AI has signed a deal to distribute AI products in the US government through AWS.
And that includes access inside highly secure environments like GovCloud and even classified regions handling really sensitive workloads. Now we've seen that open AI and the federal government particularly the Pentagon have had a huge falling out or sorry anthropic in the Pentagon.
And open AI kind of stepped in and took a lot of those contracts. And so I think what is actually happening is that open AI is trying to insert themselves directly into the most important distribution channel for government AI which is AWS because AWS already has really deep relationships across all federal agencies.
They are already compliant their infrastructure is something that's already trusted. And so by opening and plugging directly into AWS in this new partnership.
Open AI is not just selling their model they're becoming part of the default procurement pipeline for government AI.
I think what gets really interesting for me is that AWS is already heavily tied to anthropic right they have you know Amazon has invested billions into them many of the early investment rounds of anthropic were led by Amazon and AWS.
You know famously when AWS was trying to compete in the early days with chat GBT Amazon put in four billion dollars and since they've been and you know many multiples and much more than that.
But cloud is deeply ingrained into AWS's AI platform because of that. And so this was supposed to be anthropics kind of like home turf you know. But now open AI is stepping directly into that system. I don't think it's just a competition. It's really a platform level battle happening inside that same infrastructure.
Government adoption right now I think is kind of acting as a bit of a signal to the entire market because if your model is trusted for classified and sensitive workloads.
Then that credit credibility is really going to be able to spill over into a lot of enterprise deals specifically right like for users. I don't think users are like oh my gosh the government uses open AI like that should be my default model.
But for enterprise I think that's 100% the signal that they see right it's like if the government the most classified you know quote unquote organization in the world is using this. This is probably something that's great for enterprise.
So I think it really is expanding right now open AI's reach into federal agencies and it's also strengthening their position with enterprise customers who care about security compliance and long term.
Long term stability right let's be honest I think a lot of people saw the anthropic deal falling through and and felt like the company perhaps was a little bit less stable.
Now is that true or not I'm not you know speculating on that I just think that's what a lot of enterprises I heard same so at the same time open AI is keeping control they decide which models get deployed they coordinate directly with customers and they can enforce additional safeguards.
So this isn't them handing over the keys you know to AWS insane you know you're the distribution layer you you let people do whatever you want with the product.
I think right now the winners are not just the companies with the best models it's the companies that control distribution infrastructure and trust at scale.
And so if opening eyes making this deal with AWS I think that's kind of the first step forward here.
Guys thank you so much for tuning into the podcast today if you enjoyed the episode make sure leave a rating or review wherever you get your podcasts and as always make sure to go check out AI box dot AI we've just launched eight new video models into the platform.
So if you want to try those all out for only 899 month you can check out all of the latest from opening eyes Sora Google's VO seed dance and a ton of other incredible models all in the one platform.
There's a link in the description I'll catch you guys all in the next episode.
The Toyota Tundra and Tacoma are built to keep going blending rugged muscle with precision engineering all supported by Toyota's time tested legacy of dependability.
Step into a Tundra and feel the unyielding capability with the available I force max engine Tundra puts out impressive power torque and towing performance and the roomy high tech cabin keeps you connected on the go or take a look at Tacoma made for drivers who push past the path agile tough and relentless with available features like crawl control
and a powerful JBL speaker a power lift gate so gear goes in fast and the adventure keeps moving the Tacoma and Tundra are engineered to endure season after season mile after mile.
So drive one home today visit Toyota dot com or stop by your local Toyota dealer to find out more Toyota let's go places.
One ice coffee 99 cents please for real no way.
One ice coffee 99 cents please for real no way.
What a deal your new morning groove ice coffee from McDonald's any size for just 99 cents to 11 a.m.
Price and participation may vary cannot be combined with any other offer.



