Loading...
Loading...

This is our weekly, ad-free compilation of science news.
00:00 - Surprise! Milky Way Might Not Have a Black Hole After All
5:04 - The Simulation Hypothesis Gets Scientific Backing
10:01 - Breakthrough In Data Storage Could Store Your Photos for 10000 Years
We've got a black hole at the center of our galaxy, also weak thought.
A group of astrophysicists now says that might be wrong. Let's have a look.
The black hole that we thought sits in the center of the Milky Way is called
Sagittarius A Star because it's in the constellation Sagittarius.
It has an estimated mass of about 4 million sunspots squeezed into
region smaller than our solar system. It's what astrophysicists call a supermassive black hole.
The main observation of reason we believe Sagittarius A stars a black hole comes from
watching stars orbit around it. Some of these stars move incredibly fast.
The most famous one called Astu completes an orbit in about 16 years and gets extremely close
to the central object. From its motion and the motion of other stars we can infer both the mass
and how compact that mass must be. Whatever sits there has to be very massive, very small and very
dark. In general relativity the simplest explanation for that is a black hole.
If it was a black hole then the object would have an event horizon. That's a surface from within
which nothing not even light can escape. It's the defining feature of a black hole.
It is what makes the black hole black. But strictly speaking we don't know that Sagittarius A star
does have a horizon. It could very well be a very big mass of and dark object whose nature we don't
understand and if you think of dark stuff like we don't understand what comes to your mind.
Right, dark matter. Dark matter is for the most successful product. It's everywhere,
explains everything but nobody can tell you what it is. Incredible marketing and this is what
the authors of the new paper claim we may have mistaken. A blob of dark matter for a black hole.
If dark matter is made of particles, it's not entirely evenly distributed.
It clumps under the pull of its own gravity and these clumps are indeed the seeds for galaxies.
These clumps still have fairly low density much like clouds. These clouds of dark matter which
galaxies grow are called halos. The dark matter halo of the Milky Way is believed to have more than
10 times the Milky Way's mass and stretch your multiple times its size before tapering out to
the cosmological average. In these halos though, the density of dark matter isn't the same everywhere.
It has what astrophysicists call subhalos of higher density. You expect subhalos for
pretty much all types of dark matter particles. They're interesting because they could be
nearest and they could be observable. In fact our solar system might float through a dark matter
subhalor at some point which would make it easier to detect dark matter particles if they exist.
The authors of the new paper now consider a very specific dark matter particle, a type of fermion.
Fermions are particles like electrons and they have the peculiar property that they can't
all sit in the same quantum state. Because of that fermions have a built-in pressure called
degeneracy pressure which makes it difficult to squeeze fermions together. This is what keeps
wide dwarfs and neutron stars stable and they also say that dark matter could form
clumps of high density stabilise this way too. In that case the dark matter in our Milky Way
would have a dense central core which is surrounded by the much less dense halo. The halo
fits to the observations we have from our Milky Way and the amazing thing is that the dark
matter core fits the gravitational pull near Sagittarius A star really well as well as the
black hole hypothesis. They write no conclusive preference emerges between models. What are we
to make of this? One caveat is that the authors only use some of the observational data that we
have of Sagittarius A star. That's okay one has to start somewhere but it's possible that if one
looks at all the data a preference for one or the other explanation emerges. The other issue
is that they use a very specific dark matter model with properties that they tune to fit the
observations. Doing this for one galaxies one thing, doing it for all galaxies is another thing
entirely. That's uh this is not a crazy idea because astrophysicists have always had trouble
explaining why supermassive black holes form so early and grow so fast. The theory never quite
led up so maybe the answer is that some supermassive black holes just aren't black holes. I give
this a seven out of ten on the bullshit meter. It's fairly high because some models so specifically
tuned. However I do think that the idea has potential and maybe it actually ends up solving some
of the problems with the dark matter. It also finally explains the asterisk in Sagittarius A star.
It's a footnote and means may or may not be what you think it is. Do we live in a computer simulation?
Philosophers have been going on about this simulation hypothesis for decades. Physicists mostly
wrote their eyes because it's too vague an idea to even sensibly talk about. But in the past year
there's been a noticeable shift in this debate. The simulation hypothesis is slowly becoming
more scientific. I find this a very interesting development. Let's have a look. The interest in
the simulation hypothesis is a no small part driven by computer games and now AI becoming increasingly
better at well everything. It's particularly apparent if you look at the early world models like
DeepMind's Genie that's basically creating universes that can then be explored by other artificial
intelligences. It's not a big step to imagine that we are a population of a world model run by
some advanced civilization. Okay so maybe everything around us is a simulation run by some advanced
civilization on a giant computer. But what does it mean? We know that the laws of nature are
mathematical. If you want to call that a simulation we can all agree and we can also agree that it's
just a weird way to talk about differential equations. For the simulation hypothesis to be meaningful
we need to add details about what we mean by a computation and how that computation is happening.
And that is where the recent paper by David Walpert, a computer scientist comes in. He looks at
the simulation hypothesis. You'll love this using the multiverse. His point is that well if we are
being simulated then this is like we are one universe being simulated by another universe so that
naturally makes it a multiverse question. He then asks what properties the laws of nature in each
universe must have so that one universe can simulate the other. Losely speaking they need to
fulfill some sort of compatibility properties. And as a kind of corollary he also says it's possible
for one universe to simulate itself if the laws of nature are sufficiently reducible so that basically
you can compress the information you need for an entire universe. The author is a computer scientist
and he explicitly says in the paper he isn't going to look at just what physical laws might
fulfill this property but I think it's a good starting point. My issue with the simulation hypothesis
has always been that you run into an issue with computational complexity because the laws of nature
in our universe are not scale invariant. For example we think that the Planck scale is kind
of a limit to structures. So if you took part of our universe to simulate a similar universe it
can't be the same. You either have to make the simulation smaller than the original or a mid-detail
and if you have simulations and simulations and simulations eventually you run out of physical
things to compute with. However the situation is far less clear if you say well we don't know
what the laws of nature in the universe that might simulate ours are. Are there any laws of nature
in the upper level that can simulate this? This is the question that the new paper now raises.
I give the paper a 5 out of 10 on the bullshit meter. Fine for what the computer science is concerned
but rather empty for what the physics is concerned. I'm telling you about this anyway because I
think that this formalism is promising and who knows maybe this is where a theory of everything
will come from. Maybe we've been looking at it the wrong way when we've been asking what's going
on at smaller structures. Maybe we should have been asking all the time what's going on in the
embedding space. So to say the universe in which a programmer may be running this simulation.
But wait you might say wasn't there this group of physicists who recently proved that we
can't live in a computer simulation using good Earth's in completeness theorem and related
mathematical theorems? Yes I talked about this some months ago. Their point was basically that if
the laws of nature are computable then there should be bounds to the complexity of the observations
we can make and since our observations don't have such bounds we can't be in a simulation.
My take on this was in a nutshell that we've never in fact made an observation that conflicts
with any such bounds and then doesn't this kind of support the idea that everything is simulated?
I've now seen that a common appeared on the archive making a very similar point arguing that the
authors conflate maths with reality and another preprint by a philosopher likewise saying it's a
category error. I take those to mean that the simulation hypothesis is officially back from the
dead but it's still unclear what it does for a living. We used to store files on floppy disks
then on compact disks then on USB sticks but none of those lasts a lifetime. In my family we're
constantly copying family images from one computer to the next from one external hard drive to
a newer one while also storing them on three different cloud services of companies that might no
longer exist by the time we have grandchildren but here comes Microsoft. Their research team just
announced they've managed to read and write data in glass where it could last for more than 10,000
years. 10,000 years. That's long enough for archaeologists to write PhD thesis about your bathroom
selfies. How big is this news really and what other data storage upgrades are coming to consumer
electronics? Let's have a look. The new Microsoft paper is about what they call project silica.
The idea is to store data in glass. They don't do this by etching the surface but instead they write
with a laser inside the bulk of the glass in all three dimensions. They do this by using the laser
to subtly change the atomic configuration so that either the refractive index of the glass is
somewhat different or that the propagation of light depends on the polarization which is
called by refrigerants. To read the information they just illuminate the glass with LED light in
the visible range and scan it with a microscope. They identify particular positions by focusing
the light in different layers from multiple different directions. This gives them a 4-3-D
reconstruction of the data. The major reason they are working on this is that this medium is
particularly durable but of course we also want to know the other numbers. The potential data density
of this storage medium will be going by volume a factor 2 to 4 or so higher than typical hard drives.
For the writing speed they report about four megabytes per second. That's like a factor 50 or
so slower than a state of the hard drive. So if you want to back up your photo starts a good
opportunity to reread war and peace. Then again this was a lab demo not finished technology. It seems
realistic to me that this could be improved. They don't say how fast the readout works but it's
clear that when it comes to durable storage this is pretty good. It's much more practical than
DNA storage which we talked about a few months ago. That said this is not a product that will hit
the consumer market any time soon. This is because of writing and reading mechanisms rather
complicated and at least for the time being requires specialized and rather expensive equipment.
More likely it'll become available as a cloud service for archiving purposes. But there are
other developments in data storage which are about to hit the consumer market. Today's hard drives
use a tiny electromagnet to flip the magnetization of a tiny region on the disk. If you want
to store more data and the same volume you must make those regions smaller but smaller regions
are more susceptible to random thermal noise. So they are more likely to accidentally flip which
brings in errors. To avoid this you'd have to use the material that's less susceptible to noise
but that means that the magnetization is harder to flip. You need a stronger current for the tiny
electromagnet to write which creates more heat which creates its own noise. That's quite a conundrum.
A clever idea to avoid the problem is by using a material whose magnetization is hard to flip
but then temporarily briefly heat the place you want to write on with a tiny laser which makes
it easier to flip the magnetization. That way one compact data more densely onto the disk is
called heat assisted magnetic recording. The first types of these hard drives come from the company
Seagate and they began selling a few months ago. Western Digital is working on a similar product
and another development that's coming up is M-RAM. The magnetar resistive random axis memory that
stores information in electron spins. The random axis memory is the working memory of your devices.
This data is currently lost when you power down. For M-RAM this isn't the case. It can hold the
data even when you switch off power. Better still it's much faster. The current standard memory
has access times typically in the range of 50 or so nanoseconds. With M-RAMs you can do it in the
range of a few nanoseconds. What this means is that it's likely that in the coming years data
storage will continue to get smaller. Vones will continue to become faster and will waste our time
much more efficiently. It'll be great so don't forget to subscribe. Thanks for watching. See you

Science with Sabine

Science with Sabine

Science with Sabine