Loading...
Loading...

Just one bite of the deliciously comforting flavor of easy-cooked and
eckridge smoked sausage is enough to take you all the way home it takes you
back to grandma's famous sausage casserole made in her favorite pan it's
naturally hardwood smoked taste sends you away to those summer days with dad at
the grill take a bite and see for yourself that home isn't just a place it's a
taste and we think you'll like it here. Eckridge welcome to bedtime
astronomy explore the wonders of the cosmos with our soothing bedtime
astronomy podcast each episode offers a gentle journey through the stars planets
and beyond perfect for unwinding after a long day let's travel through the
mysteries of the universe as you drift off into a peaceful slumber under the
night sky have you ever considered the inherent bias in how we observe the
universe it's a massive bias right I mean when you look up at the night sky you
are basically experiencing a severely filtered version of reality just a tiny
fraction of what's out there exactly you see the stars the planets maybe the
andromeda galaxy if you are in a dark enough location very lucky yeah if you're
lucky but fundamentally human eyes and by extension the vast majority of our
optical telescopes they are drawn exclusively to the brightest sources of light
it's a foundational paradox in astrophysics really it is we build our entire
models of the cosmos based on these blazing beacons in the dark you know the
massive highly luminous galaxies and quasars but doing that relying just on the
bright spots leaves a monumental blind spot a huge one because what about the vast
spaces between those objects seemingly empty voids because they are not empty at
all far from it so today we are doing a deep dive into a groundbreaking
publication from the astrophysical journal this was released very recently
March 3 2026 based on some incredible data yes data from the hobby
everly telescope dark energy experiment which we'll refer to as H
to dex our mission today for you the listener is to understand how
astronomers are finally looking past those bright stars they're finding a
hidden ocean basically a vast hidden sea of light residing right between really
galaxies and we'll explore what mapping this invisible cosmic web actually
means for our fundamental understanding of the universe this research really
represents a complete paradigm shift in observational cosmology oh so well we
are looking at a highly critical epoch in the universe's history specifically the
period between 9 and 11 billion years ago okay let's unpack this why our
astronomers so heavily focused on this particular window what makes the
universe of 10 billion years ago so crucial for understanding the galaxies we
see around us today that time frame the corresponds to what astrophysicists
often call cosmic noon cosmic noon I love that term it's very descriptive if you
look at the cosmic star formation rate history which isn't a flat line right not
at all the universe did not produce stars at a constant rate about 9 to 11
billion years ago the universe was experiencing its absolute peak of star
formation so it's just churning out stars it was an incredibly dynamic
violently active epoch the galaxies during this period they weren't the
settled beautiful spiral galaxies we see today like the Milky Way right they
weren't like the Milky Way they were chaotic they were actively pulling in or
creating massive amounts of primordial gas from the intergalactic medium just
feeding on this gap exactly and igniting stars at rates hundreds or sometimes
thousands times higher than what our galaxy does today so it's essentially the
peak of cosmic construction but observing that construction comes with
significant physical limitations doesn't it very significant the primary
limitation is just surface brightness because it's so far away right because
we're looking at objects 9 to 11 billion light years away the inverse square
law dictates that the light reaching us is exceptionally faint okay but we can
see some galaxies from back then we can resolve the most massive hyper luminous
galaxies from that era yes because their localized starburst activity is so
incredibly intense but the smaller stuff the fainter dwarf galaxies and more
importantly the sprawling filaments of diffuse hydrogen gas that form the
cosmic web the fuel lines basically right the fuel lines they fall well
below the detection limits of standard optical imaging the light from that
diffuse gas has spread out over billions of light years by the time it hits a
telescope mirror on earth it's effectively indistinguishable from the ambient
background noise of the night sky or even the thermal noise of the camera
instruments themselves the time machine aspect of this is something that always
strikes me we use that term colloquially but mathematically that is exactly what
telescope data from this era represents it's literally a time machine right
because the speed of light is a hard limit when we pull data from H.E.A.T.X. that
originated 10 billion years ago we are capturing photons that have been
traveling through the vacuum of space since before our sun even existed long
before the earth is roughly four and a half billion years old so this light
had already been traveling for over five billion years before our planet even
coalesced from a cloud of dust it's staggering to think about it really is and
capturing that specific ancient light to reconstruct those early fuel lines it
requires an entirely different approach than just taking a long exposure
photograph right yet to take a picture taking a standard optical image of this
epoch to find diffuse gas is largely futile because the background washes it out
exactly the broadband filters used in standard photography or
photometry simply led into much background light it washes out any faint
structures so what's the alternative to map the intergalactic gas 10 billion
years ago we have to abandon images completely we rely almost entirely on
spectroscopy we're not looking for the physical shape of a galaxy anymore no we
are looking for the highly specific physical signatures hidden within the
electromagnetic radiation itself we isolate distinct wavelengths of light that
prove the presence of specific matter that requires transitioning our focus to
the actual language of light the spectrum now for you listening breaking light
down into a spectrum reveals the emission and absorption lines of chemical
elements like a cosmic barcode right a barcode but I want to zero in on the
specific wavelength that makes this entire hidex map possible the Lyman alpha
emission line the holy grail for this era why is this specific quantum
transition the ultimate tool for looking at the cosmic noon well the
lemon alpha line is the cornerstone of high redshift observation astronomy to
understand why we just need a quick look at the quantum mechanics of the hydrogen
atom which is the most abundant element out there by far yeah so when a
hydrogen atom sits near a region of intense star formation like those massive
chaotic galaxies we talked about it gets bombarded by extreme ultraviolet
radiation from the young hot stars yes specifically massive oh and B type
stars this radiation is so energetic that ionizes the hydrogen it physically
strips the electron away from the proton and the signal we're looking for is
created when that electron finds its way back right correct when the proton
eventually recombines with an electron that electron cascades down through the
atoms specific quantized energy level stepping down a ladder exactly like
stepping down a ladder and when it drops from the first excited state down to
the ground the bottom run to the latter it releases a photon with a very
specific unchangeable wavelength in that wavelength is 121.6 nanometers that is
the Lyman alpha line well 121.6 okay because the early universe was
absolutely dominated by hydrogen and the star formation rates were so extreme
these galaxies act as colossal Lyman alpha factories they pump out an
astonishing number of these specific photons but wait 121.6 nanometers is
deep in the ultraviolet spectrum that's invisible to the human eye it is and it
doesn't stay at 121.6 nanometers either because of the expansion of the
universe right the redshift as those photons travel through space for 10
billion years space itself expands it physically stretches the light waves so
by the time they reach the HETX instruments in Texas that ultraviolet light
has been redshifted straight into the visible optical bands right around 350
to 550 nanometers greenish blue light exactly the source material notes the
shows up as a dramatic peak in the data if you picture the spectrographic
feed you have this relatively flat continuum of background emission and then
boom a violent unmistakable spike at that specific redshifted wavelength it's
the undeniable fingerprint of excited hydrogen at that exact distance in the
universe we call that dramatic peak alignment alpha emitter or an LAE
finding those massive spikes is the traditional methodology for locating high
redshift galaxies if you see that peak you have definitively located a bright
active galaxy from that specific epoch but the galaxies aren't the whole story
no theoretical models of the cosmic web have always suggested that lemon
alpha emission shouldn't just be restricted to the massive galaxies the gas in
between them should be glowing too the vast filaments of intergalactic gas
drifting between the galaxies should also be emitting these photons either
through recombination from the background radiation or just from gas falling into
dark matter halos and heating up but the emission from those gas
filaments that would be orders of magnitude weaker than the galaxies
themselves vastly weaker I mean the peaks from the galaxies are highly localized
they're relatively easy to extract from the data they stand out they do but the
emission from the intergalactic gas is exceptionally diffuse it's just a subtle
incredibly faint glow spread across massive cosmic volumes and for decades
finding that faint signal was thought to be impossible virtually impossible on a
large scale the instrumental noise the foreground light from our own solar
system it all drowns it out which brings us to the sheer scale of the instrument
required to even attempt this you cannot just point a standard observatory
telescope at the sky and hope to map this stuff now you need a behemoth you
need the hobby ebberly telescope dark energy experiment at the McDonald
Observatory in West Texas let's discuss the volume of this survey because the
engineering reality of head decks is staggering the hobby ebberly telescope the
H.E.T. is a totally unique piece of engineering most big telescopes move on
dual axis right they tilt up and down and spin around to track the sky right but
the H.E.T. has a fixed elevation angle it sits permanently at 55 degrees
okay it simply rotates an azimuth around in a circle while a highly complex
tracker moves across the focal plane at the top to follow the astronomical
targets as the earth turns that's incredibly clever it's very efficient and
for the dark energy experiment the telescope was upgraded with a massive array
of spectrographs they call it virus the visible integral field replicable unit
spectrograph that's the one and virus isn't just one single instrument no it's a
massive replication strategy instead of building one giant spectrograph they
guilt dozens of identical units and fed them with thousands of optical fibers
over 30,000 optical fiber 30,000 this allows H.E.T.
to perform integral field spectroscopy on an industrial scale they're
taking spectra of thousands of discrete points on the sky simultaneously and
their primary mission as the name implies is mapping the expansion history of
the universe to constrain dark energy yes their stated goal was to chart the
3d positions of over 1 million bright Lyman alpha emitting galaxies to get
that catalog of a million galaxies they had to cover a massive area of the
sky an area measuring over 2000 full moon it's a huge swath of the celestial
sphere let's help you visualize that the angular diameter of the full moon is
about half a degree so taking up 2000 full moons that is a massive sweeping
expansive space they're just blindly pointing these 30,000 fibers at the sky
pulling in light separating it to wavelengths and generating an
unbelievable 600 million individual spectra hey here's where it gets really
interesting Carl Giphart the principal investigator for at Denix revealed a
metric about this data collection that fundamentally alters how we view these
surveys it really does despite gathering 600 million spectra that primary dark
energy mission the effort to map the 1 million bright galaxies it only
utilizes roughly 5% of the collected data 5% that is the crucial pivot point of
this entire deep dive it's what the primary pipeline for H decks is designed for
point source extraction it scans all 600 million spectra looking for high
signal to noise ratio peaks the bright galaxies exactly once it identifies and
catalogs a galaxy the rest of the data surrounding that peak which makes up
95% of the total data set is mathematically categorized as background noise
they threw out 95% of the data I mean not literally deleted it from the hard
drives but scientifically it was sideline was ignored you build this
incredibly complex array of 30,000 optical fibers you survey 2000 full moons of
sky and 95% of the photons you catch are deemed irrelevant just because they
don't cross a specific brightness threshold well from a traditional survey
perspective that is standard operating procedure really yes if your objective is
a highly pure catalog of discrete individual objects anything that cannot be
confidently resolved as an object is an impediment it's just getting the way
exactly it's foreground light it's atmospheric air glow it's thermal noise in
the CCD detectors but Masha Luzha Nimaer and the team behind this new
publication they recognized a profound philosophical flaw in that approach they
realize that 95% is not empty noise no among the instrument artifacts and the
air glow is the literal sea of light from the cosmic web it contains the
aggregate Lyman alpha emissions of all the dwarf galaxies that were too faint to
trigger the detection algorithms plus the glowing filaments of intergalactic
gas it's the difference between mapping the peaks of a mountain range and
mapping the entire tectonic plate underneath it it's a great way to put it the
bright galaxies are just the most luminous nodes of a much larger interconnected
structure but the challenge wasn't getting the data HEDX already banked half
a petabyte of it the challenge was statistical highly statistical how do you
extract an incredibly faint highly diffuse signal from a data set where the
noise is orders of magnitude louder than the signal you have to completely abandon
the concept of object resolution stop looking for individual things exactly
you can no longer ask the data pipeline to find a specific galaxy this
requires transitioning to a technique known as line intensity mapping line
intensity mapping or limb limb fundamentally redefines the objective of the
survey instead of searching for spatial coordinates of bright peaks limb measures
the integrated surface brightness of a specific spectral line across large
cosmic volumes to make that concrete for you listening shuleen minios a co-author
on the paper offered an excellent analogy regarding how we view the spatial
distribution of light airplane analogy yes he compared the traditional
cataloging method to flying in an airplane at night and trying to map a
country's population by only looking at the brightest city centers it's highly
illustrative if your optical sensor on the aircraft is calibrated to only
register the intense light output of major metropolitan areas like New York
Chicago Los Angeles then your resulting map implies a binary distribution exactly
it implies there are intense points of existence surrounded by total empty
voids but we know demographically that the population is continuous there are
sprawling suburbs rural highway corridors and small towns connecting those major
hubs and the traditional point source extraction of ATX was mapping the
cosmic cities but it was completely blind to the cosmic suburbs in the interstate
highways of gas connecting them because the light from the suburbs isn't
concentrated enough to trigger the sensor so to map the entire landscape
munoz suggests keeping the airplane at the exact same altitude looking at the
exact same landscape but changing the optical properties of the sensor you
look through a deliberately smudged window a smudged window the smudged
window represents the spatial and spectral smoothing inherent in line
intensity mapping when you apply a smoothing kernel to the data you
intentionally degrade the resolution you make it blurry you make it very blurry
you can no longer distinguish the sharp boundaries of the major cities
the points of light blur and expand but there's a mathematical advantage to
that right a critical one while resolution is lost the total photon count is
conserved it's aggregated the faint sub threshold light from the cosmic
suburbs is integrated into larger volumetric pixels which we call voxels
voxels 3d pixels right and by integrating over
larger volumes the faint signal of the diffuse lineman alpha emission
constructively interferes it naturally rises above the threshold of the
random uncorrelated instrumental noise it's a brilliant statistical maneuver
you sacrifice the ability to say there is a distinct dwarf galaxy exactly a
coordinate x right but you gain the ability to say this entire region of space
is radiating a faint lineman alpha glow the blurry picture actually
contains more cosmological information about the distribution of matter
than the sharp highly filtered picture did it's a more complete truth now the
source material does clarify that line intensity mapping itself is not a newly
invented concept right radio astronomers have used it they've used it for
years to map the 21 centimeter line of neutral hydrogen yeah but applying this
technique to the ultraviolet lineman alpha emissions in the optical band
over a survey area of 2000 full moons that is entirely unprecedented and the
application at this scale introduces formidable computational complexities
massive complexities in 21 centimeter mapping the foregrounds are intense
but the spectral line itself is relatively straightforward but lineman alpha is messy
it's a resonant line the focans scatter repeatedly off neutral hydrogen atoms
before they ever escape the galactic halo it makes the radiative transfer
extremely complex to model plus the data set itself is gargantuan to apply line
intensity mapping to the discarded 95 percent of the HX data required
processing roughly half a petabyte of raw spectroscopic files half a petabyte
let's give that some scale if you consider a high definition movie to be
roughly five gigabytes half a petabyte is equivalent to 100,000 high
definition movies you're not doing that on all laptop no processing that volume of
data isn't something you do on a workstation in a university lab the team had
to rely on the Texas Advanced Computing Center or TACC they were utilizing
supercomputers like frontera and stampede running completely custom
pipelines they had to mathematically strip away the atmospheric emission lines
the foreground zodiacal light from our solar system the galactic
series from the Milky Way and all the instrumental artifacts from 600
million spectra all without accidentally erasing the incredibly fragile
ultra faint lineman alpha signal hidden underneath
the data reduction pipeline is an engineering marvel in itself but raw
computational power is meaningless without a rigorous physical framework to
guide it right if we connect this to the bigger picture the actual
methodology used to reveal the cosmic web relies on a foundational
property of cosmology gravitational clustering exactly the universe is not a
uniform soup of matter it is heavily structured by the gravitational
potential wells of dark matter halos and this is where itero kamatsu's
signpost technique comes into play kamatsu is a highly respected cosmologist at
the max plank institute for astrophysics his contribution here is brilliant
it bridges the gap between the 5% catalog and the 95% noise
he utilizes the concept of cross correlation
since gravity dictates that matter will pull inside these dark matter halos
we know that massive bright galaxies do not exist in isolation
they seated the densest nodes of the cosmic web exactly the one million bright
lineman alpha emitters the 5% of the data already cataloged by hdx
they aren't discarded in this new map far from it they're the anchors the
anchors in cosmology we use a metric called the two-point correlation function
which mean it essentially calculates the probability of finding a specific signal
at a given distance from a known reference point okay kamatsu signpost method
leverages this by cross correlating the known 3d positions
of the bright galaxies the cities with the heavily smoothed faint intensity map
of the remaining 95% of the data the suburbs so the super computer takes the
coordinates of a known bright galaxy and says based on the laws of gravity
there should be a localized over density of gas and dwarf galaxies
right around this coordinate right it knows where to look it then looks at the
intensity map for that specific region and extracts the faint signal that's
statistically correlates with the presence of that bright anchor the bright
galaxies act as gravitational signpost shedding look here the cosmic web is
thickest right around me by stacking the signals around hundreds of
thousands of these signposts something amazing happens with the map the
random instrumental noise which obviously does not correlate with the physical
positions of the galaxies averages out to zero but the real astrophysical
signal from the intergalactic gas it constructively adds up it stacks
this cross correlation technique allowed the super computers to mathematically
triangulate and reveal the three-dimensional morphology of the diffuse gas
it's incredible to visualize the output you begin with an empty void punctuated by
one million isolated brilliant points of light a standard scatter plot exactly a
scatter plot but as the cross correlation algorithm runs across the half
petabyte of data the spaces between those points begin to glow the massive
filaments of hydrogen gas emerge from the background noise
stretching across millions of light years linking the bright nodes together
the cosmic suburbs are illuminated it transforms our view of the 9 to 11
billion year old universe from a collection of isolated islands into a massive
contiguous structure so what does this all mean the empirical observation of
the structure is a monumental achievement obviously
but why does having this empirical map matter if we already had theoretical
models that's a great question i mean we have massive super computer
simulations like illustrious tng or the eGL project
these simulations take the initial conditions of the big bang apply the laws
of fluid dynamics dark matter gravity and run it forward to see how the cosmic
web forms aren't those simulations accurate enough well simulations are
inherently limited by their resolution and by the assumptions encoded within
their subgrid physics so grid physics and a cosmological simulation spanning
hundreds of millions of light years it is computationally impossible to model
the physics of individual stars or individual supermassive black holes the scale
is just too vast right therefore simulators use
recipes to approximate the effects of supernova feedback or active galactic
nucleus agent feedback the processes that violently eject gas out of
galaxies and back into the intergalactic medium precisely
they have to estimate how much energy a black hole dumps into the surrounding
gas because they can't simulate every photon
and if the assumptions in those feedback recipes are slightly inaccurate
then the resulting distribution of gas in the simulation will diverge from reality
prior to the eight dex line intensity map theorists could simulate the flow of
gas into and out of galaxies a cosmic noon but they lacked the
comprehensive observational data to verify if their feedback models were
actually correct the hate dex map provides the ground truth it is the real
universe so cosmologists can now take the mock lineman alpha emission catalogs
generated by their simulations and directly cross correlate them with the
actual spatial distribution observed by eight dex it's the ultimate reality
check for theoretical physics if your simulation says the gas should be
blown 500,000 light years away from the galaxy by a quasar
but the het dex map shows the gas is tightly bound within 100,000 light years
you know your subgrid physics recipe for agn feedback is wrong
it forces the theoretical models to conform to the empirical reality
exactly and honestly this dynamic testing sophisticated models against raw
messy real world data is a critical philosophical anchor
not just in astrophysics but in any data driven field today
absolutely we live in an era heavily dependent on predictive models and
algorithms we've simulate climate impacts economic shifts epidemiological
spread it is incredibly easy to trust the output of a model simply because
its internal logic is sound but the ATX project reminds us that the model is
merely a hypothesis right until it is aggressively tested against the
totality of the available data not just the 5 percent that is bright
clean and easy to measure that is a highly pertinent observation
the reliance on heavily filtered data to inform generalized models
is a systemic vulnerability across all sciences the het dex map unequivocally
demonstrates the value of mining the discarded data
and within the context of astrophysics this initial map is really merely the
vanguard just the beginning the successful application of line intensity
mapping to lineman alpha emission establishes a rigorous methodological
foundation for the entire future of cosmic cartography
because the research team is already pivoting toward applying this technique
to different spectral lines right yes to map entirely different components of
the galactic ecosystem because the lineman alpha line only tells us part of the
story it traces the ionized and excited neutral hydrogen it shows us where the
massive hot stars are irradiating the surrounding gas it maps the active
violent regions of the cosmic web but to understand the complete barion cycle how
gas flows from the voids cools condenses and eventually form stars we need to
map the cold gas as well this raises an important question how do we observe
the reservoirs of fuel that haven't ignited yet the dark stuff right the immediate
next step outlined by the researchers involves targeting the emission lines of carbon monoxide
specifically the rotational transitions of the CO molecule while lineman alpha traces gas at
temperatures of roughly 10,000 Kelvin carbon monoxide is an excellent proxy for locating
giant molecular clouds where the gas temperatures drop to just tens of degrees above absolute zero
freezing cold and those freezing incredibly dense molecular clouds are the actual stellar
nurseries you need cold gas because thermal pressure fights against gravity heat pushes out
gravity pulls in right only when the gas cools down sufficiently can gravity take over
causing the cloud to collapse and ignite nuclear fusion to birth new stars exactly so by
conducting line intensity mapping surveys for carbon monoxide using millimeter wavelength
arrays like LMA or future facilities and cross correlating that data with the HTX lineman
alpha map the astronomers can effectively trace the entire thermodynamic life cycle of galactic
evolution you map the cold in falling molecular gas with CO you map the regions of active massive
star formation with the bright lineman alpha peaks and you map the diffuse outflowing or heavily
irradiated gas in the circumgalactic medium with the faint lineman alpha intensity map layering
these maps over the exact same cosmological volume will give us a complete multi-phase view of
the universe at its most active epoch we will literally watch the respiratory system of the cosmos
galaxies inhaling cold molecular gas and exhaling hot ionized plasma the technological capacity
to do this is rapidly expanding to with pioneering instruments like the hobby ebberly telescope
paving the way and upcoming line intensity mapping missions like sphere x and exclaim coming
online soon we are transitioning from an era of cataloging isolated objects to mapping the
continuous fluid dynamics of the universe julien winios correctly categorize this as entering a
golden age for cosmic mapping it truly is the HTX publication is a proof of concept that fundamentally
validates the intensity mapping technique for optical and ultraviolet wavelengths on cosmological
scales it proves that the faint unresolved emission is not a barrier to observation but rather a
profound source of physical information let's summarize the sheer scope of what we've discussed today
we've entered 10 billion years into the past to the cosmic noon an era defined by extreme galactic
assembly we examine how the quantum transition of the hydrogen atom produces the 121.6 nanometer
lineman alpha line and how cosmological redshift stretches that signal all the way into the optical
we detailed the massive engineering of the htd x project the 30,000 fibers and the brilliant statistical
pivot from point source cataloging to line intensity mapping by utilizing supercomputers to cross
correlate the known coordinates of one million bright galaxies with half a petabyte of highly
smooth previously discarded data the team successfully illuminated the faint diffuse
filaments of the cosmic web they transform 95 percent background noise into the most robust
empirical test of cosmological simulations ever created the transition from viewing the universe
as a collection of discrete luminous points to a continuous interconnected topological field it's
just an extraordinary scientific advancement it underscores a fundamental principle really
the limits of our understanding are often dictated not by the absence of information
but by our methodological filters wow we structured our most advanced cosmological models
while ignoring 95 percent of the observational data because it failed to cross an arbitrary
threshold of clarity which demands that we ask a pretty profound question it does yeah if the
literal connective tissue of the universe was hiding in the noise we systematically discarded
what other foundational truths in physics in the complex systems of our own planet or even the
structure of our societies are we completely missing simply because we have not yet developed
the framework to find meaning in the blur the HEXMAP proves the information is there the challenge is
having the vision to read it
