Loading...
Loading...

AI is transforming everything—but what’s powering it? In this episode of The Deep, Erika Ahern uncovers the hidden costs of AI data centers: skyrocketing energy use, water consumption, rising utility bills, and growing local backlash. The cloud isn’t weightless—it’s industrial. And it might be coming to your backyard next.
Timestamps:
0:00 Intro: The AI data center explosion
1:59 How data centers have changed
4:07 Mapping U.S. data centers
5:40 Energy consumption
7:52 Residents pay the price
9:18 Water usage
11:11 Land grabs and noise
12:23 Tax incentives
13:23 Do data centers create jobs?
14:55 Encroachment on residential areas
15:37 What are we doing about it?
17:24 Conclusion: Is there any stopping it?
Watch The Deep on Zeale: https://zeale.co/podcasts/the-deep
The age of artificial intelligence is upon us, and industrial revolution times 10.
The coming changes carry profound spiritual and cultural implications,
but I recently looked into how that revolution is also transforming our physical environment.
All that data we use doesn't just live in digital reality. It lives in data centers.
These centers are springing up right where we live next to our homes, our schools, and our parks.
And it turns out we hate it. Across the country proposals for new AI data centers
are beginning to trigger zoning battles, public hearings, and community protests.
To date, a growing wave of local and often bipartisan opposition to data centers
has blocked over $64 billion in planned projects.
93% of Americans support data center construction,
but only 35% say they would vote yes to a center in their hometown if a vote were to be held.
Everyone agrees, not in my backyard. So why the unified resistance?
I wanted to understand the anger and controversy that AI data centers have stirred up in Americans
from Arizona to Iowa to Virginia.
What I found was a vast physical footprint of an inhuman scale set to expand exponentially
in the next decade and consume vast quantities of energy, land, and water as it grows.
The AI revolution isn't just coming for our minds.
It's already eating up our land and resources.
The question is, is your backyard next?
Before the 1990s, every office in America had a data center.
It was called a computer closet. But as the .com boom took off,
companies increasingly outsourced data location to third-party data centers.
In the cloud computing era, platforms shifted the market toward massive, scalable facilities.
Today's hyper-scale data centers house rows of computer servers,
networking equipment, and data storage systems,
along with the support systems like power and cooling that keep everything running.
If you use any recommendation engines on your Amazon app,
that's only possible because Amazon runs thousands of servers in hyper-scale data centers worldwide.
Traditional computing workloads are handled primarily by CPUs.
CPUs are excellent for running operating systems, spreadsheets, databases, and web browsers.
But AI training in generation requires something different.
The AI industry needs hardware that can perform millions or billions of calculations
simultaneously. That's where GPUs come in.
Originally designed to render complex images in video games,
GPUs contain thousands of smaller processing cores that perform calculations in parallel.
That makes them extremely effective at the kind of matrix math
required by neural networks. But it also dramatically increases electricity consumption.
A traditional enterprise server rack might consume between 5 and 10 kilowatts of electricity.
Modern AI racks can consume between 40 and 120 kilowatts.
Next generation designs are expected to exceed 150 kilowatts per rack.
In other words, a single rack of AI hardware may consume 10 to 20 times more electricity
than older computing equipment. So where are these centers today?
The USA Data Center map currently lists 4,069 centers in existence in the 50 states,
with a third of those centers operating in just three states, Virginia, Texas, and California.
Virginia alone, particularly loud and county in the northern part of the state,
is referred to as Data Center Alley. It is the most dense cluster of data centers in the world,
and oversees about 70% of global internet traffic.
Drawn by the plentiful water supply for cooling systems,
a world-class fiber optic network, a reliable power grid, and its nearness to the United States
federal government. Data center operators like Equinix, NTT, AWS, and Stack came for the location
and the generous tax breaks and stayed to morph into hyper-sized facilities.
Northern Virginia is also one of the most densely populated regions on the very populist east coast,
and the massive centers are now bumping up against long-established neighborhoods.
And the collision between man and his cloud is getting ugly.
Throughout the United States, the controversy breaks out into full-throated community rebellion,
over four distinct but related issues, power, water, jobs, and taxes.
Right now, data centers consume roughly 4% of all electricity generated in the United States.
That may sound like no big deal, but the number is rising quickly as companies race
to build the infrastructure needed to train and operate large AI models.
In 2024, data centers consumed approximately 183 terawatt hours of electricity
in the United States. To put that in perspective, a single terawatt hour
represents 1 billion kilowatt hours of electricity, enough to power tens of thousands of homes
for a full year. By 2030, analysts project that data center electricity use could reach 426
terawatt hours annually. That represents a 133 percent increase in just six years.
Some projections are even higher. Depending on how rapidly artificial intelligence expands,
data centers could account for between 9 and 17 percent of all US electricity demand
by the end of the decade. For context, 10 percent of total electricity consumption in the United
States is roughly equivalent to all residential lighting nationwide. In other words,
the energy needed to power AI data centers could soon exceed the electricity used to light
every home in America. And the US Energy Information Administration has already warned
that nationwide electricity demand is expected to hit record highs in both 2026 and 2027,
driven in part by new data center construction. For example, Northern Virginia is planning to add
an additional 70 data mega centers to its existing 600 in the next few years. To meet that demand,
the state's main utility company, Dominion, has contracted to triple its power production
by adding 40 gigawatts of new energy capacity. So then the question becomes, who pays for it?
Supporting that kind of demand often requires new substations, new transmission lines,
and in some cases entirely new power plants. Utility companies pass the cost of those upgrades
onto customers, not just the data center customers, but also residents. While local communities face
rising electricity bills, AI data centers are often shielded from those costs through bulk power
purchasing agreements or PPAs with investor-owned utilities. These deals allow massive facilities
to secure discounted long-term power, even as their energy demand skyrockets.
In August 2024, meta-signed two major PPAs with German power producer RWE,
locking in 374 megawatts of electricity for their own projects in Illinois and Louisiana.
Meanwhile, residents saw the opposite trend. Electricity prices have climbed 20.7%
in Clark County, Illinois and 39% in Lafitte, Louisiana, even as solar PPA prices remained steady.
Data centers and utilities often work together to push rate increases and deflect scrutiny
in state and local government, but it's not just electricity we're talking about here.
We're also talking about the Earth's most precious resource water.
Older retrofitted data centers in the past relied primarily on giant air conditioning systems
to remove heat from server rooms, but the AI hardware skyrocketing thermal output heat
has pushed many operators toward new cooling technologies. One of the most common approaches
is direct-to-chip liquid cooling, where coolant circulates through small plates attached
directly to processors. Another method is immersion cooling, where entire server components are
submerged in specialized non-conductive liquids that absorb heat efficiently. Many large data centers
rely on evaporative cooling systems, which dissipate heat by evaporating water. While innovation
can reduce the electricity needed to cool the servers, these new techniques all rely on water,
and the scale can be enormous. Large facilities use one to five million gallons of water per day
during peak cooling periods. The Thirstiest Center in the United States, in Council Bluffs, Iowa,
used 980 million gallons in 2023, about 2.7 million gallons per day.
For perspective, an individual American uses on average 82 gallons of water per day.
With its population of 67,000, Council Bluffs residents use about 5.5 million gallons of water
daily. The data center, run by Google, brought about 900 jobs to town, but also a 49 percent
increase in the town's water usage. Two other touch points for local rage against the machine,
land grabs, and noise. A typical hyperscale center hosts over 5,000 servers, and they have footprints
in the hundreds of thousands of square feet. Cooling systems and generators can produce a continuous
hum, sometimes exceeding 90 decibels, which is above the level considered harmful to hearing over
prolonged exposure. The ambient drone of the massive fans, chillers, and backup generators from
these enormous facilities, has been blamed for night terrors, depression, anxiety, and even
the disruption of bird migration. But in spite of all these concerns, the prospects of economic
growth are tantalizing and hard to resist. After all, the centers do bring jobs, money, and tax
revenue to the localities where they spring up. Loud and county claims that its data center
taxable real property grew to 42.5 billion dollars in 2025, accounting for 75 percent of the
value of commercial real estate in the county. To attract investment at that scale, however, states
and small towns offer enormous incentives, property tax abatements, sales tax exemptions on
equipment, and infrastructure subsidies. Virginia alone provides about 1.6 billion dollars a year
in tax breaks for data centers. In less industrialized regions, town councils are rushing into
development deals that leave their own constituents in the dust. In Independence, Missouri, citizens
are fighting the council's intention to give a 90 percent tax abatement to Nebius, a developer
that wants to invest $150.6 billion in a new 2.1 million square foot center. The council is
excited about a possible $651 million new dollars in annual revenue. The residents are pointing to
the $6.26 billion left on the table. The Independence Missouri council also points to job creation as one of
the can't say no factors in its deal with Nebius. But residents are pointing out the majority of jobs
that a data center creates are temporary. For example, at peak construction, the Meta Richland
Parish Center in Louisiana employed 5,000 workers a day. Once complete, the center will provide about
500 operational jobs. Analysts at the Ham Institute estimate that the 10 billion dollar center
will also generate another 1,000 jobs in the local supply chain. According to the Nevada
Independent, business records from 2024 for a data center located outside Reno estimated the 1.1
million square foot facility would create 73 permanent jobs in the next decade and more than 4,000
temporary construction jobs. These figures make the local economic benefits of a data center
more comparable to the economic impact of bridge or highway construction, a comparison that makes
sense when you see the cloud as a new economic locus. They also reinforce the sense that local
governments who court these centers are in a way investing in the hollowing out of their own
communities, creating vast populations of transient workers destined to move on as soon as the job
is done. It's not just the tax breaks and the transient workforce. It's the zoning laws.
In some regions, loose rules have allowed the centers to be built within a few 100 feet of
residential neighborhoods. Homeowners in northern Virginia might enjoy the nation's highest media
and income, but millions of dollars cannot shield them from constant industrial noise, diesel fumes
and construction blasting near their homes. In Independence, Missouri, residents are looking at 398
acres of open country they will never get back. Residents are miserable and angry and politicians
are taking notice, at least at the national level. In the halls of the United States Congress,
politicians are scrambling to find a balance between hatred of the data centers, love of the cloud,
the power of big tech and the fear of falling behind in the global AI weapons race.
And we seem to be stumbling toward new ways with a lot of trial and error of walking that tight
rope. A few examples. In February, 2026, Senators Josh Hawley, a Republican and Richard Blumenthal,
a Connecticut Democrat, introduced a bill that would require data centers to build their own
energy sources. Other proposals floating around include charging data centers more than households
for electricity and banning construction altogether. The Trump administration has released statements
demanding that transmission organizations force tech companies to pay for any new power plants
added to the grid on their behalf. And the big companies don't like the local pushback either.
They want to build centers and to do that, they have to build trust. Both Microsoft and Anthropic
recently announced plans to pay their way and cover costs for grid upgrades at their centers.
But the accounting is tricky and ironing out the details of a satisfactory bargain for everyone
will take time. While these remedies could start to address the economic and some environmental
concerns brought on by data centers, nothing yet addresses the questions of disappearing land,
the effects of ambient noise on residents, and the hollowing out of communities and the
transformation of the workforce. So where does that leave us? We no longer stand at the edge of
another technological revolution. We are fully engaged. Artificial intelligence promises
extraordinary gains, speed, efficiency, information processed at a vast scale. The history of human
innovation suggests that those gains will be real. They will also be painful. And as individuals
and communities, we have to decide if they cost us something we should never surrender.
It's fashionable to dismiss critics of technological change as Luddites. But the original Luddites
were not irrational enemies of progress. They were skilled textile workers in early industrial
England whose livelihoods were wiped out almost overnight as factory owners mechanized production.
The Luddites smashed the looms not because they hated all innovation. They hated a system that
distributed the benefits upward while sending all the disruption downward. The tension is not new.
Every major technological leap forces societies to answer the same questions. Who benefits?
Who bears the costs? And who controls the course of events? The answer isn't to stop innovation.
It's something harder. We have to learn to innovate better. We have to weigh the benefits
clearly and honestly with the harm. Perhaps we can't turn this train around. But we can put on the
brakes and demand more efficient ways to power AI. The question is, will we, in the name of profit,
allow big tech companies to rape and pillage the land and our communities in the meantime?
We can't pretend to ourselves that the infrastructure powering the AI economy,
the tasks that we do every day on our devices exists in some wispy digital realm. As humans,
we like to act like the digital economy is something frictionless, something that floats above the
physical world. But AI data centers are forcing us to rediscover that the opposite is true.
The internet has an industrial footprint. There's likely no turning back from this transformation.
The benefits are too powerful and the global competition too intense. So we need a more honest
accounting of costs, of trade-offs, and of who pays for the infrastructure that powers it.
In other words, the machine that lives in the cloud still runs on the ancient precious inputs.
Electricity, water, land, and human lives. If you liked this episode, be sure to subscribe to
the Loopcast. We bring you a new episode of The Deep every Thursday at 4 p.m. Eastern.

The LOOPcast

The LOOPcast

The LOOPcast