Loading...
Loading...

Oil prices are rising, flights are being canceled, and businesses around the world are feeling the pressure. But another major conflict is also affecting business: a legal fight between AI company Anthropic and the US government. In this episode, we look at how this case raises important questions about business ethics, government power, and the future of AI.
In this episode of Down to Business English, Skip Montreux and Samantha Vega get Down to Business with a major legal dispute between Anthropic, the company behind Claude AI, and the US government.
Skip and Samantha's report gives listeners a clear and practical look at an important business and technology story, while helping you build your Business English. In this episode, you will learn:
Become a D2B Member today for to access to our -- NEW!!!-- interactive audio scripts, PDF Audio Script Library, Bonus Vocabulary episodes, and D2B Member-only episodes.
Visit d2benglish.com/membership for more information.
Follow Down to Business English on Apple podcasts, rate the show, and leave a comment.
Contact Skip, Dez, and Samantha at
Follow Skip & Dez
From Tokyo, Japan, and New Plenith, New Zealand, this is down to business English with your hosts, Skip Montro and Samantha Vega.
Wow, I just paid $200 to fill up my car.
$200.
Talk about painful.
Wow, what did it use to cost you?
A lot less than that. I think I'm just noticing it more now than anything else.
Of course, yeah, the pain in the pocket at the fuel pumps is just incredible at the moment.
And this all has to do with what is happening in the Middle East at the moment with this war, this U.S. Israeli war against Iran.
Yes, the Strait of Hormuz, choke point for 20% of global oil supply.
And with that oil not being able to get distributed internationally,
it's having a huge impact on global business.
Yes, there's lots of different businesses that are thinking of different ways to combat the spike in oil prices.
The New Zealand government is considering work from home initiatives for their employees, for businesses to help combat high prices of petrol.
I know in other countries, some employers are looking at moving their employees to a four-day work week so they don't have to drive to work five days a week.
They only have to deal with the fuel prices on four days of the week.
Save money.
Yeah, I know here in Japan on March 9th, right after the initial attack on Iran, the Nikkei 225 crashed.
Something like 5% on one day, and it still really hasn't recovered.
It's very volatile recovery.
And almost every industry, from automotive to steel,
all the major companies are making some kind of announcement about how these high oil prices are impacting their production.
I even heard of a potato chip company announcing they're going to, I don't know, reduce or cancel production of their potato chips because they couldn't secure enough cooking oil.
Oh my god.
Or oil for their factory.
I'm not quite sure.
Anyway, it's having a widespread impact in every sector of business.
I gosh, they're hitting us in the snacks.
Air and New Zealand, for example, has canceled 1100 flights.
I've had to book a flight.
I've got a business trip in a couple of months and I had to make sure I've got to book it now so that I know that I'm floating.
And hopefully the flight will still exist because the canceled 1100 flights due to high fuel prices,
at least until the end of May.
And then I think they're going to re-evaluate what those cancellations, if they can bring them back or if they're just going to have to continue them for the future.
All of this turmoil in the Persian Gulf, this military conflict, or let's just call it what it is, a war.
It's not our story today, but it is related to our topic.
That's right, it is related.
Today, we will be reporting on an unprecedented and high stakes legal battle between Anthropic, the company behind Clawed AI, and the US Armed Forces, or Department of War.
What is with the Department of War? I thought it was the Department of Defense.
There's an interesting story behind that.
What's that?
Originally, all the way back in 1789, the very first president of the US, President George Washington, established the Department of War.
And it remained that way throughout American history, all the way up until after World War II.
In 1947, they brought all the branches of the Armed Forces, the Air Force, the Navy, the Marines, the Army, under one roof, and they adopted a Congress passed a law calling it the Department of Defense.
Ah, that makes sense.
However, when Donald Trump came back into office for the second time, he wanted to change it, or he wrote an executive order, changing it to the Department of War.
According to a statement from the White House, they said the name conveyed a stronger message of readiness and resolve compared to the somewhat more softer title, Department of Defense.
So that's why they're calling it the Department of War these days.
So legally, it's still known as the Department of Defense, but they have changed the name to the Department of War because they think it sounds more robust or tenacious.
That's right, but let's not get bogged down on that.
The focus of our report is not the department's name, but rather how it does business with private companies.
This story is going to illustrate a collision between a cutting-edge technology company, corporate ethics, and a $200 million government contract.
And this story also remains fluid because other tech industry giants are piping up and joining the conversation, and the issue is evolving.
The business community is reeling because it challenges whether a company can maintain ethical red lines while serving the state at the same time.
So let's do it.
Let's get D2B down to business with anthropic versus the United States.
The dispute between the government and anthropic started in September of 2025 when US Secretary of War Pete Hegseth issued a memo
that demanded anthropic, along with all other AI contractors, permit the Department of Defense all lawful uses of their technologies.
Fundamentally, anthropic was fine with 98% of what the military wanted to use Claude for, but there were two use cases that they would not accept.
So what would anthropic not do?
The red lines and anthropic refuse to allow Claude to be used for were number one, using Claude for mass domestic surveillance.
And number two, using their AI in fully autonomous lethal weapons.
To allow Claude to be used for these purposes, anthropic would have to go against their stated company beliefs and ethical policies and remove self-imposed guardrails on its AI model Claude.
And anthropic stuck to their guns then wouldn't allow the US government to change their minds to allow governmental access of those two very important parts of their technology's ability.
The ability to domestically surveil the US public on mass and to allow the US government to control fully autonomous lethal weapons, weapons that can be used without human oversight.
It's important to note that CEO Dario Amode argued it's not that he wasn't 100% against using AI for those two use cases, but his argument is that current AI is just not reliable enough, especially for autonomous warfare.
It could easily misidentify targets.
In other words, anthropic said their technology isn't up to par yet.
That's precisely what they're saying.
And the government retaliated by designating anthropic a supply chain risk to national security and ordered all federal agencies to cease using its technology as well as contractors to federal agencies.
Being labeled a supply chain risk is going to have such a negative impact on anthropics business operation.
It will. The label supply chain risk to national security is typically reserved for foreign adversaries like China or Russia.
So thousands of contractors and government agencies like NASA and the Department of Health and Human Services now must certify they must prove, in other words, fill out a load of paperwork to prove that they aren't using Claude.
So this is creating an operational nightmare for software workflows.
And that's not the only issue. The military has problems to deal with as well.
Because despite the ban, Claude was already deeply embedded in their operations and their military systems.
Can you give us some examples of that?
Claude was just used recently back in January, 2026, when the US raided and captured President Nicholas Maduro in Venezuela.
And in Operation Epic Fury in Iran, Claude helped identify and strike over 1,000 targets in Iran within the first 24 hours.
And the irony is staggering that supply chain risk to national security label and the military calling the tech a risk is absurd because they've already used it for their most sensitive lethal operations.
Because it was so deeply embedded already that to remove it on short notice was almost impossible.
Here's a fact for you.
AI tech has compressed planning that once took days into hours.
In Iran, 20 troops did the same work that 2,000 people did in Iraq back in 2003.
That's incredible.
In any event, anthropic filed lawsuits in California and Washington DC alleging first amendment retaliation for their views on AI safety.
Basically, they're saying that the government is infringing on their first amendment right to set their own policies.
They argue that this supply chain risk label or designation was an ideological power play used to punish a private company for their policy preferences.
And anthropics not the only tech giant that is weighing in on this case.
Microsoft waited into the waters when they filed an amicus brief supporting anthropic warning that the Pentagon's move sets a dangerous precedent for all government suppliers.
It does. The government is basically forcing anyone who wants to do business with them, not to do business with anthropic.
It's quite unfair.
In a sharp twist open AI chat GPT signed a new deal with the Pentagon on the same day anthropic was blacklisted agreeing to all lawful uses and the public backlash followed.
Oh boy, did it ever follow after that ban uninstalls of chat GPT surged by 295% while at the same time downloads of Claude jumped to number one on the US app store.
Yes, consumers sent a powerful message as to where they stood on this issue.
They do not like the idea of the government dictating who private companies do business with.
So this case asks the ultimate question who sets ethical boundaries for artificial intelligence.
The engineers who build it or the government who buys it as this heads to federal court.
Every federal contractor is watching the outcome will decide if the US government has the power to order a private company to make its products less safe if it's a benefit to the government.
It's a battle over corporate autonomy versus state compulsion.
Can the government use its market power to force a private company to give up its core values?
For, like you said, the national interest of interest to us now it's time for us to get D2V down to vocabulary.
The first word on today's D2V list is the verb combat.
In everyday English, combat means to fight.
But in business, and when you hear this word in the news, it is often used figuratively.
In that context, it doesn't mean to physically fight something.
Not physically, no. It means to take action against a problem.
Reduce its effects or stop it from getting worse.
Right. It has the same sense of actively dealing with something difficult.
You're fighting the situation, not just passively accepting it.
And that is exactly how we used it in today's introduction.
Samantha said that lots of businesses were thinking of different ways to combat the spike in oil prices.
In simpler terms, I was saying many companies were trying to deal with or fight rising fuel costs.
That's exactly it. And it is worth noting that combat is quite common in formal or professional context.
You will often hear it with nouns like inflation, rising costs, fraud, risk, or climate change.
It sounds a little more formal and strategic than just saying fight or deal with.
Can you give us a business-related example, Samantha?
Sure.
To combat rising logistics costs, the company increased their delivery fees for their customers.
Oh, dear. That is bad news for consumers.
I'm afraid we will be getting a lot of bad news like that in the coming days.
Next up on today's D2V list is the noun, term oil.
Term oil is a state of confusion or instability.
We use term oil when there is a lot of uncertainty, a lot of disruption, or change happening.
In the introduction to today's report, Skip said that term oil in the Persian Gulf was related to our topic today.
I was saying that the war had created a lot of instability in the Persian Gulf,
and that this unstable situation was connected to our main story.
Term oil is often used in the news and in business when talking about major changes or crisis situations.
It usually suggests the situation is hard to control.
Let me give you a business example.
Because of the term oil in the shipping market, the company delayed its product launch by two weeks.
That works well. Nice and clear.
Thank you.
Moving on.
The next item on our list is the expression, bring under one roof.
That's a useful expression. I use it all the time.
When you bring things under one roof, you put different groups or activities together in one place,
or under one organization.
It does not always mean the same physical building.
Very often, it means bringing different teams, functions, or operations together,
so they are managed as one unit.
Right.
So the idea is not just physical closeness, but organizational unity as well.
Precisely.
And that is how I used it in today's report.
When I said that in 1947, the U.S. government brought all the branches of the armed forces under one roof and adopted the name, the Department of Defense.
In other words, the U.S. government put the different military branches together under one system instead of operating as separate organizations.
Business when a company merges departments, centralizes operations, or just combines services, they are bringing them under one roof.
This is especially useful when a company wants things to run more smoothly.
Here is a business-related example.
After the merger, the new management decided to bring the sales, marketing, and customer support teams under one roof.
That is exactly what happens in a M&A.
What's our next word?
The final item on today's D2V list is the noun, a red line.
A red line is a clear limit that you will not cross.
When something is a red line, it is a firm boundary.
It is the point where you say, no, this is not acceptable.
Just think of the red line on your car's tachometer.
You know, the instrument panel that measures the engine's RPM.
There is an actual red line that shows when your engine is running too fast.
That's a good image to help remember this word.
In today's report, Skip said that anthropics two red lines were allowing Claude to be used for mass surveillance and autonomous lethal weapons.
In other words, those were the two use cases and anthropic would not accept under any circumstance.
They were the company's clear ethical limits.
You will often hear red lines in politics, law, and business when people are talking about non-negotiable rules or boundaries.
Can you give us a business example, Skip?
Sure.
Late delivery is a red line for most buyers.
So any supplier who misses the deadline will lose the contract.
It's always best to know what your customers red lines are and not cross them.
Would you like to help down to business English reach more people wanting to improve their business English skills?
Follow down to business English on Apple podcasts, YouTube music, Spotify, or any place podcasts are found.
Leave a rating and a review until everyone how much you enjoy the show.
And that is our report on anthropic versus the US government.
And all the background on their legal battle of ethical uses of AI.
It's a very important business story and one we will have to keep an eye on to see how it turns out.
That we will.
D2B members and Apple podcasts subscribers.
The bonus down to vocabulary episode for today's report will be released over the next few days.
In that bonus episode, we will break down five more vocabulary items that Samantha and I used in today's report.
Those words and phrases are convey, self-imposed, on mass, oversight, and the expression to wait into the waters.
All useful business vocabulary.
If you are a D2B member, make sure you have copied your members only podcast feed URL from your account page on the D2B website and pasted it into the podcast app of your choice.
That way you will not miss that bonus D2B episode when it drops.
And Apple podcast subscribers, you don't need to do anything.
The bonus D2B episode will show up automatically in your feed as soon as it's released.
And if you are not yet a D2B member or Apple podcast subscriber, but you benefit from what we are doing here on down to business English, do consider supporting the show.
To become a D2B member, just visit D2BEnglish.com forward slash membership and sign up today.
And to become an Apple podcast subscriber, just tap subscribe on the down to business English show page in the Apple podcast app.
We thank you for your support.
Thanks for listening everyone. See you next time.
Take care.
Down to business English business news to improve your business English.
