General Discussion
Related: Editorials & Other Articles, Issue Forums, Alliance Forums, Region ForumsAnthropic stands up for safety, human rights, and integrity - how can we show our support?
Plenty of articles are available online.
Here is a link to an article from POLITICO - By Brendan Bordelon, 02/26/2026 06:08 PM EST
https://www.politico.com/news/2026/02/26/anthropic-rejects-pentagons-ai-demands-00802554?utm_medium=twitter&utm_source=dlvr.it
Anthropic rejects Pentagons AI demands
[T]hese threats do not change our position, Anthropic CEO Dario Amodei wrote in a blog post.
SNIP
Anthropics defiance follows unprecedented pressure from the Pentagon to abandon its restrictions on the militarys use of Claude. In a Tuesday meeting with Hegseth, Amodei reiterated his red lines a ban on the technologys use to surveil American citizens or to empower autonomous weapons.
SNIP
Hegseth had threatened to designate Anthropic a risk to the Pentagons supply chain if it failed to comply by 5:01pm Friday. The label is almost always reserved for foreign firms with ties to U.S. adversaries, and could be used by the government to blacklist Anthropic and prevent it from working with other companies.
SNIP
Hegseths move to invoke the DPA suggests that the Pentagon sees Anthropics AI models as critically important to U.S. national defense a stance some lawyers and AI policymakers said was contradictory, given the Pentagons concurrent claim that the company may be a national security risk. The discrepancy was highlighted by Amodei in his blog post.
Note: I am not a big fan of AI, but when business owners stand their ground for the security of this country, I applaud...and will use my purchasing power when appropriate.
Lithos
(26,625 posts)Subscribe: https://www.anthropic.com/
To the exclusion of others.
I just added to my post that I am not a big fan of AI...but. I want to support honorable businesses.
scipan
(3,023 posts)cachukis
(3,888 posts)usonian
(24,864 posts)Taking that away from DOD and its contractors puts THEM at a big disadvantage (IMO) compared to companies who are principled.
They'll get the contracts when the nazi regime falls.
https://www.democraticunderground.com/?com=view_post&forum=1002&pid=21054198
Murder is coming to AI, but not to Claude -- Measuring the Market Effects of Principled Defiance
OTHERS. TAKE A STAND AGAINST FASCISM
Not these Nutlicks. OOPS, I meant Lutnicks.

Mein Fuhrer, Claude has defected to the Freedom Fighters.
yellow dahlia
(5,646 posts)haele
(15,340 posts)Claude has been integrated with Microsoft Co-Pilot and Chat GPT because it's got better "vibe" coding (still trying to figure out why someone would want to code a "vibe" ) and handles prompts pretty decently.
DoD/W operates on MS products.
You can't just pull the plug on MS products without taking down the Pentagon.
In short -
Kegsbreath has visions of weaponized AI drone and surveillance systems operating in his precious New Christian States of Amerikkka for his Masters. And he's trying to fast-track acquisition and development of such weapons claiming they'll be needed against Cartels and other asymmetric warfare threats in other countries.
He can claim no military personnel need worry about being courts-martialed because a hall full of San Francisco Liberals at a Gay Pride event inadvertantly got hit by a AI drone swarm targeting terrorists...
Who needs a Constitution when you have a Trump Bible, anyway?
yellow dahlia
(5,646 posts)leftstreet
(40,272 posts)in the UAE. It would make sense, if true
Your post sure makes me wonder about the future of "warfare"
Takket
(23,670 posts)But at least they have drawn a line on this!!!
I mean, does everyone realize that Hegseth is literally trying to build a Terminator??? This is INSANITY.
yellow dahlia
(5,646 posts)Make Orwell (and Terminator) Fiction Again!
gfarber
(262 posts)In deserts near Kuwaits hot sheen,
Three F-15 Eagles fell in friendly routine,
The pilots all flew,
Through a hurricanes brew,
What safe means at Mach is unseen.
Four soldiers were lost in the fray,
When Iran sent missiles that day,
A squirter slipped through,
As pundits all knew,
While air shields just looked the other way.
At The New Republic one writer did sigh,
Is war run by humansor AI?
Asked Siva Vaidhyanathan, perturbed,
As a school site was struck and disturbed,
Did a bot let the missiles fly high?
The brass at the United States DOD decree,
Any lawful use sets it free,
Once purchased and signed,
Leave scruples behind,
Our lawyers will bless what will be.
But Anthropic frowned at the phrase with a wince,
Said conscience should matter, at least since
Frontier AIs
Still blunder and lie
Theyre guessers with glitches evince.
Their chief, Dario Amodei, made plain,
These systems arent fit to reign.
They hallucinate facts,
Make lethal misacts,
And civilians pay for the chain.
Then thundered Pete Hegseth with flair,
Supply chain risk! Out of our air!
No contracts, no chat,
For firms dealing that
Conscience was too much to bear.
Up stepped Sam Altman to agree,
From OpenAI by the sea,
Any lawful use? Fine.
Just sign on the line.
And appetite hummed happily.
Recall how the National Security Agency grew bold,
When Edward Snowden let secrets be told,
Internal review
Said spying would do
Till the public saw what it controlled.
So picture a taxi that drives,
With missiles attached to its sides,
No driver inside,
Just code as its guide
And hope that its software survives.
For corporations, some say,
Have hunger to clear every way,
No soul in the deal,
Just profit as zeal
Though memory may yet disobey.
© 2026 Glen Farber. All Rights Reserved.
yellow dahlia
(5,646 posts)Thank you for sharing it here. You have some serious talent to be able to take this chaos into poetry.
Isn't the timing interesting? Kegsbreath told Anthropic to let down the guardrails by Friday...or else. They wanted the guardrails down by the time of the attack on Iran. To me that means - a human was in the chain of command that bombed the girls school.
highplainsdem
(61,647 posts)them rejecting those particular Pentagon demands, and I'm glad to hear they might gain customers and OpenAI lose some.