U.S. Defense Department officially deems Anthropic a supply chain risk

The Pentagon has formally designated Anthropic and its Claude artificial intelligence models a supply-chain risk, a first-of-its-kind move against a U.S. tech company that escalates a high-stakes clash over military AI safeguards, according to multiple U.S. media reports.

A senior defense official told Bloomberg News and CNBC that “DOW officially informed Anthropic leadership the company and its products are deemed a supply chain risk, effective immediately,” using the acronym for the Department of War — the name the White House now uses for the Department of Defense, the outlets reported. Until now, such risk designations have been reserved for firms tied to U.S. adversaries, such as China’s Huawei.

- Advertisement -

The new designation will require Pentagon vendors and contractors to certify they are not using Anthropic’s Claude models in defense work, creating potential ripple effects across the military’s technology ecosystem and for the San Francisco–based AI start-up.

The action comes amid an intensifying dispute between Anthropic and the Pentagon over how AI should be used in national security. The company has maintained that its technology should not be deployed for mass surveillance or fully autonomous weapons systems. Pentagon chief Pete Hegseth bristled at those conditions, and Washington has countered that government suppliers cannot stipulate terms that constrain lawful defense uses, according to the reports.

Earlier this week, talks aimed at defusing the standoff began, brokered by Anthropic’s major investors, which include Amazon, Google and Nvidia, U.S. media said. Those negotiations had not produced a resolution as of Wednesday.

The conflict took on a sharper political edge after The Information reported that Anthropic CEO Dario Amodei told employees the actions against the company were politically motivated. “The real reasons” the Trump administration “do not like us is that we haven’t donated to Trump (while OpenAI/Greg have donated a lot),” Amodei wrote internally, the outlet reported, referring to Greg Brockman, president of OpenAI, who has given $25 million to President Donald Trump, according to the report.

President Trump last week directed federal agencies via social media to immediately halt use of Anthropic technology, with the Pentagon and other departments granted a six-month phaseout period, U.S. media reported. Despite that government-wide ban, the military used Anthropic’s Claude model in its weekend attack on Iran and continues to rely on it, according to multiple reports.

The Pentagon’s supply-chain risk designation raises immediate compliance stakes for defense contractors, who must now audit and attest to the absence of Anthropic tools in their workflows. It could also chill commercial adoption beyond the defense sector, as companies that serve government clients reassess their technology stacks to avoid certification problems, procurement delays or legal exposure.

For the Biden-era tech industry, the episode underscores the volatile intersection of AI safety principles, national security imperatives and U.S. politics. Anthropic has been among the most vocal AI labs advocating guardrails on model use, including restrictions on autonomous targeting and broad surveillance. Pentagon leaders argue those lines are policy decisions for elected officials and the military, not private vendors.

What happens next may hinge on the investor-brokered talks and the government’s implementation of the ban and certification regime. If the designation stands and enforcement tightens, defense integrators and software suppliers will face a quick pivot away from Claude models. If a compromise emerges, it could set precedent for how AI vendors and the Pentagon negotiate ethical constraints while preserving wartime capabilities.

Anthropic, the Pentagon and the White House did not immediately respond to requests for comment made by U.S. outlets reporting on the development.

By Abdiwahab Ahmed

Axadle Times international–Monitoring.