But OpenAI seems to have sought to enshrine these in the agreement in a different way than Anthropic. While Anthropic tried to have the limits spelled out explicitly in the contract, OpenAI agreed that the Pentagon could use its tech for “any lawful purpose,” while Altman also says of the limitations that OpenAI “put them into our agreement.”
It is unclear exactly how both these things could be true or how the limitations are stated in the agreement. But it may simply be that the contract language highlights that current U.S. law prohibits the Pentagon from deploying A.I. for mass surveillance of Americans and current U.S. military policy states that humans must retain “appropriate levels of human judgment” over the use of lethal force.
OpenAI also said that the Pentagon agreed that the company could build technical solutions into its AI models intended to prevent them from being used for either mass surveillance of U.S. citizens or deployed in lethal autonomous weapons.
“We are asking the [Department of War] to offer these same terms to all AI companies, which in our opinion we think everyone should be willing to accept,” Altman said.
Some commentators interpreted Altman’s remark as a veiled criticism of Anthropic, which had not agreed to these terms previously and instead insisted on explicit contractual restrictions on how its models could be used.
“In our agreement, we protect our redlines through a more expansive, multi-layered approach,” the company added. “We retain full discretion over our safety stack, we deploy via cloud, cleared OpenAI personnel are in the loop, and we have strong contractual protections. This is all in addition to the strong existing protections in U.S. law.”
The extent of the damage to Anthropic’s business of the “supply chain risk” designation remained unclear over the weekend. Anthropic had a $200 million contract with the Pentagon that has now been cancelled. But that is not a huge blow to a company that is reportedly on track to generate at least $18 billion in revenue this year.



