The U.S. government is tightening its grip on artificial intelligence through new procurement rules.
At the heart of this change is a high-stakes disagreement between the Pentagon and the AI company Anthropic. The dispute centered on a single, powerful phrase: 'all lawful uses.' The military insisted on the right to use AI for any purpose not explicitly illegal, which could include controversial applications like domestic mass surveillance or fully autonomous weapons. Anthropic, citing ethical concerns, refused to grant this unrestricted access.
This impasse set off a chain reaction. First, the Pentagon, unwilling to accept limitations, declared Anthropic a 'supply chain risk.' This effectively blacklisted the company from defense contracts. Second, it created a clear dividing line in the AI industry. Other major players, like OpenAI, agreed to the Pentagon's terms, demonstrating that the government had alternative suppliers willing to comply.
Now, the General Services Administration (GSA) is taking this military-grade standard and applying it to the entire civilian federal government. The new draft guidance requires AI vendors to grant the government an irrevocable license for 'any lawful' use. This isn't just a technical change; it transforms a specific military dispute into a default policy for all federal agencies, from the Department of Education to NASA.
This policy also gives concrete power to a July 2025 executive order aimed at preventing 'Woke AI.' By demanding politically neutral systems and coupling it with unrestricted use, the government is embedding ideological tests directly into its purchasing decisions. Furthermore, this 'America First' approach to AI governance could clash with international rules, particularly the EU's Digital Services Act, creating compliance headaches for global tech companies.
The market has already started picking winners and losers. Defense-aligned tech companies like Palantir saw their stocks jump, while others faced uncertainty. This policy shift is fundamentally re-pricing risk and opportunity in the massive federal AI market.
- General Services Administration (GSA): The U.S. federal government's procurement and property management agency. It acts as the central purchasing department for the government.
- Irrevocable License: A permanent, unchangeable permission to use a product or technology. In this context, it means the government would retain the right to use an AI model forever, for any legal purpose, even if the vendor later objects.
- Supply Chain Risk: A designation indicating that a company or its products pose a potential threat to national security, often leading to restrictions on its use by government agencies.
