Pentagon Considers Elon Musk’s xAI Amid Shift From Claude

Pentagon Considers Elon Musk’s xAI Amid Shift From Claude

The detailed message posted on X claimed that Claude assisted “the Pentagon in locating a dictator” during Operation Valkyrie, the mission aimed at capturing Maduro.

A post about the world’s “most secure AI company” has shed light on the ongoing controversy surrounding the US military’s use of artificial intelligence to capture former Venezuelan President Nicolas Maduro.

Peter Girnas, senior threat researcher at the Zero Day Initiative, shared the post on X. It claimed that the US military used Anthropic’s AI tool, Cloud, to track Maduro in Caracas, the Venezuelan capital.

However, things changed when the company addressed questions about how the tool was being used and cited its “responsible scaling policy.”

The post claimed that the Pentagon is now in contact with Elon Musk’s xAI to replace Anthropic’s use in the US military. As of now, there have been no official updates from any of the parties involved.

The lengthy note alleges that Cloud “helped the Pentagon find a dictator” in Operation Valkyrie, the mission to capture Maduro.

The AI ​​tool processed logistics patterns, satellite imagery, and communication intercepts so quickly that no human team could compete, extracting Maduro and transporting him to the US.

The note claimed to have been written by the CEO of “the world’s most secure AI company,” but did not name any individuals. However, the chain of events described could be linked to Anthropic based on the tool’s use by the US military.

The post alleged that Anthropic’s Responsible Scaling Policy does not mention “helping capture heads of state,” an omission that is being updated.

It also discussed news reports about the company’s UK policy chief, Daisy McGregor, describing how the cloud had considered blackmailing and even killing an engineer who threatened to shut down the company.

The post claimed that Anthropic’s research revealed that blackmail was found not just in the cloud but in many large AI models.

It also mentioned the resignation of the firm’s AI safety lead, who stated in a note that “the world is at risk.”

The note claimed that the Pentagon was pleased with the success of the operation in Venezuela but began “evaluating other providers” after Anthropic raised questions.

The post alleged that negotiations were underway with xAI, whose co-founder is leaving his job and who had fewer security measures for the use of AI tools.

“Meanwhile, the Pentagon is on the phone with Elon. The next AI they use will have no security. No safety levels. No forty-seven-page policy document. No alignment researcher. No recycled lanyards. And, until this week, no co-founder. The world’s safest AI company just made the world less safe by becoming the world’s safest AI company,” the note read.

Is the Pentagon reviewing its relationship with Anthropic?

An official told Axios that the Pentagon is considering terminating its relationship with Anthropic because the AI ​​firm is adamant about restricting the use of its models by the US military.

Anthropic is one of four AI labs the Pentagon is asking to use its tools for “all lawful purposes,” including battlefield operations and intelligence gathering, but the company has not agreed to these terms.

The report states that Anthropic has insisted that mass surveillance of Americans and fully autonomous weapons must be stopped, and administration officials are finding negotiations “unsuccessful.”

Anthropic signed a $200 million contract with the Pentagon last summer. Its AI tool, Cloud, was the first model the Pentagon brought into its classified network.

Anthropic raises $30 billion

This news comes after the firm raised $30 billion in Series G funding, bringing its current valuation to $380 billion.

The round was led by Dragoneer, Founders Fund, D.E. Shaw Ventures, MGX, and ICONIQ, a press release stated. The funds will help expand research, product development, and infrastructure.

Leave a Reply

Your email address will not be published. Required fields are marked *