top of page
Search

AI firm Anthropic seeks weapons expert to stop users from 'misuse'

  • 12 hours ago
  • 1 min read


The US artificial intelligence (AI) firm Anthropic is looking to hire a chemical weapons and high-yield explosives expert to try to prevent "catastrophic misuse" of its software.


In other words, it fears that its AI tools might tell someone how to make chemical or radioactive weapons, and wants an expert to ensure its guardrails are sufficiently robust.


In the LinkedIn recruitment post, the firm says applicants should have a minimum of five years experience in "chemical weapons and/or explosives defence" as well as knowledge of "radiological dispersal devices" – also known as dirty bombs.





  • Twitter

© 2026 UnmissableAI

bottom of page