If you hear the phrase “killer robots” or “machine warriors” and think of The Terminator franchise’s cyborg assassin, you are not alone. But in the US today, we’ve moved from a dystopian fiction to a darker reality. In the absence of a global ban on autonomous weapons powered by machine learning, and with the rise of Trump’s “might means right” approach to warfare, we are heading for worst-case scenario calamity. Is it too late to put on the brakes? No. But religious communities advocating for peace through justice are understandably overwhelmed in the midst of threats to immigrant neighbors, backpedaling on climate justice, ongoing genocide in Gaza, and recent military operations (Nigeria, Venezuela, threatened in Greenland, Iran, and more). In light of so many other pressing issues, you may have missed the Pentagon announcement from Defense Secretary Pete Hegseth that the AI Chatbot Grok will be given access to Pentagon intelligence to train it for undisclosed purposes. As NPR/AP reported yesterday:
Hegseth’s aggressive push to embrace the still-developing technology stands in contrast to the Biden administration, which, while pushing federal agencies to come up with policies and uses for AI, was also wary of misuse. Officials said rules were needed to ensure that the technology, which could be harnessed for mass surveillance, cyberattacks or even lethal autonomous devices, was being used responsibly.
The Biden administration enacted a framework in late 2024 that directed national security agencies to expand their use of the most advanced AI systems but prohibited certain uses, such as applications that would violate constitutionally protected civil rights or any system that would automate the deployment of nuclear weapons. It is unclear if those prohibitions are still in place under the Trump administration.
https://www.npr.org/2026/01/13/nx-s1-5675781/pentagon-musks-grok-ai-chatbot-global-outcry
In the War & Peace class that I teach, the module on machine learning and weapons systems is one of the most challenging (and chilling). The benefits of machine warriors are often framed in contrast to the normal limits of human warriors–what Kara N. Slade references as a problem of “theological anthropology.” Human soldiers are embodied; have rights; can die; need time to think, understand, process; can make errors of judgment; are capable of empathy, even with the ‘enemy’; are capable of moral intelligence; can be held responsible for their actions; have to be paid a fair wage; can feel pain; require medical care and other expensive benefits; and so forth. In contrast, machine warriors are faster, can operate at high altitudes and cold temperatures, can process complex data more quickly, don’t have the limits of embodiment, are cheaper to deploy, can’t be held liable, have difficulty distinguishing between combatants and civilians, and cannot experience human suffering. Regarding cost alone: previous DoD estimates indicated that each soldier in Afghanistan costed the Pentagon $850,000 per year, but a TALON robot can be outfitted with weapons for under $230,000. Slade argues that the American military-industrial bureaucracy is influenced more by national anxiety and fear than by ethics. She cites the Unmanned Systems Integrated Roadmap as an example.
DoD and industry are working to advance operational concepts with unmanned systems to achieve the capabilities and desired effects on missions and operations worldwide. In building a common vision, DoD’s goals for unmanned systems are to enhance mission effectiveness, improve operational speed and efficiency, and affordably close warfighting gaps… By prudently developing, procuring, integrating, and fielding unmanned systems, DoD and industry will ensure skillful use of limited resources and access to emerging warfighting capabilities. Pursuing this approach with unmanned systems will help DoD sustain its dominant global military power and provide the tools required by national decision-makers to influence foreign and domestic activities while adapting to an ever-changing global environment.
https://jmt.scholasticahq.com/article/11278-unmanned-autonomous-drones-as-a-problem-of-theological-anthropology
Notice that in this framing, removal of human agency through deployment of “unmanned systems” are supposed to increase mission effectiveness and increase efficiency– claims made also by Musk’s dominant ideology of “government efficiency” through the use of artificial intelligence.
Most students don’t realize that lethal autonomous weapons systems are not limited to science fiction. In 2021 NPR reported that in March 2020, a lethal autonomous weapons system (Kargu-2) was used during fighting between the UN-recognized Government of National Accord (Libya) and forces aligned with Gen. Khalifa Haftar. The attack drone, made by the Turkish company STM, can be programmed to attack targets without requiring data connectivity between the operator and the munition: what the UN calls a “fire, forget, and find” capability. At the time, the Vatican permanent observer mission expressed alarm, and the Holy See’s concerns were repeated in 2024 with a call for a permanent ban on lethal autonomous weapons systems. That statement expressed Catholic teaching that technological progress should be used to improve human life, and that “no machine should ever choose to take the life of a human being.”
Today, my tax dollars are being spent to prepare for exactly this scenario: machines deciding who lives and who dies and how. The US voted against a UNGA resolution on autonomous weapons in November 2025. And the latest announcement from Hegseth indicates a full-speed ahead integration of Grok without attention to critics’ assessments of Grok’s flaws or any attention to long-standing principles of military ethics and/or the just war tradition.
It is time to ask whether Catholics can licitly participate in the US military-industrial complex at any level. Trump’s second term has seen a movement to “war-making” over “self-defense,” with the most obvious example being the renaming of the Department of War. Trump does not seem to feel beholden to the just war tradition and its emphasis on last resort, proportionalism, international law, and humanitarianism. Nothing about the priorities of Stephen Miller align with Catholic values. But one-fifth of active duty military identify as Catholic. The persistent moral injury of working within the objectively evil operational framework of the military under a second Trump administration should be getting more attention– including by US bishops. If US objectives are to amass greater power and influence, to bully, to use force without justification under law or ethical principles, and now we are training artificial intelligence to achieve these ends more “efficiently,” we have lost our way entirely. Without significant course correction–minimally, rejection of unilateral warfare and commitment to the telos of peace– it is unclear how Catholics could licitly cooperate with present US leadership.