VATICAN CITY (CNS) -- Allowing for the development and
use of fully automated lethal weapons systems would make warfare even more inhumane
and undermine efforts to achieve peace through dialogue, not an arms race, a
Vatican representative said.
"A world in which autonomous systems are left to
manage, rigidly or randomly, fundamental questions related to the lives of
human beings and nations, would lead us imperceptibly to dehumanization and to
a weakening of the bonds of a true and lasting fraternity of the human family,"
Archbishop Ivan Jurkovic told a group of experts at the United Nations in
The archbishop, who is the Vatican observer to U.N.
agencies in Geneva, spoke April 9 at a session for the "Group of Governmental
Experts" on Lethal Autonomous Weapons Systems (LAWS). States that are party
to the Convention on Certain Conventional Weapons agreed in 2016 to establish the
group to address the legal and ethical implications concerning such autonomous
technologies, which are also referred to as robotic weapons or "killer
The International Committee of the Red Cross has defined
LAWS as being "any weapon system with autonomy in its critical functions.
That is, a weapon system that can select -- i.e. search for or detect,
identify, track, select -- and attack -- i.e. use force against, neutralize,
damage or destroy -- targets without human intervention."
The first such autonomous weapon was the landmine, but
rapid advances in artificial intelligence and machine learning have broadened
the potential for weapons with extensive autonomy from human decision-making.
Archbishop Jurkovic told the group, which was meeting April
9-13, that "the development of LAWS will provide the capacity of altering
irreversibly the nature of warfare, becoming even more inhumane, putting in
question the humanity of our societies."
"Any armed intervention must be carefully weighed
and must at all times verify its legitimacy, legality and conformity with its
purposes, which must also be both ethically and legally legitimate," he
"Confronted with today's challenges, these tasks are growing ever more
complex and too nuanced to be entrusted to a machine, which, for example, would
be ineffective when facing moral dilemmas or questions raised by the
application of the so-called principle of 'double effect,'" he said. The Catholic principle teaches
it is morally acceptable to pursue a good goal that could have an unintended evil
effect if and when there is a proportionate or adequate reason for allowing the
The archbishop said the robotization and dehumanization
of warfare present several serious ethical
and legal problems.
For example, increased automation will blur or erase
accountability and the
"traceability of the use of force with an accurate identification of those
responsible," he said.
"Such loss or dilution of responsibility induces a
total lack of accountability for violations of both international humanitarian
law and international human rights law and could progressively incite to
war," he added.
Autonomous weapons systems, he said, also lack the "unique
human capacity for moral judgment and ethical decision-making," which involves input much more
complex than a "collection of algorithms."
The needed capacity to understand a situation or context
and apply the appropriate rule or principles can never be replaced by or
programmed into a machine, he said, since such discernment or judgment "entails
going well beyond the potentialities of algorithms."
And finally, he said, "the idea of a war waged by
non-conscious and non-responsible autonomous weapons systems appears to hide a
lure for dominance that conceals desperation and a dangerous lack of confidence
in the human person."
"International security and peace are best achieved
through the promotion of a culture of dialogue and cooperation, not through an
arms race," Archbishop Jurkovic said.