Photo/Illutration Representatives discuss at a meeting of government experts on the future of lethal autonomous weapons systems, also known as “killer robots,” in August 2019 in Geneva. (Asahi Shimbun file photo)

Inhumane weapons that can operate autonomously without human judgment and kill enemies without feeling hesitation or fear are fast becoming a reality due to progress in artificial intelligence.

A rapidly growing number of autonomous and unmanned weapons such as AI-enabled drones are being used in battlefields around the world.

International rules to regulate these weapons are urgently needed since it would be too late once they are widely deployed.

Weapon systems that use AI to identify, select and kill targets without human intervention are often referred to as lethal autonomous weapons systems (LAWS), or more simply "killer robots."

The use of AI on the battlefield is described as “the third military revolution” following the discovery of gunpowder (the first revolution) and the invention of nuclear weapons (the second).

Since AI-enabled autonomous weapons allow attacks without the risk of massive human losses on the attacking side, they could make countries less hesitant to start a war. There are also concerns about AI making a misjudgment, suffering malfunctions and going out of control.

The international community is becoming aware of the need to regulate autonomous weapons, and experts and policymakers are discussing ideas to prevent their proliferation under existing treaties to ban and restrict certain inhumane conventional weapons.

With disagreements among countries still wide and deep, however, no specific achievement has been made so far. But some notable moves have been made this year.

One is a joint proposal to regulate LAWS made by the United States, Britain, Australia, Canada, South Korea and Japan.

The proposal calls for a ban on lethal autonomous weapons that are “incapable of being used in accordance with international humanitarian law.”

More specifically, it says, “the autonomous functions in weapons systems must not be designed to be used to conduct attacks that would not be the responsibility of the human command under which the weapon system would be used.”

This principle is based on the notion that human responsibility must be ensured for any use of weapons.

Another important and more radical joint proposal has been made by 10 countries and areas including Argentina and the Philippines. It calls for a comprehensive ban on LAWS, prohibiting not only the use but also the development, production, possession, acquisition, deployment and transfer of such weapons.

In contrast to the proposal from Japan, the United States and others, which leaves regulations on LAWS to domestic laws of individual countries, the other proposal envisions legally binding international measures.

The Japanese government has clearly disavowed any intention to develop LAWS, but sees no short-term prospects for a comprehensive ban or legally binding international regulations on such weapons. Tokyo intends to seek an agreement based on the proposal by six nations.

Russia has argued that existing international laws are sufficient to govern LAWS and there is no need for new regulations. China has supported the idea of creating a legally binding regulatory framework for LAWS but strictly defines weapons subject to a ban.

Any international agreement on the matter must be unanimously approved, and it will be a formidable challenge to build a consensus among nations. Russia’s aggression against Ukraine will also pose a major obstacle to consensus building.

Even if it is realistic to try to pursue gradual progress by focusing on areas that most major powers can agree on, the international community should keep working on systems to detect and punish violations without losing sight of the ultimate goal of a legally binding ban.

--The Asahi Shimbun, Aug. 20