A REVIEW ON STATE RESPONSIBILITY ENSUING AUTONOMOUS WEAPON SYSTEMS
Artificial intelligence has become a part of our lives with the development of technology. In recent years, states have often benefited from artificial intelligence by developing autonomous weapon systems in the defense industry. It is inevitable that the law deals with artificial intelligence and its appearances in different fields. However, since autonomous weapon systems are rudimentary in state practice, it has not been determined to whom and which conditions the wrongful acts that will arise from the functioning of these systems will be attributed under. For instance, many civilians have lost their lives during the air strikes of armed drones in recent years. In this case, will the state escape from the mechanism of responsibility by claiming the autonomous nature of the system that attacks? If the answer is negative, what conditions will there be attribution and responsibility under? First of all, these questions should be answered.
In this review, the nature of autonomous weapon systems and subsequently in case of the fact that their acts constitute a wrongful act in respect of international law, the scope of the state responsibility will be determined in the light of doctrine, state practice and reactions of international organizations.
1.What is an autonomous weapon system?
An autonomous weapon system is attacking weapon system that independently determines its target, based on information gathered and results derived from pre-programmed constraints. The International Committee of the Red Cross defines the autonomous weapons systems as systems that seek their target independently and attack and destroy the target that they set on their own. The term “autonomy” herein means that the lethal system functions without human intervention.
According to the definition of the US Department of Defense, These weapon systems, once activated, are capable of choosing and destroying targets without the intervention of any human operator. These include systems controlled by humans, human users can override the weapon system itself, but can detect and fire targets after they have been activated (unless there is such an interception).
Autonomous weapon systems, as mentioned above, take action completely independent of human. This system is called a semi-autonomous weapon system. For example, controlled drones are semi-autonomous; armed drones are fully autonomous weapon systems.
To summarize, although the definitions contradict at certain points, the autonomous weapon systems, known as killer robots, are war tools and equipment that perform a military task entirely on its own or with limited human contribution.
Killer robots, which no longer consist of science fiction films, are produced by developed countries. For example, SGR-A1, established by South Korea between North Korea in demilitarized zone, Harpy Loitering Weapon which is developed by Israel’s against radar absorbers; encapsulated torpedo mines named PMK-2 produced by Russia and China, and finally US air, naval and land weapons systems are some of the autonomous weapon systems.
2. International Public Response to Killer Robots
a. The attitude of states
As the arms industry develops faster and produce more efficient systems, taking advantage of the existing autonomous weapon infrastructure, the lethal risks faced by soldiers in a warfareare minimized. The armies of many countries expect this to increase whereas campaigns against killer robots also increase.
States’ opinion is divided into three at the point of the use of killer robots. Even though, 26 countries around the world oppose autonomous weapons; countries such as America, Russia and China support this use intensely. Israel, which has recently used semi-autonomous drones to fire tear gas to protesters in Gaza, is also one of the countries that support autonomous weapons systems.
Countries such as Germany and France prefer a proportionate way. They do not oppose the production of use of killer robots in the even that conditions of its usage and responsibility are determined.
b. The attitude of other international actors
The greatest reaction to the use of artificial intelligence-supported autonomous weapon systems comes from international organizations. 230 international organizations around the world oppose the use of killer robots.
The US-based Future of Life Institute, which runs an effective campaign against killer robots, states that the decision to put an end to human life should not be left to a machine.
On December 18, 2018, the European Parliament published Ethics Guidelines for Trustworthy which aims to determine the ethical principles of artificial intelligence systems to serve the benefit of the individual and society.
Further, the European Parliament published European Parliament Resolution of 12 September 2018 on Autonomous Weapon Systems.
The decision defined killer robots as weapon systems without satisfactory human control during the selection of individual targets and destruction. Considering ethical and legal problems, it was stated that the systems and robots are not of quality to take human-specific decisions such as proportionality, distinction and taking measures. Parliament’s similar decisions on autonomous weapons emphasize the importance of human control.
Another noteworthy document is the Civil Law Rules on Robotics submitted to the European Union Commission, which proposes that the robots obtain “the electronic legal personality” in order to be held accountable for their wrongful acts. The Commission stated that it would be appropriate to establish a legal status for the robots in their liability for damages caused by them, but it is not possible to give legal status as they are not yet able to interact independently with third parties.
3. Responsibility for Autonomous Weapon Systems in International Law
In the international arena, the fact that robots, which are considered only “a tool”, did not obtain a legal status, makes it impossible to determine who will be held responsible for the wrongful act, and this constitutes “Achilles tendon” of the autonomous weapon systems.
a. Responsibility of non-state actors
There are different views in the doctrine of determining the responsibility of non-state actors. Some few those supports the idea of objective responsibility for the acts of those who have played an active role in the production phase. However, some others oppose this idea by asserting that holding an extended list as responsible will create a vacuum in the case of the implementation of this responsibility regime as all individuals involved in the production process, commanders, political leaders, etc.
On the other hand, am analogy issue discussed in terms of attribution in the doctrine. It is argued that the responsibility for the wrongful act of the autonomous weapon system should be applied by analogy the responsibility of the military commander. In this case, the commander will be responsible to the extent that he must knows the wrongful acts committed by his subordinates as much as he must know their acts. However, the unpredictability of the actions stemming from the full autonomy of the weapon makes it difficult to attribute the individual responsibility.
b. State responsibility
For the state responsibility in the international law, an act that has been realized or attributable to the state must take place, there should be a damage emerging from this act, there should be a lien of causality between the act and the damage and finally there should be no reason that makes the act in conformity with the law. Therefore, the state is responsible for the wrongful act when these conditions cumulatively meet.
Even though it is autonomous, the state is objectively responsible for wrongful acts arising from the use of these systems, but it is appropriate to examine the scope of this responsibility separately in terms of two different dimensions of war theory; jus ad bellum and jus bello.
Jus ad bellum is the principle that sets the criteria for determining whether or not war is justified in certain situations. It simply refers to the period before the war.
The use of the autonomous weapon system in the period of jus ad bellum does not make the use of force a wrongful act per se. For the use of force to become legitimate, regardless of the type of weapon used, UNSC submission, the valid consent of the target state required or the conditions of self-defense must be fulfilled.
“Jus in bello aims simply to justify the use of force in war.” The most important conditions are civil immunity, proportionality and distinction. The rules of international humanitarian law apply in the period of jus in bello.
International Humanitarian Law envisaged a number of regulations on the nature and production of weapons that determine the qualifications they need to possess in order to ensure the proper use of weapons under humanitarian law. According to Article 36 of the Additional Protocol to the Geneva Conventions of 1949; during the production phase of a weapon, the compliance of this weapon with the provisions of humanitarian law and other international law should be investigated by the state concerned itself. This provision imposes an obligation on the state. Similarly, for a weapon to be so-called “legal”, it must be able to distinguish between combatants and civilians. In its Nuclear Weapons Advisory Opinion, the ICJ stated that this distinction is one of the cardinal principles of humanitarian law. Article 51, paragraph 4 (b) and (c) of the Additional Protocol to the aforementioned Convention prohibits the use of weapons which are not able to make this distinction. In the same article, it is regulated that the use of any weapon is unlawful if it can’t be directed to a certain target and the damage caused by it cannot be controlled. The fact that autonomous weapon systems can distinguish between belligerents and civilian raises concerns. In fact, the death of dozens of civilians as a result of drone attacks in recent years prove righteousness of these concerns.
Therefore, it is clear that both in jus ad bellum and jus in bello autonomous weapon systems are responsible for the wrongful act which constitutes use of force. However, the question of the use of weapons is unlawful per se is an issue that can only be revealed in the practice of the state by evaluating in accordance with the aforementioned criteria.
HUKUKCULAR DERNEGİ
Commission of International Law