Killer Robots: Legality of the future of warfare
They are here, accept it or not, and they raise many concerns both moral and legal. With no human in control of them, we give them the power to choose and eliminate their targets. Are we ready to give them this power to kill other humans? Are we ready to replicate human judgment? Are we ready for that arms race?
Fully autonomous weapons will be capable of selecting targets and using force without any human input or interaction. This would be a step beyond current remote-controlled drones and weapons which always has a “human in the loop”. These fully autonomous weapons will lack most human characteristics of judgment, compassion, and intentionality. These weapons will have a large military advantage, but the risks involved outweigh the benefits. The lack of compassion will make no difference for them between potential enemies and civilians. These will also give irresponsible states or non-state armed groups, machines that could be programmed to indiscriminately kill civilians. With no human in control of them, they will have no notion of the human right to life and the principle of human dignity. The use of force is only lawful when it is used to protect human life, as a last resort, and the force applied should be proportionate to the threat. It will be impossible to programme a robot to calculate all these principles in combat and they can’t be programmed to solve all situations, as during a war many unlikely situations will arise, which can’t be dealt without proper judgement and compassion towards your enemy forces. As inanimate machines, they will lack the value and respect for human life and significance of its loss. It is not only legal, economic reasons for which we respect other person rights, but the very reason for it is moral and if we breach that moral restraint on us, by giving machines the power to kill other humans, we won’t be far from our inevitable downfall as human civilization.
The next big question which arises is who will be liable for the acts done by fully autonomous weapons? No person who does something wrong should be allowed to escape his liability for his wrongful act, as that would be against the most basic principle of law. Each person should be personally accountable for the wrongful act done by him. Similarly, each state should be made accountable for the wrongful act or war crimes committed by it during the course of a war and should be punished for the same to provide retribution to victims. Though it will be easy to establish liability to a state, it is also important to assign the personal responsibility of these acts.
|TALON: Sentry Robot|
It is important for individuals to be held criminally responsible for unlawful or immoral acts, to not only provide retribution to victims but also to deter such behaviour in future. It also shows that victims’ rights have been recognized and the wrongdoer is punished for the harm they suffered.
Men are held responsible for crimes, not machines. For a crime to be established two basic elements are supposed to be present. There must be an act and mental state i.e., mens rea. It would be impossible to prove mens rea or mental state of robots as they won’t have the intentionality to commit a crime. The commanders cannot be held liable for acts of a robot as it would be difficult and almost impossible for them to have knowledge of acts committed by fully autonomous weapons during the course of a war. This would only arise if the autonomous weapon sends some kind of communication before taking the action and selecting targets, then such target would need a human to take the final decision and fall out of fully autonomous weapons scope. It would also be impossible to punish robots as they aren’t humans and commanders can escape liability by saying they didn’t have control or prior knowledge of robots acts.
Under Civil liability, the victims bring an action for compensation as a remedy to damage suffered by them. It would be difficult for them to bring and maintain a suit against manufacturers and impossible to bring a suit against military personal as most of them are immune under civil law. The manufacturers can claim manufacturing defects and software-based deficits to escape liability under civil law. The last resort would be a strict liability compensation system, which will require victims to only prove harm and not how it occurred. Not all legal systems will be ready to adapt this system and this will only provide compensation to victims and not deter such actions in future. The risks involved changes to be made and conflicts which will arise will be much greater than advantages of fully autonomous weapons.
Based on these arguments it is recommended that development of such weapons should be banned before they become reality.
About the Author:
Aatif Salar is currently a second year BBA-LLB(Hons.) student at Faculty of Law, IFHE, Hyderabad.