Earlier this month, a Dallas police SWAT team used a Remotec Andros robot to deliver half a kilogram of C-4 explosive to eliminate a crazed sniper. The problem was resolved without exposing even more police lives to danger. The age of the Terminator is here. The Andros robot is also designed to be fitted with accessories such as a 12-guage semi-automatic shotgun. This incident raises questions about the use of killer robots.
Remote-controlled weapons in the military are not new. The MQ-9 Reaper has already made its presence felt in various fields of combat. A further advancement beyond remote human control is autonomy where the software rather than a human makes all the key decisions. Autonomous technology is relatively new – it is clear that autonomously driven motor vehicles will be commonplace and make our roads safer before long. Autonomous weapons make both target selection as well as payload delivery decisions (i.e. when to press the trigger) without human intervention. They are known as LAWS – lethal autonomous weapon systems.
Countries with high tech military capability are moving more and more toward autonomous machines of war. Robotics has become the new arms race. Sea Hunter, the US Navy’s autonomous warship, is used for tracking submarines, but can be fitted with a range of weapons. The AutoNaut is an unarmed, wave-powered sea drone used for surveillance.
Computer-controlled weapons are already in use by the US Navy (such as the Phalanx) and US Army (C-Ram). These weapons are used where humans cannot react fast enough – to fire at and intercept incoming missiles. Israel’s Iron Dome rocket defence system is similar, designed to neutralise incoming missiles before they strike their target.
The X-47B is an unmanned combat aircraft which can operate from an aircraft carrier and is capable of refuelling in the air. It will also be autonomous, able to identify targets and launch its weapons without any human intervention.
The Samsung Techwin Security Guard Robot is designed as an autonomous sentry with capability to identify targets and shoot. It is designed to replace human guards on national borders and at critical infrastructure sites. Current design involves a human making the shoot decision, but the human could be easily taken out of the loop.
Russia is developing the Uran-9 robot tank to assist with surveillance and combat. Kurdish forces currently fighting ISIS in Rojava in Northern Syria could certainly do with the assistance of either remote-controlled or autonomous weapons.
There is no problem with using robots in applications such as IUD control, and recovery and disaster relief. Issues arise when the robots are armed with lethal weapons.
The decision-making of robots is only as good as their computer code. Being autonomous, means LAWS lack important aspects of human judgement, conscience, compassion and understanding of complex context. There are also issues of accountability when things go wrong – who is to blame, the commander, the manufacturer, the software developer, or the robot itself? In the US, most weapons manufacturers have immunity. A report from Human Rights Watch indicates that in the case of LAWS, no party may be held accountable.
The International Committee for Robot Arms Control (ICRAC) is interested in the regulation of robotic weapons. Momentum is growing to restrict the use of LAWS. However at the UN only 14 countries have called for a ban – most of which lack strong technical capability. As the issue becomes more prevalent, pressures from more and more quarters will grow and this debate will heat up.
While state actors may reach agreement to restrict killer robots, what will stop well-funded rogue states, terrorist organisations, or criminal gangs from utilising the technology? Bans may tip the balance in favour of these groups and make the world a far more unstable and dangerous place.
Where there is software there is potential for malware attack. The more complex the software the greater the vulnerabilities, and remote-controlled and autonomous software is hugely complex. Mankind has never yet managed to build complex software without vulnerabilities. Two fields of battle will emerge – one with remote-controlled or autonomous lethal machines, the other in the software realm. Software security will become a vital arm of the military, focussing on both defensive and offensive opportunities. Techniques such as using mirrors or holograms could confuse robots. Also, the asymmetry of software security comes into play here as the defender has to secure all vulnerabilities whereas an attacker only needs to exploit one.
In his excellent book “There will be Cyber War”, Richard Steinnon points out that cyber attack can completely neutralise even conventional armed forces. The fictional story “Ghost Fleet” explores a similar theme. However simply neutralising weapons is not the greatest threat, turning them against their masters is even more dangerous.
Scary as the scenario facing the Dallas SWAT team earlier this month was, it would have been even more terrifying had they faced a more sophisticated adversary with the capability to turn the Andros robot around and direct its C-4 explosive payload back at its handlers.
Hasta la vista.