What's happening at the
Tuesday 25 Sep 2018 | 06:54 | SYDNEY
Tuesday 25 Sep 2018 | 06:54 | SYDNEY
Debates

The future of drone warfare

5 Nov 2015 16:15

The US State Department has approved a sale of armed drones to Italy, the first such transaction to be allowed under a new policy. Previously only the UK was allowed to share this technology.

US service personnel train in loading munitions on an MQ-9

Drones are becoming big business and this sale seems to signal the start of an export strategy for unmanned aerial systems (UAS). In February the State Department noted: 

'As the nascent commercial UAS market emerges, the United States has a responsibility to ensure that sales, transfers, and subsequent use of all US origin UAS are responsible and consistent with US national security and foreign policy interests, including economic security.

The estimated contribution of the proposed sale to economic security is $126 million. This amount may pale in comparison to the long term security costs of international proliferation. [fold]

According to the Defense Security Cooperation Agency's media release, the proposed sale is for MQ-9 'Reaper' weaponisation kits to Italy, along with 156 Hellfire II missiles and other equipment. While payloads such as missiles and bombs are classified as MDE (major defense equipment), the weaponisation kits for the MQ-9s are not. Rather, they are categorised as 'non-MDE items' and thus subject to less stringent regulations under the US Arms Export Control Act.

Under such guidelines, will the sale, as purported by the announcement 'contribute to the foreign policy and national security of the United States by improving the capability of a ... key democratic partner of the United States in ensuring peace and stability around the world'? The answer may depend on whether you take a short or long term view.

The report noted further that 'the proposed sale of this equipment and support will not alter the basic military balance in the region'. But scholars argue that a long-term global perspective should take precedence over any short-term regional focus.

An important consideration here is the moral hazard some observers believe armed drones introduce to decision-making. Just as countries are less hesitant to shoot down drones than manned aircraft, decision-makers can deploy the technology with no risk to pilot’s lives or ground troops. The reduced cost in blood and treasure is thought to lower the threshold for the use of force.

In a 2014 Foreign Affairs article, Sarah Kreps and Micah Zenko suggested that while drones do not possess the transformative power of nuclear weapons, the moral hazard around their use meant they could still be highly destabilising to international order. Specifically, the authors wrote, armed drones could increase the possibility of 'military conflicts in disputed areas where the slightest provocation could lead to strife'.

Armed drones have been on the front line of the War on Terror for years. They are so common as to be the subject of presidential humour. At the White house Correspondents Association Dinner in 2010, President Barack Obama cautioned any potential suitors for his daughters with the joke: 'I have two words for you: Predator Drone. You will never see it coming'.

Armed drones have been praised for targeting precision that keeps civilian casualties and economic costs low. As part surveillance machine, they are responsive to changing conditions, such as the entrance of non-combatants into a blast zone. Since drone operations pose less risk to civilians and less collateral damage overall, they tend not to threaten diplomatic relations with neighbouring states. In fact, the use of drones typically requires flyover and basing rights in neighbouring countries.

To date, only the US, the UK and Israel are known with certainty to have used armed drones, although other states such as China, Pakistan, India, Turkey and Iran also claim to have the technology either ready to deploy, or in advanced development. In Australia, Japan and Singapore, the focus has been on surveillance drones, which the NSW Police began trialing late last year.

Both surveillance and armed drones have potential international security implications. Recently, Japanese officials debated shooting down Chinese surveillance drones over the disputed Senkaku/Diaoyu Island. A former PLA commander responded that this would be considered an act of war.

So far the US has conducted the vast majority of drone strikes (with over 1000 drone strikes in Afghanistan alone since 2008). Even as the annual number of strikes steadily increases, the use of armed drones has maintained broad political support. According to 2015 Pew survey, the majority of Americans polled support the use of drone strikes against terrorist targets (except where the target is a US citizen). This is not surprising in an age of intervention fatigue. Armed drones are a much easier sell than ' boots on ground'.

Less clear are both Americans' views on international sales of armed drone technology, and the long term security implications of proliferation enabled by such transactions.

Image courtesy of New York National Guard

COMMENTS

10 Nov 2015 08:52

Last Thursday on The Interpreter, Jennifer Hunt outlined the considerable issues that arise in a world in which armed drones proliferate. But the proliferation of armed drones is already more advanced than her article might suggest. By some counts, up to 70 nations are pursuing them.

The announcement that the US has given approval for the weaponisation of Italian Reaper drones is significant, but not entirely unexpected. Italy is a critical partner for US drone operations, particularly since it acquired the Reaper, but also because of the importance its Sicilian NATO airbase Sigonella assumed during the 2011 Libyan campaign. From this location Italy has flown Reaper missions in support of Operation Mare Nostrum, the mission to rescue migrants sailing north from Africa. Italy has also hosted Global Hawk operations from Sigonella. By transferring Hellfire missile capability to Italy, the US builds capacity in a trusted NATO partner with the geographic access necessary to contribute to operations in both the Middle East and North Africa.

Perhaps as part of a broader tilt to federated defence, the US is on a trajectory to build armed drone capacity with a number of trusted partners (here's a US Air Force view of drone use in the next 25 years). It also sees the use of long range drones such as Triton and Global Hawk in global terms (see the Broad Area Maritime Surveillance concept here). Australia is a critical partner in this. The Royal Australian Air Force will soon acquire the MQ-4 Triton drone for maritime operations, and earlier this year the Chief of Air Force declared an intent to pursue the acquisition of up to eight armed Reaper drones in the Defence White Paper currently before government. [fold]

In Senate Estimates testimony in April, defence officials addressed the implications of wider and more active use of weaponised remotely piloted aircraft. Jennifer Hunt is right to draw attention to Micah Zenko's excellent work on the problems associated with the use of drones; his 2010 book on discrete military operations remains the best analysis of the moral hazards of the use of remote weaponry. One of his surprising findings is that civilian national security officials are more entranced with this technology than uniformed military, and have a higher appetite to use drone strikes.

To the countries already using armed drones that Jennifer listed, add another name: Iraq.

Only a few weeks ago, footage emerged of Iraq's newest acquisition, a Chinese-manufactured armed drone, ready to join operations against ISIS. My former colleague David Schafer has written extensively on the advanced state of Chinese global drone sales; the main takeaway is that they are substantial, growing, and occurring outside existing arms control regimes.

All of this means increased proliferation of armed drones is not merely a risk, it's already happening. So embedding ethics into organisations that use them is a pressing concern, one made even more pressing given that military forces are already migrating weapons systems from remotely controlled to lethally autonomous. This talk by science fiction author Daniel Suarez (embedded above) transverses some of the issues that lethal autonomy creates. And for a chilling vision of where things might eventually end up, you won't find a better read than Suarez's Kill Decision.

COMMENTS

11 Nov 2015 11:50

OK, so that headline is a mildly offensive way to enter the discussion started so respectfully by Jennifer Hunt and James Brown on The Interpreter. I'm sure they both love that puppy, even if they have their doubts about drones. But now that I have your attention, let me try to make a few serious points, including one about puppies.

I'll start not with drones but with 'lethally autonomous robots'. Combat drones such as those the US is using so frequently in its war on terror have a human decision-maker 'in the loop', someone (or frequently more than one person) who makes the final decision to fire on a target. Killer robots would do away with such human intervention, and the video James Brown recommends by sci-fi novelist Daniel Suarez paints a dark picture of where this technology leads, not only for warfare but for democracy.

But I found that video rather overwrought. I'm not convinced that such technology can 'reverse a five-century trend toward democracy'.

Suarez is absolutely right that military technology shapes our political institutions (see Phillip Bobbitt's The Shield of Achilles on this topic), but don't forget that lethal autonomy is in a sense not new. Suarez himself points out that such weapons are already a feature of the Korean Peninsula standoff, and he doesn't even mention the humble landmine, sea mine or booby trap. And for several decades, navies have fielded fully automated anti-aircraft defence systems because the aerial threats they face just move too fast to allow for human intervention. [fold]

Moreover, it's important to recognise that automation will not allow robots to make life and death decisions, because robots can't really make decisions at all. They are merely programmed, by humans, and if we program them to fire a missile at a target at some future time, that simply means we have moved the human decision-point forward. The current generation of drones moves the human decision maker away from the battlefield geographically; the next generation will also take them away from the battlefield chronologically. But either way, it is still a human decision, and if war crimes are committed, those who operate and even those who program these killer robots ought to be liable, because they are the ultimate decision-makers.

Speaking of war and decision-making, Jennifer Hunt writes in her piece that 'An important consideration here is the moral hazard some observers believe armed drones introduce to decision-making...decision-makers can deploy the technology with no risk to pilots' lives or ground troops. The reduced cost in blood and treasure is thought to lower the threshold for the use of force.'

I think that's true, but it is also completely commonplace and unavoidable. In fact, for as long as humans have engaged in conflict with others, we have sought a battlefield advantage through technology by making the enemy more vulnerable to our weapons and us less vulnerable to theirs. It results in an offence-defence cycle, with new technologies constantly being developed to overturn or undermine the advantages wrought by the previous generation of weapons. To put it somewhat crudely, the sword came along to give one side the advantage in hand-to-hand combat, so the spear was invented to overcome the advantages of swords. And so on.

A drone, therefore, is just a tool to reduce to the risk of aerial combat and gain an advantage over an enemy. Asking nations and military commanders to forego that potential battlefield advantage would be like asking them to not buy tanks or frigates. True, the world has managed to largely or wholly ban entire classes of weapons, such as chemical and biological arms, and landmines, but these are rare exceptions. The practical barriers against drone proliferation are therefore extremely high.

Moreover, for drone operators such as the US, drones don't change the risk calculation very much. The last war in which significant numbers of US combat aircraft were shot down was Vietnam. Since then, the US has conducted every one of its many combat operations around the world with overwhelming air superiority. Very few aircrew have been lost to enemy action. So the switch to drones is not a moral leap that suddenly makes aerial warfare low-risk for the US, because that has been the case for some decades.

By this point in the article, you're probably wondering about those puppies. Let me explain. The 'moral hazard' case against drones does not just fall down on the practical grounds sketched above, but on moral grounds too. By arguing that low-risk warfare makes war more likely, you are effectively saying that, in order to reduce the likelihood of war, war ought to be much riskier. But if that's your argument, why stop at drones? Don't soldiers' helmets also make the battlefield less risky for them? Doesn't the availability of advanced field hospitals make it more likely that commanders will risk the lives of their troops, knowing that they have a higher chance of survival?

The 'moral hazard' argument effectively says that nations ought to make themselves as vulnerable as possible because this encourages them to tread so carefully on the world stage that they will not provoke wars. It's the equivalent of asking drivers to strap puppies to their bumper-bars in order to discourage reckless driving.

There. I did it. I found a way to work puppies into an Interpreter debate. May Jessie, my dearly departed old Ridgeback-Red Heeler-cross, forgive me.

Photo by Flickr user philhearing.

COMMENTS

17 Nov 2015 12:15

There is no doubt that remote weapons pose significant challenges in the evolving character of warfare.

Like other disruptive technologies emerging in the civilian realm, remote weapons such as Unmanned Air Vehicles (UAVs) have had considerable impact in the ways that advanced militaries are not only fighting wars, but also in how they understand themselves.

However, as the initial ethical concerns about remote weapons become less challenging, two key aspects of remote weapons point to the future concerns of their roll out and use.

In regard to existing ethical concerns, the main discussions of remote weapons centred on three main sets of worry: first, that the distance between pilot and target made remote weapons specially morally problematic; second, that their use, specifically by the US military, was secretive and thus of concern; and last, that they were somehow dishonourable.

The first sort of argument, that the distance between pilot and target is so great as to make them uniquely problematic, is initially tempting.

It soon fails, however, when we look at the history of almost all weapons. From the bow and arrow, to the gun, to rockets, to fighter jets and intercontinental ballistic missiles, the operator or pilot or commander has been ever more distant from their target, and that distance alone has not been of special moral concern. In this respect at least, remote weapons like UAVs are morally equivalent to many other weapons of war that we take for granted.

The second argument has some truth but is largely the product of confusion between the different branches of US offensive operations.

The US military does use UAVs primarily for reconnaissance and surveillance. But it has also increasingly used them in targeted strikes and close air support of ground troops. The doctrine of use is publicised, available on Amazon: Hardly a secret. The CIA is also known to use UAVs in regions where there are no official military operations occurring. However, the doctrine for CIA uses of UAVs is not public. The important point here is that the moral concern seems to be primarily whether the non-military CIA should be carrying out lethal operations, a point highlighted by UAVs but not particular to them.

The third concern is how UAVs change the actual ways that warriors are perceived. For instance, given their remoteness from conflict, are the operators of these drones pilots: actual warriors? Moreover, will the enemy see these drone strikes as cowardly, lacking honour? And if so, will this perceived lack of honour embolden the enemy, prolonging war?

The first question is largely answered – drone pilots are now receiving medals and though it may take time, it is likely that they will be incorporated into the military as have many specialists in the past. The second question is more speculative. But it is hard to see how an enemy will consider a drone pilot to be any more or less honourable than a jet pilot kilometres above the battle or a naval officer, perhaps hundreds of kilometres from the battle.

What’s perhaps of greater interest are the near term developments that we can anticipate.[fold]

The first of these is the concern of the proliferation of cheap UAVs. We can buy drones from the local toy-store or online. Given their increased accessibility through penetration into the commercial realm and decreased costs, do they have the capacity to become widespread weapons, a remote version of small arms?

One thing that is overlooked about UAVs is the high number of support and maintenance staff required to set up, send out and maintain them. Further, the current lot of commercially available drones are highly limited in their geographic range by battery life and communications.

So, for the near term at least, it would not seem that they will be used in battle by any but the most developed militaries. The real concern lies in the capacity for people to use them as an airborne explosive device. Perhaps it will not be long before we see UAV IEDs as part of a hostile insurgent campaign.

The other future scenario is remote weapons will become even more advanced and complex, and display increasingly autonomous capacities.

Current UAVs have the capacity for limited autonomy; they can, for instance, be set to auto-pilot when their mission is done. However, as the battery life of these remote weapons increased, and their scope of use expands, we are likely to see them display increasing levels of autonomy.

A recent report by the United Nations Institute for Disarmament Research (UNDIR) looks at this issue in the communication denied maritime environment. What this report demonstrates is that the idea of autonomous weapons is not some mere scientific fantasy. Moreover, it brings up a host of challenges about just autonomy is, and what it means for the future of weapons.

The overall point is that, as far as the ethics of remote weapons is concerned, we have largely left the initial concerns about the remoteness behind. In some senses we are moving into a new phase of assessment, where contrasting ideas of cheapness and complexity highlight a new set of areas that require further consideration and reflection.

Image courtesy of Flickr user Justin Ennis

COMMENTS

18 Nov 2015 12:45

The recent series of posts on armed drones from James Brown, Sam Roggeveen and Jennifer Hunt each make a compelling case for the need to consider the ethics of these weapons. These authors are right, but what they may not be aware of is that such consideration is underway,  at least in the Australian Army.

On 21-22 June 2016 the Army's Strategic Plans Branch will co-host with UNSW-Canberra a major conference on military ethics. The ethics of armed drones will be on the program. This conference is open to the public and its proceedings will be published, as Army encourages an open discussion of the ethics of contemporary war.

Moreover, through its Research Scheme the Australian Army has already commissioned papers on the ethics of other emerging technologies such as soldier enhancement, and intends to commission additional studies.

The ethics of armed drones are only one of numerous issues regarding their possible use that must be considered by Australia and other states considering their acquisition. Armed drones have demonstrated a stunning ability to kill people from great distances with virtually no risk to their operators. But there has been little consideration by the world's military organisations of their contribution to the ultimate objective of all war, the ability to compel an adversary to accept your will (in the classic Clausewitzian sense). The effectiveness of military technology has been a theme of Williamson Murray's work, to which he recently returned, and he finds fault in the wisdom of the recourse to drones and precision strike.

For me, the question that needs to be answered is: if a target is subjected to unexpected and instant death, has there been any effect on will?

The contribution to success in war (to victory) is an important aspect of any evaluation of the ethical utility of a weapon. I would argue that a weapon that doesn't meet ethical standards is unlikely to make a positive contribution to forcing your enemy to accept your will. Rather, it is likely to have the opposite effect.

Armed drones have dazzled many military and political minds with their ruthless efficiency. But efficiency and effectiveness in war are not the same thing. Efficiency in killing won't translate into effectiveness in war unless the ethics are right.

Otherwise, armed drones are nothing more than instruments of murder.

Photo courtesy of Flickr user Airman Magazine.

COMMENTS

26 Nov 2015 14:04

Is the drone pilot a warrior? It's a crucial question surrounding the place of the drone pilot within the military ethos – and one Adam Henschke points to in a recent entry in this series of posts of the future of drones on The Interpreter.

It's a good and important question, not only for the reasons Henschke identifies. Whether or not drones are seen as cowardly and therefore offend or embolden the enemy, the lack of clarity regarding the position of the drone pilot is concerning. 

Shannon E French argues being identified as a warrior situates a person within the 'warrior ethos' – including an informal code of conduct. 'The warrior code', as French calls it, is an honour system that regulates behaviour based on an agreed-upon sense of 'what it means to be a warrior'.

But unless drone pilots are actually able to live up to the normative demands the warrior code represents, they risk being seen as dishonourable not by enemies but by fellow military personnel. Worse, some may come to see themselves as shameful.

There are good reasons for thinking drone pilots are not warriors. Drone pilots experience no real risk in carrying out their wars, and are thus distanced in several ways from the realities of combat. Some, like Mark Coeckelbergh suggest 'there seems to be something cowardly and unfair about remote killing'. [fold]

Others, like Christian Enemark are more circumspect. Enemark argues drone pilots 'challenge traditional notions… of what it means to be a combatant or "warrior" within the military profession'. 

Enemark describes drone pilots as 'disembodied warriors'. Disembodiment means drone pilots face no fear for their personal safety. Thus, there is an inability to practise what Enemark describes as 'physical courage' (courage when one's life is at risk). Arguably, such a virtue is part of what defines someone as a warrior — and therefore as worthy of honour by their warrior peers.

Warriors make life-and-death decisions on the battlefield. Disembodied warriors usually don't. Given their targets are vetted in advance and their superior officers able to directly monitor missions, there is very little opportunity for drone pilots to exercise any autonomy at all. They are, to return to St Augustine of Hippo's fourth century notion, 'an instrument, a sword in the user's hand'. 

Although they are treated as such, drone pilots are not merely instruments in the hands of their superiors. They are people. As such, the moral gravity of killing bears on their consciences, they feel acutely the seriousness of what it is that they are doing. 

Here, however, the problem of risk-free warfare returns. Drone pilots can't justify the killing they do in the same way other warriors can.

Regardless of the justice of the mission or war, warriors who are physically on the battlefield can justify their killing through the framework of self-defence. Drone pilots are not defending themselves; there is no 'me or them' logic to fall back on.

Enemark says, 'war necessarily involves some kind of contest... opposing combatants' equal right to kill in war is founded on the assumption of mutual risk'. In this sense, drone pilots will not feel like warriors — their killing is no contest at all. 

Without a coherent moral framework for justifying their killing, drone operation is morally fraught. It is unsurprising then that, despite undertaking no risk, drone pilots report the same rates of PTSD as pilots of manned aircraft. This is even less surprising when one considers the growing literature on moral injury:  which is trauma that emerges as a product of transgressing against deeply held moral beliefs

Drone pilots not only kill their targets, but they observe them for weeks beforehand, coming to know their habits, families and communities. That is, they are able to see their targets as persons. As Coeckelbergh notes, 'pilots may recall images of the people they killed... of the person who first played with his children and was then killed'.

Based on the trauma they experience, many drone pilots appear to consider themselves in some sense morally responsible for those who they kill. Despite effectively being an instrument in the hands of superiors, it is the pilot who does the killing.

If drone-based killing is to be justified, drone pilots need to be made aware that the justifications for it are manifestly different to those available to front-line soldiers. Just because drone pilots serve the military does not make them warriors, and does not avail them to the kind of justifications for killing that soldiers possess.

A new moral framework is necessary to explain how (if at all) unmanned, risk-free killing can be justifiable, lest more drone pilots become wracked with the guilt of what the warrior code holds to be unjustified killings.

Better would be the emergence of a new honour code available to 'disembodied warriors' (like drone pilots and cyberwarriors) which emphasises moral virtues other than courage. It should also explain how their killings can be justified. If this cannot be done, perhaps the practice of armed drones should either be made fully autonomous (which is itself, as James Brown argues, likely to be unethical) or abandoned altogether.

Photo courtesy of Flickr user Airman Magazine.

COMMENTS