Michael D. Moberly April 5, 2016 ‘A blog where attention span really matters’!
Lethal autonomous weapons systems (LAWS) represent, in my judgment, an inevitable, but, as yet, incomplete class of weapons embedded with capabilities to independently select and engage targets (adversaries) without human (operator) assessment and/or interventional oversight.
LAWS are unlike existing (conventional) pilotless drone ‘aircraft’ in the sense they are – will be largely, if not wholly, autonomous. In other words, as I have come to understand LAWS, once deployed in various manifestations, they can surveil, assess, and execute in a wholly independent manner presumably with internal assessment and decisional guidance wrapped in AI (artificial intelligence) software.
The development and introduction of remotely piloted – controlled drones for operation in theaters of combat. counter-insurgency and counter-terrorism and for surveillance and intelligence gathering serve as real time hedges favoring expansion of risk adverse strategies, particularly, human life. Obviously, drones deployed in war fighting circumstances can deliver devastating munitions to specified adversaries – targets with the aid of satellite and global positioning systems, but only at the direction of their human operators and overseers, thus mitigating risk to requisite for ‘boots on the ground’.
Presumably LAWS, on the other hand, will be designed – programmed with capabilities to identify, assess, and self-authorize target engagement, i.e., seek, find, distinguish, select, and engage targets absent human intervention or oversight ala simultaneous introduction of infinite numbers of ‘jason bournes’ to a conflict theater. LAWS could presumably function (also) as ‘defensive’ weapons, i.e., as a theater interceptor – destroyer of an adversaries’ incoming munitions to supplant human reaction times.
Aside from the autonomy and independence of such weapons systems, their development and use is presumably intended to mitigate – favorably affect human’s – societies’ intangible senses – perceptions of risk, fear, and safety, while simultaneously serving as formidable strategic deterrents each being an intangible. To be sure, adversaries and allies alike are aggressively pursuing comparable-competing LAW war fighting capabilities, the theater functionality of which may be more-less effective, at which time the aforementioned intangible (asset) senses will likely change accordingly.
This post was inspired by the writings of Heather M. Roff, particularly an article published in Slate Magazine (online) dated April 7, 2016, titled ‘Killer Robots on the Battlefield: The Danger of Using a War of Attrition Strategy with Autonomous Weapons’ in advance of her testimony at the U.N.’s, April 11, 2016 ‘Convention on Certain Conventional Weapons’ in Geneva.