Giving machines a mind of their own
With the rapidly increasing use of intelligent machines like unmanned aircrafts in modern warfare, the definition of the term 'human' seems to demand a revision, Manmohan Bahadur writes.
God has been making man from the same mould since time immemorial. Let's call his version Man Mk 1.0. But the time is soon coming (if it has not already come) when we may have to reclassify a newer version - Man Mk 2.0.
That would be a gift of science and of the progress being made in the field of artificial intelligence. However, its utilisation in war raises serious ethical and moral issues that need to be addressed soon.
The Libyan crisis and the Western intervention in the conflict were a foretaste of the importance of re-defining a 'human'.
The UN Security Council Resolution number 1973 authorised military action 'to protect civilians and civilian populated areas under threat of attack in the Libyan Arab Jamahiriya, including Benghazi, while excluding a foreign occupation force of any form on any part of Libyan territory'.
Apache helicopters closed in to 8 kms to fire missiles and other armament. Fighter aircraft flew high but were intimately involved with ground action and unmanned aerial vehicles (UAVs) hovering in the Libyan skies kept a persistent vigil on the goings-on.
Was the UN resolution just because a Western soldier did not physically touch Libyan soil? Is there, thus, a need to redefine a 'human' in warfare so that 'human presence' can in turn be re-qualified?
UAVs are soon going to start sharing airspace with passenger aircraft.
In the US, the process for integrating Unmanned Aircraft Systems (UAS) in their airspace is to commence in 2015 while the International Civil Aviation Organisation will have complete integration in place by 2028 for UAS to fly autonomously on international air routes.
In less than a decade, Man Mk 1.0 would be confident of handing over major decision-making authority involving his own safety to a machine capable of taking such decisions - this would still not be Man Mk 2.0 but an intermediate step of say Man Mk 1.1 where a human would be in the loop to take over and guide his 'ward'.
This is the benign part of the problem, for come the decade of 2030 and beyond when highly advanced artificial intelligence would be available, the decision-making authority to identify friend from foe and taking offensive action to address the foe could be transferred to the machine.
The US Air Force has a document titled 'Unmanned Aircraft System Flight Plan 2009-2047' which in a most detailed manner lays out the road map for development and employment of UAS by the middle of this century.
The document states that, 'The end result would be a revolution in the roles of humans in air warfare.' It sure would be, for the US Air Force already has more pilots training to be UAV operators than to fly manned aircraft.
As the technologically advanced countries avoid putting their personnel in harm's way, the pressure to outsource the dangerous tasks to automated machines would be overpowering.
And in UN operations, where political sensitivity debars the presence of foreign troops, these 'non-human' lethal killing machines would be the ideal foil to beat criticism.
The recent Human Rights Watch and Amnesty International reports are critical of American drone attacks in Yemen and Pakistan. The present situation is with a human mandatorily ensconced in the killing loop; imagine when Man Mk 2.0 makes its appearance.
It would be autonomous and brutish requiring its employment to be controlled through suitably drafted international law. We, Man Mk 1.0, better get our act together to safeguard the aspect of humanity in our society.
Manmohan Bahadur is an Air Vice-Marshal of the Indian Air Force
The views expressed by the author are personal