Fears over robots in warfareTuesday 24 June 2014 21.54
Wars in the very near future could be fought by robots operating completely autonomously of any human input, according to one Irish scientist.
Noel Sharkey, Professor of Artificial Intelligence and Robotics at the University of Sheffield, has said an international ban must be put in place now, before one country starts using them in conflict, opening the door to others to follow suit.
Currently military forces around the world use unmanned aerial vehicles or drones, and other remotely operated military systems in combat.
However, it is thought that so far no nation has openly used entirely autonomous aircraft, submarines, surface vessels or tanks that are capable of tracking, selecting, targeting and deploying weapons entirely by themselves and based on algorithms.
Speaking at the Euroscience Open Forum in Copenhagen, Professor Sharkey warned such systems were already being developed by a number of countries, including the US, China, Israel, Taiwan and Russia.
However, he said because they have not yet been used, an opportunity is there to ban them before nations race to develop and start using them.
Professor Sharkey is chairman of the International Committee for Robot Arms Control, a coalition of human rights and other organisations seeking a UN ban on the use of such war machines.
It believes such machines are incapable of knowing compassion, consideration or mercy and therefore are more likely to breach human rights, the rules of war and kill more civilians.
However, not everyone agrees.
Some scientists believe such systems will never be possible and therefore will not be used.
While others think that it is possible that autonomous killing machines could make wars more humane because human emotion, including hate and fear, would be taken out of the equation.
Also speaking at ESOF, Professor Ronald Arkin of the Georgia Institute of Technology said technology can, should and must be put to better use to save lives in war zones.
One way to do this, he said, is to use robots that can be more selective and accurate.
He said further research is needed to see if this is possible, and to see what level of autonomy is acceptable.
Professor Arkin said constraints would be necessary to ensure compliance with international law.