That’s a good question rs98 and it’s something some roboticists – including me – worry about a great deal. The first thing to say is that it depends on what you mean by a robot. Some missiles already have alot of ‘intelligence’ built in, cruise missiles for instance can find their own way to their target. So, in a sense, cruise missiles are already semi-autonomous robots fighting in wars.
However, I guess you probably mean robots fighting instead of, or alongside, soldiers. Here again there are already remotely controlled robots in use to help soldiers with bomb disposal, or to go into buildings and allow the soldiers to see inside the building (through the robot’s cameras) before the soldier goes into the building themselves. However, these robots can’t learn, are not intelligent at all, and are remotely controlled by the human soldiers.
But – and here’s the worrying part – there are some people who think that it’s ok to put a gun on a robot and use robots for fighting. I (and a number of other roboticists) think that’s a very bad idea indeed. There are two main reasons I think it’s a bad idea (apart from weapons generally being not-a-good-thing for humanity). The first is that if the robot with a gun is remotely controlled by the human, then the human has to make a decision about aiming and firing the gun based on what he or she sees through its cameras. The problem with this is that it’s very hard to really see and understand what’s going on in the battle when you’re not there and just looking through a robot’s cameras – and so I think you will be more likely to make mistakes and shoot either innocent civilians or soldiers on your own side.
The second reason I think it’s a bad idea is if the robot-with-a-gun is not remotely controlled by a human but ‘autonomous’. In other words the robot decides, on its own, where to aim its gun and when to fire. Of course there are serious ethical and legal problems with this, like who is responsible if the robot makes a mistake and shoots the wrong person. But I won’t go into those here. Instead I’ll explain the basic technical problem which is – in a nutshell – that robot’s are way too stupid to be given the autonomy to make the decision about what to shoot and when. Even the smartest autonomous mobile robots around today are not much smarter than an insect. Would you trust a robot with the intelligence of an ant, with a gun? I know I wouldn’t. I’m not sure I would even trust a robot with the intelligence of a chimpanzee (probably the next smartest animals to humans) with a gun. And we are a long long way from making robots as clever as chimpanzees. Robot with human-level intelligence are I think hundreds of years into the future, I wrote about that here http://ias.im/35.1633.
Personally I would like to see international laws passed that prohibit the use of robots with guns (a robot arms limitation treaty). What do you think?