Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Human Rights Watch: Petition Against Robots On the Battle Field

samzenpus posted about a year and a half ago | from the why-was-I-programmed-to-feel-pain? dept.

The Military 275

New submitter KublaCant writes "'At this very moment, researchers around the world – including in the United States – are working to develop fully autonomous war machines: killer robots. This is not science fiction. It is a real and powerful threat to humanity.' These are the first words of a Human Rights Watch Petition to President Obama to keep robots from the battlefield. The argument is that robots possess neither common sense, 'real' reason, any sense of mercy nor — most important — the option to not obey illegal commands. With the fast-spreading use of drones et al., we are allegedly a long way off from Asimov's famous Three Laws of Robotics being implanted in autonomous fighting machines, or into any ( semi- ) autonomous robot. A 'Stop the Killer Robots' campaign will also be launched in April at the British House of Commons and includes many of the groups that successfully campaigned to have international action taken against cluster bombs and landmines. They hope to get a similar global treaty against autonomous weapons. The Guardian has more about this, including quotes from well-known robotics researcher Noel Sharkey from Sheffield University."

cancel ×

275 comments

Sorry! There are no comments related to the filter you selected.

Recommended Reading (5, Interesting)

smpoole7 (1467717) | about a year and a half ago | (#43002113)

http://en.wikipedia.org/wiki/Berserker_(Saberhagen) [wikipedia.org]

Fred Saberhagen's "Beserker" series.

Aside from touching on the subject at hand, it's just some crackin' good sci-fi. :)

I don't know if we'd ever reach that point ourselves, but in that series, an unknown (and now extinct) alien race, losing a war and desperate, created "doomsday" machines that were simply programmed to kill all life. They were self-replicating, self-aware AIs that took their task seriously, too.

Then again, I ask myself what some jihadist might do, if given half the chance ... . .. ..

Re:Recommended Reading (2, Informative)

will_die (586523) | about a year and a half ago | (#43002361)

Also add in _Second Variety_ by Philip K. Dick

Fear of robots is a red herring (3, Insightful)

arcite (661011) | about a year and a half ago | (#43002387)

There are all indications that the coming robotic revolution will usher in a new era of human peace and prosperity. Robots have no emotion, no bias. Imagine deploying a few hundred (or thousand) semi-autonomous robotic peacekeepers into a conflict zone. They maintain the peace 24/7, they never tire, they are alert and objective in their duties. War is traditionally an incredibly wasteful and expensive exercise. Look at Iraq and Afghanistan! $1 trillion and thousands of allied casualties. Deploy a robot army and watch the costs come down. No need for living quarters, no need of food or water, logistics becomes cheaper in every aspect.

Like them or loath them, Drones are incredibly efficient in what they do. They are very lethal, but they are precise. How many innocents died in the decades of embargo on Iraq and the subsequent large scale bombings under Bush? Estimates run into over 100,000. Use of drones in Libya, Mali, Yemen, Pakistan have reduced costs by hundreds of millions and prevented thousands of needless casualties. Drones are the future and the US has an edge that will not give up.

Re:Fear of robots is a red herring (2)

VAXcat (674775) | about a year and a half ago | (#43002419)

The robots in the Jack WIlliamson's Humanoid stories had a Prime Directive of ''to serve and obey and guard men from harm"....see how well that worked out...

Re:Fear of robots is a red herring (4, Insightful)

ultranova (717540) | about a year and a half ago | (#43002599)

Robots have no emotion, no bias. Imagine deploying a few hundred (or thousand) semi-autonomous robotic peacekeepers into a conflict zone. They maintain the peace 24/7, they never tire, they are alert and objective in their duties.

An autonomous robot needs to form a model of what's happening around it, use that to figure out what its possible long- and short-term actions will be, and finally decide how desirable various outcomes are relative to each other. All of these steps are prone to bias, especially since whoever designed the robot and its initial database is going to have their own biases.

Also, a robot acting in real life cannot carefully think everything through. There's simply not enough time for that. This necessiates some kind of emotion-analogy to provide context for reflex and simple actions, just like it does on living beings.

Look at Iraq and Afghanistan! $1 trillion and thousands of allied casualties. Deploy a robot army and watch the costs come down. No need for living quarters, no need of food or water, logistics becomes cheaper in every aspect.

So there will be a lot more "interventions", since the cost (to you) is lower. I think that's part of what worries the the HRW.

Re:Fear of robots is a red herring (3, Insightful)

Caffinated (38013) | about a year and a half ago | (#43002687)

Well, that raises the question of the "who controls the robots" question, doesn't it?. Presuming that they'd be as effective as you outline (I quite doubt it), they'd be great for making it domestically painless to invade and occupy places that one doesn't like for whatever reason, and I doubt that's a good thing (Iraq and Afghanistan only happened and went on as long as they did since even with the causalities, the pain was almost entirely borne by military families; heck, we didn't even increase taxes to actually pay for it). In short, I'd imagine that you might have a bit of a concern with autonomous foreign peacekeeping robots patrolling your neighborhood, and I'd expect that people in other places feel that way as well.

Re:Fear of robots is a red herring (5, Interesting)

kannibal_klown (531544) | about a year and a half ago | (#43002793)

A couple of issues.

1) Software can be hacked... either partially or totally. Maybe just putz with the Friend-Or-Foe logic, maybe take direct control, etc. Sure, humans can be blackmailed and extorted but usually on an individual basis. Mass-putzing with a regiment or squad and you have serious issues. Such as perhaps those drones protecting the US (if they ever become truly robotic).

2) It does make war a bit more meaningless. If you aren't facing emotional losses, then there's little reason NOT to go to war. If it's not personalized... then who cares? Sure, even now we have sympathy for the other side and protests and such... but the majority of the people that care mostly care because our brothers / sisters / sons / daughters / etc. are out there possibly dying. So that helps push back the question "should we actually GO to war with them?"

3) There ARE concerns of self-aware armed robots. Make them too self aware, and maybe they realize that the never-ending violent slaughter of humans is contradictory to their goals of preserving their owners' lives. In which case they take a OVERLY logic to preserve the FUTURE "Needs of the many" by doing PLOTLINE X. Sure, it sounds like bad sci-fi... but as you say they have no emotions and only logic. Take away emotion, and we become like cattle... where they cull the herd due to a few random mad-cow cases to save the majority.

Re:Recommended Reading (2)

ultranova (717540) | about a year and a half ago | (#43002865)

Then again, I ask myself what some jihadist might do, if given half the chance ... . .. ..

Take the Soviet Union's place at American Boogeyman #1, which is pretty darn impressive accomplishment on their side and just plain sad on America's.

You are worrying about a bunch of third-world priests and their followers building a high-tech weapon the American Army - or any first-world country - can't out-high-tech. And it got modded +5 Interesting. Come on.

Obama already leads the way (0, Insightful)

Anonymous Coward | about a year and a half ago | (#43002117)

In robot drone murders and you morons think he will sign something? Obama, Nobel Peace Prize winner that has killed the most innocent women and children yet!

Re:Obama already leads the way (0, Troll)

jsepeta (412566) | about a year and a half ago | (#43002197)

uh, no, that would be Stalin - killing the most innocent women and children yet. Hitler is probably number two, although white supremacists would probably counter me over the use of the word "innocent".

Re:Obama already leads the way (0)

Anonymous Coward | about a year and a half ago | (#43002247)

Apparently reading is not your strong suit, seeing as neither Stalin or Hitler won the Nobel Peace Price.

Re:Obama already leads the way (1)

stevencbrown (238995) | about a year and a half ago | (#43002249)

the post you're replying to is one line long, and you still can't read it right? When did Stalin or Hitler when Nobel Peace Prize?

Re:Obama already leads the way (0)

Anonymous Coward | about a year and a half ago | (#43002319)

Hitler was nominated for the Peace Prize in 1939.

Re:Obama already leads the way (2)

hsmith (818216) | about a year and a half ago | (#43002809)

Anyone can be nominated for a Nobel Peace Prize...

Re:Obama already leads the way (1)

KiloByte (825081) | about a year and a half ago | (#43002349)

When did Stalin or Hitler when Nobel Peace Prize?

By then, the Nobel Committee had some shreds of dignity left. Yet I'd count hundreds of millions of children forced to sing songs in school that call Stalin the "sun of humanity", "father of peace", and so on.

Re:Obama already leads the way (1)

Anonymous Coward | about a year and a half ago | (#43002259)

Unfortunately you're both wrong. Mao wins that particular genocide competition with a death total between 45 million and 75 million.

Re:Obama already leads the way (1)

medcalf (68293) | about a year and a half ago | (#43002321)

Actually, Mao is first, followed by Stalin and then Hitler.

Re:Obama already leads the way (2)

Kartu (1490911) | about a year and a half ago | (#43002569)

Stalin's regime (officially) executed between 3.5 and 5 million (most of it in post civil war era, check how it went in post-revolution France), even assuming all of them were innocent, how could you compare that to what Hitler did and come to the conclusion, he did less???

Stalin was an ass hole but you putting him in front of Hitler (who was fine with exterminating entire nation) shocks me. You took too much anti-kommies propaganda too seriously guys.

Re:Obama already leads the way (1)

jabuzz (182671) | about a year and a half ago | (#43002657)

It is not the "officially" executed that counts it is the millions that died due to starvation as a result of his policies. As such Stalin is way beyond Hitler. That includes the millions who died due to the fact that Stalin had "purged" the Red Army of all the effective officers who could have stopped Hitler much much sooner.

Re:Obama already leads the way (0)

Anonymous Coward | about a year and a half ago | (#43002547)

And he'd have gotten away with it too if it hadn't been for selling all the grain to the highest bidder during the famine in order to buy guns.

Capitalism: destroying Socialism since forever.

Re:Obama already leads the way (5, Insightful)

Anonymous Coward | about a year and a half ago | (#43002437)

In robot drone murders and you morons think he will sign something? Obama, Nobel Peace Prize winner that has killed the most innocent women and children yet!

I believe Yasar Arafat, Henry Kissinger, Yitzhak Rabin, Shimon Peres, Menachem Begin, and Le Duc Tho all currently lead Obama the "Number of Innocents Killed by a Nobel Peace Prize Winner" race.

Re:Obama already leads the way (1)

kelemvor4 (1980226) | about a year and a half ago | (#43002941)

Sorry, but if you haven't been paying attention the "European union" is a Laurette. http://www.guardian.co.uk/world/2010/oct/15/un-backed-troops-accused-rape-congo [guardian.co.uk] or https://en.wikipedia.org/wiki/United_Nations_peacekeeping#Reception [wikipedia.org] or just take the UN's word for it here: https://www.un.org/en/peacekeeping/fatalities/documents/stats_1.pdf [un.org] . The UN claims to only whack a under 200 a year.

Deal with it. (2)

Colan (2771285) | about a year and a half ago | (#43002125)

As far as I can tell, we need to focus on dealing with the presence of drones and "killer robots," not how to prevent them. Like it or not, 'progress marches on'.

Re:Deal with it. (2)

Vanderhoth (1582661) | about a year and a half ago | (#43002499)

I think robots are a great way to go. Sure in the beginning there would be robots fighting human solders, but in time human solders would be phased out. They're expensive in terms that it takes 18-20 years (in civilized society) to raise and train a human solders. Robots could be build and programmed in a few days with a good manufacturing plant. The future of war will just be machines fighting other machines.

It'd be like a real life game of starcraft with humans controlling the groups of robots remotely. The side to run out of resources and/or units first loses with no intentional lost of human life.

Re:Deal with it. (3, Insightful)

Darth Snowshoe (1434515) | about a year and a half ago | (#43002813)

You are describing your own fantasy rather than a reasoned prediction.

Surely once the robots break through the curtain of defenders, they will begin quite efficiently to the civilian population and their infrastructure. How would robots even distinguish between them? (In fact, this is a difficulty for human soldiers today.) Is it not likely that civilians would attempt, at the last, to defend themselves and their families also?

The hope for humanity is not that the winners will somehow be more virtuous than the losers. Our only hope is that, as the consequences of armed conflict escalate, the number and severity of conflicts will dwindle.

Re:Deal with it. (1)

Darth Snowshoe (1434515) | about a year and a half ago | (#43002829)

(edit)

"quite efficiently to" kill

Re:Deal with it. (1)

NatasRevol (731260) | about a year and a half ago | (#43002837)

" with no intentional lost of human life."

Yeah, as long as the wining side chooses not to wipe out the humans on the losing side, since they'll have no robot protection anymore.

I'm sure that'll never happen.

Define "robot" (1)

Anonymous Coward | about a year and a half ago | (#43002129)

There is no satisfactory definition, just as there is no definition for "Artificial Intelligence". Things change and so does our acceptance of what is commonplace vs. what is considered novel.

Endorsements (1)

Sparticus789 (2625955) | about a year and a half ago | (#43002137)

This message is sponsored by Sarah and John Connor. With special consideration from Morpheus, Trinity, and Neo.

These are not the droids you're looking for (4, Informative)

rodrigoandrade (713371) | about a year and a half ago | (#43002147)

Hey, James Cameron, are you the submitter??

The automomous Terminator-style robots the summary refers to are far from becoming a battlefield standard, much to the disappointment of the /. crowd and sci-fi nerds.

Predator drones et al., like all current robotic devices in the battlefield, still have a human being in charge making all the decisions, so the points raised are completely moot.

Re:These are not the droids you're looking for (1)

MitchDev (2526834) | about a year and a half ago | (#43002171)

Are they really just remote controlled devices rather than autonomous "robots"?

Re:These are not the droids you're looking for (5, Interesting)

fuzzyfuzzyfungus (1223518) | about a year and a half ago | (#43002225)

Yes and no: especially sophisticated autonomous robots, either self-driving vehicles or biomimetic killbots of some sort, are sci-fi stuff; but land mines 'That's bi-state autonomous area denial agent sir to you, cripple!' and more sophisticated devices like the Mark 60 CAPTOR [wikipedia.org] are autonomous killer robots.

And, so far, they've proven deeply unpopular in bleeding-heart circles. The fancier naval and anti-vehicle mines are still on the table; but the classic land mine enjoys a sense of ethical distaste only slightly less than just hacking off children's limbs yourself...

Re:These are not the droids you're looking for (2)

wren337 (182018) | about a year and a half ago | (#43002355)

Landmines are the perfect example of existing autonomous technology. Next steps would be, I imagine, drones that fly themselves home if jammed. Still pretty innocuous but a step into automation.

Also imagine a first generation turret. Automated target acquisition based on stereo imaging and stereo microphones. The first models would require an operator to approve the target. But the systems are so much faster than us - soon you'd want to be able to approve a target area, hold down the "OK" button and have it keep firing. We're not talking spray and pray here - this thing could be single round fully automated sniper, catching someone who only sticks their head up for a fraction of a second. How long until you'd designate an area as a no-go hostile zone and leave it on all night to guard the perimeter?

Re:These are not the droids you're looking for (0)

Anonymous Coward | about a year and a half ago | (#43002415)

I think the analogy to mines gets right to the heart of the issue. When you hand over the kill/don't kill decision to an autonomous agent - be it a mine's pressure switch or some near-future drone's IFF system - then you invite their more liberal deployment than weapons where deployment of the weapon and the destruction of the target are utterly inseperable.

Re:These are not the droids you're looking for (1)

Hentes (2461350) | about a year and a half ago | (#43002287)

But we shouldn't wait until the autonomous drones arrive.

Re:These are not the droids you're looking for (2)

progician (2451300) | about a year and a half ago | (#43002505)

Disappointment or not, the problem is kind of different. In fact, the problem exists quite a while ago, since people invented time-bombs, remote controlled bomb, suicide killers, and such.

The issue at hand is the following: War is about killing people and destroying stuff. People on the battlefield facing to each other turned out to be counter-productive in this regard, exemplified on many occasions in the end of 1st World Massacre. After a long period of constant threat of death, patriotism, religious fanaticism or any other ideological commitment to the slaughter will give a way to basic instinct of staying alive, and many soldiers deserted their posts, and went home. In many cases, if the officer tried to hold them back with threats, he got simply a bullet, instead of the enemy. Also, there was also a threat of that people on both side of the front would realize that they are not really enemies, they are there for killing each other in the name of others, and others' interests, so they could just simply walk home, and simply let each other live.

Ever since the Great Massacre, technology is invested in to a literal war-machine that removes this options from the war. There are no massive battles face to face. Behind the heavy artillery, bombing and armoured vehicle attacks, the troops are there for filling up the gaps. This allows to organize the army of professional combatants, and remove them from the front line, allowing them to keep their emotional distance from the enemy. It is just a work now, just like anything else. War is a business, with turnovers, wage labour, and increasing automation.

Sure, armies aren't made up of autonomous military robots yet, but the trend is clear and straight. It is not the weaponry that wins a war, but the level of fear in the population. Thus, the increasingly automated weaponry is aimed at the population at large, and, just like the weapons of mass destruction, it is aimed break the will of the combatants by keeping the population in fear, as a hostage. The same thing employed by professional armies and the ragged army of Islamist "terrorists".

Extrapolate this trend, and you'll end up with remote controlled death squads all over the world. And as for the sci-fi point, you can also see that if we would be able to mass produce autonomous robots, the 3 laws of robotics will be ignored at large, since one of the major interest and means to employ in robotics is the army.

Re:These are not the droids you're looking for (0)

Anonymous Coward | about a year and a half ago | (#43002523)

Hey, James Cameron, are you the submitter??

James Cameron would not write about it. He'd just spend 20 billion of his own money to develop autonomous killer robot technology (and probably a working time machine as well) for his Terminator reboot to make it as realistic as possible.

Re:These are not the droids you're looking for (1)

KublaCant (2847303) | about a year and a half ago | (#43002769)

I am glad to be "just" an ordinary developer and software architect, not James Cameron. All I wanted, with this submission, was to kindle discussion. Obviously, that goal was reached :-) The "still" in "still have a human being in charge", however, is tell-tale, and in and by itself already justification enough for wanting to kindle such a discussion, as it is an often-heard argument. You do know, I suppose, that there are already fully automated guardian robots for sale, armed with nothing less than rapid-fire cannons ?

samson (3, Interesting)

nten (709128) | about a year and a half ago | (#43002833)

http://en.wikipedia.org/wiki/Samson_RCWS [wikipedia.org]

These turrets count I think. Israel has at times said they are keeping a man in the loop, but the technology doesn't require it, and at times they have said they are in
"see-shoot" mode. This is essentially indiscriminate area denial that is easier to turn off than mines. It does have the computer vision and targeting aspects of a killer robot, just not the path finding and obstacle avoidance parts.

I want that! (-1)

Anonymous Coward | about a year and a half ago | (#43002165)

...any sense of mercy ...

How to put this ....

One of the reasons why our wars of late have been dragging on and on, depleting the resources of both sides and causing way too many civilian deaths is that we pussy foot around.

We are afraid of body bags coming home and we're afraid of collateral damage. Unfortunately, the paradox is that pussy footing around causes more of the above.

With killer machines, we'll have a combat force that can pussy foot around without the subsequent needless deaths of troops - the machine's program senses that civilians may be hurt, it stands down and takes the shot - no body bag, brain injury, maiming or whatever.

Secondly, I WANT our enemies to know that they are fighting machines. Machines that are not afraid and won't succumb to terror or psychological games.

You want to screw with us? Then a bunch of terminators are coming to take care of it - you bags of meat. And I would love it if the terminators didn't shoot, but grabbed our enemy by the limbs and tore them apart. That'll make them think TWICE about attacking us!

Re:I want that! (3, Interesting)

jsepeta (412566) | about a year and a half ago | (#43002215)

in the 1890's Tesla staged naval battles in Madison Square Garden where remote-controlled boats did battle against each other. His goal was to have robots fighting in wars as our proxies, so men wouldn't have to die. But eventually, it will be man vs machine, Terminator-style.

Re:I want that! (0)

Joehonkie (665142) | about a year and a half ago | (#43002231)

His machines weren't "robots" any more than Predator drones are: they were remote controlled by radio.

Re:I want that! (3, Informative)

Ch_Omega (532549) | about a year and a half ago | (#43002801)

His machines weren't "robots" any more than Predator drones are: they were remote controlled by radio.

Yes. That is probably why he stated that they were remote-controlled.

Humans in the loop (0)

Anonymous Coward | about a year and a half ago | (#43002175)

I thought that all current systems still have a human-in-the-loop who makes the use-of-force decision? Or has this changed in recent years? I know last I heard, it was still being discussed in military ethics circles. Have we moved on from this?

Re:Humans in the loop (1)

Dr. Tom (23206) | about a year and a half ago | (#43002363)

There's a lawyer standing behind the drone pilot. He's there to make sure no laws are violated. So it isn't the drone, or the ROV pilot, it's the lawyer who makes the kill decision. So if you are complaining about it, ask yourself who makes the laws? More importantly, in other countries that are about to become drone capable, what sorts of laws do they have preventing arbitrary kills?

Drones are Piloted (1)

ZombieBraintrust (1685608) | about a year and a half ago | (#43002177)

The drones America uses are piloted by humans. The other robot in use by the military is the one that disables bombs. It also is remote controlled by a human. I don't think the military has any non piloted robots deployed in combat. Even a turret would be too dangerous. An automated turret could kill our own troops. Closest thing we have is landmines.

Re:Drones are Piloted (1)

Jawnn (445279) | about a year and a half ago | (#43002213)

The drones America uses are piloted by humans. The other robot in use by the military is the one that disables bombs. It also is remote controlled by a human. I don't think the military has any non piloted robots deployed in combat. Even a turret would be too dangerous. An automated turret could kill our own troops. Closest thing we have is landmines.

Are you sure about that?

Re:Drones are Piloted (1)

Anonymous Coward | about a year and a half ago | (#43002307)

I'm pretty sure we have automated anti-aircraft and anti-missile guns.
    http://en.wikipedia.org/wiki/Kashtan_CIWS
  http://en.wikipedia.org/wiki/Phalanx_CIWS

Re:Drones are Piloted (0)

Anonymous Coward | about a year and a half ago | (#43002237)

Phalanx automatic gun systems are deployed on ships and I believe were/are also used as base defences in iraq/afghanistan.

Re:Drones are Piloted (2)

Sparticus789 (2625955) | about a year and a half ago | (#43002271)

I don't think the military has any non piloted robots deployed in combat. Even a turret would be too dangerous.

Ever hear of the PHALANX/CIWS? [wikipedia.org] Automated turrets that are placed on Aircraft Carriers and on bases in the middle east to shoot down incoming mortars and rockets. Something capable of shooting 4,500 20mm rounds per minute could be very deadly. Because human reaction time is too slow, these turrets DO fire automatically.

Re:Drones are Piloted (2)

PopeRatzo (965947) | about a year and a half ago | (#43002527)

Something capable of shooting 4,500 20mm rounds per minute could be very deadly.

But please, only when used by a well-regulated militia.

Re:Drones are Piloted (2)

Sparticus789 (2625955) | about a year and a half ago | (#43002563)

Bought one last week. My well-regulated militia is very interested in not being killed by a Hellfire missile shot by Obama. We are American citizens after all, and subject to assassination order by the President.

And in case anyone is too dense to recognize the sarcasm.... /sarcasm

Re:Drones are Piloted (0)

CanHasDIY (1672858) | about a year and a half ago | (#43002763)

Bought one last week. My well-regulated militia is very interested in not being killed by a Hellfire missile shot by Obama. We are American citizens after all, and subject to assassination order by the President.

And in case anyone is too dense to recognize the sarcasm.... /sarcasm

Huh, and here I was thinking it was one of those "terrifyingly prophetic" type of statements...

Re:Drones are Piloted (2)

Mindcontrolled (1388007) | about a year and a half ago | (#43002915)

But, but... careful there, brother. It's not the Hellfires shot by Obama personally, who are after you. Don't neglect to watch out for the black FEMA/UN helicopters implementing Agenda 21!!!!

also... /sarcasm

Re:Drones are Piloted (0)

Anonymous Coward | about a year and a half ago | (#43002323)

For now. It is inevitable that some crude form of AI will be running the drones. Not necessarily by Americans. People assume it will be them as right now at this time they are spending the most. However, what if they decide to slash the cost? Then someone else steps up to the plate? It happened to England, France, and Spain. All 3 of those were 'world super powers' at their time.

Many battle strategies over the years have just 'went by the numbers'. X number of enemy killed for Y number of 'friendlies' killed. Just add in 'hours of use before autonomous disablement' and you are there. Also it takes quite a large amount of training and cost to make a soldier combat ready these days. But if you can do it at a fixed cost and way cheaper suddenly drones become something else entirely. The English got slaughtered a couple of times by the French because they wrongly assumed the longbow was good enough because they could out fire the crude cannon of the day. Once the french fixed that 'crude' problem the French thumped the English.

What you will probably see at first is 1 human in charge of 5-10 drones. The drones act 'autonomously' and the controller can take over any of them. Then you will see as they get comfortable with the tech something like 1 to 50. Then they will take the 'commander' out of the loop and put it in the hands of 'strategy committees'. Then they will let the computer fight out what from our point of view in the command 'bunker' is a large RTS game.

Re:Drones are Piloted (2)

fuzzyfuzzyfungus (1223518) | about a year and a half ago | (#43002459)

What you will probably see at first is 1 human in charge of 5-10 drones. The drones act 'autonomously' and the controller can take over any of them. Then you will see as they get comfortable with the tech something like 1 to 50. Then they will take the 'commander' out of the loop and put it in the hands of 'strategy committees'. Then they will let the computer fight out what from our point of view in the command 'bunker' is a large RTS game.

It's already one pilot to multiple drones. Given that one of the major features on the things is long endurance/loiter times, and they possess some limited automation of basic flight functions(ie. unlike a 'basic' RC aircraft where every control surface is directly mapped to a joystick on the controller, and the pilot has to compute the control-surface configuration that gets the path he wants), a single person can watch over multiple drones at a time, and (so long as the standing order is some variation of 'just putz around at safe altitude until I come back') a drone can temporarily be ignored if something more important is happening with one of the others.

If memory serves, takeoff/landing still has to be one one-on-one, and all waypoint assignment and weapons targeting is human controlled; but handling the aeronautical details of moving from waypoint to waypoint is already automated.

It might be OK (1)

Pete LaGrange (696064) | about a year and a half ago | (#43002201)

if we could just get the robots to only fight other robots...

Re:It might be OK (1)

SJHillman (1966756) | about a year and a half ago | (#43002445)

Transform and roll out!

Re:It might be OK (1)

Ch_Omega (532549) | about a year and a half ago | (#43002845)

if we could just get the robots to only fight other robots...

Yes. Then, all your enemy would have to do to defeat your robot army, is to send a human army.

It's the same as bio-warfare (4, Interesting)

RobinH (124750) | about a year and a half ago | (#43002239)

If you think about a virus for a second, it's the same thing. You can't reason with a virus. It doesn't make moral decisions. It just does what its DNA programs it to do, and it's even more dangerous because it's self-replicating. We need to deal with autonomous robots the same way we deal with bio-warfare.

Depending on how one defines "robot"... (1)

mschaffer (97223) | about a year and a half ago | (#43002243)

Depending on how one defines "robot", this will be extremely unlikely.

Effectiveness trumps morality every time. (4, Insightful)

concealment (2447304) | about a year and a half ago | (#43002245)

I don't mean to be the dark figure in this conversation, but I think it's inevitable that robots will be used on the battlefield, just like people are going to continue to use cluster bombs, land mines, dum-dum bullets and other horrible devices. The reason is that they're effective.

War is a measurement of who is most effective at holding territory. It is often fought between uneven sides, for example the Iraqi army in their 40-year-old tanks going out against the American Apaches who promptly slaughtered them. Sometimes, there are seeming upsets but often there's an uneven balance behind the scenes there as well.

Robots are going to make it to the battlefield because they are effective not as killing machines, but as defensive machines. They're an improvement over land mines, actually. The reason for this is that you can programmatically define "defense" where offense is going to require more complexity.

Already South Korean is deploying robotic machine gun-equipped sentries on its border [cnet.com] . Why put a human out there to die from sniper fire when you can have armored robots watching the whole border?

Eventually, robots may make it to offensive roles. I think this is more dubious because avoiding friendly fire is difficult, and using transponders just gives the enemy homing beacons. In the meantime, they'll make it to the battlefield, no matter how many teary people sign petitions and throw flowers at them.

Re:Effectiveness trumps morality every time. (0)

Anonymous Coward | about a year and a half ago | (#43002391)

I can't say that I mind robots on the battlefield all that much. At least is an improvement over minefields.
What I don't like is that it enables a nation to go to war without risking human casualties of their own. This makes it possible to pretty much slaughter another people without upsetting anyone back home.
It's harder to keep an occupation going when your voting population actually risks their lives.

Re:Effectiveness trumps morality every time. (1)

SirGarlon (845873) | about a year and a half ago | (#43002581)

What I don't like is that it enables a nation to go to war without risking human casualties of their own.

This is not so different from the "limited air war" doctrine the US practiced for 20+ years between Vietnam and Desert Storm, or the drone war today. I don't like it, either.

War robots? (0)

Anonymous Coward | about a year and a half ago | (#43002251)

What kind of war? How about we replace the godamned shitheads in all the godamned government agencies with a functional, law-biding computer program? If we did this in a way that everyone agreed with (haha) then maybe we could have social sanity again. Think about all the computers, not wanting money, not being able to be bribed, not being effected by personal, or religious ideals... Yeah, these war-robots sound good, so long as the war is in the proper place...

Asimov's Laws (0)

Anonymous Coward | about a year and a half ago | (#43002277)

Given that technology has a tendency to end in its "simplest" state due to the forces of the competitive market, I never quite understood how it was realistic that the Laws of Robotics were ever actually expected to be implemented in the first place. By the time robots were capable of processing them in any meaningful sense, they'd already be bypassed by suppliers willing to skip them for a cheaper product that doesn't worry about such things.

Legislation might be useful to force manufacturers to implement such for civilian use once robots were capable of "understanding" these laws, but was there ever a time that we assumed that military and law enforcement 'bots would be required to obey them also?

It's funny, between robots and FTL drives, teleporter technology, replicators, etc... I always thought the Three Laws (or later amended) were the most unrealistic.

Re:Asimov's Laws (2)

kannibal_klown (531544) | about a year and a half ago | (#43002451)

In the Asimov books, the inventor of the Robot Brain pretty much invented and designed the Positronic Brains so they the whole underlying foundation was just a large spaghetti of stuff... and the brain wouldn't function without it. And part of the spaghetti was the 3-laws... remove them and it all falls apart like a house of cards.

So it wasn't so much an issue of "Manufacturers installing the 3-laws-patch" but that the 3-laws were built into the brain's foundation. And that there weren't really ways to make the brain without having all of that stuff there.

Though in one of (Asimov's?) books, some genius designed a Gravitronic brain from scratch in such a way that it didn't have the 3 laws built in. Thus it was smaller and cheaper. But I forget if it was an Asimov book or just someone that borrowed his rules and such.

What's With the Rampant Futurism? (0)

Anonymous Coward | about a year and a half ago | (#43002293)

These days it's everywhere. Whether predicting good or bad to come, they all act like it's imminent. Calm down you guys, both you folks who think the Terminator is nigh, and you guys on the edge of your seat waiting for techno-rapture; neither are coming in your lifetime, or mine, or for generations for that matter

Get on with your lives.

Also, these petitions are a joke now.

Re:What's With the Rampant Futurism? (1)

CanHasDIY (1672858) | about a year and a half ago | (#43002821)

These days it's everywhere.

That's not a new phenomenon for our species - heck, back in the days of ancient Greece, one couldn't throw a stone without hitting at least one or two "oracles."

Of course, they at least had the excuse of rampant mercury poisoning...

I fail to see the difference. (1)

Westwood0720 (2688917) | about a year and a half ago | (#43002299)

I fail to see the difference between a robot killing off a group of people and a religious extremist strapping explosives to his four year old son before he tells him to "go say 'hi' to those nice soldiers over there."

Re:I fail to see the difference. (1)

fuzzyfuzzyfungus (1223518) | about a year and a half ago | (#43002385)

So you are saying that both are deeply distasteful and likely to result in casualties among civilians who are 'innocent' by any stretch of the word? Or was this one of those 'we have to stoop to their level to stop those animals' arguments?

(Incidentally, unless the extremist was fucking a defense contractor, I bet the kid and not the robot was produced on time and under budget...)

Re:I fail to see the difference. (1)

Westwood0720 (2688917) | about a year and a half ago | (#43002551)

So you are saying that both are deeply distasteful and likely to result in casualties among civilians who are 'innocent' by any stretch of the word?

Pretty much this.

The 3 laws are fiction (4, Insightful)

Dr. Tom (23206) | about a year and a half ago | (#43002315)

How many times must it be said? Asimov's 3 "laws" have nothing to do with real robotics, future or present. They were a _plot device_, designed to make his (fictional) stories more interesting. Even mentioning them at all in this context implies ignorance of actual robotics in reality. In reality, robot 'brains' are computers, programmed with software. Worry more about bugs in that software, and lack of oversight on the people controlling them.

Re:The 3 laws are fiction (2)

ledow (319597) | about a year and a half ago | (#43002477)

Quite.

The day we get a robot that can understand, interpret and carry out infallibly the "three laws", we don't need the three laws - it will have surpassed the average human ability and probably could reason for itself better than we ever could. We would literally have created a "moral" robot with proper intelligence. At that point, it would be quite capable of providing any justification to its actions and even deciding that the three laws themselves were wrong (like the "0th law" used as a plot device itself).

The day we have to worry about the three laws in a real robot, we'll have taken a step forward into a whole new world with new rules anyway. You're then literally months away from a robot wanting to become a legally recognised citizen that wants the right for its demonstrable freewill not to be bound by the three laws unless humans also are.

Re:The 3 laws are fiction (1)

Anonymous Coward | about a year and a half ago | (#43002597)

It's almost as if the author of the summary used the Three Laws as a way of getting the reader to think about the design challenges of creating safe automata. A rhetorical device, if you will.

As long as citizens can have them we are cool (1)

cod3r_ (2031620) | about a year and a half ago | (#43002325)

Killer robots can't be a government only option =D

Re:As long as citizens can have them we are cool (2)

fuzzyfuzzyfungus (1223518) | about a year and a half ago | (#43002397)

Killer robots can't be a government only option =D

"Killer robots don't kill people, people with killer robots kill people! Wait, um, no, actually, killer robots do kill people!"

Re:As long as citizens can have them we are cool (1)

NatasRevol (731260) | about a year and a half ago | (#43002949)

Probably only billionaire citizens.

You want a Bill Gates model?

The shotgun was outlawed by the Geneva Convention (2)

Rambo Tribble (1273454) | about a year and a half ago | (#43002327)

This led to clever people developing submachine guns.

Give it a couple decades and you'll be able to download plans for your own battlebot and then create it on your printer

Total Garbage. (4, Interesting)

inhuman_4 (1294516) | about a year and a half ago | (#43002339)

This article is absolute garbage. Almost everything in that Guardian article is misinformed and sensationalist.

"fully autonomous war machines"? Care to give an example? I've follow this stuff pretty closely in the news on top of researching AI myself. And from what I have seen no one is working on this. Hell, we've only just started to crack autonomous vehicles. They site X-37 space plane for gods' sake. Everything about that is classified so how do they know it is autonomous?

My favourite gem has to be this one: "No one on your side might get killed, but what effect will you be having on the other side, not just in lives but in attitudes and anger?". Pretty sure that keeping your side alive while attacking your opponent has been the point of every weapon that has ever been developed.

Re:Total Garbage. (3, Informative)

Anonymous Coward | about a year and a half ago | (#43002767)

A huge amount is known about the X-37 [wikipedia.org] seeing as it's a redirected NASA project. It's capable of autonomous landing and it's widely assumed that it performed its primary reconaissance mission autonomously seeing as it's basically a glorified spy satellite capable of a controlled re-entry.

We already have fully autonomous combat aircraft, that can be pointed at a target and perform complex manouvers in order to reach and subsequently destroy it. They're called cruise missiles. You're hopelessly naive if you think we're more than a decade from a drone that can cruise to a target and wait for the operator to give the fire order.

What if...? (1)

No Grand Plan (975972) | about a year and a half ago | (#43002347)

How about we replace all soldiers on all battlefields with these robots? That way there's no issue with robots killing people.

you want MORE robots, not less (1)

circletimessquare (444983) | about a year and a half ago | (#43002369)

robots killing robots

wars settled in a clash of machinery without any humans for miles around

Re:you want MORE robots, not less (1)

LQ (188043) | about a year and a half ago | (#43002601)

robots killing robots

wars settled in a clash of machinery without any humans for miles around

Most conflicts in the last 50 years have been asymmetric - big tech military vs AK47s and RPGs. And such conflicts usually involve insurgents (or whatever) hiding among non-combatants. So it's not robot vs robot - it's about deciding before every strike how much risk you are prepared to inflict on by-standers. Robots can't do that (and humans make a bad enough job of it too).

Re:you want MORE robots, not less (1)

progician (2451300) | about a year and a half ago | (#43002825)

And to extend that, as long as a lot of non-combatants are killed, you make sure the replacement of the "insurgents". Vicious circle as it is.

most importantly (1)

shentino (1139071) | about a year and a half ago | (#43002395)

Robots are not alive.

There is no true sacrifice of blood and souls when robots take the place of soldiers in battle. In my opinion, that brings them up to WMD in terms of being able to inflict loads of casualties with little risk to the aggressor.

Lol (1)

TheSkepticalOptimist (898384) | about a year and a half ago | (#43002401)

Only humans should be on the battlefield killing each other, robots killing robots is just so inhumane.

No robots! No humans! (1)

RicardoKAlmeida (2790435) | about a year and a half ago | (#43002405)

I agree! No robots in the battlefield! I also propose that all battles from now on be fought by avatars in a game. Humans should be prohibited to engage in real battles, where they act as merciless robots, "just following orders", even "illegal" ones. A new Geneva convention should be summoned immediately to enforce these new rules of engagement!

can america (0)

Anonymous Coward | about a year and a half ago | (#43002433)

can america please do something about mental health in the US? Americans are seeing Obama as a killer, clearly you need mental help because George Bush signed all those laws...

Did these people not hear about WWII? (1)

nedlohs (1335013) | about a year and a half ago | (#43002441)

Without "autonomous war machines" we've managed to firebomb cities (with a nice 3 hour gap between bombing runs so that fire fighters and so on would be putting out the first run's fires when the second run hit), mass murder civilians, drop atomic bombs on cities, use chemical weapons, and everything in between. I don't think feelings of mercy and pity and an ability to not follow illegal orders makes much of a difference.

Video Game War (1)

Jason Levine (196982) | about a year and a half ago | (#43002457)

My main problem with using robots (or, more likely, remotely piloted-semi-autonomous war machines) on the battlefield is that it makes war too easy. Right now, drones aside, war is a costly matter. You need to put actual lives at risk and that acts as a check on what generals/politicians would want to use troops for. Want to invade North Korea and Iran to stop them from being a threat once and for all? Well, that's going to wind up costing tons of lives which is going to make it harder to sell to the public. (Lives on the other side count too, but - let's face it - they don't count as much because it is all too easy to dehumanize the enemy.)

However, let's assume millions of soldiers were seated at a "video game console" while their robotic avatars were out in the battlefield killing North Koreans or Iranians. Suddenly, the only cost is money spent on damaged avatars. Spending billions on war can eventually cause public sentiment to shift against the war, but not the same way as the prospect of thousands of body bags coming home does. I fear that, once war becomes a "real life video game", we'll become a much more war-mongering nation due to the reduced "cost" of war.

Already? Wow. (0)

Anonymous Coward | about a year and a half ago | (#43002549)

I didn't think, after watching "Terminator", that this would start happening until about 2029 or later.

Daleks (1)

jfdavis668 (1414919) | about a year and a half ago | (#43002585)

are coming

Mercy (1)

Ukab the Great (87152) | about a year and a half ago | (#43002627)

There have been many situations where you've had humans in on the ground, one gets killed, and the slain soldier's buddies snap and decide to massacre an entire village. I'm not really sure what part of merciful warfare autonomous robots are threatening.

War is terrible. (0)

Anonymous Coward | about a year and a half ago | (#43002659)

War is terrible. The people starting any war are stupid. I don't care if it is the USA (my country), some other country or just organizations like al-Qaeda. Don't start something you cannot finish.

Missiles are robots.
Bombs are robots.
Any stand-off weapon is basically a robot.

Drones have a human "somewhere" making the decision. Sometimes that decision is wrong, just like when real people are on a battle field and shot at the wrong people.
Sometimes the equipment malfunctions.

War is terrible and should be avoided. Only idiots think that breaking things really solves issues. Often, it only delays the issue.

For deep beliefs where there is no way to change the belief, yet the other side wants to kill their opponents (us?), then the only answer seems to be to kill everyone over age 7 and reteach all their children. Killing just the people on the front has proven to be ineffective. More will be grown to be "warriors" and continue the fight. This seems to be a way of life in the middle east. They've been complaining about the same issue for thousands of years. Get over it already. War is stupid. If you don't agree to let the other-side live, there is no way to resolve the issue.

Argentina has been complaining about the Falklands for hundreds of years. Get over it already. War is stupid. If you don't agree to let the other-side live, there is no way to resolve the issue.

War is terrible, but doing a half-assed job is just as terrible. It leaves an entire culture wasting time thinking about retribution instead of becoming productive, wealthy members of a world-wide society.

Kill them all. End THAT war forever. Don't let the children of the dead live on to believe that they own some vengence to their ancestors. Kill them all. That would be kinder to the current people and for the generations to come.

Whatever happened to neutron bombs?

War is terrible whether it is close up or remote.

Hey baby... (1)

Valentttine (2420782) | about a year and a half ago | (#43002843)

wanna kill all humans?

Isn't that interesting! (1)

briancox2 (2417470) | about a year and a half ago | (#43002911)

People really get very preferrential about their mode of death. Robots seem to offend them more than the more civilized death of a soldier. Me? Hell, I'd prefer to die in the glory of combating a metal beast. That would be far more glorious than fighting a mere mortal.

3 Laws of Robotics are fiction, not reality (1)

Anonymous Coward | about a year and a half ago | (#43002943)

The 3 laws of robotics were a brilliant development as a plot device. But I don't think there's even a theoretical way to implement them on actual technology that wouldn't be trivial to circumvent. There is not a single organization that makes "robot brains" that can build in these laws. There isn't a way to build this into a processor, and even if there were, you could modify the data sent to the processor. Also, there would always be a manufacturer willing to leave them out once the price got high enough, my guess is that, for most manufacturers, that price would be around the cost of a current processor plus a dollar.

Load More Comments
Slashdot Login

Need an Account?

Forgot your password?

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>