Archived material Access restricted Article
Rantburg

Today's Front Page   View All of Thu 04/10/2008 View Wed 04/09/2008 View Tue 04/08/2008 View Mon 04/07/2008 View Sun 04/06/2008 View Sat 04/05/2008 View Fri 04/04/2008
1
2008-04-10 -Obits-
Killer Ground Sword 'Bots Out of Iraq
Archived material is restricted to Rantburg regulars and members. If you need access email fred.pruitt=at=gmail.com with your nick to be added to the members list. There is no charge to join Rantburg as a member.
Posted by KBK 2008-04-10 14:41|| || Front Page|| [4 views ]  Top

#1 I suspect a different reason, that we had learned that stealing one of these bots had become an intelligence priority for some major, unnamed power, so they were sent home under lock and key.

Nothing succeeds like success. Robots are the big thing right now, and so there must be enormous pressure to steal robot tech.
Posted by Anonymoose 2008-04-10 18:29||   2008-04-10 18:29|| Front Page Top

#2 Slippery slope, very slippery slope. This is scary shit. These are the K-mart versions of the stuff that's out there too.

I've heard all the arguments for and against automated warfare and semi-automated warfare, and the answer is still no, no, a resounding no.

Let's remember the three laws of robotics that Asimov laid down when considering a future ripe with Robots. They are pretty logical, and until now have really only had a place in science fiction, but times they are-a-changing. The Laws state the following:

1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.
2. A robot must obey orders given to it by human beings, except where such orders would conflict with the First Law.
3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.

All reasonable thinking humans should be against autonomous and semi-autonomous killing machines. You replace men with consciouses with a machine that answers to no higher power beyond its programming and that's a recipe for disaster. Period.

I'm a robotics nut, but I don't even have words to express my opposition to this application of technology. Independent thinking machines have no place in the killing of people, any people, whether my enemy or friend. Call me old fashioned but that responsibility should be left to man and man alone.

There are just too many opportunities for something, anything to go wrong especially with this. The rapidly approaching nano/bio-tech revolution make completely autonomous self replicating, unstoppable war machines a not so distant reality. Read the DARPA research on this stuff, it ain't pleasant, it is scary apocalyptic shit.

I can hear the DARPA guys and gals giggling and expressing their infinite delight when considering self replicating autonomous killing machines hovering about and doing their nerdly bidding.

I have to admit, I want six of em so my 7 year old and I could spend all day and wipe out all the squirrels on the south 40, but this is just too slippery a slope.

There should be a serious debate taking place about this technology being applied to this extent before it happens, not after...oops, too late.
Posted by ElvisHasLeftTheBuilding 2008-04-10 18:58||   2008-04-10 18:58|| Front Page Top

#3 Um ... Elvis ....

I work in robotics. Well, actually in artificial intelligence and knowledge representation, but on area I research is potential robotics features for mil use.

SWORDs is tele-operated. Period. Asimov has nothing to do with this system. Its software is limited to being able to manage its own gearing etc. as it traverses terrain, plus responding to operator controls. Yes, there could be a problem with its control system. The same thing is true of Predators armed with Hellfires, or with the main cannon or smaller guns on tanks.

I hear and understand your concern, but SWORDS isn't anywhere close to what you are describing. And it remains the doctrine of the US that no autonomous systems will initiate fire.
Posted by lotp 2008-04-10 19:18||   2008-04-10 19:18|| Front Page Top

#4 And yeah, before we actually allow autonomous offensive fire, there will need to be a serious discussion about it.
Posted by lotp 2008-04-10 19:30||   2008-04-10 19:30|| Front Page Top

#5 Yeah people always seem to neglect the difference between remote-control cars and real Asimov-type robots. The remote-control car (like this SWORD) is no robot at all, it has no intelligence whatsoever.
Posted by gromky 2008-04-10 19:32||   2008-04-10 19:32|| Front Page Top

#6 Asimov has nothing to do with a robotics in a warfare program lotp, I can't agree with that statement? This one program is not the entirety of my argument, the nature of the future of robotics in warfare is. User directed or autonomous, its all in the same basket philosophically speaking. The person ain't there.

I'm not arguing apples and oranges here, a dishwashing machine vs a killing machine. Swords and Predator are killing machines that currently use off site human direction.

The logical and stated evolution of this type of technology includes the absence of man in the decision making loop. Maybe today's policy dictates no autonomous killing, but how long will that thinking persist? My hope is forever, however, I doubt that will be the case.

SWORDS and Predator are both just the beginning of this revolution as you well know as an industry rep. I'm not an engineer or a programmer and that's why you don't see discussion of technical capabilities in my comments. I do, however, read the research and there's no denying where this technology leads...to autonomous and semi-autonomous killing machines.

The whole robotics warfare industry is in its infancy. That's the whole point of the article. So of course we're not talking about Star Wars episode II here. But that doesn't mean we don't have these conversations now. Thus the slippery slope angle of my entire comment. BIG things have small beginnings.
Posted by ElvisHasLeftTheBuilding 2008-04-10 19:47||   2008-04-10 19:47|| Front Page Top

#7 Elvis, I didn't say that the questions Asimov posed aren't relevant to military robotics as a whole. I said they aren't at issue in SWORDS.

We've been moving the warfighter away from immediate experience of targets for some time now, in the case of some weapons. Tank crews don't see events around them directly, they are inside the tank responding to pictures on a screen. Even truer for fighter and bomber pilots. No, these are not all equivalent. But there is a spectrum here, not a black and white difference.

A robot that autonomously decides to fire is at the far end of that spectrum. Could happen, and if so it will be in the air first not on ground (UCAVs). But do distinguish autonomy of movement (ability to find its way across terrain), which is what most of the literature means when they say 'autonomous', from the robot picking targets and deciding to fire on them.

You might make a case WRT swarm mini-missiles. This is intended basically to be a group of tiny missiles launched together at a group of targets. They are intended to be able to negotiate with one another to ensure that priority targets are hit even when/as some of them are destroyed en route. But they themselves aren't intended to select or prioritize the targets, just to work cooperatively in destroying targets given them by humans.

So yeah - there are going to be issues we'll have to work through on this. But it's not quite in our laps yet.

(I work for the military BTW not industry, if that's relevant.)
Posted by lotp 2008-04-10 20:01||   2008-04-10 20:01|| Front Page Top

#8 The only time I saw the late, great Dr. Asimov in person, he said that the laws of robotics should be extended even to robots used in manufacturing. Thus, in his view, they would refuse to build nuclear weapons because they could be used to kill humans. I'm not sure if he wanted to extend that concept to dual use technology or a second level - so that a robot would refuse to build something that could be used to build something that could be a weapon.
I wanted to ask him what would prevent a more primitive enemy from building nuclear (and other) weapons the old fashioned way - by humans. Thus the more primitive enemy would actually have a great advantage over us. However, I didn't get the chance.
Of course, what I really wanted to ask him was why the second and third laws weren't reversed. As they stand now, a could say to a robot "Drop dead", and the robot would be forced to shut down and "die" because the second law takes precedence over the third.
Posted by Rambler in California">Rambler in California  2008-04-10 20:11||   2008-04-10 20:11|| Front Page Top

#9 Military Industrial complex, so no it doesn't make much difference. Just the pay is different eh?

Excuse my philosophical ranting about the laws of robotics in regard to a technical matter related to the SWORDS system. I lean towards philosophical left brain thinking, not technical right brain thinking. Also I have seen Terminator too many times on AMC recently so...

Anyway to bring this to a conclusion: as I said I've heard the arguments, I'm not ignorant of the current state of the technology or the application in this instance, and still, I am damn afraid of autonomous killing machines and the slippery slope we are approaching rapidly as evidenced by the advancement of robotic warfighting technology in current development. Mary Shelley would be proud of my Chicken Little role in this instance I'm sure.

It's alive, alive!!!!!

Posted by ElvisHasLeftTheBuilding 2008-04-10 20:16||   2008-04-10 20:16|| Front Page Top

#10 I think that eventually combat robots will be a lot more like UAVs in their operation.

A realistic design is probably more oriented to a Johnny Quest-style "robot spy", giant spider robot. Made of unconventional materials, it will have a very lightweight but durable body, and will be able to "close the gap" quickly with the enemy.

It will tower over people, and the enemy will be inclined to shoot up at it, exposing themselves to fire. Its artificial intelligence will be mostly devoted to navigation, and rapid withdrawl on order. Otherwise, its offensive weapons will be used for rapid, under fire, flanking maneuvers.

Its power supply will have to be extremely powerful, and its biggest limitation will be how long it can operate without refueling.
Posted by Anonymoose 2008-04-10 22:09||   2008-04-10 22:09|| Front Page Top

23:55 JosephMendiola
23:50 JosephMendiola
23:23 GORT
23:20 trailing wife
23:04 twobyfour
22:59 twobyfour
22:12 USN,Ret.
22:09 Anonymoose
22:07 Frank G
22:00 Frank G
21:59 Anonymoose
21:51 Pappy
21:42 DarthVader
21:40 Pappy
21:32 JSU
21:31 Frank G
21:20 Eric Jablow
21:16 Frank G
21:12 Snaving Stalin4500
21:10 Snaving Stalin4500
20:49 Barbara Skolaut
20:45 Barbara Skolaut
20:41 Nimble Spemble
20:28 Barbara Skolaut









Paypal:
Google
Search WWW Search rantburg.com