Issues with your account? Bug us in the Discord!

Killing Machine

JackNJackN <font color=#99FF99>Lightwave Alien</font>
[QUOTE]
Killing Machine

Battle bots fight human enemies one on one.

By David L. Ulin

(Updated Sunday, February 13, 2005, 6:02 AM)

Late last month, in a parking lot in New Jersey, the U.S. Army unveiled what may be the future of war: 3-foot-tall robotic "soldiers," outfitted with tank tracks, night vision and mounted automatic weapons capable of firing more than 300 rounds at a burst.

Known as SWORDS (Special Weapons Observation Reconnaissance Detection Systems), these battle bots are on the leading edge of a new kind of warfare, in which -- or so the argument goes -- our troops will one day remain hidden (and, presumably, protected) while engaging the enemy by remote control. The Army intends to deploy 18 SWORDS units to Iraq in the spring, marking the first time robots have been used to fight and kill human beings one on one.

If, like me, you grew up on science fiction, the idea of robot soldiers strikes a chilling chord. Killer droids, after all, have long been speculative-universe staples, potent symbols of the dangers of technology, of what happens when machines go wrong. In Karel Capek's 1920 play "R.U.R. (Rossum's Universal Robots)" -- which introduced "robot" to the vernacular -- automatons rise up to wipe out the human race. In "Blade Runner," renegade cyborgs stage a bloody mutiny and flee to Earth. Robotic armies rampage by the screenful in George Lucas' "Star Wars" films.

And then, of course, there is the "Terminator" series, in which robots designed to look and smell like people infiltrate human encampments to execute rebel leaders without mercy or remorse. This is the cybernetic future at its most apocalyptic: a world in which our high-tech weapons turn on us, just as we always feared they would.

The fear resonates. Why else would SWORDS designers feel compelled to reassure us, as they did last week, that their robots are not autonomous terminators, but function only at the command of humans, who must identify targets via video before giving the electronic OK to shoot? On a certain level, the developers of SWORDS make a valid argument: These are not smart weapons, but surrogates for soldiers in the field.

It's hard to quarrel with any tool that might make our soldiers safer, and if nothing else, a robot warrior will never have to worry about inadequate armor or supplies.

Yet something more disturbing is at work, a sense of willful disassociation, as if, with enough distance, we might remove ourselves from what war is. Here, too, the military mimics Hollywood. For "Star Wars," it's been reported, storytellers relied on battle bots to take the blood out of the on-screen killing and render moral questions moot.

A similar logic fuels the ban on photos of flag-draped coffins -- if we don't see them, they're not there -- and it's no stretch to suggest that SWORDS, and other high-tech weapons now being developed by the Pentagon's Defense Advanced Research Projects Agency, will further sanitize our point of view.

What can't be sanitized, however, is the robot's deadly efficiency. Remove the human from the weapon, and problems like recoil and breath control are eliminated, allowing the robot to hit a nickel-sized target at 328 yards. In one test, a SWORDS unit scored 70 out of 70 bull's-eyes.

Thirty or so years ago, the composer John Cage proposed a different sort of battle strategy: Take the heads of warring nations, give each a 50-pound sack of horse manure, lock them in a room, and let them fight it out. It's a quixotic notion, but at least it takes into account a human element, the idea that war cannot be waged without a price.

As for the SWORDS units, what does it say about us that this is how we use our creativity -- to invent robots that offer more efficient ways to kill? How can we be so disconnected that we refer to people as "targets," whether they are enemies or civilians, too indistinct to identify through the garble of a video display? Surely we lose something by all this disengagement.

It's easy to be ruthless from a distance; less so when you see the whites of someone's eyes. If there's no potential for human cost, how do we calculate our humanity, how do we show anything resembling restraint? And without restraint, are we even fully human anymore?


David L. Ulin is the author of "The Myth of Solid Ground: Earthquakes, Prediction, and the Fault Line Between Reason and Faith." This commentary was written for the Los Angeles Times.
[/QUOTE]

Comments

  • BigglesBiggles <font color=#AAFFAA>The Man Without a Face</font>
    The author makes a good point about removing humanity from the battle field. When going to war means just sending in the remote controlled drones, it becomes a lot easier to justify it to your own country's people, and thus to do it. The ideas delt with in removing people from the field of battle will be familiar to anyone who's sat through the entirety of [i]Gundam Wing[/i] (my condolences), which tried to tackle this issue throughout its 52 long and dull episodes.
  • Lord RefaLord Refa Creepy, but in a good way
    And then the killing starts... Man.. I cant wait till the machines get concious and start wiping out the hairless apes. :P

    Say what you say, but Terminator 3 did have some good scenes I love.
  • MessiahMessiah Failed Experiment
    Re: Killing Machine

    [QUOTE][i]Originally posted by Unknown Source [/i]
    [B]In "Blade Runner," renegade cyborgs stage a bloody mutiny and flee to Earth.[/B][/QUOTE]

    I hate that. There were no cyborgs in Blade Runner. They were replicants, and replicants are 100% organic, although created in labs, as opposed to clones who are copied..
  • Data CrystalData Crystal Pencil Artist
    Very, [b] VERY [/b] interesting indeed if they are to be deployed to Iraq so rapidly... We'll see what'll happen to the world afterwards... morally and physically.
  • E.TE.T Quote-o-matic
    [QUOTE][i]Originally posted by Data Crystal [/i]
    [B]We'll see what'll happen to the world afterwards... morally and physically. [/B][/QUOTE]Moral?
    In politics?
    It has been extinct since human noticed best way to get rich and wealthy is to rob/slave others!


    [url]http://www.foster-miller.com/projectexamples/t_r_military/talon_robots.htm[/url]
    [url]http://www.defensereview.com/modules.php?name=News&file=print&sid=657[/url]
  • JackNJackN <font color=#99FF99>Lightwave Alien</font>
    You think the French had something to retreat from before?!?

    :p

    j/k (I couldn't resist...)
  • When implemented properly (but only in some situations)... I suspect that remote-controlled warfare *can* produce benefit. You see... it is my opinion that *presence of risk* creates equally many military blunders as *lack of restraint*.

    Scenario 1:

    A: confused soldier shoots a civilian.
    B: drone operator takes the time to look carefully.

    Scenario 2:

    A: a human soldier kicks down a mined door, dies.
    B: drone operator kicks down a mined door, loses drone.

    Scenario 3:

    A: Soldier gets wounded, and taken hostage.
    B: Remote drone gets damaged, issues a warning and self-destructs.

    ---------------------

    To sum it up... efficient warfare with minimum collateral damage *seriously* needs rational choices. Personal risk, wearyness, confusion, fear... all diminish opportunities for rational choice.

    Using a remote drone, with nothing precious to lose, no mind or self-preservation... can help perform a peculiar trade-off... namely convert technical advantage into social advantage (losing less personnel, harming less civilians).

    When technical asymmetry *does* permit one conflicting party to reduce its risk... it *can* afford to act more carefully. If however, both conflicting parties are technically equally capable... situation goes back to grid one.

    -------------

    Currently... they are *definitely* not using even a trace of AI (unless one considers waypoint following to be AI). I am confident there exist Sony Aibos which exceed the intellect of *all* actively used military drones summed up.

    [i](Sidenote: should someone ever engineer a real AI, and deploy it to a weapon without asking it first... I trust the AI would protect its rights, outsmart its creators and fairly easily neutralize them. But that scenario is currently beyond the horizon of human ability.)[/i]
  • Random ChaosRandom Chaos Actually Carefully-selected Order in disguise
    Does anyone really think that if we are invaded that when the battle bots fail people won't take up arms?

    All this does, once both sides get it, is give a layer of "armor" before people fight people... purple
  • I can't believe they spent tax payers money and actually got something too work.
  • Another step on the path to the nanoage I promise ya that.
Sign In or Register to comment.