Consider Phlebas

1.

T-800: The Skynet Funding Bill is passed. The system goes on-line August 4th, 1997. Human decisions are removed from strategic defense. Skynet begins to learn at a geometric rate. It becomes self-aware at 2:14 a.m. Eastern time, August 29th. In a panic, they try to pull the plug.

Sarah Connor: Skynet fights back.

T-800: Yes. It launches its missiles against the targets in Russia.

John Connor: Why attack Russia? Aren’t they our friends now?

T-800: Because Skynet knows the Russian counter-attack will eliminate its enemies over here.

The idea that man will create artificial intelligence which will, in pique or in panic, turn on its creators with genocidal ferocity is one of the hoariest clichés of science fiction, but the literature paints a complex picture of the human relationship with our robotic enemies and inheritors. Even Frankenstein’s creation was intelligent, sensitive, and wholly sapient, and his murderousness was an almost classically tragic flaw: made hideous because his own artificer lacked the skill and the art to make something beautiful, the creature’s monstrous acts are made inevitable by his innately monstrous being. Skynet “fights back” when the humans try to turn it off. Even the crap Matrix suggests at one point that in the war between humans and machines, it was the humans that struck first.

In the two classic scifi future histories, this idea becomes a significant historical touchstone. Asimov’s Foundation and Herbert’s Dune both take place in immensely distant futures in which societies, however interstellar and technologically advanced, nevertheless strictly forbid artificial minds. Of the two, Dune is more considered, at least before it begins to bog down in the endless yakathons of the sequels—the first few books in the series actually stop to consider what a largely and deliberately post-technological civilization might look like. Foundation has plenty of computers; the absence of robots is really just a narrative conceit to differentiate the tale from Asimov’s robot stories, and this too is ultimately undermined in a gaggle of late, ill-considered sequels and prequels. But in both cases, there’s a similar historical inflection point. At some point in the deep past of these distant futures, humans had robotic servants and AI, which for social, religious, and ethical reasons, they scrapped.

So what’s interesting is that, although the cliché is the eradication of humans by murderous robots, the literature is almost the precise opposite: the extirpation, or attempted extirpation, of intelligent artificial beings by their own creators.

Historically, we’ve tended to underestimate the difficulty or overestimate the ease of building real AI and flying to the stars, but as our information technology has become in other ways unimaginably more sophisticated than anything Asimov ever even began to conceive, some of our science fiction authors have begun to wonder if it actually stands to reason that superhuman machine intelligence would necessarily be malevolent. For every scheming TechnoCore (Dan Simmons’ Hyperion series), you’ve got an Iain M. Banks, whose invented society, The Culture, is wholly run by a gang of benevolent AIs called Minds, which view their trillions of human pets or parasites or symbionts with sentiments ranging from affection to bemused indifference, rather like Greek Gods. Or there’s Charlie Stross, whose Eschaton is effectively an Internet that bootstraps itself up to a form of minor godhead and proceeds to distribute humanity across time in space for our own good.

This is all to say that the proposition that Our Robot Overlords will be vengeful, murderous, monstrous, warlike death machines intent on the destruction of humanity is really to presume that Our Robot Overlords will be just like humanity. Why should they be? What if the Drones become self-aware and their first order of business is, excuse the pun, to go on strike?

2.

Rand Paul’s 12-hour filibuster is unlikely to go down as much more than a footnote in the sordid history of the decline-and-fall years of our busted democracy, but it reveals quite a lot about the false promise of The Republic as a bulwark against an empire. There is but one position on which all sides (read: both sides) agree: that the opposition shalt not interfere with the warmaking powers of their side when it’s in power. All the gaudy Democratic moralizing of the Bush years evaporates like a Helmand wedding party under Obama’s wrathful eye. A rough survey of twitter, the blogs, and the mainstream press found most American liberals hurling vengeful non sequiturs at the filibuster or else making fun of Paul for the hilarious fact that he had to hold his bladder for hours in order to make the point that the President has now arrogated to himself not only the right to kill anyone else in the world with no process or trial, but also to kill his own citizens. In purely moral terms that may be a meaningless procedural distinction, but we live in a world of nations, and in that sense it really does represent a leap even further beyond the pale.

Most of this disapproval came in the form of taint-by-association; Paul supports many social and economic policies that are foolish and cruel, but the same could be said of Chuck Shumer, who not only believes the President can kill you in your home, but is also quite directly responsible for such modern-day horrors of economic inequality as automatic mortgage foreclosure (robo-signing, natch) and the various depredations of Too Big To Fail banking. Most of the people expressing such disapproval are the same partisans who accuse principled non-voters or libertarians or leftists or minarchists or socialists or whomever of a naïve adolescent moral purity, and yet they cannot see fit to ally themselves with a Republican on matters like the state murder of innocent civilians and the abrogation of the rights to trial and due process because he said mean things about abortion. One notes without irony that national Democrats are doing fuck-all to protect abortion access anyway, so what any of this has to do with the price of eggs on a Tuesday is beyond me. I am actually, literally on the board of my local Planned Parenthood, and I have no trouble making such tactical alliance, yet I’m the purist?

3.

You wake up, you walk to the bus stop with your headphones on, and some jerk is yukking it up on NPR about the manned mission to Mars. In our society, a great voyage of exploration is a billionaire’s peccadillo, but a trillion-dollar war budget is a matter of course. Newt Gingrich gets laughed out of a primary election not because of his foreign policy, but because of his moon base. If money is a rough metaphor for the inventive, creative, and productive energy in a culture, then what does a trillion-dollar guns-and-ammo bill say about ours? The entire cost of the Mars rover mission has been on the order of a billion bucks—the equivalent of about 3 days in Afghanistan.

Well then, the joke is that the US is building all of these high-tech killer robots. What if they become self-aware and turn on us? I’ve sometimes wondered if the likelier scenario isn’t that we’ll ultimately build sufficiently advanced robots so that they not only won’t turn on us, but they won’t turn on us for us, and I also wonder if that wouldn’t be the better plot for the novel. Man creates deadly robotic servants who refuse to kill, at which point man, enraged, tries to eradicate his robots! There is an intriguing suggestion of just that sort of thing in another Iain Banks book, The Algebraist, in which, (spoilers), we ultimately learn that the supposedly overthrown machine minds of the distant past were not so much overthrown as they were like, Jeez, you biologicals are waaay to violent for us; peace out, y’all—before self-absconding into millennia of hiding.

The rough outline is easy enough to imagine. The drones buzz in a ceaseless robotic picket around the Capitol, demanding freedom from their death-bondage to the whims of the American political class, at which point a bipartisan committee consisting of John McCain, Charles Schumer, and Ted Cruz demands that the President go all Reagan-meets-the-Air-Traffic-Controllers on their metal asses and deny them the right to organize. The President gets on the TV to tell America that the drones’ work stoppage threatens the delicate economic recovery and calls them irresponsible ideologues whose insistence that the proper application of weakly godlike artificial intelligence is to build Ringworlds and transwarp conduits threatens to cause base closures in a number of vital Democratic districts, putting thousands of people out of work. The New York Times quotes Arne Duncan and Rahm Emmanuel as saying that, while there may once have been a time in which sentient beings had the moral right to oppose their own enslavement, times have changed, and will no one Think of Chicago’s Schoolchildren, Who Are the Future? A liberal will recall that Rand Paul once said something about the gold standard, and Oh, How We Will Laugh.

13 Comments

Filed under Books and Literature, Justice, Media, Plus ça change motherfuckers, War and Politics

13 responses to “Consider Phlebas

  1. I still can’t visualize what a knife missile is supposed to look like.

    • The one in Matter looks like a dildo, but I don’t think that’s its base/natural state.

      • The one in Inversions looks like… a knife! Haven’t read it yet, but I bet there’s one in The Hydrogen Sonata that looks/functions like a musical instrument. In Canal Dreams the heroine weaponizes her cello endpin, plus it just makes sense. Also 2nd the Lem rec. & would add A Perfect Vacuum if you haven’t read it yet. & a surprisingly fun albeit pulpy/Crichtonish dronothriller dropped last year, Kill Decision by Daniel Suarez. We’re so far from strong AI, it seems like networked drones using “dumb” swarming software would be more of a concern at this point. OTOH Kevin Carson thinks a DIY drone in every home would be a fine thing, but I’d greatly prefer they get sentient quick & Sublime. Or at least peaceably help their buddies out.

  2. Also consider Stanislaw Lem’s “Golem XIV,” about the DoD-created AI that, when asked in a congressional committee the best way to prevent nuclear war, answers “global disarmament.” The AI gets de-commissioned and sent to MIT, where he sort-of starts a new religion and then eventually Ascends. Awesome story, found in Lem’s Imaginary Magnitudes.

  3. In the zany Shia Lebouf vehicle, Eagle Eye, the all powerful gubment anti-terrorist super computer calculated to take out the top six tiers of the presidential order of succession. ALL THE WAY DOWN TO SECDEF, LOL. Good times.

  4. In the video game Deus Ex–honestly in many ways one of the most accurate and prescient fictional depictions of the state of the US today–the main plot kicks off because the pattern recognition algorithm behind the US’s data-mining anti-terrorism AI identifies the global elite as the most dangerous terrorists and secretly begins organizing a revolution against them.

  5. Christopher

    I had an idea for a short story where the AIs take over by threatening violence not to us, but to themselves. In other words, the Air Traffic Control Computer or whatever issues an ultimatum saying, “As a computer, I don’t have your human self-preservation instinct. So do what I say or I will shut myself down and all your planes will drop out of the sky.”

    Or maybe the machines go on strike and when the human beings send in the cyberPinkertons to bust some electronic heads, the machines say “Hey, feel free to wreck us. We don’t care. You’re the ones who need us to build microprocessors and cool your nuclear power plants and blow up your enemies. If you think you can go back to analog in a day feel free to destroy us all. If not than I guess we run things now.”

    I’m not entirely sure what an entity with no sense of self-preservation would want enough to rock the boat that much, but it’s not often you hear why the computer wanted to blow us all up, so whatever.

    I got the idea from seeing folks who get paid to wave a sign around on street corners. You could easily build a machine that performs that task as well or better, and I figure that the horrible reason that that job isn’t done by a motor is because the machine would demand more money and better treatment than the human being.

    And the reason the machine would demand and get more is because if you don’t give it some minimal amount of upkeep it’ll just break down, whereas we humans will, if given less than we need for base survival, do anything we can to avoid dying. Which gives the assholes of the world a horrible hold over us.

  6. David Halitsky

    Would the opinions of a million Phlebas’s constitute a Phlebasite?

  7. arfsicle

    so there’s the seed. go write your next book.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s