Every story about robots is in some way a story about death. Sometimes it’s the human race that gets it. Sometimes it’s a tiny plastic dog. Last summer, Sony discontinued the last remaining traces of customer support for its AIBO series. Remember the AIBO? It’s the line of artificially intelligent robotic pets that, assuming you don’t own one, you can maybe picture cutely waddling on four hinged legs in a promo video from 2003 or so. Tilting their little heads quizzically. So much personality! Sony hadn’t made a new AIBO since 2006, but it provided some basic service for the existing ones. Warranty repairs, that sort of thing. Over the years, the company phased even this commitment out, and now, with the spare-parts supply dwindling, every existing AIBO is on a countdown to the tomb. The future, man — it’s a hard place to be adorable.
What’s amazing about the coming extinction of the AIBO, though, isn’t this product-circle-of-life stuff. Toys break; it’s what toys do. What’s amazing is that AIBO owners are experiencing actual grief. From the outside, owning an AIBO kind of looks like the pet version of vaping. But these are people who have owned, and cared for, their robo-pups for years. Longer than the lifespans of most actual dogs, in some cases. Their AIBOs are part of their lives. The New York Times did a video about this in June, and I’m sorry, it’s moving. They’re lighting candles. They’re saying goodbye. I have no idea what this means. Like, are they having an aesthetic experience? Is it absurd and depressing to feel deep feelings for a consumer product? Are the feelings themselves the product? Does it say something optimistic about human nature? Does it say something terrifying? Life is very large. Under the right circumstances, you, too, could fall in love with a toaster.
I’ve been thinking about the desolation of the AIBO lately because — well, partly because I think about the desolation of the AIBO all the time, because I am a person with a soul, but also because robots are having a moment. Robots have been having a moment pretty much nonstop for the last hundred years or so, but this one is particularly intriguing. Here are a few of the robot-related takeaways served up by the personality-reading algorithm that directs my phone’s news app in the past couple of weeks. An ethicist in England has started the Campaign Against Sex Robots, a new movement to combat the (so far largely theoretical) development of consumer sexbot technology. The Marines are running combat drills with a robotic scout dog partly developed by Google. A sports league that wants to feature giant fighting robots passed its funding goal on Kickstarter. The year’s hottest Christmas present is the BB-8 droid from the new Star Wars movie (commercial tagline: “This is the droid you’re looking for”). The Russians have a new spy robot that looks like a cockroach. And Target has plans to replace many human employees with robots, plans that by some bizarre coincidence came to fruition the day after a group of its biological employees voted to form a union.
In other words: We are standing, as always, on the threshold of the future. Only just now the threshold is looking a little more real than usual.
For a century, the future has been full of robots. The future has been the place where robots were going to have sex with us and slaughter us and fight titanic wars against each other and spy on us and serve us and make us obsolete. The fears and desires that robots arouse are deep and weird and they don’t really change over time. The play that coined the word “robot,” R.U.R., by the Czech writer Karel Capek, premiered in 1921. It includes almost all of this stuff. It depicts the dystopian future in which robots do most of our work. It tiptoes around the alluring and disturbing possibilities of robot genitalia and human-robot sex. It climaxes with the robot uprising that annihilates the human race. That’s how far back this goes — the word “robot” comes into being simultaneously with the fear that robots will destroy us. Someone’s extinction is always part of the story.
Did I say a century, though? The truth is, that’s not nearly long enough. Imagined automatons have been producing the same terrors and yearnings since before science fiction, before capitalism. Before electricity. Think about the golem in Jewish legend. Or look at Greek myth. There’s a sculptor named Pygmalion. He carves a statue of a beautiful woman, then falls in love with it; when he kisses it on the lips, Aphrodite, the goddess of love — sort of the Tinder of marble-besotted Greek guys of her day — makes it come to life. There’s a warrior called Talos. He’s a mechanical giant, made of bronze. He patrols the shores of Crete, and when a pirate ship or a warship comes too near, he hurls giant boulders in its direction.
Statues that kiss you, statues that kill for you, slaves that rise up against you. Extrapolate a little from these stories, and how far off are you, really, from the Replicants in Blade Runner? From the Cylons in Battlestar Galactica? From Skynet? From the lust/enslavement/revolt psychodrama of Ex Machina? What we want from robots and what we think robots will do to us for taking it — we were dreaming these things for a long time before they were a real possibility. It’s the same temptation and the same warning, unspooling over thousands of years.
Should we talk about robot sex? I think it’s time. Actually, first, let’s talk about Commander Riker playing the trombone, not because anyone wants to picture that horror, but because it points us toward a subversive idea that we might as well stash in the background here. The subversive idea is that human love can exist in a meaningful way between a person and a computer program. Maybe that person is the first officer of a starship. Maybe he cuts a figure of rugged paunchiness. Maybe he likes his jazz pre-bebop and trombone-driven. Maybe one day he strolls into the Holodeck to blow a few bars in an archetypal midcentury club, and maybe while he’s there he meets a woman called Minuet. She’s a blonde, but he doesn’t think blondes are jazzy, so he asks the Holodeck for a brunette. Now she’s a brunette. He asks the Holodeck to make her sultrier. Now she’s very sultry. She’s just his type — smooth, sophisticated, trombone-friendly — only she happens to be a holographic projection of a piece of software. Where do the feelings go?
This all happens in “11001001,” the first-season Star Trek: The Next Generation episode that may have quietly been one of the most provocative things to air on mainstream TV in the 1980s. There’s nothing at all provocative about the romance; she’s created specifically to please him, and she’s a slinky dame in a cocktail dress, not exactly a thrown gauntlet vis-à-vis hetero gender norms. Plus, she turns out to be part of a nefarious alien plot to distract Riker while blah blah dilithium etc. But there’s this weird space in the middle of the episode. Riker is going around wondering whether he can have a real relationship with an unreal woman, and he talks to Picard about it, and they’re both amazingly open to the idea. They are at least willing to entertain the idea that the one-sided emotion a person feels for a computer could qualify as love, and not merely as a kind of reformulated violent escapism ending in mass destruction. This is probably still as close as television has ever come to endorsing the idea that a person should be able to marry a body pillow.
Anyway, robot sex. Who’s doing it? Hardly anyone at the moment, since the only commercially available sexbots are this doll called Roxxxy and her male brand cohort Rocky, which — well, they cost thousands of dollars and here’s their video. “Sex” seems like a bigger stretch than “robot” here. Still, with all due respect to the Russian espionage-roach, the story about the anti-sexbot movement is the most complex and fascinating of the recent robot-themed headlines, and only partly because it’s a strike against a product category that doesn’t really exist yet.
What’s interesting is the Campaign Against Sex Robots’s specific argument. You can see, right away, that any discussion of the possible social effects of fuckable android tech is going to cluster around one of two poles. The first will see sex robots as harmless or beneficial, because it will assume that what happens in a robot bondage dungeon stays in a robot bondage dungeon — will assume, in other words, that sexbots might ameliorate loneliness, enable people to explore their own kinks without fear of humiliation or danger, or work out erotic longings that might be destructive with a human partner, without any of this necessarily bleeding through into their actual human relationships. What could be wrong with increasing the sum total of pleasure in the universe?
That’s the first argument. The Campaign Against Sex Robots follows the second. The second argument assumes that robots can’t be classified uncomplicatedly as objects, because anything people get up to with something that seems like a person will inevitably condition their relations with actual people. If I have sex with a human-like robot that I’m free to dominate and objectify, then my brain is not going to be able to compartmentalize that behavior as robot-specific; I’m going to see other humans, including but not limited to human sex workers, as more object-like, more subordinate to my own will. Moreover, going to bed with a physical body that is both compliant and lacking consciousness will make me more likely to want to dominate and objectify other people, because it will reinforce my own deep-down, infantile sense that my subjectivity matters more than anyone else’s — that other people’s minds may be real, but not quite as real as mine.1 The founder of the Campaign Against Sex Robots, the English ethicist Kathleen Richardson, not unreasonably makes this an issue of gender and power, arguing that giving heterosexual male desire free rein among anatomically female sex robots will only worsen men’s objectification of women and children generally, and thus reinforce the violent inequalities already present in society.
Think about the dismay that followed the decapitation of hitchBOT, the hitchhiking robot, in Philadelphia over the summer. The sadness wasn’t just about the destruction of a cute science project. It was also a sadness about human nature, the feeling being that someone who would be cruel to a vulnerable little droid would be cruel to vulnerable humans, given the chance.
Which, I mean — spend 15 minutes in a comments section; this is far from an outlandish position. It does assume, though, that only a man could want to fuck a robot. Ladies, is that true? As far as I can tell, the bulk of the nascent sexbot-industrial complex also takes for granted that this is going to be a dude-centric phenomenon. It’s plausible. But as with pornography, it’s hard to tease out the problem of production/distribution from the more obscure problem of desire; maybe it’s just that the straight men who control sexbot R&D are building the products they themselves want to get with. (They’re Pygmalioning!) And there are at least a few cultural indicators that the question might be somewhat more nuanced than the Campaign Against Sex Robots’s position statement allows. Margaret Atwood’s new book about a sex-robot-strewn dystopian future includes a bit about an android gigolo who’s “like a super-dildo, only with a body attached.” Brookstone’s thriving personal-massager-based business model at least vaguely gestures toward a world in which gender is not the barrier to wanting machines to get us off.
In any case, the Campaign Against Sex Robots’s position here goes right to the core problem of humans and human-created automatons. We want our individual consciousnesses to be more valid than anyone else’s. We want, deeply, the power that comes from having the only legitimate outlook on any given situation. We want to be able to look at other people without their looking back. We want to be able to use other people without their being able to judge us for it. Robots, assuming they aren’t sci-fi sentient, offer the possibility of simulating this sort of power in a realm free of ethical implications. We don’t have to consider their feelings, because they don’t have any, even if we build them to act like they do. But it’s a slippery possibility, because we know that any number of pseudo-ethical justifications have been devised over the millennia to allow us to enforce the same sort of power over actual humans. Lacking robots, humanity has historically been pretty happy to settle for slaves.
Treating real people as if they were artificial is a constant temptation. And so treating artificial people as if they were artificial feels dangerous. Even acknowledging the desire feels dangerous, because the whole point of most ethics — you could even say the whole point of civilization — is that other people are as real as you are and have to be treated as such. (Of course, if the desire didn’t exist, we wouldn’t need ethics to restrain it.) This is why, even when the robots are nominally villains, the androids-annihilate-humanity plot in sci-fi is always, in some sense, a drama of justice. It’s a way to imagine the punishment that should follow from the slaveholder’s denial of the humanity of slaves.
But the problem is even more complicated than this, because objectification and its dangers also form the basis of so much art. Novels, TV shows, painting, sculpture — even if they don’t give us power over other people, the appeal of these forms is often precisely that they give us a way to look without being seen. They’re inherently objectifying. They encourage us to form emotional bonds that are one-sided, non-reciprocal, and in some sense false. And yet we don’t consider them detrimental to ethical life; we consider them the opposite, in fact, because it turns out that when the objectified person is rendered engagingly enough, we come out on the other side of objectification. We look at a sculpture and feel more connected to other people, not less. We cry when a character dies. We identify. We sympathize. We perceive imaginary humanity in a way that makes us more alive to the real humanity of others.
I’m not suggesting that fucking a sexbot is the same thing as reading Little Dorrit. But the heart is strange and wild. Give people intimate access to anatomically correct simulated humans and of course all sorts of twisted shit will go down, because people are twisted. People will also form intense attachments to these things, will feel feelings we don’t now have words for, because people are hopelessly loving and vulnerable.2 People can weep for an AIBO. People can serenade a hologram with a trombone. Whether this is a reason for hope or a reason for deeper despair depends on how you look at it, I guess. The future may be a dystopian hell of lonesome orgasms and drone strikes. It may be a paradise of warm-lipped statues who share our interest in kayaking. It may never happen at all. It may always float a few feet out in front of us, the way fear sometimes seems to do. Or desire.
Marie Kondo, the decluttering guru, advocates thanking objects for their service before you throw them away.