I caught the tail end of the segment on Sherry Turkle ("the poster child for the tech revolution, literally") in which host-creature Guy Raz anxiously warned Turkle about "some of your colleagues at MIT who, you know, make you seem like an outlier." I mean, you wouldn't want to be a weirdo, would you, Sherry? Don't you want to be, like, a normal poster child for the tech revolution? Raz was much more comfortable with his next guest, Cynthia Breazeal, who reminisced obligingly for him. When she was a kid, she saw Star Wars and was delighted with the robots C3PO and R2D2:
I mean, they had emotions, and they were loyal sidekicks and they helped people ... And they were amazing in their - in their humanity almost, right? ... I was all about those robots.
RAZ: And so, those robots from Star Wars?
BREAZEAL: They definitely led me, ultimately, to MIT ...
RAZ: And that's where, for the last 20 years, Cynthia Breazeal's been building and studying social robots.Okay, Breazeal can be forgiven for thinking, as a kid, that C3PO and R2D2 "had emotions and were loyal sidekicks and ... were amazing in their ... humanity almost." But of course C3PO and R2D2 were not really robots. They were characters in a story, played by human actors in costumes, and they were not evidence of what robots were like in 1977, or might be in a galaxy far away. In this they were like Isaac Asimov's robots, who also were animated by magic ("positronic brains"), not by advanced technology: that's why they seemed so humanlike. They were not robots but allegorical stand-ins for human slaves, and they didn't even bear as much resemblance to "real" robots as African-American slave characters in Hollywood films like Gone with the Wind bore to real antebellum slaves. They were meant to delight the audience, not represent reality.
Guy Raz was the ideal shill for those who exaggerate the achievements of actual technology. He badgered Sherry Turkle for asking questions about what it means that technocrats see it as progress when a lonely old woman in a nursing home bonds with an animatronic doll called Paro.
RAZ: Isn't there a part of you that looked at that interaction and thought it was amazing? I mean, here was this woman who had suffered a profound loss, at some point in her life - she had lost a child - and she was responding to this robot. And it might not have been real, but it was real to her.
TURKLE: Well, no, I don't think it's amazing. And what was extraordinary, to me, is that although she was surrounded by people who were in a position to understand her story, we were kind of applauding and stepping back, and cheering on her connection to a machine that understood nothing. And I really felt - what are we doing? Why are we, essentially, outsourcing the thing that defines us as people? You know, we care for our young; we care for our old; and outsourcing it to something that we knew was essentially, deceiving her by tricks into thinking that it cared for her at all ...
RAZ: OK, Sherry, can I just pause here for a sec? ... What if someone is so lonely, and their loneliness can be resolved by having a robot or, you know, some device to talk to - I mean, shouldn't we let them have that?I agree with Turkle: it wasn't amazing at all. It's not much of an achievement to get people to bond with inanimate objects or pets or characters on TV or recorded singers or imaginary friends. That's why I don't take the Turing Test very seriously. The computer scientist Joseph Weizenbaum found that people who used his ELIZA program in the 1970s, which reflected people's typed words back at them, reacted to it as if they were having a real private conversation with someone. Because it is so easy to get people to anthropomorphize the world, some computer geeks have tried to raise the bar, so that "interrogators are specifically trained to avoid" anthropomorphizing. In that case, I suspect, many human beings couldn't pass the Turing test. As I listened to the TED Radio Hour, I had my doubts whether Guy Raz was human or a software construct. But no, he was way too easily fooled by the robots he met to be anything but human.
"We" aren't going to need people anymore, because robots are quickly being developed -- by whom, I don't know, certainly not by humans -- that will fulfill our needs. But when the Kingdom of the Robot comes, will "we" be worthy of our new neural-net BFFs? That question lurked everywhere in the program, but was never addressed, let alone answered. Notice too that Raz assumed -- rather like the tech people at the scene -- that the woman's loneliness was "resolved" by talking to a robot, and that this represented some kind of "amazing" advance. But there have been people whose loneliness or misery was assuaged somewhat by stuffed toys, or plastic dolls: high-tech electronics are not really necessary as a stopgap, but there's no reason to suppose they are any better a substitute for real human interaction than a Betsy-Wetsy. Jesus once asked his audience rhetorically if, when a man's child needs bread, he'll give it a stone; for Guy Raz, the answer is evidently "Why not?" But nowadays we can give a hungry child a full-color hologram of food, and shouldn't we let them have that rather than give them nothing?
The blurb on the web page continues: "We've been promised a future where robots will be our friends, and technology will make life's daily chores as easy as flipping a switch." Really? When were "we" -- presumably human this time, but who knows -- "promised a future where robots will be our friends"? Who made that promise, and what drug were they on? One cognitive scientist, Gregory J. E. Rawlins, speculated eagerly about electronic pets and "feral cars" in his 1997 tract Slaves of the Machine (MIT Press, 1997. 121): "One day our artificial creations might even grow so complex and apparently purposeful that some of us will care whether they live or die. When Timmy cries because his babysitter is broken, ... then they'll be truly alive." Haven't children already cried when toys broke?
Listening to the TED Radio Hour right after I'd finished reading David F. Noble's Forces of Production helped me recognize what was going on: it was the technocratic mind at work, trying to get rid of messy, selfish, lazy, irrational human beings in favor of clean, ego-less, hard-working rational machines. The advertising always runs ahead of reality, promising emancipation from human workers that the products can't deliver. It doesn't matter whether those machines are able to do the work, let alone whether they are cheaper or more efficient -- they aren't -- what matters is control by those who fondly suppose themselves to be superior to the rabble, and who forget that they are human themselves. They fantasize about communing with robots -- or at least, as NPR's Guy Raz did, about letting other people make do with robots when real people fail them. Many fantasize about becoming cyborgs themselves. This isn't a rational aspiration: it's just a newer version of the ancient religious wish to jettison the corruptible flesh and become pure, incorruptible spirit. Like the second coming, the glorious day of that transformation keeps having to be postponed, but the true believer never gives up hope.