Avedon at Sideshow links to an excellent post on Obama by Ian Welsh, and adds:
Obama had the power to do many, many good things, and he refused every opportunity to do them. He refused to even attempt the most basic steps of negotiation with the opposition, asking not for a higher goal than what we really needed, but a lower goal as a pre-compromise, thus lowering the bar further still. He alleged (when he was trying to get elected) that he believed single-payer was the best way to go, but then he started babbling about the public option before he'd even started making a case for single-payer, having simply declared that passing single-payer wasn't politically feasible. (Oh, yeah? Start pounding it into the general public that everyone in America can get effectively free health care without raising taxes, and see how far Congress gets trying to resist it past the next election.) He even telegraphed to the press that he wasn't even really trying to get his so-called "compromise" of the public option, but instead was hoping the threat of the public option would frighten the insurance companies into slightly softening their viciously predatory and fraudulent practices - which it didn't. If he'd really wanted single-payer, he could of course have spent a lot of time explaining how real socialized medicine actually works in Britain and used it as the scare image of what "the left" was demanding, forcing the not-so-left to welcome single-payer as a longed-for compromise. And that has been his pattern with everything.
And more.
I read through the comments on Welsh's posts, and it seems to me that the Obamabots, while present, are fewer in number and somewhat less vitriolic than they used to be. Maybe they have bigger fish to fry elsewhere, like showing their cojones by jeering at Glenn Beck in comments at alicublog, but maybe they're losing the faith just a bit. But where will they go? The wilderness is cold and scary, and Robert Gibbs will call them names, and they can't stand that. (It's okay for them to call other Democrats names, of course, to say nothing of The Terminader.)
This is cute, but it's not really a good antithesis. How about "Science lets you bomb civilians from the sky without any danger to you"? Or "Science lets you shred unarmed civilians with machine-gun fire via video in your attack helicopter"? Or "Science lets you kill children with remote-control drones from thousands of miles away"? Or "Science lets you cull the human race of inferior losers through eugenics"?
The September 11 terrorists didn't fly into the World Trade Towers by the power of religion. (Whoa, dude, you mean they totally flew through the air? Praise God!) Like any other good soldiers, they used high technology which enabled them to do much more damage than they could have done by their own muscle power. Scientists have been remarkably willing to put enormous destructive power into the hands of politicians and generals, but that's okay because it's not for scientists to decide these things -- or so they claim. The responsibility isn't theirs alone, of course, but they must shoulder their share.
What's wrong with this bumper sticker is that it deliberately confuses what is at stake. It's not a confusion limited to atheistic scientists, of course. At the beginning of Patience With God, for example, Frank Schaeffer draws the reader this picture of the trouble we're in here in River City:
At a time when Islamist extremists strap bombs to themselves and blow up women and children; when America has just come staggering out of the searing thirty-year-plus embrace of the reactionary, dumb-as-mud Religious Right; and when some people are bullying, harassing, and persecuting gay men and women in the name of religion, it's understandable that the sort of decent people most of us would want for neighbors run from religion. There is a problem, though, for those who flee religion expecting to find sanity in unbelief. The madness never was about religion, let alone caused by faith in God. It was and is about how we evolved and what we evolved into [3].
Now, I'm the kind of atheist who agrees that "the madness never was ... caused by faith in God", since human beings create their gods. "Faith in God" is an effect, not a cause. (Schaeffer, though, still thinks there's a nice God who protects babies.) But look again at those opening clauses: he mentions "extremist" Muslim suicide bombers, but not "extremist" Christian American soldiers killing hajis as payback for 9-11, though American soldiers have killed far more people in the past few decades than Muslim suicide bombers have. America is rich, after all, and can equip its soldiers to kill more effectively and on a much larger scale. I think Schaeffer can now blame Ronald Reagan and George W. Bush for their murderous religiosity, though it will be interesting to see just how far he goes as the book proceeds. But my point is that it's easy and safe to invoke Islamist suicide bombers as a symbol of religious fanaticism, and not so safe or easy to recognize the same fanaticism on our side -- whether "our side" is America, Christianity, or science.
So the bumper sticker that inspired this post really is meant to paper over Western science-driven state terror. Possibly the person who designed it didn't realize that, but if so, he or she is less rational than he or she clearly wants to believe.
Got your attention, did I? When that title occurred to me today, I didn't know when I'd find a use for it. And then I logged onto Facebook and found this Wall Street Journal op-ed piece, linked by a right-wing acquaintance of mine. The subhead crows, "In 2008 liberals proclaimed the collapse of Reaganism. Two years later the idea of limited government is back in vogue." Of course, I'm not a liberal, so, just to show off, let me quote myself from a year ago:
Driftglass, which has some nice discussion here and here, also shares this video clip and Washington Post article from 1994, when the Republicans, having taken control of Congress, thanked Limbaugh for his services and leadership. In the video you can see Limbaugh pronouncing the final defeat of liberalism: every college should have one liberal and one communist professor, he declares, living fossils who will remind students of what they tried to do to America.
This Nostradamus-like prediction should, I think, be borne in mind now as liberal pundits celebrate the end of the conservative movement. Just on general principles, this is never a good idea. The arch-reactionaries who now define conservatism in the U.S. were supposedly defeated for good when Barry Goldwater was defeated by Lyndon Johnson in the 1964 presidential elections -- but sixteen years later Ronald Reagan proved those claims premature. Yes, the Republicans are in serious disarray now, which suits me just fine, but I don't think that Obama is in office, or that the Democrats now control Congress, because the Democrats or his center-right policies are superior to the Republicans' -- it's because the Republicans screwed up the economy so spectacularly. (I say "the economy" because in foreign policy Obama clearly intends to continue screwing things up as Bush did.) The Dems now have a chance to do better, and I hope their timid approach is enough to right the damage that the Republicans inflicted (with Democratic collaboration), because it doesn't take long for the voters to become disgusted with Democratic ineptitude. In particular, Obama is going to have to stand up to the banking system, but so far he only seems to be interested in throwing more money at it. Clinton was elected in 1992 because of Americans' disillusionment with Reaganomics, and he lost his mandate in just two years.
As I said, liberals have jeered at the right's paltry showing in their tea parties, and it's been entertaining to watch the indignant right-wing response to the jeers: Big trees from little acorns grow! There may be just a few of us now, but there will be more! One, two, many Pinochets! ... And after all, the Civil Rights movement, the opposition to the Vietnam war, the women's movement, the gay movement all started with little bands of nutty extremists. The same is true of the conservative movement William Buckley Jr. built, which ultimately took over this country for three decades. So, for that matter, did Christianity, which is a reminder that, contrary to what one commenter argued at the Village Voice, it is not necessary to have realistic or coherent ideas to build a frighteningly successful movement. Irrationality can be a strength. And forty-seven percent of the electorate voted against Obama last November; they lost the election, but they are not a negligible part of the population.
I said it was premature to celebrate the death of the Right. But did anybody listen to me? Noooooooo!
Anyway, Berkowitz's analysis is wrong from start to finish. American voters in 2008 pretty decisively rejected what he calls "limited government" -- though that's a misnomer for Republican policy since Reagan. "[R]eining in spending, cutting the deficit and spurring economic growth" describes neither the Reagan nor the George W. Bush administration. Bill Clinton cut the deficit by cutting social spending, but without benefiting most Americans economically -- most of the money created by the Internet bubble went to the rich, and the bubble burst just in time for Dubya to take office. (See Robert Pollin's Contours of Descent [Verso, 2003] for more information.) Obama has mostly continued their policies, which is why most Americans are still in bad economic shape. On the other hand, corporate profits -- the only measure that interests Republicans and their Democratic collaborators -- have bounced back since Obama took office, so from Berkowitz' point of view Obama has been doing the right thing, or at least not doing wrong.
The far-right Republican revanchists of the Tea Party never complained about big government, runaway spending, and ballooning deficits while Bush was in office, at least not until he bailed out Wall Street at the end of his second term. (With the cooperation of McCain and Obama, remember.) They only became restive when a brown-skinned Democrat moved into the White House. Unfortunately, given our zero-sum two-party system, those voters who don't want either Republican or Democratic pro-corporate pillaging have few options this November. A defeat for the Democrats won't be a vindication of Republican policies -- if anything, the reverse -- any more than it was in 1994, but the corporate media, the Republicans, and the Democratic Party apparatus will spin it that way.
"In the aftermath of the global economic crisis of 2008, Western liberal democracies have been increasingly forced to come to grips with their propensity to live beyond their means," Berkowitz intones. This is typical right-wing hackery, but it's important to get warmed up for this November, whether the Republicans make electoral gains or not. The global economic crisis was the result of neoliberal economic policies championed by the Republicans in the US and their allies elsewhere. They never had any interest in making governments live within their means -- rather the opposite: the aim was to cut back government revenue (by tax cuts for the richest) in order to rationalize and justify cutbacks in social spending, while continuing to raise deficits, subsidize corporations, and wage war. This would starve the mass of people while enriching the top one percent or so of the population. (See Naomi Klein's The Shock Doctrine [Holt, 2008] for a fuller account.)
To describe Obama's record as "aggressive progressivism" is absurd, but I'd expect no less from a Wall Street Journal op-ed. For even more fun, read the comments, like this one: "Re-reading this article a day later, I have come to suspect that Peter Berkowitz is a closet progressive." And why? Because Berkowitz allowed that our government's "responsibilities include putting people to work and reigniting the economy—and devising alternatives to ObamaCare that will enable the federal government to cooperate with state governments and the private sector to provide affordable and decent health care", and that
A thoughtful conservatism in America—a prerequisite of a sustainable conservatism—must also recognize that the liberty, democracy and free markets that it seeks to conserve have destabilizing effects. For all their blessings, they breed distrust of order, virtue and tradition, all of which must be cultivated if liberty is to be well-used.
To observe this is not, as some clever progressives think, to have discovered a fatal contradiction at the heart of modern conservatism. It is, rather, to begin to recognize the complexity of the conservative task in a free society.
Working at the Hoover Institution must be a cushy gig, with no thought required, just the ability to shuffle and deal out gaseous cliches. Numerous commenters took vehement exception to his platitudes, however, and to
The Gingrich revolution fizzled, in part because congressional Republicans mistook a popular mandate for moderation as a license to undertake radical change, and in part because they grew complacent and corrupt in the corridors of power.
The "Gingrich revolution fizzled" because most Americans didn't want it -- Gingrich never had a mandate, since voter turnout in the off-year elections of 1994 was typically low. The Republicans capitalized on voters' anger over one of their own programs, the North American Free Trade Agreement, just as they're capitalizing now on the consequences of their policies under Bush and Obama.
My ambivalent Obama-supporter friend just posted this joke to Facebook, from Fark:
Estimates show that there would be fewer jobs and larger deficits under the Republicans' plan. Republicans say that's impossible, they don't have a plan.
WARNING: Here There Be Spoylers If you haven't seen this movie but want your first viewing to take place in blissful ignorance of its content, read no further. Georgy Girl isn't really plot-driven, and you can probably enjoy it just fine if you know the story in advance, but if you care, I'm interested in raising questions about it that assume the reader has also seen it (and maybe read the book as well), to compare notes rather than excite the curiosity of those who haven't seen it.
A couple of weeks ago I finally decided to check out the DVD of Georgy Girl from the public library. I'd been listening to songs of the Australian easy-listening group The Seekers, who had a big hit with the highly annoying theme song for the movie, and I was curious to see just what sort of story Georgy Girl had to tell. I expected something like a Richard Lester film, an upbeat fable of an ugly duckling who becomes a bird.
The DVD case encouraged that impression: the top half of the cover has heavily retouched headshots of the four stars, with the women in flip hairstyles and Alan Bates looking like a young and goofy Leonard Cohen. The lower half shows Bates, Lynn Redgrave, and Charlotte Rampling dancing hand-in-hand in colorful outfits. It's not typical of the movie, and in context the dancing scene is ironical: the characters are hoping for a happy outcome (in this case, of Bates's marriage to Rampling) they aren't going to get. And, of course, the film itself is in black-and-white, not pink-tinged color.
The opening credits were a pleasant surprise: Lynn Redgrave saunters (almost swaggers) down a London street, her hair tied back in a loose ponytail, without makeup, wearing a baggy sweater, a leather coat, and midlength culottes over tights. Though not conventionally pretty, she looks wonderful. Some reviewers describe her as "chubby," which she's not, though she's no Twiggy either. The sequence takes a silly turn (though, as I discovered, it's straight out of the book) when she's tempted by the window dressing of an upscale hair salon. She enters, then emerges with her hair in a bouffant beehive, which immediately makes her uncomfortable, so she ducks into a ladies' loo and soaks it down in the sink. A moment later her ponytail is back and she looks happy. To my eyes she looks (and mostly acts) like a real person, not a Hollywood movie character, but then this is not a Hollywood movie -- it's English, from a period when the English film industry was playing with more realist themes and styles, exploring sexual subjects that the US wouldn't deal with for another decade or more. But even so, Redgrave projects a naturalness that almost no one else in the film (except maybe Charlotte Rampling) does: the other main characters are stagy Brit film actors, which isn't necessarily a bad thing (except maybe Alan Bates, whose overacting gets tiresome pretty quickly) but makes Redgrave stand out even more.
Georgy -- short for Georgina Caroline -- is the daughter of Ted and Doris, valet and housekeeper to the wealthy James Leamington and his hypochondriac wife Ellen. Because his own marriage is childless, James (James Mason) has always treated George as his daughter, even sending her to a posh finishing school, and encouraging her headstrong ways. Ted and Doris are embarrassed by her casual assumption of privilege and James's favor, and try to keep her in her place by reminding her that she's too big, fat and ugly for any man to want. In her mid-twenties as the film begins, she lives in a flat of her own, using James's house for the dancing class she teaches to neighborhood children. James, who's always been a womanizer, abruptly asks Georgy to become his mistress. She puts off giving an answer either way.
Georgy shares her flat with the chic, gorgeous Meredith (Charlotte Rampling), a first violinist who sleeps around a great deal. (For those of us who remember the period, all you need to know about Meredith is that the ultra-hip Mary Quant designed her costumes.) Georgy, who has no love or sex life, plays the same role with respect to Meredith that her father plays to James: adoring servant. Among Meredith's many admirers, Jos (Alan Bates), a banker and part-time musician with mod aspirations, has become a regular and befriended Georgy somewhat. When Meredith gets pregnant, presumably by Jos, she realizes she wants a rest from her wild life, so she decides to keep the baby and marry Jos. Georgy is delighted, and expects that the three of them will raise the baby together. Meredith tires of motherhood and Jos even before she gives birth, and in a series of scenes that shocked 1960s audiences and critics, refuses to take care of the baby and insists on putting it up for adoption. Meanwhile, James's wife has died suddenly and unexpectedly, so James upgrades his proposal to Georgy to marriage.
Meanwhile, Jos and Georgy have fallen into an affair, relieving her of her virginity. She wants to keep the baby herself, and manages to block the adoption. Meredith runs off, leaving the baby with Jos and Georgy, but Family Services takes a dim view of cohabitation, and takes the baby away. Jos says he loves Georgy but is too irresponsible to have a relationship with her or his daughter, so he dumps her. This relieved me. I don't see how anyone could have wanted them to stay together, with the characters as they were: Bates's manic overacting, which he would refine to apotheosis in Simon Gray's Butley a few years later, makes Jos a person no one could live with for long. Georgy then agrees to marry James if he'll adopt Meredith's baby, and the film ends with their wedding and Georgy in a white wedding gown in the back seat of James's car, looking as uncertain but determined as Dustin Hoffman and Katherine Ross would look in the final scene of The Graduate a year later.
So. Even by twenty-first century standards, Georgy Girl is rather strong stuff, flouting all the standards of Christian civilization. That's what makes it interesting. British cinema had explored some of these byroads already in such works as A Taste of Honey (1961), in which an illegitimate working-class girl is impregnated and abandoned by her black sailor boyfriend, then befriended by an effeminate gay boy who wants to take care of her and the baby.
Georgy Girl has aged well. Cohabitation and single parenthood aren't as scandalous now as they were then. Seen from today's perspective, Meredith doesn't seem quite as monstrous as she appeared in the Sixties, though she's still not particularly sympathetic. In a way, she's a throwforward: an independent career woman with no interest in marriage or motherhood, but she lets herself succumb to both in a brief moment of weariness. Such a character couldn't have appeared in an American film of the period, or even later. Georgy may be a great lumpy thing, but to some extent -- not nearly enough -- Second Wave feminism made it easier to appreciate the beauty of women who aren't Charlotte Rampling or Marilyn Monroe, who aren't fragile or femme, who don't wear makeup or spend hours on their hair, and who take up space. Of the contemporary reviewers I've found so far, only Eleanor Perry, writing for Life of all places, really appreciated the character. The New York Times reviewer generously allowed that Redgrave couldn't "be quite as homely as she makes herself in this film. Slimmed down, cosseted in a couture salon ... she could become a comedienne every bit as good as the late Kay Kendall." Pauline Kael, complaining that movies were "out of control," mistook Georgy Girl for a failed conventional comedy. Redgrave herself was a tall woman, "Nearly 6 feet tall, bluntly outspoken and something of a kook," according to this profile from Life.
It's partly because I'm queer myself, of course, but I thought that what Georgy needed was a girlfriend. That's not to say that the filmmakers or the novelist Margaret Forster, who wrote Georgy Girl and cowrote the screenplay, saw her as lesbian or even bisexual. But George's adoration of the beautiful and icy Meredith, her readiness to wait on her, cook for her, and clean up after her, certainly allows a queer reading of the film. (So does her name, which if only in retrospect conjures up Frank Marcus's scandalous 1964 lesbian play The Killing of Sister George, filmed in 1968, and Van Morrison's possibly transvestite "Madame George" from his 1968 album Astral Weeks. What is it with the Brits and these names?) Yes, I know, a big-boned woman shouldn't be stereotyped as a dyke; Georgy wanted to be Meredith, not have her; and so on. But gay and gay-friendly viewers will know that there's not a sharp line between wanting to be and wanting to have, and since movies especially convey a lot of information about a character through appearance, the character as written and as played by Redgrave permit, though she doesn't require, her to be read as lesbian. True, she has an affair with Jos, but many young dykes experiment with (or are pushed into) heterosexuality; I wouldn't say that Georgy was even closeted -- probably she was still too naive to think of sex with a woman as a possibility; and she ends up married to a much older man for convenience. It's not at all outrageous (though too bad if it is) to see Georgy a few years down the line, raising her daughter with a female partner, perhaps after James dies.
After watching Georgy Girl I read the novel (Secker & Warburg, 1965) on which it's based, and I was extremely startled at how faithful the film is to the book, both in spirit and text. Much of the movie dialogue comes verbatim from the book, and even scenes like the one where Georgy submits herself to a hairstylist, or the one where she puts on a gown and makeup and, looking like a drag queen, sings a bawdy torch song to James's party guests, are in the original. (In the book, though, everyone thinks Georgy's performance is amusing; in the film, they're embarrassed. That's the only place where the film really fails the book's spirit.
I was also struck by this bit of backstory in the novel (pages 8-9), describing how Georgy's father Ted had come to work for James in the first place.
James picked Ted up at a music hall. He came to see a little blonde juggler and happened to sit next to Ted, who hadn't come to see anyone in particular. He was what he called "on spec". Ted noticed James, which wasn't surprising as James was very imposing looking, and James noticed Ted, which was surprising because Ted was very ordinary. He was small, seedy and at that time thin because it was 1935 and he was out of work. James wore a beautiful fur coat and a rakish hat tilted over one eye. Ted didn't have a coat of any sort and he had momentarily removed his hat because he had a fixation about not being able to hear properly without it on.They didn't speak during the performance. Ted was too busy thinking what a wicked waste of money his seat was, and James was all keyed up waiting for the blonde juggler.
But at the end they happened to go out together and Ted sort of followed the toff he'd been sitting beside partly because he'd nothing better to do and partly to see what sort of car he had. He looked rich enough and dashing enough to be a car man.The toff went round to the stage door and hung around a bit until he appeared to get fed up and handing a card to the doorkeeper walked briskly off. Ted followed. The car was round the side of the theatre, a big Rover with shiny red upholstery. Ted wanted the car so much that all the saliva rushed into his mouth with desire and he had to spit to get rid of it. It was a very noisy spit. The toff turned round and said "Are you spitting at me?" and Ted almost said "Yes, what you gonna do abaht it" but luckily changed his mind and said "No". Instead he said what a lovely car it was and the toff was all agreeable and offered him a ride which Ted accepted without any ill feeling whatever.
His wife Doris, James's cook and housekeeper, disapproved.
Every now and then they would have a real row and she would scream at him that he'd never dirtied his hands since she married him. Well, he hadn't. His job was a clean job. As James's valet he looked after clothes and he had to be clean. He couldn't make out why she wanted him to go and get some filthy job, as though there was some sort of virtue in dirt. He knew when he was lucky, which was more than she did. He had a soft job, free accommodation, good wages, life-long security and, above all, constant access to James [11].
Again, I think all this supports (though it doesn't require) a gay reading. Not that Ted and James are 'really gay' -- the popular concept of 'fluidity' in sexuality always gets forgotten in cases like this. But what, exactly, happened on the ride they took that night they first met? Add James's sexual frustration at not getting to the blonde juggler he was after to a few drinks and Ted's decision to be acquiescent rather than uppity to this "toff," and a brief sexual connection between the two is quite plausible. The bond was sustained by other factors once Ted married Doris and went to work for James, and I don't suppose that it was sexual after the beginning. But it's remarkable (partly because it's never stated explicitly, as far as I remember) that Ted's subservience to James is mirrored by Georgy's subservience to the glamorous Meredith.
If the idea of even a transient sexual connection between these characters bothers you, fine: put it firmly from your mind. It's not explicit, it's at most a subtext, it probably wasn't in the minds of the writer or the filmmakers, but I think it's a legitimate reading of the story that adds to and thickens the emotional interactions of the characters. It also added to the already considerable pleasure I took in watching the film and reading the novel. Numerous reviewers called Georgy Girl an "ugly duckling" story, but I was pleasantly surprised to discover that it wasn't. Rather than becoming a normal girly-girl, Georgy remains her ungainly, ugly-duckling self even in her wedding gown. Both the book and film are open-ended, with an unfinished protagonist who still has some growing to do, but will probably turn out all right. That suits me much better than turning her into a swan.
Katherine Jackson -- who has custody of Michael's three children, Prince, Paris and Blanket, 8 -- has been working hard to provide the kids with a more normal upbringing since their father's tragic passing. In June, Jackson's kids were spotted going door-to-door as Jehovah's Witnesses.
I'm the last person to stump for normality, but I'm experiencing some cognitive dissonance here.
And it continues. The two oldest Jackson children have been homeschooled, but now they're going to be exposed to "traditional schooling."
The Buckley School is an exclusive private school with scads of famous alumni: Matthew Perry, Alyssa Milano, Nicole Richie, Nicollette Sheridan, Laura Dern, Paris Hilton and Kim Kardashian all attended.
I guess the meaning of "normal" here is to prep them for a career of reality TV and cocaine busts. Well, I'm sure Grandma knows best.
So I went back to the beginning of Frank Schaeffer's Patience With God. He says that when he plays with his five-month-old granddaughter Lucy,
I find myself praying, "Lord, may none but loving arms ever hold her." This prayer has nothing to do with theology. I'd pray it whether I believed there is a God or not, for the same reason that when I'm looking at the view of the river that flows past her home I sometimes exclaim, "That's beautiful!" out loud, even when I'm alone [ix].
Maybe it's the same reason, but what is the reason? Schaeffer doesn't like the New Atheists any more than he likes fundamentalists, but New Atheist Daniel Dennett has written that, after a sudden health crisis that nearly killed him,
I saw with greater clarity than ever before in my life that when I say "Thank goodness!" this is not merely a euphemism for "Thank God!" (We atheists don't believe that there is any God to thank.) I really do mean thank goodness! There is a lot of goodness in this world, and more goodness every day, and this fantastic human-made fabric of excellence is genuinely responsible for the fact that I am alive today. It is a worthy recipient of the gratitude I feel today, and I want to celebrate that fact here and now.
To whom, then, do I owe a debt of gratitude? To the cardiologist who has kept me alive and ticking for years, and who swiftly and confidently rejected the original diagnosis of nothing worse than pneumonia. To the surgeons, neurologists, anesthesiologists, and the perfusionist, who kept my systems going for many hours under daunting circumstances. ...
The best thing about saying thank goodness in place of thank God is that there really are lots of ways of repaying your debt to goodness—by setting out to create more of it, for the benefit of those to come. Goodness comes in many forms, not just medicine and science. Thank goodness for the music of, say, Randy Newman, which could not exist without all those wonderful pianos and recording studios, to say nothing of the musical contributions of every great composer from Bach through Wagner to Scott Joplin and the Beatles. Thank goodness for fresh drinking water in the tap, and food on our table. Thank goodness for fair elections and truthful journalism. If you want to express your gratitude to goodness, you can plant a tree, feed an orphan, buy books for schoolgirls in the Islamic world, or contribute in thousands of other ways to the manifest improvement of life on this planet now and in the near future.
The trouble is, "goodness" is no more a real thing than, say, God. It's an impersonal abstraction, and you can neither thank it, nor owe it a debt. All those people who kept Dennett alive are not "goodness." "Goodness" didn't give us Randy Newman's music, nor fresh water nor fair elections (do those exist?), any more than God did. By doing good things, I am not trying to "repay my debt to goodness." At most I'm repaying other people who've done good things, but if I create goodness, I do it for its own sake, because creating goodness is both subjectively pleasant to do and objectively useful.
More serious, though, there's widespread disagreement as to what "goodness" is. Dennett appears to be blissfully (and remarkably, for a professional philosopher) unaware of this. Even among atheists there's no consensus. Dennett thinks it would be good to "cage" and "quarantine" theists who get in the way of his scientific triumphalism. Christopher Hitchens and Sam Harris think it's good to kill ragheads. Ayn Rand thought that selfishness (as she very tendentiously defined it) is the paramount human good. Once theists get beyond generalities like loving and serving their gods, they too get bogged down in details. "Kill them all, let God sort them out" is a good for many theists. Divorce? Homosexuality? Wealth? Family? Believers prefer to sweep these issues under the rug, but they shouldn't be allowed to get away with it. But neither should atheists.
It's natural for human beings to personify the impersonal, and there's no harm in saying "Thank goodness" (though I don't think I agree with Dennett that the phrase is not merely a euphemism for "Thank God!", any more than "God" is a euphemism for "goodness," though some believers use it that way). It seems reasonable to me to want to thank someone for good or beautiful or pleasurable things in our lives that don't come from specific, identifiable persons -- but there's no one and nothing to thank for the beauty of a sunset or the river that flows past your house. "I prefer real good to symbolic good," Dennett declares, but "goodness"as he's using the word is symbolic good. Ah, my fellow atheists are such an embarrassment to me sometimes.
So back to Schaeffer. I'm not a parent or a grandparent, but I don't need to be to feel the same way about babies and young children, about human happiness in all its fragility. I hope too that such happiness and beauty will last, though I also know it won't. (In the long run, John Maynard Keynes said, we are all dead.) But the last person I'd ask to preserve these things is the god Schaeffer prays to, a being who kills babies and children and adults and old people without mercy, often stretching out their suffering to a horrifying extent. (Schaeffer, who survived childhood polio, knows this very personally.) It's easy to feel gratitude toward a god when you're dandling a healthy, happy baby; not so easy when you're burying that baby after it dies of one of the diseases to which your god made babies so vulnerable. It's easy to be dazzled by Nature when you see a picturesque landscape; not so easy when Nature rears up and drowns you and your whole family. I've mentioned before my agreement with the Peter De Vries character who'd rather deal with an empty universe than with a universe under the sway of an omnipotent, omniscient Someone who sees horrible things happening and does nothing about them.
Schaeffer declares at the outset in Patience with God that he's writing for people who believe in God already, but are put off by nasty fundamentalists, bible-thumpers, preachers of hate. Preachers like Jesus, if he was paying attention. It seems that he's going for a feel-good religion that won't make many demands on him -- sort of like Philip Kitcher, an equally moderate, middle-of-the-road atheist. That's fine with me to a point: I too want to feel good, and I want everyone else to feel good too. But in the world Frank Schaeffer and I share, life isn't always like that. Whether you believe in gods or not, it seems to me that your engagement with life has to address the bad things as well as the good, the pain as well as the pleasure, the misery as well as the joy. I'm now curious to see if Schaeffer will grapple with the wholeness of life in this book, but it doesn't look promising. His main approach so far is to blame other people (and the occasional marginal biblical book) for everything that goes wrong, but any half-trained philosopher could tell him that won't work.
Behaviorism is a 20th-century scientific fad whose biggest self-promoter, the late B. F. Skinner, made a career of playing a caricature of the 19th-century materialist crank who declares dogmatically that if it can't be measured, weighed, etc., it's just superstitious nonsense. Behavioral psychology lives on, exploiting the principle beloved of evangelists, generals, and scientists everywhere: if we haven't delivered the Second Coming / total victory / a Grand Unified Theory, it's only because you haven't given us enough money and personnel! Dig till it hurts! Send in your contribution NOW!
Like some other Artificial Intelligence researchers, Rawlins subscribes to the behaviorist principle that "Ultimately it's behavior that matters, not form" (17). If it quacks like a duck, it is a duck, no matter how many superstitious, animistic mystics claim that it's a CD player.
But AI apologists have made one vital modification to the principle: where behaviorists refused to anthropomorphize anything, especially human beings, some AI folk are ready to recognize the full humanity of computers, yesterday if not sooner. Slaves of the Machine churns up a weird goulash of behaviorist rhetoric spiced with echoes from Abolitionists, the Civil Rights movement, even the anti-abortion movement. (Hence the pun on "quickening" in the book's subtitle: If a computer scientist was gestating a computer Beethoven, would you abort his research grant?) "Most computer hardware today isn't working all the time anyway; most computers are asleep most of the time, snoring away while they wait for us to give them something to do" (81). "So today's computer spends most of its time fighting back tears of boredom" (29). Dig deep into your pockets, friends, and give, so today's computer will never have to cry again.
"So what will our future be? Wonderful and terrifying," Rawlins crows (20). "Are we ready for a world of feral cars?... Some of these future machines will be so complex that they'll eventually become more like cats than cars.... Perhaps all of them will be insane to some extent. Perhaps we are too. Perhaps when your future toaster breaks down and refuses to toast your bread because it's having a bad day, you won't call an engineer or a mechanic, you'll call a therapist" (121).
"One day our artificial creations might even grow so complex and apparently purposeful that some of us will care whether they live or die. When Timmy cries because his babysitter is broken, ... then they'll be truly alive" (121)." Like the Velveteen Rabbit? Or Pinocchio? Rawlins knows, and says, that human beings are prone to personify inanimate objects. Children already become attached to inanimate objects like stuffed toys or security blankets; adults are not much better.
"As the children who accept them as alive grow up and gain power in the world, they'll pass laws to protect the new entities." And he sees this as progress? More like a regression into animistic superstition. But it's also not very likely. People don't pass laws to protect the civil rights of the security blankets or stuffed toys, or imaginary playmates, they loved as children either.
I'm sure there are people who would buy a robot "pet," a thing coated in washable plastic fur that eats electricity, doesn't need a litter box, never goes into heat, and can be neatly switched off when you go on vacation. It would alleviate another common problem: the cute little puppies that are abandoned when they grow up and aren't cute anymore. Rawlins would like us to have robots that are too lifelike to be dumped in the landfill, but why bother? We already have real pets.
Rawlins likes to think that robot "pets" and robot "people" will present new ethical problems. This is typical Jetsons talk. The ethical problems involved are quite old, and we've mostly preferred not to think them through. Since we've never done very well with them before, I see no reason to suppose that we'll do better when computers are involved. Other animal species and other human "races" have raised the very same ethical problems.
The easiest way not to deal with those problems has been displacement: the faith that on the other side of the next mountain there lives a "race" of natural slaves who are waiting to serve us, their natural masters. They will welcome our lash, set our boots gratefully on their necks, and interpose their bodies between us and danger, knowing that their lives are worth less than ours. Since there is no such natural slave race, the obvious solution is to build one. The question is not whether we can, as scientistic fundamentalists like to think, but whether we should. Rawlins is a bit smarter: he not only recognizes, he stresses that true machine intelligence would not be controllable. But he confidently declares that it's inevitable, which he needs to prove, not assert.
In fact he's so obviously wrong that I wonder if he's using the word in some arcane metaphorical sense. "Inevitable" suggests a downhill slide that requires all your strength to retard, let alone stop. On Rawlins's own showing, such machines will be built only if governments and/or businesses pour vast amounts of money and intellectual labor into reinventing the art of computer programming from the ground up, and probably also reinventing computer hardware from the ground up, in order to produce a totally new kind of machine that no one really wants anyhow: a "feral" computer that is unpredictable, uncontrollable, and "insane to some extent." I don't call that inevitable. It's an uphill climb if it's anything at all.
Like many scientistic evangelists, Rawlins accuses anyone who doesn't want to climb that hill of superstitious pride: "Some of us may resist the idea of ourselves as complex machines -- or of complex machines one day becoming as smart as we are -- for the same reason that we resisted the idea that the earth revolves around the sun or that chimpanzees are our genetic cousins" (120).
Most people who talk about this kind of sinful pride forget that "Man" was never at the top of the Great Chain of Being: that spot was reserved for the gods and other supernatural beings, like angels. Humility in our human status has always been an official value in the West, even if it has been honored more in the breach than the observance. In practice its main function has been to rationalize and reinforce human hierarchies (obey the King, as he obeys God). This alone should give us pause before we try to push machines to a rung above our own. James Cameron seems not to have considered in his Terminator movies that there would be human Jeremiahs urging their fellows to worship Skynet, the intelligent computer network that was trying to wipe out the human race, or at least to submit to it as the instrument of god's wrath, much as the original Jeremiah urged Israel to surrender to Babylon.
A kinder (if two-edged) analogy is the biblical tower of Babel. Yahweh smote its builders with confusion of languages, not because he was annoyed by their pride, but because he really feared they would build a tower that could reach Heaven. Just as there were still naifs in the Fifties and Sixties who opposed the space program because they thought the astronauts would crash-land at the Pearly Gates and injure St. Peter, there are probably those who oppose AI because they think these are mysteries in which "Man" should not meddle. But it's more accurate -- and despite its triumphalist rhetoric, Slaves of the Machine bears me out -- to point out that the Tower of AI will never reach Heaven. Inventing smaller faster microprocessors with extra gigabytes of memory won't produce machine intelligence -- not even if perfectly coordinated teams of programmers manage to write flawless million-million-line programs. Rawlins and his colleagues are not clear-eyed rationalists -- they're the builders of Babel who argued that with just a few more research grants, a few thousand more slaves, Babylonian newlyweds could honeymoon in Paradise.
GARBAGE IN, GOSPEL OUT
On Rawlins's own showing we're reaching the limits of computers as we know them, because programming for every contingency is impossible in any but the simplest tasks. He says we could program current hardware to be more flexible, if only programmers would stop living in the past, but he gives no reason to think so and numerous reasons not to think so. It seems more likely to me that we'd need new kinds of hardware too, kinds that aren't based on on-off logic switches – and that hardware doesn't exist yet. Maybe someday it will, but Rawlins is already bowing down before it as if it did.
The most likely scenario Rawlins presents is that technogeeks will get funding for new hardware developments the way they always have: by promising governments, business, and the military faster, more compact, more flexible, but still absolutely docile electronic slaves. Instead they would produce (in the unlikely event they succeeded) machines with minds of their own. It isn't inevitable that they will get the money they demand, though. There's no reason why the public must pay the AI cult to build whatever its members want while promising something else. We can say No.
And on that somber note, our time is just about up. Let's take one last look at our computer flatworm experiment.... No, it appears that the sundered parts of our BFD supercomputer have not regenerated themselves. BFD promises us, however, that this problem will be corrected in the next version of the operating system, which will be released very soon. So thanks for tuning in, and join us next week for the next installment of "This Old Computer"!
(I wrote this back in the 90s, and while Artificial Intelligence no longer seems as trendy as it was then, I think a lot of the issues I talk about here are still very much with us.)
In the 1990s, or so we were assured when I was a lad, everyone would have his own personal helicopter, powered by its own clean, safe, and inexhaustible nuclear power pack. (Automobiles would be obsolete!) The skies would be full of commuters, all of them white males as all Americans were presumed to be in the 1950s, returning to the suburbs from their 20-hour a week jobs. (The work week would shrink, giving us all more leisure for home barbecues cooked on clean, safe and inexhaustible nuclear-powered backyard grills!) Their wives would preside languidly over homes staffed by robot butlers and maids, each powered by a clean, safe, and inexhaustible nuclear power pack, whose excess heat could be used to distill the alcoholic beverages needed by 1950s housewives to dull the boredom and isolation of their suburban days.
Well, you get the picture: it's the Jetsons. But the Jetsons affectionately parodied visions of the near future that were discussed seriously in other media. Nowadays those failed predictions are a bit embarrassing: nuclear power has lost its glamour, few people like the idea of filling our skies with individual commuter aircraft, and economists lecture us that the relative affluence of the 1950s was a historical fluke, that we'll be lucky if our standard of living doesn't decrease. As I read computer scientist Gregory J. E. Rawlins's book Slaves of the Machine: The Quickening of Computer Technology (MIT Press, 1997) I often had the feeling that I had been transported into the past, the past of the 1939 World's Fair, the past of the Jetsons.
Rawlins plays the prophet with such gee-whiz gusto, in fact, that I'm still not sure he isn't kidding. The future is here, by golly, and you ain't seen nothin' yet! “From ovens to spaceships, from mousetraps to nuclear power stations, we're surrounded by millions of automatic devices each containing small gobs of congealed wisdom” (2). Today's portable computer is "half the size of Gutenberg's first Bible - and perhaps as important" (21). Computers "twenty years from now will be vastly different from today's primitive mechanisms, just as a tiny computer chip is different from a barnful of Tinkertoys [5]... and in 20 years they'll be as cheap as paper clips" (28). "Because in the far future - which in the computer world means one or two decades - we may make computers from light beams. Sounds like science fiction, doesn't it? Well, in 1990 AT&T Bell Laboratories built one" (31). "In forty years, something bee-sized may carry more memory and computing power than all today's computers put together" (34). "But one day, what we call a computer may be something grown in a vat that will clamber out clothed in flesh" (33). "And, for the same money, by 2045 you could have 8500 million 1997 dollars worth of machine – four times the power of the entire world's supply of supercomputers in 1997 - all in your designer sunglasses" (34).
I suppose there's no harm in this sort of thing, but is it worth $25 for 135 clothbound pages? You can find equally solid predictions for a lot less in the checkout lane at your supermarket. The dust jacket copy of Slaves of the Machine touts it as an introduction to the wonderful world of computers, written in simple, colorful language that any cybernetically-challenged doofus can understand. I doubt that Slaves will reach its intended audience, though, not just because of its cost and lack of graphics, but because of its patronizing tone. As the examples I've quoted above indicate, Rawlins is more interested in dazzling than in explaining. Slaves of the Machine is more like an evangelical tract, full of promises and threats, than science writing. (But I may be drawing a nonexistent distinction there.)
One of Rawlins's pronouncements, though, caught my fancy: "Today, our fastest most complex computer, armed with our most sophisticated software, is about as complex as a flatworm" (19). In that spirit, we have in our studio a state-of-the-art supercomputer, courtesy of BFD Technologies, which I shall proceed to split down the middle with this fire axe. In the course of our program today we'll return to see if this complex marvel of human ingenuity can, like a flatworm, regenerate itself into two complete supercomputers, each with the memory and software of the original computer already installed! Meanwhile, we'll take a closer look at Prof. Rawlins's vision of the future of computing.
THAT'S NOT A BUG, THAT'S A CREATURE, I MEAN FEATURE!
Having begun by promising us unlimited horizons of computer power, Rawlins proceeds to explain How We Talk to Computers. Like a guide warning Sahib about the innate inferiority of the natives he's about to encounter, Rawlins warns the reader, "It's no use appealing to the computer's native intelligence. It has none" (42). (Remember that line.) I use the guide / Bwana / native/ image because Rawlins himself uses it, not just in his guiding metaphor of computer languages as "pidgins" but in his anecdotes.
Using a computer, he says, is like being driven around France in the 1940s by a surly driver named Pierre, who "obviously thought he was competing in the Grand Prix" and needed precise instructions, which were futile because "he was particularly dense. He understood only two words -- yes and no; or rather, since he refused to understand anything but French, oui and non" (44). But then, "around 1955, ... we got Jacqueline, a slightly more advanced translator. Of course, Jacqueline didn't speak English either, but she knew Pierre really well" (44f). Oo la la! Lucky Pierre! Gradually it became clear to me that Rawlins meant this conte as an allegory of the development of computer "languages," but even though I already knew this stuff, I found it hopelessly confusing. I can't imagine what a real computer virgin would make of it.
Rawlins chides us for wondering, "How could a mere machine understand a language?" Since he has just told us that French computers, at least, have no native intelligence, this seems a reasonable question to me, but Rawlins breezily assures us: "Still, without getting into philosophical subtleties, it's okay to say that a computer can understand lots of simple things, if we first explain them carefully enough using a small number of well-chosen commands" (45). With that, he's off with a basic course on computer programming: the IF-bone is connected to the THEN-bone, the THEN-bone’s connected to the NEXT-bone, and so on. And I hope you were paying attention, class, because Rawlins moves quickly on to what really interests him: the cult of the Silicon God, and the care and feeding of its priesthood.
In ancient times (about thirty to forty years ago), "computers were costly, cranky, and above all, customized" (48). "In those days, even the vibration of a passing car or plane could crash the machine" (51). "No one could dispute the technical elite because no one -- often including the elite themselves -- understood the machines. What they did, how they did it, what they could do, these were all mysteries to almost everyone, including the biggest companies and the biggest names in computing today" (49). As late as 1972, when I took a FORTRAN class at IU, The Computer sat in its Holy of Holies, its sacred meals of punched cards fed to it by graduate students in Computer Science, and spewed forth the printouts of its oracles (which in my case, consisted mostly of error messages).
Remember that even in those days, computers were advertised as "electronic brains." Not knowing what they did or could do didn't keep their advocates from making grandiose claims about them, which ought to make us suspicious about the no less grandiose promises Rawlins was making about the future of computing just a few pages back.
But ah, then "came the pidgin designers", who "made it possible to talk to the machine in a sensible way." But wait! "Today's computer pidgins are hard to learn and hard to use. All are cryptic, precise, and dense" (53). Contrasting crocheting instructions with computer languages, Rawlins concedes that "even the rankest beginner can learn to crochet a doily or scarf in a few hours, whereas it often takes computer beginners days or weeks before they can program anything interesting" (54).
Now, a pidgin can be used by ordinary people, by traders and the "natives" with whom they trade. Computer languages vary in their difficulty, but for the most part they still require an interpreter - the programmer - between the user and the "native." A closer analogue to a real "pidgin" is the mouse-and-icons interface of the Amiga, the Mac, and Windows, which enables people without extensive technical training to "tell" their computers what to "do."
But there's a more basic problem: Computers are not trying to communicate with us, any more than a thermometer is. Despite the common use of anthropomorphic language such as "think" and "feel" by their devotees, they do not think, or feel, or see, or communicate. Perhaps someday they will, but for now and the foreseeable future they don't, and any computer enthusiast who pretends otherwise is either lying or fooling himself. Computer "language" is also a metaphor, not a literal fact. In a very loose sense, it could be said that you "communicate" with a light bulb when you turn on its switch, and that is the only sense in which computer "languages" enable us to "communicate" with computers. Rawlins's extended metaphors are entertaining, but they are also misleading. I suspect that when he says that "computers" think, he is using "computers" metaphorically too: to mean, not today's computers, but imaginary computers which might someday clamber out of hydroponic vats but for now are only a gleam in Gregory J. E. Rawlins's eye.
Rawlins blames the impenetrability of computer "pidgins" on programmers themselves, "slow-witted, tiny-brained bunglers" (74) who are "lost in the past" (81), "like lawyers who are concerned only with law, not justice" (80). "While hardware keeps leapfrogging ahead every eighteen months, software is still lost in the dark ages of the 1960s. ...What we lack today isn't more hardware, but more imagination" (81). "[W]e don't let [computers] revise their procedure in the light of new evidence or insight. We never let anything new occur to them, and we deny them access to the history of earlier attempts" (97).
All this may be true, but it's still unfair to the programmers, who must write instructions for machinery that may be twice as fast and cost half as much, but still has no native intelligence. Programmers don't "deny them access to the history of earlier attempts"; computers aren't seeking such access. "The change to adaptive computer systems will come, as often these things do, by the actions of a few forward-looking software companies with the courage to take a risk and produce a new kind of software -- adaptive software" (81). In other words, Rawlins has no idea how to do it either, but he will accept funding from "forward-looking software companies."
What surprises me is that at the same time he's impugning the intelligence and integrity of programmers, Rawlins anticipates every objection I could think of. "Of course, there are financial and physical limits on how much we can improve today's computers" (35). "Year by year we're rapidly gaining power and, just as rapidly, losing control" (37). Faster, more powerful machines still must be programmed by human beings. "The more people we put on a software project, the longer it takes .... At current rates of progress they'll never produce flawless thousand-million-line programs" (69, 70).
On a deeper level, "We simply can't track all the myriad details involved in solving a very complex problem." This is not because we're tiny-brained bunglers, but because "no one knows how to design a sequence of steps that represents what we would call thinking" (105).
"The trouble is that just because something is hard for us it isn't necessarily hard for computers, and vice versa. We invented them to do something we're bad at: they can do arithmetic with unparalleled speed, reliability, and accuracy. So it's only fair that they're bad at things we're good at. The surprise is that they're bad at such apparently easy things" (111). Rawlins has evidently read, and taken to heart, such serious critics of Artificial Intelligence research as Joseph Weizenbaum, Noam Chomsky, John Searle, and the arch-heretic Dreyfus brothers, though he prudently names none of them, lest he be pronounced anathema by Douglas Hofstadter and Daniel Dennett.
Still, the fault lies not in our stars, Horatio, but in our wicked refusal to recognize that a computer's a man for a' that. But let's stop here for a moment to see how our split computer is coming along... hmmmm. No regeneration so far. Well, we mustn't be impatient. As Thomas Edison said, science is 1% inspiration and 99% perspiration. We'll be right back with more about Slaves of the Machine, right after this message from Microsoft.
It begins with a video game, an entertainment I allow myself from time to time, slaying animated pixels with aging fingers dancing in keystrokes. We lose a match, and the avatar of an anonymous nobody types, all caps and exclamation point, "FAGS!!!!!!!faggotsFAgs!" Which prompts me to reply, "Do you mean to imply that the other seven of us are queer?" "FAGGOT!!1!," he responds.
When challenged and teased, the "anonymous nobody" explains:
"I'm not gay, dude. I don't mean your [sic] gay. I mean you m8s wer [sic]..." "...fucking weak ass players. I hate fucking loosing [sic]." And there it was, one of the new and very modern (perhaps post-PoMo?) meanings of the term. An analog to "bitch," which on top of meaning "breeding whore" and "woman as chattel," has also come to mean "weak ass" guy. Faggot, as Loser. Someone who accepts a lower rank. One who kneels and accepts the dominance of others.
I'm not quite clear about a few things here. Was Nobody one who lost? If so, then he's a faggot too. (Which may be why he attacked the others: to divert attention from his own failure.) More important, how does losing an online game -- or anything -- equal "accept[ing] a lower rank" or "accept[ing] the dominance of others"? These games are set up so that someone has to lose, so maybe anyone who chooses to play such a game (meaning normative, masculist males) is by this definition a loser and a faggot; which doesn't really compute. And in my observation, "bitch" means not "woman as chattel", but an uppity woman: a whore is a woman who'll have sex with you, a bitch is a woman who won't. "Bitch" used between males, of course, means "faggot," which brings me full circle: it's a status imposed by some males on others, not one "accepted" by the subjects. It's no fun otherwise.
I've long thought that one reason why mainstream boy culture reacted with such fury to the emergence of openly gay people in the 60s and 70s was not just that we'd hijacked the innocent word "gay," but that we'd taken ourselves out of their control. How can you keep queers in line by threatening to reveal our dirty secret if it's not a secret any more? How can you establish your superiority as a Normal Person by calling someone a queer if he or she laughs in your face? (One reason I object to Totalistic Safe Space is that it maintains the idea that I should be devastated by being called a faggot, and that I need an authorized person to protect me from devastation. The radical gay movement meant, as far as I'm concerned, that I could put bigots on the run myself with the help of other faggots -- I didn't need the assistance of Compassionate, Healing, Helping Services Professionals.)
What got my attention here, though, was Jack Crow's characterization of this usage as something new, even "Po-Mo." Um, no. Like the Western hemisphere, it was there all along, with people living in it. Richard C. Trexler's Sex and conquest: gendered violence, political order, and the European conquest of the Americas (Cambridge: Polity Press, 1995) is a gold mine of lore. For example:
A third clue to understanding this profound contradiction between cultural approval and political condemnation of penetration is the recognition that, even though activity was most seriously condemned by the law, ridiculing a subject or enemy as passive was a standard means of exerting power. That is, the sovereign not only suffered the timeless defamation that he was feminine because he could not defend the public against enemies. He exploited the cultural order’s privileging of activity by denouncing all others’ alleged passivity. An early Christian case is as explicit as one could wish, making use of the fabled effeminacy of clergy. In Gregory of Tours’ sixth-century History of the Franks, Count Palladius, in the presence of the king, hurls this insult against the ‘passive and effeminate’ Parthenius, bishop of Gevaudan. ‘Where are your husbands, with whom you cohabit in lechery and turpidity?’ This was as much as to claim that the celibate bishop had patrons in the saeculum who had active homosexual rights over him.
Anyone in a hierarchy (except maybe the person at the top) has others above him as well as below him. I suppose Count Palladius was trying to climb a rung or two higher by verbally putting Parthenius below him. It's unimportant whether Parthenius was actually "passive and effeminate"; calling him so is a move in the endless, tedious dominance games boys play. That it's "merely" verbal doesn't mean it isn't effective, of course: where actual penetration or violence can't be used, verbal insults are one way to work the pecking order. (And faggots play this game too, the more shame to us.)
In C. J. Pascoe's Dude, You're a Fag: Sexuality and Masculinity in High School (California, 2007), she reports how
In my fieldwork I was amazed by the way the word seemed to pop uncontrollably out of boys’ mouths in all kinds of situations. To quote just one of many instances from my field notes: two boys walked out of the PE locker room, and one yelled, “Fucking faggot!” at no one in particular. None of the other students paid them any mind, since this sort of thing happened so frequently. Similar spontaneous yelling of some variation of the word fag, seemingly apropos of nothing, happened repeatedly among boys throughout the school. This and repeated imitations of fags constitute what I refer to as “fag discourse” [59].
I shouldn't belittle the anxieties of adolescent males, struggling to construct selves in a uncaring, un-understanding world, but hell, why not? There's something seriously wrong here. If such boys just yelled "faggot!" at no one in particular, it wouldn't matter so much, but sooner or later they go after living targets, and it doesn't stop with words. And it doesn't help to claim that kids who are harassed or beaten up as fags aren't always actually gay, that they're just being assaulted as part of boy-culture dominance games -- that's a distraction from the fact that boys and men are being assaulted in the name of Manhood, because of the personal anxieties of their assailants. Those anxieties are expressed in the 'merely' verbal use of "faggot" no less than in the use of fists and feet. As Pascoe also noted:
Contrary to the protestations of boys earlier in the chapter that they would never call someone who was gay a fag, Ricky experienced this harassment on a regular basis, probably because he couldn’t draw on identifiably masculine markers such as athletic ability or other forms of dominance to bolster some sort of claim on masculinity [67].
When I was visiting Korea a couple of years ago, I wrote in email to a friend in the States:
I'm in an internet cafe (called a PC Room here) in my host's neighborhood in a suburb of Seoul. There are half-a-dozen Korean-American adolescents in here, making a lot of noise as they play video games. "You fucking faggot!" one yelled as he lost a round. That's how I know they're American. Very annoying. The place is full of Korean kids the same age, rapt in their gaming, but they aren't making noise. I suspect these kids been sent over for summer vacation to visit relatives. ... Oops, another just yelled, "Move, nigger!" at a character in his game. Maybe I should go over, slap his head, and curse him in Korean. I can see boys like this growing up to become post-colonial academics, whining about how they've suffered from American racism.
Crow informs us at length that he's very liberal in sexual matters:
I don't care one bit about sexuality, so long as all the participants do their thing willingly. You can suck cock, or play model housewife, or wrap yourself up in bondage and submission, for all I care. You can do the auto-erotic, or the anerotic. You can partner up in a loving and monogamous gay marriage, or trip the vampy halls of amphetimated orgy parlors, seeking pleasure in bi- and tri and poly- hetero anonymity, and it matters naught to me.
I cannot wrap my head around wondering or worrying how other people copulate. I really can't. I cannot honestly imagine a more tedious and ridiculous waste of my time.
I have no damned fondness for kneelers. For people who take a knee. Who serve a master willingly. Who ply sycophantic crafts and the courtier's homage to power and might. For cops and prosecutors and mafia under bosses. For shop floor tyrants and people who vote for Democrats or Republicans, for Labor or Tory, for Christian Democrat or Revised Media Tinpot Fascist. Or, the people who work for them. Or the Democrats and Republicans, themselves.
For the bourgeoisie. For everyone and anyone who does obey, willingly. For people who choose to lose, because the alternative bears too heavy a cost. Because their dignity fails them, or they never had enough of it.
Stirring stuff. But girlene! What does this have to do with calling people "faggot" because you lost a round of an online video game? Jack explains. He is referring to:
Servants of power.
You know - faggots...
Well, no, I don't think so. For one thing, people in a hierarchy willingly embrace their position, because even if they're not at the top, they get some privilege (and reflected glory) from their patrons, and they're still above the losers below them. For another, Jack is still buying into the boy-culture fag-discourse dominance game, which also assumes that a man (or a woman) who is willingly penetrated is a 'servant of power', a punk, a whore, a bitch, a faggot. Which at best is a distraction: sexual penetrability has nothing to do with what Jack Crow is talking about, and the assumption that it does is not benign. What Jack is doing here is a more articulate, more sophisticated version of the online game-player's "faggot!" outburst -- working out his own anxieties about manhood and status by accusing others of what he fears. You gotta serve somebody, Bob Dylan sang, but you also gotta have somebody to sneer at because they're lower on the food chain than you.
Someone else commented on the post. (No permalink, but there are only six comments as I write this. Look for Charles F. Oxtrot.) First, it's all the fault of "political correctness."
Somehow America ran away from personal responsibility in the 2d half of the 20th Century --probably because personal injury litigation is a nice "driver" of profiteering for those who might not have a profitable work niche otherwise-- and the result is that we now have a babying culture, where political correctness has more evocative and persuasive power than real harm (or real threat thereof) for a great number of Americans.
Well, we all have our little agendas. As I pointed out, these games are not new; they have nothing to do with political correctness. The commenter went on:
It's a rare adult in America who hasn't had to suffer verbal insults and ego injuries at the words of another. Some of us may have known it only on rare occasions at schools or on playgrounds; others may have had a hard time escaping such attacks even in their own homes, from their own parent(s). Probably most of us fall between those ends.
Indeed, and I probably suffered less as a kid than most from fag discourse. But I was lucky.
Oxtrot went on to say:
Some subcultures in America have a group relational game that depends heavily on one's ability to creatively insult another. Lots of people call this game "The Dozens" but I've heard it called "ragging" and "jagging" as well. I'd imagine that most anyone who grew up as a part of that subculture knows how to see words as just words.
This is not a bad point; there are indeed games of ritual insults. Sometimes fag discourse is just such a game, but it is more than that: it also routinely extends to assaults on non-players. Some of us have suffered more than "ego injuries" from boy culture. But even where the abuse is only verbal, I want little and big faggots to learn self-defense so that they don't have to be helpless victims of other kids and adults. It's fun to watch the guys who just called me "faggot!" whine and complain of my meanness when I call them on their stupidity and viciousness; I think more people should have that experience.
To the extent that "faggot" and "whore" are used as metaphors for "kneelers," "those who obey," it's even more important to call out the people who use these words. The confusion of sexual receptivity with toadying and collaboration with the more powerful is itself part of the problem. I don't believe we will do anything about injustice and inequality by calling their perpetrators and collaborators by sexual insults. That just keeps the values of the system in place. Above all we must refuse to play by the fag-discourse rules, and we may hold those who do in contempt -- as long as we don't call them faggots ourselves.
P.S. Trexler also wrote, p. 109:
Various cultures have used sexual signs and gestures of subordination to express reverence toward their gods and lords. Indeed, only those ready to avoid the topic will be surprised that some corporal expressions of religious reverence, such as kneeling, bowing and prostration, remain formally close to certain sexual postures.
I'm about 60 pages into Emma Donoghue's new book Inseparable: Desire Between Women in Literature (Knopf, 2010), and enjoying it a great deal. Donoghue is best known as a novelist, but she's a scholar too, and a good one. Her earlier Passions Between Women (HarperCollins, 1995) not only taught me a lot but encouraged my own take on the history of sexuality.
In her chapter on inseparable romantic friends, Donoghue says that "opposites attract ... is one of our culture's most beloved truisms" (60).
The fact is, in Renaissance literature, love was very often thought to be based not on contrast but on similarity; the classical model was the romantic bond between two men. Like calls to like; birds of a feather flock together. There is an intriguing exchange in Honore D'Urfe's (1607-1627), when a jealous man called Hylas asks sneeringly if "Alexis" (a man disguised as a priestess) finds shepherdesses more appealing than shepherds, and "she" answers proudly, "Have no doubt about it, and blame no one but nature, who wants everyone to love his own kind." When Leonard Willan dramatizes D'Urfe's saga as Astraea (1651), he included a debate between a woman and a man, to be judged by their mutual object of desire, Diana. Phillis argues that same-sex passion is strongest because love grows from "Equality and Sympathy"; Sylvander counters that mating requires difference.
You plead th'advantage of your Sexe, as bent To love sembable were natures Intent; In Beasts see where her motives simple be, Their prevservations shall t'each contrarie [60-61].
What this shows, of course, is that both conceptions of love co-existed then, as they do now. (With good reason: both conceptions play a role in human loving.) It brought to mind some arguments I've seen about the value of same-sex erotic relationships, apart from the question of marriage. Some religious bigots have been arguing -- without any real basis for their arguments, of course -- that two men or two women can't form a couple as rich and rewarding as a mixed-sex couple, partly because men and women are opposites. Or because they're an animalistic fact of nature. Orson Scott Card also argued for the superiority of heterosexual marriage because it was unnatural, because it's “very, very hard -- to combine the lives of a male and female, with all their physical and personality differences, into a stable relationship that persists across time.”
But even within the Christian tradition these are shaky positions. Neither Jesus nor Paul thought much of heterosexuality, not even marriage. True, Paul used marriage as a metaphor for the relationship between Christ the Bridegroom and the Church the Bride -- but not between Christ and the individual believer, who he thought would do better to remain single if possible. The individual believer was the slave of Christ, not his bride. And I don't see oppositeness as the guiding conception of the sexes in the Genesis creation myths. Whether you go with Genesis 1, which has us created, male and female in Yahweh's image; or Genesis 2, where Eve was created from Adam's rib to be his helper, you won't find a meeting of opposites where men and women are concerned. The conception of sex/gender in the Hebrew Bible, at least, is more like the "one sex" model described by Thomas Laqueur in Making Sex: Body and Gender from the Greeks to Freud (Harvard, 1990).
Donoghue offers the story of Ruth and Naomi as an example of inseparable female friends. (The story of David and Jonathan is similar: "the soul of Jonathan was knit to the soul of David, and Jonathan loved him as himself" [1 Samuel 18:1-4].) Whether either relationship had erotic elements, it's significant that Ruth's famous words to Naomi have often been used in heterosexual weddings. If same-sex couples are so different from mixed-sex couples, and indeed inferior to them, why do heterosexuals keep using same-sex couples as their models?
The other day I noticed a copy of Monica Sone's Nisei Daughter on display at the public library. Originally published by Little, Brown & Company in 1953, it was reissued in 1979 by the University of Washington Press and is still in print. It turned out to be a loving re-creation of the author's Seattle childhood and early adulthood, including her family's internment in an Idaho concentration camp for Japanese Americans during World War II. Sone, who was born in Seattle in 1919, grew up in the hotel her parents ran on Skid Row, attended public schools plus Japanese school, and eventually became a clinical psychologist. I can't believe she remembered all the details she includes -- for example, her mother's tea set:
The tea set was stunningly beautiful with the uneven surface of the gray clay dusted with black and gold flecks. There was a wisp of soft green around the rim of the tiny cups, as if someone had plucked off grass from the clay and the green stain had remained there. At the bottom of each teacup was the figure of a galloping, golden horse. When the cup was filled with tea, the golden horse seemed to rise to the surface and become animated. But the tea set was only for special occasions and holidays, and most of the time we used a set of dinnerware Americana purchased at the local hardware store and a drawerful of silver-plated tableware [12].
Or one of her classmates at Japanese school:
Genji was a handsome boy with huge, lustrous dark eyes, a noble patrician nose, jet crew-cut setting off a flawless, fair complexion, looking every bit the son of a samurai. He sat aloof at his desk and paid strict attention to sensei. He was the top student scholastically. He read fluently and perfectly. His handwriting was a beautiful picture of bold, masculine strokes and curves [25].
But however she may have embroidered her story, it feels right, and much of it is wonderful, like her account of the time her mother went to a gala Mickey Mouse Club Party in which her children were performing and was mistaken by the reception committee for the wife of the Japanese consul. Or the crash-landing of her first romance, with the handsome and athletic Haruo. He gave her a photograph of himself when they graduated from grammar school and he went on to Franklin High. When they met again a year later in Japanese school, they were delighted until they stood up to bow to the teacher:
Before sitting down, I turned to smile at Haruo, but the smile froze on my face. I felt as if I were on the pinnacle of a mountain, looking down into Haruo's perplexed eyes. I was a good head taller than he. Haruo had not changed his appearance nor his height, but I had grown like Jack's beanstalk. I resumed my seat in a daze, my little world crashing down all around me [130].
One of the best things about Nisei Daughter, to my mind, is that it makes clear that Japanese families, even in Japan, were not all alike -- that a monolithic East/West divide just doesn't work. Nisei (second-generation, the offspring of Japanese immigrants) kids varied enormously in their conformity to traditional norms. Kazuko (Sone's Japanese name) and her siblings were noisy, rambunctious, and playful, while other kids at least put on a better face of standard Japanese manners. By depicting not only her own parents but other Japanese parents, Sone gives a varied picture of what it meant to be Japanese, and Japanese-American, in the first third of the twentieth century.
But there's also her account of her mother's search for a summer cottage near the ocean, which was blocked because Seattle whites wouldn't rent or sell to "Orientals." Or her grandfather's inability to come and live with them in the US because "In 1924 my country had passed an Immigration Law which kept all Orientals from migrating to America since that year" (107). Or:
One beautiful Sunday afternoon a carload of us drove out into the country to swim at the Antler's Lodge. But the manager with a wooden face blocked our entrance, "Sorry, we don't want any Japs around here."
We said, "We're not Japs. We're American citizens." But we piled into the car and sped away trying to ignore the bruise on our pride [119].
This is one reason why I become really angry when people idealize the America of this period as a time of happiness, safety, and harmony. Sone's account of the evacuation and internment is quietly furious and drips sarcasm like acid:
Once more I felt like a despised, pathetic two-legged freak, a Japanese and an American, neither of which seemed to be doing me any good. The Nisei leaders in the community rose above their personal feelings and stated that they would co-operate and comply with the decision of the government as their sacrifice in keeping with the country's war effort, thus proving themselves loyal American citizens. I was too jealous of my recently acquired voting privilege to be gracious about giving in, and I felt most unco-operative. I noticed wryly that the feelings about the Japanese on the Hawaiian Islands were quite different from those on the West Coast. In Hawaii, a strategic military outpost, the Japanese were regarded as essential to the economy of the island and powerful economic forces fought against their removal. General Delos Emmons, in command of Hawaii at the time, lent his authoritative voice to calm the fears of the people on the island and to prevent chaos and upheaval. General Emmons established martial law, but he did not consider evacuation essential for the security of the island.
On the West Coast, General J. L. DeWitt of the Western Defense Command did not think martial law necessary, but he favored mass evacuation of the Japanese and Nisei. We suspected that pressures from economic and political interests who would profit from such a wholesale evacuation influenced this decision.
Events moved rapidly. General DeWitt marked off Western Washington, Oregon, and all of California, and the southern half of Arizona as Military Area No. 1, hallowed ground from which we must remove ourselves as rapidly as possible [158-9].
That last sentence presages current Christian hysteria over the 'hallowed ground' of Ground Zero in New York.
Despite passages like this, Nisei Daughter has been seen by many critics as less than fully militant, even assimilationist. For example, in her pioneering Asian American Literature (Temple UP, 1982), Elaine H. Kim wrote that Sone's "testimony is made lightly, with good-natured humor and plenty of self-mockery, and it ends hopefully with what sounds like a high school civics speech" (75), though she added, "But the reader is left with a terrible uneasiness." I suspect that "the reader" will be left with different feelings depending on her situation: a white American who read the book on its first publication, for example, would probably feel differently than an American of Japanese extraction. Finally, though, Kim conceded:
The book is much more than it appears. It is not a chronicle of self-contempt. It is not an outraged protest book. But neither is it a cheerfully ingratiating document of Japanese American success through ability to "endure." It is rather the subtly documented story of the sacrifices demanded of the nisei by the racial exclusivity of American society, of a soul's journey from rage to shame, from self-assurance to uncertainty. At the end, Sone writes that "the Japanese and American parts of me were not blended into one," but somehow that statement remains unconvincing because the blend seems externally imposed, and everything, including the answers to Kazuko's unspoken questions, is left in limbo [80].
Fair enough, I guess, though I'm not sure I wouldn't add "and back again" to Kim's description of the story as "a soul's journey from rage to shame." And I've read enough immigrant literature to know that Sone's ambivalence is not unique to Asian-Americans. The "externally imposed" blending of the Melting Pot myth seems to have affected everybody since the Irish. But even so, it seems to me that Kim can't see the seething anger in the book simply because it doesn't have the words "WHITE DEVILS!" on every page.
I'd read so much about that "high school civics speech" that I was expecting a long patriotic lecture to close Nisei Daughter, but it's only a page or two. I suspect that the publisher required an upbeat, patriotic conclusion -- the tone of the concluding passages is off, and for all I know it wasn't written by Sone herself. (Has anyone ever asked her about it? She was still alive a year ago, as the video clip above shows.) 1953 was the height of the McCarthy Era, after all; I'm surprised that Nisei Daughter was published as it was, with so much of Sone's anger intact. In the preface she wrote for the 1979 reprint, she makes it clear that her anger and refusal to accept injustice remain undimmed.
A character in Marge Piercy's great Woman on the Edge of Timesays emphatically, "Every piece of art can't contain everything everybody would like to say! ... Our culture as a whole must speak the whole truth. But every object can't! That's the slogan mentality at work, as if there were certain holy words that must always be named." Nisei Daughter isn't the whole story of the Japanese-American experience or even of the Internment, but neither is Joyce Kogawa's Obasan, John Okada's No-No Boy, any other single work. You can't reduce the experience of thousands of people to one book or one narrator, and who would want to? Only those who don't really want to know, and would like to take what they perceive as bitter medicine in one noxious dose. As Edmund White says, canons are for people who don't like to read, and so want a minimal list of what they must struggle through. But Nisei Daughter is no bitter pill; it's a welcoming book, full of humor and humanity. I'm glad I read it, and very glad that it exists.