Monday, March 31, 2008

Faggot!

Every so often I hear straight men claiming that epithets like “faggot” don’t really refer to homosexuals. Rather, they say, it refers to ineffectual men who can’t take care of themselves or anyone else, who can’t give a woman what she needs, men who are cowardly and despicable, men who aren’t Real Men. (I don’t have any links at the moment, though I’ll try to add some the next time I encounter the claim online. I think I’ve seen Eminem and some other rappers saying such things, and I've read similar rationalizations about maricón in Mexican culture. Someone is playing a racist variation on the game here. To see how homophobic epithets are actually used by normal red-blooded American males -- haw haw haw! I can't believe I wrote that with a straight face! -- read some of the comments to this video. I'm still trying to figure out why it generated such hysteria.)

Since “gay” became a schoolyard epithet, soon after we queers mainstreamed it as a more-or-less neutral, non-clinical term for ourselves, I’ve heard the same thing about it as well. It’s true, some of the people who say “that’s so gay” are gay-friendly at other times, have gay friends, and pay liberal lip service to gay issues. And since we did claim the right to use “gay” for ourselves over the protests of our generation of genteel homophobes, I suppose we can’t really say that it has only one fixed meaning, and we shall stop linguistic change from happening in this one area forevermore.

That might even be the best response to “that’s so gay”: to recognize and, as necessary, point out that in that context, it has nothing to do with either the pre-1970 “gay” (“Don we now our gay apparel, fa-la-la fa-la-la la-la-la”) or the post-1970 homosexual “gay” (Gay Pride Now!).

Still, I don’t think any gay man who’s ever been called a fucking faggot (which means pretty much all of us) will take this claim seriously. “Faggot” refers not only to despicable, ineffectual men of any sexual orientation, but to men who have sex with other men, because in masculist culture men who have sex with other men are assumed to be despicable, ineffectual, etc. -- and fucked, in various senses of the word. There’s nothing more horrible in the masculine imagination than being penetrated anally: it takes away a man’s manhood as effectively as castration. For a man to enjoy being penetrated, to seek out the experience, is not thinkable (even if it’s not unknown to the men who deploy homophobic epithets). Gay liberationists were correct that shouting one’s fagitude to the world was a powerful challenge to the male supremacist order; that’s why gay liberation is now history, and today’s gay movement ambivalently calls for gender conformity, except for its reliably successful drag fundraisers.

“Faggot” and its synonyms are the equivalents for males of “whore” and its synonyms for women. What the Faggot and the Slut (as mythic figures) have in common is that they have been penetrated, and are therefore polluted, unclean. In both cases, the target of the epithet may not literally have been penetrated: boys may be targeted because they don’t fit in with other boys, regardless of their sexuality, and girls ditto – a girl may be called a Slut simply because she’s begun to develop breasts earlier than her age mates. But the words are (I think this is the right use of the term) performative: by calling you a faggot or a whore, I symbolically penetrate you, establish my manhood, earn and reinforce my membership in the men’s house. (Girls call each other “slut” too.) There are some interesting books on the words for women, starting with Leora Tannenbaum’s Slut! Growing Up Female With a Bad Reputation (Seven Stories, 1999) and Emily White’s Fast Girls: Teenage Tribes and the Myth of the Slut (Scribner, 2002), but they don’t go deeply enough – I could sense the authors drawing back from the abyss. I don’t know of any books (or any significant writings at all) which deal with the words for men, though Richard Trexler’s Sex and Conquest: Gendered Violence, Political Order, and the European Conquest of the Americas (Polity Press, 1995) has some useful discussion, as does Geng Song’s The Fragile Scholar: Power and Masculinity in Chinese Culture (Hong Kong UP, 2004). There’s a lot more thinking to be done about this; I’m just trying out some ideas now.

Meanwhile, what about the males who say that “faggot” refers to somebody else, the cowardly, ineffectual, effeminate guys – and not to their Homo-American buddies? It’s tempting to point out that effeminate men, the sissies who got harassed and beaten up by the Real Men all their lives, are fundamentally tougher than any macho man – but that would be a mistake, partly because it plays into their ritual of competitive toughness and partly because at best it can only send the bullies off in search of someone they can still feel entitled to degrade as a not-man. That’s probably the core point right there: “faggot” does not say anything about the man who’s called one – it does say volumes about the fears and inadequacies of the men who use it as a token in their pathetic dominance games.

Sunday, March 30, 2008

Becalmed Among The Great Unwashed

You know, I don’t think I’m going to finish reading Susan Jacoby’s The Age of American Unreason. I imagine Jacoby feels better for having written it, vented her bile, and talked to the press about it. But I don’t feel better for having read the first eighty pages, so I’m gonna vent my bile right here.

As I expected, The Age of American Unreason is an extended and not very skillful game of “Ain’t It Awful.” In a way, it’s frustrating to read, because I do dislike most of the things she dislikes, but then I don’t need her to tell me about them. On the other hand, I don’t share her fury over the use of “folks”:
a plague spread by the President of the United States, television anchors, radio talk show hosts, preachers in megachurches, self-help gurus, and anyone else attempting to demonstrate his or her identification with ordinary, presumably wholesome American values. Only a few decades ago, Americans were addressed as people or, in the more distant past, ladies and gentlemen. Now we are all folks.
A plague? Darling, get a grip. Reading this, one wants to deliver a Hollywood-style hysterics-stopping slap upside Jacoby’s head, and wipe the flecks of foam from her quivering lips. Someone who gets as worked up over “folks” as about creationism, infotainment, and Larry Summers’s slighting remarks about lady academics – and, as far as I can tell, more upset than she gets about the US war in Iraq – needs to work on her priorities. (Two hundred years ago, Jonathan Swift threw a similar hissyfit over the word “mob”, which would never take the place of “rabble” in his heart. I agree with the writer Jay Quinn that it's a shame Swift didn’t win that battle, so we could talk today about rock stars being “rabbled” by their fans.) If she opposes the war in Iraq, it seems to be because of Bush’s belief that he is Yahweh’s instrument, not because innocent people are getting, like, hurt and killed there. There’s an odd lack of ordinary humanity in Jacoby’s jeremiad. (How can you worry about dying children when Americans are misusing apostrophes?)

Nor has she convinced me that things are that much different from the way they used to be, especially since her own evidence has a way of refuting her. She thinks that reactionary Christian religion is more influential in American life than it was in the 1800s, though she documents plenty of anti-intellectualism and Christian square-headedness from that era, which managed to flourish without the aid of today’s mass media. She brushes aside the Second Great Awakening with a sniff, to focus on an oration by Ralph Waldo Emerson to Harvard College’s Class of 1837. Emerson told his audience that “The mind of this country, taught to aim at low objects, eats upon itself” – making basically the same complaint then that Jacoby’s making now, only without videogames and Oprah. Ironically, the burden of Emerson’s oration was that it was time for American culture to stand on its own two feet, rather than leaning on Europe; Jacoby regards Europe today as a comparatively enlightened place where Christians don’t keep Darwin out of the schools. She also admits that “American freethought” was “never a majority movement,” which is probably putting it mildly, but it still undermines her thesis that things used to be better.

Oh yeah – I asked a dozen or so undergraduates around the dorm where I work if they knew what Pearl Harbor was, since Jacoby told the New York Times that her book was inspired by overhearing two yuppies in a bar on the night of September 11, 2001, who seemed to have no idea about it. Everyone I asked knew that the bombing of Pearl Harbor by the Japanese led to the US entry to World War II. Jacoby will be relieved to know that the coming generation of college students know their history pretty well, even if that fact takes some wind out of her book’s sails.

Saturday, March 29, 2008

But Enough About You ...


This article – well, really it’s only a squib – by one Megan McArdle has been linked by IOZ (in a strong, eloquent post), if not by others, on the web. It’s interesting to watch Ms. McArdle squirm:

Obviously, there are people who were right about the war for the right reasons, and we should examine what their thought process was--not merely the conclusions they came to, but how they got there. Other peoples’ opposition was animated by principles that may be right, but aren’t really very helpful: the pacifists, the isolationists, the reflexive opponents of Republicans or the US military. Within the limits on foreign policy in a hegemonic power, these just aren’t particularly useful, again, regardless of whether you are metaphysically correct.

“It won't work” is the easiest prediction to get right; almost nothing does. The thought process that tells you something probably won't work is not always a good way to figure out what will, even if you were right for the right reasons, as I agree lots of people were. That’s why libertarians have a great track record at predicting which government programs will fail (almost all of them) and a lousy track record at designing ones that do work.

On the other hand, “I thought it would work for X reason”, when it didn’t work, is, I think, a lesson you can carry into both decisions about what to do, and what not to do. On a deeper level, understanding the unconscious cognitive biases that lead smart and well meaning people to believe that things which will not work, will work, is a very good way to prevent yourself from making the same mistake.

It’s a repulsive performance, and while I’m tempted to say that it’s surprising to find it on the site of a liberal magazine like The Atlantic, I have to recall that The Atlantic also spotlighted Dinesh D’Souza’s right-wing tract Illiberal Education, publishing an excerpt before the book was published. Of the first few dozen commenters, most fault McArdle for thinking that the invasion of Iraq hasn’t worked, or it would have if not for the Iraqis, which is probably the best refutation of her position one could ask for.

Notice, in the first paragraph I’ve quoted, how blithely she dismisses the “pacifists”, the “isolationists”, not to mention those who are “reflexively” opposed to the Republican party. I wonder who she has in mind. It’s so easy, and such a popular tactic, not to name names, so no one can quibble over the accuracy of the characterizations. But if someone argues nowadays that the Japanese should not have tried to take over Asia in the 1930s, is that “isolationism”? Does only a “pacifist” say that the Japanese should not have killed Our Boys at Pearl Harbor, or that al-Qaeda was wrong to destroy the World Trade Towers? American pundits and politicians never hesitate to make moral judgments on the actions of our certified enemies; it’s only the US whose motives are beyond question.

Next McArdle moves to the Realpolitik so beloved of mainstream liberals and conservatives alike: well, we live in a world of hegemony, so we have to work within those parameters, don’t we, and not be afraid to get our hands a little dirty. So, the question becomes something like: how can we effectively achieve our aims – never mind whether those aims are good ones? How could Hitler have gone about establishing hegemony over Europe, for instance, in a way that would work? When the Soviets crushed democracy in Czechoslovakia in 1968, is the only permissible question whether their hegemony worked? And how about China’s hegemony over Tibet? A Chinese Megan McArdle could explain that only an isolationist or a pacifist, surely, would deny China’s right to run that country as it wishes. The only question is whether Chinese methods will work, and if not, how to make them work.

As I remember it, American liberals who opposed the invasion of Tibet -- I mean Iraq, sorry! – mostly expressed the fear that “we” would get into another “quagmire” there, like we did in Vietnam. Gloria Steinem, for one, expressed that fear in a speech here at Indiana University. What about the Iraqis who might be killed by our bombs and artillery and white phosphorus, you ask? Who cares? No one’s going to accuse Steinem of pacifism or isolationism! There was debate in The Nation, too, about how comparable Iraq was to Vietnam, though a few knee-jerk anti-Republicans were allowed to express their reflexive rejection of hegemony in its pages.

One commenter at IOZ asked, “But did anyone opposed to the war intelligently warn what would happen if the US went in without a governance plan? I don't recall that being their message.” Gracious, so many demands here, demands that would never be made of supporters of the war – intelligence, for one. But leaving aside those who warned of a quagmire, there’s this article by Noam Chomsky, and all you have to do is browse around Counterpunch in the months leading up to the invasion to find numerous warnings that it would not be the cakewalk promised by the Bushites. Those predictions have mostly been borne out by events, too. But for the likes of Megan McArdle, the deaths of hundreds of thousands of Iraqis and the flight of millions more are of no account in themselves, only as signs of our doing our hegemony wrong.

But then there’s Pete Seeger, the granddaddy of privileged white kids learning folk music, blacklisted from American TV as a Red for many years until he appeared on The Smothers Brothers Show in 1968. Seeger wrote a song called “Waist-Deep in the Big Muddy” about the American experience in Vietnam. The Smothers Brothers bucked CBS censors so Seeger could perform this radical, cutting-edge political song on their show. The key offense, much as in a Stalinist state, was the song’s reference to “the big fool [who] says to push on,” widely taken to mean President Lyndon Baines Johnson. The song is about American soldiers “on maneuvers in Louisiana,” training for the Big One, WWII, who are nearly sucked down into quicksand because of the incompetence of their captain. If we take this song as it was meant to be, as an allegory of America in Vietnam, it’s notable that what menaces Our Boys is a force of nature – opposing human beings are conspicuously absent, to say nothing of napalmed children and slaughtered villagers. Seeger knew better, I hope. But that this pretentious song could have seemed extreme (or daring, depending on your point of view) tells me a lot about American hegemony, even among opponents of the US invasion of Vietnam. … A few years ago I happened on a Pete Seeger songbook at the library and began working through it, learning songs I hadn’t heard in years. I started to learn “Big Muddy,” but as I listened to the words I was singing I couldn’t go on.

I’m also reminded of a joke, which I first encountered in Leo Rosten’s The Joy of Yiddish but found again in Paul Breines’s very serious and important book Tough Jews. Some rabbinic students were drafted into the Tsar’s army more than a century ago, and much to their trainers’ surprise they turned out to be excellent sharpshooters. On the target range they never missed. But when they were put into battle, they refused to fire their guns. Their officers screamed at them, “What’s the matter? Why don’t you shoot?” They replied, “But those are real men out there, sir – if we shoot, we might hurt them.” Crazy pacifists!

Friday, March 28, 2008

Age Is Not Just A Number

I saw it again today on the Web, “Age Is Just a Number.” I guess this cliché makes some sense as a corrective to the idea that at each age you’re permitted to act a certain way and do certain things: dress like this but not like that, look like that but not like this, do this but not that, and so on. But beyond that, I think it’s dead wrong.

Aging has not, so far, been a big deal for me. My health, at 57, remains good. I’m now the oldest person in my department at work, but I think I’m virtually the only full-time worker there who isn’t on some kind of medication for physical or other ailments. I’m just now beginning to get enough gray in my hair to be noticeable, and I’m often told I look younger than my age – a trait that runs in my family, I think. When I met the mother of some of my Korean friends, she asked me how it is that I look so young. “I have no children,” I told her, and she nodded in agreement. But my parents, who did have children (four of us, heaven help them), aged gracefully too, and kept reasonably good health well into their seventies.

But even so, my body has changed and slowed down. It takes longer for cuts and other small injuries to heal. I can’t walk onto a track after a hiatus and run two miles in fourteen minutes, as I could do till I was in my 30s; I might finish one mile, but it would probably take me fourteen minutes by itself. I’ve put on weight gradually over the decades, despite moderate but obviously insufficient exercise and a non-sedentary job, and it won’t come off. I’m definitely less flexible than I used to be, and it takes more work now to try to fix that even a little. When I was in my twenties I frequently masturbated twice a day without having to strain; now it’s, erm, rather less often. (I make sure I have a few orgasms each week, partly so I won’t forget how, and partly because they’re good for prostate health.) I still have a powerful visceral reaction to the sight of attractive people, but that has little to do with the body; sexuality is primarily in the head.

Just on these points, it’s obvious to me that my age is not just a number – it’s written in my body. I try to attend to my flesh and let it, not the number of birthdays, guide me, but I don’t try to ignore its changes.

Now suppose that my brain, all memories intact, could be transplanted into a much younger body – what then? Well, those memories are important too. For me, the assassination of John F. Kennedy in 1963 is a memory, not something I read about in a book or saw on the History Channel. Ditto the Vietnam War, the assassination of Martin Luther King Jr., the impeachment and resignation of Richard Nixon, the assassination of John Lennon, the election of Bill Clinton. That last is worth stressing just a tad: a college senior graduating this year would have been four years old when Clinton was elected – only a little younger than I was when Dwight Eisenhower was re-elected in 1956. I remember the fact of the election, but nothing of the campaign or the issues. The 1960 elections, between Kennedy and Nixon, were the first I paid much attention to. And just think – the September 11th attacks happened over six years ago. There are children in their first year of school who were not born at the time, and children just a few years older for whom they are most blurry memories. Soon they too will be history.

Similarly, Beatlemania, the Summer of Love, Woodstock, disco, punk – all these are memories for me, and I still have most of the records I’ve bought since the 1960s. The Beatles, the Stones, Bob Dylan, the Supremes, the Four Tops, and so on are not what I grew up hearing on my parents’ scratched vinyl or my older brother’s CD player. I can remember when all of it didn’t yet exist. To say nothing of the fact that I was eighteen, freshly graduated from high school, when I read about the Stonewall riots in the Village Voice, a week or so after they happened.

These cultural and historical events and changes – and much more – are written in my body too. I carry them around with me, in an invisible balloon in my head. They are the sea in which my mind swims, the block of ice or amber in which I am encased. I’d be a different person if I’d been born ten years earlier, or ten years later, in ways I can’t even imagine. None of this is in any way a complaint, or a claim that I’m hermetically sealed off from people older or younger than I am. It is merely to say that age is far more than a number. In many ways it’s a quality rather than a quantity, but however you look at it, the difference it makes is real and not an illusion.

Thursday, March 27, 2008

Prepare To Be Boarded!

I'm mighty tard (Old Hoosier for "tired") tonight, so I'm not going to say much. I was thinking of writing something about Alexei Panshin's 1968 sf novel Rite of Passage, which I just reread, but I quickly found there was more to say than I felt like saying tonight, so I'll just show you the picture above, which I suddenly remembered when I read a reference to a slide rule in Panshin's book. It's by the popular sf artist Kelly Freas, and it makes me want to try to track down the Murray Leinster story it illustrates, because I feel sure it isn't totally serious.

I think I first saw this cover reproduced around twenty years ago in a book called something like The Science in Science Fiction, which pointed out the most amusing detail in it: the slide rule, once as essential an accessory for technogeeks as a pocket protector or a ham radio license, but totally replaced by the pocket calculator and the microcomputer. Yet it never occurred to most sf writers that computers would be miniaturized. Even the Noble Engineer Heinlein, famed for his technological prophecies, had his far-future starship crews swearing by their trusty slipsticks. The obsolescence of the slide rule clenched in his teeth like a cutlass makes Freas' space pirate even campier.

Wednesday, March 26, 2008

Wilde In The Streets


Here's another review for Gay Community News, published in the January 15-21, 1989 issue. The caricature above is by Max Beerbohm, a younger contemporary and friend of Wilde's who outlived him by more than half a century.

Oscar Wilde

by Richard Ellmann
New York
: Alfred A. Knopf, 1988
680 pp.

Oscar Wilde's London: a Scrapbook of Vices and Virtues 1880-1900
by Wolf Von Eckardt, Sander L. Gilman, and J. Edward Chamberlin
Garden City: Anchor Press/Doubleday, 1987
285 pp.

The Oscar industry grinds on, and its two latest offerings demonstrate the range of its products’ quality.

The idea behind Oscar Wilde’s London is a good one. “This book is not about Oscar Wilde,” the authors assert in the Introduction. “It is about the city that made Oscar Wilde.” If, like me, you’re a bit vague on the actual conditions of late Victorian Britain, a social history sounds like just the thing to help understand how Wilde perceived himself and was perceived in his day. Biographers fill in quite a bit of this background, but there are many details -- such as the fact that when Wilde arrived in London in 1879, electric street lights were just beginning to be installed there -- which don’t belong to biography proper but help to understand its subject.

The best thing about Oscar Wilde’s London is its illustrations, particularly the many photographs, most of which are so clear and sharp they might have been taken yesterday. Not just of the famous, they include some fascinating pictures of daily life by one Paul Martin (see pp. 19-20, 94), whose work I’d like to know better. The text is less impressive. The chapters on London’s growth, on the poor, and on sports and popular entertainment are pretty good. But the book seems rather poorly organized. It offers no information on how the three authors divided up the writing among themselves, and at times I had the feeling that it had been pasted together too quickly. Topics are sometimes dropped almost in the middle, with the outcome of one or another controversy omitted as though everyone knew it. There are also some odd errors which suggest a lack of care in fact-checking. The message on the infamous visiting card left for Wilde by the Marquess of Queensberry, which led to Wilde’s downfall, is quoted here as “To Oscar Wilde posing as a sodemite (sic)” (73). Queensberry did indeed misspell the key word, but I’ve always seen it rendered “Somdomite”, and had thought the error was almost as well-known as some of Wilde's epigrams. (According to Richard Ellmann’s new biography, the actual message was “To Oscar Wilde posing Somdomite”.) I felt that the connection with Oscar Wilde was too tenuous, more of a marketing hook than a unifying principle for the book. Still, Oscar Wilde’s London is worth a look, and it includes a long reading list which should be useful to anyone who wants to explore the subject more thoroughly. See if your library has it.

The late Richard Ellmann completed Oscar Wilde just before his death in 1987, and while it is neither as exhaustive nor as definitive as his famous biography of James Joyce, this new biography is notable for its warmth, good judgment, and good writing. It is the least homophobic of any book on Wilde by a straight author that I’ve seen: not just free of amateur psychoanalysis but a bit disdainful of that popular biographical perversion, and downright scornful of the hypocrisy which destroyed Wilde's life and career. Nowadays we ought to be able to take such an attitude for granted, but unfortunately it’s still rare enough that Ellmann deserves notice for it.

Ellmann, in fact, writes as an unabashed fan of Wilde, and this makes his book even more refreshing. He has many touching stories to tell about Wilde’s generosity and kindness (see especially pp. 412-13), even in areas where other biographers turn up their noses: “What seems to characterize all Wilde’s affairs is that he got to know the boys as individuals, treated them handsomely, allowed them to refuse his attentions without becoming rancorous, and did not corrupt them” (390). He praises Wilde’s defense of ‘Greek love’ at his trial: “For once Wilde spoke not wittily but well.” Ellmann also credits those courageous souls who helped Wilde when he needed it most. Frank Harris, who is often portrayed (not entirely without reason) as a major buffoon in books about Wilde, has a shining moment of humanity that makes up for a lot of silliness. Believing that Wilde had not committed the acts of which he was convicted, Harris arranged to borrow a yacht to smuggle him to the Continent. When he told him of the plan, “...Wilde broke out and said, ‘You talk with passion and conviction, as if I were innocent.’ ‘But you are innocent,’ said Harris, ‘aren’t you?’ ‘No,’ said Wilde. ‘I thought you knew that all along.’ Harris said, ‘I did not believe it for a moment.’ ‘This will make a great difference to you?’ asked Wilde, but Harris assured him it would not” (468). There are people today who couldn't rise to so much humanity. By way of contrast, the painter Sir Edward Coley Burne-Jones “hoped that Wilde would shoot himself and was disappointed when he did not” (479).

There is one area where Wilde’s generosity failed, however, and since no one ever seems to comment on it, I'd like to. Ellmann seems not much bothered by the clear indications that Wilde married because he needed money and public proof of heterosexual normality, and though he was charmed and attracted by Constance Lloyd, he doesn’t seem ever to have taken her seriously. He evidently began to neglect her almost at once, first for his rounds of socializing and travel, then for the young men who occupied his real sexual and romantic interest. After Wilde’s downfall, “Paul Adam, in La Revue blanche of 15 May 1895, argued that Greek love was less harmful than adultery” (482). But Wilde’s love for Alfred Douglas was adulterous, to say nothing of all those hardened little hustlers to whom he was apparently rather kinder than he was to his wife and children. While he was in prison, a reconciliation was arranged which Ellmann seems to think could have succeeded, but it was forestalled by the return of Douglas and by Constance’s death in 1898. I don't doubt that Wilde was so grateful for his wife’s willingness to forgive him that he really believed he loved her, and would change his ways forever. But I also don’t doubt that once he’d regained his freedom, he would have allowed boredom to set in. Despite this, Wilde doesn't come off badly compared to his heterosexual contemporaries -- how many of them went to prison for marrying money or neglecting their wives? -- or to many gay men and lesbians before and since who’ve made the mistake of marrying heterosexually to get a hostile society off their backs. The more so if Ellmann is correct that Wilde had no overt sexual experience with men before his marriage, and some experience with women; that’s a classic formula for disastrous self-deception.

It’s unfortunate that Wilde was unable to pick up the pieces of his life and career after his imprisonment. He had a social conscience, encouraged by his Irish nationalist mother, and had done some interesting political writing; he wasn’t quite the mindless butterfly he sometimes pretended to be. As we watch around us the ominous rise of the same forces that destroyed him, he no longer seems as quaint as he did in the 1970s, and his life has much to teach us. Ellmann’s biography is probably the one to read, and now that it‘s out in paperback it’s the one to own: humane, learned, affectionate and smoothly written, Oscar Wilde is a model of the biographer’s art.

Tuesday, March 25, 2008

Atheists Say The Darnedest Things!

Strange. I’ve been encountering an unusual number of strangely misinformed remarks about religion by atheists recently. Maybe I’ve just been in a meaner, crankier mood than usual?

Try this one, from the late Arthur C. Clarke: “Science can destroy a religion by ignoring it as well as by disproving its tenets. No one ever demonstrated, so far as I am aware, the nonexistence of Zeus or Thor, but they have few followers now.” (The site attributes it to Childhood’s End, one of Clarke’s novels, so I presume it’s spoken by a character, not directly by Clarke. Ordinarily it’s not wise to assume that characters speak for their authors, but the blogger who posted the quotation to his own site evidently did, so I’ll go along with him.)

It’s true, Zeus and Thor have few followers now, but the credit (or blame) doesn’t go to science; it goes to a certain rival cult, which achieved its supremacy not by ignoring rival gods but by imposing itself by force, up to and including violence.

Actually, the first thing that popped into my head when I read Clarke’s remark was drapetomania. Discovered in 1851 by a white American doctor named Cartwright, drapetomania was a disease that caused African-American slaves to run away from their masters. As far as I know, no one ever demonstrated scientifically that runaway slaves were not sick, but few would claim now that they were. I’m not saying that Clarke would have accepted the existence of drapetomania, only drawing the parallel to show that proofs and demonstrations are not necessarily relevant.

Clarke was never one of my favorite sf writers anyway, but he finally annoyed me terminally with a remark in the afterword to 3001. After patronizingly expressing affection for his religious friends (some of his best friends are Buddhists and Jews and Christians and Hindus! isn’t he liberal?), he purrs: “Perhaps it is better to be un-sane and happy, than it is to be sane and un-happy. But it is best of all to be sane and happy.”

Maybe Clarke would have accepted the existence of drapetomania. There is no reason I know of to believe that most religious believers are “un-sane.” The complacent assumption that most people are crazy while the assumer is the only sane person is not exactly a sign of perfect sanity, however. It’s certainly not a sign of rationality to try to discredit another person’s beliefs by questioning their sanity. I’ve observed that tactic used by Christian apologists against agnostics and atheists, and of course as a gay man I’m well acquainted with the secular medicalization of unpopular life choices.

Try this comment by another atheist: “Religion starts from the assumption that an ancient text or tradition is true, and seeks to reconcile observed reality with the text.” Well, no it doesn’t. We don’t really know how religion started, but most religions are not based on sacred texts -- Greek and Roman paganism, for instance. Judaism was a novelty in that respect (though its texts were a relatively late development compared to the sacrificial practices, purity rules, and festivals that were its core – and these also changed over time), followed by Christianity and Islam. Christianity started from current events – Jesus’ career as a miracle-worker and preacher, culminating in his death by crucifixion and the claims by his followers that he’d been raised from the dead – not from ancient texts or traditions. Early Christians appealed to the Jewish scriptures to justify their new cult, but they neither took them literally nor based their claims on the texts: rather they interpreted the texts with amazing elasticity to force them to conform to the sacred events. (Nothing in the Hebrew Bible, for example, predicts that the Messiah would be crucified and rise from the dead.)

At around the same time as the Christian New Testament was coalescing, rabbinic Judaism codified its legal rulings into a new compilation, the Mishnah, again partly as a result of historic events: the destruction of the Jerusalem Temple, which ended the sacrificial cult. This forced the reinterpretation of text and tradition to conform to reality, not vice versa.

Ancient texts and traditions weren’t always ancient. Once they become authoritative, they are used by their adherents in a complex way, both influencing believers and being influenced by them. The reinterpretation of texts reconciles the texts with observed reality, trying to make them fit present-day needs.

Finally, the blogger at whose site I found these quotations (except the one from 3001) wrote, as an example of the way that religion is a “fixed system,” unlike science: “The Catholic Church isn't going to ‘adjust’ or ‘self-correct’ their version of God based on conflicting ‘evidence’, whatever that might be; for them he is and will always be the omniscient creator of everything in the universe, and the ultimate answer to every question.” I can’t imagine Western science ever adjusting its basic approach to understanding the workings of the universe, namely trying to explain those workings without appealing to divine or other supernatural agency; that’s a given, though it was arrived at fairly gradually over the past 350 years or so. But even within the Roman Catholic tradition, the understanding of God has changed over the past two millennia. Augustine, for example, used Platonic ideas; Aquinas used Aristotle and other philosophical authorities.

The Church would probably claim that its understanding is indeed “self-correcting” (a popular, if dubious buzzword among scientific apologists these days). On less central issues, like slavery or Christendom’s relation to competing sects, Christian positions have changed quite a bit over the centuries. From the New Testament we know that widely divergent understandings of Christ coexisted and were in conflict from the earliest days of the churches. Outsiders had little or no input into these internal controversies, so I suppose their progress could be described as self-correcting.

It simply isn’t true that religion is a fixed system. As individuals, people change their religious beliefs in ways ranging from wrestling with personal fears and conflicts by interacting with tradition, to joining a new denomination or converting to a different religion – or abandoning religion altogether. Such changes may be affected by thinking about Copernican or Darwinian theory, but they may also take place entirely within a framework of religious thought. Believers sometimes want you to think their beliefs are fixed and solid, but it’s odd to find atheists taking them at their word. Nothing human is fixed and solid, and a look at the history of religious belief and practice will show that religion is no exception to the rule.

Monday, March 24, 2008

Wishin’ and Hopin’ and Thinkin’ and Prayin’



(The wedding imagery here reminds us that the Church is the Bride of Christ, and if you imagine the third-person pronouns with initial capitals [“Wear your hair just for Him … You won’t get Him thinkin’ and a-prayin’ …] you have quite a kinky little hymn on your hands.)

My text today, dearly beloved, is from H. Allen Orr’s review of Philip Kitcher’s Living with Darwin, on marketing strategies for atheists:
Too often, the New Atheism forgets to make its humanism humane.
Wow. Where did Orr (or Kitcher) get the idea that religion is humane? One objection I have to Dawkins and the other “New Atheists” (as I’ve said before, I’m always suspicious of talk about “New” anything) is that they are basically secular avatars of the old-fashioned hellfire and brimstone preachers. Orr quotes Kitcher from Living with Darwin:
Often, the voices of reason I hear in contemporary discussions of religion are hectoring, almost exultant that comfort is being stripped away and faith undermined; frequently, they are without charity. And they are always without hope.
Orr agrees with Kitcher in criticizing the New Atheists for the lack of hope in their message, for their apparent glee in, as they imagine, stripping away other people’s illusions; but such has always been the method of revivalist religion, and these boys are revivalists. Theoretically there is hope of salvation for those who heed the Christian message, but the congregation members who reportedly fainted on hearing Jonathan Edwards’s 1741 sermon “Sinners in the Hands of an Angry God” don’t seem to have been reassured. The gospels’ Jesus taught (Matthew 7:13-14) that only a few would find the narrow path to salvation, so the overwhelming majority of humanity would be damned, and there is no hope for those who die unsaved. The parable of the Rich Man and Lazarus (Luke 16:19-31) is striking in its callousness toward the damned. I think it was the historian of religion Jonathan Z. Smith who wrote somewhere that the Good News of Christ was, and is, Bad News for most people.
Christianity, the religion I know the most about, has never been concerned with the tender feelings of believers in competing sects. Whether the competition was Jews, pagans, or (ever since the beginnings of the cult, as the New Testament shows) other Christians, the rhetoric has been vitriolic and unforgiving. Hatefulness is endemic to Christianity, though not specific to it, and the New Atheists tend to come across as reincarnations of the ancient Christian heresiarchs, seeking out and denouncing those who wickedly stray from their version of truth, only with the sectarian elements (and learning) stripped away. I see little to choose between Jonathan Edwards and a sodden, bleary-eyed Christopher Hitchens.
Kitcher is evidently trying to play the Good Atheist Cop to Dawkins’s, Harris’s, Dennett’s, and Hitchens’s Bad Atheist Cop. I doubt it will work, since despite his Christian upbringing Kitcher doesn’t seem to understand or really empathize with religious believers any more than the Bad Cops do, nor does he really offer any hope. Maybe he thinks that someday, someone will come up with some from somewhere. Maybe they’ll cook it up in a lab, a newer genetically-modified EnlightenmentTM hope that will enable people to get over the death of a loved one or the diagnosis of a painful terminal disease without the troublesome, addictive side effects of the old pre-scientific religious hope.

I’m not saying that I understand religious folk either. I’d think it would be easier for atheists like Kitcher, who were raised in religious families and only later broke away. I had no religious upbringing, and realized fairly early in life that I felt no need to believe in gods. I suspect that some of my attitude is temperamental (meaning that I have no idea where it comes from). Somewhere I read about a movie, Pete’n’Tillie, based on a Peter DeVries novel, in which a married couple (played by Carol Burnett and Walter Matthau) suffer through their child’s death of leukemia. One of the parents says something to the effect that it’s less painful to believe that there is no god, that no one is watching Up There, than to believe that Someone is watching but does nothing. That's exactly what I think, but I know that many other people, perhaps most, would rather believe that Someone is up there, weeping great salt tears over our pain and feeling it with us. (And no, I'm not talking about this guy.)

Yet religion is not synonymous with hope. Many atheists have died without fear, and many theists have died in terror of what might await them. Nor does every religion even offer the hope of an afterlife. Judaism, for one, has never been much concerned with the idea; reincarnation, while it promises some kind of survival, doesn’t offer the happy dream of reunion with one’s loved ones in an eternal Sunday afternoon. The philosopher Ludwig Wittgenstein wrote in his Tractatus Logico-philosophicus (6.4312), and again I agree:
Not only is there no guarantee of the temporal immortality of the human soul, that is to say of its eternal survival after death; but, in any case, this assumption completely fails to accomplish the purpose for which it has always been intended. Or is some riddle solved by my surviving for ever? Is not this eternal life itself as much of a riddle as our present life?

Sunday, March 23, 2008

It's Literally Turtles All The Way Down

(Cartoon from Baldo, via Literally, A Web Log)

But back to literalism. I’ve finished reading Philip Kitcher’s little book Living with Darwin, and he has it all wrong. The Book of Genesis has him in a tizzy, but fundamentalists and creationists don’t even take Genesis literally.

I mentioned before that there are two different creation stories in the first two chapters of Genesis. In chapter one, the sequence of creation goes roughly like this: on the first day God creates light and darkness; on the second day, he creates a dome called the sky; on the third day he creates the dry land by separating the waters into the sea, then creates vegetation; on the fourth day, he creates lights in the sky, the sun and the moon. (How did he create light and day without the sun, I hear you ask? Don’t ask.*) On the fifth day he creates the animal kingdom: birds, fishes, and sea monsters; on the sixth day, he creates land animals, and finally human beings, both male and female, in his own image, and gives them dominion over all other living things. On the seventh day, famously, he rested.

Chapter two recounts that in the day when Yahweh created the heavens and the earth, before he created the plants, he formed Adam from the dust of the earth and breathed life into his nostrils. Then Yahweh created a garden in Eden, with the trees of Life and of the Knowledge of Good and Evil in it and a river flowing out of it. Then Yahweh, deciding that Adam should not be alone, decided to create the animals, and brought them to Adam to see if any was a suitable companion for him. None was, so he put Adam to sleep and, famously, created Eve from his rib. They were both naked, and were not ashamed. No passage of time is specified here, and since it’s clearly a fable (notice the allegorical Trees), it’s possible that the writer supposed that everything here happened in one day.

Now, these two stories are completely different in sequence. Serious fundamentalists know full well that a literal reading won’t work, so they have to work pretty hard to harmonize them. Non-fundamentalist scholars have concluded, on the basis of this and other evidence in the Biblical text, that Genesis is actually the product of pasting together different source materials into a literary patchwork. (Richard E. Friedman’s Who Wrote the Bible? is a good introduction to this approach.) Kitcher never addresses this matter; perhaps he doesn’t consider it important, though I think that if you want to address fundamentalist positions, it’s more relevant than throwing Darwin at them. At one point Kitcher complains that the creationist and Intelligent Design positions consist more of trying to find flaws in evolutionary theory than in presenting an alternative; so why not play their own game, and show the problems in their own reading of the Bible? But that would be, like, hard. In any case, this example alone is evidence that “literalism” is not really the issue.

But wait -- there’s more in Genesis 3. The usual Christian interpretation, common to fundamentalists and less conservative Christians, of the events leading up to Adam and Eve’s expulsion from Eden is that Satan, in the form of a serpent, teased Adam and Eve into eating the fruit of the Tree of the Knowledge of Good and Evil. Yahweh had warned them that if they did so, they would die. Satan told Adam and Eve that they would not die but become as gods, knowing good and evil – absurd, of course: how could mere humans become like God? But they ate the fruit, and immediately felt the shame of their nakedness. For their disobedience, Yahweh threw them out of Eden and cursed them to labor and suffering, and sentenced Satan the serpent to crawl on his belly forever.

This is not what Genesis 3 actually says, however. A literal reading is instructive. There’s no hint that the crafty serpent is Satan, a figure who doesn’t show up until much later in the Hebrew Bible (or Old Testament, as Christians call it); it takes a nonliteral reading to find Satan here. Then, when Adam and Eve have eaten the forbidden fruit and are cowering in shame, and after Yahweh has cursed them to dirt and suffering and enmity with the serpent’s tribe, Yahweh reflects: “See, the man has become like one of us, knowing good and evil; and now, he might reach out his hand and take also from the tree of life, and live forever” – and only then does he drive Adam and Eve from Eden. The text is clear that Yahweh had lied, for they didn’t die when they ate from the forbidden tree; and it is explicit that by doing so, they had become as gods. For some reason Yahweh feared this, so he cut them off from the tree of life. This is more like, say, Greek mythology, where the gods try to keep knowledge from human beings; Yahweh is not a moral figure in this tale. Again, non-fundamentalist biblical scholars generally recognize this, but anyone reading the text can see it. A literal reading produces unacceptable results, so most Christians (apart from scholars) ignore the literal meaning of the story and understand it differently. Conservative Christians seem not to have minded, or even noticed, the departures from Scripture of Mel Gibson’s The Passion of The Christ; the letter of the text is not all that important to them, except when it becomes an excuse to draw a line in the sand over some other issue.

The model of the universe in the Hebrew Bible, with its domed sky hung with lights, is, from what I’ve read, not “supernatural” at all but based on ancient, mostly Babylonian, astronomy. The New Testament borrows from the astronomy of its own time, with the planets moving in concentric spheres or heavens. (“Seventh” heaven is the highest of these.) To insist on thinking of the universe in this way is not so much believing in the supernatural as in outdated science, like believing in humors or phlogiston. But then, popular astronomy books today contain pictures of the planets strung like beads on circular lines around the sun, called “orbits.”

Notice that the importance of a literal reading of Genesis 1-3 has nothing to do with the “supernatural” (another of Kitcher’s buzzwords, along with “spiritual religion” and “the enlightenment case”), though there are supernatural elements here and elsewhere in the Bible. The inconsistency of Genesis 1 and 2, or of the different accounts of Jesus’ birth in the gospels of Matthew and Luke (which Kitcher does discuss), is not due to the supernatural; it’s due to different, irreconcilable story lines.

Even when ordinary believers believe in the supernatural, it doesn’t mean they don’t also believe in an everyday material world where the supernatural doesn’t usually intrude. They know that if you jump off a tall building you’ll probably die. They know that virgins do not usually become pregnant, and if one of their single daughters were to try to claim the Holy Spirit as the father of her unborn child, they would become extremely hard-headed skeptics. Stanley Tambiah says that the anthropologist Meyer Fortes “once invited a rainmaker to perform the ceremony for him for an attractive fee, and the officiant in question replied ‘Don't be a fool, whoever makes a rain-making ceremony in the dry season?’” (Magic, science, religion, and the scope of rationality, Cambridge 1990, p 54). It seems that often it’s the modern, scientific rationalist who takes such things literally, while the believers think about them differently. The actual function of the supernatural in human belief is still being studied, and while it may be comforting to dismiss the “superstitions” of those who believe in different foolish things than we do, it’s not a sign of wisdom.

Things are not so different in the present-day secular world. For example, a lot of American women believe in the existence of “snuff films,” in which a woman is actually killed on camera during or after performing sexual acts, and that such films are available under the counter at adult video stores everywhere. There is no evidence that such films have ever existed, but the arguments to show that they do exist (as well as the vehemence with which the arguments are made) are reminiscent of arguments made by fundamentalists to preserve the inerrancy of the Bible.

What I find chilling is that the people who make this case would rather believe that thousands of women are being killed every year to make snuff porn, than that they aren’t. I’d say the same about other people who want to believe that thousands of children are kidnapped each year in shopping malls and sold into sex slavery. There are many other similar legends / fantasies in circulation today, from the "Paul Is Dead" scare to the claims that the Bush administration executed the 9/11 attacks, to the belief that Saddam Hussein was behind the 9/11 attacks, to “blood libel” legends which accuse Jews of sacrificing Gentile children and using their blood to make Passover matzos. These have nothing to do with the supernatural, nor with literalism. As with magical rituals, it’s interesting to speculate about the personal and social functions of such beliefs, but they shouldn’t be taken literally. (There’s an interesting discussion of this issue in Pamela Donovan, No way of knowing: crime, urban legends, and the Internet, Routledge, 2004). I suspect that the “supernatural” is a secondary issue here, not the real crux of the problem.

All of this swoops under Kitcher’s radar. I think that if we secularists don’t stop imposing our own misunderstandings on the religious and if we believe that by rejecting religion we are somehow immune to the lure of the legendary, we’re going to get nowhere in a hurry.

More later on Kitcher’s notion of “spiritual” religion as a superior alternative to “supernatural” religion.

*There's an old joke about the seeker who asks someone (St. Augustine, maybe?) what God did in the eons before he created the heavens and the earth. The old teacher snaps: He made Hell for people who ask questions like that!

Saturday, March 22, 2008

¡Me No English!

Now I understand why so many professional writers insist on writing every day, no matter what: it’s so easy, if you take one day off, to let it slide for another, and then another. So I’m back at the keyboard now, before I slip further. There’s so much to write about anyway. I’ve finished reading Living with Darwin, and I have a lot to say about it, as well as about Obama’s big speech on race and what people have said about that, and more.
But today I stumbled on this bit, from AOL news: a Philadelphia restaurant was vindicated by the city Commission on Human Relations for its display of two signs ordering patrons to “SPEAK ENGLISH” when ordering because “THIS IS AMERICA.” (The article reports, inaccurately, that the signs say “PLEASE SPEAK ENGLISH.” A “please” would be better business practice, don’t you think?) The Commission ruled that the signs do not violate the city’s Fair Practices Ordinance.
The restaurant’s owner claims “he never refused service to anyone because they couldn't speak English”, and that “he posted the signs in October 2005 because of concerns over immigration reform and an increasing number of people in the area who could not order in English.” Whose “concerns,” exactly? And if he never refused service to anyone, what is the point of the signs, except to establish that he’s a bigot?
There have always been people in urban America who couldn’t speak English, like Italian or Polish grandmas brought over by their relatives. Or they might be spouses, following a husband who came here to study. The offending immigrants in the restaurant’s neighborhood are presumably Asian or Latin American, and I suspect I’m looking at a venerable American tradition where an older wave of immigrants despises the wave after them for allegedly refusing to assimilate. (A commenter at The “Blog” of “Unnecessary” Quotation Marks says, “Apparently this same restaurant has been there for many decades, but back in the 1960's they would only serve you if you spoke Italian! Same intolerant attitude, just a different language.” I don’t know if it’s true, but I wouldn’t be surprised.)

I also wonder what constitutes “speaking English” in this context. I once had a Japanese boyfriend who spoke excellent English, and who switched credit-card companies because the customer service people claimed that they couldn’t understand him. It’s been shown that if people see a “foreign”-looking person, they’ll hear an accent in her speech even if she speaks perfect Standard English. (This was tested by playing the same audiotape, but showing the subjects two different photographs of the person supposedly speaking on it. One photo showed a blonde Caucasian, the other an “Asian,” and the subjects were much more likely to hear an accent if they were shown the latter photo.) I imagine that if you hear an accent where there isn't one, your brain would simply shut down and refuse to parse an actual accent.
So it’s quite likely, I believe, that the immigrants who so offended the owner did speak some English, but spoke it too slowly or badly to suit him. I’ve known quite a number of Latin American immigrants, and they all speak some English and want to learn more. None has ever expressed the view that learning English was not important. But if you work ten to twelve hours a day in the kitchen of a restaurant, it’s not easy to find time to take classes or study. (And I live in a nasty ol’ PC liberal college town, whose university offers some free English classes, mainly for students’ wives but open to anyone who wants to take them.) There’s a myth often retailed by the “This Is America” frothers, that their ancestors came here and learned English, so why can’t these newcomers? But most of their ancestors did not learn English. Their American-born children did, but learning a new language, no matter how motivated you are, is hard work, and it’s harder the older you are. In any case, the arrival of several thousand non-English-speaking immigrants (we don’t really let in that many, you know, and the larger numbers of illegals are here because American business wants them here) is not a threat to our civilization, such as it is.
I do agree that if you’re going to visit a foreign country, let alone move there, you should try to pick up as much of the language as you can. (Just saying “hello” and “thank you” in the local language will generally win a tourist points.) But I’ve talked to too many Americans who think that learning foreign languages is for foreigners. I remember in particular a man with an advanced degree in business who declared that foreigners coming to the US should know English, and if he went to other countries, the people there should know English to speak to him, because English is the dominant language in the world today. (Especially in business.) That he should befoul his tongue with foreign jibber-jabber was, to his mind, outrageous Political Correctness. I hope he lives long enough to see a day when he loses business because he doesn’t speak at least a little Chinese. The simple, brutal rudeness of his attitude still boggles my mind. The international reputation we Americans have for arrogance is not, alas, completely unearned.
As for Geno’s Steaks in Philly, I agree that they should be allowed to post their signs. But I also think that no decent human being should eat there. According to the AOL article, there’s a rival steakhouse across the street, and if they don’t have a sign demanding English in their window, it should be easy enough to get a good sandwich there. I also wonder if Geno’s, like most American restaurants, has any employees back out of sight in the kitchen who don’t speak more than minimal English…
In the university food service where I work, there are a fair number of Latino/a workers, all of them legal, who speak English with varying degrees of fluency. They’re dispersed among the dining halls during the school year, but during the summer you’ll find several of them in the same building. I enjoy this, if only for the chance to practice my Spanish. One summer day during break time there were several of them, speaking Spanish in the dishwashing area, along with me and one other Hoosier, a well-meaning but still often obnoxious young man who has been known to make fun of the halting English of his coworkers. (In addition to the Latinos, we also have a number of Albanians and West Africans, among other nationalities.) After watching our coworkers talk for a minute, he blurted out, “Hey! This is America! Speak English!” and looked at me for support. I grinned back at him and said, “Lo siento, Señor, no hablo inglés.” He looked crestfallen, as well as he should have.
I’ve used the same move a couple of times since, on other people in other places around town. Americans are just going to have to grow up. I cut no slack to the relatively uneducated like the fellow I just mentioned, since in most of the world it’s not uncommon for uneducated people to be able to get around in two or more languages. Americans have just been spoiled because of our relative isolation, though the successive waves of non-English-speaking immigrants we’ve enjoyed for the past couple of centuries render even that excuse non-operative. I favor a more aggressive stance by Americans of good will towards our fellow citizens who freak out at the sound of a language that isn’t English, let alone English with an accent. ¡Abajo Geno’s! Run ‘em out of business.

Thursday, March 20, 2008

Living With Literalism

Philip Kitcher’s Living with Darwin: Evolution, Design, and the Future of Faith (Oxford, 2007) is a nice little book. Kitcher, a philosopher who’s written a well-known critique of creationism and a well-known critique of sociobiology, now takes on Intelligent Design. He does a good job of explaining why natural selection is the best theory we have for the origin and extinction of species, and he’s even decently modest in the claims he makes for science as a mode of knowing. Despite the help he acknowledges from various philosophers and other scholars, though, he gets into trouble on the religious issues. My main beef is his reliance on the straw man of “biblical literalism.”

He tries to hedge just a little: “For many of those who want an alternative to Darwinism, however, novelty creationism is not enough. They would remain shocked by a science curriculum that implied that any (nonpoetic) part of the Bible cannot be taken as literal truth” (page 20). “Nonpoetic” won’t quite cut it, especially since Kitcher doesn’t explain which parts of the Bible are poetic and which are non. Is the Sermon on the Mount poetic? The Nativity Stories? The killing of Goliath by David? The book of Acts? The letters of Paul? Only 20 pages in, and poor Professor Kitcher is already in over his head and sinking fast.

He backs himself up with an endnote (page 170, note 19):
As I have discovered, some well-educated people find this statement incredible. They suppose that nobody takes all the (nonpoetic) parts of the Bible as literal truth. Their reaction is surely based on the fact that all the religious people they know adopt nonliteralist strategies of reading the scriptures. In fact, as any survey of evangelical Christian literature reveals, literalism is extremely important to many Christians. This is apparent not only in the books written in support of “scientific creationism” … but also in the King James Study Bible (Nashville, TN: Nelson, 1983). The Study Bible begins its section on interpretation by reminding the reader that “the Bible is God’s infallible, inerrantly inspired Word (p. xxiiii), and concludes a note on the opening of Genesis with the declaration that “the biblical account of Creation clearly indicates that God created the world in six literal days” (p. 6).
Kitcher seems to think that “infallible, inerrantly inspired” means “literally true.” He’s wrong. Yes, the King James Study Bible declares its belief that Genesis describes “six literal days” of creation – but that is not anywhere close to taking all “the (nonpoetic) parts of the Bible as literal truth.” I think it’s revealing that this is the best – at any rate, it’s the only -- evidence Kitcher provides to support his claim.
Biblical inerrancy is quite another doctrine. It’s a fairly mainstream belief, which conservative evangelicals share with the Roman Catholic Church. And in order to preserve the Bible from error, it’s necessary to interpret the Bible quite non-literally – in one famous example, by interpreting the six “days” of creation as epochs running to thousands or millions of years.
Some basic points:

1. I’ve never heard of a Christian denomination that claimed to take the entire Bible literally. I did once encounter an individual Christian who claimed she did, but when I asked her what she did with passages like Matthew 19.12 (become a eunuch for the kingdom of heaven), Matthew 5:29 (if your eye leads you to sin, pluck it out), or Mark 10:21 (sell all you have and give to the poor), she backtracked immediately: well, of course you can’t take the whole Bible literally! I didn’t mean you should take those verses literally!

2. At the other end of the spectrum, the most loudly non-literalist Christians known to me believe that Jesus literally lived in Galilee in the first century, roamed around teaching and gathering disciples, and finally died on a cross in Jerusalem. A few maverick scholars have argued that Jesus was really a mythical figure with no literal existence; Kitcher should look at the scholarly reaction to their work to see how important “literalism” is to non-evangelical, even quite liberal Christians.

Where Christians differ is in which parts of the Bible to take literally and which to interpret figuratively. Once again we see someone distorting a difference of degree into a degree of kind, opening up a gulf between people who are not really that different from each other. A philosopher should know better than this.

3. Non-literal interpretations are not necessarily correct. Often they’re used to get around passages that are false or otherwise embarrassing. For example, three of the gospels report that Jesus predicted he would return with power before the generation then living had died. Since this is false, it can’t mean what it says, so Christians have found various ways to interpret it in such a way that it is no longer false. Albert Schweitzer, who was not only a humanitarian doctor (the Mother Teresa of his day) and a Bach specialist but a New Testament scholar, argued in in The Quest for the Historical Jesus (first edition 1906; first English translation 1910), that Jesus expected the Kingdom to come right away, and so was mistaken. In the mid-20th century, an Anglican scholar, C. H. Dodd, developed a theory of “realized eschatology” in The Parables of the Kingdom (1935; revised edition 1961) which reinterpreted Jesus’ teachings to say that the Kingdom of God had already arrived, so no Second Coming was necessary and Jesus was right. Understandably, a number of respectable theologians who didn’t like to think of Jesus as a wild-eyed apocalyptic preacher liked Dodd’s interpretation, but it doesn’t seem to have held up well. Schweitzer’s general argument remains strong, but it’s a stumbling block for many Christians who want Jesus to be inerrant, so scholars continue to try to find ways around it.

Or consider Jesus’ teachings about the family. Though he opposed divorce, he didn’t mind if his followers abandoned their families to follow him, and the gospels show him at odds with his own family. When his mother and brothers came to see him in Mark 3, they couldn’t get through the crowds around him, and Jesus brushed them off. Who are my mother and brothers? he asked rhetorically. These (meaning his disciples and other followers) are my mother and brothers; whoever does the will of God is my mother and brothers. On another occasion (Matthew 8:21f) Jesus forbade a disciple to return home for his father’s funeral, ordering him to leave the dead to bury the dead. He also said (Matthew 10:34ff) that he had come not to bring peace, but to bring division, to set people against their families and their families against them, and that anyone who came to him and did not hate his family was no disciple of his (Luke 14:26).

This was reasonable (if not particularly attractive) behavior for an apocalyptic cult, or for any new cult that needs to lure converts away from established religions to build up its own numbers. So some interpreters borrow Schweitzer’s arguments just long enough to get rid of (or at least explain away) Jesus’ anti-family teaching: well, Jesus thought the world was about to end, so of course he had a sense of eschatological urgency and thought people had to get rid of anything that might interfere with their salvation. Of course Jesus was mistaken in thinking that the world was about to end, but these sayings express Jesus’ sublime sense of eschatological urgency and trust in God – but that was then and this is now, so of course you shouldn’t hate your family! These, again, are non-fundamentalist Christian arguments, the kind of thing I found in the work of distinguished theologians like Rudolf Bultmann.

I suspect that the reason Kitcher wants to brush aside the importance of Biblical inerrancy among fundamentalists is that he wants to draw a sharp line between bad, low-class literalist Bible-thumpers and good (or at least not-so-bad) decent non-literalists. (He even thinks the Gospel of Thomas is the neatest thing since sliced bread. Very trendy!) So far (I’m about halfway through Living with Darwin), Kitcher doesn’t mention that there are two different creation stories in Genesis, which, if read literally, contradict each other thoroughly. (There are also different versions of Noah and the Flood, which differ on many points.) Fundamentalists nowadays try to harmonize them by putting the verbs in Genesis 2 into the past perfect tense, so that it describes God creating Adam and then, in a flashback, describes the other things he had created before. It doesn’t work very well, and the only English translation I’ve ever seen that supports this interpretation is the New International Version, a fundamentalist-friendly translation that often distorts its translations for theological-apologetic reasons.

I’m not sure that Kitcher’s distortion of this basic issue hurts much of his argument. Does it really matter if someone takes all the Bible literally, or just some of it? Maybe not for the purposes of a discussion of evolutionary theory, but still, it grates on me every time Kitcher talks about “literalism,” which he does fairly often. It may matter more when I read the rest of the book, in which he’s going to address the role of religion in a scientific world. He’s trying to be nice, to distance himself from angry cranks like Richard Dawkins, but if he can so fundamentally misunderstand the people he’s trying to talk to, he’s not going to reach them. But I suspect too that this book is, in this respect, like early Christian defenses of the faith which were ostensibly addressed to Roman rulers: it’s highly unlikely that anyone but Christians ever read them at the time, so they mainly made Christians feel good, but didn’t have much effect on non-Christians who were skeptical about the new cult. Similarly, Living with Darwin seems unlikely to reach conservative evangelicals. Judging from the customer reviews on Amazon, it’s being read mainly by non-believers who want to feel good about rejecting Intelligent Design, and can then say, Take that, you superstitious literalists, you!

More on this, I hope, after I finish the book. I'll try to add some links later, too, with more information for those who want it.

Wednesday, March 19, 2008

The World As I Found It

Another book review for Gay Community News, published in the January 15-21, 1989 issue.

No one seems to dispute it now, but as far as I recall, there was controversy over the fact of Wittgenstein's homosexuality into the 1980s -- maybe even as late as the publication of Duffy's novel. And it was philosophers who were throwing hissyfits over it; but I've read enough philosophy that I should know better than to assume that philosophers are rational. Generally philosophers like Bertrand Russell, while hostile to religious anti-sex teachings, thought that homosexuality was the result of religious repression, and that buggery would wither away along with the church as the Enlightenment advanced. I must try to find again a book I once stumbled on, published by Pelican Books in the 1960s, which claimed to develop an atheistic philosophical view of sex. It was antigay, though of course in a compassionate way: no jail for us, maybe even no electroshock or chemical castration, just an enlightened form of pitying contempt.

The World as I Found It
by Bruce Duffy
Ticknor & Fields, New York, 1987
$19.95 cloth, 546pp.

Well, I'm afraid that The World as I Found It is a bit of a disappointment. Till recently there was no full biography of Ludwig Wittgenstein (1889-1951), the most influential philosopher of this century, so the prospect of even a novel about him excited me a little. Aside from his professional importance, Wittgenstein was one of the more interesting eccentrics of our time. Born to a wealthy Catholic (converted from Judaism) family in Vienna, haunted by the suicide of an older gay brother, Wittgenstein was a wanderer all his life. He won the interest of the great mathematician Frege, studied with Bertrand Russell at Cambridge, then dropped philosophy to join the Austrian army in World War I. During the war he wrote his brilliant and mystifying Tractatus Logico-Philosophicus, then abandoned philosophy as soon as it was published to teach schoolchildren in an Austrian village. But friction with the villagers forced him to abandon that project, so he returned home long enough to design and build a house for his sister, then went back to Cambridge to teach philosophy for the rest of his life. The only other work he intended for publication, the Philosophical Investigations, appeared posthumously, but recent years have seen a flood of publications culled from his notebooks and from his students’ lecture notes.

It's certain that he was gay, though his love life was intensely problematical; so far I gather that he had heavy Platonic crushes on his students, but whether any of them ever reciprocated I don't know. He also had many endearing quirks, such as a fondness for Mickey Mouse cartoons and detective stories; and his former student Norman Malcolm recalls how Wittgenstein insisted while visiting on helping Malcolm's wife with the dishes. So Wittgenstein certainly seems a suitable subject for a novel, and everything from its cover blurb to the Library of Congress Cataloguing Data announces that The World as I Found It is about Wittgenstein.

And quite a bit of it is, but there is a frustrating amount of space -- seemingly about half the book -- devoted to Wittgenstein's Cambridge colleagues Bertrand Russell and G.E. Moore. Even worse, while The World as I Found It sheds little light on Wittgenstein's sexuality, it tells far more than I wanted to know about Russell's, including many tediously lengthy accounts of heterosexual copulation. Don't get me wrong; some of my best friends are heterosexual, but in a novel about Wittgenstein these interludes seem rather pointless digressions. Duffy writes well, a neat journeyman's prose, and The World as I Found It is very much worth reading, but I'd be happier if the author had dared to go deeper into the mind and heart of his alleged star, and spent less time on the supporting cast.

Monday, March 17, 2008

Wilt thou not once vouchsafe to hide my Will in thine?

I’m most of the way through the second season of Will and Grace on DVD, thanks to my public library. (There was an essay by Edward Rothstein in the New York Times today, lamenting the “democratization” of public libraries – nothing new there! – because the New York City library system is having an event at the Fifth Avenue branch, built around video games. “Nothing in this event embodies the slightest hint of cultural aspiration, except the library’s own aspiration for a wider public.” Much “Ain’t It Awful?” about the decline of something or other. I do expect my public library to have Dickens and Austen and Shakepeare and Woolf, and it does. But it also has embroidery patterns, cookbooks, graphic novels, and videos, as it should have. The whiner in the Times, by the way, gave no evidence that the New York libraries were going to throw out their literary classics and stock only Mortal Kombat and Grand Theft Auto. Can’t something be done about these alarmist cretins?)

Until I began watching it on DVD, I’d never seen Will and Grace. I really don’t watch television. But I’d been hearing about the show for years, especially in connection with such vital issues as Jack’s “stereotypicality” and Will’s lack of a love life, so when I found the first season DVDs on the shelf at the library last year, I decided to see for myself. And I was surprised: it was reasonably well-written and very well-acted, and it made me laugh a lot. That might be because it comes from what I suppose I must call a Gay Sensibility: even though it must, by the nature of broadcast TV, be accessible to a straight audience, it still assumes that audience to be gay-friendly and willing to identify with gay characters. When Jack and Will make jokes built on gay stereotypes – as gay men do, frequently – the show assumes that the audience will laugh with them, as well as at them. Apparently TV audiences did so, for years, and that is no small achievement. Yet I had to make almost no adjustments while watching: there is very little earnest preaching about how We’re just like You, sort of, except for our adoration of Cher and Britney and our tendency to frame our faces à la Judy at least once per episode. I found Will and Grace easy to watch, such appealing brain candy that I’ve been watching each disc of six episodes in sequence, like bonbons. Just one more episode tonight won’t hurt, I keep telling myself as I proceed. (If only I had a divan and a peignoir.)

What I particularly wanted to address here was Will’s love life. Contrary to what I heard from everyone, he does have one, more so in Season Two than in Season One as I recall. He goes out on dates about as often as Grace does, and we see the men he’s dating, and the implication is clear that he has sex with them. So far I don’t recall having seen him in bed with one, even for post-coital conversation (though Grace has been seen so with her SNAG boyfriend Josh), and that is a problem, though not a serious one for me in itself. Ditto for kisses, and that is a minor problem. One episode in Season Two has Will lock lips with Jack as they protest the failure of a TV sitcom in their world to show two men kissing. It’s a highly self-referential kiss (look! we’re protesting the absence of kisses in our sitcom by showing a kiss!), and there’s no romance in it at all. I still have one disc and several seasons to watch – did things improve? No, don’t tell me, I’ll find out for myself.

Maybe most bothersome in this connection is the episode where Will is having a recurring nightmare of Grace coming into his bedroom and the two of them making love. It’s actually more than we’ve seen Grace do with any of her straight dates, and it would be a lovely scene – if only it were not in a gay sitcom. I love the way the two of them come together so intimately, but it only highlights the absence of such scenes between men on the show. There have been male-male love scenes of equivalent beauty in films, mostly foreign or small-budget independent, and since I don’t watch TV anyhow, I’m not complaining too much. And now, if you’ll excuse me, I'd like to munch on a few more Will and Grace bonbons tonight.

Sunday, March 16, 2008

A Touch Of Class

I’m still reading through the work of Raymond Williams, which is going to take a while. He’s always readable, but I keep coming across ideas and arguments which are unfamiliar to me, which slows me down. And that’s fine: I take a lot of pleasure, and also relief, in finding that I’m still learning new things as I get older. (And also, I confess, frustration that there will not be time to learn everything I need and want to learn.) I’m getting through Politics and Letters slowly, slowly. But here’s something I found in the posthumous collection What I Came to Say (London: Hutchinson Radius, 1989). It comes from an autobiographical essay in which Williams describes his arrival at Cambridge University as a scholarship boy from Wales in 1939.

In this and other ways, over the first week, I found out what is now obvious: that I was arriving, more or less isolated, within what was generally the arrival of a whole formation, an age-group, which already had behind it years of shared acquaintance, and shared training and expectations, from its boarding schools. I was reminded of a conversation my father had reported to me, from his advance visit. The porter had asked him, rather haughtily, whether my name was already down. ‘Yes, since last autumn.’ ‘Last autumn? Many of them, you know, are put down at birth.’ I try to be charitable, and find it easier now. But I remember sitting on the benches in hall, surrounded by these people, and wishing they had been put down at birth. There was little personal difficulty or dislike, but the formation was easy to hate – is still easy to hate – and I have to record that I responded aggressively. The myth of the working-class boy arriving in Cambridge – it has happened more since the war, though the proportion is still quite unreasonably low – is that he is an awkward misfit and has to learn new manners. It may depend on where you come from. Out of rural Wales it didn’t feel like that. The class which has dominated Cambridge is given to describing itself as well-mannered, polite, sensitive. It continually contrasts itself favourably with the rougher and coarser others. When it turns to the arts, it congratulates itself, overtly, on its taste and its sensibility; speaks of its poise and tone. If I then say that what I found was an extraordinarily coarse, pushing, name-ridden group, I shall be told that I am showing class-feeling, class-envy, class-resentment. That I showed class-feeling is not in any doubt. All I would insist on is that nobody fortunate enough to grow up in a good home, in a genuinely well-mannered and sensitive community, could for a moment envy these loud, competitive and deprived people. All I did not know then was how cold that class is. That comes with experience.

The first thing this passage makes me think of is the Harry Potter books. Harry also arrives at Hogwarts alone, though he’s quickly adopted by the Weasleys, and is preceded by his reputation as the boy whom Voldemort tried and failed to kill. Unlike Harry, who was raised by his hostile human relatives in isolation from other wizards, Williams grew up in a very self-aware working-class community. His father (unlike Harry’s) was alive and able to instruct him, having participated in the great railroad strike of 1926, and labor organizing and activism were in the air Williams breathed as a child and adolescent. Williams’s excellence as a scholar, rather than his parentage (indeed Harry’s race, as a Wizard rather than a Muggle), was what won his way into Cambridge; and thanks to his background Williams was much less conflicted than Harry about his outsider status; unlike Harry, Williams never seems to have felt pressured to be an insider.

Williams’s description of the upper-class students at Cambridge will be familiar to Potter readers; he could easily be describing the Malfoys and the others who felt it was their right to rule. I recognize them too, as a subset of the students I encountered at the large state university I attended, and where I still work. Though my family was working-class, like most Americans we thought of ourselves as middle-class, so I didn’t have Williams’s identification with a working-class community and its traditions of organizing and struggle; my parents were pretty apolitical, and since we weren’t churchgoers either, I grew up with little sense of community outside my family. (Considering my background, I find it interesting that when I came out, getting involved in gay politics and activism seemed like the natural course for me, though in the end I never did that much, and I turned out more like my parents in this respect than I like to acknowledge.) So it took me a long time to realize that a large part of my feelings of alienation came from class differences. Unfortunately I didn’t have a background of “class-feeling,” as Williams did, to make sense of my situation.

Eventually I came to recognize myself as a familiar enough type: the intellectual from a working-class background who ends up feeling out of place whether among my class of origin, or among the class I’m supposed to aspire to. That’s one reason I find Williams so interesting: despite that alienation, he found a way to build a career of important work that was true to his principles without being imprisoned by his background (which actually seems to have supported and liberated him) or by the structure of the great English universities.