Monday, December 31, 2012

My Inner Child

One of my friends on Facebook wished me a happy birthday, and hoped I'd find something enjoyable to do on the day.  When I thanked her, adding that I'd better find something enjoyable to do, since all the snow we've been getting would keep me from getting out much, she replied, "You're never too old to go sledding!"

I said that I became too old for sledding when I was ten or so, and she chirped "2013 Resolution: Reconnect with your inner child."  She has some New-Agey culture-of-therapy tendencies, and I debated with myself whether to answer that, but in the end I gave in when I thought of the right response.

"I never disconnected," I told her.  "My inner child is a sissy who doesn't like sports and would rather be indoors on a winter day, reading a good book.  Which is probably what I'll be doing."  I added a smiley (which, yes, is probably almost as bad as "LOL") to try not to seem too mean.

And if I do say so myself, I think that was a good answer.  In fact I often did enjoy playing in the snow as a kid, but more often I preferred reading.  I realized, though, that my friend had a reductive and stereotypical idea of what a child is -- and what an adult is.  Which led me to wonder where the whole notion of an "inner child" came from; according to Wikipedia, which is probably adequate for my purposes here, it comes from analytic psychology and popular psychology.  It seems to assume that children and adults are utterly distinct from each other, and I'm not so sure about that either.  I don't perceive "childish" and "adult" wishes, thoughts, behavior in myself, in clearly separate compartments -- they all seem mixed together and I perceive my childhood through all the memories and experiences I've had since then.

I don't know how many children feel this way, but it seems to me that I often resisted being made to do certain things as a child by adults who wanted to make me do them because they were kid things: dressing up in costume for Halloween, for example, or wearing short pants.  I was more interested in getting in touch with my inner adult.  I've often thought that the story of Peter Pan contains a paradox: children mostly want to grow up, and adults want to be children again, so by refusing to grow up, Peter signals that he has already done so and is no longer a child.  At best he's an adult's idea of a child.  (This paradox lies at the heart of Spielberg's Hook, the only movie of his that I can stand; I've meant to write about that for a long time.  Maybe in 2013.)

I didn't really feel like an adult until I was in my fifties, and I don't think I've often refused to do something because it was childish.  Mostly I wondered when I'd start to feel like a grownup, which on a childish level I assumed went with adulthood: suddenly you wake up and you know what to do, how to take charge, how not to be afraid, how not to get lost, how to fit into the adult world.  I know from talking to other people that this is not uncommon.  If you're lucky, you wake up and realize that you've arrived, and it's time to act as though you knew what you were doing.  Often you were ready before you were an adult, but adults wouldn't let you.

No, I never lost touch with my inner child, who was there all along.  He's often what annoys other adults about me, in fact, but by now he's too big to spank, and unlike an outer child he can talk back.  So reconnecting with him isn't something I need to do.  For the past five or ten years, though, I've been getting in touch with my inner old person, which is easier than I thought it would be.  But then, I've had much more time to get to know him.  A child gets tossed into childhood, as it were, with no warning and no preparation.  No wonder childhood isn't easy.

"Tribalism" and the End of the Year Blahs ...

... to be followed, I suspect, by New Year blahs in short order.  I don't entirely know why I've slowed down in this last week of the year, though it's partly because of the heavy snow we got on the day after Christmas, and refreshed on Saturday and today.  No matter; it will pass.

I see that of the three big projects I listed in my agenda for this year, I didn't get to even one.  That's not terrible, because two of them had been on the back burner for several years anyway.  Maybe this year.

But one thing I can do, and that's to post a small supplement to my post on my unease with the word "tribalism," as used by left and progressive bloggers.  After I wrote it, I came across a book called Mistaking Africa: Curiosities and Inventions of the American Mind by Curtis Keim, published by Westview Press in 1999.  It's a reasonably short and accessible account of the ways Americans, white and to a lesser extent black, invent an Africa they want to believe in, rather than the Africa that is there. There's an entire chapter devoted to the word and concept of "tribe," and it confirms my sense that "tribal" and "tribalism," as used by white progressives, have racist overtones.

Keim writes that words like "nation," "people," and "tribe"
began to diverge in meaning in the late eighteenth century.  Europeans, who increasingly thought of themselves as more advanced than other peoples, needed words to distinguish themselves from others.  The word people retained its general usage, but nations came to thought of as larger groupings with more complex social structures and technologies.  The word tribes was reserved for groups that were smaller and, supposedly, simpler and less evolved.  Our modern ideas about primitives and the Dark Continent emerged in the same era.  By the mid-nineteenth century, the word tribe had assumed a negative meaning that implied political organizations that were primordial, backward, irrational, and static.  A person didn't join a tribe; one was born into it.  People in civilized societies could actively select from among different, creative courses of action, but tribal people followed tribal customs without thinking.  It was indeed fortunate for tribes that they had such customs to guide their actions, because members were so limited intellectually.  Of course, "tribalism" was expected of such people.  In other words, to be tribal was to be genetically incapable of more advanced thought or political organizations.

In the twentieth century, the meaning of the word tribe, as it applied to Africa, developed in two directions.  The first, favored by white politicians and colonial administrators, was a variation of the nineteenth-century definition of tribes as having closed boundaries and unchanging customs.  This was administratively useful because it allowed colonialists to make sense of an create order out of the bewildering variety of African political organizations ...

The second direction in the development of the word tribe was favored by anthropologists.  In the 1920s, anthropologists began to live with Africans and to take their day-to-day lives more seriously.  Their experiences revealed that the nineteen-century definition of tribe was deeply flawed.  They found that tribal peoples were neither unthinking nor less evolved than Westerners.  And they learned that tribes were constantly changing and adapting, just as were their own societies.  Anthropologists have sometimes been called servants of colonialism, because they provided the information and categories necessary to organize African people.  Although this negative label has some validity, it is also true that anthropologists were among the first to recognize that African complexity was creative rather than irrational and chaotic [99-101].
This history explains why the use of "tribal" as a putdown bothers me: it draws on nineteenth-century racist notions of "natives" as less "evolved," less intelligent and rational, and unable to think for themselves, but also as a different kind ("race") of person.  It also implies that non-tribalists are more evolved, smarter, and bold independent thinkers, though the leftish writers who use "tribal" this way would be properly contemptuous of the Europeans who thought they were a better breed than the dusky-hued people they tried to subjugate.  (And these same leftish writers are anticolonialist and anti-imperialist to a man.)  But this only shrinks the circle of evolved, intelligent, independent thinkers; it doesn't abolish the circle altogether.  There's still a "tribe" of Real Smart People in there, smug in their disdain for the lesser tribes.

(There's another side to this: I've noticed some intelligent American Indian writers and scholars who seem to accept the notion of tribes as timeless, ahistorical, uniform, and unchanging in their customs because the Elders know best.  None of them, I think, would actually want to live in such a society, but it's a comforting fantasy.)

I admit that I've been susceptible to this same tendency, especially as a lonely young bookworm and sissy.  It was very comforting to think of myself as part of an elite corps of Homo Superior, just waiting for our moment to emerge from the shadows and manifest our glory  It's a tendency I've worked to get free of, though I don't think it's possible for a human being to be completely free of it.  We're a social species, and there will always be Usses and Thems in our lives.  The question is how we treat the Thems.

I also admit that the use of "tribal" by these writers probably hurts no one.  It disturbs me mainly because it looks to me like a faultline in their politics and their thinking.  It disturbed me, but entertained me even more, when one of the writers in question and his Posse defended his use of "tribal" in the same way overt racists defend their practice: by accusing me of petty Political Correctness and Thought-Police tendencies, plus of course they couldn't have said anything racist, it's all in my imagination.  The writer's other line of defense was almost as funny: he cited the writer from whom he'd picked up this sense of "tribalism" as Authority.  (It's on the Internet, so it must be true!)  I consider this to be further evidence of how such groupthink erodes one's ability to think.

There are other words that could be used instead of "tribalism."  "Groupthink," for example.   Or more specific ones, like "party loyalty."  There are surely others  But it's so much easier to have a handy catchphrase, not too vulgar, for flogging one's ideological enemies.

Saturday, December 29, 2012

Forward Into the Past!


Batocchio at Vagabond Scholar has carried on the late Jon Swift's custom of an annual roundup of the best blog posts of the year, chosen by the bloggers. I'm in there, of course, but there's plenty of material worth your attention; I've already read several good ones. I'm going to put together my own retrospective, as I did last year, but for now I wanted to pass this along.

And this might be as good a time as any to mention that my 1500th post went up earlier this week.  That's 1500 in five and a half years; I enjoy feeling productive.

Thursday, December 27, 2012

The Ghost of Strawberries Past



The Onion AV Club has a new review of Ingmar Bergman's 1957 film Wild StrawberriesReading the review and the comments it inspired made me think again about what I think of Bergman as a filmmaker.

I can't remember now which was the first Bergman film I ever saw, because it was over 40 years ago, but it was probably The Seventh Seal.  It was my introduction to art film, except maybe for a late-night broadcast of Rashomon dubbed in English on the local NBC affiliate (I'm still trying to figure how that happened), and so I was impressed by the contrast with Hollywood movies, but I still saw that it was ponderous and self-consciously arty.  I liked its literariness, being a bookworm, but it didn't excite me all that much. Some years later, after I'd seen it a couple more times, I happened on Stanley Kauffmann's remark that it was Bergman's pretentious film, and I was relieved to learn I wasn't the only person who thought so.

I saw Wild Strawberries not very long afterward, and disliked it intensely: as far as I was concerned, it was nothing but a Bergmanesque A Christmas Carol, and annoying from start to finish.  I don't think I've watched it again.  I did see every Bergman film that showed on campus after that (this was in the days before home video), so I saw quite a bit of his early work, and each new one as it was released.  Perversely I liked The Devil's Eye the best of his early work, and Scenes from a Marriage best of all, followed by The Magic Flute.  But Face to Face and The Serpent's Egg and Autumn Sonata were awful, and I gradually tuned him out.  (Despite this, I still intend to see films of his that I haven't gotten around to, like Fanny and Alexander.  The Criterion DVD set is on my table, waiting...)

I suspect that Bergman's reputation comes partly from the fact that he is so many people's introduction to art film, and he opens up new horizons for people who grew up on Hollywood product (as I did).  At the same time, though, one of the connotations of 'art film' in the 1950s and 1960s, when I was growing up, was sex, specifically a bluntness and matter-of-factness about sex that was impossible for American films in the days of the Hollywood code.  That interested me too, but I'd spent my high school years exploring taboo-breaking books, which exceeded anything European art films offered in their explicitness, so I never had such a sense that these films were groundbreaking except perhaps as films.  For that matter, I noticed fairly soon that the themes of a lot of European art film had mostly been done to death in European and American art literature, and the films didn't have anything to add that I could see.  Of course I mostly came to these Fifties and Sixties films late, a decade or more after their original release, but more recent European films indicated to me that they had hardened into schtick by the 1970s.

By the time I was in my late 20s, Bergman's Protestant (Calvinist?) guilt/angst made me impatient.  I had no religious upbringing to speak of, and as an atheist I never felt the need to shake my fist at Heaven as he did.  I've never been one of those atheists who feel that they lost or lack something because they don't believe.  I respect and admire Bergman's craft -- his direction and his actors, and Sven Nykvist's cinematography -- but I think he's overrated by many people.

Wednesday, December 26, 2012

What We Got Here Is a Failure to Communicate

I guess I'd better get back to work.  I recently read an intriguing little book called Why Tolerate Religion? (Princeton, 2013) by Brian Leiter.  The title is a bit misleading, since it suggests that religion shouldn't be tolerated at all.  But Leiter's main question is whether religion should be singled out (as it is in the US) for special protection from state interference, compared to other claims of conscience.  He begins by citing a recent Canadian legal case involving a Sikh teenager who demanded that he should be allowed to carry his kirpan, the ceremonial dagger that every adult Sikh male is required to have on his person at all times, in school.  In the end the youth won his case.
But now suppose [Leiter continues] that our fourteen-year-old boy is not a Sikh but a boy from a rural family whose life "on the land" goes back many generations.  As in almost all cultures, this boy's community has rituals marking the arrival of maturity for males in that community.  A central one is the passing of a dagger or knife from father to son, across the generations.  To be a "man" at the age of thirteen or fourteen is to receive that dagger from one's father, just as he received it from his, and so on, stretching back for decades, perhaps centuries.  A boy's identity as a man his community turns on his always carrying the family knife, for it marks his maturity and his bond with the past.  There can be no doubt in this case about the conscientious obligation every boy of knife-bearing age feels to carry his knife with him, even in school.  And there can be no doubt that were his ability to carry his knife abridged, his identity as a man devoted to his community would be destroyed.

There is no Western democracy at present, in which the boy in our second scenario has prevailed or would prevail in a challenge to a general prohibition on the carrying of weapons in the school [2-3].
Leiter seems to overlook something here: the hypothetical second boy wouldn't be the only one wearing his family knife in school, and so he probably wouldn't have trouble as long as he lived in his ancestral community.  He'd only be prohibited from carrying it if his family moved to another community with different manhood rituals, much as the Sikh boy's family had done.

But the basic point is strong, especially since many religions -- at one time, all of them -- are about communities and their norms and rituals.  Israelite religion and Hinduism are well-known examples, but the same would be true of Greek and Roman paganism.  The rituals are often older than the myths that rationalize them.   At some point there emerged religions like Christianity, Buddhism, and Islam, missionary religions that sought adherents in a variety of cultures.  But once transplanted, they often grafted on rituals of the old gods, like "Christmas" trees.  Leiter's second boy's knife is by older standards every bit as "religious" as a kirpan.

That doesn't undermine Leiter's argument, though, because in the modern West there is a distinction drawn between religion and other principles that guide conduct and demand adherence.  Why, Leiter asks, should this be so?  What makes religion so special, so privileged?  He never quite answers the question, but I think he'd agree that the burden of argument lies on the person who advocates special privilege for religious claims of conscience.  Along the way he grapples with the definition of religion, and mainly succeeds in showing that it's impossible to draw a sharp line between religion and non-religion, though I think he's satisfied with the distinction he draws.  (He puts a heavy emphasis on faith, construed as attachment to beliefs and demands of conscience that "do not answer ultimately to (or at the limit) to evidence and reason, as these are understood in other domains concerned with knowledge of the world" [34].  He admits that this isn't specific to religion -- for example, "We are, all us, in the grips of a multitude of false beliefs" [76],  though I'm not sure he fully recognizes how universal it is, especially in matters of morality.)

One of his most interesting discussions, to my mind, is on respect.  Philosophers recognize two main kinds of respect: the less demanding can be called "recognition respect," as when I respect your right to hold your beliefs even if I disagree with the beliefs themselves, and I recognize that they're important to you; the more demanding is "affirmation respect," as when I'm expected to agree that your beliefs are totally cool.  He quotes an article by the philosopher Simon Blackburn, who
tells the story of being invited to dinner at a colleague's home and then being asked to participate in a religious observance prior to dinner.  He declined, though his colleague said participating was merely a matter of showing "respect."  His host seems to have viewed this as a matter of simple recognition respect, but Blackburn interprets it (perhaps rightly) as something more:

I would not be expected to respect the beliefs of flat earthers or those of the people who believed that the Hale-Bopp comet was a recycling facility for dead Californians who killed themselves in order to join it.  Had my host stood up and asked me to toast the Hale-Bopp hopefuls, or to break bread or some such in token of fellowship with them, I would have been just as embarrassed and indeed angry.  I lament and regret the holding of such beliefs, and I despise the features of humanity that make them so common.  I wish people were different [74].
I must read Blackburn's paper, because it sounds to me as though he's confused about the distinction between respecting people's rights to hold their beliefs and respecting the beliefs himself; but his host appears to have been just as confused in the same way.  As Leiter later remarks, it's not clear that a Jewish shabbos prayer is on the same level as the Hale-Bopp cultists, but even so, if I'd been in Blackburn's shoes I wouldn't have been "embarrassed and indeed angry" when asked to participate, though I'd still have declined to do so.  Like a good many of my fellow atheists, Blackburn seems to overreact to other people's beliefs and practices even when they practice in their own homes.  I have been invited to meals where my hosts prayed before eating, and I simply sit quietly until they're done praying.  It is disrespectful of someone's unbelief to expect him or her to participate in one's ritual, though many spiritual tourists delight in doing so.  Politeness and respect -- recognition respect -- need to go in both directions.  Both Judaism and Christianity have traditions of martyrdom rather than participating in non-Yahwist rituals, like burning a pinch of incense before an image of Caesar.  C'mon, would it have killed them to show a little respect?  Blackburn complains about what he calls "respect creep," where "the request for thin toleration turns into a demand for more substantial respect, such as fellow-feeling, or esteem, and finally deference and reverence" (quoted by Leiter, 75), and that's a valid complaint, but I get the feeling he wants some respect creep in his own direction.  As I say, I'll have to read his paper.

But this anecdote shows just why even minimal recognition respect, let alone maximal affirmation respect, is problematical.  Blackburn's host refused to extend to him, as an unbeliever, the recognition respect necessary not to demand that he participate in prayer, and Blackburn seems to have overreacted to finding himself seated at a table with believers who, reasonably enough, expected to pray.  Did he expect them to go further, and forego the shabbos prayer because an apikoros was present?  I'd call that a failure in even "thin tolerance."

There's more to Why Tolerate Religion?, and I'll get back to it in due time.

Sunday, December 23, 2012

Taxation With Representation

This is just a thought: one of the big issues in the fiscal bluff ("a neat pun and probably one we should all start using," as Avedon Carol said) negotiations is whether George W. Bush's tax holiday should end.  Obama says he wants to extend it for the "middle class" while letting it end for the wealthy, and he's indulged in some alarmist rhetoric to make it seem like the Gummint will suddenly take $2000 out of your middle-class wallet on January 2.  (The corporate media, friend to the people that they are, increase the amount to $3,500.)

I'm not middle-class myself, but I'm probably better off than a lot of people who are living a lot closer to financial trouble than I am: I don't have any debt, I don't have any children to put through college, and my health is good.  With those caveats in mind, though, I myself don't necessarily object to having my taxes go back up a bit.  Ever since Obama lowered them, I've felt strange about having lower taxes when the economy was in trouble.  True, until I retired I spent most of the difference, which is good for the economy: we of the 99% spend our money instead of hoarding it.  And I know that the deficit is not as big a problem as the ongoing shortage of jobs.

But I've also been worried about the long-term effects of lowering taxes while increasing spending, especially on the military.  I remember that Lyndon Johnson preferred to pay for the War in Vietnam on credit, as it were, instead of raising taxes; a prudent politician knows how much sacrifice he can ask from the citizenry for a war that most people don't care about very much.  Ditto for Bush, who drove the deficit through the roof to pay for the Afghan and Iraq wars.  (Instead, as we old folks remember, he asked Americans to help defeat the terrorists by going shopping.)

A popular slogan on the Right is that if a family goes into debt, it should cut back on spending.  Leave aside for a moment the fact that a nation is not a family.  Less spending is only one way to cope with a shortage of funds.  Another is to increase income.  That was one reason the top marginal rates went up so much during and after World War II: to pay for the war, both the Good War and the Cold War.  Probably most Americans would have had second thoughts about Bush's wars a lot sooner if they knew they were going to have to pay for them upfront.  So, a return to pre-Bush income tax rates doesn't seem like a bad idea to me on its face.

But only on its face.  Obama, like the Republicans, wouldn't put extra income to good use by paying for important social programs.  He wants to cut them.  It's not enough to have more revenue; you also have to put that revenue to good use, and Obama isn't interested in doing that.  Anyway, he's changed his mind about prolonging the middle-class tax holiday as well as Social Security cuts, so get ready for the fourth Bush term.

Saturday, December 22, 2012

If This Be Socialism, Make the Most of It

Bless his heart, President Obama is still doing standup comedy.  He shouldn't give up his day job.  Well, maybe he should.  Take it away:
During an interview with Noticias Univision 23, the network's Miami affiliate newscast, Obama pushed back against the accusation made in some corners of south Florida's Cuban-American and Venezuelan communities that he wants to instill a socialist economic system in the U.S. The president said he believes few actually believe that.

"I don't know that there are a lot of Cubans or Venezuelans, Americans who believe that," Obama said. "The truth of the matter is that my policies are so mainstream that if I had set the same policies that I had back in the 1980s, I would be considered a moderate Republican."
Those "corners of south Florida's Cuban-American and Venezuelan communities" would be considered neo-Nazis in genuinely reality-based parts of the world.  (Oh, all right, I'm exaggerating a bit there.  They're really paleo-Nazis.)  Only rich right-wingers who think Pinochet got a bum rap 1) can afford to leave their Latin American countries when they go socialist and 2) would be welcomed as legal immigrants by the US government from their countries.

Remember that when Obama talks about moderate Republicans, he means Ronald Reagan.  Reagan was a far right-wing Republican, though political realities forced him toward the center once he was in office.  Not that Obama is wrong about the similarities between them.  As Mark Green and Gail McColl wrote in 1983, when Reagan was President:
The misinformation is on the subject of just how much the President wants to cut.  He has acted to eliminate Social Security benefits for students who are children of deceased and disabled workers; proposed cutting the minimum benefit for Social Security recipients; and allowed his Health and Human Services Secretary to propose cutting benefits at 62 years of age -- a proposal rejected by the Senate, 95-0 [There He Goes Again, pp. 87-88].
Of course, we've moved so far to the left since then that the Senate wouldn't reject a proposal to cut Social Security benefits by 95-0; nowadays they might accept it by that margin.  Obama wants them to, though that alone would be enough to make the Republicans reject it.

Pandering to right-wing Cuban exiles was second nature to Reagan too.  As were austerity measures for everybody but the rich, coddling scandal-ridden bankers and financiers, multiple wars (including the first War on Terror), slashing civil liberties -- though in many ways Obama is to the right of Reagan.

Notice that the ABC story on Obama's moderate politics claimed:
Some on the left have long argued that the president's policy beliefs closely resemble moderate Republican views from the 1980s and 1990s. Ezra Klein made the argument in a 2011 column, citing his adoption of the individual health insurance mandate, an idea developed in conservative think tanks.
Klein actually wrote:
Rather, it appears that as Democrats moved to the right to pick up Republican votes, Republicans moved to the right to oppose Democratic proposals. As Gingrich’s quote suggests, cap and trade didn’t just have Republican support in the 1990s. John McCain included a cap-and-trade plan in his 2008 platform. The same goes for an individual mandate, which Grassley endorsed in June 2009 — mere months before he began calling the policy “unconstitutional.”
But Klein did seem to think that the policies the Democrats picked up from the Republicans on their Long March to the right were "moderate."  The individual health-insurance mandate came from the right-wing Heritage Foundation, which boasts of its ideological compatibility with Rush Limbaugh.  Limbaugh is certainly a mainstream Republican, but he's no moderate.  It isn't "moderate Republican views from the 1980s and 1990s" that Obama's policies resemble, but views of the right-wing Republican fringe.

Obama really should stop trying to win over Republican paleo-Nazis, who are never going to vote for him anyway.  Even appeasing them is a waste of time, and a tactical error.  Why not try to please his actual base?  Maybe he should consider that the people who voted for him want him to "instill a socialist economic system in the U.S."  As they do: they want to expand the reach of Social Security and Medicare, raise taxes on the rich, and reduce the influence the wealthy have on American politics.  If he did that, the Democrats might be able to hold onto the Senate, and maybe even make gains in the House, in 2014.

A popular theme in upper-strata circles is that America has moved to the right since, say, the 1960s.  This is at best debatable.  True, the Democratic Party leadership has been chasing the Republican party leadership rightward since the 1980s, as the Republican leadership chases the horizon even further rightward.  But this only refers, as usual, to a tiny fraction of the American population: the richest one percent plus their hangers-on and toadies.  The majority of Americans have stayed pretty much where they always were, favoring socialist programs and the downward redistribution of wealth.  (As opposed to the elites, who want to redistribute wealth upward, and have largely gotten their way for the past thirty years, aided by both political parties.)  That doesn't necessarily mean we're right, of course: the majority can and often has been spectacularly wrong.  But the top one percent, if anything, has been wrong more often.

Thursday, December 20, 2012

Linguistic Polyamory on a Dark and Snowy Night

It's been a busy day, so this will be a quickie, two quotations from Sarah Schulman's new book Israel / Palestine and the Queer International (Duke, 2012).  One, from page 153, refers to Haneen, a queer Palestinian activist who visited the US on a solidarity tour Schulman organized:
Similarly, she addressed the American LBGT obsession with determining words to call ourselves. 'I find it ["queer"] useful for the time being but I am not attached to it or any other term. I am happy to move along with language. I am not looking for a term to marry. When it comes to language, I believe in short affairs."
The other, from page 166, is Schulman's own opinion:
I have always hated the "safety" discourse. Feeling "safe" is not the paramount goal of life; it's being able to move forward constructively even if one is afraid. Total safety is impossible unless other people's rights are entirely infringed, and it's not a desirable state for an adult. How '"safety" of the most dominant became the agenda of the LGBT community center is beyond me.
Israel / Palestine and the Queer International is, in my opinion, Schulman's best book since 1998's Stagestruck.  I recommend it highly.

Wednesday, December 19, 2012

You Will Be Assimilated

Robert Bork has died, at the age of 85.  Whatever It Is I'm Against It has posted perhaps the best tribute to his memory, a (probably partial) list of things that Robert Bork thought are not unconstitutional, among them banning the sale of contraceptives to married couples and the eugenic sterilization of prisoners.  To that I can only add this opinion piece I wrote for the Indiana University newspaper in 1996.
ROBERT BORK AND THE HEARTBREAK OF CONSERVATISM

Speaking of victims' culture, Robert Bork has a new book out.  Bork has made a career out of whining that an unholy cabal of politically correct liberals deprived him of his God-given right to sit on the Supreme Court.  Slouching Toward Gomorrah, his new tome, would be your typical conservative tract on America's moral decline, if not for its author's explicit self-pity.

Old folks like your Junkyard Intellectual remember Bork as the Nixon toady who presided over the Saturday Night Massacre, the firing of Watergate Special Prosecutor Archibald Cox on October 20, 1973.  Nixon's men Eliot Richardson and William Ruckelshaus resigned rather than obstruct justice, but Bork was a boy who couldn't say no.  He even declined to resign after doing the deed, for his President needed him.

For his collaboration with the Nixon gang, Bork suffered the appropriate penalty: Ronald Reagan appointed him to the Federal judiciary, where he built a reputation as a hardline Constitutional reactionary.  Sniffing the possibility of a Supreme Court nomination, however, he began to moderate his position.  That is, he swerved from right to left like an American driver who has suddenly materialized on an English highway.  (Overturn Roe. v. Wade?  No, Senator, the thought never crossed my mind!)  After Bork's rejection by the Senate, Reagan and Bush preferred nominees with minimal judicial experience, to avoid those embarrassing "paper trails."  Since then Robert Bork has had only his media access, speaking tours, and lucrative Olin Foundation fellowship to console him.

Given Bork's personal history, it should come as no surprise that Slouching Toward Gomorrah is a complaint about the decline of other people's public morality.  Its index has one entry under "Watergate scandal" (Bork is truly honked off by the "thorough indoctrination in Watergate" schoolchildren supposedly receive), but a couple dozen under "White heterosexual males," including multiple references to "Feminist harassment of," "Religious attacks on," and so forth.  As Clarence Thomas, another notable right-wing whiner, once said, "Bitch, bitch, bitch, moan, moan, moan."  I'm also reminded of James Carville's book We're Right, They're Wrong, a celebration and defense of liberalism in general and of the Clinton administration in particular, which managed to avoid all mention of the North American Free Trade Agreement.

Bork blames the nihilistic Sixties for the breakdown of traditional values.  In the real world, political radicals of the Sixties were criticized not for a lack of morality, but for being too moralistic and idealistic; it was conservatives and liberals who congratulated themselves for their tough-minded acceptance of the bigotry, corruption, and violence that make the world go round.  More recently, it was not Baby Boomers but the Reagan administration which plumbed new depths in cynical expediency.  Those Boomers who have distinguished themselves for nihilistic amorality, such as Newt Gingrich, aligned themselves with the Right.

Bill Clinton's sex life, or rather its failure to cost Clinton the 1992 election, especially galls Bork.  Thirty years ago, he laments, you couldn't have had a political career in America if people knew such things about you.  Of course, thirty years ago John Kennedy's extramarital escapades, like those of Franklin Roosevelt and Warren G. Harding before him, were well known in Washington, but in those days the press corps preferred to shield the American public from such information.

But it's true that Americans have become more tolerant of sexual license in their public figures.  Ronald Reagan's divorce and remarriage were well known from the start of his political career, and it was no secret that Nancy was pregnant when they married, yet the Religious Right celebrated Reagan's election to the Presidency as a vindication of "family values."  By 1988 Pat Robertson's unrepentant acknowledgment that his wife was pregnant when they married was hardly news at all.

Ironically, then, Bork has a defensible point.  It's just that American political morality was shoddy long before Clinton, as Bork knows first-hand but wishes we'd forget.  The conservative whine that children aren't learning their history goes hand in hand with the conservative insistence that American history be taught as feel-good propaganda.  The social movements of the Sixties were really attempts to raise the standards of American political morality, and they must be vilified by the likes of Robert Bork because they nearly succeeded.  The barbarians at the gates who trouble our rulers' dreams are the American people, seeking justice.
WIIIA recalled having written in 2005 that "Bork is living proof that one can also be driven mad by lack of power."  Power he may have lacked, but he had plenty of influence.  Look at that partial list of his positions, and you'll see that Bork was the kind of right-winger our Constitutional scholar of a President can use to make himself look moderate by comparison.

Small Changes, Very Small

I'm almost done rereading Marge Piercy's third novel, Small Changes, originally published in 1973.  I first read it in the mid-1970s.  It's a long complex book, built around two main characters but with many other secondary ones.  (Two important male characters are Vietnam veterans, interestingly.)  It's typical now to dismiss it as one of the feminist classics of the 1970s, and many contemporary readers claim that it's dated.  I don't think so.  Relations between women and men have changed somewhat since the book was written, and middle-class women are more likely to have careers outside the home than they were previously.  But men still expect emotional and other service from women, and woman still find it difficult to refuse it, let alone demand emotional and other service from men in return.  And there's still reaction, a powerful movement to return us all to the Fifties, though the Fifties were an aberration, a diversion from tradition rather than Tradition itself.  We're not likely to go back to the days when many women could stay at home as housewives because most men can't earn enough to support a household by themselves.  It wasn't feminism that destroyed the "traditional" family, it was capitalism and the Reagan administration.

But although feminism is one of the core themes of Small Changes, it's about other things too.  It's partly an account of reaction against the Sixties, including state repression that most Americans paid no attention to at the time.  One of the characters must testify before a grand jury empaneled to destroy the resistant social movements that had scared our rulers so badly.  This was officially justified, insofar as anyone bothered, because of the violent segments of those movements: the Black Panthers, the Weathermen.  (Of course, the far greater State violence, against American citizens as well as foreigners, was of no great concern to most citizens, even after it was exposed.)  As one character explains:
It isn't like a regular jury.  They're older, richer white people always, and they even run a credit check on them.  They're to bring in indictments that the government wants.  Ninety-five per cent of the time that's what they do -- agree to prosecute.  But Anita says that mainly they're a device for investigating.  They're a way of forcing people to testify against each other.  The F.B.I. can't force you to talk, the police can't, though they try.  But a grand jury can send you to jail if you don't answer their questions [505].*
And Wanda, the character called before the grand jury, explains:
"This is a fishing expedition, Beth.  The questions they want to ask me aren't just about deserters.  They're asking about the whole web of connections in the movement.  They want to know who's friendly with who, who lives where, in what commune, who sleeps with who, who gives money.  They want to fill in all the missing connections so that people can't disappear underground any more."

"But they must know all that anyhow, they have all the phones in the country tapped!"

"Don't exaggerate, Beth.  They don't know everything.  Even when they 'know,' half of what they know is garbage and they want confirmation.  Look, the questions they're asking me include every address I've lived at since the year zero.  Who else lived there?  What vehicles have I owned and who borrowed them?  Who was at the women's conference last year in Boston and where did they stay?  How did I travel from Boston to New York for the health care conference that summer?  What meetings did I attend in that fall that plotted riots, demonstrations, or street actions -- all lumped as one! ... [507].
Yes, the novel is often didactic, but no more than any Robert Heinlein book.  No wonder that I, who grew up loving Heinlein, also love Piercy.  Heinlein's advocates have often pointed to his highly intelligent and competent characters as a feature of his work that sets it above most (especially mundane) fiction.  Piercy also focuses on intelligent competent characters, with the difference that she doesn't set them against straw opponents they can easily beat.

Things haven't changed all that much in American politics; we now have the Patriot Act and the surveillance state, and probably all our phones and e-mail are tapped now.

Another key topic is computers.  Miriam, one of the two principal characters, moves from mathematics to computer science.  This is in the late 1960s, when computers were mainframes that filled whole rooms, and programs and data were input on punched cards, before desktop computers much less tablets, before the Internet and WiFi.  Miriam gets a job with a small research firm on Route 128 near Boston, the only female programmer in the company.  (Come to think of it, Miriam is a lot like Heinlein's smart, sexy, male-identified heroines, though again, she lives in a different universe.  But then Heinlein's men are also fantasy figures.)  Their big contract is writing software for the anti-ballistic missile program that the Nixon administration backed at the time.  Miriam gets shuttled onto that team, which she hates.  There's a scene where she meets her old MIT professor Wilhelm Graben, who scoffs when she complains about being "hired by death."
"What nonsense.  It's scuttlebutt that the system will never operate.  For instance, assuming that the computer technology works out perfectly -- for the first time in history, perhaps -- but assuming that, the crucial interface is that combination of radar equipment and computers.  Now there cannot possibly in any real war situation be enough time to discriminate the raw data in the radar inputs about what would be real missiles and what simply extra junk floating about.  All weapons are programmed to fill the heavens with large quantities of objects that induce noise in the radar.  Chaff for instsance -- aluminum foil -- reflects radar beams quite nicely and produces responses that would indicate serious objects approaching.  The electronic countermeasures will be jamming, with strong transmission at the same frequency that the radar operates on ...

"But forget military questions.  Think about the most amusing aspect of the ABM.  Now you are exploding, say, a five-megaton warhead to knock down a missile no bigger than a barn.  Obviously, this does not require a warhead of five hundred million tons of TNT -- Hiroshima was destroyed by twenty thousand tons' equivalent.  Thus you can deduce that accuracy is simply not in it.  They're figuring on exploding a five-megaton bomb to knock down a missile because they are not counting on being in the same state with it -- states imagined to be lines superimposed upon the air ... "  His voice was calm and mocking.  She grew colder and colder ...

"Defense, it's called.  The rhetoric of defense, of course, is that human beings are being defended.  But the type of weaponry we are discussing is absolutely useless in defending human beings.  It would make little difference, I imagine, to someone on the ground whether he was fried alive because an enemy missile exploded twenty miles to his right hand, or because one of 'his' missiles exploded as far overhead.  The temperature on the ground would instantly rise several hundred degrees, in either case.  Defense is defense of missile sites, not of people or the landscape, which would be eliminated ... But you must admit, the rhetoric with which American politicians address their constituencies about defense spending is amusing" [379-380].
I'd forgotten this passage when, sometime in the early 90s, I leafed through the book looking for another passage I remembered.  The same weapons systems, with super space lasers added, were floated again in the 1980s as the Reagan gang's Strategic Defense Initiative or SDI, commonly derided as "Star Wars." There have occasional attempts to resurrect the program under Clinton and perhaps under Bush, even though the same criticisms Piercy put into Graben's mouth still apply.  What counts is pouring millions and billions of dollars into high-tech industry, not "defense ... of people or the landscape." 

There's a lot of quotable, still-relevant material in Small Changes -- I haven't come close to exhausting it here.  On one hand, while there's nothing wrong with books that focus on stereotypical concerns of women like domesticity and romance, concerns that still dominate what is derisively called "chick lit," it should be clear that Piercy's interests and ambitions include those concerns but aren't restricted to them.  Much as I love, for instance, Jennifer Crusie and her smart heroines, none of her books are as large in this sense as Small Changes or most of Marge Piercy's novels.  But then, the same is true of most male writers today.  I think that the reason many people try to dismiss Piercy's work as dated feminist cliches is that her left politics disturb them at least as much as her critical take on men in relation to women.

*Quoted from a Fawcett-Columbine trade paperback reprint from (I think) 1997.

Tuesday, December 18, 2012

The Hound of God


There was a mind-boggling article on the Guardian's website this past weekend.  I'm still trying to make sense of it.  Here's the core, um, argument:
It seems obviously fair and right that if straight people can get married, why not gay people? But we must resist the easy seduction of the obvious. It once seemed obvious that the sun revolved around the Earth, and that women were inferior to men. Society only evolves when we have the mental liberty to challenge what seems to be common sense.

Many Christians oppose gay marriage not because we are homophobic or reject the equal dignity of gay people, but because "gay marriage" ultimately, we believe, demeans gay people by forcing them to conform to the straight world ...

Cardinal Basil Hume taught that God is present in every love, including the mutual love of gay people. This is to be respected and cherished and protected, as it is by civil unions. But to open up marriage to gay people, however admirable the intention, is ultimately to deny "the dignity of difference" in the phrase of the chief rabbi, Jonathan Sacks. It is not discriminatory, merely a recognition that marriage is an institution that is founded on a union that embraces sexual difference. It is not a denial of the equality of the love between two gay people, for all love is of infinite value.
According to his Guardian profile Timothy Radcliffe, the author of this piece, "is an English Dominican. He was Master of the Order of Preachers from 1992 to 2001. His latest book is Take the Plunge: Living baptism and confirmation."  As a piece of writing it's standard theological word salad; a friend once said of a similar work that you could rearrange the sentences at random and they'd make about as much sense.  Come to think of it, Radclyffe's piece reads as if someone already did rearrange the sentences at random.  (I love his repudiation of common sense, by the way.  As usual with such people, though, he has it backwards: common sense tells us that marriage is supposed to be be between Adam and Eve, not Adam and Steve.)

I have my own reservations about same-sex marriage, civil or religious.  But I think it is up to gay people, individually and collectively, to decide whether they want to demean themselves by "conforming to the straight world."  It's no news that many of us can't wait to leap into that sinkhole of degradation.  Radcliffe wants you to believe that he's liberal, by prating about the beauty and equality of the love between two gay people, but I am pretty sure I've heard this kind of argument before: Women don't really want to go to college and pursue professions; that would be a denial of their essential femininity.  Blacks don't really want to be equal to whites; deep down inside they know that they were created to be hewers of wood and drawers of water, servants of servants.  God created us different for a reason, and the Fremen have a saying: One cannot go against the word of God.  It's only common sense.

For all that the majority of marriages in history have been between men and women (often between one of the former and several of the latter, but Radcliffe discreetly avoids that issue), seeing the union of two men or two women as a marriage comes quite easily to most people.  If anything, that's about what one would expect: if heterosexual marriage is a central institution in society, then it will be imposed as metaphor even on same-sex couples, even if they aren't erotically involved.  So marital language and symbolism turns up in the friendship between the ancient Gilgamesh and Enkidu, and Herman Melville's Ishmael compares his pillow talk with the Pacific Islander Queequeg to that of man and wife:
How it is I know not; but there is no place like a bed for confidential disclosures between friends. Man and wife, they say, there open the very bottom of their souls to each other; and some old couples often lie and chat over old times till nearly morning. Thus, then, in our hearts’ honeymoon, lay I and Queequeg—a cosy, loving pair.
Later in the nineteenth century, female couples who lived together were referred to as "Boston marriages," even though they were assumed (not necessarily accurately) not to be Sapphists.  Did all this marital imagery degrade the persons involved?  Did it deny difference?  Not that I know of -- at least not until gay people began demanding that they be allowed to literalize the metaphor by applying for marriage licenses.  Reports of same-sex lovers having wedding ceremonies go back hundreds of years, to the great indignation of the hostile reporters who passed their accounts down to us.

As I've said, I have doubts about all this.  To some extent I even agree with Fr. Timothy that marriage might not be the best metaphor for same-sex couples, but then I feel the same way about heterosexual marriage.  There's also a tradition in gayish circles of referring to same-sex partners as friends, and this hasn't always been a euphemism.  As Paul Monette wrote in a passage of his AIDS memoir Borrowed Time that has stayed with me for decades now:
I always hesitate over the marriage word.  It's inexact and exactly right at the same time, but of course I don't have a legal leg to stand on.  The deed to the house on Kings Road says a single man after each of our names.  So much for the lies of the law.  There used to be gay marriages in the ancient world, even in the early Christian church, before the Paulist hate began to spew [sic] the boundaries of love.  And yet I never felt quite comfortable calling Rog my lover.  To me it smacked too much of the ephemeral, with a beaded sixties topspin.  Friend always seemed more intimate to me, more flush with feeling.  Ten years after we met, there would still be occasions when we'd find ourselves among strangers of the straight persuasion, and one of us would say, "This is my friend."  It never failed to quicken my heart, to say it or overhear it.  Little friend was the diminutive form we used in private, the phrase that is fired in bronze beneath his name on the hill [24-25].
I recognize that for many gay people, marriage is the word that fits and quickens their hearts.  So be it, but let it be a reminder that not all of us see our relationships in the same way.  (I've noticed a certain reluctance among some marriage-equality advocates to accept the other terms that go with marriage, like husband and wife: perhaps it's the gendered implications of those words that make them uncomfortable.  I don't think they should be allowed to evade them, though two husbands or two wives are fine with me.  And I've seen less of this reluctance in the past couple of years.) 
An easygoing tolerance, rubbing along beside each other without much curiosity, is not enough. We need to recover a confidence in intelligent engagement with those who are unlike us, a profound mutual attention, otherwise we shall crush a life-giving pluralism. It will not only be gay people who will suffer. We shall all be the poorer.
It is still not clear to me what this has to do with gay people, let alone gay marriage.  Again I'm touched, if not convinced, by Radcliffe's concern trolling about how gay people will suffer if they're allowed to marry.  Things may be different in England, but in the US there's nothing non-homophobes and non-bigots like Radcliffe can do to prevent same-sex marriage, even if they manage to protect gay people against themselves by blocking civil marriage between males or between females.  In England, the legality of marriage is handled differently, plus there's an established Church that could conceivably prevent laypeople from staging their own religious weddings.  In the US, the First Amendment and the absence of established church means that religious non-bigots have no way to impose their definition of religious marriage on anyone, not even other heterosexuals. Radcliffe's blather about "life-giving pluralism" is disingenuous, to put it gently.

Monday, December 17, 2012

Thoreau-Going Twenty-Somethings


The reason I was poking around in Ellen Willis's Don't Think, Smile! was that I've begun reading a new book, Twenty Something: Why Do Young Adults Seem Stuck? (Hudson Street Press, 2012) by Robin Marantz Henig and Samantha Henig, mother and daughter respectively.  The book grew out of a New York Times Magazine article by Henig Senior, as an attempt to deal with its subject in more depth and with the added perspective a twenty-something co-author could add.  I noticed the book because I'm interested in the perennial problem of adults who complain about the coming generation's fecklessness, irresponsibility, and dependence.  The Henigs promised to survey empirical research as well as interviews of real people, so I let myself form some expectations.

I'm about ninety pages in as I write this, and the book is due at the library today, with a hold on it so I can't renew it.  I may return to it later.  But for now I'm a bit disappointed.  The authors do shoot down stereotypes of the Millennials, as they call them, showing that many of the same complaints were leveled at the Baby Boomers: inability or refusal to commit to a career or a spouse, for example.  And I was amused when it was pointed out that Henry David Thoreau, the bold and free individualist who lived alone in the Walden Woods, nevertheless took his laundry home every week (only a few minutes' walk) for his mother to wash.  But not until just a page ago was it acknowledged that the Boomers, the generation Mother Henig and I share, are the anomaly, not the norm against which to measure twentysomethings now.  For example, the average marriage age in the Fifties and Sixties was low, historically speaking, compared to the Boomers' parents and grandparents.  The number of kids who went to college was also higher than ever before.

Among the most important, and one that the Henigs have so far ignored even though it's relevant to their topic, is that the economic boom after World War II raised expectations for economic mobility and security that were to be dashed in the 1980s and after.  My parents, and I presume Robin Henig's, grew up during the Great Depression and came of age during the War.  (The Depression is not mentioned in the book's index, nor so far have I seen it mentioned in the text.)  The US had long been the richest country in the world, but after the war we became politically dominant too, and unlike most of our competitors we had not been devastated as Europe, Russia, and England were.  As they rebuilt, we flourished, and we did so in a New-Deal environment where the federal government subsidized technology and education, with high marginal Federal income tax rates and regulation of banks and finance to prevent any more economic crises.  The GI bill sent unprecedented numbers of Americans to college, which meant that the universities no longer were the exclusive preserve of old family WASPs, and new civil-rights laws opened new opportunities to white women and to people of color.

What this meant was that my parents believed and hoped that opportunities were there for anyone who wanted them enough to work for them, and that their children could be anything they wanted.  It was no longer necessary to watch every penny, or to grab any job available to keep the wolf from the door.  They wanted their children to have easier and richer and freer lives than they had.  The Henigs are aware of this to some extent, but so far they haven't explored the political and social ramifications of these changes.

Which boil down to this: the Henigs agree with the pundit consensus that twenty-somethings tend to believe that there's no hurry to settle down, they have time to decide on a career and a spouse, the world won't collapse if they take a while to find their course.  This isn't some random fantasy: it's what they learned from their parents, and from the postwar social consensus.  The trouble is that the postwar consensus was falling apart just as the Millennials were being born.  American economic elites had never been happy with the constraints imposed on them after the Great Depression (let alone those of the war, when their ability to make obscene profits was kept under pretty tight rein).  They didn't like workers having job security, or the power to make demands on the owners and managers.  They hated the idea that workers could retire with a government pension, though of course the money for the pension came from their own work.  They preferred that working people should live in constant insecurity, never far from being thrown out on the street.  They couldn't see why the kind of speculation that nearly destroyed the world economy in 1929 should be regulated or forbidden -- on the contrary, there should be more of it.  And why should the lower orders be sent to public universities, where they learned sedition and class-hatred from the Italians and Jews who'd gotten in because of the GI Bill?  They began plotting reaction, which they began to implement during the 1970s.  (See David M. Gordon's Fat and Mean: The Corporate Squeeze of Working Americans and the Myth of "Managerial Downsizing" [Free Press, 1996] for more about this.)

At any rate, by the time today's twentysomethings were coming of age, deregulation had set America's moneymakers free, offshoring (subsidized by the taxpayer) had depleted the American manufacturing base, and kids who wanted a college education were finding it necessary to go into debt to do so.  Thanks to Reagan's policies, unemployment shot up to Great-Depression levels and never really came back down: the "new" jobs that replaced those lost under Reagan were largely less than full time, with fewer or no benefits, and lower wages. (Again, the Henigs haven't paid attention to this particular point as they consider twentysomethings' difficulties in finding good jobs.)  If twentysomethings were ill-prepared for these changes, it was largely because their parents didn't see them coming either.  They had been raised to believe that America was getting better and richer, that the barriers to equal opportunity had come down, and everybody could be whatever they wanted.  To the extent that they'd seen these claims in their own lives, why shouldn't they pass them to their children?  But by the 1990s the conventional wisdom had changed: the post-WWII boom was not the flowering of  the American free-enterprise system but a fleeting illusion, Americans were just going to have to lower their expectations, stop whining, and accept that a diminished life was inevitable.  (That was only most of us who needed to lower our expectations, of course: for the rich, no such adjustments were necessary, or tolerable.)

The Henigs do have an interesting quotation about the rise of student debt that reveals how narrow their focus is:
"It's not like we had a great national debate about whether we wanted to impose huge debts on young people," Patrick Callan, president of the National Center for Public Policy and Higher Education, said recently.  "We just woke up and it had happened" [30].
That's right, it fell from the sky like a meteor, without human effort or thought!  True, the changes happened without a "great national debate," but people made them happen.  The Henigs aren't interested in asking who, or why, though both questions seem appropriate.

Ellen Willis is relevant to this because, as an older contemporary of Robin Henig and me, she observed the same changes and wrote about them with an eye to politics and history.  Don't Think, Smile! has a chapter called "Intellectual Work in the Culture of Austerity," which addresses the changes in the economy after the end of the Vietnam War and the rise of Reaganism.  Willis stresses that in this period it was possible to bypass the usual career paths (a big deal in the Fifties, with its Organization Man and Lonely Crowd, not to mention its Rebels Without a Cause) not merely to pasture one's soul but to engage, fully consciously, in intellectual and political work on a shoestring.  You could hardly expect the big organizations of the corporate state to subsidize work which undermines them, but young left intellectuals of Willis's generation didn't expect it to.  Willis's essay describes how she herself moved from freelance writing to regular journalism to the fringes of academia and ultimately to fulltime teaching, always with an eye to supporting the kind of work she wanted to do, which included political activism as well as writing.  She wasn't representative of her generation, but neither on her own account was Robin Henig, who married young, started a family and settled down while pursuing a career.  (Not to put her down: she got her BA at 19 and her Master's a couple of years later.)

The Henigs have so far ignored this political and historical context.  Like student debt, the economy and the job market are just out there, somehow, and their empirical resources have been resolutely apolitical.  They're also often ahistorical: most of the research the Henigs cite, while colorful, studies twentysomethings without a control group.  As I noted before, the Henigs' own basis for comparison is mostly limited to their own generations, aside from an occasional anecdote from Henig Senior's parents.  Twentysomethings are mostly compared to Boomers, not to the generations that came before the Boomers.  Maybe this will change later in the book, which is an engaging read.  I'll try to pick it up again early next year when it returns to the library shelves.

Always Vulgar and Often Convincing

A couple of years ago I alluded to an old article by Ellen Willis about the time CBS commentator Andy Rooney got into trouble for offensive statements.  I couldn't quote it directly because it wasn't online and I didn't have a copy of the original, but today I found basically the same material recycled in Willis's book Don't Think, Smile! (Beacon, 1999), on pages 148-149.
Consider the infamous Andy Rooney affair.  The 60 Minutes commentator was attacked by the Gay and Lesbian Association against Defamation (GLAAD) for publishing a crude, antigay letter in the gay newspaper the Advocate and allegedly making racist remarks to a reporter.  As a result he was suspended from CBS, and elicited reproving editorial noises about "civil public discourse" from the New York Times. This was seen in some quarters as a victory for the left.  Yet the real reason Rooney got into trouble was that he violated the media establishment's bland, centrist criteria for acceptable speech.  In demanding Rooney's removal, lesbian and gay activists appealed to precisely those standards of "civility" -- that is, niceness -- regularly used to marginalize their own speech.  While Rooney was slapped down for expressing bluntly illiberal views, it's hard to imagine anyone comparably left of the mainstream -- particularly in a libertarian direction -- ever having his job in the first place.  And suppose such a person did slip through and then wrote a letter to the editor defending illegal drug use or attacking organized religion as tyrannical -- can anyone doubt that he or she would have been not suspended but fired, and with little public protest at that?

The expression of a homophobic opinion is not an act of domination.  Where the real issue of inequality arises is in the consistent denial of radical dissidents of equal access to the mass media and other public forums.  Rather than pressuring CBS to throw Andy Rooney off the air, GLAAD should have demanded time on 60 Minutes to rebut him.  In choosing instead to define his speech as an intolerable threat, they merely reinforced the basic assumption of the dominant culture that we can't afford freedom, that all hell will break loose if we relax controls.  In effect, campaigns against offensive speech displace the fight for equality onto battles against freedom.  This is a tempting maneuver, particularly at a time when the left is weak and on the defensive, for a simple reason: fighting for equality is a difficult, long-term, exhausting process that meets bitter resistance every step of the way while attacks on freedom often get immediate results and -- odd, isn't it? -- sympathy or even outright support from the very people in power who are supposed to be the enemy.
Some things have changed since this was first published.  A few high-profile atheists, of whom the late Christopher Hitchens is probably most notorious, have managed to publish attacks on religion in corporate media.  Hitchens was allowed to do this largely because he'd established himself as a vocal advocate of American aggression in the Middle East, but still, he did clear a space for vigorous anti-religious statements in the mainstream.  He also benefited from the access right-wing frothers won to corporate media: Rooney looked moderate, reasonable and civil compared to Michael Savage, Ann Coulter, Rush Limbaugh, and the rest of the Right's attack clowns, while Hitchens could be pointed to as their left-wing equivalent.

But some things haven't changed, especially the dominant culture's assumptions about the importance of "bland, centrist criteria for acceptable speech" and that "we can't afford freedom, that all hell will break lose if we relax controls."  "In effect, campaigns against offensive speech displace the fight for equality onto battles against freedom," which are always in season since we can't afford freedom; it was a nice experiment in the 1790s, but we now know that it's not relevant in a civilized modern society.

Rebuttal and debate are my preferred mode of response to offensive speech.  Way back in 1971 or 1972, late-night TV star Jack Paar made some antigay remarks on the air, and to his credit he allowed some activists from Gay Activists Alliance or the National Gay Task Force to appear on his show and have a debate.  (Bruce Voeller is the only one whose name I can remember.)  The liberal newspaper columnist Nicholas von Hoffman wrote an attack on the gays; as I remember, he wasn't all that happy that Paar had let these weirdos speak on national television.  I wasn't pleased with the activists' performance; I remember thinking that we had several people in Bloomington who could have done a better job.  But the fact that they were allowed on national TV, and chose debate rather than suppression as their response, pleased me a great deal.  I foolishly thought that it would happen again.

I've noticed that a lot of people hate debate.  (Compare the popular meme "Debating on the Internet is like competing in the Special Olympics.")  It certainly has its limits, but I don't know of any better way to handle controversial topics without violence or repression; but of course, many (most?) people prefer violence or repression as a response to opinions they dislike.  I first encountered real debate in connection with the Vietnam War.  I wasn't a participant, but I found it interesting and exciting to watch other people confront and answer each other's claims.  But this debate was largely virtual, even then.  First I absorbed the official US line on the war through the mass media, not all of which was corporate in those days.  Then I heard Tom Hayden describe the history of US involvement in Vietnam in a speech at Notre Dame University.  What he said was so different from anything I'd heard before that I went to the library to check other sources, and I found out that he was right.  I also read Howard Zinn's book on the "logic of withdrawal" from Vietnam, which explicitly answered the official US arguments for continuing the war.  I argued about some of this with my mother, but of course our exchanges were warped by parent-child dynamics.  And so on.  You'll see that I constructed my own debate, but I chose debaters who stuck to the issues instead of just yelling "Commie" or "Surrender monkey" or "Warmonger" at each other.  I mean, what kind of weirdo would do that, instead of just listening obediently to Uncle Walt every night and hearing the sane, responsible point of view?  There was no need for me to question; the truth was out there for the taking.

I suspect that many people don't know how to follow a debate.  They think it's like a football game, where you figure out which team is your team and then you cheer every time it makes a good play, and then you go out and overturn cars when it's over.  Some debates, I admit, are like that.  In a good debate, though, the kind that interests me, you listen to both sides and see how well they make their case.  The goal is not to convince the other debater that he's wrong, much less to score points, it's to convince the audience, or at least to shake their own positions.  (National political debates, of course, are like a football game.)

But my original concern here was with "civility."  Two memories keep cropping up for me in that regard.  One is an exchange I had in comments on another blog after Gerald Ford died.  Some of us began discussing whether Ford should have pardoned Nixon, and at some point one of the other commenters said that she didn't like debate, because it involved disagreeing with another person, and she was afraid she'd hurt their feelings if she did that.  Some others agreed with her, and that brought us to something of a stalemate.  A few months later, someone posted anti-choice, pro-forced birth opinions in comments at the same blog, and my goodness but you should have seen the claws unsheathed!  I believe the same person who didn't want to hurt anybody's feelings joined the others in hurling abuse at the anti-choice commenter.  But on reflection I realize she wasn't really contradicting herself: it was only debate she didn't like, not abuse.  (And some people's feelings just don't count, apparently.)

The other memory involves RWA1 and his concern about civil discourse after the Arizona shootings last year.  Disturbed by the dreadfully rude rhetoric of the Far Left (i.e., liberal Democrats), he declared: "It is time to retire analogies to Nazis and fascists once and for all."  What he meant, of course, was for Democrats to retire those analogies; he himself was tossing around Nazi and fascist analogies within a month or two, nor did he ever chide other Republicans for using them.  And in that respect he was no different from liberal Democrats who called for civility: they were complaining about Republicans, not themselves.  Attacks on Republicans, in the crudest schoolyard terms, were just fine with them.  Truth be told, they're fine with me, too -- up to a point.  It feels good to mock and deride your opponents.  But at some point you have to start thinking.  Well, come to think of it, I guess you don't.  Who has time for thinking?  Most people just don't have time to inform themselves, so they find a team to cheer for and another team to jeer at.  You have to know your priorities.

Sunday, December 16, 2012

Put Your Hand Inside the Puppet Head

I scooped this one up from Facebook today.  Arthur C. Clarke knew a fair amount about science, but he got into trouble when he strayed beyond his field.  I've had occasion to make fun of him before, so today I'll settle for making fun of this statement.

Two questions: When did Religion "hijack" morality, and more important, from whom did it "hijack" it?  On its face the statement (from his 1991 "Credo") makes very little sense.  As I've pointed out before, a good many atheists talk about Religion as though it were an autonomous entity, virtually a person, that keeps us from being sane and rational.  But religion is a complex of practices, beliefs, and ideas that human beings invented. Saying that religion hijacked morality is like saying that a puppet turns around and punches its puppeteer in the nose, all by itself.  In fact the puppet can only do what a human being makes it do.  To add to the fun, religion (or politics, or science, or philosophy, or art) is like a puppet with billions of puppeteers who disagree with each other, often vehemently, about what the puppet should look like, or do, or say.  But it can't do anything without its puppeteers.

Historically, it's most likely that morality and religion were originally inseparable -- that people worked out moral systems within the context of religious belief and practice -- just as religion and science used to be, or religion and art, or religion and philosophy.  Clarke was disingenuously reversing who really tried to snatch morality from whom: it was scientists who tried to claim morality for Science and Rationality.  It's possible to doubt how much of an advance this was.  Scientists tended to accept a lot of religious morality uncritically; they just wanted to be in charge of enforcement.  So, instead of executing homosexuals or putting them in jail, as the irrational churchmen often wanted to do, scientists favored institutionalization with "treatment," ranging from lobotomies to electroshock to doses of hormones.  Scientists tended to agree that women should not go to college or enter the professions, since it was scientific fact that higher education drove women insane or made them sterile; the history of women in the sciences makes for depressing reading, and reveals the religious roots of science all too clearly.  Scientists continue to embarrass themselves on the subject of rape. The masturbation hysteria of the nineteenth century (and extending well into the twentieth) was the work, not of theologians, but of medical doctors.  Scientific racism is still with us, as is the readiness of scientists to provide politicians with ever more destructive weaponry.

Of course, scientists are not united on these issues, but neither are religious believers. The puppeteers are divided against themselves on just about everything.  That's not bad in itself; I consider it reassuring.  The trouble is that the puppeteers believe that the puppets have lives and minds of their own, which is the kind of irrational magical thinking that people like Clarke like to lament, while sharing it.

Clarke did say one thing I can agree with: "It is amazing how childishly gullible humans are."  It's confirmed by the people who made his remark about religion hijacking morality into a meme, and by those who are spreading it around the Internet.

Saturday, December 15, 2012

A Petition to Put a Hot Tub in the Cafeteria

I'm not going to say much about the mass shooting in Newtown, Connecticut, because I don't care to add to the flood of reactive gibberish that is out there already.

My liberal law professor friend called for the repeal of the Second Amendment, saying that it "not relevant in today's society. A civilized society doesn't need armed citizens. We don't need a militia...." I had already been thinking along those lines myself -- that is, that the only way to get around the Second Amendment and the huge body of legal precedents that obstruct gun-control laws is to get rid of the Second Amendment itself, either repealing it or amending it. But phraseology like "relevant in today's society" and "civilized society" always set off alarms in my mind.  The bipartisan standby "We've got to do something!!!" is seldom far behind.  (I strongly urge that it replace IN GOD WE TRUST as the national motto.)

Then I got e-mail from Daily Kos, imploring me to sign their petition "asking President Obama to help start a national conversation about gun control.  "If we don't start talking now, when will we ever? And if the president doesn't lead the discussion, who will?"  We're talking, remember, about a president who has killed many innocent people during his time in office, and has joked about it; and who has yet to say anything remotely intelligent about any important issue, since his public persona is driven completely by marketing and PR considerations.  You'd have to be pretty slow to ask such a man for leadership in anything, except maybe killing more people, which he will do anyway; no need to ask.

Of course there's all the prattle about hugging kids, and praying; the picture above was up on Facebook last night.  It might, just maybe, help people far from the incident feel better.  As a thoughtful response, it helps to remind me why Life of Pi annoyed me so much: it demolishes any claim that religion deals with catastrophe and suffering better than alternative worldviews.  If religion is the opiate of the people, it's not an effective one.

Friday, December 14, 2012

Afterlife of Pi

Again, spoiler spoiler spoilers will probably appear in what follows.

After writing the previous post I allowed myself to read some reviews of Life of Pi.  One of the best I found was Tasha Robinson's at the Onion AV Club.
A central plot point in Life Of Pi—the film adaptation of Yann Martel’s bestselling book—centers on the philosophical question of whether animals have souls. The title character, a self-possessed Indian boy, believes they do, and that people can tell by looking deep into their eyes. His zookeeper father feels differently; in his opinion, any depth in an animal’s eyes is just human emotion reflected back at the viewer. This conundrum—essentially, the question of whether to interpret the world spiritually or cynically—becomes the backbone of the plot. But it also works into a choice that the characters present directly to the viewers, about whether they want to take that plot literally or metaphorically, whether to focus on the film’s body, or accept its soul.
The choice is harder than it was in Martel’s book, because here, the body is more compelling by far.
By "the body" Robinson means the glorious visuals of the movie's main story, Pi's voyage across the Pacific in a lifeboat accompanied only by an adult Bengal tiger.  The second story, the one Pi tells from his hospital bed to the Japanese investigators, doesn't get the full CGI treatment but simply Pi's talking head against a white background.  As Robinson argues in Spoiler Space, "the real story (if that’s what it is) only gets a flat verbal retelling ... It seems to be a conscious distancing effect, with Lee strongly stacking the deck in favor of the fantasy by making it so much more cinematically compelling."  Many people believe that the audience is expected to take this second story as the "true" one -- what actually happened after the freighter sank -- and that the story we've been watching for nearly two hours is a fantasy or hallucination.  Maybe so.  It's often a toss-up when one is expected to decide which of two fictional alternatives is "real," since neither of them is, yet both are.  In this case, there's a revealing bias involved in the assumption that the less edifying story is the "real" one, and the more colorful one the myth, except that it's supposed to be the true real one because reality is a drag.

Robinson's framing of the movie's central question shows just how inadequate that question is.  Is the second story, which involves a sociopathic ship's cook who kills and kills again before guiltily surrendering to Pi's justice, "cynical"?  It could be interpreted in "spiritual" terms just as easily.  Think of great religious stories like Crime and Punishment or The Brothers Karamazov, which take a dark view of human beings yet still find meaning in them.  "Spiritual" doesn't mean only cotton-candy primary-colors happy endings, but I suspect that's what many people who made Life of Pi a bestseller think it should mean.  As with the book of Job, being deprived of everything and then physically tortured isn't given meaning by having one's belongings and health restored -- even if Job had more children, that doesn't mean his first batch of children wasn't slaughtered unjustly with Yahweh's permission.  (The reviewer of Life of Pi for the Guardian does a terrible job, one example being his reference to "Pi, howling like Job into stormy skies."  The guy ought to try reading what Job said; he didn't howl, he eloquently read his god the Riot Act.  Pi howls, but unlike Job's, his god doesn't talk back.)

One commenter, I think on the remarks by Samuel Delany that I quoted in the previous post, lamented that we don't have a good spiritual writer like C. S. Lewis around.  Lewis was an interesting writer in many ways, but he nearly always let his thinking be hobbled by dogma.  But this might be a good time to recall some of his comments on suffering, from his 1940 book The Problem of Pain:
We are perplexed to see misfortune falling on decent, inoffensive, worthy people -- on capable, hard-working mothers of families or diligent, thrifty little trades-people, on those who have worked so hard, and so honestly, for their modest stock of happiness and now seem to be entering on the enjoyment of it with the fullest right. How can I say with sufficient tenderness what here needs to be said? ... Let me implore the reader to try to believe, if only for a moment, that God who made these deserving people, may really be right when He says that their modest prosperity has not made them blessed; that all this must fall from them in the end, and that if they have not learned to know Him they will be wretched. And therefore He troubles them, warning them in advance of an insufficiency that one day they will have to discover. The life to themselves and their families stands before them and recognition of this need; He makes that life less sweet to them. ... The creature's illusion of self-sufficiency must, for the creature's sake, be shattered. ... And this illusion ... may be at its strongest in some very honest, kindly, and temperate people, on on such people, therefore, misfortune must fall [96-98].
Those who think only of the Narnia books when they hear Lewis's name must ignore his specifically Christian writings, which it must be remembered were intended as defenses of Christianity.  Lewis's justification of torture of the creature by its creator should be borne in mind when contemplating Life of Pi.  At least Lewis engaged the problem, however inadequately, instead of smiling benignly at it as the movie does.

Most modern Christians seem determined to ignore the less cutesy parts of the New Testament; many brush aside the Hebrew Bible as the domain of an "Old Testament God of wrath," but that's evasion since both Testaments are about the same god, and there's no shortage of wrath in the New Testament in any case.  (Those who do, like the people who made and liked The Passion of the Christ, with its CGI-enhanced torture scenes, are another can of worms.)  Many of the positive interpretations of Life of Pi try to turn it into a feel-good fable.  To its limited credit, the movie never anthropomorphizes the tiger.  If it has a soul, it's not a human soul but a tiger's; not Milne / Disney'sTigger nor Bill Watterson's Hobbes, but a tiger's.  What exactly is answered by claiming that animals have souls?  What is a soul, anyway?
 
This conundrum trips up a number of earnest reviewers.  Roger Ebert, for instance, writes that "wild animals are indeed wild and indeed animals," but human beings are also animals, and religions don't necessarily treat us as all nice inside, even if we do have "souls."  The Bible depicts bloodthirsty humans who will be justified by their bloodthirsty lord of armies.  The film shows young Pi grappling with the doctrine that Yahweh killed his own son for the sins of the world, but he only had to do it because his own "justice" required blood sacrifice to atone for sin.  If Pi had been stuck in the lifeboat with a herbivore, his voyage would have been a lot less fraught, and let's not forget the importance of cattle in Hinduism.  Yet a few sentences later Ebert writes:
The writer W.G. Sebold once wrote, "Men and animals regard each other across a gulf of mutual incomprehension." This is the case here, but during the course of 227 days, they come to a form of recognition. The tiger, in particular, becomes aware that he sees the boy not merely as victim or prey, or even as master, but as another being.
That's highly debatable even within the world of the movie.  Once safely onshore, the tiger disappears into the forest without looking back, and Pi cries disconsolately in the arms of his rescuers because the great cat did not grant him any "recognition."  Why should it have done?  I don't see any basis for Ebert's fine rhetoric about the tiger seeing the boy "not merely as victim or prey, or even as master, but as another being."  In the end the movie accepts the warning of Pi's father, that you may think you're seeing recognition in a tiger's eyes, but you're really only seeing your own reflection.  But it looks like many viewers will ignore this, trying to turn it into a parable of interspecies communication and love.  If so, the love is unrequited, but I don't consider that tragic, let alone "spiritual."

I agree with A. O. Scott, the New York Times reviewer:
The novelist and the older Pi are eager to impose interpretations on the tale of the boy and the beast, but also committed to keeping those interpretations as vague and general as possible. And also, more disturbingly, to repress the darker implications of the story, as if the presence of cruelty and senseless death might be too much for anyone to handle. Perhaps they are, but insisting on the benevolence of the universe in the way that “Life of Pi” does can feel more like a result of delusion or deceit than of earnest devotion.
But I repeat: there's nothing in "spirituality" itself -- which can be quite morbid, misanthropic, and even "cynical" -- which entails vagueness or repressing of the darker implications of life.  It's this underlying assumption of the movie that made it hard to watch despite its visual beauty and great performances.  And as an atheist, I believe that any worldview which claims to make sense of the world has to grapple with questions like those of suffering and evil.  Here again I part company somewhat with Samuel Delany when he wrote that the filmmakers "couldn't have come up with a better script promoting atheism if they had gotten Christopher Hitchens to write it."  Atheism isn't itself an answer or a solution to the problem of suffering.  I prefer the witch Granny Weatherwax in Terry Pratchett's Lords and Ladies:
"I don't hold with paddlin' with the occult," said Granny firmly. "Once you start paddlin' with the occult you start believing in spirits, and when you start believing in spirits you start believing in demons, and then before you know where you are you're believing in gods. And then you’re in trouble."

"But all them things exist,” said Nanny Ogg.


"That's no call to go believing in them. It only encourages 'em."