Thursday, January 31, 2013

The Importance of Being an Earner

Today on Democracy Now! I heard again this snippet from President Obama's recent speech on immigration reform:
PRESIDENT BARACK OBAMA: We have to deal with the 11 million individuals who are here illegally. Now, we all agree that these men and women should have to earn their way to citizenship.
Oh, really?  And how did you "earn" your way to citizenship, Mr. President?  Right, you were born.  Me too.

These remarks reminded me of rhetoric I used to hear in the Sixties (though it's undoubtedly older) that The Negro Must Earn His Rights, or show that He deserves them.  Even as a teenager I noticed that all  whites had done to earn our rights was to be born white.

In the story and discussion that followed, I noticed that the Republican proposals for "reform" (a word that should always be put in quotes, it seems to me) include requiring an English proficiency test and a civics test to get a green card -- not citizenship, mind you, just legal residency.  That's not a blatantly unfair requirement, I suppose, just quietly and firmly unfair -- as co-host Juan Gonzalez said, those are presently requirements for citizenship -- but I'd want to know how rigorous the test would be.  Could most native-born Americans pass it?  I've seen the written test for citizenship, because some of my foreign-born coworkers studied for it, and while I could probably pass it, I feel sure that most of the white native-born anti-immigrant racists I know could not.  (I base this on conversations I've had with such people over the years, but also from what I see on Facebook: if they ever took a high-school civics class, they didn't learn anything.)

On the other hand, many of the undocumented foreigners in the US aren't looking to become citizens: they are here to work and earn money to send back home.   Most of them want to go back to their countries eventually, and they do.  Despite the fantasy that many Americans have, not everyone in the world wants to become an American, even if or especially after they've lived here.  Even those who become citizens and master English have to cope with white racism, and increasing numbers of legal immigrants are going back home.  The problem will be how to deal with the temporary immigrants -- call them migrants -- who aren't interested in earning US citizenship, just in earning money.  And they do earn it, working as most of them do in unrewarding service jobs for long hours.  Those on "guest visas," whose jobs may be more interesting, still have problems.  As Mae Ngai, one of the guests, remarked,
We have a lot of experience in this country with guest worker programs, and I think that it should really give us pause. The problem with temporary labor visas is that if the employer holds the visa, as in the case of the H1s, then the worker really has no rights at all. If you say, "You didn’t pay me" — and this is what happens a lot in the lower end of the H2 program — "I didn’t get paid. I was forced to do overtime, all these things," you’re just sent home. You have no rights, and you can’t quit, you know. And we all understand in this country that the quintessential thing of being a free labor—of free labor, is the right to quit, as well as the right to organize. And those are things that you can’t get with a temporary labor visa.
Ngai also said:
I’m reminded a lot of the difference between immigration at the turn of the last century and immigration at the turn of this century. In many ways, they’re similar: It’s a mass migration, it’s a labor migration, it contributed to a dynamic growth of the country’s economy and culture. The main difference, though, was, a hundred years ago, there were no numerical restrictions. So when people say, "My ancestors came legally; they didn’t break the law; they didn’t cut to the front of the line," well, there wasn’t any line. Ninety-eight percent of the people who showed up at Ellis Island got in. And that’s a big difference.
At first I was taken aback by Ngai's claim.  What about the anti-immigrant nativist movements of the American past, with their hostility even to "white" European immigrants?  The 1795 Naturalization Act limited the possibility of naturalized citizenship to "free white persons," extended to include persons of African descent in 1870 but excluding Asians -- a limitation that remained in force until the 1960s.  What about immigration quotas?  I did some looking, and found that the first US immigration quotas were passed in 1921.  The Page Act of 1875 blocked immigration by "undesirables," which in practice kept Asian women out of the country.  The Chinese Exclusion Act of 1882 blocked all immigration by Chinese.  The Immigration Act of 1924 was intended to limit numbers of Eastern and Southern Europeans while excluding Asians and Middle Easterners altogether.  It's significant that Ngai refers to Ellis Island, which processed immigrants entering the US from Europe, but then immigrants weren't supposed to be coming from Asia anyhow.  The number of "illegal" immigrants from Europe or Asia was limited by the fact that few people were able to swim the Atlantic or Pacific Oceans, but an unknown number of Chinese entered the US with forged documents, as "paper sons."  So, while Ngai was technically correct, I think she gave an oversimple and unrealistically positive picture of immigration in the past.

I'm not saying that immigration should be unrestricted, though I think there are better ways to address the issue than we've seen so far.   As this timeline shows, the amount of immigration is influenced if not determined by social conditions in the home countries and elsewhere: Jewish immigration in the late 1800s, for instance, was driven by pogroms and other persecution in Eastern Europe.  Mexican immigration now is affected by the state of the Mexican economy as well as by US business' demands for cheap, vulnerable labor.  (As Ngai points out, even legal immigrants with worker visas are vulnerable to the whims of their employers.)  Changes in policy are similarly affected: the Chinese Exclusion Act was repealed in 1943, during World War II, as a gesture to our Chinese allies against the Japanese enemy; the status of Filipinos came and went, ebbed and flowed.  White American racism remains a constant, though.

Which reminds me of something else: when Ngai mentioned "mass migration," I thought for a moment she was talking about internal migration, such as the movement of southern African-Americans to the northern states just before and during World War II, in search of better jobs and less Jim Crow.  Even migration within the US, by American citizens (albeit of the 'wrong' color), has been controversial in some circles.

Wednesday, January 30, 2013

Do As I Say, Not As They Did

One thing I didn't expect when I started blogging was that I'd receive offers of material from publicists whose clients had ideas to promote, or from artists inviting me to review their work.  It makes sense, and  I'm interested in hearing from these people, but until now I hadn't received anything I wanted to write about.

Yesterday I received a message from a publicist I've heard from before, on behalf of the work of Richard E. Kelly, "a self-described 'survivor' of Jehovah’s Witnesses" and the author of Growing Up in Mama’s Club and The Ghosts from Mama’s Club.  Kelly is concerned to warn against the danger of "cults," fringe religious groups with tight authoritarian structures.  This doesn't seem to be as hot an issue as it was thirty years ago, when such groups got a lot of hostile media attention, but Kelly is working in the same fields.  What caught my attention was Kelly's list of the signs that indicate a group might be a cult.  As I noticed during the last big wave of cult alarmism, these traits describe early Christianity very well.  The similarities help to explain why the early churches were regarded with suspicion and distrust, even hostility.
The following beliefs should be considered cult constructs, he says.
• Certainty that the world will end in one’s lifetime: This is a crucial pill to swallow for a subsequent list of cult beliefs, which keep followers in a perpetual state of fear. If only one holds true enough to a strict set of rules – like avoiding pledges of allegiance at school, for example – then they may be spared at Armageddon.
This one, especially, is a "construct" of early Christianity.  All the New Testament writers took for granted that the world would end within the lifetime of Jesus' first followers.  Jesus himself said so, according to the gospels, and the same belief turns up in almost all the New Testament writings: where it's not put at center stage, it's assumed.  The Apostle Paul often refers to the nearness of Jesus' return.

The early Christians also got into trouble for their refusal to participate in expected demonstrations of loyalty to the Roman Emperor, "like avoiding pledges of allegiance at school."  They refused to show reverence to the divine ruler of the empire; the US simply requires reverence to its flag, though it also is accorded divine significance ("under God").
• Social manipulation: For Jehovah’s Witnesses who are not observant of all rules, ostracism and shunning is used. How to handle someone who questions policy? Make sure their family ignores them!
This is also a trait of early Christianity.  Both Jesus and Paul enjoined believers to shun or expel their fellows who were "not observant of all rules."  Of course, family pressure was a normal way to enforce social conformity in antiquity, as it is to this day.  (In Pray the Gay Away Bernadette Barton shows not only contemporary fundamentalist Protestants disowning their gay children, but Roman Catholics as well.)  The early Christians reacted as modern cults do: by declaring themselves the new family of converts, and requiring them to break with their blood relations.  One follower asked Jesus' permission to go home for his father's funeral, for example, and Jesus told him curtly to leave the dead to bury the dead.  When Jesus' mother and brothers came to remonstrate with him, he simply refused to see them, designating his obedient followers as his true mother and brothers. 
• Cripple half of the members (women): For Jehovah’s Witnesses, women are seen as creatures trapped somewhere between men and animals in God’s hierarchy. No woman can have a position of authority, which means it's men only for preaching, teaching and praying. If there’s an official meeting and a woman prays she must cover her head out of respect for the angels who might be there.
This is true not only of much of early Christianity, but of modern churches.  The apostle Paul forbade his churches to allow women to preach or occupy any position over men, which is used by the Roman Catholic church today to rationalize its refusal to ordain women as priests.  This wasn't true of all early churches -- there was at least one woman apostle in the first Christian generation, mentioned by Paul in his letter to the Romans -- but eventually the male supremacists won out.  Outsiders in the second and third centuries accused the Christians of letting their women run wild, a charge they answered by insisting that they kept their women as subservient as any good Roman did.  
• Scorning education: Who needs advanced learning when the world is sure to end in a few short years? Kelly’s sister, Marilyn, had very little education, so when she was finally able to leave home, she had few coping skills. She ultimately met an abusive third husband, who later murdered her.
The early churches were ambivalent about book learning.  Jesus rejoiced that his heavenly father had revealed the deep secrets of the universe to the poor and unlettered, and Paul exulted that the cult of Jesus, built around his crucifixion, was offensive to Jew and Gentile alike.  Later on, some early Christians found it expedient to acquire good Greek educations, and to work as tutors to children, and some early Christians distinguished themselves as scholars, but in general they were dismissive of such pursuits.  Only after Christianity took over the running of the empire did learning gain more status.
• Sexually repressive: Jehovah’s Witnesses are thoroughly indoctrinated in how to harness the power of the sex drive to please God. It’s obsessive compulsive when it comes to creating rules about sexual do's and don’ts, from masturbation to the role of women; from conception to sexual pleasure. Sex before marriage is an onerous crime, punishable by shunning and death at Armageddon.
The whole Bible is sexually repressive, but the New Testament reflects trends toward asceticism that were current when it was written.  Jesus warned his followers that even feeling sexual desire was worthy of damnation, and held up those who became eunuchs for the kingdom of heaven as examples to all.  Paul permitted but discouraged marriage, since the end was near and marriage was a distraction from devotion to the Lord.  In the book of Revelation, a group of 144,000 "who were redeemed from the earth" sang a new praise song in front of the throne of Christ: "These are they which were not defiled with women, for they are virgins" (14:3-4; King James Version).  Though some branches were less strict than others, mainstream Christianity kept this strain of hostility to women and the flesh throughout its history; and why not?  It is thoroughly biblical and supported by Jesus' own teaching.

Of course I'm not a supporter of the Jehovah's Witnesses, any more than I am of any other sect.  What I want to draw attention to here is the ignorance of many right-thinking people today, who condemn sects today for behaving and teaching pretty much what the biblical Jesus and the early Christians taught and did.  That's not surprising, because Jesus and the early Christians were religious fanatics whose extreme teachings and conduct would have gotten them in trouble in just about any time or place.  But most people today -- even many atheists, to my ongoing surprise -- want to see Jesus and the early church as good guys, even as exemplars for our troubled times.  I think it's much more likely that they would have despised them, even if they didn't condone overt persecution, for the same reasons they despise and attack modern groups which follow in the early Christians' footsteps.  It's clear, at least, that they have no idea what early Christianity was like.  Most of what Kelly (and most other anti-cult writers) see as characteristic of certain small cranky sects is really characteristic of mainstream Christianity, historically and sometimes down to the present.

Something similar: my liberal law-professor friend recently shared on Facebook an announcement of an upcoming event showcasing Islam at her university, probably run by campus Muslims groups.  The poster touted the Koran as providing "guidance" for humanity.  I'm all for religious education, though I expect Muslims to be as honest and accurate about their faith as Christians are -- which isn't very.   I don't agree that the Koran is a guide for humanity, any more than the Bible is.  I can oppose and criticize the widespread demonization of Islam in the US, without endorsing it or any other religion.

Tuesday, January 29, 2013

The Power of Critical Thinking

One reason I'm so skeptical when people claim that the Internet has somehow sapped our precious critical fluids, and that people are now more credulous than they used to be is that I'm old enough to remember the days before the Internet.  One famous case is that of the petition allegedly filed with the Federal Communication Commission by the atheist Madalyn Murray O'Hair to ban all religious broadcasting from the airwaves, so all faithful Christians should write the FCC right now!!!  As Snopes.com explains if you want the details, this hoax goes back to at least 1975, long before the Internet became accessible to ordinary mortals.  Though the hoax was debunked many times, the FCC received more than 30 million pieces of mail decrying the nonexistent attempt to ban religious broadcasting.  That's thirty million, and all without e-mail or Facebook.  It may well be that the Internet has made it easier for such rumors to spread, but word-of-mouth, photocopying machines, and radio evangelists did quite well without microprocessors and fiber optics, thank you very much.  In the absence of evidence that people are more gullible than they used to be -- which I haven't seen, and I don't know how you'd try to prove the claim -- I think it's fair to presume that people have always been willing to believe absurd things.  The history of religion alone gives plenty of evidence in favor of my opinion, and I'm inclined to cite the frequent and unsubstantiated claim that people are more gullible or more stupid than they used to be as more evidence in itself.

I've just begun reading Helen Merrick's The Secret Feminist Cabal: A Cultural History of Science Fiction Feminisms (Aqueduct Press, 2009), which attempts to trace women's participation in the world of science fiction not only through the women who wrote it, but through the women who participated in fandom.  I came across a reference to the book while I was rereading Joanna Russ's The Female Man last month, and so far I'm enjoying it.  Among much else, Merrick recounts the genesis of probably the first women's fanzine, Femizine, in England in the summer of 1954.  Also pre-Internet, you'll notice.  There were three editors, but the editorial manifesto in the first issue "was signed by the main editor, Joan W. Carr 'on behalf of all femme fans' (and co-editors Frances Evans and Ethel Lindsay)" (90).
Reading this zine for evidence of the interests and passions of female fans of the 1950s is, however, complicated by the fact that Joan W. Carr was not, in fact a woman, but a hoax, a fictional persona created by male fan Sandy Sanderson.  While Frances Evans was aware of Carr's real identity, co-editor Lindsay initially was not, and the hoax was not revealed to UK fandom and the readers of Femizine until May 1956 [91].
Femizine turned out to be very popular, drawing up to 100 fan letters for each issue.  (The letters pages of sf magazines and fanzines back then were the forerunners of the computer bulletin board and Usenet decades later, both in terms of their style and of their demographic.)  The editors exploded the hoax themselves in the summer 1956 issue, and to their relief, "fannish reaction to the hoax was not the catastrophe Evans and Lindsay feared ... That the response of women fans was not as bad as the editors feared may have been due to the fact that many had secretly not found "Joan" as enticing as the men [did]."  One female fan commented that "Not everyone liked Joan wholeheartedly.  My own reaction was that she was another of those masculine sergeant-type women -- horribly competent and out to prove they are as good as any man by acting like a man!" (92-3).  Femizine stopped publication for a while, then restarted in September 1958 and ran for two more years.

This is a relatively benign and minor deception, but it shows that the written word, whether printed or mimeographed, can provide the same kind of anonymity that the Internet does now.  And people tend to believe what they read, and what they hear.  (Most of the time, anyway: the biblical scholar James Barr pointed out that though fundamentalist Christians deplore the materialist, secularist skepticism that doubts the literal reality of the biblical miracles, they are harshly and scornfully skeptical of miracle stories outside the Bible, whether in Catholic lives of the saints, "heretical" Christian writings, or "pagan" mythology.  Richard Dawkins and Daniel Dennett have nothing on them.)  In a way that's neither surprising nor foolish: we are social animals, and trust is the glue that lets us communicate with each other.  But it is also surprising and foolish.  One of the primary uses of language is to lie, and children figure out how to do it themselves before they learn to distrust their parents on Santa Claus or other official lies.  We couldn't survive if we decided to disbelieve automatically everything anyone says to us; but we'll get into trouble if we believe everything automatically.  We have to learn to use judgment, which means learning to use reason and evidence but doesn't stop there.  It takes experience and intuition, which must be cultivated throughout one's life.  But the right kind of skeptical, critical judgment will be discouraged forcefully by some or most of the people around us, starting with our parents, so it's hardly surprising that most people never really learn it.

Monday, January 28, 2013

They Don't Make Sheeples Like They Used To

Another day, another fake Buddha quote.  An old friend shared this on Facebook today:

Knowing his audience, he added a comment: Someone will say 'Buddha didn't say that' , I dont care."

In reply, someone posted, "'The trouble with the Internet is, you can't tell which quotes are authentic' - Buddha."

While trying (not very hard) to track this one down, I came upon Fake Buddha Quotes, a blog by a Buddhist writer posting as Bodhipaksa who also knows something about tracking down sources.  He covers this quotation here.  He thinks it came from a 1994 book by an American Buddhist teacher called Jack Kornfeld, whose title (Buddha’s Little Instruction Book) makes it sound as if the teachings it contains come from the Buddha himself, rather than through the teacher.  He concludes that he's "not actually sure what it's saying," and suggests that it's a "deepity", a word "coined by the teenage daughter of one of [philosopher Daniel Dennett's] friends. The term refers to a statement that is apparently profound but actually asserts a triviality on one level and something meaningless on another."  I'd have thought that Dennett would consider everything from outside of science and positivist philosophy to fit this definition; he's coughed up a few of them himself, for that matter.  I would certainly classify a lot of canonical religious teaching as deepities.  (I think I'll stick with "platitude," though.  Dennett, like Dawkins, seems to be overfond of coming up with new words for old ideas, like "Bright," that always seem to be a bit off.)

A good many people get all spitty when it's pointed out that they've misattributed a quotation, either first-hand or by passing along a misattribution.  Others benignly brush it aside, because what harm is done?  I don't believe they'd be as blithe about misattribution if the same empty verbiage were attributed to someone they didn't like.  (I must get around to making a meme of my own, based on a feel-good platitude that was big in the 60s.  It occurred to me that it would show just how meaningless it is if I put it over pictures of some other less appealing people.  I'll get to this Real Soon Now, I promise.)

If it doesn't matter who said it as long as it makes sense -- though "making sense" is an odd criterion for religious types to invoke -- then why attribute it to the Buddha, or to anyone? It seems to me that someone tried to give their saying prestige by putting the Buddha's name on it. If it's valid no matter who said it, then why attribute it to anyone in particular -- especially to someone who didn't say it?

Whether it's valid is another matter. It's another one of those nicely vacuous platitudes that could also be true if it's reversed: "The problem is, you think you have no time." OMFG, how profound! We're always in a hurry, but time is an illusion and we exist in eternity, so high five! Since it's meaningless, it can also be interpreted differently if you attribute it to someone else. Auric Goldfinger, for example: Your problem is, Mr. Bond, you think you have time.  As usual, the people who love this quotation take for granted that their understanding of the platitude is the true and only one, though given its ambiguity it could be interpreted in numerous other ways.


I browsed a little through FBQ and found this post, which quotes from and links to another blogger's piece on the problem of falsehoods being spread over the Internet.  An excerpt from the other blogger's post:
Why do I care, you ask? Because it’s a waste of time. Because I want to believe that the people around me aren’t knee-jerk emotional reactionists willing to dispense with logic because the internet is such a shining bastion of quality information. Because it takes no time at all to stop, consider, and question. Because truth is better than bullshit. Because right is better than wrong…
Now, I agree wholeheartedly with the basic point, but I do have some objections to some of what the other blogger, John Shanahan, says.  Some of the trouble is apparent in the paragraph I just quoted above: he wants to believe that the people around him aren't knee-jerk emotional reactionists usw.?  That's faith, and the worst kind of faith at that: believing what you know isn't so.  The evidence is clear that the people around him are knee-jerk emotional reactionists willing to dispense with logic and so on.  And so, judging by his refusal to believe the evidence of his own observation, is he.  You're mad too, Mr. Shanahan, or you wouldn't be here.

Shanahan begins his post thusly:
At some point in the earliest days of e-mail, the internet began to take away our power of critical thought. It moved in slowly, taking hold like a cancer and then spreading. It preyed on our trust. It made us think it wanted to protect us from things or tell us about interesting things of which we were unaware.
Oh my goodness, you don't think this suddenly began with the advent of the Internet, do you?  It's much older than that, probably as old as humanity.  When did people conscientiously use their power of critical thought?  And it wasn't the internet that did this.  The internet doesn't prey on your trust: people prey on your trust.  The Internet didn't make you think anything; people do that, and in fact they can't make you think much of anything.  They can cajole, persuade, harass, seduce, and try to manipulate you, but it's you who decide whether to go along with them or not.

A bit later, after dismantling a couple of popular bogosities, he wails:
I know that there are some anti-Snopes people out there, or those who would challenge my use of it as a source. (It’s been done.) But you can’t refute a reference that–hang onto your hats, outright sheepish believers!–cites its sources. Its legitimate sources. Holy research! Sweet mother of web-based-journalism, how can this be?
Now, as a matter of fact, you can refute a reference that cites its sources, even legitimate sources.   The person citing those sources might misquote or misrepresent them.  The sources might be inaccurate, or mistaken.  For example, the economist Amartya Sen wrote in his Development as Freedom (Knopf, 1999, p. 168) that
the authoritative Encyclopedia Britannica, in its vintage eleventh edition, refers to the Indian famine of 1344-1345 as one in which even "the Moghul emperor was unable to obtain the necessaries for his household." But that story runs into some problems. It is sad to have to report that the Moghul empire in India was not established until 1526. Perhaps more important, the Tughlak emperor in power in 1344-1345 -- Mohammed Bin Tughlak -- not only had no great difficulty in securing necessaries for his household, but also had enough means to organize one of the more illustrious programs of famine relief in history. The anecdotes of unified starvation do not tally with the reality of divided fortunes.
I admit that I haven't checked Sen's sources here myself.  But that's just it: I could conceivably refute the "reference" of a Nobel-Prize winning economist who cites his very reputable sources, if it turned out he had misrepresented them.  If he's right, then Sen refuted such a "reference" himself.  What you learn by checking sources is whether they've been accurately represented.

Bodhipaksa, to his credit, recognizes this himself.  He points out that all he establishes by tracking down a Buddha quotation to the canonical sources is whether it appears there.  Even the older Pali canon was preserved orally for several centuries before it was written down, and it is possible (even likely) that some of the teachings attributed to the Buddha there weren't actually spoken by him.  Anyone trying to sort out the authentic teachings of Jesus faces the same problem: we can say with fair confidence what the canonical gospels report that Jesus said, but we can't be sure whether he actually said them.  But if someone prints the text of the Gettysburg Address and tacks Jesus' name onto it, you can be fairly sure that it's inauthentic.  If a saying like "The trouble is, you think you have time" appears nowhere in the canonical Buddhist scriptures and only appears for the first time in a book published in 1994, it might be a very fine saying and true, but there's no reason to claim that the Buddha said it.

At the same time, the resistance many people give to admitting this is evidence of something: it suggests how sayings never uttered by this sage or that came to be attributed to them: it sounded nice, and such a wise man could very well have said it because he was wise and the saying is wise, so why not put his name on it?  This must be a trait of human psychology: it's ancient and widespread and difficult to dislodge.  Therefore it must be taken into account whenever someone does serious historical study.  It's a reminder of how tenuous and tentative most human knowledge is.

It must also be a product of the evolutionary process: we survived and flourished because a lack of scrupulosity about who said what is deeply rooted in human nature.  Oddly, many rationalists resist drawing this inevitable conclusion, just as John Shanahan resists recognizing that most people (including himself) are not particularly rational.  A commenter on his post wrote:
Every where you go on the internet, you are constantly beckoned to “follow” something…your blog, for example, or twitter…so why be surprised that we’ve become a race of sheeples?
"Become"?  Nope, we always have been "a race of sheeples," thanks to thousands and millions of years of evolution.  When I pointed this out, the blogger urged me to "throttle down the righteous indignation there, sport."  Well, that's only natural.

Saturday, January 26, 2013

A Fine Mess

I'm just about exactly halfway through Wuthering Heights today.  It's the third time I've read it, mainly because I'll soon be seeing Andrea Arnold's 2011 movie version.  I have never liked the book as much as I'm supposed to, and while it's a rattling good read I still don't get it.  I don't sympathize with any of the main characters, though judging by the various other film versions I'm supposed to see Heathcliff as a romantic, even Byronic figure, and Cathy likewise.  A lot of classic love stories look to me like two people tied together with a lead weight hung from their feet, thrown into the sea, struggling as they sink into the watery depths, and too busy blaming each other for their predicament to even try to untie themselves.  And all through it runs the refrain of one of Eric Berne's Games People Play, See What You Made Me Do: because you don't love me as I wish to be loved, because I went and married the wrong person, because you made the weather bad today, I'm going to tear my hair out, bang my head against the wall, and hold my breath until my face turns blue.  And when I'm dead, you'll be sorry, and I'll be laughing!  You just wait!

Ironically, Emily Bronte herself shot down any notion of Heathcliff as a romantic hero.  Aside from his wanton mistreatment of a dog, he tells Nelly, one of the narrators, after he has married Isabella Linton:
"She abandoned [her family] under a delusion," he answered, "picturing in me a hero of romance, and expecting unlimited indulgences from my chivalrous devotion. I can hardly regard her in the light of a rational creature, so obstinately has she persisted in forming a fabulous notion of my character, and acting on the false impressions she cherished. But at last I think she begins to know me. I don't perceive the silly smiles and grimaces that provoked me at first, and the senseless incapability of discerning that I was in earnest when I gave her my opinion of her infatuation and herself. It was a marvellous effort of perspicacity to discover that I did not love her. I believed, at one time, no lessons could teach her that. And yet it is poorly learned, for this morning she announced, as a piece of appalling intelligence, that I had actually succeeded in making her hate me -- a positive labour of Hercules, I assure you!"
This shouldn't be taken quite literally, since we've also been shown that Heathcliff enticed and encouraged the girl's infatuation with him.  But no one is responsible for anything in this book: it's all somebody else's fault.  That gets on my nerves, but my main question is how anyone could have mistaken the book for a love story at all.  Even the core relationship between Cathy and Heathcliff is an addiction, not anything I recognize as love.

Since I began this post I've finished the book, and I'm still baffled.  The happy ending feels forced, and I'm of two minds about Wuthering Heights.  It feels on the one hand as if Bronte knew exactly what she was doing when she wrote it, and I'm just failing to follow the structure; but it also feels as if she got distracted, forgot where she was going, and tied the threads together carelessly.  But again, she wrote with too much confidence and authority for me to be able to believe that.  A riddle of a book, and I can't solve it.  Maybe next time around. 

Friday, January 25, 2013

The Banality of Banality


This meme turned up in my feed at Facebook yesterday, and when I checked I found, to my surprise, that it was authentic: Wilde did write these words, in his essay The Soul of Man Under Socialism.  No doubt he had in mind the gospel saying "Lay not up for yourselves treasures upon earth, where moth and rust doth corrupt, and where thieves break through and steal: But lay up for yourselves treasures in heaven, where neither moth nor rust doth corrupt, and where thieves do not break through nor steal" (Matthew 6:19-20).

But then I began to wonder, and not for the first time, why people choose a platitude like this from the writings of someone who produced a huge store of genuinely witty, intelligent, thought-provoking and meaningful sayings.  (One example out of many: "A thing is not necessarily true because a man dies for it.")  Almost anyone could have said the words in that meme, and many have.  Jesus himself was hardly original: the transience of material possessions was a cliche in his own time.  (Helpful hint: if a religious or philosophical teacher preaches against accumulating material possessions, that is a sign that he lives in a culture where most people accumulate material possessions.)  Many of the quotations I see on Facebook consist of putting banalities into the mouths of famous people who didn't say them, which is why this one almost fooled me.

Potatoe

I've been rereading Harvey A. Daniels's Famous Last Words: The American Language Crisis Reconsidered (Southern Illinois UP, 1983), which I've quoted before.  This time I want to pass along another depressing bit, where Daniels quotes an article by "a Chicago City College teacher" for the Chicago Tribune, "bracketing testimony about his own nobility with dozens of student errors" (232).  Here's one of the examples of student stupidity offered up:
I was born in the state of Mississippi, where I started school in the south the teacher didn't teach much about writing the little I know I learn in high school where I attented here which isn't very much, you see.
Daniels remarks:
One wonders what Mr. de Zutter's students thought when they saw their own writing held up as examples of stupidity (and hilarity, according to the Professor's accompanying commentary) for the amusement of the million or so readers of the Tribune.  What did these students say to Professor de Zutter the next time they came to class?  What did he say to them?  How, exactly, was the work of the class advanced by the public ridicule of the students' efforts?  What about the student who had already confessed that what he knew about writing "isn't much, you see"?  What was the purpose of humiliating someone already so humble?

Professor de Zutter, and all of his colleagues around the country who have written or abetted articles in this vein -- exposés that depend on reproducing the worst sentences from the clumsiest essays of the weakest student -- have demonstrated something worse than student illiteracy: they have confirmed their own incompetence.  Teaching anyone anything requires a modicum of trust -- teaching writing requires perhaps more than teaching any other subject.  The writing instructor who is filled with a perpetual sense of outrage over his students' inadequacies, who is obsessed with the shortcomings of their past training, who loathes their attitudes and tastes, who actively expects to despise both the form and content of everything they say or write, who feels that such work is beneath his dignity, who wants to get out of teaching writing, who is so contemptuous of his students' morale -- such a person will never teach anyone to write.  His students may learn to hate their teacher, which might be appropriate.  But they will probably also learn to detest writing, which is a sad and unnecessary waste [232-3].
Daniels goes on to point out that this contempt for students is, " so far as I can tell almost unheard of among elementary teachers" (233).  Not always, though: I've long been haunted by this anecdote from Colette Dowling's The Frailty Myth: Women Approaching Physical Equality (Random House, 2000).
The lack of encouragement girls perceive often come in the form of criticism and even verbal abuse from coaches and parents. In tee ball, a baseball-type game for five- and six-year-old boys and girls, coaches and parents make it clear that they’re not overly interested in the girls who play. And it isn't because they can’t throw. Girls who display strength, power, or physicality when interacting with boys run the risk of being marginalized. “I know the boys don't like it that I run faster than they do,” said Amanda, the fastest runner in seventh grade. “But I do.”

... Worse, the coaches didn’t take their young charges seriously, not even the female coaches. “None of the girls want to be there,” one coach told Landers.* “Not one. If I put a coloring station in the corner, every girl would be there. ... Dads take their sons out to throw with. Girls stay inside and play dolls.”

... In the Georgia tee ball scenario, the coaches reprimanded the girls more frequently and more harshly than they did the boys – and often for the same behavior that was accepted in boys. When they weren’t criticizing individual girls, they were projecting global images of girls as incompetent. When one of the male coaches dropped a ball he was trying to throw, he said, “Look, I throw like a girl.” Apparently he felt so embarrassed, he was compelled to turn the scene into farce: “He contorted his arms and awkwardly threw the ball toward Helen, [a child] who was standing in front of him.” This grown man deflected attention from his own gaffes by bringing one of the girls onto the scene as “a prop for depicting her own incompetence,” Landers wrote in her field notes.

This male coach might have done better to sharpen his own skills, but instead he used the girls to gloss over his clumsiness. After dropping the ball another time, he said to one of the girls, “I’m a little girl. I can’t catch the ball.” Such mockery obviously sends a powerful message to girls -- not only about their abilities, but about their very worth. This coach was telling his girls they were inferior, weak, and not to be confused with strong, powerful boys. The brainwashing worked. The coach's treatment of them and the unwelcoming attitude of the boys on the team contributed to the waning interest of girls during the season [90-2,emphasis added].
When I read this, I imagine a scenario in which the parents present surrounded the coach, and when they dispersed, he was gone but for a little greasy splotch on the grass.  But it seems that the parents didn't object: they wanted the girls out of the program as much as the coach did.

It also occurred to me that the teaching of sports to children provides evidence of how social construction works.  If a little boy can't throw or catch a ball, adults will expend quite a bit of energy to teach him how.  It will take no little determination on the boy's part to get the adults to leave him alone, if he isn't interested in learning this skill.  If a little girl can't throw or catch, adults will throw up their hands and say, "What can you expect?  Girls just naturally can't do this, and they don't want to."  If a girl persists anyway, more pressure will be applied to drive her away, and it will probably succeed if she doesn't have adult allies.  There will be general agreement that it is nature, not nurture, at work in this sorting process.

This may relate to what I wrote yesterday about moral absolutism and "grounds" for criticizing it: if a society doesn't want little girls learning to throw a ball, I may indeed have difficulty answering the claim that they shouldn't.  But absolutists often try to justify and rationalize their dogmas: not only shouldn't little girls throw a ball, they naturally don't want to and naturally can't.  This sort of claim can be rebutted and refuted.  (Remember too that the coaches who tried to drive girls out of tee-ball league didn't invoke religion; religion isn't the only vector for moral absolutism.  Whatever people want to believe will, however, sooner or later find its way into religion: It's not I who say this, it's God, and you cannot go against the word of God.  This move is only effective if you want to be fooled by it, but many people do.)

It then becomes evident that the desire to humiliate students who don't measure up to the instructor's standards is not just a personal neurotic quirk: it functions, often with social support, to weed out undesirables, students whose grew up in areas with inadequate schools and parents who were unable for whatever reason to take up the slack.  (The title of this post comes from an incident that is probably mostly forgotten now: in 1992, then Vice-President Dan Quayle attended an elementary-school spelling bee.  When one of the students spelled "potato" correctly, Quayle intervened to get him to add an "e."  If one comes from a well-to-do white Republican family, poor spelling and grammar skills need not interfere with one's ability to rise in society.)

In one respect, I'm all for people who flaunt their punctuation, spelling, and grammar intolerance openly: it makes it easy to identify and pick on them.

*Dowling is referring here to Melissa A. Landers and Gary Alan Fine, “Learning Life’s Lessons in Tee Ball: The Reinforcement of Gender and Status in Kindergarten Sport,” Sociology of Sport Journal 13 (1996): 87-93.

Thursday, January 24, 2013

Relative Absolutism


I knew I was skating carelessly on thin ice over some important problems in yesterday's post, but I allowed myself to feel rushed.  Besides, they weren't central to the topic of the post, and I knew I could address them separately when I was ready.

Today I got e-mail from a regular reader who questioned me about some of the matters I'd slighted, so I'll try to answer him.  I don't think I have any final answers on these issues, though: like most of my posts, this is a first draft, an exploration of ideas rather than a final disposition of them.

My correspondent wrote:
I fear how far this can be taken.  In Saudi culture, the oppression of (“protection of”, they would say) women is MORAL.  Murdering gays is MORAL. By your non-system, outsiders, westerners would have no grounds to critique this.
First, I don't think I was positing a non-system in the post; or a system either.  I hadn't gotten that far.  I was criticizing Penn Jillette's apparent moral subjectivism -- if it feels good to Penn Jillette, do it -- not offering a better way to settle moral questions.

Second, Saudi culture has no grounds for oppressing women or murdering gays either.  When I said that there is no absolute morality, I meant it, so I don't recognize Saudi culture's absolutism and I don't have to be uncritical about it.

A word about "grounds" in this context.  I think it was in one of Mary Midgley's books that I first encountered the idea that having grounds or a foundation for an argument is a metaphor based in pre-Copernican cosmology: the earth is the unmoving, immovable center of the universe, and if you are anchored in the earth, your system is securely founded and will stand against the Devil and his relativist minions.  This is one reason why even a sun-centric cosmology, let alone the vast uncentered relativistic universe that succeeded it, was so unnerving to many people, including scientists: it induced a feeling of vertigo, literal and metaphorical.  (Which is why some scientists are trying to find absolute directions in the universe to this day.)  There is no absolute up or down in the universe either, any more than there is an absolute right or wrong.  When I mention this to some other non-theistic types, they tend to get angry.  They declare that "grounds" and "foundations" are metaphors, you're not supposed to take them literally.  On one hand they're factually mistaken -- "grounds" and "foundations" used to be meant quite literally -- and on the other hand they're evading the problem, because if they are metaphors, what do they mean, and what are they metaphors for? I suspect the metaphor allows many people to feel more sure of themselves than they have warrant (another metaphor) to be.

So there's a hidden assumption in my reader's objection: if you don't have some absolute grounds for your beliefs, you can't critique someone else's beliefs: the only way to oppose one absolutism is with another absolutism.  But of course you can.  One way is to compare their principles to their practice, or to try to show how their principles contradict each other.  Or you can argue that their principles are wrong, by trying to find principles that you agree on and going from there.  Your critique, and your attempt at persuasion, might fail, but so might an attempt to convert your opponent to another moral system.

I suspect there is another hidden assumption at work here: that change in moral and ethical values over time has come because human beings abandoned "false" morality as they came to discern "true," absolute morality.  To believe this, I'd have to see reason to believe that there was such a "true" morality, and I'd want to know how one tells the true from the false.  To many people it's just as apparent that people have progressively fallen away from "true" morality as they understand it, after all.  But I see this mainly as the imposition of a linear progression on history, rather than as anything that follows from it.

I'm not the first person to notice that most of history's worst atrocities (as they appear to me) were committed by absolutists, not by relativists.  That makes sense, since absolutists are more likely to have the moral confidence necessary to sacrifice other people's lives.  Relativists, it seems to me, should be more tentative about imposing final solutions. I have noticed that absolutists quickly become relativists when they encounter pressure from outside: they want other people to compromise their morality, but refuse to do it themselves.  At the very least they demand "respect" for their inhumane values and practices -- a respect they feel equally entitled to withhold from others.

In the case of the hijab, for instance, I have seen plenty of Islamic criticism of non-Muslim women's immodesty and indeed libertinism for not covering their heads.  That head-covering has not been a universal Muslim custom in all times and places escapes these critics.  But they certainly feel entitled to attack non-Muslim customs without feeling that they are interfering in another culture's values.  There's a popular derisive quip in multiculturalist circles about "white women protecting brown women from brown men"; it cuts both ways, but no one seems to notice that.

I don't see that the only alternative to moral absolutism is total nonjudgmental "relativism."  I suspect that "relativist" is in practice an epithet, not an argument -- much like "pacifist" or "isolationist."  It's a common tactic to accuse one's opponents of a total lack of values or norms when they challenge one's particular stance, and this is a diversion, not an argument.  I'm not arguing a middle ground, I'm arguing a third or fourth or other alternative.

While it's true that there is no absolute up or down in the universe, we can and do speak meaningfully speak of "up" or "down" on earth.  It wouldn't be immoral if the sun went nova and blasted all life from the earth, but that doesn't mean it wouldn't matter to us.  I don't need to believe that my life matters to the universe to resist death for as long as I can.  But many people disagree: they want to believe that a robin redbreast in a cage puts all heaven in a rage, and that it's impossible for them to have morality if they don't believe that the universe cares as much about their every hangnail and tummyache as they do.  If they were right, then maybe earthquakes are caused by sodomy after all.  And if you run overtime on your parking meter, Nature kills a kitten.

I wrote in the previous post:
Luckily, we don't need an absolute morality.  A lack of absolutes apparently scares many atheists as much as it scares many theists, however, and "relativism" is an equally harsh epithet in both camps.  I think that it doesn't matter whether the universe cares what we do: what matters is that we, as human beings, care.  Even if there was a god, its interests would not be ours.  (Theists have always been aware of that -- Quod licet Jovi, non licet bovi; His ways are not our ways -- but they try to evade its significance.  Scientific non-theists know the same about Nature and Evolution, but still figure Nature is up there in the stands rooting for us if we show the right spirit.)  Human morality has to be decided at the human level.  There's no certain way to decide what is right and what is wrong.  Human beings construct right and wrong over time, and since human beings are fallible, some skepticism about even the most important values is necessary.  There's no impartial outsider to adjudicate the conflicts; we have to do it ourselves; the buck stops here.
I don't think I posited an absolute relativism there.  Yet people like my reader -- and he's far from alone -- tend to assume that if someone denies the validity of absolute morality, the only alternative is no morality at all: "I agree with you…but I think it leads to a nihilism that is challenging to address."  That simply doesn't follow, any more than it follows that without an absolute up or down we are going to fall up when we trip on the sidewalk.  (You don't know, it could happen!)  I don't see how the denial of moral absolutes "leads to a nihilism" of any kind.  "Nihilism" isn't rooted in the universe any more than absolute morality is.

Nietzsche derided what he called the "slave morality" of Christianity, and spoke of a transvaluation of values.  But for all that he was a four-eyed sissy with a nervous tummy who lived safely in a highly regular European society, commonly considered the height of civilization.  I doubt he'd have lasted five minutes in, say, the slums of New York.  I don't think a genuinely "nihilistic" society made up of bold warrior individuals would last for long either.

It might be that for many, even most people, there's a powerful psychological need to try to connect their values to the universe.  I see this not only in morality and ethics, but in standards of personal beauty and artistic merit.  People want to believe not only that they think that a given movie star is gorgeous, but that the person possesses an objective, absolute gorgeousness independent of individual taste; or they want to believe that Shakespeare and Beethoven are objectively excellent, independent of taste and cultural difference.  The great poet will occupy a favored place in Heaven.  The deification of heroes, artists, and beauties in Greek antiquity might be a manifestation of this belief.  There's no reason I know of to agree with either belief, and none to think that it matters.  That people still cling fiercely to the metaphor of foundations of knowledge indicates that the metaphor possesses a strong psychological appeal, even though it is factually false.  If we really can't think or judge without these illusions, so be it, but we should at least acknowledge them as a limitation in our ability to think clearly, and not as a merit of the illusions themselves.

My reader also argued:
Plus, your non-system assumes monolithic CULTURES.  Rebels within Saudi Arabia, for example, would have little grounds on which to oppose oppression, would they?
Not only didn't I posit a non-system, I didn't say or assume anything about "monolithic CULTURES."  But those rebels wouldn't need "grounds," since the dominant culture has no grounds either.  I consider the fact that cultures aren't monolithic to be favorable to my position.  If all Saudis, male and female, agreed that women need to be "protected," change would be impossible.  But no culture has that kind of unanimity.  And no system is completely free of contradiction -- certainly not a system of values.

How did the West abolish slavery, for example?  The abolitionists couldn't point to the Bible, which regulated slavery but accepted it as part of the social landscape.  Apologists for slavery could appeal to Biblical authority, so abolitionists had to come up with new arguments.  Many of them weren't very good, and they were hampered by many white abolitionists' inability to recognize blacks as fully human beings.  Yet slavery was abolished, in most cases nonviolently.  In the US, which used a horrifying Civil War to do the job, blacks were pushed back into servitude as soon as formal slavery was abolished, and racism persists to this day.  No one really had solid ground on which to stand, except the slaveowners who could point to Scripture.  Yet change happened, unevenly, very painfully, and it's still in process.

I keep thinking in this connection, of Bob Jones University's final abolition of its policy against interracial dating among its students.  For decades the administration insisted that they had what they (if not the worldly) considered good Biblical reasons against the practice.  But when the change came, they admitted that they didn't really know why they had clung to the policy for so long. They couldn't find a Biblical reason for it, not anymore.

The same applies to women's status, or to gays.  Many of the arguments on all sides are groundless.  I don't know of any grounded reasons why homosexuality should have equal status with heterosexuality, but then I don't need any.  I don't consider the Bible an authority; while I don't lightly dismiss longstanding public attitudes, I also don't consider them above criticism.  I can't overthrow them in one move, but I can chip away at them.  Arguments against homosexuality are mostly pretty bad; the best I've seen start from assumptions about maleness and femaleness, and the proper role of sex in human life, but they are assumptions; start from other assumptions, and you get different results.  But many of the arguments for the acceptance of homosexuality are no better.  A popular one in my youth was that homosexuality was so common, according to Kinsey; but it made assumptions about what "homosexuality" was, and even worse, assumed that what is common, even prevalent, is therefore good.  Racism is at least as common as sex between males, but racists don't get much traction if they try to argue that racism is good because most whites are racist -- especially as they realize that whites are no longer as overwhelming a majority in America as they used to be.  Gays have had to disregard the prevalence of antigay prejudice, so numbers alone aren't an argument.  Why have antigay bigotry and homophobia diminished in the US and some other countries in the past fifty years?  No one knows, but I doubt it was because of a careful, reasoned debate.

I've argued that the debate on same-sex marriage has been incoherent and badly-reasoned on both sides.  Again, I don't think grounds have much to do with it.  But the advocates of same-sex marriage aren't arguing for moral nihilism, despite the claims of some of their opponents.  The debate isn't rationally grounded, but it's not free-floating either.  The answer lies not somewhere in between, but somewhere else.

Wednesday, January 23, 2013

Do You Have a Personal Relationship with the Null Set?

A friend passed this one along on Facebook today.  I couldn't believe it: though he was surely unaware, Penn Jillette was using a well-worn evangelical trope.  I figure it's well-worn because I first saw it in a book by the great revivalist Billy Graham,* and I doubt he invented it -- it probably had grey whiskers when he used it:
Recently, a friend of ours was converted to Christ.  He had previously led a wild life.  One of his old friends said to him, "I feel sorry for you.  You now go to church, pray, and read the Bible all the time.  You no longer go to the nightclubs, get drunk, or enjoy your beautiful women."  Our friend gave a strange reply.  He said, "I do get drunk every time I want to.  I do go to nightclubs every time I want to.  I do go with the girls every time I want to."  His worldly friend looked puzzled.  Our friend laughed and said, "Jim, you see, the Lord took the want out when I was converted and He made me a new person in Christ Jesus" [128].
That certainly seems clear enough, doesn't it?  Become a Christian and the Lord will take the want out.  But on the very next page Graham admits that "the want" is still there:
Conscious of my own weakness, sometimes on rising I have said, "Lord, I'm not going to allow this or that thing to assert itself in my life today."
Well, there's his problem right there: it's sinful pride for the kind of Christian Graham is to think that he can control his old Adam.  Only the Holy Spirit can keep this or that thing from asserting itself in his life.
Then the devil sends something unexpected to tempt me, or God allows me to be tested at that exact point.  Many times in my life I never meant to do in my mind I did in the flesh.  I have wept many a tear of confession and asked God the Spirit to give me strength at that point.  But this lets me know that I am engaged in a spiritual warfare every day.  I must never let down my guard -- I must keep armed.

Many of the young people I meet are living defeated, disillusioned, and disappointed lives even after coming to Christ.  They are walking after the flesh because they have not had proper teaching at this precise point.  The old man, the old self, the old principle, the old force, is not yet dead or wholly renewed: it is still there.  It fights every inch of the way against the new man, the new force, that God made us when we received Christ.  Only as we yield and obey the new principle in Christ do we win the victory [129f].
It's easy to see how these young people were led into error: they listened when Billy Graham (or someone like him) promised them that their old selves would be killed with Christ and washed clean by the Holy Spirit, and that the Lord would take the want out.  When the want turns out still to be in them, they can then be blamed for it, because they haven't prayed enough or really surrendered their will to the Lord or something of that sort.

But what in the name of Nobodaddy is Jillette doing with this bit of nonsense?  It's a clever evasion of the question it pretends to answer, and will surely snow the rubes, as it's meant to.  He couldn't possibly mean to imply that if you abandon theism, no-god will take the want out.  (Could he?)  He must know that even if he is virtuous enough never to want to do anything bad or harmful to others, many people aren't cut from the same sublime cloth.  It says something about Jillette that he turned it into a question about himself, when it's about other people.

There is no easy answer to that question.  More thoughtful people than Jillette have been arguing about it for a long time.  A better, more serious rebuttal would be to point out that despite their belief in God, Christ, and a personal Hell, most Christians have continued to sin.  Christians know that they shouldn't rape or murder or steal, yet many of them do so anyway.  American prisons are full of Christians; atheists are seriously underrepresented in our prison system.  It could be that being assured of forgiveness leads people to assume that they can misbehave and get away with it if they confess and repent.  In any case, religious belief, including belief in judgment for one's sins, doesn't seem to deter people very much.

But even this begs some questions.  For an atheist as pure in heart as Penn Jillette, there's no problem: he's not even tempted to do bad things.  But what about those who are tempted?  Why should they resist?  A conventional atheist answer -- at least, it's conventional enough that the American Humanist Association selected it for their ad campaign -- is along the lines of "Be good for goodness' sake."  Aside from the tautological irrationality of this line, what is good?  How do you decide what people should or shouldn't do?  It would be nice if there were universal human agreement about morality, but of course there isn't.  Even such widely agreed-on principles as "Don't murder" or "Don't rape" come with gaping loopholes, whether the principles are articulated within religion or without it.  (We have to wipe out the Islamofascists to defend ourselves! and The slut was asking for it, the way she was dressed -- or more subtly, evolutionary necessity requires me to penetrate any pretty girl I see before my colleague gets to her.)  Atheists, haven't from what I've seen, distinguished themselves by their superior rationality where morality is concerned, even or especially when they do so in the name of Science, Biology, Evolution.

I agree with Michael Ruse that there is no absolute morality: the universe doesn't care what we do.  Luckily, we don't need an absolute morality.  A lack of absolutes apparently scares many atheists as much as it scares many theists, however, and "relativism" is an equally harsh epithet in both camps.  I think that it doesn't matter whether the universe cares what we do: what matters is that we, as human beings, care.  Even if there was a god, its interests would not be ours.  (Theists have always been aware of that -- Quod licet Jovi, non licet bovi; His ways are not our ways -- but they try to evade its significance.  Scientific non-theists know the same about Nature and Evolution, but still figure Nature is up there in the stands rooting for us if we show the right spirit.)  Human morality has to be decided at the human level.  There's no certain way to decide what is right and what is wrong.  Human beings construct right and wrong over time, and since human beings are fallible, some skepticism about even the most important values is necessary.  There's no impartial outsider to adjudicate the conflicts; we have to do it ourselves; the buck stops here.

Penn Jillette, I understand, is a Libertarian as well as an atheist, and that may be part of the trouble here, though I don't know enough about his version of libertarianism to evaluate it.  But what he articulates here is consistent with a Libertarian / Randite assumption that morality is founded in the individual in isolation.  (Which comes from social-contract theory, a useful but limited heuristic.)  But morality for a social species only comes into play when individuals are in conflict with each other.  If no individuals were ever tempted to kill, lie, rape, or steal, as Jillette claims he is not, there'd be no problem.  If individuals existed in isolation, they wouldn't come into conflict.  But we don't exist in isolation, and it often happens that human beings come into conflict, so we have had to develop structures for trying to resolve these conflicts.  Maybe Jillette isn't as stupid as this meme makes him seem.  I hope not; but he's clearly not as smart as he thinks he is.**

*The Holy Spirit.  Waco TX: Word Books.  As reprinted, New York: Warner Books, 1978.

** I'll freely concede that I'm not as smart as I think I am either, but I don't know how smart I think I am.

Tuesday, January 22, 2013

The Decline of Elizabethan English


Survivor's Guilt; or Nature Thinks She's So Smart

(Grammar obsessive digression: I can't quite make up my mind whether the apostrophe in the title should be before the final s or after it -- singular or plural?)

But anyway, here's another meme that's going around, like the flu but slightly less virulent:

"Faith is the reorganization of observations so that they fit neatly into a particular belief system." Well, no, that's science: the fitting of observations into a particular belief system, aka theory. This is on my mind today because I'm reading Walter Alvarez' T. Rex and the Crater of Doom (Princeton, 1997).  (Also, a more technical point: observations have no meaning except as they are framed within a theory or other organizing framework. Facts don't speak for themselves. This isn't exactly rocket science, though a lot of scientists talk as though they didn't know it.)

T. Rex and the Crater of Doom is Alvarez' account of how he and many other scientists filled out a speculation with facts that, they concluded, validated and confirmed the speculation.  The speculation, which has become fairly well-known by now, was that the mass extinction which killed off the dinosaurs and many other species of animals and plants 65 million years ago, was caused by a large object, probably an asteroid, hitting the earth.  The mass extinction, formerly known as the Cretaceous-Tertiary Extinction (also KT, as Alvarez refers to it throughout the book) has been renamed The Cretaceous-Paleogene Extinction Event by Politically Correct zealots at the International Commission on Stratigraphy.  (The same kind of people who decided Pluto isn't a planet.)  There have been numerous mass extinctions, but KT is one of the best-known because of the public fascination with dinosaurs, but no one knew why they happened.

It's a fascinating story, and undermines the faith-science dichotomy in the meme above.  At the beginning Alvarez didn't have any evidence for believing that KT was caused by something big hitting the earth, he had a hunch.  Then he had to think of what could constitute evidence, then he had to look for it.  Occasionally he found what looked like disconfirming evidence, but he didn't lose faith: he kept digging and thinking until he found new facts or a way to think about the facts he had.  In the end he came up with a lot of evidence that confirmed his hunch, and I'm not denying that it's good evidence or that the KT extinction happened, or happened that way.  What I'm saying is that it shows that a simple science/faith dichotomy doesn't work.

The general acceptance of Alvarez' theory struck a blow against the uniformitarianism that had dominated geology for a couple of centuries, which again had been a matter of faith: scientists simply asserted that no catastrophes had any important effects on the earth.  The doggedness with which they clung to this belief is interesting.  There was nothing either illogical, let alone supernatural, about the idea of an asteroid or comet striking the earth, but it was resisted fiercely for a long time.  Not only that, but some scientists continued to reject Alvarez' theory long after most of their colleagues accepted it, arguing that his evidence didn't prove what he thought it did.  That's a reminder that rejecting a scientific theory is not in itself anti-science, which many scientists forget in times of controversy.

T. Rex and the Crater of Doom also quietly rejects the heroic fantasy of the solitary scientist battering away at Nature's privacy until she surrenders her secrets: from the beginning Alvarez worked with other scientists, and a worldwide network of colleagues contributed data and suggestions to help.

Scientists have to have faith in their theories, or they would abandon them at the first disconfirming evidence, and science would fail.  They also have faith in science itself, since science has as many embarrassing failures in its history as it has successes.  It's an old joke among historians and philosophers of science.  The scientific method is like a besieging army: if it fails to capture one city, it moves on to another that looks more vulnerable, in hopes of better success. Where faith comes in is the belief that eventually, every problem will be conquered.  Declaring that science will never be able to solve its unsolved problems is analogous to declaring that gods definitely don't exist.  It's notoriously hard to prove a negative, but asserting the positive requires faith.

There's one thing about Alvarez' presentation that bothers me, though: his personification of Nature.  This might be partly the all-too-common scientists' assumption that you have to dumb down a popular account to snow the rubes.  Mary Midgley wrote that when she "complained of this sort of thing to scientists, I have sometimes met a surprising defence, namely, that these remarks appear in the opening or closing chapters of books, and that everybody knows that what is found there is not to be taken literally; it is just flannel for the buying public" (Evolution as a Religion, Methuen, 1985, 67).  Whatever the reason, Alvarez writes of Nature as a conscious agent, even an adversary, throughout his book.  Some examples (there are many more):
As we struggled to understand what had happened, it almost seemed as if Nature had cleverly constructed a maze of alibis, misleading clues, and false trails [82].

As scientists, we are engaged in a conversation with Nature. We ask questions – like “Where is the crater?” – by making observations or performing experiments. And Nature answers, with the results of the observation or the experiment [85].

… to understand the real meaning of Nature’s answers, or how many ways there are to make mistakes and get fooled [86].
This last one reminded me of the immortal beginning of Robert Treese's 1973 paper on homosexuality and the Bible: "What is God trying to tell us about homosexuality?"  You'd think science was engaged in a game of charades.
But we had all been fooled!… How did Nature fool us? Only years later, after the Yucatán crater was finally found, did we come to understand how we had been misled? [94]

Nature misled us by mixing sedimentary rocks rich in calcium and magnesium together with the underlying continental crust, which was rich in silicon [95].

At last, it seemed, Nature’s trick had been figured out … It was satisfying to have finally understood Nature’s ploy, but the satisfaction was premature. Nature was about to have another laugh at our expense [105].
And so on, and on.  I remember a passage in Norbert Wiener's Cybernetics (1948) to the effect that the physical scientist has an advantage over the social scientist in that Nature doesn't cheat.  That's because Nature is not a person: Nature is an abstraction, though it has long been personified as an alternative to God, a way of avoiding theological reference in books while still anthropomorphizing the inanimate, impersonal world.  It seems to be very difficult to avoid personifying the impersonal, even for scientists, but that's supposed to be a major part of what science is supposed to be about.  But how flattering to cast oneself as one who outsmarted crafty, dissimulating Nature.  It's like beating Death at a game of chess, almost as good as stealing fire from the gods.

Monday, January 21, 2013

Too Big and Too Close


I've given Bernadette Barton a hard time for most of Pray the Gay Away, so I want to praise her for making a good point.
Instead of receiving the support they need to weather tough times, same-sex couples who are open about their relationships are frequently censured by heterosexuals as well as other gay people for "flaunting" their homosexuality.  In my role as a teacher and public speaker on such issues, I have listened as heterosexuals have explained that they don't have an issue with homosexuality, they just don't understand why so many gay people need to flaunt it because no one goes around announcing their heterosexuality.  This is a problematic statement for three reasons.  First of all, it's wrong.  Heterosexuals constantly flaunt their heterosexuality.  Every time a heterosexual wears a wedding ring, discusses his children, and vacations, and all the routine activities he did last weekend with his spouse, he announces heterosexuality [107].
... And so on. She devotes most of three pages to the topic, and does a very good job of it.  Credit where credit's due, with extra points for noticing and acknowledging that some gay people play this game too.

There are other worthwhile parts to Pray the Gay Away, such as Barton's account of her ethnographic field observations of an Exodus International convention.  Exodus is probably the best-known of the ex-gay ministries.  It's a pity that Barton has nothing to say about Exodus' repudiation of change therapy last summer; it took place years after her observations, probably while the book was in press.  In any case, Barton found the conference disturbingly comfortable, even alluring.  Despite Exodus' embrace (at the time) of the idea that homosexuality comes from disturbance in the formation of "gender identity" -- boys acting girly and vice versa -- the conference had plenty of room for sissies and bulldaggers: Barton "considered approximately 40% of the people I encountered during the conference to be gender non-conforming, that is, their gender presentation did not conform to heterosexist standards of butch men and feminine women, including leaders of the organization, Christine Sneeringer and Alan Chambers" (125).  That, I'd say, compares favorably with mainstream gay organizations, which have also traditionally been concerned with gender presentation for defensive PR purposes, but also, depressingly often, from self-hatred.

Barton tells how several of her Bible Belt gay interviewees reported "meeting partners, flirting with, having sex, and developing crushes on people they met in ex-gay ministries, particularly at Exodus conferences" (127).  I've always thought that ex-gay groups would be a good place to meet guys, though dating people so mired in self-hatred is not a good idea.

But then Barton quotes the president of Exodus, Alan Chambers, on gay protesters outside the conference:
As I think of them, the image that comes to my mind is of those hungry, starving, balloon-bellied babies in Africa, and your eyes well up in tears seeing them living in a dust bowl and you think, "Sure, I'll give 18 cents a day" [129].
Meeoww!  Barton finds Chambers's remarks disturbing, however.
Chambers's analogy comparing the demonstrators to starving babies in Africa disturbed me as it framed LBGT activists not only as severely lacking spiritual nourishment but also drew on racist and colonialist imagery [129]. 
Of course Barton and her Bible Belt gay informants feel the same way about antigay Christians, to say nothing of those struggling with ex-homosexuality.  These tactics can be and are used by both sides.  Later she mentions a speaker at the conference who worked with a quotation from Ephesians, and it struck me that "spiritual seekers" like Barton are fond of quotations too. This is what Walter Kaufmann called exegetical thinking: reading one's own beliefs into a text and getting them back endowed with authority.  Spiritual seekers draw on a wider range of sources, not limiting themselves to the Bible, and they interpret them as freely as any fundamentalist uses the Bible, respecting neither literary nor historical context. Many of those quotations are bogus, as can be seen on Facebook, but it’s just one more outlet for them.  Again and again I noticed how sure she is about what a god should do; and how she assumes – in the face of so much evidence to the contrary – that a god would want us to be happy and sexually fulfilled. Maybe so, maybe not, but I think it needs to be argued, not assumed. Barton believes "that living one’s whole life without sexual intimacy is not 'God’s miracle of celibacy' but rather unsustainable and unhealthy" (141). As long as she knows – and I don’t think she does – that other "spiritual seekers," including Jesus and the Buddha, have disagreed with her.

Also interesting is her account of a field trip she made with a group of her students to the Creation Museum nearby in Petersburg, Kentucky.  It proved to be a stressful experience from the time of arrival:
Museum signage alerts guests that improper [or “inappropriate”?] actions and/or statements are grounds for dismissal from the facility. In other words, dissenting thoughts and action are evidence of sin that might have otherworldly (that is, keeping one out of heaven) and material (that is, barring entrance to the museum) consequences.
Could be, but it sounds to me like the Museum just wants to be a Safe Space.  A safe space for me might not be a safe space for thee, however.  Barton and her students thought that they should be allowed to set the terms of safety on other people's premises.  (Sort of like Americans who think that foreigners should learn English before coming to visit the US, because this is America and we speak English here -- but also that foreigners should learn English to deal with Americans even in their home countries, because the US is a dominant economic force in the world and foreigners need to adjust to that reality.)  One complained of "several disappointing remarks re: gay marriage and homosexuality (164), but come now: they'd have been disappointed if those remarks hadn't been there.  In the Creationist mindset, Barton intones, "Evolutionists are of Satan; Creationists are of God" (166).  Silly Creationists!  Barton could have told them that it's the other way around.

Sunday, January 20, 2013

No One Could Have Foreseen This Situation

(This photo appeared on Reuters (I think), which suggests it may be genuine.  I haven't been able to find it anywhere else, so it's suspicious.  But it sure fits.)

Today Avedon Carol linked to a recent post at The Nation by Rick Perlstein:
We have on our hands a President Groundhog Day ... [R]egularly, and regularly and regularly, Obama initiates a negotiation; finds his negotiating partner maneuvering him into an absurd impasse; then “negotiates” his way out of a crisis with a settlement deferring reckoning (in the former of further negotiation) to some specified time in the future, at which point he somehow imagines negotiation will finally, at long last, work—at which point the next precipice arrives, and he lets his negotiating partners defer the reckoning once more.
Perlstein thinks this pattern comes from Obama's personal psychology, which he promises to explore in a future post.  Avedon thinks it's conscious and deliberate: "Or, at least, that's the story we're meant to believe...."

Liberals and many leftists tend to agree with the far Right that Obama is really a diabolically clever rope-a-dope Eleven-Dimensional Chess master, manipulating his opponents to get what he really wants -- though they disagree as to what he really wants. I think both groups are giving him too much credit.  I also think it's irrelevant.  If I could know that Obama's totally sincere, I'd still criticize his policies and his ability to negotiate.  If he's a canny secret corporatist (I'd agree that he is, except it's not secret) or a wily secret anti-colonialist socialist, the question still arises of what to do it about it: elect more Democrats to Congress?  Or more Tea Party Republicans, to defeat the Kenyan Usurper?  It is to laugh.  The epithet "conspiracy theory" is thrown around by right-thinking people of both parties to dismiss explanations they dislike, whether the theory is supportable or not; but only for other people's conspiracy theories.

Perlstein also writes that in 2011
The president reportedly thought he and Boehner were working together—'to freeze out their respective extremists and make the kind of historic deal that no one really thought possible anymore—bigger than when Reagan and Tip O’Neill overhauled the tax code in 1986 or when Bill Clinton and Newt Gingrich passed welfare reform a decade later.'
It's the bit about "their respective extremists" that's important here. Boehner's "extremists" are those Republicans who want to dismantle the Federal Government except for the military and the surveillance machinery, which is arguably an "extreme" position. Obama's "extremists" are those Democrats who want to preserve the New Deal and the Great Society, which is a conservative position in the strict sense of the word, and anything but extremist.  If anything is really "centrist" in the US today, it's opposition to cuts in social programs and support for higher taxes on the wealthy.  Which, among other things, goes to show how meaningless the word "extremist" is.  Insofar as the word applies to anyone, it applies to President Obama, Speaker Boehner, and their loyal supporters.