Thursday, August 23, 2018

You $@#!&% Kids Get the #%&! Off My Mother#@%&* Lawn!

I keep worrying that I'm getting old, partly because I am, but you know what I mean.  These two tweets this morning from someone I follow, who will remain anonymous for now:
If y’all can name me five poor or working people that give half a fuck what [first name of prominent journalist] fucking [surname of prominent journalist] has to say about anything I’ll kiss your ass.

Got grown motherfuckers on the left "[first name of prominent journalist] you're being dishonest!!" Like the fuck did y'all expect? Quit pining for the approval of these limp dicks and damn sure quit assuming best intentions in their coverage.
These were in response to a Bernie Sanders tweet chiding the journalist for spreading misinformation about Medicare for All.  Now, my first impulse was to ask what I would have to do for them not to kiss my ass.  (First prize, one week in Philadelphia; second prize, two weeks in Philadelphia.)  My second reaction was that while they had a point -- one should have a decent skepticism about the corporate media and their works -- at the same time it is perfectly legit to point out when corporate media figures promulgate falsehoods about important issues.  I'm a working person, and I give at least a quarter of a fuck what anchorcritters with vast platforms have to say about such things, because thanks to their elite positions they influence what most people believe.

Imagine asking, say, who cares what Donald Trump has to say about anything?  No sensible person would take Trump's word on anything, or pine for his approval or assume best intentions in his ravings.  But I don't think Sanders was doing any of these things.  He was trying to correct misinformation that might, either directly or through trickle-down, affect poor or working people's opinions of a good system for providing health care in the United States.

My third reaction was, as noted, to fret that I'm getting old in a bad way (oh no, I'm sounding like my mother!), because it bothers me when people think that putting a bunch of fucks, motherfuckers, and shit into their discourse makes it somehow more persuasive -- or makes me pine for their approval, assume their best intentions, or believe that they're commentators I should take seriously.

This person doesn't always resort to naughty words in their Twitter output, so I've been trying to figure out why they did it here.  I listened to one of their podcasts once: it was heavily peppered with fucks. The participants were mostly male; the one (?) female joined in, but it was, like Chapo Trap House, basically a boys' club in manner and content.  So my first guess is that they think they sound Street, which incidentally is cultural appropriation: white kids trying to sound like black kids.

At one time saying "fuck" a lot could have been defended as breaking a taboo.  I remember how thrilling it was when Jefferson Airplane sang "Up against the wall motherfuckers," on a major-label album, but that was almost fifty years ago. And it's a lousy song.  Ditto for the Sex Pistols, forty years ago, though they did it better.  Certainly many people would still regard "fuck" as taboo, but not the audiences this person is addressing.  If anything, it's conformist, safe, boring. yet irritating. The two can go together: think of a mosquito buzzing around your head when you're trying to sleep.

I could probably overlook the fucks on the grounds that it's a generational thing, if not for all the British rock stars older than I am who also season their speech with fucks.  As the examples of Pete Townshend, Jefferson Airplane, Johnny Rotten and others suggest, this kind of talk is now old people talk: your grandma talking salty.  When certain people misuse "literally," I wonder what word they use when "literally" is the right word to use.  And when wannabe Internet celebrities talk nasty for street cred or fitting in with the cool hipster guys, I wonder what they'll do when "fuck" loses what is left of its obscenity.  It still has it in boy culture, of course, when somebody tries to be macho by saying "Fuck the Republicans," and that's not a sign of wokeness either.  It's the opposite of being edgy, bold, independent.  It's a way of showing you belong.  And much of the time it's a substitute for substance, as in the tweets I quoted here.

Saturday, August 18, 2018

An Army of Non-Conformists Cannot Lose!

I've been reading numerous fascinating books about the history of "religion" as a concept and social phenomenon, which I should have written about here before.  Currently I'm in the middle of Stereotyping Religion: Critiquing Clichés (Bloomsbury Academic, 2017), edited by professors Brad Stoddard and Craig Martin.  It's intended as readings for college-level courses, and draws on the same scholarship that has taught and clarified so much for me.  The writing is meant to be accessible to undergraduates, so it's probably a good introduction for anyone who's interested in sorting out what religion is.  That question interests me as an atheist, and ought to interest other atheists as well as theists because so much discussion of atheism vs. religion makes assumptions about what religion is or isn't.  If you're a champion of rationalism and critical thinking, you should be concerned that your own assumptions are correct, no less than your opponents'.

Briefly, the scholarship I've been reading shows that there's a specifically modern, historically and culturally contingent concept (or definition) of religion that, if it didn't originate in the Christian "West," became normative here at around the time Europe was coming into contact with other cultures it hoped to conquer and colonize.  There was considerable debate as to whether these culture had "religions" that should be respected, or mere "superstitions" that could be replaced (by force if necessary) with True Religion, viz. Christianity.  The case of Islam, which had achieved enough temporal and power that it couldn't be so lightly dismissed, had also provided fodder for debate as to its nature and status.  At around the same time, the rise of Protestantism raised questions about the status of religious dissent within Europe.  Up till then, "religion" was an inseparable part of one's culture, not a freely chosen lifestyle from a smorgasbord of possible "faiths."  By the time the US forced its way into Japan in the late 1800s, demanding "religious freedom" (including the freedom to missionize) for its nationals there, the question had to be addressed because "religion" and "religious freedom" had to be translated into Japanese for inclusion in treaties, and the Japanese had no equivalents for those terms.  Conflict and confusion over the meaning of "religion" persists down to the present day.

This summary oversimplifies, of course, and I refer interested readers to such works as Timothy Fitzgerald's Discourse on Civility and Barbarity (Oxford, 2007) and Jason Ananda Josephson's The Invention of Religion in Japan (Chicago, 2012).  I'm glad I read them before I got to Stereotyping Religion, because it seems to me that most of the contributions, while drawing on their scholarship, also get some things wrong.  One recurring motif is a tendency to blame the formation of "religion" as a concept on Protestantism and its associated "individualism."  I think this is mistaken, because it tends to reify Protestantism in much the way that "religion" reifies religion.  For one thing, the nature of religion would have had to be sorted out anyway, just because of European imperialism and the need to deal with different beliefs and practices in the places Europe sought to dominate.  For another, there had always been "religious" dissent in Christian Europe; Protestantism got a foothold and survived by aligning itself with political trends that weakened, undermined, and eventually broke down Roman Catholic hegemony.  But Protestants didn't originally see theirs as a separate "religion" -- they accepted the prevailing conceptions.

After all, Christianity originally emerged as a cult of individual salvation, hostile to established institutions of Family and State, and there are precursors of arguments from individual conscience in early apologetics; it only became Religion after its competitors had been effectively eliminated.  Something similar was true of, for example, Buddhism, which required individuals to split off from their families and conceptions of holy thought and practice.  Remember, for example, that when Buddhism got a foothold in China, traditionalists attacked it for its rejection of traditional values.
Judged by these standards, the ideal monk presented a disturbingly flawed picture of aberrant manliness.  He abjured marriage, renounced fatherhood, was ill positioned to care for parents, did not own property, declined public office, deprecated secular learning, mutilated his body (a gift from his parents) by shaving his head, and rejected orthodox manners and rituals for an alien set of rites.  According to the masculine standards of the time, how could such a person even be called a man?*
Here you can see how new, imported "religious" practices were characterized as attacks on culture, assumed to be natural, the mandate of Heaven.  In time, Buddhism became part of the Chinese landscape, and what had been outrageous violations of common sense became acceptable variations.  As with Christianity, one asserted "individual" rebellion by appeal to higher authority, that of the Buddha or of Christ.  This tactic was also used by the early Christians against Judaism -- in the gospels, for example, Jesus defends his innovations by asserting that they are the law of God rather than the law of men, by which he meant the supposedly God-dictated Torah -- and by religious dissenters within Christianity centuries later.  (I'm not a rebel, you're a rebel!)  But a standard approach by traditionalists trying to refute rebels appealing to a higher authority is to define them as selfish individualists.

In the chapter of Stereotyping Religion I'm now reading, the authors, Andie R. Alexander and Russell T. McCutcheon, discuss the popular "I'm Spiritual But Not Religious" (SBNR) position adopted by many people in the US today.  Alexander and McCutcheon see SBNR as a modern, individualist stance, though their argument is that those who adopt it are mistaken in seeing it as such: any individualist, they argue, doesn't really stand alone:
[W]hat do we make of someone who comes along and says: “I’m spiritual but not religious”? For, as already suggested, this claim (at least as understood by some who make it) seems to signify that there exists something pre-social, something this person (and not that—the one who simply identifies as being religious, we guess) possesses or experiences that is more deeply significant because it is outside of (i.e., preexisting) all institutional constraints. While our commonsense way of understanding ourselves might suggest that such a claim is sensible—lots of people seem to think it sensible to say it—the excursion we just took into an alternative way of thinking about meaning now suggests that such assumptions are rather problematic, inasmuch as they seem to take the social work and thus institution-specific setting of all meaning-making as invisible, as if it wasn’t even there [loc 1879 of the Kindle version].
They then ask rhetorically (and don't you love rhetorical questions?):
What is it about our age that prompts some members of our society to understand themselves as existing apart from it, despite using the same language, economic system, and so forth as those from whom they feel alienated? [loc 1889]
As I've already suggested, it's not really about "our age" or "our society" but about the felt necessity of defending a minority position in any society.  The apostle Paul, for example, like other early Christians, characterized his sect as in but not of the world, despite their reliance on the same language, culture, economic system, and so forth as those from whom they felt alienated.  Alienation too is not "natural" but constructed and maintained.  Rhetorical details vary in emphasis, but the overall picture hasn't changed much.  I also think that this question caricatures the SBNR, leaving out some important features that I'll return to in a moment.

Alexander and McCutcheon conclude:
For if we instead start from the standpoint that it’s, well ..., standpoint all the way down and that there is nowhere to stand that isn’t situated, that isn’t invested, that isn’t implicated, that isn’t part of a prior conversation that we didn’t start ourselves, and that isn’t therefore part of a social and thus institutional world, then those who talk as if their private, true, or authentic self somehow trumps the so-called derivative forms that other people’s lives take will be seen by us as fascinating players in an ongoing contest, working with what’s at hand, to give their position a competitive edge [loc 1898].
This is true enough as far as it goes.  As a whole, this part of the book makes many important and valid points.  But as here, the writers omit to recognize that the proponents and defenders of "social and thus institutional world" against which SBNR define themselves are also individuals: they seek to pretend that they stand on solid ground, that it isn't "standpoint all the way down," and that the absolutes they espouse were created and perpetuated by people like themselves. 

The thing that occurred to me while I was reading this material was that people who present themselves as SBNR do not always claim that their "private, true, or authentic" selves trump the forms that other people adhere to. They generally draw on a buffet of ascended masters, spiritual teachers, cultural icons and other sources as authority for their personal versions of spirituality -- even though many of those figures are associated with Organized Religion themselves. They also frequently find community in study groups, classes, shops of spiritual paraphernalia and accessories, and the like; they rarely stand completely alone.  As Alexander and McCutcheon point out, standing alone is a tactical claim, not a reality. We are all both individuals and members of collectivities; these are aspects of ourselves, and neither one represents us completely.

And then I remembered that in the previous chapter of Stereotyping Religion, Steven W. Ramey had argued against the belief that "religions are mutually exclusive," that people can and should be classified as belonging to one and only one religion.
This cliché, though, is far from universal, as people in other parts of the world often have different conceptions. Many people in Japan, for example, participate across a lifetime in practices associated with both Buddhism and Shinto, seeing them as addressing different aspects of human existence. Some people understand all religions to be doing the same thing, allowing people to employ whatever practices or beliefs that they find beneficial. Within the context of South Asia, praying at the shrines and temples associated with different religions provides opportunities to access supernatural power or wisdom, without undermining a person’s identification with one religion. For example, Qutb Ali Shah, whose followers in British India identified him as a Muslim Sufi, did not require his followers who identified as Hindu to convert to Islam. In fact, he incorporated deities and practices commonly seen as Hindu in his own activities (Gajwani 2000, 39–41), and Hindus and Christians who claimed a high social status often participated in each other’s festivals as an expression of their higher status while excluding others who identified with the same religion from participating (Bayly 1989, 253, 289–90). Many people who identify as Chinese do not identify as a follower of any particular religion but follow practices that we commonly label Buddhist, Daoist, Confucian, and folk traditions. In fact, it is common for temples in Chinese communities to incorporate a range of figures that we commonly identify with different religions [loc 1428].
This is true and important, though Ramey overlooks similar attitudes and practices within European and American Christianity: think of the welter of "pagan" elements in European and North American celebrations of Christmas.  Rabbinic Judaism has been trying to root out magic and "superstition" among lay Jews for millennia.  (But also think of New Atheist Sam Harris, who practices his own version of Buddhism, trying to convince itself that there's no conflict.)  Christianity itself is a syncretistic mix of Judaism and Greco-Roman religion and philosophy.  Which didn't mean that the early Christians didn't refuse to conform to Jewish or Roman demands for conformity that they found objectionable (burning incense to an image of the Emperor, for example), or to represent their sect vis-a-vis Judaism and "paganism" as mutually exclusive.

Ramey even acknowledges that
One trait of what people sometimes call New Age religion is the adoption of practices associated with different religious traditions, which becomes a point of critique for some people opposed to New Age practices. A similar issue arises in the language of “spiritual but not religious,” which rejects institutional forms of practice for an individualized selection of practices ... For example, in the British colonies that became the United States, those identified as Christian incorporated astrology and similar practices despite the common Puritan teachings that such practices were not acceptable for people who identified as Christians [locs 1584, 1594].
So I think there's a contradiction here: "Spiritual But Not Religious" is not a modern, Euro-American, Protestant, individualist deviation (even if some of its adherents may defensively present themselves as such); rather, it fits comfortably into almost universal, traditional cross-cultural practices of people around the world and throughout history.  Attempts to purify a particular tradition are not inherent in religion, but represent minority, usually elite efforts to construct regularized systems that appeal to them aesthetically and intellectually.  Most people construct the constellations of meaning that they use to structure their lives ad hoc, opportunistically; the "Cafeteria Christian," as objectionable as he or she may be in many ways, is simply being religious in a normal, traditional manner, the way most believers have been religious.

---------------
*Bret Hinsch, Masculinities in Chinese History (Rowman & Littlefield, 2013), p. 50.

Wednesday, August 15, 2018

I Don't Care What Yahweh Don't Allow

According to this article, the resident artiste of the Masterpiece Cakeshop is suing his state government, alleging harassment and persecution for his deeply held religious beliefs.  This is due to another suit accusing him of discrimination for refusing to bake a cake for a transgender person's transition. There aren't enough details in the article for me to discuss that case; anyhow, what I'm interested in now are the remarks of the person, Brent Sirota, who linked to the article in a tweet.  He's an academic, a "Historian of sacred and profane things" according to his profile, with a focus on "Disenchantment operations, mostly."  I follow him because he often shares useful information, including recommendations of several books that I have found very useful.  But his complaints today made little sense to me.

First:
I simply don't see the bottom to this. Any number of prejudices can be and have been swathed in theological garb--many quite recently, historically speaking: against interracial marriage and integration, antisemitism, anticommunism.
This is true as far as it goes.  What's missing is an acknowledgment that it's not only "prejudices" that have been "swathed in theological garb."  His use of "prejudices" is tendentious and disingenuous.  Just about any position at all can be and has been tarted up in theological drag.  How is anyone supposed to tell which positions are legitimate and which are merely prejudices?

One example of this came up a couple of Christmases ago when I criticized a liberal/progressive Christian reading of the Nativity stories that cast the Holy Family as "refugees."  I pointed out that there were other ways to read the texts, based on the narrative and indeed theological framework of the New Testament itself.  I was advised to study some theology by someone who was unaware that I've spent many years doing that.  What my reading had taught me was that the meanings of biblical stories and the doctrines Christians constructed were manifold, and largely determined by what the interpreter wanted to find: theologians work backward from their conclusions to get the texts to mean what they want them to mean.  (This is not true only of theologians, of course.)  The irony was that my critic assumed what he accused me of believing, that each story has only one meaning.  You couldn't prove that from a reading of theology; rather the opposite.

Sirota continued: 
Eventually, this will make the state the arbiter of orthodoxy. Courts and legislatures will have to determine that transphobia is a legitimate application of Protestant doctrine, but opposition to "race-mixing" or "popery" is not.
I can't see anything in the article that supports this overwrought claim.  Perhaps the courts and legislatures will take it on themselves to decide Christian doctrine, in defiance of the First Amendment, but there's nothing in the article or the case that obliges them to.  If anything, Sirota seems to want the courts to determine that transphobia is an illegitimate application of Protestant doctrine, which isn't acceptable either.  (I don't quite understand why he specifies "Protestant" here, since Catholic doctrine is also anti-trans.  I suspect he's alluding to - and misusing, in my judgment - scholarship which traces religious pluralism and toleration to the rise of Protestantism; but of that, more some other time.)

Now, it's true that probably most Americans (including their elected representatives) don't understand the First Amendment, largely because they don't see why it should prevent them from imposing their beliefs on other people, at their expense.  I've mentioned before some gay Christians I know who, not content with a mere civil ceremony, wanted the government to force their churches to provide them with a church wedding.  They didn't care that this would violate the First Amendment: they wanted it, and thought they were entitled.  (As white middle-class Christian-American males, of course they were!)

There are other ways of understanding this issue, of course, but it seems to me that even if a doctrine is a legitimate application of Protestant doctrine, it can still be regulated or forbidden by the state.  Slavery was held to be theologically legitimate for centuries by Catholic and Protestant divines, and though most people don't realize it, the American Civil War and Emancipation did not change that.  Even if your church still considers slavery to be in conformance with the will of God, it is still illegal for you to own other human beings.  The same can be said of polygamy: there is nothing in the Bible to forbid it, and it appears that Christianity abandoned the practice not for theological reasons but because Roman society disapproved of it.  (Oh ye of little faith, letting the World determine doctrine for you!)  If Protestants want to burn papists, or vice versa, because of their sincerely held theological doctrines, tough luck.  Nor is the sexual abuse of children by Roman Catholic clergy acceptable because the Church hierarchy refused to do anything about it.  Because the United States does not, thanks to the Bill of Rights, have a state religion, we are not at the mercy of theologians in deciding public policy.

Sirota concluded:
And that was precisely Madison's complaint in the Memorial and Remonstrance in 1785, that such policies imply "that the Civil Magistrate is a competent judge of Religious Truth . . . an arrogant pretension falsified by the contradictory opinions of Rulers in all ages."
James Madison certainly did not mean that churches should be given free rein, however.  Remember that he opposed tax exemptions for churches and a chaplain in the Congress.  He was correct that the Civil Magistrate is not a competent judge of religious truth; I would only add that neither is the theologian, as shown by their contradictory opinions in all ages.  No one is.  Happily, as I've already said, the magistrate need not judge religious truth; he or she only needs to keep the arrogant pretensions of churchmen and believers from disturbing the public peace.  "Only" is probably not the right word here, because it's no small task to balance the competing claims of religious freedom, which usually involve one religion's freedom versus another's.  Even if believers could agree on what their gods require, their gods have no authority in this country.

Tuesday, August 14, 2018

Still Gotta Get Out of This Place


I've been listening to numerous versions of this song today, led there by the imminent reissue of this one.  It's one of my absolute favorites from the mid-60s: Brill Building pop given a rough edge by some scruffy Brits, a great vocal by Eric Burdon (a major crush of mine at the time).

I had no idea it had been covered by so many people, though I'm not really surprised.  Turns out it was popular among US invaders in Vietnam, but it's also simply a great song.  I also found at least three "original" Animals versions on Youtube, but I'm faithful to the US single, to which Burdon listlessly and probably drunkenly lipsyncs in this clip.  I bought the sheet music myself and tried to work out an acoustic version on guitar, just for the satisfaction of singing it, but couldn't make it work. But Richard Thompson could.

Sunday, August 12, 2018

People Keep Using This Word, Etc.

David Sirota is a fine journalist, so I'm not picking on him specifically here.  I've seen a lot of other people do the same thing:
Modest proposal: if you took over a political party & then oversaw that party losing Congress & most statehouses, and then you additionally oversaw it losing the presidency to a reality TV star, you’re no longer in a position to lecture anyone about electability or effectiveness.
I agree completely with the substance of this remark.  It's the "Modest proposal" part that bugged me.  I take it to be an allusion to Jonathan Swift's satirical tract of that title, published in 1729, in which he proposed to fatten Irish babies for English tables. If you're following Swift's example, a "modest proposal" is sarcastic in the first instance -- you know it's outrageous -- and in the second, you do not actually mean that your recommendation should be followed.

I presume that Sirota, by contrast, is quite serious in proposing that the Democratic Party leadership STFU. So there's no sarcasm here, no satire.  It's as tone-deaf as Hillary Clinton's use of George Orwell's 1984 in her apologia pro snafu sua What Happened.  Yet Sirota is not a stupid man.  As I've said, I've seen other intelligent people announce modest proposals that they mean unironically.  How does this happen?

Wednesday, August 8, 2018

First We Take Tahiti

I just reread Jack Douglas's The Adventures of Huckleberry Hashimoto (Dutton, 1964), for the first time in at least forty years.  Douglas (1908-1989) was a radio and TV comedy writer, a buddy of Jack Parr (who makes a cameo appearance in this book), and the author of numerous books that permanently warped my sense of humor when I was in middle school: such classics as My Brother Was an Only Child (1959), Never Trust a Naked Bus Driver (1960), and A Funny Thing Happened to Me on My Way to the Grave (1962).  All of which were in my small-town public library; they weren't obscure under-the-counter samizdat, they were published by a respectable house and went into mass-market paperback.

I don't remember at this remove in what order I read them.  It's possible I read Huckleberry Hashimoto first and then found the earlier books.  The first two were basically comedy routines that couldn't have been performed on TV; Huckleberry Hashimoto was an account of Douglas's 1963 trip to Japan with his (third) wife, Reiko and their rambunctious sixteen-month-old son Bobby.  They went by way of Tahiti and Hawaii, largely by ocean liner.  Bobby was, Douglas informs us,
sixteen months old when we started out.  Reiko was twenty-seven and I was forty-eight.  When we got back home he was nineteen months, Reiko was still twenty-seven, and I was a hundred and three [9-10].
In fact, Douglas seems to have told a little white lie here.  According to his Wikipedia entry, he was born in 1908, which would mean he was 55 in 1963.  (According to her obituary in the New York Times, Reiko was born in 1936, so she was 27 when they went on that fateful trip.)  But the marriage seems to have been successful -- they remained together for the rest of Douglas's life.

I enjoyed the book this time around, and was surprised by how many of the situations and gags had stayed with me.  What I noticed on this reading was the number of references to homosexuals.  It's been a frequent complaint of many homophobes that although we are a small minority, we still insist on talking about ourselves all the time - you know, the "the love that dared not speak its name" routine.  There is some truth about this, though we're not any more obsessed with our sex lives than heterosexuals are about theirs.  But what I find interesting is how obsessed with homosexuals many heterosexuals are.  Douglas is not terribly bad, really -- he doesn't fuss and fume hysterically, he sticks to the neutral term "homosexual" for the most part -- but he does keep mentioning us, more than our minuscule numbers would seem to justify.

For example, early in the voyage Douglas catalogues some of the characters he saw on board:
... I saw Miss Ethel Murdock and Mr. Peter Corbin enjoying a glass of beer (their eighteenth) under a rear table in the Outrigger Bar.  (Miss Murdock is a forty-five-year-old third-grade teacher, with more than just a suggestion of a moustache, in search of "new experiences."  Mr. Corbin is a homosexual who didn't realize that Miss Murdock wasn't Pancho Villa.) [27].
Then, on the way to Tahiti:
This particular deck steward also told me that if I was going to write anything about Tahiti in my new book, to check up on the stories concerning a certain movie actor, who had made a picture there.  The stories all pertained to the fact that he was a homosexual.  I told him that that's what they said about everybody in show business, that they were either homosexuals or communists.

He said, "Jesus -- I didn't know he was a communist, too!"

I said, "I didn't say that -- besides, what proof has anyone got that this actor is a homo?"

He said, because all the Tahitians called him a "mahu" which is Tahitian for "faggot" and also this actor in making this movie (which was about Tahiti) insisted that all of his "boys" be in it, and when he didn't get his way he stamped his sandaled foot and swished off the set into his thatched hut where he sat around in his muu-muu and sulked and pouted and moued for three days.  Something had to give, so finally the producer agreed that his "boys" could appear in the trailer for the film, if not in the film itself.  This seemed to satisfy the movie star and he emerged from his hut, his lip still a little Jackie Cooperish, but more or less ready to continue making the movie.  In all fairness to this movie actor, I did check up on him when I got to Tahiti and the consensus was unanimous -- he's a fag [47-48].
It's hard to sort out Douglas's commentary from the steward's in this story, so I'm not going to try.  I think Douglas was a bit more sophisticated than he let on, for once he got to Tahiti he reported
the penchant the Tahitians have for sensational gossip, one facet being that according to everybody on the island everybody else is queer.  Also, everybody that comes to the island is either queer or double-gaited -- or both.  (For those of you who are not familiar with the term "double-gaited," it means a homosexual who is happily married and the father of six children, but who is madly in love with the boy next door.)  [90]
I found this pretty funny, actually.  It reminded me of the stories Mark Padilla told in his book on male hustlers in the Dominican Republic, who would mention "to me that one or another of their peers was known to 'dar el culo' (give their ass) on occasion, which often produced much hilarity on the part of the storyteller."  Padilla didn't draw the connection himself, but he also reported that some of the same young men who told him that other guys would steal from their clients, turned out to be thieves themselves.  I suspect, then, that at least some of the Tahitians who gossiped -- no doubt with "much hilarity" -- about the predilections of their fellows and of tourists were sounding him out to see if he might be available and interested.  (The same might have been true of the steward.)  Douglas may even have realized it; who knows?

There were a couple of passing references to homosexuals later in the book (see page 122 if you ever read it), but you get the idea.  Douglas wasn't complaining about homosexuals everywhere, trying to take over, which puts him ahead of many his more highbrow peers in those days; he was just showing how worldly he was.  And to give him credit, he was also genuinely interested in Japan, got along with Reiko's family, and picked up more Japanese than most gaijin who married Japanese women seem to have troubled to do.  His account of staying with Reiko's family -- her father was a Buddhist priest, by the way -- in Hiroshima is pretty sensitive for the era.  He gets comedy out of their stay in Japan, but no more than he does of everything else, and I liked him overall.  There were other books by other writers that I read in those days that disturbed and frightened me by their handling of homosexuality; but Douglas' treatment made no impression on me, and I went on reading his books happily for years afterward.

So let me conclude with a story that I found rather charming:
The last thing I remembered as I drifted off that first night was the incident of the dignified little old Japanese gentleman and the electric-eye door at the hotel in Kyoto.  Every time he left the hotel or entered it, the electric eye would fling open the door for him.  And every time he would stop and bow to it [166].

Monday, August 6, 2018

The Price of Liberty



I watched this movie last night on DVD, and it was very good -- better than I expected, given Korean movies' increasing Hollywoodization. Which could mean that American audiences might like it! It's streaming on Amazon and Youtube and probably elsewhere, so chances are you can find it if you're interested. Korea has been in the US news more than usual lately, so Americans should take the opportunity to inform themselves. Besides, this is a good movie -- entertaining, well-made, and fairly accurate historically.

1987 is about the events that led to the fall of the military dictatorship in South Korea, set off by the murder by waterboarding of a student activist, Park Jong-chul.  A prosecutor blocked the coverup, the press ran with the story, and after huge protests all over the country, the government allowed open elections to take place. Still, it took several more years for something like democracy to be established, and as we saw with the recent plans by the military to save ex-President Park's regime by declaring martial law, it's still not safe. It never is, anywhere.

The film was directed by Jang Joon-hwan, whose Save the Green Planet! (2003) was a sort of science-fiction allegory of the persistence of South Korea's repressive past.  1987 is only his third feature. (I haven't yet seen his second, Hwayi: A Monster Boy [2013].)  It's a much more mainstream work than Green Planet, though still tricky in its structure -- not in ways that would confuse an audience, necessarily, but in the way the narrative keeps adding on characters who turn out to be surprisingly significant.  (According to Darcy Paquet of Koreanfilm.org, only one of the characters is entirely fictional.)  It's like watching a juggler keep adding objects to keep in the air.  For Korean audiences, it probably helped that numerous well-known actors played cameos; this was a prestige project, after all.  The screenwriter, Kim Kyung-chan, has one previous credit, Cart (2014), about a strike at a Korean big-box chain store, so he has experience handling political content involving a big cast of characters; I should watch Cart again to see how it compares.

The narrative circles back on itself, rhyming the killing of another student activist whose death galvanized the democracy movement even more, with Park Jong-chul's death, and culminates in a recreation of the huge protest marches that took place all over the country. This must have been a big budget production, given the thousands of extras who participated, the need to get costumes and hairstyles right, to find locations that looked like the 1980s in the 2010s. 1987 is a spectacular logistical achievement as well as good entertainment and a powerful history lesson.


My favorite scene, by the way, is when a young woman, played by Kim Tae-ri, gets caught up in a demonstration near her university, though she herself wasn't one of the activists. The police grab her, punch and club her, and start to drag her away -- but a handsome (of course) young activist rescues her. They run through the alleys trying to escape, and manage to frustrate the cops, one of whom is knocked out. The young woman starts to run, but then turns around to kick him in the head before she escapes with her new friend. (The cop she kicks is one of those who hit her earlier, by the way: he deserves it.)

Once again I'm impressed by the ability of South Korean filmmakers to make brave political movies about the recent past.  I can't think of any US films that come close, though I admit I've probably missed some, and I hear Rob Reiner's Shock and Awe, about the propaganda campaign legitimizing the US invasion of Iraq, is good.  The most famous US example might be Salt of the Earth, the 1954 independent film about striking miners that was suppressed by a McCarthyist campaign.  But could Hollywood make such a movie?  The usual Hollywood approach is to reduce political struggle to One Man (sometimes One Woman) who Makes a Stand, while ignoring the many people who actually make a movement.  It's a very American blind spot.  One reason I love South Korean cinema is that it shows that you can make exciting, entertaining movies and TV dramas that don't scant the importance of solidarity and collective action.  But you don't have to think about that when you watch 1987.  Just be shocked (by the brutality of the repression), awed (by the courage of the people who resisted it), and moved.

Saturday, August 4, 2018

Sometimes the Truth Is Somewhere In Between

Alfie Kohn linked to this opinion piece on TED Talks, and while I'm sympathetic, I have some objections.

"Picture this," Julie Bindel begins. "A darkened auditorium, an attentive, cult-like audience staring ahead expectantly, hardly daring to breathe; a huge screen on which there is an image no one can decipher."  Excuse me -- "cult-like"?  The rest of the article is written in the same lazy style, which is unfortunate for Bindel's complaint that TED "style appears to be given a hundred times more thought than content" and that "the style puts me off devouring the content."  Style is pretty much all there is to Bindel's piece; there's not much thought in it at all.  And I'm basically sympathetic to her position, which is why I clicked through in the first place.

She builds up to her smashing conclusion thusly:
I often give talks to both small and large audiences, and always feel nervous beforehand. This used to bother me, after decades of public speaking, but I then realised that being nervous is respectful of those who are there to hear me. Why would anyone wish to listen to some overconfident, over-rehearsed guru? Why would I want to subject them to a performance?
Um, well, maybe because giving a talk is a performance.  I wonder if Bindel has defined "performance" tendentiously to mean "something other than what I do" -- she doesn't specify.  If that's what she's doing, it's disingenuous and therefore disrespectful to her readers; if not, then she's merely wrong.

I've done a fair amount of public speaking over the years, though rarely on preset topics: usually, as with the GLB Speakers Bureau, I'm up there to answer questions, so what I say depends partly on the questions I'm asked.  But most of the questions we're asked are the same, on issues and topics that I've been thinking about for decades.  When I'm formulating an answer, I'm also conscious that I'm communicating with an audience, and I hope to be memorable, maybe funny, to say my say so as to make an impression that will stay with them.  That's a performance.  Am I nervous?  Not in the panel context.  But I don't think I'm "over-rehearsed" either.

"Over-rehearsed" might be disrespectful to the audience, but then so would under-rehearsed be.  I doubt that Bindel simply stands up and wings it with no preparation at all.  Nor, I hope, does she stand up and draw a blank, gaping open-mouthed at her audience.  That would be spontaneous, natural, "real."  It would also probably generate some complaint from the people who'd shown up hoping to hear her say something substantial.  Spontaneity is, in this context, an illusion.  A performer plans and prepares so as to seem natural and spontaneous; it takes a lot of work and talent to create that illusion.  So for once, it seems to me, the truth lies somewhere between "over-rehearsed" and "under-rehearsed."  The sweet spot can't be specified in advance.

I agree with some of Bindel's objections to TED Talks, but she hasn't thought her position through.  I find myself wondering, for example, about that standard TED Talk style, which I imagine is indeed the result not only of rehearsal but of guidance by the organization.  I find it annoying too, but evidently many people don't.  It reminds me of the conventions of stage acting and indeed of public speaking a century ago, where hopefuls were taught the proper gestures to go with and convey emotions.  The survival of elements of this style is what makes it difficult for me to watch early silent movies.  But that didn't bother most of their original audiences, apparently.  I suspect that the same applies to TED talks, and it's as absurd to speak of their audiences as "cult-like" as it would be to apply that epithet to early 20th century movie or stage or Chatauqua audiences.

Yeah, the TED style annoys me too.  But when I do listen to or watch a TED Talk, it's usually the content that bothers me.  Good content is always hard to find.  In Julie Bindel's case, it's both style and content that bother me.  First pluck the beam out of thine own eye, Ms. Bindel.

Thursday, July 26, 2018

This Is Fine; or, Mnyeh - It's a Living



Roy Edroso's alicublog is still a useful resource for keeping up with the antics of the American Right, and I admire Roy's fortitude in following their media, performers, and apologists so that I don't have to.  I don't condemn him for avoiding whataboutism with respect to what is laughingly known (or should be) as the center.  He's got his mission statement, it keeps him busy, gives him an audience; nobody can cover everything.  Whataboutism with respect to the center, the right, and the left is, or should be, part of my mission statement.  But now and then he drifts close to the precipice of Demon Doubt, as he did yesterday:
Conservatives have been asked to believe nonsense for a long time -- that tax cuts for the rich trickle down and pay for themselves, that we'll be greeted as liberators, that there's no more racism, etc. These were tough lifts, but they had the help of intellectuals, or magazine writers who passed for them, to give them a line of gab that made these things sound reasonable, at least to themselves.

Recent years have not been kind to these beliefs, and Trumpism kind of blew the whole scene -- not just by being so mind-bendingly, outlandishly at variance with observable reality, but also by  presenting them with unitary Republican control of the government, thus making American politics a perfect playground for their fantasies.
True enough; I don't dispute his analysis.  But that little Duncan with horns, pitchfork, and red suit on my shoulder whispered in my ear: Not meaning to make trouble, you know, but suppose you substituted a few terms for Roy's in there?
Liberals have been asked to believe nonsense for a long time -- that Barack ended the wars, that Hillary was the most qualified, progressive candidate in human history, that when Bill Clinton lied nobody died, that Barack's election dealt a death blow to racism, that voting will bring about change, etc. These were tough lifts, but they had the help of intellectuals, or magazine writers who passed for them, to give them a line of gab that made these things sound reasonable, at least to themselves.

Recent years have not been kind to these beliefs, and Obamamania kind of blew the whole scene -- not just by being so mind-bendingly, outlandishly at variance with observable reality, but also by presenting them with unitary Republican control of the government, thus making American politics a perfect playground for their fantasies.
Make some substitutions of your own; it's fun.

The liberal Democratic fantasies that are most current today are Russia Russia Russia, all Trump all  the time, Trump is Putin's boyfriend, and so on and on.  I long ago recognized the absurdity of the liberal claim to be reality based, but it used to be a little funnier.

Wednesday, July 25, 2018

But What About the Whataboutism, Huh?

Here's my position at the moment.  (If you don't like it, I have others.)

I take it for granted that Russia (the whole country, every dang one of 'em!) meddled in the 2016 US elections.  Probably all states meddle in other states' affairs, and the US is no exception.  It's like spying: every country does it.  When they catch another country's spies, they expel or punish them, which is fair enough; it's the pretense of violated innocence that I find insufferable.

What effect, if any, Russian meddling had on the outcome of the election is another question, which most commentators, professional and amateur alike, seem not to recognize; they evidently assume that if not for Russian interference, Clinton would have beaten Trump. On that question I'm agnostic.  Given the myriad of factors that affect electoral results, I don't think it's possible to isolate just one -- if you're really interested in understanding why Clinton lost and Trump won, that is, or in preventing such interference in the future.  I don't believe that most of those who are presently obsessed with Russia and Putin are interested.  As Seth Ackerman wrote recently at Jacobin:
Despite ubiquitous demands to “take the threat seriously,” none of these voices has plausibly explained what the US ought to do about it. Democratic senator Chuck Schumer demanded that Trump “cancel his meeting with Vladimir Putin until Russia takes demonstrable and transparent steps to prove that they won’t interfere in future elections.” And how exactly would that work? ...

It seems clear that the current Beltway panic isn’t really a reflection of the magnitude of the perceived threat from Moscow. It reflects panic that someone like Trump could win an election in the United States. If Russia’s actions did, in fact, shape the outcome — and I doubt they did — it was by changing a tiny, marginal number of votes in what would have been a close election anyway. Russian meddling is not the reason Trump was a viable presidential candidate.
Ackerman also wrote:
Think of it as “the expressive function of the Russia freakout.” Just as there is what Cass Sunstein called “the expressive function of law” — “the function of law in ‘making statements’ as opposed to controlling behavior” — there’s a purpose served by the constant keening over Putin. It conveys liberals’ sense of bewilderment and disorientation at a country they no longer recognize — a feeling not so different from that which motivated the Right’s manifold freakouts in the Obama era.

On both sides there’s a sense of loss about a bygone America that no longer exists: for the Right, the white, middle-class utopia of the Eisenhower years. For liberals, the upright decency of the Jed Bartlet administration. The problem with these fantasies is neither of them ever existed.
So it's easy enough to see why the most popular riposte to criticism of the Russia Freakout is an accusation of "whataboutism."  I've been seeing it used more lately -- which does not mean that it's absolutely more common, of course, only that I am seeing it more.  It's bipartisan, too: my Never-Trump Right Wing Acquaintance uses it just as Democratic loyalists do.  Of course, he's engaged in whatboutism himself many times, just as they have; indeed, many accusations of whataboutism are accompanied by more whataboutism.  Whataboutism is only bad when the wrong people use it. 

The first time I remember encountering whataboutism online was in the late 1980s, when a College Republican, defending the Reagan/Bush support for death squads in Central America asked me Whatabout the Sandinistas' human rights violations in Nicaragua?  (Which were, though real, much less severe than those in El Salvador or Guatemala.)  I don't think I missed a beat; I replied that the US should cut off all military aid to the Sandinistas.  I was being snotty, of course, because the US wasn't giving military aid to the Sandinistas: we were waging a vicious proxy war against Nicaragua, killing and wounding thousands of civilians.  I don't recall that my interlocutor had an answer to that; the home office hadn't provided one.

Whataboutism is much older than that, for reasons I'll get to presently.  It was a common response to US propaganda against the Soviet Union, both by the USSR and by apologists in the West. But the US propaganda was itself whataboutist: sure, things aren't perfect in the US or in our client states, but whatabout the terrible conditions behind the Iron Curtain?  (It's related to the popular parental response to kids who won't eat their spinach: Whatabout the starving children in India who'd love to have that spinach?)

But once you start to pay attention, you'll find whataboutism everywhere.  When American racists threw tantrums because the Kenyan Usurper was putting his big dirty feet on the sacred Oval Office Desk (a gift to America from Queen Victoria!), the usual and entirely proper rebuttal was to point out that his predecessors had done the same thing, and more.  When I pointed this out in a whataboutism thread on Twitter, one person indignantly replied that those rebuttals were just stating facts, they weren't meant to insult and mock the people they were rebutting (as the case we were debating allegedly did).  It must take quite a bit of determined effort to miss the scorn and mockery in the article I'd linked, but a partisan will make that effort.

Another classic example of whataboutism I've been citing comes from Martin Luther King Jr.'s 1967 "Beyond Vietnam" speech (emphasis added):
As I have walked among the desperate, rejected and angry young men I have told them that Molotov cocktails and rifles would not solve their problems. I have tried to offer them my deepest compassion while maintaining my conviction that social change comes most meaningfully through non-violent action. But they asked, and rightly so, what about Vietnam? They asked if our own nation wasn't using massive doses of violence to solve its problems, to bring about the changes it wanted. Their questions hit home, and I knew that I could never again raise my voice against the violence of the oppressed in the ghettos without having first spoken clearly to the greatest purveyor of violence in the world today, my own government. For the sake of those boys, for the sake of this government, for the sake of the hundreds of thousands trembling under our violence, I cannot be silent.
When someone accuses me (usually accurately) of whataboutism, I reply demurely that if whataboutism was good enough for Dr. King, it's good enough for me.

As this example indicates, whataboutism is really a basic component of critical thinking.  That emerges in Walter Kaufmann's formula for his canon, quoted here:
Confronted with a proposition, view, belief, hypothesis, conviction -- one's own or another person's -- those with high standards of honesty apply the canon, which commands us to ask seven questions: (1) What does this mean? (2) What speaks for it and (3) against it?  (4) What alternatives are available?  (5) What speaks for and (6) against each?  And (7) what alternatives are most plausible in the light of these considerations?

Now it may be objected that doing all this is rather difficult.  But has it ever been a condition of virtue that it required no great exertion?  On the contrary.

Patricia Miller-Roberts' discussion of demagoguery and debate is also helpful, if only to show that Kaufmann's formulation of critical thinking isn't unique to him.

Another entertaining example of whataboutism went viral recently when a young Anglo-Asian journalist yelled "I'm a communist, you idiot!" at Piers Morgan on a British talk show.  Morgan had been hectoring her about the protests greeting Donald Trump's visit to the UK, asking why there hadn't been any protests when Barack Obama came to call.  Now, in fact, there were protests during Obama's visits to the UK; it's hard to tell under the cross-yelling, but I think Ash Sarkar tried to say so.  Morgan finally referred to Obama as Sarkar's "hero," which prompted her now-famous reply.

Morgan did have a point, one that Obama's critics on the left made repeatedly during his time in office: people who objected to George W. Bush's policies and actions suddenly tolerated or celebrated them when they became Obama's policies.  Antiwar activism diminished during Obama's presidency, for example.  But not all activism disappeared: protests against Obama's immigration policies were big and troublesome enough to push him to try to disarm or appease his critics.  It doesn't matter what Morgan's intentions were, he raised a valid question, and Sarkar tried to answer it.

Whataboutism criers claim that whataboutism is intended to stop debate.  It can be, but so is the accusation of whataboutism.  (More whataboutism -- whatever shall we do, wherever shall we go?)  One reply to this objection would be that whataboutism can also be intended to start debate.  The trouble seems to be, as I've noticed before, that most people have no idea how debate works, what to do once their opponent make a first rebuttal.  They know their talking point, which they got from a meme on Facebook, but it stops there.  They've seen that this or that liberal hero/ine EVISCERATED or DESTROYED the Rethugs with a single well-chosen word, and they think they can do it too.  But those Rethug targets, far from being destroyed or eviscerated, are still in good health and at large.

The proper use of whataboutism is to find out how consistent someone's position is.  If they are outraged by Obama's feet on his desk, did they object (or even notice) Bush's feet on the desk?  If Bush's surveillance of American citizens was outrageous, did it stop being outrageous when Obama did it?  Remember, though: a responsible critical thinker will ask him or herself such questions before an opponent raises them: not to win debating points, but to test whether he or she has a sound position.  As Nietzsche said, "A very popular error: having the courage of one's convictions; rather it is a matter of having the courage for an attack on one's convictions!!!"

When I point out the US' record of interfering in other countries' elections, I'm not trying to stop debate: I want to start it.  I have some relevant and important questions for the Russiagaters.  For example, should other countries respond to US interference as they want the US to respond to Russian interference?  One difficulty is that it's not at all clear, as Seth Ackerman and Lyle Jeremy Rubin pointed out, how they want the US to respond.  As with liberals' vaunted concern over Syrian atrocities, they are upset and want everybody to know they're upset, but they don't know and don't much care what should be done.  Concrete suggestions, on the rare occasions when they were offered, consisted mainly of a US war against Syria.  As Ackerman pointed out, there's similar vagueness about what we should do about Russia.  There's a lot of posturing and xenophobic grandstanding, but little substance. Some measures for better cybersecurity have been enacted, but they are mostly not being implemented; Ackerman linked to a Politico story which reported that although millions of dollars had been appropriated to help states improve voting security, little of that money has been used.  But, I suppose, it's whataboutism to bring that up.

Sunday, July 22, 2018

For There Is More Joy Over One Sinner Who Repents

Someone recently recommended Janina Bauman's A Dream of Belonging: My Years in Postwar Poland (Virago, 1988) in a way that piqued my interest, so I checked it out of the library.  Bauman was a Polish Jew who survived the Holocaust; she and her husband, the philosopher Zygmunt Bauman, moved to England from Poland in 1971.  A Dream of Belonging is a sequel to a previous memoir, Winter in the Morning: A Young Girl's Life in the Warsaw Ghetto and Beyond; as the subtitle indicates, the newer book recounts her life in Poland after the war ended.

Bauman worked for several years in the Polish film industry.  She tells of one of her coworkers, who, although he was (like Bauman) a member of the Communist Party, was not trusted by his superiors.  Eventually he told her why. During the war he had risen in the Polish army through distinguished action:
After two years or so, he was suddenly demoted, deprived of his rank, and discharged from the army  The reason for such a severe punishment seemed unbelievably trivial.  Before the war Marek had as a child belonged to the Polish Scouts.  After the war this organization was blacklisted by the new regime.  Marek had never thought of mentioning his youthful involvement to his superiors.  So, when they somehow dug it out, he was accused of concealing his political past.  And this was a very serious sin.  This was the skeleton in Marek's cupboard for which he was to be punished till the end of his life, it seemed [112].
This mirrors the purges of premature anti-Fascists and other leftists in the West after the war ended.  But the last sentence I quoted reminded me of some defenses I've seen of George W. Bush by liberals, such as "So I guess we have to hate him for the rest of our lives. No one gets a break at the Intercept right?"  This refers to criticism of Bush by people who aren't impressed by his recitation of anti-Trump boilerplate, which, accompanied by his appearance on Ellen DeGeneres's TV show and a hug from Michelle Obama, signaled Dubya's rehabilitation in the eyes of the Party.  (The Democratic Party, you understand, which puts a whole other color on the phenomenon.)

I shouldn't make too much of this (just try to stop me, though), but it seems to me that at least some Democrats are unconsciously echoing stories like Marek's, the lament that it's just not fair that some poor guy should be punished and hated for ever just for a little mistake in his youth.  In Marek's case, I agree, but I lived through eight years of Bush's presidency, and Bush is no Marek.  Marek's "offense" was buried in his personal past, and more than balanced by his heroism during the war.  Bush's offenses were committed in the public eye, with worldwide publicity.  Marek belonged to an organization that was not proscribed at the time he belonged to it.  Bush committed terrible crimes (aggressive war, torture) that were known to be crimes, which he celebrated publicly.

Nor has Bush ever expressed any contrition for his crimes -- let alone been held accountable for them.  He didn't even suffer the minimal accountability of being voted out of office: he served two terms and retired to Texas to paint pictures in comfort.  Just attacking Trump for doing much the same things he himself did (while never acknowledging the continuity) is not contrition.  I might be too strict, but I wouldn't take him seriously unless he presented himself to the International Criminal Court as a war criminal, ready to stand trial; but needless to say, he hasn't even begun to acknowledge his crimes, let alone accept responsibility for them.  And I think it's significant that so many liberal Democrats are willing to forgive and forget, even though Bush has never bothered to ask for forgiveness -- and really, it's not for them to forgive him anyway.

(Notice, by the way, that in the interview I linked to above, Nancy Pelosi theatrically begged Bush's pardon for confusing him and Trump.  I'm sure a Christian man like Dubya would graciously grant her pardon for the blasphemy.)

It's as if they don't realize just how serious his crimes are.  Starting two wars on false pretenses, which resulted in the deaths of hundreds of thousands of innocent people, the injuring of many more, and the turning of millions into refugees -- and that's only a partial list -- are not petty traffic offenses to be wiped from his record lightly after he undergoes probation and counseling.  But then most Americans never consider such peccadilloes to be serious crimes, as long as American presidents commit them.

Wednesday, July 18, 2018

Where Do Credentials Come From, Mommy?

On to other matters. David Sirota posted this a couple of days ago:

Sirota's a smart journalist, and I take him seriously.  But in this case -- no, I don't think so.

First, I thought polls showed that most people hate the media because they see them as too adversary, too hostile to wealth and power. Judging from what I hear from people I know and those I read on social media, that belief probably divides along partisan lines, as when Republicans believe that the media just make up stuff about their President.  I noticed that many Democrats believed that Hillary Clinton lost because the Lying Media misled the public about her.  So it seems to me that many people want the media to be loyal to wealth and power as represented by their own party.

Second, I question the word "ideologically."  Most people don't know what it means anyhow, and I don't think that loyalty to wealth and power is usually ideological.  Rather it's opportunistic ("I've got mine and I want to hang on to it"), personal (because a person likes and identifies with the wealthy and powerful, hoping that some of the goodies will trickle down), and self-interested.

Third, a 2016 poll
found respondents valued accuracy above all else, with 85 percent of people saying it was extremely important to avoid errors in coverage. Timeliness and clarity followed closely, with 76 percent and 72 percent respectively saying those attributes were imperative among media sources.
“Over the last two decades, research shows the public has grown increasingly skeptical of the news industry,” the report reads. “The study reaffirms that consumers do value broad concepts of trust like fairness, balance, accuracy, and completeness. At least two-thirds of Americans cite each of these four general principles as very important to them.”
That's moving and inspiring, only -- how do respondents know whether a news source is accurate?  From what I see, for many people "accuracy" means that the news tells them what they want to hear.  How often do most people pay any attention to the corrections department of their newspaper?  The meaning of terms like fairness, balance, and the like are disputed in the press, and I don't see any reason to believe that news consumers are any clearer about them than the professionals.  I see many people who dismiss news sources lightly -- ah, they'll print anything if it'll make them money, they're just trying to divide us, etc.  Again, all very well, but they still seem to trust some sources, usually Fox News or Breitbart or Limbaugh or Mike Savage or Rachel Maddow.  If they trust them, it isn't because they've carefully examined the facts (wherever they would get facts in the first place) and subjected them to rigorous scrutiny, and decided that their preferred network or commentator is reliable.

Interestingly, though, those who are cynical about "the media" in general tend to trust local news sources, which isn't a bad idea except that local news sources are increasingly non-local in ownership and control.  It's like public education: many people are sure that Our Schools across the nation are failing miserably, but they like and trust their own local schools.

Which makes this column from last December, by the Washington Post's media columnist Margaret Sullivan, interesting.  She found that when they were allowed to discuss the question at length, instead of replying to narrow poll questions, people (she talked to several dozen, outside the Beltway) had a much more nuanced understanding of the news.

But not that much more nuanced.  She heard a lot of complaint about "bickering" in the media, for example from this "young Trump voter":
"I wish the bickering would stop," he said, referring to commentary by pundits. He listens to Fox News Radio on SiriusXM, and he compares what he hears there with what's offered on CNN and other news outlets, like NBC News, which he watches some evenings: "There's a pretty stark contrast between what they report." So he weighs them against each other. "At this point, I'm pretty tired of it all," he explained, saying he wants reporting presented straight — just the facts, with less opinion attached. "It's called the news — it's not supposed to be about their agendas." Journalists, especially cable pundits, he said, "need to grow up."
I doubt very much that the bickering will stop, at least not until people stop listening to it.  I've run into numerous people with the same complaint, and I've begun asking them why, if they hate the bickering so much, they continue listening to it.  At best they stop for a while, and then they tune back in.  But panels of pundits aren't all there is to the news media, as this guy seems to realize.  I almost never listen to such roundtables, having quit decades ago when I realized how useless they mostly were -- and in those days the emotional temperature they exhibited was a lot lower.  It happens that I recently watched the now-notorious "I'm a communist, you idiot" segment from Good Morning Britain on Youtube; after watching Piers Morgan's antics, I can understand why people would object to this sort of thing -- so why do they watch it at all?  Many clearly do; what do they get from it?  It can't be for the information.  I was amazed at the utter lack of professionalism Morgan displayed; if he gets away with it regularly, someone at ITV must like him. I prefer reading text, largely because it filters out grandstanding like Morgan's.

Wanting the news to report "just facts" seems no less quixotic to me.  It's true, the line between "news" and "commentary" is blurred, but it always has been.  When I see old (Sixties or Seventies' vintage, say) news stories from the New York Times, I'm constantly struck by the amount of commentary that found its way into them.  But even leaving that aside, which facts an outlet chooses to report, which stories it finds significant, which stories it ignores, what facts it leaves out, who it talks to -- all these factors put a slant on the "news."

Sullivan puts heavy stress on the claim, which I don't dispute, that reporters almost never make up stories.  I bet she'd be surprised to find out that a harsh press critic like Noam Chomsky not only agrees with her, he regularly commends the professionalism and competence of most mainstream reporters.  He directs most of his criticism at the institutions: the companies, the corporations, the publishers, the editors.  The Herman-Chomsky Propaganda Model, contrary to the fantasies of many of its critics, is not a conspiracy theory but a theory about institutions and their interests.  I'm less bothered by the fact that many media/press people mistake it for a conspiracy theory -- that's a natural human defense mechanism -- than that many of Chomsky's fans make the same misreading.

Like many people, Sullivan is concerned that so many Americans get their news from Facebook or other social media.  She reports that a Bernie Sanders supporter is "aware that some of what's on social media has been fabricated."  The thing is that neither Facebook nor Twitter nor any other social media outlet I know of produces its own news: I don't "get news from Facebook or Twitter," I get references and links and recommendations, which point me to reports and stories and essays I may decide to read at length.  I know not to believe everything I see there; what I find fascinating is that so many of the same people who are vocally cynical about the media are also extremely credulous, shocked to learn that The Onion just makes stuff up and you aren't supposed to take their stories as fact.  They have even more difficulty understanding that there are people who really do make stuff up and post it in order to fool people -- the piteous photos, for example, of little children in hospital beds, if you give them an amen and a like then Facebook will donate a dollar to their medical fund.  Cynicism and credulity are highly selective.

Sullivan continues:
Much worse was my conversation with Jason Carr of Green Bay, Wis., a middle-aged member of the Oneida Nation who was visiting his girlfriend in western New York. Wearing a "Born to Chill" T-shirt and sitting behind the wheel of his Ford F-150 pickup truck in a KeyBank parking lot, Carr told me that media reports strike him as nothing but "a puppet show" that is "filtered and censored" by big business. He buys into the conspiracy theories that the United States government was responsible for the 9/11 attacks and that the 2012 massacre of Connecticut schoolchildren at Sandy Hook Elementary School was staged. Carr didn't vote in the presidential election and said there's nothing the news media could do to earn his trust. "I don't believe anything they say," he said. "They get paid to be wrong." I left the conversation shaking my head, knowing that, as is clear from the huge following of sites like the conspiracy-promoting Infowars, he's far from alone in his beliefs.
My question for Jason Carr, if I could ask one, would be which sources -- media, that is -- he gets his information from, and why he trusts them over other media.  He didn't invent the "conspiracy theories" he believes; he got them from somewhere, and he trusts those sources.  Why them and not others?  Sullivan, I suspect, would regard Carr's sources, whatever they are, as untrustworthy because they're uncredentialed.  But where do credentials come from?  Even in the good old days before the Internet, anyone could start a newspaper, and most of the newspapers in the US used to be independently owned and run.  It was not uncommon for even small communities to have two newspapers, a Republican one and a Democratic one.  So which one was real news, which one was trustworthy?  How did you decide which source to trust?  The same way you do nowadays on the Internet: by exercising your critical faculties.  It's good to bear in mind that corporate media tend to report the news from the point of view of the stockholder class; but there is no source that has no point of view.  There is no magic crystal ball that will bring you the Truth, allowing you to shut off your mind.

This doesn't mean assuming that the media are out to lie to you; it means assuming that the news is produced by fallible human beings with biases, and with the best will and intentions they won't always be right.  Their stories will be partial, both in the sense of incomplete and that of biased.  So I (and you) need to be aware of that, as well as of our own limitations and biases.  It's not always easy, though it gets somewhat easier with practice -- and then one day you find that your bullshit detector failed spectacularly and you fell for an amazingly bogus story.  Whereupon you pick yourself up and go on, chastened and ready to continue learning.

Tuesday, July 17, 2018

Tea for Two, and Two for Tea

A final note on Toni A. H. McNaron's Poisoned Ivy, which I finished reading over the weekend.  Overall, it's a valuable, useful book, thanks to the breadth of McNaron's experience and the material she received from the respondents to her questionnaires and interviews.  It's somewhat out of date by now, twenty years after it was published, but it provides a snapshot of that point in time; I might look to see if anyone has done more recent work of the same kind that would indicate how much has changed in the late 1990s.

What I'm commenting on here, then, is not the book as a whole, but the occasional moments that made me sit up and wonder, "Now, where did that come from?"  Like this one, quoting one of her respondents.  Parenthetical remarks (in brackets) are McNaron's.
A second instance [in which the administration tried to force a faculty member to leave] was [around] 1962-63.  The man this time was in the discipline of art.  He also joined the faculty the same year as did I.  He was arrested with several others at a local "tea room."  [In English gay parlance, "tea room" became the name for a place where one could meet other men.]  [155]
Whoa!  A "tearoom" -- I've usually seen it written as one word -- is not "a place where one could meet other men," which could mean a bar, a party, a social circle in someone's home.  A tearoom is camp slang for a public restroom where men go to seek quick anonymous sex.  The term is American, not "English" (in the sense of British) as far as I know; in England such sites are known as "cottages," and gay men go "cottaging" when they fancy a quick one.  It passed from in-group code to academic awareness after the sociologist Laud Humphreys published his research on tearoom trade in 1970.

Considering the notoriety of Humphreys' work, I find it hard to understand how McNaron got this wrong.  Since she's never cruised a tearoom herself, I presume she got the information from someone else, who misled her, perhaps through euphemism.  And none of her advance readers, no one at Temple University Press, caught the error.  Again, this doesn't mean that Poisoned Ivy is worthless.  I'm just concerned, because strange errors creep into academic publications, which may confuse or mislead others who read them, especially students.

Saturday, July 14, 2018

The Usual Riffraff

As I hoped, Poisoned Ivy got better once I was past the preliminaries.  I like the way Toni McNaron used her own experience, the experience of people she knew, and the experiences reported by responded to her questionnaires.  I'm still not happy with her writing, but I can overlook that.

In the chapter on university administrators, she quotes a 1978 article on research similar to hers, by gay academic Louie Crew (he's still alive, bless him):
The following comments come not from the minds of the usual riffraff, but from the graffitic imaginations of persons distinguished by being chairpersons of departments of English in colleges and universities across the United States, writing with anonymity in the margins of a questionnaire.

"Gay Persons" -- do you mean queers?
This is the damndest thing I have ever seen!
Returned with DISGUST!
God forbid!
Tell me, Louie, are you a daisy?
Your questionnaire has been posted on our department bulletin board and has been treated as a joke.
That was forty years ago; Poisoned Ivy was published twenty years later, twenty years ago.  There has been some progress since those days, and few senior academics in the US would write or say such things aloud anymore.  And do you know why?

Because of "political correctness," that's why.  They know that if they said such things aloud, intolerant leftist heresy hunters would start screaming postmodernist, relativist abuse at them.  So they've learned to repress and hide their traditional values, for fear that they'll be attacked as "bigots," even monsters.  America has so far abandoned civility (which, as you can see, is fully compatible with the kind of schoolyard abuse those highly educated men scrawled, while leaving off their names) that these respectable and basically decent academics must cower in fear that the fundamentalist wing of gay advocacy will call the drones on them.

Of course I'm (half) joking; but only half.  I'm exaggerating in my parody of civility fetishists, but not by much.  They'll tout the inarguable progress we've made over the past half century, while conveniently leaving out that that progress was made, not by civility (by which they basically mean hunkering down, keeping quiet, and hoping that bigots will change all by themselves), but by rocking the boat, making waves, complaining, agitating, and demanding that bigots and bigots' enablers change their behavior.

As the examples I linked to show, mainstream sympathy for the most vicious homophobic frothers isn't a product of the Trump era.  Excuses will always be made -- bad excuses, but that I suppose is better than no excuses at all.  What is inexcusable is getting in these guys' faces and telling them that they're bigots.  It doesn't really matter how humble, how civil, how incremental you are: any criticism at all is too much.  Remember the amazing candlelight vigils that, week after week for months without violence, helped bring down South Korean President Park Geun-hye?  It turns out that the South Korean
Defense Security Command (DSC) drew up plans last year to mobilize hundreds of tanks and thousands of troops to quell candlelight protests against then impeached President Park Geun-hye, a civic group claimed Friday.

The Military Human Rights Center for Korea disclosed what it claimed was a DSC document drawn up in March last year to outline ways to impose wartime martial law in case the Constitutional Court rejected the National Assembly's impeachment of Park and kept her in office.

According to the civic group, the DSC suggested responding to candlelight protests by declaring garrison decree first in light of the negative connotations of martial law. If the situation deteriorated further, martial law should be considered, it said.
It's not clear why these plans were abandoned, as they fortunately were.  But you see, even the most civil, best-behaved protests are unacceptable to the powerful.  So we mustn't allow ourselves to be gaslit by apologists for the powerful and the bigoted who try to explain that they wouldn't mind our criticism if we'd just be nicer about it.  They regret the changes that have occurred, and would like to turn back the clock to the Olden Days, when the lowly who weren't meek enough could be thrown out on the street, jailed, or killed.

------------------
*"Before Emancipation: Gay Persons As Viewed by Chairpersons in English," in The Gay Academic, ed. Louie Crew (Palm Springs CA: ETC Publications, 1978), p. 3.

Thursday, July 12, 2018

Dude, I'm a Gay and Lesbian Academic

It's been a busy couple of days, and I'm already feeling swamped by topics I should write about.  So I'll be sneaky and do what I hope will be a quick and easy one.

I found a copy of Poisoned Ivy: Lesbian and Gay Academics Confronting Homophobia (Temple UP, 1997) by Toni A. H. McNaron at the library book sale the other day, and it looked interesting, so I bought it.  McNaron, who began teaching in 1964 at the University of Minnesota, surveyed a generational sample of LBGTQ with questionnaires, and I'm always interested in seeing what people have to say about their experiences.

But once I sat down and started to read the book, I was frustrated by McNaron's writing.  Like so many academic writers (though not only academics, I concede) she thanks various friends and colleagues and editors for assiduously going over and improving her prose.  I can only wonder what it looked before they worked on it, and with that in mind I too must thank them for their efforts.

More important, though, I keep stumbling over strange errors that apparently no one caught despite the numerous hoops that academic writing must jump through.  For example:
In 1973, the American Psychological Association (APA) removed homosexuality from its catalogue of diseases, reducing its classification from psychosis to neurosis [17].
This is a mess.  First, it was the American Psychiatric Association (APA) that in 1973 removed homosexuality from its Diagnostic and Statistical Manual.  It's easy to confuse them with the American Psychological Association, since they have the same initials; many people do, I've done it myself, and so does Google, which brought up this New York Times article on the American Psychiatric Association when I searched for the American Psychological Association.  Even the organizations themselves get confused: this American Psychological Association page says that the APA has opposed stigmatization of homosexuals since 1974, while this one says 1975.

Second, while homosexuality was removed from the DSM-II, there was enough dissension among psychiatrists that a new category replaced it: sexual orientation disturbance, which meant that if you felt bad about being gay, a practitioner could take your money to make you feel better about it.  Given the poor results of most psychotherapy, I wonder how effective such treatment actually was.

Third, I can't find that homosexuality was ever classified as a psychosis by either APA, though some individual practitioners may have done so, if only as a term of abuse.  As far as I can tell, then, McNaron's claim that homosexuality went from psychosis to neurosis is false.  It also conflicts with her own statement that homosexuality was "removed" from the DSM; if it was simply reclassified as a neurosis, it was still a "disease" and so was not "removed."  Since both of us are old enough to remember that period, I wonder where she got this interesting misconception.

Next, McNaron writes:
Since much queer theory argues against identity politics as being too solipsistic and narrow to be helpful in understanding a post-modern world, it has become possible for a faculty member to conduct and publish research about gayness or lesbianism without necessarily being gay or lesbian.  To the extent that this new field of inquiry provides a protective umbrella for some faculty who might otherwise refrain from integrating their sexual orientation into their work, it can only benefit students and faculty alike.  To the extent that it runs counter to the ideas of an older generation or academic era, those who continue to advocate for greater visibility in asserting the existence of intimate and unavoidable connections between the personal and the intellectual, queer theory runs the risk of diluting gains made at great risk to individual faculty members [18].
My objections to McNaron's analysis here are perhaps less factual than interpretive, but there are still facts she leaves out.  (However: "solipsistic"?  It's a much-abused word, but ...)  First, before the rise of openly gay and lesbian scholarship in the 1970s, academics took for granted that only heterosexuals could be impartial and objective about homosexuality, so gay and lesbian academics who wrote about the topic didn't reveal their personal connection to their material because to do so would have discredited them in their profession.  An example that comes to my mind is Laud Humphreys, the sociologist whose controversial observations at sites of gay men's anonymous sexual encounters, published as Tearoom Trade (Duckworth Overlook, 1970), nowhere revealed that Humphreys (who was heterosexually married) was himself gay, though he did acknowledge it later.  Two decades before Humphreys, Alfred Kinsey presented his research team as married heterosexual males, though he and some of his team weren't exclusively heterosexual; but the reason was the same, to comply with professional and cultural norms of objectivity.  One of the motives of openly gay and lesbian scholars was to demolish the notion of objectivity; it's not just a "post-modern" concern.

When openly gay and lesbian scholars began to emerge and publish in greater numbers in the 1970s and afterward, they took different approaches to this problem, though this was, again, controversial, flouting professional norms of impersonality.  Some, influenced by Second Wave feminism, wrote more personally, but most continued to produce professional work that left the observer out of the discussion.  Often personal revelations were confined to prefaces and introductions.  I've seen some disagreement about the extent of this greater personalization, but this is how I perceived it as an interested observer during that period.

As for queer theory, the distinction between it and "gay and lesbian studies" was never well-defined or -maintained.  The textbook The Lesbian and Gay Studies Reader (Routledge, 1993), for example, contains many contributions which, properly speaking, are queer theory, including an important excerpt from Eve Kosovsky Sedgwick's ovarian queer-theoretical The Epistemology of the Closet (California, 1990).  Sedgwick was also controversial because though she at times would accept the label "queer," she was heterosexually married, and both gay-and-lesbian-studies and queer-theory types disputed whether she really was queer and whether she should be doing queer theory if she wasn't.  Identity politics has been disavowed by queer theorists, but they have their own identities and their own politics about them.  The younger queer scholars I've read or met don't seem interested in excluding their queerness from their  work; they have other fish to fry.

So, whatever effects queer theory may have had on academics, McNahon's claims seem dubious to me.  One effect of greater gay visibility was, in my opinion, that it made it harder for scholars to do work on homosexuality while dodging questions about their own sexual orientation.  Older scholars, who'd grown up in a time when homosexuals were expected (under great coercion) to pretend, as much as possible, that they were not One of Those People, even when everyone around them knew otherwise, no doubt found it difficult to adjust.

What I've read of Poisoned Ivy so far confirms this.  One of McNaron's informants describes how a closeted colleague torpedoed his appointment to a choice position by tattling about his erotic past to the college president.  "He was obviously afraid I would expose him as a closeted gay," the informant writes (16).  Really?  There are other ways to read the incident.  One is that the informer exposed him partly to divert attention from himself: by fingering someone else, he could prove his own normality. Another is that while he was aware of his own vulnerability to exposure, he disapproved of anyone but himself being queer: he was different, a respectable academic, and this young upstart a disreputable perv.  It's impossible to say for sure in this case, at this distance in time, but I have known people with this attitude, and the trashier their own private lives were by their own standards, the more outraged they were by others.

I've peeked ahead in the book, and there is more to come.  Still, I hope to learn something by reading on, so I will.