Friday, May 27, 2016

First They Came for the Politically Correct ...

Now, this is interesting; an exchange between Atlantic blogger Conor Friedersdorf and an anonymous 22-year-old Donald Trump supporter from San Francisco.  But this bit from Friedersdorf's correspondent seems to say a lot:
In my first job, I mentioned that I enjoyed Hulk Hogan to a colleague who also liked the WWE. I was not aware at the time, but Hogan had recently made news for his use of some racial or homophobic slur. I was met with a horrified stare. By simply saying I liked his showmanship, I was lumped into saying I too was racist or homophobic.

I feel like I have to hide my beliefs.
I've noticed that PC Cryers, whether they're Republican or Democrat or something else, are very fragile snowflakes. Notice that it's not clear that he even got more than a "horrified stare," but he knows that he was "lumped into saying [he] too was racist or homophobic." Maybe I'm wrong, but I can only go by what he wrote. So even a "horrified stare" (also his interpretation) makes him "feel like I have to hide my beliefs."  Wow. Even if his interpretation was correct, I've always gotten such lumpings from the Right.  (Oh, you don't think we should invade Iraq? I bet you want Osama Bin Laden for President!)  Now, think of the good old days (which are still with us in most of the United States) when you could lose your job for being gay, and you couldn't even get certain jobs or go to certain schools if you weren't white. Think of the physical attacks that targeted civil-rights demonstrators, or school kids who had to be guarded by armed soldiers to enter a previously white-only public school. And even then they weren't safe.  But this guy is too tender to cope with political disagreement. He talks as though he shouldn't encounter any expression of disagreement at all.

What's bitterly funny is that this twerp believes that Trump is more tolerant than the people he calls "PC," and he likes the idea that if Trump became President he'd be safer from PC (which is white-person politically-correct jargon for disagreement). He's talking about a guy who encourages vigilante violence in his rallies, remember. "... I think Trump would likely do what he can to protect free speech" is merely delusional, much like the fantasies many Democrats had about Obama in 2008. I think this whiner might find that he won't be as safe as he likes to think in a Trump America.  He has an Asian-American girlfriend, for example, and Asian-Americans have been targets of racist violence by nativist thugs in the recent past; they could be again.  I think things would likely get worse, but he doesn't mind as long as someone else is the target: "Admittedly," he writes, "I do not focus on the human cost either."  I do, and I'd rather not find out what it will be. Things are bad enough already.

This doesn't mean I'm not hostile to the authoritarian tendencies among many liberals and "progressives," including the gay movement alas. I've often written and spoken out about it. The typical "Oh how can you say such awful things!?" liberal reaction to offensive statements deserves scorn, and I give it. We need more and better debate and discussion, and people need to inform themselves, and we all need to speak out more without trying to punish those we disagree with, even when we think they're badly wrong.

Tuesday, May 24, 2016

The Bengal Famine Commemorative Tea Towel
Keep Calm and Click On It, you know you want to.  But maybe not; I suppose and hope this meme has passed its sell-by date by now.

I'm reading The Ministry of Nostalgia by Owen Hatherley, published earlier this year by Verso, and so far it's very good.  (An excerpt here; you can order the book directly from the publisher, including the e-book at about half what it would cost from Amazon.)  Among other things, I've learned where the original Keep Calm poster came from (the British propaganda ministry in 1939, but it was never used officially, and only surfaced at the end of the twentieth century.  Hatherley also discusses "hauntology," an ambivalent alt-music subgenre mixing nostalgia with anxiety that sounds interesting.  And to give credit where it's due, I stole the title of this post from page 72.

But what I want to write about today is a thought that occurred to me soon after I began reading The Ministry of Nostalgia.  There is a lot of nostalgia (or "amnesia turned around," as the poet Adrienne Rich called it) for the World War II period, in the US no less than in the UK.  It occurred to me how odd this is, when the nostalgia is expressed (as it often is) by people who see government as a danger to freedom.  The war years were a time of Big Government on joy juice: enforced conformism on a scale that you only see in that kind of war, with censorship, rationing of food and other goods, wage and price controls, limitations on union activity, and a general disregard for individual freedom in the name of a greater collectivist good. Winston Churchill had to cope with looting by civilians and by officials during the Blitz.  Of course it was also a time of resistance, with black markets in consumer goods, complaints (however muffled and cautious) about restrictions on travel, and profiteering by business and individuals when they could get away with it.

Which reminds me that the genuine threat of external danger didn't entirely stop people or organizations from looking for someone they could bully and attack at home.  For example, Manning Marable wrote in his biography of Malcolm X (Viking Press, 2011),
In response to blacks' modest gains in employment [in the 1930s and 40s], thousands of white workers participated in "hate strikes" during the war years, especially in skilled positions. In July 1943, for example, white racists briefly paralyzed part of Baltimore's Bethlehem Shipyards. In August the following year, white streetcar drivers in Philadelphia, outraged at the assignment of eight black motormen, staged a six-day strike. In response, Roosevelt dispatched five thousand troops and issued an executive order placing the streetcar company under army control [56].
To say nothing of the Detroit race riots, also in 1943, when white thugs ran wild for three days, killing 25 African-Americans and doing millions of dollars' worth of damage. (Seventeen of the blacks were killed by police; none of the nine whites who died were killed by police.)  Time magazine's retrospective account is interesting:
As a matter of fact, the Axis propaganda machine predictably jumped all over the news of America’s 1943 race riots, citing them as evidence of a corrupt, weak and fatally divided culture. (A few years later, of course, that corrupt, weak, fatally divided culture emerged from the war victorious and more powerful than any other single nation on the planet.)
That victorious, powerful culture remained fiercely racist and imperialist, both in policy and in popular attitudes.  But Time had to wave the flag after criticizing the rabble in Detroit (and in Texas, Alabama, and California) for giving aid and comfort to the enemy; they wouldn't want to go too far.

My point here is not so much the unreliability of nostalgia, which isn't exactly news, but that people who indulge in it are longing for a time they'd have hated to live in, and to some extent they're aware of it.  The culprits in England include the supposedly anti-government Thatcherites (though Thatcher was all for repression on her terms).  The Bush administration, like the Johnson administration planning its war in Vietnam, knew full well that it couldn't prosecute the War on Terror by asking Americans to make economic sacrifices -- indeed, Dubya urged Americans to do their part by shopping.  As Hatherley shows, the austerity of the war years is very different from the austerity we are being asked to accept today.

More to my point, Hatherley spends a lot of time on austerity nostalgia among the Left, though it's not uncontested there.  Hatherley quotes Perry Anderson's critique of the leftist historian E. P. Thompson:
Anderson claimed that when looking at the working class of the present, Thompson could see only that of the past: 'the divorce between his intimacy and concord with the late 18th and early 19th centuries, and his distance and lack of touch with the second half of the 20th century, is baffling.  It is a divorce that is evidently rooted deep in the sensibility'.  The composition of the working class - who they are, what they do for a living, where the political fault lines might lie -- is increasingly ignored, in favor of a vague, windy imprecise notion of 'The People' -- and 'who the "common people" are is never said.  They exist only as figments in this moralistic rhetoric.  The fact that the majority of the population in England in this period voted consistently for Conservative governments is brushed aside' [49-50].
We have the same problem in the US, where the phrase "We the People," used all across the political spectrum as if it were a single word, always makes me wary.  The cult of Woody Guthrie is, I think, an example of leftish austerity nostalgia here, epitomized by Bruce Springsteen in my generation.  (In England, the singer Billy Bragg recorded a couple of albums of Guthrie's lyrics with tunes Bragg supplied.  Hatherley mentions Bragg as an "early adopter of austerity nostalgia in the 1980s," who in his memoir The Progressive Patriot "did not flirt with racism in the way that many of these writers have done; the 'patriotism' that he refers to was that of tolerance and multiculturalism" (37).  Compare the attempts of many American liberals, progressives and others to reclaim patriotism for the Good Guys "without [as Hatherley says of their British counterparts] partaking in any of the 'sad passions' that actually makes much of the right's politics so powerful -- resentment, hatred, bitterness" (38).

So, halfway through The Ministry of Nostalgia, I'm enjoying it and very pleased.  It's polemical and very quotable but still properly sensitive to ambiguity.  Someone should do a similar book about the same syndrome here in the US. 

Monday, May 23, 2016

Neo-pro, Neo-con

Daniel Larison recommended this article that attacks a claim made by a pundit for a respectable media outlet "that neoconservatives have been part of a broad foreign policy consensus dating back to the ’50s."

You know, I'm not entirely sure what neoconservatives are.  The pundit, Eliot Cohen, makes some risible statements, for example that "the two-generation-old American foreign policy consensus ... held that American interests were ineluctably intertwined with American values, and that when possible, each should reinforce the other, as when the promotion of liberty and human rights helped to weaken the Soviet Union."  Oh, yes, we all know how "American values" promoted liberty and human rights around the world, and continue to do so.  But Paul Pillar, Cohen's critic, has his own blind spots.
Dwight Eisenhower's presidency was one of foreign policy restraint. Ike didn't dive into Southeast Asia when the French were losing, he didn't attempt rollback in Eastern Europe, and he came down hard on the British, French, and Israelis during their Suez escapade. Richard Nixon's foreign policy was characterized by realism, balance of power, and extraction from a major war rather than starting one. Ronald Reagan, despite the image of standing up to the Evil Empire, didn't try to wage Cold War forever like some in his administration did. He saw the value of negotiation with adversaries, and when faced with high costs from overseas military deployments (think Lebanon in 1983-84), his response was retrenchment rather than doubling down. George H.W. Bush had one of the most successful foreign policies of all, thanks to not trying to accomplish too much with overseas military expeditions, and to his administration being broad-thinking and forward-looking victors of the Cold War.
I suppose most of these statements could be explicated in ways that would make them less absurd than they are at first glance, but that's because Pillar is overlooking, deliberately or through ignorance, facts that would complicate them, and perhaps undermine his argument.

Take his account of Eisenhower, who continued Truman's policy of massive support for the French war in Indochina, analogous to Obama's support for the current Saudi blitzkrieg in Yemen. True, when the French gave up Eisenhower didn't "dive in," if that's supposed to mean a direct invasion by US forces.  Instead Eisenhower undermined the political settlement that followed, by bringing in and supporting a viciously repressive client, which soon led to resistance by the Vietnamese and ultimately (less than a decade later) a direct US invasion by Eisenhower's successor.  Eisenhower also used covert action to overthrow govenments that he considered insufficiently cooperative with US "interests."  Guatemala and Iran were what he considered successful interventions, both involving the installation of singularly brutal dictatorships that the US supported for decades; Indonesia was such a failure that his administration did their best to ensure it would be forgotten, with considerable success.  It's currently fashionable to whitewash Ike, but his main success was minimizing US losses, while maximizing casualties in the countries he chose as targets.

As for Nixon, his "extraction from a major war" didn't happen.  He extended and escalated the war in Vietnam while starting a new one in Cambodia, again with minimal US losses and maximal losses among Cambodians.  I suppose Pillar has in mind Nixon's "Vietnamization" program, which was supposed to turn the work of waging the US to South Vietnamese forces, but the US remained involved in Vietnam throughout Nixon's tenure, and only got out under his appointed successor Gerald Ford.

Reagan, like Eisenhower, preferred "covert" (meaning, not publicized in the US but well-known elsewhere in the world) and proxy activity, but his first impulse was different.  (Why were US troops in Lebanon to begin with, for example?)  Bush the Elder invaded Panama and Iraq, again with minimal US casualties but maximal Panamanian and Iraqi losses.  His disinclination "to accomplish too much with overseas military expeditions" presumably refers to Bush's initial promise to support the Iraqi uprising against Saddam Hussein immediately after the Gulf War, and his subsequent inaction when that uprising was put down with harshness typical of US clients defending their turf. (It would not have required a military expedition to support the uprising, by the way; but letting Iraqi rebels use "captured Iraqi equipment" against Saddam wasn't acceptable to Bush.)  Bush's supposed aversion to overseas military expeditions is also belied by the unseemly haste with which he reacted with military force to the Iraqi invasion of Kuwait in the first place.

So Pillar's critique seems to overlook important contrary evidence and considerations about the post-WWII US foreign policy consensus.  Whoever the neoconservatives are, US policy has mostly involved state terror, violence, direct military intervention when possible and covert intervention by repressive American proxies when discretion required it.  Whatever the neocons wrought, it seems to have differed from the consensus mainly in degree, not in kind.

Saturday, May 21, 2016

This Way to the Regress

I just bought a digital copy of Conversations about Psychology and Sexual Orientation (NYU Press, 1999), by Janis S. Bohan, Glenda M. Russell, and several other contributors.  I've read it before, and found it useful enough (if only to criticize) that getting the ebook felt worthwhile once the price dropped to a reasonable level.

This time I went first to Leonore Tiefer's contribution, "Don't Look for Perfects: A Commentary on Clinical Work and Social Constructionism."  Tiefer is a psychologist, a sex therapist, and a professor of psychiatry, and author of Sex Is Not a Natural Act (2nd edn, Westview Press, 2004), which I liked.  (It's about time to reread it, I guess.)  I was disappointed by the opening, under the header "Sexual Orientation: Oppression or Identity?"  (Don't you love false antitheses?)
Writing this commentary raises a great irony for me.  As a deep social constructionist, I see sexual orientation as an idea that emerged near the end of the nineteenth century as part of the new profession of psychiatry's effort to busy itself segmenting the behavioral and intrapsychic world into neat little boxes of normal and abnormal.  In my mind, the categories of heterosexual and homosexual cannot be separated from their historical origins -- everything else is rationalization and a more or less disguised fulfillment of that original psychiatric phase.

Fast-forward to 1998.  I am writing this commentary as a clinician, that is, a person who must and does think in terms of normal and abnormal (or else be a total hypocrite) in her or his work.  People consult me and listen to me because they have confidence that I can offer insight and advice based on some understanding of normal and abnormal.  The social changes of the past third of a century have erased the normal/abnormal dichotomy from sophisticated discussions of "sexual orientation."  Now, the term is merely descriptive -- whom does one love and desire, a person of the same sex or a person of the other sex (or both)?  The reality of the categories is taken for granted, and the big controversies are about etiology (which some might argue is a sign that the normal/abnormal dichotomy has not really been erased from sophisticated clinical discussions!) [77].
Tiefer does attempt in the succeeding pages to think of ways people might deal with their problems without the normal/abnormal dichotomy or a belief in an individual's "true sexual nature," ways which turn out to be fairly simple, intuitive and non-paradoxical.  They involve careful listening (a client-centered approach that is hardly new) and critical thinking.  I approve of these ideas wholeheartedly, but they don't have a lot to do with social constructionism.

Being a social constructionist, deep or shallow, doesn't by itself commit you to any particular historical narrative or to any specific construction of a category.  Tiefer's opening reminds me of a biologist I debated in the 90s, who said that his training caused him to seek biological explanations for every aspect of human behavior.  I responded that his training was, on his account, unscientific if science is supposed to seek knowledge without preconceptions; ruling out non-biological explanations in advance of the evidence is as invalid as ruling out biological ones.  It seems to me that Tiefer is taking the latter tack here, motivated by the same kind of binary she's trying to reject, and by an essentialist notion of "psychiatry." 

I think her account is also inaccurate historically, for the same reason.  "Sexual orientation" as a concept didn't originate in the nineteenth-century, so Tiefer appears to be assuming that the concept has a nature that persists through changes of theory and terminology.  "Homosexuality" has always been an incoherent concept, and the advent of the term "sexual orientation" hasn't made it any more coherent.  As I've pointed out before, "sexual orientation" is formally defined as the direction of one's sexual desires, but in use it refers to gender inversion, and overlaps confusingly with "gender identity" and other incoherent ideas.

Nor do categories and classifications necessarily involve any assumptions about "the realities of the categories," though it's true that clinicians, like most people, tend to forget this.  Consider a hypothetical categorical division, "people shorter than six feet tall" and "people six feet or taller."  The differences between the classes are "real" for some understanding of "real," but they aren't absolute. This classification might be useful for some purpose or other, and it would be perfectly valid to use it -- until the clinician or researcher began thinking of the two groups as mutually exclusive and different from each other by nature.  In practice that doesn't seem to take very long.  The same is true of commonly used dichotomies like "masculine/feminine," "Catholic/Protestant," "theist/atheist," or "bisexual/monosexual."  Alfred Kinsey tried to use "homosexual" and "heterosexual" in this neutral way, and encountered fierce resistance not only from clinicians attached to the essentialist use but from later sex researchers who saw themselves as working in his tradition but moving "beyond Kinsey."

"Normal/abnormal" is a prime example of this confusion.  If "normal" simply means something like "what most people do" and "abnormal" means "what most people don't do," there's nothing invidious or harmful in the binary.  You can be part of a minority, even a small one, and that's just fine.  But most people, "sophisticated" or not, have trouble sticking with that construction.  Sooner or later, they figure that if you're not jumping off the bridge with everyone else, you're a loser, a freak, uncool, sick.  If there weren't something wrong with you, you'd be hanging with or at least admitting the superiority of the cool kids.  It's worth remembering of course that the normal-as-normative is often a trait of the few.  The cool kids are always a minority, often a small one.  People are ambivalent about this, and their attitude is driven by factors other than numbers.

Anyone who wants to assume "the realities of the categories" needs to remember that in the real world, categories tend to be porous, often with very wide variation among the members.  Kinsey, who can be classified as an "Aristotelean," cut his professional teeth by studying gall wasps.  This brought him face to face with the problems of classification, and he brought that approach to human sexuality, intending to map the variation in sexual behavior rather than locate essences in it.  This latter approach, which can be classified as "Platonic," dominated the sciences in his day as it still does.  Kinsey's critics argued that he should have looked for essences of human sexuality (the "normal") rather than being distracted by range and variety, which they saw as surface distractions from the Real.  (See Peter Hegarty's Gentlemen's Disagreement: Alfred Kinsey, Lewis Terman, and the Sexual Politics of Smart Men [Chicago, 2013] for an intelligent discussion of aspects of this controversy.)  But from what I know of him, I don't think Kinsey himself, any more than Aristotle, was a social constructionist.  Recognizing and studying variety is perfectly compatible with believing that the variety is a feature of the "real" world.

Consider a case Tiefer mentions later, "a patient who recently came to me when he was increasingly preoccupied with sexual fantasies about (as well as a budding sexual relationship with) his secretary." Tiefer notes that "sexual identity was not an issue for him," though there is an identity available for such a person, namely "adulterer."  "[But] the idea of 'true sexual nature" was..." (81).  As with the transsexual patients Tiefer discusses, it's possible for a clinician to be useful to a patient without invoking "normal" or "abnormal," which really aren't useful categories most of the time anyway.  If your patient is, say, hearing voices that urge her to kill herself, normality and abnormality are beside the point.  But ditto for a benign or neutral situation, like a person who doesn't conform to official gender norms.  It was surely "abnormal" in various senses for men to grow their hair long in 1960s America (and remember that "long" was highly subjective when a military buzz cut was the standard), but there was nothing wrong with it, nor did it really say anything about their "gender identities."

There probably is no correct answer to a question like "Should I have an affair with my secretary?" or "Should I leave my husband?" or "Should I seek sex/gender reassignment surgery?"  ("Sex reassignment" and "gender reassignment," by the way, express different conceptual understandings, though the procedure involved is the same.)   Patients asking these questions may be looking for an authority figure to make their choices for them.  According to some constructions of authority, that may be what such figures are for, but that shows just how uneasy many people are about the idea of choice.  They want to believe that outcomes are predictable -- if you do this, you'll be happy; if you do that, you'll be unhappy -- and that someone (God, a priest, a doctor) knows which one to seek.  But there are no guarantees, and this, again, has nothing to do with social constructionism in itself, even if you want to believe, as many of its proponents clearly do despite their disavowals, that social construction gives a true picture of reality.

Wednesday, May 18, 2016

Make the World Go Away

One of my Facebook friends from high school posted today:
I'm thinking of a break from Facebook. I can't take the political shit that's going down. Bathrooms, Donald Trump bashing, people getting beat up. Can't take the stupidity I see on here. And the hate. How did we get so unforgiving to other people and their way of life. What happened to peace and love?
Well, I certainly sympathize; I've had the same idea myself.  But two things stop me. One is that many of the people who complain about negativitity and hate on Facebook or elsewhere are complaining because they posted something hateful, dishonest, and negative on Facebook, and someone called them on it. (One good example, but there are many others, is the person who told me she'd like to shoot Mexicans for target practice.  She was very upset when I called her racist, I was being mean and political, and she was tired of all this political stuff.) Such people never take responsibility for what they themselves say.  Responsibility is for other people -- they can just exult in their Trump-like, Reagan-like talent for blurting out whatever pops into their heads, because they're telling it like it is, and they don't care who they offend.  But you'd better not offend them, because that's being negative, and they're very special snowflakes, too good for this world.

It also seems to me that one reason the world is in the trouble it is, is that so many people react to the problems they see by running away from them, hiding their heads in the sand, retreating into "mindfulness" and other egocentric practices. Apparently they believe that if they don't see it, it isn't happening, and if they ignore it, it'll go away by itself.  I decided as far back as high school that I wasn't going to do that. And it's not easy. But do you want to know why Trump, for example, is so successful? It's because people didn't want to engage with the people around them.

And it also seems to me that this is a First World luxury. I noticed this in high school too, when I heard white people saying that they were tired of hearing about black people's problems.  Not as tired as black people were!  (And are.)  Maybe my friend can make the world go away, but poor people can't. Black people can't stop police shootings of unarmed kids by giving up Facebook. I can't escape antigay bigotry by giving up Facebook. Women who need abortions can't get them by giving up Facebook. Children killed by Obama's drones can't come back to life by giving up Facebook.

But, as I say, I sympathize. Everyone has to set their own limits and do what they can.  My limits include not remaining silent when confronted by bigotry and injustice, which of course is considered negative and hateful by many.  I can live with that.  Or, to put it another way:

Except that I don't accept these Western either/or binaries: you have to do both.

Tuesday, May 17, 2016

Already It Was Impossible to Say Which Was Which

The Intercept reports that Donald Trump appears to be changing his stance on entitlements.  Up till now Trump has insisted that he would protect Social Security, Medicare, and Medicaid against attempts to cut benefits, promising to "focus on economic growth so that we’d get 'so rich you don’t have to do that.'"

But now he's backing down, or at least reconsidering.
Trump policy adviser and co-chairman Sam Clovis said last week that the real estate mogul would look at changes to all federal programs, “including entitlement programs like Social Security and Medicare,” as part of a deficit reduction effort.

Clovis made the comments at the 2016 Fiscal Summit of the Pete Peterson Foundation, an organization whose founder has spent almost half a billion dollars to hype the U.S. debt and persuade people that the Medicare and Social Security programs are unsustainable. Trump also met privately last week with House Speaker Paul Ryan, R-Wis., an outspoken Medicare privatization advocate.
All pretty predictable, no?  My first reaction was that this sounded familiar.  Didn't Barack Obama follow a similar trajectory?  Yes, he did.

When he was campaigning for his first term as President, he told an AARP convention on September 6, 2008 (via), "But his [McCain’s] campaign has gone even further, suggesting that the best answer to the growing pressures on Social Security might be to cut cost-of-living adjustments or raise the retirement age. I will not do either."

Of course, he did both.  In 2010 he appointed a commission to make recommendations on cutting the national deficit, packing it with deficit hawks (including Paul Ryan).  Despite the way Obama had rigged it, the Simpson-Bowles commission was unable to agree on conclusions, so Alan Simpson and Erskine Bowles wrote up their own recommendations, which President Obama and most of the media treated as if they represented the commission as a whole.  These recommendations included phasing in a raise in the "retirement age" (meaning the age at which a retiree can receive full benefits) to 69 and changing the index for cost-of-living adjustments so as to lessen those adjustments, resulting in a benefits cut of 3 percent according to the economist Dean BakerObama also announced his willingness to use cuts in Social Security and Medicare to bargain with the Republican Congress on the debt ceiling.  The Business Roundtable, a gang of corporate CEOs, would prefer raising "full retirement age" to 70, and of course the usual Republican suspects in Congress were already on board for that.

So there's concern that "A Trump presidency would threaten programs like Social Security."  It would probably would, but so would any other Republican candidate's presidency.  So, most likely, would a Clinton presidency.  But an Obama presidency already has.

Sunday, May 15, 2016

All Critics Are Equal, But Some Critics Are More Equal Than Others

I hope I can wiggle out of having to read this book, and even more I hope I can wiggle out of having to buy it.  I suppose I can just request that the library get a copy...

What I'm talking about is something called This Thing We Call Literature, by one Arthur Krystal, published earlier this year by Oxford University Press.   It got a laudatory review from Micah Mattix at The American Conservative, and you know, slogans like "In Defense of Great Books" are a red flag for me.  So for the moment, I'm just reviewing the review.  If Mattix misrepresented Krystal, I beg pardon, but the position sketched out here is one I've encountered before.

According to Mattix,
... Krystal’s main concern is not to chop individual writers down to size and extol others—though he does do some cutting—as much as it is to defend the value of hierarchical thinking with respect to literature. “The prevailing mood,” he writes, “regards hierarchies with suspicion: Who’s to say who is worth reading and who isn’t?” While a willingness to include “formerly disenfranchised artists and writers” in the canon is a good thing, “the fact that writers are all entitled to a fair hearing doesn’t mean that they are equal.”
I wonder if Krystal actually backs up the insinuation there, that "the prevailing mood" is that all writers are equal.  It sounds like he's buying into the widespread if not prevailing notion that equality means sameness.  The quotations indicate a fondness on his part for the gaseous cliché and le mot injuste, but then you don't have to be a good writer yourself to appreciate good writing by others.

Maybe I got it wrong, but I always had the impression that the call for reconsidering "formerly disenfranchised [not really the right word, Arthur] artists and writers" was motivated not by the belief that they were all equally good, but by the very reasonable belief that some good writers had been ignored because they had chosen to have the wrong skin color or genitals.  Granted, in the heat of the politicking, some people probably overstated the virtues of the artists they championed, but it's not as if that didn't happen with the usual white male suspects as well.  And the racism and sexism of the literary establishment was never really a secret; in the good old days, white men could and did openly dismiss work by women and Negroes and homosexuals and the Irish, just because they were women and homosexuals and Negroes and Irish.

I admit, I "regard hierarchies with suspicion," and I think that "Who's to say who is worth reading and who isn't?" is a fair question.  From Mattix's review I am not sure Arthur Krystal is one to say it.
Unfortunately, Krystal doesn’t help his case—which I think is almost entirely right, by the way—by often failing to demonstrate in detail how canonical writers are actually better than the minor writers who were forgotten ... But in these first two essays, he mostly lists writers and critics or turns to ex cathedra pronouncements—“War and Peace is objectively greater than The War of Words”—which are rarely very satisfying, however correct they may be.

When he does get more specific, things can get a little thorny. Is it true, for example, that great novels “rely more on accuracy of characterization than on the events that their characters react to”? I suppose it depends on what “rely more” and “accuracy” mean. Without further explanation, questions abound: Is Raskolnikov in Crime and Punishment accurate? Does the novel rely more on characterization than events? How about in classical drama? The Iliad and The Odyssey? Pamela and Jane Eyre?

... Krystal notes in passing that some fiction that goes by the name “literary” isn’t, but he doesn’t explain why, give examples, or offer even a brief analysis of the failures of literary fiction.
I feel about the claim that some works just are great rather the way I feel about claims about objective reality: yes, I can go along with that, but how do you tell what's really great, and what's really real?  In the case of art, we're talking about value judgments rather than measurements, and those judgments must function within traditions.  They also change over time.  The canon, which both Mattix and Krystal admit has "sociological roots," changes.  Krystal says that those roots don't mean the aesthetic judgments are invalid, but how can you tell?  He invokes "a credible, if not monolithic, consensus among informed readers," which is plausible, except that informed readers disagree with each other and change their minds over time.  Numerous authors who used to be canonical aren't anymore.

Mattix's questions might well be supplemented by others about different artistic traditions, such as Chinese poetry.  Much if not most poetry doesn't translate well, and I still regret not having stuck with Russian long enough to read Pushkin in the original.  Mattix quotes Krystal's lament that today's poetry lacks "music."  Music is generally the first thing you lose in translation.  Which is true of prose too, if less so.

An interviewer asked Gore Vidal about his claim that, "when it comes to matters of prose and of fiction at this time and in this place, I am authority."  Whence, she inquired, comes this authority?
It is earned, mostly, but it is also a matter of temperament. The critic must know more than either writer or academic. He must also value experience and have a truth-telling nature. I think I have that. In their youth most people worry whether or not other people will like them. Not me. I had the choice of going under or surviving, and I survived by understanding (after the iron- if not the silver- had entered my soul) that it is I who am keeping score. What matters is what I think, not what others think of me; and I am willing to say what I think. That is the critical temperament. Edmund Wilson had it, but almost no one else now does, except for a few elderly Englishmen. 
I still agree with Vidal, but I'd add that he wasn't the only authority.  Others would disagree with him about the value of different writers, and I disagreed with him about some of those he recommended -- Italo Calvino, for example.  But still, the job of a critic is to persuade, by explaining why and how he or she thinks this particular work functions and succeeds (or fails).

One thing that seems increasingly important to me is that a critic, who reads widely and deeply, may forget that most people don't.  As Edmund White has said, "A canon is for people who don't like to read, people who want to know the bare minimum of titles they must consume in order to be considered polished, well rounded, civilized. Any real reader seeks the names of more and more books, not fewer and fewer."  And, I'd add, a real reader in this sense isn't all that concerned with greatness, though he or she won't be undiscriminating or non-judgmental either.  But most people aren't, and never will be, "real readers" in this sense.  It's up to teachers to try to guide as many of their students as they can toward a sense of what reading, or Literature with a capital L, has to offer.  (I wonder, though, how many teachers really know that themselves?)  If greatness is a thing, then it should be detectable by any reader, not declared ex cathedra.

It sounds as if Arthur Krystal lacks the capacity to do the critic's job, at least in Vidal's league.  Maybe I'll read some of his other work.  This Thing We Call Literature is only a little over a hundred pages long.

Friday, May 13, 2016

I Want a Boy Just Like the Boy That Married Dear Old Mom

The gay men's discussion group I sometimes attend will be taking up the question "Are Gay Men Different from Straight Men?  Why?"  The trouble with any question like this is that it assumes that all gay men are alike, and all straight men are alike.  But in most groups that aren't defined very narrowly by sameness, the differences within the group are greater than the average differences between it and whatever group is being compared to it.  When a group is stigmatized as gay men have been, it is all the harder to build a representative sample; if you limit your sample to openly gay men, then you're not talking about all gay men, and you can't make any assumptions about the closeted.  Some people would distinguish between men who "identify as" gay and men who are homosexually active (the famous Men Who Have Sex with Men) but even the gay-identified vary widely.

In any case, it depends on what traits you're comparing.  Most gay men have penises, for example, probably the same proportion as straight men.  Most of us want to copulate with other males, and we tend to downplay the extent to which many of us want to copulate with females as well.  (In that respect, of course, we're the mirror image of straight men.)  Many of us construct multiple conceptions of what gay men are: sanitized for dealing with heterosexuals, eroticized when we're in gay environments; gender conformist for heterosexual audiences, gender nonconformist for each other.  I expect that stereotypes will rule at the discussion group, as they so often do.

I'm currently working through another book I happened on at the library: Salaam, Love: American Muslim Men on Love, Sex, and Intimacy, edited by Ayesha Mattu and Nura Maznavi, published by Beacon Press in 2014.  Some of the contributors are gay, and one of these writes of his first gay friend, who approached him at a gym (page 23):
I certainly wasn't like the other guys there.  I was brown, hairy, and chubby.  I was twice the size of any of the other boys.  Was it my Arabness, my otherness, that intrigued Corey?  Was he fetishizing me, as so many did after 9/11?
September 11, 2001 was several years in the future, so unless he is a prophet I doubt it was on Ramy Eletreby's mind when he met Corey.  But I laughed aloud when I read this passage, because on the previous page he'd written:
Staring at boys at the gym was my treat.  I convinced myself that Allah had provided me with eye candy so I could satisfy my urges without crossing any lines.  The furtive glances I stole at those surfer boys -- white, hairless, perfect -- changing in the locker room was Allah granting me pleasure.  In secret, of course.  I never spoke to anyone.  I felt like a creeper.  And I was.
This is typical of people, including non-white gay men, who complain that they are or might be fetishized by others: they fetishize others quite freely, but you'd better not fetishize them, and they assume that any interest shown in them must be fetishizing, ill-intentioned, oppressive, dehumanizing and degrading.  (A similar common pattern is that you should overlook my lack of conventional hotness and be attracted by my personality, but I have the right to desire you based solely on your surfer/model good looks -- I'm not interested in your personality in the slightest.)  It might be that Eletreby has gotten over himself since then, but nothing in his essay indicates that he has.  I can't even tell whether he is aware of the irony in his complaint.  Self-pity rules his contribution.  Is that what gay men are like?

Incidentally, the editors of Salaam, Love are both female.  Previously they compiled Love, Inshallah: The Secret Love Lives of American Muslim Women (Soft Skull Press. 2012), and in the introduction to their newer book they write that Love, Inshallah "resonated with readers around the world.  Including men.  They started to ask 'Where are our stories?'  We dismissed the inquiries with a laugh. 'Please!  Guys don't talk about this stuff.'" (vii).  I was annoyed by this stereotype, which ought to be obviously false -- men have written vast amounts of love poetry, and most of literature is about their feelings and love lives.

But then I wondered why the editors didn't make what seems to me a much more pertinent response: "You want a book?  Fine, do it yourself."  Wanting someone else to bell the cat is of course a common human impulse.  But men have as much access to publication as women do (indeed, several of the contributors to Salaam, Love are already published authors), and one of the primary anti-feminist reflexes is men expecting women to do the emotional and other service work that they ought to do themselves.  (When women comply by telling them how they need to change, another popular reflex is to complain that all women do is focus on the negatives.)  It would have been a salutary project if men had taken it on, for their own instruction and edification, but being good women, Mattu and Masnavi took on the burden.  The results are mostly predictable, with several writers falling back on millennial snark and irony to avoid dealing with their feelings.  The main exception to my mind is Ibrahim Al-Marashi's "The Other Iran-Iraq War," which addresses not only emotions but politics on a level that his fellows miss.  Check out Salaam, Love from the library, and read Al-Marashi's piece; the rest, as far as I'm concerned, can be skipped.

Friday, May 6, 2016

Justice, You've Been Served!

On Monday Democracy Now! rebroadcast a 2006 interview with the late Daniel Berrigan.  It includes his account of an exchange with former Secretary of Defense Robert McNamara in 1965 about the Vietnam war, with the unintentionally (I presume) funny aside that Berrigan had to ask a secretary at the magazine he was publishing to transcribe McNamara's response "in shorthand" -- he couldn't write it down himself?  Isn't a Christian supposed to serve rather than be served?

The part that prompted me to write, though, was this, at the end of the interview:
AMY GOODMAN: You’ve continued to get arrested. Do you think these arrests, what you have engaged in, protest, even when people are not being arrested or jailed, have an effect? I mean, you have gone through a number of wars now. Do you think things are getting better, or do you think they’re getting worse?
FATHER DANIEL BERRIGAN: No. No. This is the worst time of my long life, really. I’ve never seen such a base and cowardly violation of any kind of human bond that I can respect. These people appear on television, and the unwritten, unspoken motto seems to be something about "We despise you. We despise your law. We despise your order. We despise your Bible. We despise your conscience. And if necessary, we will kill you to say so." I’ve never really felt that deep contempt before for any kind of canon or tradition of the human.

AMY GOODMAN: What do you mean, "We despise your Bible"? It is often said it’s done in the name of the Bible.
FATHER DANIEL BERRIGAN: Well, yes, these people are—they’re making a scrapbook out of the Bible in their own favor. And they’re omitting all the passages that have to do with compassion and love of others, especially love of enemies, or the injunction to Peter, "Put up your sword. Those who live by the sword will perish by the sword"—all of that. All of that gets cut out in favor of, well, a god of vindictiveness, the god of the empire, the god who is a projection of our will to dominate.
I respect and honor Berrigan's tireless activism over many decades, but I could never really trust anyone so intellectually and morally dishonest as to say something like this.  Berrigan himself was "making a scrapbook out of the Bible in [his] own favor."  He omitted all the passages that have to do with the killing of those who worship the wrong gods or are living in the wrong territory or simply failed to meet the deity's high standards.  The "god of vindictiveness, the god of the empire, the god who is a projection of our will to dominate" is Yahweh, the god of the Bible, and his son and viceroy Jesus. While Berrigan was correct about others' selective use of the Bible, he himself was constructing a god who was a projection of his own will.  It was perhaps bad luck on Berrigan's part that he grew up in a tradition defined by the Christian Bible, a book full of violence and hatred as well as professions of love and compassion and peace, since in order to oppose war and empire he had to engage in the same cut-and-paste job he condemned (rather self-righteously, I must say) in others.

But at the same time it must be remembered that Berrigan chose to remain in the Christian, and specifically the Roman Catholic tradition, and hamstrung himself by using the fundamentalist assumption of biblical inerrancy: if you interpret the Bible correctly -- that is, if you agree with his interpretation -- it will be true, free from all error, and by remarkable coincidence will agree with Father Dan!  In order to sustain this belief he had to lie about his opponents, by implying that because they conveniently ignored the parts of the Bible that were inexpedient for them, they rejected it entirely.  But the same accusation could be made of him: If cherry-picking the Bible means that you "despise" it, then he despised the Bible no less than the warmakers do.  The difficult thing to do, with the Bible or any other scripture or authority, is to say forthrightly that it is not free from error, that it is wrong and you reject the parts that are wrong, while recognizing that they are there.  If I weren't prone to the same temptation, I'd be amazed that so many people find this non-fundamentalist approach so difficult.  It is difficult, but it can be done.  Once you admit the possibility, it becomes easier.  Instead Berrigan chose to present his own projection as if it were truth.

Even if Christians leave the Tanakh, with its divine commands to exterminate whole populations, out of the picture, they are still stuck with the Jesus of the gospels, whose vindictiveness deserves more attention than it usually gets.  Jesus not only threatened people with eternal torture, he was preaching in the apocalyptic tradition which expects Yahweh to establish his Kingdom on earth (as it is in Heaven) through a cataclysmic war between Good and Evil, reaching a climax as "the wine press was trodden outside the city, and blood came out from the wine press, up to the horses' bridles, for a distance of two hundred miles" (Revelation 14.20; the metaphors get tangled up there, but the meaning in context is clear).  Whether he liked it or not, whether he thought it through or not, this is the Jesus Dan Berrigan followed.

Some Christians I've talked to try to get around this problem by arguing that while God is of course a loving god, he also is a god of justice.  The answer to that is simple enough: punishment is not justice.  As the philosopher Antony Flew declared:
Now, if anything at all can be known to be wrong, it seems to me to be unshakably certain that it would be wrong to make any sentient being suffer eternally for any offense whatever.  Thus a religious commitment which must involved the glorification of such behavior as if it were a manifestation of perfect justice and goodness would be repugnant to ordinary decency and humanity; even if the facts were such that in prudence we had to trample down our generous impulses in a rat-race for salvation [The Presumption of Atheism, 1976, 64].
The fantasy of Hell can't have been invented out of a desire to teach people they've been doing wrong, since part of the fun is that if they do learn anything from their punishment, it's too late, because they're in Hell!  Forever!  Hahahaha!  The apologist C. S. Lewis tried to justify it in The Problem of Pain in 1940.  Imagine, he proposed, 
a man who has risen to wealth or power by a continued course of treachery and cruelty, by exploiting for purely selfish ends the noble motions of his victims, laughing the while at their simplicity ... Suppose, further that he does all this, not (as we like to imagine) tormented by remorse or even misgiving, but eating like a schoolboy and sleeping like a healthy infant ....

We must be careful at this point.  The least indulgence of the passion for revenge is very deadly sin.  Christian charity counsels us to make every effort for the conversion of such a man ... But that is not the point.  Supposing he will not be converted, what destiny in the eternal world can you regard as proper for him?  Can you really desire that such a man, remaining what he is (and he must be able to do that if he has free will) should be confirmed forever in his present happiness ...?  And if you cannot regard this as tolerable, is it only your wickedness -- only spite -- that prevents you from doing so? ... You are moved, not by a desire for the wretched creature's pain as such, by a truly ethical demand that, soon or late, the right should be asserted, the flag planted in this horribly rebellious soul, even if no fuller and better conquest is to follow.  In a sense, it is better for the creature to know itself, even if it never becomes good, that it should know itself a failure, a mistake.  Even mercy can hardly wish to such a man his eternal, contented continuance in such ghastly illusion [122].
Of course Lewis made things too easy for himself by supposing that there were only two possible options: Heaven or Hell.  An infinitely wise and omnipotent Creator could do better than that.  But even accepting Lewis's terms, I would send his wicked man to Heaven.  I assume that Heaven is a place without suffering, so that he will be able to make no one else suffer.  If this will curtail his free will, everyone's free will must be curtailed in Heaven if there's to be no suffering there.  In Heaven the man's "continued course of treachery and cruelty" would give him no advantage, as it did on earth.  (Which raises the question why Lewis's god arranged his creation so that bad people can flourish.  In what sense, given that arrangement, is the wicked man's bad behavior "a failure, a mistake"?)  Since Lewis represents his complacency as based on his worldly success and happiness, which would mean nothing in Heaven, is it accurate to say that he could 'remain what he is'?  But even if he could, so what?  The important thing would be that he couldn't hurt anyone else.  Lewis assumed that the only way that "soon or late, the right should be asserted" is by stomping on it for all eternity; it never seems to have occurred to him that rebellion might be quelled by mercy as well as by punishment.  One thing we know is that punishment is not an effective way of changing people's behavior; if Yahweh has such a thing for punishment, why did he create us so that it would be ineffective?

I like to quote Michael Neumann's remark that "Where ‘respect’ means not beating people or putting them in jail or driving them from their homes, it is a fine idea. But you shouldn’t do those things even to people you hold in contempt."  I think this should be extended: you shouldn't do those things even to bad people.  I think that many people would disagree, even Christians who, according to orthodox teaching, believe that we are all bad people.  But then it's also orthodox teaching that God is entitled to beat or jail or torture all of humanity, and only tempers "justice" with mercy by letting some of us off; again, I deny that punishment is just in the first place. This is a matter of judgment, of course, not of fact or even of logical demonstration.  But the same is true of the mindset that demands infinite retaliation for "rebellion," and that indeed sees harm done to others are primarily an offense against a deity rather than against the people actually harmed.  And I ask what good is achieved (other than the "passion for vengeance" Lewis rejected while still clinging to it) by punishment at all.

Going back to Daniel Berrigan: I don't object to his rejection of the wrathful aspects of Christianity.  What I do object to is his pretense that those aspects aren't a core part of the religion, and of Jesus' teaching in particular.  It's not only dishonest to denounce those who believe in a "god who is a projection of our will to dominate" (though that is reason enough to reject his projection), but it will be ineffective as long as Christians refer to the New Testament and the teachings of Jesus.  Jesus believed in, and taught, a god of vindictiveness and empire and a projection of our will to dominate.  The only way to correct that teaching (assuming it is incorrect) is to confront it head-on, and reject it directly rather than by projecting it onto the bad Other, as Berrigan did, and so many other Christians do.

Tuesday, May 3, 2016

I Get Up Each Morning and Dust Off My Wits

Now that I'm officially an old person, I often wonder if I'm becoming more paranoid, or just more cautious.  I suspect it's a little of both.

Luckily, thanks to Facebook I have a basis for comparison in other people my age.  From what I see, they are becoming both more paranoid and less cautious.  Many of them are certain that Obama is coming personally to take away their guns and rip down the flags flying from their porches, but they will give their credit card numbers to any scammer who calls them on the telephone.  They know that the Media will print or broadcast anything if they think they can make money from it, but they implicitly trust Fox News and any meme that feeds their prejudices and fears.

I can't be too smug about this, because although I am comparatively careful about what I see on the Internet, I've fallen for a number of bogus stories passed along by people I knew.  So I understand the principle involved.  On the other hand, when I learn that I've been fooled, I don't attack the person who corrects me for picking on my fantasies.  So far so good, then.  Or at least not too bad.

Sunday, May 1, 2016

Take a Picture of the World, It'll Last Longer

I picked up Social Construction: Entering the Dialogue (Taos Institute, 2004) by Kenneth J. Gergen and Mary Gergen, which appeared to be a brief, handy introduction to the concept. The Gergens are a husband and wife team, both psychologists and academics, both dedicated to social constructionist thought, so they should be qualified to explicate it, but they didn't manage very well. The book turned out to be something of a trainwreck, but maybe that's not the Gergens' fault; maybe it's the fault of the concept itself.  But the confusion they foster could be useful for someone who wants to understand what "social construction" means and how it's used (and misused) by some of its proponents.

The Gergens take a false step right away:
The foundational idea of social construction seems simple enough, but it is also profound.  Everything we consider real is socially constructed.  Or, more dramatically, Nothing is real unless people agree that it is [10; bold type in original].
This is the kind of rhetoric that confuses people, and enables critics of social construction to misrepresent the concept.  I think the Gergens realize this, so on the next page they try to clarify:
We must be clear on this point.  Social constructionists do not say, "There is nothing," or "There is no reality."  The important point is that whenever people define what "reality" is, they are always speaking from a cultural tradition [11].
The thing is, the Gergens themselves came very close to declaring that "there is no reality," and they did so more from an apparent wish to sound dramatic than from a need to explain their concept clearly.  Since the book is intended to introduce social constructionism to people who don't know much if anything about it, and who can't be assumed to have much experience with philosophical discourse, saying "Nothing is real unless people agree that it is" might just be the worst formulation the authors could have chosen. It might have been better to put "real" in quotes, but not by much.  Most readers, from undergraduates to psychologists, will remember "Nothing is real" as the core (or "foundational idea," another revealing choice of words) of social construction theory, which it isn't.

Ironically, the Gergens undercut themselves, because they are "defin[ing] what 'reality'' is" here -- their formulation is itself a social construction from a specific cultural tradition (namely the American Culture of Therapy), but they state it as the nature of the real world.  (They're aware of this problem, but that awareness doesn't inform their writing.)  To clarify my own criticism, consider the well-known case of "homosexuality" as a social construction.  Many proponents of that position say that there were no homosexuals before, say, 1868.  What they mean is that the word "homosexual" wasn't invented until then, and different concepts of men who have sex with men and women who have sex with women were used in a specific cultural context -- in this case, western Europe in and before the nineteenth century, and especially by lawyers and doctors in that era and place.  As a historical claim, this is dubious anyhow, but again, it might lessen confusion if such writers were careful to put "homosexual" in quotes -- "There were no 'homosexuals' before 1868" -- but they seldom do.  True, scholars may choose which definition of a term they wish to use, but in general they don't use their chosen definition consistently.  Often they even retroject "homosexual subcultures" into the period before 1868, and to other regions.  Most seriously, like the Gergens, they clearly consider these claims and statements to be the reality of homosexuality, and they forget that their discourse is a social construction in a particular cultural tradition, namely Western academic philosophy and critical theory.  I'm not being pedantic here, since one of the assumptions of social-constructionist discourse is that terminology is important, that it shapes how we view the world, and words create worlds -- so it's vital to examine the words we and others use, and to use them accurately and consistently.
Could all that we construct as "problems" be reconstructed as "opportunities"?  By the same token, as we speak together, we could also bring new worlds into being. We could construct a world in which there are three genders or a world where the "mentally ill" are "heroes," or where "the power in all organizations lies not with individual leaders but in relationships" [12].
This is another good example of the careless use of concepts.  It's not surprising that many people understand social construction to mean that people freely and consciously choose how they see the world, when social constructionists talk as if it were so. Consider the popular accusation that social constructionists claim that gay people "choose" to be gay.  It mistakes social construction for an account of the nature and origin of sexual orientation, an antagonist to "born this way," when (according to my understanding, my construction if you will) the whole project of social construction involves explaining how people come to think of customs as "natural."  I have the impression that many people who talk about social construction think that they are exposing illusion and getting at the reality of whatever topic they're discussing, be it sexual orientation, sex/gender, race, language, or whatever.  (If it's not what they really think, it's what they say.)  It's precisely what we assume to be "real" and "natural" that we must question, and it's not as easy as "speak[ing] together" to "bring new worlds into being."  ("Worlds," by the way, is another much-misused word, a sign that someone is socially constructing carelessly.)
Take the simple process of naming.  There is Frank, Sally, Ben and Shawn.  Now these individuals were scarcely born into the world with nametags.  Their parents assigned their names. In this sense, they are arbitrary.  Except perhaps for family traditions, for example, Frank could have been named Ben, Robert, Donald, or something else.  But why were they given names in the first place?  The most important reason is practicality.  For example, parents want to talk about Sally's welfare; is she eating properly, does her diaper need changing, is her brother Frank jealous of her? ... More broadly, the words we use -- just like the names we give to each other -- are used to carry out relationships.  They are not pictures of the world, but practical actions in the world [14-15].
In that last sentence the Gergens seem to suppose that "pictures of the world" are real, or at least "real."  At the very least, they're constructing a false dichotomy. Language is used both to make "pictures of the world" and to make actions in the world.  (I suspect that a sloppy understanding of the performative use of language is lurking in there somewhere.)  Neither of those uses, which don't exhaust the options, expresses the "real nature" of language, which no one knows anyway.  This section is headed "Language: From the Picture to the Practice," but it would be as accurate and useful to reverse the order, since "pictures" are often constructed to explain and justify practices.

On naming, the Gergens oversimplify drastically, evidently because they're thinking in a middle-class American cultural tradition.  Names are seldom "arbitrary": like language in general, they are often believed to invoke the nature of the world.  Even when someone, say, names her baby "Britney" because she's a big Britney Spears fan, it's not an arbitrary choice.  It seems to me that "arbitrary" is as serious an error as "real" or "natural": the Gergens suppose that naming just happens, without any social or cultural meaning.  Naming is another social construction.  One custom is "remaking" ancestors, as Richard Trexler discusses in Naked Before the Father in the context of medieval Italy, and in Orthodox Jewish tradition down to the present.  A child's name isn't picked out of the air, but chosen so that a dead relative can live again through his or her name -- and in this construction, a child is never named after a living relative.  In other constructions, a child may be named for a living relative, and it would be interesting to know why it's acceptable in some cultures but not in others.  Names generally have meanings (mine means "brown warrior," for example), even when those meanings have been forgotten.  Often they aren't.  In China, ancient Israel, and in at least some Native American cultures, the meanings of names are known by the parents who give them and by those who use them.  In traditional Korea, males' names consist of three Chinese characters, and they are not chosen arbitrarily but with some thought and care, sometimes by divination or other systems.  Girl children were often not given names, however, which is a reminder that it's not necessary to use a name in practical, day-to-day parent-child relations anyway: in Western and non-Western cultures alike, parents can and do address children not by name but by status ("son," "daughter,"), just as their children address their parents by status ("mom" and "dad." My Mexican friends address each other this way all the time: Uncle, Aunt, Cousin, Nephew/Niece, etc.  In Korea, people almost lose their names when they become parents, and are not only referred to but addressed by others as "Young-sook's Mother" and "Jung-min's Father"; there are titles for status in family constellations that are used more often than an individual's name ("Elder brother," "Youngest child," etc.).   Invoking parents' concern for children's welfare as a reason for naming them is a middle-American social construction, yet the Gergens appear to think their explanation describes reality.

Examining the customs and understandings of other cultures often informs social construction theorizing, but the Gergens seem to have only the barest -- and not very well-informed or thoughtful -- awareness of other cultures.
What is more obvious than the fact that our social world is made up of separate individuals, each normally endowed with the capacity for conscious decision making.  It is because of this that we favor a democracy in which each adult citizen has the right to cast a vote, courts of law in which we hold individual actors responsible for their deeds, schools in which we evaluate each student's individual work, and organizations in which we subject individual workers to performance evaluations?  It is largely for these reasons that we characterize Western culture as individualist.  

Yet, for a constructionist, the obvious fact of "the individual as a conscious decision maker" is not so obvious.  Rather, we see it as only one way of constructing the world.  In fact, the individualistic orientation to social life is not so old historically (possibly three centuries), and it is not shared by the majority of people on earth.  This does not make it wrong, but it does allow us to step out of the box and ask about its pros and cons [30].
This is odd, for voting, courts of law, schools, and organizations are much older and widespread than the "individualistic orientation."  "We" certainly don't have these institutions because we see persons as atomized individuals; they're compatible with a non-individualist worldview, and are found in non-individualistic cultures.

The individualist / collectivist dichotomy is itself a social construction, a narrative about cultural difference, though the Gergens write as though it were a real difference between societies.  It's also inadequate as a description of a society except in very broad terms.  The difference between an "individualist" society like the US and a "collectivist" society like Japan is one of degree, not of kind, since every category is made up of individuals, and every individual human being belongs to various collectivities.  But speaking of relative differences as if they were absolute differences is a very popular social construction.

Even within the same society, these aspects are stressed more or less according to the situation or problem involved.  (One reason I love Koreeda Hirokazu's very Japanese film After Life is that it tells stories about individuals with love and respect, with no suggestion that doing so is un-Japanese.)  It might be that the US has moved farther toward the individualist pole than most other countries, but we have certainly not eliminated categories or collectivities, as the use of terms like "America" and "Americans" alone is enough to show. 

I'm skeptical of most of the Gergens' sweeping generalizations, or constructions as they should be called.  For example, "most human conflict can be traced to the processes of meaning-making" (66).  Conflict also occurs between non-human animals, who have no language but still manage to have conflicts.  The same goes for the experience of loss.  "For you to 'lose' something (a job, a close friend, the love of others) means that you carry around a story of yourself as a major character, embarked on a course of progress or fulfillment (end-points of a good story), and have suffered a setback" (49).  Non-human animals experience loss without having language, and therefore without having narratives.  And, of course, among humans, narratives of loss are not limited to individualistic cultures.  These experiences go along with having/being bodies.

The Gergens, consistent with their status as helping professionals, proceed to describe what they present as non-individualist, "relationship-oriented" ways of dealing with human problems.  It never seems to have occurred to them that it would be instructive to examine how supposedly non-individualist societies deal with such problems.  After all, we have not only vast amounts of sociological and anthropological data about such societies, but plenty of literature and traditions from the history of Western culture, in the days before we became individualists.  Why re-invent the wheel?

Well, for one thing, it's hard to justify your status and salary as a science-based academic helping professional, working with "solid data" (90), if you're just going to appeal to tradition.  But as the Gergens are aware, hardly anyone wants to go back to the old days, even those who claim that they do.  Getting rid of tradition hasn't eliminated the abuses that were common in non-individualistic societies, but there were good reasons why people tried.  Relationship-oriented ways of dealing with conflict don't automatically lead to good outcomes, and it's arguable that even as the Gergens proclaim their rejection of the individualistic tradition, they are still working within it.  "Dialogue," for example, as the Gergens practice it, is an activity performed between individuals.  Social constructions are not something you can change, let alone escape, just by saying so, or even by setting your mind to it.

The Gergens, like many other social constructionists, skirt close to a blank-slate social construction of human beings and culture.  I think it was the philosopher Edward Stein who argued that the verb underlying "construction" ought to be "construe," not "construct."  I like that idea, but I think "construct" is a good metaphor, as long as it's remember that construction works with materials, and materials (animate or inanimate) are not infinitely malleable.  Our bodies cannot always follow where our ideas go.  Perhap more often they can't.

Social Construction: Entering the Dialogue is not a good introduction to the concept for beginners; it will probably just confuse and misinform them.  I must look for more basic texts on social construction and see if there are better ones out there.  It seems to me that while the idea of social construction is a valuable and useful one, as the Gergens and many other people use it, it is not an idea or a concept but a brand -- something like I Can't Believe It's Not Essentialism!