Showing posts with label elitism. Show all posts
Showing posts with label elitism. Show all posts

Monday, August 23, 2021

Standards of Beauty

At around the same time I heard that NPR segment about chic headscarves, someone posted this on Facebook:

There's so much wrong here that it's difficult to know where to begin.  I can't argue with "don't be a white supremacist," but the rest is garbage.

Are non-Eurocentric standards of beauty any better than Eurocentric ones?  I don't know any reason to think so.  What would they be, anyway?  The footbinding of Chinese women, ended by the Eurocentric Chinese Communists, was one such, but I hope this writer doesn't want to bring it back.  As for "cisheteropatriarchal beauty standards," the transgender beauties I hear about adhere to them absolutely, and they are celebrated by transgender allies. 

Not that it matters much, because standards of beauty are inherently harmful.  Their only function is to set a bar that most people in any culture, of any gender, will not be able to reach.  As a result people will put a lot of energy into trying to reach them anyway, and when they fail they'll feel bad about themselves.  At best such standards aren't totally unrealistic in that no one could possibly meet them, but most people can't, and there's no reason why they should.

One of the stumbling blocks is the confusion of "beauty" with "sexual desirability," though there is no valid standard of sexual desirability either.  I think I was in junior high school when one of the photographic newsweeklies did a story on the politician Barry Goldwater, who was also a skilled amateur photographer.  The article included a full-page photograph of an elderly Native American woman, with a face as wrinkled as W. H. Auden's or Mick Jagger's.  The caption quoted Goldwater's opinion that she was totally beautiful.  

I don't think he meant that he wanted to copulate with her, though who knows?  But the remark made an impression on me: beauty doesn't equal sexual desirability.  People use "beautiful" for everything from sunsets to flowers to babies to old ladies, so that insight shouldn't be surprising, but it seems to surprise many.

I don't remember when I began -- it might have been a result of Goldwater's comment on his photograph -- and I don't believe I made a conscious choice, but when I'm looking at people I try to see what beauty they have on their own terms, rather than measuring them against a standard that is designed to exclude them in advance.  I fail more often than I succeed, but that's the goal.  Erotic desire is only part of it, though it's certainly part of it.

Contrariwise, sexual desirability doesn't equal beauty, though given the elasticity of "beauty," you could argue otherwise.  I do: the men I'm most attracted to aren't conventionally good-looking, but they inspire in me the deep thrill that means "beauty" to me.  Most people who don't conform to white-supremacist, cisheteropatriarchal standards of beauty still find sexual partners who want them, and who themselves may not conform to those standards.  Given that reality, why bother with standards at all?

The gay photographer Tom Bianchi thinks otherwise, and has belabored the point for decades, notably in a small book called In Defense of Beauty (Crown, 1995).  Bianchi has often been criticized for the narrow range of men he photographs.  Among the authorities he cites in his defense are Oscar Wilde, Edmund White, Stephen R. Covey (think The Seven Habits of Highly Effective People) and Deepak Chopra.  Bianchi sets out his position early on:

I am no longer surprised when I hear the charge that the people in my pictures are "too beautiful" or "only the most perfect bodies," for I have come to see the mistake in perception from which these comments come. The implication is that I am elitist, or as one friend suggested, the new word is lookist.  But people who find fault with beauty, who trivialize it by assuming a negative quality in it, diminish themselves.  The ability to appreciate beauty in others is a prerequisite to express it in oneself [8].

I might concede that Bianchi's critics are wrong about his work, but Bianchi has his own "mistake in perception."  He assumes that the kind of men who populate his photographs -- gym queens, in a nutshell -- are beautiful, with "the most perfect bodies," members of an elite.  He defines "beauty" to mean such men, and only such men.

Now, I disagree that his models are beautiful, let alone "too beautiful."  I don't think that these overmuscled bodies are beautiful or perfect, and their faces (which to me are at least as important as the body below the neck) are quite unattractive, either grimly serious or with tight, anxious grins. This is of course a matter of taste, but that's the point: there is no universal standard of male (or female) beauty.  Bianchi relies, I believe, on the ancient Hellenic model, which is fine, but other cultures had very different ideas about the beauty of men.  In East Asia, for example, sculpted muscles were of no interest, though the advent of European imperialism changed that to a great extent.

Bianchi would probably charge me with "find[ing] fault with beauty," with "trivializ[ing] it by assuming a negative quality in it."  I would deny it, because physical beauty is very important to me, though it's not the only human quality that matters.  I just find beauty in people whose beauty Bianchi would deny, because he lacks the ability to appreciate them.  We could agree to disagree, but Bianchi's stance leaves him no room for greater inclusivity.  Beauty is what he says it is, and nothing else; it doesn't seem to occur to him that it could be otherwise.  He's entitled to his taste, of course, but it seems impoverished to me.

I'm reminded here of the far-right Christian pundit Rod Dreher, who has complained that modern Americans, especially the young, "have more generally lost our receptive capabilities to things numinous."  It would be more accurate to say that Dreher is unreceptive to things numinous from any tradition other than the one he has chosen. Likewise, I'm not not hostile to beauty, only to a narrow conception of beauty.  But I have to admit that I'm not receptive to the beauty of Bianchi's models either; the difference is that I'll recognize that he and other people find them so.  The eye of the beholder, anyone?

Wednesday, January 3, 2018

If You Don't Know, I'm Certainly Not Going to Tell You!

Two passages from Morality and Expediency: The Folklore of Academic Politics (Blackwell, 1977) by the anthropologist F. G. Bailey.  I've read, enjoyed, and learned from Bailey's work before, but I picked up this particular book because I've found myself reading some famous satirical works on academia lately, such as Kingsley Amis's Lucky Jim, Robert Grudin's Book, and Frederick C. Crews's The Pooh PerplexMorality and Expediency is based on the English academic Bailey's fieldwork in some American universities, and I felt sure he'd have some useful things to say.  He did, though for me their usefulness goes beyond the Ivory Tower.

First:
The contempt in which a 'popularizer' is held, particularly when he is himself a member of the academic community, comes about for several reasons.  Firstly, he is using the discoveries of other people to make money or reputation for himself: the fact that special talents are needed to market the stuff (so that in fact he does add something), is usually ignored.  Secondly, in the process of popularizing he is likely to dilute and distort: the fact that dilution may be a necessary price for the dissemination of knowledge is ignored. Thirdly, at the back of all this, lies a dominating myth among academics about their own superiority.  Knowledge, for whatever reason accessibly only to the few, is by that very fact superior to knowledge accessible to anyone.  It is all strangely economic: knowledge is valuable in proportion to its scarcity.
In fact this argument is never taken to its logical conclusion, at least by scholars: for the conclusion must be that any sharing of knowledge dilutes it.  But that will not do, since, by definition, knowledge (as distinct from mystical experience or revelation) exists only to the extent that it is disseminated, that is, shared with other people [21].
I don't think I agree with the line Bailey draws here between "knowledge" and "mystical experience or revelation," because revelation, at least, tends to be shared and disseminated, though with the same ambivalence about the process.  On the one hand, the revelation will be polluted by the unclean ears of the many, so must be reserved for the few who have shown themselves worthy; hence the commands in books of revelation to seal up the material until the time is fulfilled, or Jesus' secrecy about his status, to the point of teaching in parables in order to keep "those outside" from understanding and being saved (Mark 4:10-12). On the other hand, the revelation often includes a command to spread the word like seed cast by a sower (also Mark 4), and in the New Testament book of Revelation, the order not to seal the book, for the time is near (Revelation 22:10).

That popularizers are also distrusted, even despised, by scientists no less than other academics, is among other things a sign of the common origins of science, religion, and magic.  On one hand, the rabble are despised for not being willing (or able, depending on the presuppositions of the elitist) to learn the Truth; on the other hand, to attempt to teach them, to let knowledge out of its pen to wander freely in the world, is inevitably to dilute and distort the Truth that only the elect can know.

But this ambivalence also turns up in the arts.  I've mentioned before the composer who despised laypeople for loving the wrong music for the wrong reasons, and speculated that those who make art will inevitably understand it differently than those who consume it.  I first began thinking seriously about this problem, though, when Nirvana's Nevermind became a platinum-selling hit in the early 1990s and I saw people fuming about it online.  They'd been fans before the band signed with a major label, when they could think of Nirvana as esoteric knowledge reserved for the wise few, and they were furious that the masses were going to pollute Art once again with their unclean ears.  I realized that those who see themselves as elites may lament the fact that Artists are despised and rejected (again the language comes from biblical precedents) by the ignorant rabble, but if the rabble suddenly embrace an Artist's work it isn't because their taste has miraculously improved but because the Artist has sold out, gone over to the Dark Side, prostituted himself.  (Myself, I never could hear much difference between Nevermind and Nirvana's earlier work, but then I too am a man of unclean ears and lips, though I have heard the word of Kurt.)

And yet I don't think that the people who were so upset by Nirvana's sudden popularity thought of themselves as elitists; they probably saw themselves as marginalized outsiders, anarchists, the common people trampled on by big business, and they'd thought Cobain and the guys were just guys like them.  The same would be true of the early Christians.  Jesus, after all, had taught that only a few would pass through the narrow gate that leads to salvation, so it couldn't have been only the rich (a small minority in any society) who were going to be damned.

There's a similar confusion among the right-wing Republican base of the Tea Party and of Donald Trump's presidency: on the one hand they are a pitiful minority persecuted by godless brown and black people, transgenders, and extreme liberal media; on the other, they are America, We the People, hear them roar, and the government should govern as they demand.  I also detect echoes of the ambivalence police (also a Trump constituency) have toward the public: on the one hand, a sentimental stance of service and protection; on the other, a paranoid sense that the public misunderstands them, won't support them, blames them first for everything that goes wrong.

Bailey goes on to discuss the conflicting attitudes academics have toward the outside world that supports them.  The University produces and stores Knowledge and Wisdom; it is utterly distinct from and must maintain a wall of separation between itself and the World (the religious precedent again) -- but therefore the public should feel honored to support it; on the other, the public are stupid and can never understand the Truth, so the wise elites of the University are entitled to extract "resources from the outside world without giving anything in return" (40).  These attitudes are also echoed in the arts and sciences.  They are somewhat caricatured, but like any caricature they are recognizable.

Next:
The arena [as opposed to the 'elite'] committee tends towards the public model.  The members of the committee are representative of bodies outside, to which they are accountable and to which they must report back, and the awareness of this potential audience will push members towards posturing and the language of principle and policy, and away from a gossip-like exchange about persons.  Furthermore, since altercation has to be contained if anything is to be done, there may be a tendency to develop rules of etiquette, and with that would appear the suspicion that the committee's work is becoming ritual and ceremonial, leaving the real decisions to be taken elsewhere.  In practice, this descent and fall is usually arrested, because the contestants begin to see the necessity for collusion and for concealing from their followers some of the deals they make with the opposition.  The Planning Sub-committee is an example: the crude antagonisms of its earlier days have been softened a little by increased formality but more by a growing camaraderie and spirit of give-and-take among the members [72].
This made me think of government, especially above the local level.  It's a description of an arena committee like the US Congress, for example, no less than of a university faculty Senate.  The public face of the legislature allows for a lot of grandstanding and posturing, and much work must (therefore?) be done out of the public view.  Members of Congress are representative not only of the voters but of non-voters, and of their donors.  I've often noticed that many citizens talk, at least, as though they believe that their legislators should know what they want or need without being told, and should produce laws cut to order when just one citizen (themselves, of course) confronts them and tells them what he or she wants.  This is impossible even in smaller bodies, like a university senate, because as Bailey says, there is no objectively correct way to divide up limited resources: everybody thinks their wishes and interests are most important and should get attention.  So,
in those small committees which are designed to take or recommend action, just because they are nearer to reality than the larger assemblies, the unpncipled business of compromise behind the scenes -- one of the main indicators of the community [as opposed to the organizational] style -- takes place.  This in turn reinforces the need for secrecy, because there are no public principles -- other than 'reasonableness, which means refusal to stand on principle -- by which they decisions can be defended [66].
I also found useful the distinction Bailey draws between "organizations," based on principles and accountability, and "communities," based on interpersonal relationships, where to "ask for accountability is at best a misunderstanding and at worst a wicked perversion  of the true nature of the institution" (12).  Of course every institution is at the same time a community and an organization, and while it must be decided which mode is proper for dealing with a problem, there is no objective (or public) way to decide it.  Reading a liberal Democrat's account of her interaction with Elizabeth Warren through these filters is revealing: on the one hand, the writer sees herself as a member of a community shared with Warren, whom she evaluates as a person, but also as a member of an organization unfortunately dragged down by the proles, because "we [the wise elites, the Party insiders] are always talking policy but the voters are always choosing on personality."  As I've argued, it isn't true that the voters don't care about policy, and certainly this writer chose on personality; some mixture of the two will probably be present in every individual.

Reading Morality and Expediency, then, reminds me how much I need to learn about real-world politics.  It also fits with what I've been learning about the impossibility of distinguishing science from religion, or the arts from politics, or any number of human institutions from each other: what I, and others, tend to see as specific traits of each turn up in all the others.

Monday, May 5, 2014

This Smart, But No Smarter

When I wrote Friday's post, I hadn't yet started reading Elvin T. Lim's The Anti-intellectual Presidency (Oxford, 2008), though I had a copy in my reading pile and the subject was clearly on my mind.  It turned out to be the right book to read.  Lim's thesis is that presidential public language has been progressively simplified and evacuated of most content in an effort to make it seem that the President is just a regular Joe talking to John Q. Public, man-to-man, in a commonsense way; and not a long-haired, pointy-headed intellectual out of touch with the real America.  His argument is based on analysis of presidential communication since George Washington, plus interviews with those former presidential speechwriters still living and a study of the writings of earlier ones.  He found that Flesch readability scores of presidential speeches have dropped (becoming more easily "readable") ever since Washington, and has become more oriented toward sloganeering and emotive cheerleading than substantial content.  The evidence of the speechwriters indicates that this trend was a conscious and deliberate project throughout the twentieth century, as speechwriting became assigned to specialists rather than the President himself and his advisors, and as the speechwriters lost access to the President, so that their speeches had little connection to actual policy.

Unfortunately, The Anti-intellectual Presidency was published in 2008, just as that year's presidential campaign was heating up, so Lim's analysis ends with George W. Bush; but some remarks in the concluding chapter indicate that Lim saw both McCain and Obama continuing the tradition he describes.  When I have a chance I'll look at Lim's blog and see what he's had to say during Obama's presidency.  (Hm... this post indicates that Lim fell under Obama's spell: "this is not a president willing to mince his words any more."  No, Elvin, when any president uses that trope, it's another example of the plain-speaking man-of-the-people smokescreen you've analyzed so well -- and Obama's no exception.)

Lim touches on issues that have interested me for some time.  For example, he is not saying that the presidents themselves have become less intelligent: rather their rhetoric has become more anti-intellectual, which is another matter.  He distinguishes between "intelligence" and "intellectuality," arguing that the former is generally respected or at least paid lip service, while the latter is everybody's favorite whipping boy.  (He also points out that "rhetoric" originally referred to discourse in general, and to the study of how to use language to communicate and persuade.  Usually nowadays the word is used pejoratively, to refer to empty verbiage and propaganda, but as Lim indicates, that tactic probably goes back to Plato at least.  I am eloquent and thoughtful, a humble artisan of language; you deploy sophistical rhetoric, hiding your elitist emptiness behind flowery, fancy-pants words.)  There was a spike in the trend during the Nixon administration, after marketing and media consultants were brought in to help Nixon overcome his poor media skills, and except perhaps for Carter, presidents since then have followed Nixon's example.  Still, this is a difference of degree rather than one of kind, since orators have always been accused of manipulating their audiences with rhetoric.

Lim describes how presidents would demand that their speechwriters simplify the rhetoric in their productions, so that "speechwriters in turn have observed a Janus-like quality in their bosses, who are articulate, formal, and sophisticated in private, but decidedly casual and simplistic in public" (42).  This applies even to Eisenhower, who before Reagan was probably the paradigm of "casual and simplistic," i.e. dumb, presidential self-presentation.  It takes a lot of work, through endless drafts and revision, to remove all the intellectual content from a speech.

What I want to talk about today is something else, though, namely how intelligent Americans want their presidents to be.  As Lim says, citing Richard Hofstadter's classic book, anti-intellectualism is a hallowed American tradition.  It's pretty clear to me that the right wants to be led by people who aren't too bright.  In 2008 I heard a right-wing co-worker, annoyed by students' derision of Sarah Palin, splutter, "At least she's normal!"  Palin was probably the closest to a "normal" American to run for such high office in my lifetime, but as a state governor she was still hardly as regular a gal as she pretended.  When she was tapped as John McCain's running mate, moreover, she went on a Party-funded spending spree on clothing to spiff up her image; she knew, no less than the Beltway city slickers, that looking too normal would hurt her with the voters she hoped to win over.  It's worth recalling that her attackers included not only snobby liberals but the same conservatives who at other times would praise the common people and denounce liberals for supposedly looking down on them.

I have the impression that American rightists are willing, even eager, to believe that their heroes are "normal" even when, like Reagan, they are wealthy former movie stars in the pay of big corporations, or like Bush, they are spoiled brats from rich and corrupt elite families, sporting post-graduate Ivy League degrees.  At the same time, a Tea Party favorite like Paul Ryan peddles himself as a "wonk," a technocrat if not necessarily an intellectual, and right-wingers like my RWA1 waver between being elitists and populists.  (Randy Newman captured this right-wing ambivalence perfectly in his song "Rednecks": College men from LSU / Went in dumb, come out dumb too.)  Bill Clinton worked this side of the street too, and Lim found his speeches to be high on the anti-intellectual scale despite his Fullbright Scholar / Oxford accomplishments.  I was brought up short, though, when Lim constrasted Bill with Hillary in that department:
According to one observer, "Bill Clinton sounds intimate and conversational when he’s discussing energy policy.  Hillary Clinton sounds like a policy wonk when she talks about her mother’s childhood struggles" [76].
Wait a minute -- I distinctly remember Bill being attacked by the Right (and sometimes celebrated by his fans) as a "policy wonk" himself as well as for being trailer trash, of course.  True, he was good at fake compassion, but his policies were something else.

Democrats, by contrast, now largely identify themselves as the smart party -- smart, at least, compared to the Republicans, which doesn't set the bar very high.  They make much of Obama's intelligence, signaled by his Ivy League education and his status as a Constitutional scholar, but they also embrace his vacuous sloganeering rhetoric and object strenuously to any application of critical thinking to his speeches or policies.  Their interest in intelligence, or in the intellect, is largely virtual, a partisan token.  Like evangelical Christians who've pointed to C. S. Lewis as evidence that you can be Christian and smart, Democratic loyalists use Obama's vaunted intelligence vicariously: he's smart so they don't have to be.

Lim does a good job countering the claim that it's "elitist" to object to anti-intellectualism in politics:
My objection to presidential anti-intellectualism is not a knee-jerk moral panic provoked by an elite suspicion of mass involvement in politics, but it emerges from the assessment that the theories of the anti-intellectual presidency are, at multiple levels, impoverished.  Americans need to be politically educated so that they develop the intellectual and moral capacities that are necessary for competent citizenship, among them, a capacity to look beyond individual interests toward collective interests, and an ability to think through and adjudicate the various policy options that their leaders propose.  While we do not expect democratic citizens to be policy experts, there is a threshold level of political knowledge below which their ability to make informed and competent civic judgments is impaired [113].
Paradoxically, anti-intellectualism is largely an elitist tactic.  Those who defend it will claim that ordinary folks can't understand complex foreign-policy or economic issues, and should just be left to live their little lives, taken care of by the benign folk at the top of the pyramid.  In practice, however, those at the top and their toadies don't necessarily know what they're doing, but even more, they hold the ordinary voter in contempt, especially when she doesn't vote for their candidate. It's not that they want to translate complex issues into simple language, it's that they want to tell the public what they believe we want to hear:
The manmade teleology of presidential anti-intellectualism stems from the perceived benefit of going anti-intellectual, which is nearly universally felt, as my interviews have shown.  I say “perceived,” because there is no reason to think that these calculations are objectively true; we know only that presidents and speechwriters appear to believe them true.  As each president and his team of speechwriters seek to simplify his public rhetoric even further, the effect of such efforts is cumulatively felt even if each administration does not feel individually responsible [48].
Lim points to those who claim that anti-intellectualism increases public "participation" in politics, but "participation" here evidently means that the public cheers when the Leader gives a speech.  If that were so, Hitler and Mussolini would have been great democratic leaders.  He quotes the Reagan speechwriter and hagiographer Peggy Noonan on
the groupthink behind contemporary speech craft: “It is simplicity that gives the speech its power. … And we pick the signal up because we have gained a sense in our lives that true things are usually said straight and plain and direct” (original emphasis).  But simplicity does not guarantee the truth, only the semblance of sincerity.  Paradoxically, in heeding Noonan’s advice, presidents have to be untruthful or duplicitous – altering their innate speech patterns – in order to appear truthful [47].
But this fits with the myth of American meritocracy, the belief that those at the top earned their status through innate superiority expressed in grit, hard work and street smarts.  In practice, again, we may doubt the superiority of elites, either in work capacity or in intelligence.  The best and the brightest, every time they've been handed the reins of power, have screwed up repeatedly, from John Dewey's campaign to get the US into World War I through the bipartisan technocrats who engineered the US invasion of Vietnam to Barack Obama playing eleven-dimensional chess with his opponents.  Because of this I'm skeptical of Lim's assertion that "Americans need to be politically educated" by their leaders.  We need to educate ourselves, recognizing that our leaders are people just like us, as given to wishful thinking, irrational hopes and fears, and incomplete information, as the rest of the citizenry.

The hard but crucial question, I believe, is: How intellectual do we need to be?  If there is a valid distinction between being intelligent and being an intellectual, as I believe there is, it's largely a difference of degree rather than kind.  My personal definition of an intellectual is someone who works with ideas, the same way a mechanic works with machines.  But everyone does that to some extent; we just don't all do it equally well, and it's not surprising that some people have more talent for the job and more interest in it.  As Kath Weston said of "theory," everyone -- not just professional theorists --  has theories about human nature and how society works.  "The question then becomes: What kind of theory do 'we' want to do?  And who occupies themselves with which sort of theory?"

I've observed that the same people who despise intellectuals and "theory" often love arcane and convoluted systems, like Gnosticism (which maybe I should call neo-Gnosticism, since its present-day adherents have no real connection to the original movement) or Tibetan Buddhism with their multiplication of heavens and deities and angels and devils. They may admire imaginary worlds like J. R. R. Tolkien's Middle Earth or the Star Trek universe, with their many peoples and cultures and invented languages, and may aspire to world-creation themselves.  They can be intensely attentive to and critical of weaknesses or contradictions in those systems.  But when it comes to the real world, they're not interested.  My third right-wing acquaintance -- a former schoolteacher with at least a bachelor's degree -- has told me that she simply chooses to believe the news slant she likes; she's not interested in questioning why she likes it, and certainly not in examining for logic or factual accuracy.  So she sees herself as a bold skeptic, just as most of my liberal friends do, because she disbelieves reports and analyses she doesn't like, but is utterly credulous about those she likes.  In this I believe she differs in degree, not in kind, from many more "intellectual" people.  I don't ask how she (or they) should be "educated" by others, by their leaders or by me; that won't work.  Besides, she is if anything too willing to be educated by her leaders.  I think that's the real problem with public political discourse.  What to do about it, I have no idea.

Democracy is sometimes defended (or criticized) as the belief that the People, given a chance, will make better decisions than elites will.  I disagree, since I don't think the People are any smarter, or dumber, than the elites.  I can't remember who (Paul Goodman, maybe?) said that the real reason for democracy is that people are affected by events, and by government policies, so they (we) have a right to a voice in the making of them -- and to accountability when they turn out to be wrong.  Accountability, of course, is not very popular, except as something to demand for the other guy, and then what is usually meant is punishment, not accountability.  I believe that most people are capable of critically examining their own beliefs and ideas as well as those of others, to varying degrees.  Whether enough of us can learn to do it well enough (and what is well enough), I don't know, but I don't think anyone does know.  It's always a good time to begin.

Saturday, September 28, 2013

There Are Elitists, and Then There Are Elitists



My Right Wing Acquaintance was playing the populist on Facebook again today, which was entertaining as always because he's almost as blatantly unconvincing in the part as George W. Bush: "the pathetic defense of Western 'ideals' expounded by the intellectual elite and the pitiful symbolic acts they take to assert them" and blah blah blah.  (Elsewhere he said that Greenpeace "talks out of its arse"; let him who is without a talking anus cast the first stone, RWA1.)  Remember, on alternate days this salt-of-the-earth common-clay-of-the-New-West Man of the People quotes sages who warn that "the turbulence of the mob is always close to insanity."

I had this in mind when I started rereading Lawrence W. Levine's The Opening of the American Mind (Beacon Press, 1996), something I've been meaning to do for some time now, and noticed this nugget:
After reading Plato's Symposium, a student came to Allan Bloom "with deep melancholy and said it was impossible to imagine that magic Athenian atmosphere reproduced."  Bloom assured him that such experiences "are always accessible ... right under our noses, improbable but always present."  But only for a small elite [12].
This is why cultural conservatives are so confused.  On one hand, works like the Symposium are canonical, the benchmarks of civilization (I was going to write "Western civilization," but that would be redundant to this mindset), signposts to the "community of those who seek the truth, of the potential knowers ... the true friends, as Plato was to Aristotle" (ibid.) in Bloom's words; but on the other hand, these materials must be used with care, lest the unsophisticated young be led astray.  Consider the Symposium from the point of view of one of today's Cultural Right: a drinking party -- indeed, an orgy -- composed of a pack of child-molesting homosexuals, trying to disguise their unnatural lusts in high-flown philosophical gasbaggery and Sophism, right down to a Queer-Theoretical myth in which heterosexuality and homosexuality are put on an equal footing.

And all this in honor of a dirty old man who eventually had to be executed by respectable citizens (after a trial by his peers) for impiety and corrupting Athenian youth!  (Bear in mind that the Symposium, like all of Plato's dialogues, was written after the execution of Socrates, and was meant to rehabilitate him and carry on his legacy.)  One of the most debased and corrupted of his minions boasted of how he had offered his body to the old lecher, only to be turned down -- so he said, but they spent the night together under the same cloak.  Even many cultural liberals prefer not to think about the circumstances of the Symposium, I think, sweeping its pederastic context under the rug.  If such a gathering were discovered today, it would surely provoke a scandal, and Socrates would have to drink the hemlock again.

P.S. Some readers might be wondering along the lines of "What about the good old days, when college students studied Greek and Latin and would have read Plato in the original?"  The short answer is that by and large they didn't.  Levine wrote:
Fortunately, we can can turn directly to the students he [James Atlas] envies who, while they did indeed read the "classics" in the original Greek and Latin, read them not as works of literature but as examples of grammar, the rules of which they studied endlessly and by rote. James Freeman Clark, who received his Harvard A.B. in 1829, complained, "No attempt was made to interest us in our studies. We were expected to wade through Homer as though the Iliad were a bug ... Nothing was said of the glory and grandeur of this immortal epic. The melody of the hexameter was never suggested to us." Henry Adams proclaimed his years at Harvard from 1854 to 1858 "wasted" and exclaimed in his autobiography: "It taught little, and that little ill ... Beyond two or three Greek plays, the student got nothing from the ancient languages" [16].
In E. M. Forster's early 20th century novel Maurice there's a scene in a Cambridge University Greek class.  (It's in the Merchant-Ivory movie adaptation too.)  The students are orally translating some ancient text into English, and the Don instructs them to "Omit: a reference to the unspeakable vice of the Greeks."  Students with a personal interest in that "vice" worked out their understanding of such things on their own.  The educational establishment walked a narrow line between reverence for the classics and hostility to their contents.

Saturday, July 6, 2013

The Smart-Aleck's Tale

What does it say about me that among my favorite songs are three which express the point of view of someone who offers his love to someone smarter, despite his own lack of book-smarts? And I identify with him, not the person he's offering his love to.

I've been having some online conversations that have left me feeling a bit uneasy, with people I know from high school. As I've said before, I was an isolate in those days, partly because I was so closeted, but also because I seemed to have so little in common with my peers. Playing guitar was the first real connection I was able to make with them. I was also an intellectual who did well in school, and while I didn't look down on others, I didn't know what to talk to them about. This was partly temperamental on my part; the tendency to be a loner seems to run on my father's side of the family. It's more than shyness, but shyness is certainly part of it. Since I got onto Facebook and some of my old schoolmates contacted me, I've learned that there were more readers and thinkers (and even some atheists) in my small town than I realized.

But what about everyone else? A few of my classmates have told me that they thought I believed I was better than others, too good to be friends with them because I was smart. This is difficult to answer, because I didn't think I was better than they were, and I don't feel I'm that smart. I wrote "feel" there rather than "think" or "believe," because I know objectively that I am smart, probably smarter than most people, but most of the time I'm more conscious of what I don't know than what I do know. If anything, in high school I felt inferior to most people: fearful of what they'd think if they knew I was homosexual; ashamed of my family for various private reasons I'm not going to write about but suffice it to say I thought we were abnormal; ashamed of my own shyness and penchant for solitude. Books and music were a refuge for me, as they are for many awkward adolescents. It wasn't my own thoughts I communed with, but the thoughts of other people encountered in the books I read.

It was this shyness and insecurity that other people misinterpreted as arrogance. They still do sometimes, so when I have the chance, when someone remarks on someone else's perceived arrogance, I suggest that maybe that other person is shy and insecure instead. I suspect that many people like accusing others of arrogance, no doubt because of their own insecurities, which probably come from being taught by various adults to think of themselves as stupid. Among the writers I learned from in those days were Jonathan Kozol and John Holt, who taught me what a lot of other kids went through while I was coasting through school. If I had ever thought of my schoolmates as dumb, I got over that decisively. But by then I was out of school, and far from my home town.

It's also true that in my teens I exposed myself to a lot of writing, especially in science fiction, which espoused technocratic elitism, the idea that the more intelligent few should rule the ignorant many. This flattered my alienated adolescent self, of course, and it took me quite a while to learn better and get over it.

Two things have been helpful to me in figuring out where I stand. Several people have told me that I'm good at explaining things, clearly and without condescending to them. This is probably why so many people have used me as a resource over the years, a walking talking encyclopedia that shares what it knows but doesn't pretend to know what it doesn't. One of my high school classmates told me that I helped him with algebra, so this willingness to share knowledge goes back at least that far. But there are subjects that can't be covered in five or ten minutes; some of it involves worldview, and that's harder to get across. For example, you can talk about specific cases of American foreign policy (Vietnam, the Iraq War, Central America, and so on), but a lot of people reflexively resist the idea that the United States has ever done anything really, seriously, wrong. It's too disturbing, so they balk. This includes not just people with high-school educations, of course, but people with college degrees.

The other thing, probably more important, was learning to listen. In the Seventies I joined a local crisis telephone line and was trained in what they called "listening skills." It really just codified and highlighted things I already knew; I'd noticed almost as soon as I arrived in Bloomington that a surprising number of people found me easy to talk to about their lives. Reading is a kind of listening, after all, and books for me have always been a way to learn about the lives of people who were different from me.

[This post has been in my drafts folder for a year or more; time to get it out of there while I work on something else.]

Friday, March 15, 2013

Move 'Em Up, Head 'Em Out

I finally finished Chris Hayes's Twilight of the Elites: America After Meritocracy, which turned out to be even less meritorious than I expected.  Hayes's ongoing misuse of the word "meritocracy" annoys me like a stone in my shoe: "institutions that create meritocratic elites" (64), "the energetic force of meritocratic competition" (75), "American political economy during the meritocratic age" (142), "several decades of failed meritocratic production" (155), "the class of meritocratic overachievers" (201), "the meritocratic era" (203), and the like.  What meritocratic era?  He can't seem to make up his mind whether meritocracy has been an aspiration, or a brief shining moment that has alas faded, or some combination of the two. 

He even quotes someone else's misdefinition:
Adam Michaelson, a former advertising executive who worked in Countrywide [Financial]'s marketing division, recalls the atmosphere of the business this way: "Countrywide fashioned itself a meritocracy, that is, whoever generated the most value or profit for the firm would be granted the greatest rewards, growth and prestige."  As happened in so many places over the past decade, the institutional definition of merit inside Countrywide became thoroughly perverse [97-8].
But the "institutional definition" didn't become "perverse," it was perverse in itself.  Those who generated value and profit for the firm weren't running it (the -cracy part of meritocracy again): the upper ranks of executive officers did that.  Those upper echelons skimmed off immense wealth for themselves from the "value and profit" their underlings generated:
Between 2001 and 2006, [cofounder Angelo] Mozilo managed to arrange for himself a staggering $470 million in in total executive compensation.  The most cynical interpretation of these actions, though also the most plausible, is that Mozilo was looting the company he'd built as fast as he could before the markets or regulators caught up with him [97].
Earlier in the book Hayes wrote, "In a meritocracy, people are judged not on the color of their skin, but on the content of their character" (51).  Leaving aside the fact that we have no way of judging the content of people's character directly, before they are granted access to money and power, it seems fair to say that Mozilo showed his character all too well as he squirreled away vast amounts of money.  In the end the Securities Exchange Commission caught up with him, but he managed to avoid trial or jail by reaching a settlement in which he "agreed to pay $67.5 million in fines and accepted a lifetime ban from serving as an officer or director of any public company ... The terms of the settlement allow Mr. Mozilo to avoid acknowledging any wrongdoing.  In February 2011, the U.S. dropped its criminal investigation into the facts behind that civil settlement."

This isn't, contrary to Hayes's repeated lament, a new crisis of authority and public trust in the US.  What bothers me is that he really thinks that it's important that we believe authorities, and that they used to or even theoretically could deserve unquestioning trust.  He seems to think, for example, that the "mainstream" corporate media used to be reliable, or ought to be treated as if they were.  It "wasn't just the rise of technology that produced the explosion of the blogosphere; it was the perceived failings of the mainstream media" (118).  That word "perceived" bothers me.  It could mean that those failings became manifest and so were perceived; but in context it seems to mean that people only thought they discerned those failings.  If Hayes has ever heard of I. F. Stone, he doesn't let on.  As he concedes,
It turned out that ... someone sitting in a basement in New Jersey, using the Internet, reading from a diverse set of sources about WMD intelligence, could actually get closer to the truth than the beat reporter with the inside sources at Langley [118].
That's essentially what Stone did.  Expelled from the Washington press corps in the late 1940s, he started his own newsletter and worked by close reading of corporate media and government publications.  He didn't even need the Internet.  Access to government insiders, that consummation devoutly to be wished among most American journalists, has more to do with wanting to rub elbows with the powerful than with doing serious journalism.

Critical thinking makes Hayes uneasy, for some reason.  He admits that authority figures have made lethal mistakes, and that professional "consensus" can be and has been completely wrong.  But he still seems to think that somewhere there's a meritocratic elite who will be right and can be trusted, if we can just find them.  That, it seems to me, is the root of the problem.  No one individual can know everything, so we have to delegate learning to other people.  But if we expect to find someone who knows everything and never errs, we're bound to be disappointed when our elites turn out to have feet of clay, and then go looking for the next true authority.  It's not necessary to be cynical about it, just realistic, but that option seems not to have occurred to Hayes.

Hayes assumes that competition selects for merit, and that is probably his biggest mistake.  In some cases it may do, where merit can be narrowly and specifically defined: as the fastest runner or the highest jumper.  But that applies only to a very small part of worthwhile human endeavor.  When merit is defined as making as much money as possible for your hedge fund, you get into trouble.  Or consider this example:
[Economist Sherwin] Rosen argued that certain technological trends had radically expanded the demand for services for those who were the best in their field: in 1950, a top basketball player could only monetize his talent with an endorsement deal that would sell sneakers to Americans; today, LeBron James is featured on billboards from Florida to Turkey to China [142-3].
I think "market" would be more accurate than "demand," but the real fault here is the assumptions that there is one indisputable "best in their field," and that no one else can "monetize his talent."  Many elite athletes have their fans, who'll buy products endorsed by their favorite stars, and those fans disagree about who is the best.  You don't have to be the top basketball player in the world to become rich, or even financially comfortable.  And again, sport is a field where excellence is relatively easy to define and detect, compared to the arts, let alone politics.  Hayes goes on:
The same goes in a whole host of domains: the best opera soprano can, with the advent of MP3s and the Internet, sell to anyone in the world with an iPod, which spells trouble for the fifth best soprano. If you can buy the best, why settle? [143]
This assumes that people seek out the best, or the world's best, singers, and that they agree who is the best.  Neither is true.  A fan may have his or her favorite diva, and argue that she's the best against the fans of rivals, but there is room (and a need) for more than five great sopranos in the opera world, just as there's no reason why New York City can't offer excellent educations to more than 185 new students each year.  Those who aren't the top 185 still have a claim on learning, and something to contribute.  (P.S. I think that Hayes's remarks here could only be made by someone who doesn't much like it, and knows next to nothing about it.)

I realize that I'm probably not a typical music listener, but I don't think in terms of seeking out the best singer, or songwriter, or guitarist, or pianist, or band in the world.  I look for music or other art that gives me pleasure, or moves me emotionally, and I'm drawn by a range of artists.  Sure, there are fans who root for their band or their singer against all others -- I remember with distaste the divide between Beatles and Stones fans in the 60s, for example -- but it isn't necessary to denigrate all other singers than your personal fave.  The mindset that does so is fed by the competitive ethos of capitalism, but that's what is wrong with it.  Few of even the most fanatical fans have records by only one performer in their collections.  And none of this has much if anything to do with merit, let alone meritocracy.

Hayes structures his book around a few hot-button stories that many writers and pundits can agree are important -- the Roman Catholic abuse scandals, steroids use in baseball, corruption in business and finance and politics -- but they don't have much to do with "meritocracy."  That's not too surprising, since he doesn't have a very clear understanding of what meritocracy is or might be, and Twilight of the Elites is a frustratingly insubstantial discussion of some serious problems.

Saturday, February 9, 2013

The System That Dare Not Speak Its Name

I'm not sure I'm going to make it all the way through Chris Hayes's Twilight of the Elites: America After Meritocracy (Crown, 2012).  I know I said I was going to read it, but I got sixty pages into it yesterday and got so dispirited that I didn't want to go on.  I think he's mapped out his assumptions by the point I've reached, and they are what's wrong with the book.  Put simply -- and Hayes is not a complex thinker -- he believes that the United States is a meritocracy: that our rulers are people who showed that they were qualified to run a complex society and its institutions.  He keeps referring to our elites as "the meritocracy," begging repeatedly the question that they actually have merit.

This emerges very clearly in the second chapter, "Meritocracy and Its Discontents."  Hayes begins:
Whether we think about it much or not, we all believe in meritocracy. It is embedded in our very language: to call an organization, a business, or an institution “meritocratic” is to pay it a high compliment.  On the portion of its website devoted to recruiting talent, Goldman Sachs tells potential recruits that “Goldman Sachs is a meritocracy.” It’s the first sentence [31].
I presume that by that first sentence Hayes means that "we" all believe a meritocracy is an ideal to be worked toward.  If so, he may be right, as when he wrote a few pages earlier that meritocracy "is an ideal with roots that reach back to the early years of the Republic" (21), that people should achieve wealth and status through disciplined hard work, not have them assigned by accident of birth.  Or, as he also puts it, "In a meritocracy, people are judged not on the color of their skin, but on the content of their character. This crucial distinction between the contingent and essential features – between skin color and intelligence – appeals to some of our most profound moral intuitions about justice and desert" (51).

He goes on to offer what he considers a "compelling" argument, that meritocracy is not "necessarily fair, but rather that it is efficient. By identifying the best and brightest and putting them through resource-intensive schooling and training, we produce a set of elites well equipped to dispatch their duties with aplomb. By conferring the most power on those equipped to wield it, the meritocracy produces a better society for us all. In rewarding effort and smarts, we incentivize both."  He goes on to concede that "meritocracy in practice is something different.  The most fundamental problem with meritocracy is how difficult it is to maintain it in its pure and noble form" (52-3).  Before you can "maintain" a meritocracy, however, you have to achieve it.  Hayes only assumes that it has been achieved (for a few bright shining years in the Sixties, I gather), and this is not something that can be assumed; you need to provide an argument and evidence, which he doesn't do.

Instead he begins with a case study, the Hunter College High School in New York City, of which he is an alumnus.  (Along with many other well-known people, including Florence Howe, who wrote about her experience there in her memoir A Life in Motion.)  Rather flustered, Hayes describes how Justin Howard, a graduating senior at Hunter, delivered a commencement address in 2010 that challenged many of the school's pretensions.

Hunter, Hayes explains,
embodies the meritocratic ideal as much as any institution in the country.  It is public and open to students from all five boroughs of New York City, but highly selective.  Each year, between 3,000 and 4,000 students citywide score high enough on their fifth-grade standardized tests to even qualify to take Hunter's entrance exam in the sixth grade.  Only 185 are offered admission ...

Hunter is routinely ranked one of the best high schools in the nation.  In 2003, Worth named it the highest-ranking public feeder school, and Newsweek, in a 2008 evaluation of high schools, named it one of eighteen "public elites."  In 2007, the Wall Street Journal identified Hunter as sending a higher percentage of its graduates to the nation's top colleges than all but one of its public peers.  That year, nearly 15 percent of the graduating class received admission to one of eight elite universities that the Journal used for its analysis.  The school boasts an illustrious group of alumni, from famed Civil Rights activist and actress Ruby Dee, to writers Cynthia Ozick and Audre Lorde, to Tony Award winners Robert Lopez (composer and lyricist, The Book of Mormon and Avenue Q) and Lin-Manuel Miranda (composer and lyricist, In the Heights), to West Wing writer and producer Eli Attie, to Supreme Court justice Elana Kagan [32].
Where do I begin?  Well, first and I hope most obviously, Hayes takes standardized tests very seriously: he seems to think that they actually measure merit.  In reality they measure ability to choose correct answers from multiple-choice options under intense time pressure in a competitive environment.  Marking in bubbles on an answer sheet -- neatly, neatly, don't get outside the bubble! -- while the clock ticks is indeed a skill of some kind (I did quite well at it myself in the day), but I'm not sure it is equivalent to "merit."  Hayes is sure it is.  He considers the SAT, for example, "an objective measure of the 'merit' of the applicant" so that "the nebulous subjectivity of the admissions procedure would be eliminated in favor of an equal, accessible, and objective metric" (37).  There is quite a large literature which explains why this isn't so, but Hayes seems not to be aware of it.  Well, nobody's perfect.  (For now I'll refer the reader to two articles by Alfie Kohn, available online, and to his book The Case Against Standardized Testing [Heinemann, 2000].)

Notice that the famous alumni Hayes names are not, in fact "crats," however much merit they have, except for Justice Kagan.  Poets, composers, writers: they contribute to our society, no doubt, but they don't rule it.  This is a problem that runs throughout his discussion.  Not only is he confused about what merit is, he consistently ignores the "-cracy" in meritocracy.  He's far from alone in doing so, of course.

Hayes also seems to want to impress the reader by how selective Hunter is.  Thousands apply, but only a few, the best! get in.  The best!  If only a few are let in, they must be really good, right?  And it proves that the school is good, right?*  He's so attached to this idea that he misunderstands Justin Howard's critique of the Hunter system -- or rather, of the system that produced Hunter, the system Chris Hayes mistakes for meritocracy.  Howard's entire speech is available online; I'll mainly stick to the portions Hayes quotes.  "I feel guilty," Hudson said,
... because I don't deserve any of this.  And neither do any of you.  We received an outstanding education at no charge based solely on a test we took when we were eleven-year-olds, or four-year-olds.  We received superior teachers and additional resources based on our status as "gifted," while kids who naturally needed those resources much more than us wallowed in the mire of a broken system.  And now, we stand on the precipice of our lives, in control of our lives, based purely and simply on luck and circumstance.  If you truly believe that the demographics of Hunter represent the distribution of intelligence in this city, then you must believe that the Upper West Side, Bayside, and Flushing, are intrinsically more intelligent than the South Bronx, Bedford-Stuyvestant, and Washington Heights, and I refuse to accept that [quoted by Hayes, 33].
Hayes reports that the "parents in the crowd were, not surprisingly, a bit taken aback.  The teachers offered a standing ovation.  Jennifer J. Raab, the president of Hunter College and herself a graduate of Hunter College, stayed seated" (34).  Hayes himself is indignant: "Unlike elite colleges ... entrance to Hunter rests on a single 'objective' measure: one three-hour test.  If you clear the bar, you're in; if not, you're not.  There are no legacy admissions, and there are no strings to pull for the well connected.  If [NYC Mayor] Michael Bloomberg's daughter took the test and didn't pass, she wouldn't get in.  There are only a handful of institutions left in the country about which that can be said" (34).

But Hayes is missing the point, partly I think because Howard didn't make it clearly enough.  I think he'd have preferred to miss it anyhow.  In one sense, I disagree with Howard: I think he and his fellow students do deserve the "outstanding education at no charge" that they received at Hunter.  But every other child in New York City, and indeed in the US and around the world deserves an outstanding education at no charge.  Not the same education, mind you, but an outstanding education that would bring out and nurture the capacities each child has.  Most children in New York don't get anything like that unless their capabilities match those valued by elite universities, which Hayes, begging the question, assumes to be merit.  This is the source of Howard's survivor's guilt, and Hayes misses it entirely.

Only 185 children are offered admission to Hunter each year.  That seems an arbitrary number, but I suppose it's the capacity of the school.  There is no reason why there couldn't be other good schools in the city, each admitting another 185 of those who've scored high on the entrance exam; even those who take the test in the first place have already been selected from the mass of students by the preliminary test they took in fifth grade.  There are other elite schools in New York, of course, like the Bronx High School of Science attended by Samuel R. Delany.  I imagine there could be even more, if there were the will to spend the money to establish them.

It's good that such schools exist, but there also need to be good schools for all the other children.  There aren't, and that's the root of Justin Howard's guilt and anguish.  This is especially true because we can't know in advance what any given child is capable of, though the testing industry and many educational "professionals" are built on the premise that we can.  Michael E. Young satirized this belief in the 1950s in his The Rise of the Meritocracy, which Chris Hayes has read but whose satire he can't quite grasp.  Young mocked the pretense of psychometricians to be able to identify children's potential at ever earlier ages.  Though it's well established that there is no measure, "objective" or otherwise, that can really do this, Hayes, like other believers in meritocracy, wants to believe that there is, or could be.  And one of the core features of meritocracy as an ideology is that those who rise in the hierarchy must be willing to accept the values of the rulers and leave behind the lesser, lower orders from which they came.  The values of the rulers must be meritorious, because they are the rulers' values and the rulers wouldn't be rulers if they didn't have merit, right?

"Hunter's approach to education rests on two fundamental premises," Hayes declares in his apologia.
First, kids are not created equal.  Some are much smarter than others.  And second, the hierarchy of brains is entirely distinct from the social hierarchies of race, wealth, and privilege.  That was the idea, anyway.  But over the last decade, as inequality in New York has skyrocketed and competition for elite education has reached a fever pitch, Hunter's single entrance exam has proven to be a flimsy barrier to protect the school's meritocratic garden from the incursions of the world outside its walls [35-6].
Hayes makes several blunders here.  First, different is not the same as unequal.  That people are not equally intelligent does not mean they are or should be unequal in terms of rights; Hayes's allusion to the opening of the Declaration of Independence shows that he either doesn't understand the difference (more likely, I suppose) or that he's deliberately blurring it (in which case he isn't as smart as he likes to think).  Second, there is not really a "hierarchy of brains" distinct from the "social hierarchies of race, wealth of privilege," especially since (as Hayes uncomfortably admits) the two are intimately connected, even intertwined where Hunter is concerned.  The percentage of black students entering Hunter dropped from 12 percent in 1995 to 3 percent in 2009, he reports on page 36.  Have black New Yorkers gotten 75% dumber in 14 years?  I doubt it very much, and even Hayes can't quite bring himself to say so.  (No doubt because of the "open-minded, self-assured cosmopolitanism that is the guiding ethos of the current American ruling class," that he says he "absorbed" at Hunter [35].)  Hunter isn't responsible for the growing economic inequality, with all its destructive consequences, of the world outside its walls -- or is it?  After all, its graduates are disproportionately represented among the elites who are responsible for the increase in inequality.  It appears likely that something important wasn't covered in that outstanding education they received.

I don't find that surprising.  Hunter isn't an alternative approach to education, it's a portal that admitted a few people who wouldn't have gotten past the old aristocratic sorting methods of the elite schools.  Hunter was founded at a time when the prevailing sorting methods were frankly racist and sexist; basing admission solely on test results was a good start, but it was only a start.  Its aim is to assimilate its students into the values of the ruling elites.  Remember that it was very intelligent, highly-educated products of the system represented by Hunter -- the "best and brightest" -- who invaded Vietnam and devastated Southeast Asia in the name of anti-Communism.  Just as George W. Bush's staff 'fixed the intelligence' to justify the US invasion of Iraq, the Pentagon Papers revealed that the Eisenhower administration fixed the intelligence, despite the absence of supporting evidence, to prove that Ho Chi Minh was a Kremlin agent; Kennedy's men bought this fabrication and ran with it as if it were a game of touch football.  Of course, very few of them died as a result of their brilliance.  (Most of the casualties, American or Vietnamese, didn't merit an elite education.)

Ursula K. Le Guin put it neatly in words she invented for the philosopher Odo in her novel The Dispossessed, originally published in 1974 (reprinted by Perennial, 2003, p. 358):
For we each of us deserve everything, every luxury that was ever piled in the tombs of dead kings, and we each of us deserve nothing, not a mouthful of bread in hunger.  Have we not eaten while another starved?  Will you punish us for that?  Will you reward us for the virtue of starving while others ate?  No man earns punishment, no man earns reward. Free your mind of the idea of deserving, the idea of earning, and you will begin to be able to think.
These words leapt out at me when I first read them decades ago, because I'd recently encountered the same idea in Walter Kaufmann's Without Guilt and Justice, where he attacked "the idea of deserving" at length.  I don't expect Chris Hayes to have read either of these books, of course.  But he needn't have read them to have encountered some enlightening alternative views of ability, education, and social justice.

Hayes alludes a couple of times to the metaphor of the cream rising to the top.  So does the scum, but he prefers not to think about that.  In either case it is a metaphor which has dubious relevance to education or the running of a society.  He eventually claims that at Hunter, "over time the mechanisms of meritocracy have broken down. With the rise of a sophisticated and expensive test-preparation industry, the means of selecting entrants to Hunter has grown less independent of the social and economic hierarchies in New York City at large. The pyramid of merit has come to mirror the pyramid of wealth and cultural capital" (54).  This is all wrong.  There is no pyramid of merit, not even figuratively.  The "mechanisms of meritocracy" were never in place.  Perhaps Hayes believes that Hunter's entrance exam could be revised so as to weed out the kids who've been prepped for it, leaving only the truly, innately excellent -- but there is no reason to think so.  Hunter isn't to blame.  The fault lies with the system Hayes mistakenly buys into, no doubt because he has benefited from it: the hierarchy of wealth and power that justifies itself by declaring itself a meritocracy.  The merit involved can only be instrumental, the ability to carry out efficiently tasks whose value isn't open to question.  Should we invade this or that country?  Don't ask: the only question is whether we can do it efficiently.  I think Hayes is obscurely and uneasily aware of this, but isn't ready to challenge the values of the society that raised him up and gave him a platform to write books like this one.

Speaking of which, it looks like the next chapter will focus largely on professional baseball's doping scandals.  The ability to play baseball well is a merit easier to measure than the ability to govern a country well and wisely, but baseball players don't rule the country either.  I have to remind myself that what concerns Hayes is that our elites have let us down.  Say it ain't so, Joe!  I'm going to have to steel myself to keep reading.

* P.S.  On rereading this it occurred to me that while it was obvious to me that Chris Hayes was wrong to assume that Hunter's extreme selectivity was evidence of its merit, and of its students, it might not be obvious to everyone else.  Sarcastic rhetorical questions, while fun, aren't an argument.

So: suppose that Hunter still only admitted 185 new students each year, but did so purely on the basis of a lottery, a random drawing.  It would still be very selective, but no one would claim that this proved the "merit" of the 185.  Or suppose that admission was based on the ranking of the students' parents among New York City's wealthiest families.  (Or its poorest, if you like.)  There are probably people who'd argue that the richest have demonstrated their merit and their children have demonstrated theirs by choosing good parents to be born to, but I don't think Hayes would be one of them.  Selectivity by itself doesn't prove much; consider the negative associations of the word "exclusive," as in "exclusive country club."

Hayes claims (or assumes -- after all, he passed it, so it must be good!) that Hunter's entrance examination is an "objective measure" of the students' aptitude for the Hunter experience, but I doubt it: could it be any better than IQ tests or the College Board examinations, both of which are seriously flawed as measures of intelligence, intellectual ability, or scholastic aptitude?  If so, Hunter should share it with the world.  But here I'm only really concerned with his evident belief that selectivity is a good in itself.

Thursday, January 17, 2013

Meritocracy in Action, Episode 5487

Yesterday I wrote that "people who make large amounts of money in business or finance often think, and are thought by some, to know how the government should be run, or even just the economy.  Aside from the huge business and fiscal disasters over which such people have presided, their public statements since the re-election of Barack Obama should have disabused most people that they're not even qualified to decide their own salaries: they routinely think they're worth a lot more than they are."

New examples keep turning up.  The other day, Whole Foods CEO John Mackey made headlines when he claimed that the Affordable Care Act was "like fascism," marking himself out as a blustering fool.  So of course he quickly attempted to rewrite the record, digging himself in even deeper, as public figures from Todd Akin to Barack Obama usually do when they try to make it all better.

Mackey conceded that the word "fascist" wasn't the most felicitous choice: "it's got so much baggage attached to it ... Of course, I was just using the standard dictionary definition."  It's hard to see how the ACA fits with the dictionary definition of fascism; maybe Mackey took "Obamacare" literally and thought that Obama just wrote the law himself and signed it without going through Congress?  There are all kinds of objections that have been made by rational people to the ACA, but it is not an example of a strong autocratic government led by a dictatorial leader, etc.
"We no longer have free-enterprise capitalism in health care," he said. "The government is directing it. So we need a new word for it."

Mackey defined it later on HuffPost Live. "I think I'm going to use the phrase government-controlled health care. That's where we're evolving to right now," he said.
 Of course this is raving, pure and simple.  Arguably we haven't had "free-enterprise capitalism" in health care since the insurance companies took it over a few decades ago.  Almost no doctors are independent small-business owners anymore, as many were when I was a kid; most now work for the HMOs.  Corporate capitalism is not free-enterprise capitalism, and the American health care system is not 'directed' by the government, it's directed by giant corporate entities.  Which, as been pointed out at tiresome length in the past few years, is why our health care system is so wasteful, inefficient, and overpriced, with poorer outcomes than other industrialized nations.

The very limited regulation of the insurance companies mandated by the ACA isn't "government-controlled healthcare", anymore than food inspection is "government-controlled food production."  If you're going to talk about government 'direction' of health care in the US, you could talk about state licensing of medical practitioners, or federal oversight of drug safety, or Medicare.  I think most Americans would agree that if Medicare is fascism, fascism isn't so bad after all, and we could use a good deal more of it.

Of course our neighbor to the north has what Mackey would probably also call "government-controlled healthcare", but few would call Canada fascist because of it.  England's National Health Service goes even further in that direction, but again, it isn't fascist.  (Mackey previously compared the ACA to socialism in a 2009 Wall Street Journal op-ed, according to the HuffPost article I've linked here.  But you can't blame a man for evolving; he was so much younger and less mature then.)  Most developed countries have government involvement in the health care system to keep costs down and manage outcomes, for that matter.  Our corporate-controlled system is the main reason why Americans' health is so much worse than citizens of other First World countries.  And don't forget, President Obama and his core supporters find it hilarious or outrageous that anyone should think Canada's system might be a good example for us.

Richard Dawkins has railed against clergy and theologians being allowed to participate in public discussions.  They couldn't be any worse, and often aren't, than our corporate elites.  Which doesn't mean I want business types to be silenced; better they should be allowed to remind us, as often as necessary, how dumb they really are.

Wednesday, January 16, 2013

When I Hear the Word "Meritocracy" I Reach for My Critical Thinking

I'm not sure why it seems important to write something about Chris Hayes's recent book Twilight of the Elites: America After Meritocracy (Crown, 2012).  Glenn Greenwald recommended it highly, and I have a lot of respect for Greenwald.  (On the other hand, he also thinks more highly of Rachel Maddow than I do.)  For another, Hayes has done some good work on his TV program.  But the most important reason is that I find the whole question of elites and meritocracy provocative.

Before reading Hayes's book, though, I read a classic of sociological satire, The Rise of the Meritocracy 1870-2033, by Michael D. Young, originally published in 1958. Young invented the word meritocracy, and it's remarkable how quickly what he intended as snark was adopted by the very people he was mocking.  Understandably, this annoyed him, and in 2001 he complained in print that the Labour party had borrowed the word for its own purposes, along with the abuses Young had warned about.

"Meritocracy" is a hybrid word: like "homosexuality," it consists of a Latin and a Greek component glued together.  The -ocracy was an obvious choice, given its already widespread use.  Why didn't Young use a Greek prefix?  Maybe because such a compound already existed: aristocracy, which means rule by the best or most excellent.  For several hundred years, though, aristocracy had taken on connotations of rule by the privileged, the "best-born," and one could refer to an aristocrat without meaning that he or she was an excellent person -- rather the opposite.  Even those who mainly supported the social order, like Jane Austen, were aware that birth and breeding didn't guarantee excellence, virtue, or even mediocrity: an aristocrat could be crass, stupid, dishonest, and corrupt.  I don't think I'm the only person for whom aristocracy suggests an inbred class of people, probably related to each other, who assume their excellence, especially when they have none.

Merit has similar ambiguities, though, like any abstraction.  It used to mean an earned reward or punishment; I guess demerit eventually took on the sense of an earned punishment.  Two of Merriam-Webster's definitions, "the qualities or actions that constitute the basis of one's deserts" and "character or conduct deserving reward, honor, or esteem", are probably what Young was thinking of.  They're basically circular, but then Young didn't mean his new word to be a sensible one.  Still, in 2001 he could write, "It is good sense to appoint individual people to jobs on their merit."  The difficulty is working out exactly what "merit" is pertinent to a job.  One common problem that seems to be overlooked, probably because it's intractable: Even after a person has occupied a position for some time, he or she can be excellent in some respects and terrible in others; so what do you do?  Young's fictional narrator could claim that by the late 1900s, testing had been developed to the point where a candidate's aptitude for a given spot could be measured precisely; in the real 2010s we're still far from such discernment.

There appears to be a widespread notion in the US and in the UK that we already live in a meritocracy.  Or sort of one.  Or we'd like to.  Or we should.  Or we used to, but it's turning into an oligarchy.  I did a web search for "we live in a meritocracy," and most of the results were questions rather than declarations.  Not too surprisingly, the pundit David Brooks came up with his own use for the word: "The hunger for recognition is a great motivator for the meritocrat ... Each person responds to signals from those around him, working hard at activities that win praise and abandoning those that don't", as if simply wanting to rise in a hierarchy made a person a meritocrat, or constituted merit.  (The article is a tiny masterpiece of free association.)

Most people who use the word "meritocracy," even critically, seem to take for granted that people in high places actually have proven their merit.  I don't.  Generally their success within the system is assumed to be evidence that they have merit, which is conveniently circular.  In the absence of better evidence, I think Noam Chomsky had a good point:
One might suppose that some mixture of avarice, selfishness, lack of concern for others, aggressiveness, and similar characteristics play a part in getting ahead and "making it" in a competitive society based on capitalist principles. Others may counter with their own prejudices. Whatever the correct collection of attributes may be, we may ask what follows from the fact, if it is a fact, that some partially inherited combination of attributes tends to lead to material success?
People tend to overlook the suffix of "meritocracy," the idea that those with "merit" should rule. The first violinist of the New York Philharmonic may be one of the best violinists in the world, but the main benefit accruing to his or her merit is ... playing in the New York Philharmonic.  Some prestige too, no doubt, but the achievement is an end in itself, as achievement mostly should be.  The same goes for star athletes: they may get paid a lot of money, but no one thinks that their achievement entitles them or qualifies them to do anything but play their sport.  Yet people who make large amounts of money in business or finance often think, and are thought by some, to know how the government should be run, or even just the economy.  Aside from the huge business and fiscal disasters over which such people have presided, their public statements since the re-election of Barack Obama should have disabused most people of any notion that they're even qualified to decide their own salaries: they routinely think they're worth a lot more than they are.

Lately at the library book sale I stumbled on The Reality Club (Lynx Books, 1988), a collection of papers from (according to the blurb) "New York's most vibrant intellectual salon", which offered "every reader a journey to the cutting edge of ideas and knowledge in our culture."  I decided to buy it because one paper was "In Defense of Elitism," by Gerald Feinberg, a Professor of Physics at Columbia.  Strictly speaking, "elitism" isn't the same thing as "meritocracy," but since neither word is well-defined, they work out in practice to be roughly the same thing.  (Feinberg helpfully posts a definition as an epigraph to his article: "the leadership by the choice part or segment" [275], though the definition is neither helpful nor very grammatical: "the leadership"?)  Like "aristocracy," "elitism" has acquired some negative baggage, but there's always somebody game to defend it.

"In what passes for social commentary in present-day America, it is hard to find someone who has anything favorable to say about elitism," Feinberg bravely begins.  Yet "No voices have been raised to condemn the Los Angeles Lakers basketball team or the Chicago Symphony Orchestra, even though those institutions are elitist according to any plausible definition of that term" (275).  Really?  I'm not sure they fit even Feinberg's definition.  What do they "lead"?  Who chose them?  Audiences attend events featuring either "segment" not to be led, but to see what they hope will be excellent performances; but also from local chauvinism, loyalty to the brand, to see and be seen by other fans, and so on.  Or again: who "leads"?  It's not the players but their coaches who lead, and the course of the orchestra will be influenced if not determined by wealthy donors who possess no musical excellence themselves.

Feinberg is quick to distance himself from political elitism:
I am not arguing that elitism is a desirable approach to determining who should govern society.  I agree with the view that was adopted early in our society that participation in the process of government by a large part of society is a more important matter than how effective the governors are [276].
Cute.  And disingenuous; surely Feinberg is aware what a question-begging word "effective" is in this context.  Does he mean, say, making the trains run on time?  He also seems to overestimate the actual "participation in the process of government by a large part of society" in the American system, even in theory.
My defense of the proposition that elitism is a desirable attitude, at least in certain situations, is based on two simple ideas.  One is that some activities are accepted as so worthwhile, both by those that do them and by society at large, that they should be done as well as they can be done.  The other proposition is that some individuals are much better at doing these activities than others, or can become by any methods now known [276].
Even if I grant his ideas, what does either one have to do with leadership?  Why should  acceptance by the ignorant canaille ("society at large") have any more weight in determining what is "worthwhile"?  True, Americans do invest a large amount of interest, energy, and money in elite sports, but in what way does that make the NBA or the NFL more worthwhile than pro wrestling?  Classical orchestras are having increasing financial trouble, so does that mean that they are less worthwhile than they used to be, and would Feinberg admit the worth of more profitable music with less old-people prestige, such as Lady Gaga or hiphop?  The idea that only two-hundred-year-old European art music is "worthwhile" has less to do with its intrinsic excellence (which is often real, I'm not disputing that) and more to do with class stratification.

Excellence in sport or musical performance has a certain advantage over other kinds of excellence: it's a lot simpler, and easier to evaluate, than excellence (or effectiveness) in politics or artistic invention or many other areas.  Feinberg glosses over this problem with handwaving, and it soon becomes clear that his article, far from being cutting edge, is an old-fashioned attack on the Trash They're Letting Into College These Days, and the Barbaric Garbage That Is Being Taught/Sold as Art.  He's deliberately vague about the details, but then he could probably count on his intended audiences agreeing.

Ironically enough, Feinberg turns out to be part of the problem he's complaining about, when he dismisses "what passes for social commentary in present-day America."  He may not know much about society, politics, or art, but he knows what he likes.  On top of that he's a particle physicist and a professor at an elite institution of higher learning, so his uninformed and half-thought-out ramblings on social policy and the arts are supposed to count for more than the opinions of someone who has actually studied such subjects and knows what he or she is talking about.  As a physicist Feinberg was elite -- he shared a Nobel Prize in Physics for work on neutrinos -- but expertise in physics doesn't translate into expertise on social issues.  It doesn't even follow that a Nobel Laureate in the sciences is qualified to lead his own academic department; if he is, it's coincidence.

That's the trouble with elitism, though it resembles Feinberg's swipe at popular participation in the process of government, the notion that one man is as good as another and a damn sight better too.  Elitism as an ideology encourages people who do well in one area to believe they can automatically do well in others.
----------------------------------
My next post on "meritocracy."

Saturday, November 17, 2012

Dumb Humans Think Humans Getting Dumber

Wait, Wait Don't Tell Me referred today to a study that claims human beings aren't as intelligent as they used to be.  I'd seen it mentioned once last week, but forgot about it; after an online search I realized that it was getting a fair amount of attention, so I read some of the articles and decided it was something I wanted to write about.

First, though, go back to this unrelated (or is it?) story from last March.
Work by Cornell University psychologist David Dunning and then-colleague Justin Kruger found that “incompetent people are inherently unable to judge the competence of other people, or the quality of those people’s ideas,” according to a report by Life’s Little Mysteries on the blog LiveScience.
“Very smart ideas are going to be hard for people to adopt, because most people don’t have the sophistication to recognize how good an idea is,” Dunning told Life’s Little Mysteries.
What’s worse is that with incompetence comes the illusion of superiority.
The irony in that last sentence sails over the researchers' heads, of course.  Hold that thought as I proceed.

Back to the articles (two so far) published by Stanford geneticist Gerald Crabtree on diminishing human intelligence.  I did some looking around for more information; tried to find Crabtree's articles themselves, but though I found the journal online through the university, I couldn't find the articles themselves.  I'll keep looking.

But for now, none of the reports indicate that Crabtree presented any evidence that human intelligence is in fact decreasing. Here's a summary of his argument from the Daily Mail:
Based on calculations of the frequency with which deleterious mutations appear in the human genome and the assumption that 2,000 to 5,000 genes are required for intellectual ability, Dr Crabtree estimates that within 3,000 years, about 120 generations, we have all sustained two or more mutations harmful to our intellectual or emotional stability.

Also, recent findings from neuroscience suggest that genes involved in brain function are uniquely susceptible to mutations.

Dr Crabtree argues that the combination of less selective pressure and the large number of easily affected genes is eroding our intellectual and emotional capabilities.
There is, I admit, no evidence Crabtree could present on human intelligence in history, because we have no good measure of intelligence for humans today (or any definition of intelligence that would enable us to measure it), and even if we did, we are unable to apply those measures to people who lived hundreds or thousands of years ago.  So what Crabtree has here is at best a hypothesis that he can't test, nor has he any prospect of being able to test it.  (Readers who take IQ tests seriously might want to recall the Flynn Effect, a documented rise in IQ scores that has been observed since the beginning of IQ testing.  But again, there's no way to administer IQ tests to our cavedwelling forerunners.)  According to a New York Daily News story, "Crabtree estimated that within 3,000 years, humans will endure two or more mutations harmful to our intellectual and emotional stability." Now, there's a testable prediction -- all we have to do is wait three thousand years, and then Dr Crabtree can collect his Nobel Prize!

So how does Crabtree argue for a decline in human intelligence?  He uses a well-worn canard, that human beings have gone soft over the millennia because we don't have to dodge saber-tooth tigers anymore.  As this writer quotes him:
"Needless to say a hunter gatherer that did not correctly conceive a solution to providing food or shelter probably died along with their progeny, while a modern Wall Street executive that made a similar conceptual mistake would receive a substantial bonus," Crabtree explains.
Crabtree gets a point or two for mocking the people who think they're the smart ones, but that may be why this blogger -- Web Editor for the San Francisco Business Times -- isn't all that impressed.  Still, he does a good job on a much quoted passage from the paper:
"I would be willing to wager that if an average citizen of Athens of 1000 BC were to appear suddenly among us, he or she would be among the brightest and most intellectually alive of our colleagues and companions," writes Crabtree (whose knowledge of Athenian history may not be quite as good as his obvious expertise in genetics -- he's chosen a date from the Dark Age in Greece, when writing was forgotten and "citizen" was a bit of a stretch, centuries before democracy, Pericles and his ilk -- ah, but I digress; read on, perhaps that's his point after all).

"We would be surprised by our time-visitor's memory, broad range of ideas and clear-sighted view of important issues. I would also guess that he or she would be among the most emotionally stable of our friends and colleagues," he goes on.
He adds, "I would also make this wager for the ancient inhabitants of Africa, Asia, India or the Americas, of perhaps 2,000 to 6,000 years ago."  So that's all right then.

I really must track down a copy of the paper, because I find it hard to believe that this drivel was published in a professional, peer-reviewed journal of genetics.  There's no science here, just speculation and fabulation: "I would be willing to wager ... We would be surprised ... I would also guess ..."  That and $2.50 will get you on the Metro.  It might fly on an Op-Ed page somewhere, but a scientist is supposed to give support for speculations, not just toss them out and treat them as fact.

It appears that Crabtree and his colleagues got their chronology mixed up in more serious ways. They place the peak of human intelligence before humans emerged from Africa, about 2 million years ago, with the long downhill slide following.  By two to six thousand years ago, most of those damaging mutations would have done their work.  There's no reason to believe that people who lived no more than six thousand years ago would be that different from people today -- but they would, on Crabtree's assumptions, be much more like us than they'd be like our shared African ancestors on the savannah.

The Daily Mail continued:
But the loss is quite slow, and judging by society's rapid pace of discovery and advancement, future technologies are bound to reveal solutions to the problem, Dr Crabtree believes.

He said: 'I think we will know each of the millions of human mutations that can compromise our intellectual function and how each of these mutations interact with each other and other processes as well as environmental influences.

'At that time, we may be able to magically correct any mutation that has occurred in all cells of any organism at any developmental stage.

'Thus, the brutish process of natural selection will be unnecessary.'
"Magically"?

There is a lot of question-begging going on here: one is that "intelligence" was a crucial factor in human survival.  It doesn't take a lot of intelligence to escape from saber-toothed tigers; many non-human species have done at least as well as we have in that area.  According to the San Francisco Business Times writer, Crabtree considers "building a house, washing the dishes and putting them away (yes, that's one of his examples), or surviving in the jungle" to be examples of high human intelligence in action.  He has an odd concept of intelligence.  Did our ancestors two million years ago wash dishes?  But again, surviving in the jungle and building shelter are not specifically human abilities.

The only criterion that really matters in natural selection is reproductive success, and human beings have done quite well at that -- too well, in many people's view.  Maybe "intelligence" isn't as vital to human evolution as we like to think.  The canard on which Crabtree builds his case, that Homo Sapiens spread all over the planet, in all kinds of hostile environments, by somehow escaping selective pressure, is an absurd misunderstanding of the theory of natural selection.  But it's a popular one. Someone posted this, linking to the Daily Mail article: "There's no longer survival of the fittest. Intelligence isn't necessary to simply survive."  Like many people this person misunderstands "survival of the fittest."  It doesn't mean fitness according to an abstract conception of superiority; it means fitness in a given environment, and has no meaning outside that environment.  In an environment where intelligence hindered reproductive success, less intelligence would be fitter and the environment would select for it.  If Crabtree were right, that would be exactly what has happened: as human beings became less intelligent, we became more successful.  (It wouldn't necessarily follow that lower intelligence was being selected for, of course.)  But whatever role intelligence played in human evolution -- and we don't really know what role it was -- intelligence of the same kind and level wasn't necessary for reproductive (and therefore evolutionary) success in most species.  This is so basic that I feel foolish spelling it out like this, but there it is.  Insofar as human activity has changed the environment, we have affected natural selection -- but we haven't bypassed it, let alone eliminated it or triumphed over it.  (For example, if our invention and use of antibiotics has led to the emergence of resistant strains of microbes, that's natural selection in action.  Scientists weren't trying to produce resistant strains; they were an unintended and unwelcome outcome of their work.)

The Independent quoted a grumpy geneticist on Crabtree's papers:
“At first sight this is a classic case of Arts Faculty science. Never mind the hypothesis, give me the data, and there aren’t any,” said Professor Steve Jones, a geneticist at University College London.

“I could just as well argue that mutations have reduced our aggression, our depression and our penis length but no journal would publish that. Why do they publish this?” Professor Jones said.
Notice that, contrary to Professor Jones, Crabtree isn't "Arts Faculty."  Like Jones, he's a geneticist, the head of a laboratory at Stanford Medical School that studies Developmental Genetics, Chemical Biology, and Chromatin Regulation. In fact it's usually faculty in the humanities who criticize this kind of biological reductionism.  As Mary Midgley wrote in Evolution as a Religion (Methuen, 1985): 

The effect [of specialization] is to leave many of today’s physical scientists rather unpracticed in general thinking, and therefore somewhat naïve and undefended against superstitions which dress themselves up as science.  Creationism, for instance, cuts no ice at all with humanists and social scientists.  Nobody trained to think historically is in any danger of taking it seriously, least of all theologians.  It makes its academic converts  among chemists and physicists – sometimes, alarmingly enough, even among biologists [24].
But Jones is right that Crabtree doesn't seem to have any data aside from some irrelevant (at least, their relevance isn't evident) calculations of the frequency of malign but unknown mutations that might affect human intelligence.  There's nothing necessarily wrong with putting out untestable speculations, but they don't constitute confirmation or proof of anything.

So why do "they" publish this?  I wonder that myself.  But it's easy to see why it got so much attention.  The thesis is a popular one among social Darwinists, who like to think that the race has gone soft due to luxurious living, and the Stupid are inheriting the earth, instead of their own superior selves. Which takes me back to the quotation above: With incompetence comes the illusion of superiority.