Showing posts with label commonplace blog. Show all posts
Showing posts with label commonplace blog. Show all posts

Wednesday, June 4, 2014

Against Nature

A reader wrote in to correct something I wrote in this post, that "this cultivated nostalgia for a carefully modified-and-tamed-by-humans Nature is an artifact of modernity.  It's a luxury we moderns can indulge because we can keep Nature at bay.  (Most of the time, anyway.)"  My reader pointed out "the Roman poets and their longing for simple country life," and I sit corrected.

I should also have remembered Socrates, who according to Plato, reacted against a similar romanticizing of nature even before the Romans.  In the Phaedrus, Socrates says that "the country places and the trees won't teach me anything, but the people in the city do."  The trick to getting him out of the city, he continues, is to dangle a book in front of him, as people dangle carrots in front of hungry draft animals to get them moving.

I noticed this too when I read The Tale of Genji about a decade ago.  It's a vast novel a thousand years old about Japanese court life.  The title character likes to send flowers on branches, wrapped in poems of his own composition, to the ladies he pursues.  (And rapes, as often as not.)  But when Genji goes out in the rain or snow to collect these romantic gifts, he is wrapped in oilcloth against the elements.  It's his servant, less well covered, who does the work of breaking the branches off their trees.  Japanese culture is famous for its aestheticization of nature, but I noticed that "nature" in Genji's day was something to be cut up and gift-wrapped.  Just like my co-worker, taking her backlit e-reader with her when she goes camping to commune with nature.  So I admit, this fetish isn't a product of modernity.

This might also be the place to mention a couple of related things I've read lately.  Today I noticed a collection of C. L. Moore's Jirel of Joiry sword-and-sorcery stories.  The stories themselves date back to the 1930s, but the collection was published in 2007, with an introduction by the science-fiction writer Suzy McKee Charnas.  I liked some of what Charnas had to say, but much of it baffles me.

For example, she discusses the popularity of "two-fisted action" in pulp writing of the early twentieth century, and contrasts it with literary fiction of the period:
Meanwhile back at the library, the stuff called “literature” in the United States was dominated by people like Ernest Hemingway and F. Scott Fitzgerald, stylists of a terse, “masculine” mode touted as the truest voice of serious American writing.  This hard, stripped-down style was clearly intended to challenge the more ornate, emotional, and melodramatic style of novel that had been popular (especially with middle-class women) for generations – think Dickens, and you’re definitely in the ballpark.  World War One had a powerful effect in concentrating the cultural mind on consensual reality.  After all that killing and dying in reality, mere “fancy” had come to be considered childish and insignificant in literary quarters.  Or, worse, decadent (code for – gasp! – homosexual) [12].
First, Hemingway I can see (though he learned his "terse, 'masculine' node" from the much butcher Gertrude Stein), but Fitzgerald?  His writing is only "terse" compared to, say, Dickens.  I just took another look at the opening pages of This Side of Paradise, thanks to Project Gutenberg:
When Amory was five he was already a delightful companion for her. He was an auburn-haired boy, with great, handsome eyes which he would grow up to in time, a facile imaginative mind and a taste for fancy dress. From his fourth to his tenth year he did the country with his mother in her father's private car, from Coronado, where his mother became so bored that she had a nervous breakdown in a fashionable hotel, down to Mexico City, where she took a mild, almost epidemic consumption. This trouble pleased her, and later she made use of it as an intrinsic part of her atmosphere—especially after several astounding bracers.
Young Amory sounds more like Truman Capote than Conan the Barbarian.  I also know from reading Ann Douglas's The Feminization of American Culture (Knopf, 1977) that nineteenth-century American male writers were obsessed with writing what Julian Hawthorne called "Man-books."  Herman Melville, says Douglas, "regarded the reception of his books as a test which would ascertain what genuine masculinity, or, as he tacitly defined it, what health and independence of mind, remained in American culture" (296).  "Over and over, Melville assures us that he will "set forth things as they actually exist"; he writes to correct 'high-raised romantic notions' about life at sea; no 'sentimental' illusions motivate him; he will give us 'facts'" (301).  Self-conscious striving after "literature" by American males seems to go along quite often with masculine anxiety and homosexual panic; it's not specific to any particular period.

About Jirel of Joiry, whose adventures take place in a version of medieval Europe, Charnas goes on to say that of course both her parents are dead by the time she's an adult, because "In such a violent age if you reached maturity your parents were likely dead, leaving you to replace them as you were meant to" (17).  True, the feudal period was violent, but violence wasn't the only thing that killed people then.  Plague swept through Europe periodically, decimating the population, and women often died in childbirth.  Also, says Charnas, "The feudal, rural France of 'Joiry' is nothing like the Renaissance world of reason, light, and beauty we’re familiar with from history" (15).  Renaissance Europe was still a violent, dirty, stinking, plague-ridden place, with "reason, light, and beauty" kept within strict limits.  Charnas seems to be relying on some outdated histories here, which depicted the medieval period as much 'darker' than it was, and the Renaissance as much 'lighter' and more rational.

I've just begun reading George Sturt's The Wheelwright's Shop (Cambridge UP, 1923), several months after I learned about it from David Ellis's Memoirs of a Leavisite.  Ellis accuses Sturt of romanticizing the old ways of working and living, but this seems at odds with this lovely passage:
The shop was still but half opened when the two front doors had been unfastened.  On either hand was a window, shuttered at night with two shutters put up from within and then fixed with a wooden bar.  When the shutters had been taken down from the windows there was nothing to take their place.  Snow, freezing wind, had a clear run.  With so much chopping to do one could keep fairly warm; but I have stood all aglow from yet resenting the open windows, feeling my feet cold as ice though covered with chips.  To supply some glass shutters for day-time was one of the first changes I made in the shop.  Nowadays, when all the heavy work is done by machinery, men would not and probably could not work at all in such a place; yet it must have sufficed for several generations.  My grandfather and my father had put up with it, and so did I until the winter came round again and the men began to ask me for sundry small indulgences, of which this was one.

Six o’clock in the morning was well enough in the summer; none the less I liked the dark winter mornings better.  Truly they were dark!  At that time the Farnham Local Board, caring nothing for working-class convenience and caring much to save money, had all the street lamps in the town put out at midnight.  The result was that, in the depth of winter, every man who went to work at six in the morning, and most artisans did, had to find his way without any light.  To be sure, there were moonlight mornings.  Sometimes, too, snowy roofs showed clear enough under glittering starlight.  But, on the other hand, there was freezing fog, there was the blackness of dense rain.  One foggy morning I lost my whereabouts in the familiar street; no building could be seen nor any sky distinguished; nothing but a slight difference in the feel of the pavement under my feet told me that I was passing So and So’s shop.  Another time a little glimmering light that met and passed me proved to be a lighted candle-end between the fingers of a chimney sweep, against whom one might otherwise have uncomfortably blundered.  And one black morning I walked through and was conscious of what I took to be the aura of a man on the pavement whom I never saw – probably a motionless policeman [13-14].
Lately I've been thinking often about what life was like in the days before electric light, about town streets -- let alone country roads and paths -- at night; or what it would be like to work in a place like the shop Sturt describes.  Sturt gives a striking picture of that time, and he doesn't seem unduly nostalgic about it.  Well, I've only read the first chapter or so, but I suspect I'll find more nuance here than Ellis allowed.

Monday, April 22, 2013

Not Much of a Difference Between a Bridge and a Wall

I was out of town all weekend, from Friday morning to late Sunday afternoon, and I had no time to write.  I'm still getting caught up, but I'm also reading I Am Your Sister, a posthumous collection of Audre Lorde's essays and speeches.  Lorde was one of the great teachers, and as I read these pieces I'm inspired all over again.  That's especially valuable now, when I've been wading through acres of sloppy sludge on Facebook -- I did have time to scan my news feed over the weekend -- that is equally offensive intellectually whether it comes from liberal, radical, or reactionary sources.  By contrast, Lorde's voice on the page -- poetic, exact, and often angry without being panicky -- is like a satisfying meal after an exhausting day.  I don't always agree with her, just most of the time.

For example, from "The Transformation of Silence into Language and Action":
And where the words of women are crying to be heard, we must each of us recognize our responsibility to seek those words out, to read them and share them and examine them in their pertinence to our lives.  That we not hide behind the mockeries of separations that have been imposed upon us and which so often we accept as our own.  For instance, "I can't possibly teach Black women's writings -- their experience is so different from mine."  Yet how many years have you spent teaching Plato and Shakespeare and Proust?  Or another, "She's a white woman and what could she possibly have to say to me?"  Or, "She's a lesbian, and what would my husband say, or my chairman?"  Or again, "This woman writes of her sons and I have no children."  And all the other endless ways in which we rob ourselves of ourselves and each other [42].
I think I've always known this, since as a child I sought out the stories of people different from me as well as those like me.  Rather than hoping to find myself in one person, my twin or clone, I learned to treasure the similarities I found piecemeal in many characters: this interest here, that trait there. I like to think Lorde would have been as put off as I was by the gay male filmmaker who talked as if he believed that we can only respond to characters exactly like ourselves.  That belief is kin to the idea that men can get nothing from women's writing, that straights can't identify with gay characters, that whites can't identify with blacks, and so on.  To me, the stories of people unlike myself have always been invitations to come in and visit and learn.  I'm always baffled when other readers see such stories as forbidding, unwelcoming.  I suspect they're projecting: because they don't want to read these stories, they fantasize that the stories don't want to be read by them.

Not that Lorde accepted all differences.  (Nor does anyone else, really.)  She was sharply critical of those who made excuses for denying the humanity of others, whether they were white feminists who didn't want to have to "deal with the harshness of Black women" (Lorde, Sister Outsider, 126) or black men who didn't want to have to deal with the harshness of black women.  In a scathing response to black sociologist Robert Staples' 1979 attack on black feminists, Lorde wrote:
The lack of a reasonable and articulate Black male viewpoint on these questions is not the responsibility of Black women.  We have too often been expected to be all things to all people and speak everyone else's position but our very own.  Black men are not so passive that they must have Black women speak for them.  Even my fourteen-year-old son knows that.  Black men themselves must examine and articulate their own desires and positions and stand by the conclusions thereof.  No point is served by a Black male professional who merely whines at the absence of his viewpoint in Black women's work.  Oppressors always expect the oppressed to extend to them the understanding so lacking in themselves [46].
The last sentence of that paragraph is familiar to me too.

Among her most thrilling insights were those about difference.  As she told students at Hunter College,
It is within our differences that we are both most powerful and most vulnerable, and some of the most difficult tasks of our lives are the claiming of differences and learning to use those differences for bridges rather than as barriers between us [201].
That's not easy to do, of course.  There's a fascinating scene in the documentary Audre Lorde - The Berlin Years 1984 to 1982 in which Lorde tries to talk to a couple of Afro-German men in her audience.  But one of the things she worked at during her too-short life was learning to how to use differences as a bridge rather than a barrier, and this more than anything else is something I need to learn from her.

Friday, January 25, 2013

Potatoe

I've been rereading Harvey A. Daniels's Famous Last Words: The American Language Crisis Reconsidered (Southern Illinois UP, 1983), which I've quoted before.  This time I want to pass along another depressing bit, where Daniels quotes an article by "a Chicago City College teacher" for the Chicago Tribune, "bracketing testimony about his own nobility with dozens of student errors" (232).  Here's one of the examples of student stupidity offered up:
I was born in the state of Mississippi, where I started school in the south the teacher didn't teach much about writing the little I know I learn in high school where I attented here which isn't very much, you see.
Daniels remarks:
One wonders what Mr. de Zutter's students thought when they saw their own writing held up as examples of stupidity (and hilarity, according to the Professor's accompanying commentary) for the amusement of the million or so readers of the Tribune.  What did these students say to Professor de Zutter the next time they came to class?  What did he say to them?  How, exactly, was the work of the class advanced by the public ridicule of the students' efforts?  What about the student who had already confessed that what he knew about writing "isn't much, you see"?  What was the purpose of humiliating someone already so humble?

Professor de Zutter, and all of his colleagues around the country who have written or abetted articles in this vein -- exposés that depend on reproducing the worst sentences from the clumsiest essays of the weakest student -- have demonstrated something worse than student illiteracy: they have confirmed their own incompetence.  Teaching anyone anything requires a modicum of trust -- teaching writing requires perhaps more than teaching any other subject.  The writing instructor who is filled with a perpetual sense of outrage over his students' inadequacies, who is obsessed with the shortcomings of their past training, who loathes their attitudes and tastes, who actively expects to despise both the form and content of everything they say or write, who feels that such work is beneath his dignity, who wants to get out of teaching writing, who is so contemptuous of his students' morale -- such a person will never teach anyone to write.  His students may learn to hate their teacher, which might be appropriate.  But they will probably also learn to detest writing, which is a sad and unnecessary waste [232-3].
Daniels goes on to point out that this contempt for students is, " so far as I can tell almost unheard of among elementary teachers" (233).  Not always, though: I've long been haunted by this anecdote from Colette Dowling's The Frailty Myth: Women Approaching Physical Equality (Random House, 2000).
The lack of encouragement girls perceive often come in the form of criticism and even verbal abuse from coaches and parents. In tee ball, a baseball-type game for five- and six-year-old boys and girls, coaches and parents make it clear that they’re not overly interested in the girls who play. And it isn't because they can’t throw. Girls who display strength, power, or physicality when interacting with boys run the risk of being marginalized. “I know the boys don't like it that I run faster than they do,” said Amanda, the fastest runner in seventh grade. “But I do.”

... Worse, the coaches didn’t take their young charges seriously, not even the female coaches. “None of the girls want to be there,” one coach told Landers.* “Not one. If I put a coloring station in the corner, every girl would be there. ... Dads take their sons out to throw with. Girls stay inside and play dolls.”

... In the Georgia tee ball scenario, the coaches reprimanded the girls more frequently and more harshly than they did the boys – and often for the same behavior that was accepted in boys. When they weren’t criticizing individual girls, they were projecting global images of girls as incompetent. When one of the male coaches dropped a ball he was trying to throw, he said, “Look, I throw like a girl.” Apparently he felt so embarrassed, he was compelled to turn the scene into farce: “He contorted his arms and awkwardly threw the ball toward Helen, [a child] who was standing in front of him.” This grown man deflected attention from his own gaffes by bringing one of the girls onto the scene as “a prop for depicting her own incompetence,” Landers wrote in her field notes.

This male coach might have done better to sharpen his own skills, but instead he used the girls to gloss over his clumsiness. After dropping the ball another time, he said to one of the girls, “I’m a little girl. I can’t catch the ball.” Such mockery obviously sends a powerful message to girls -- not only about their abilities, but about their very worth. This coach was telling his girls they were inferior, weak, and not to be confused with strong, powerful boys. The brainwashing worked. The coach's treatment of them and the unwelcoming attitude of the boys on the team contributed to the waning interest of girls during the season [90-2,emphasis added].
When I read this, I imagine a scenario in which the parents present surrounded the coach, and when they dispersed, he was gone but for a little greasy splotch on the grass.  But it seems that the parents didn't object: they wanted the girls out of the program as much as the coach did.

It also occurred to me that the teaching of sports to children provides evidence of how social construction works.  If a little boy can't throw or catch a ball, adults will expend quite a bit of energy to teach him how.  It will take no little determination on the boy's part to get the adults to leave him alone, if he isn't interested in learning this skill.  If a little girl can't throw or catch, adults will throw up their hands and say, "What can you expect?  Girls just naturally can't do this, and they don't want to."  If a girl persists anyway, more pressure will be applied to drive her away, and it will probably succeed if she doesn't have adult allies.  There will be general agreement that it is nature, not nurture, at work in this sorting process.

This may relate to what I wrote yesterday about moral absolutism and "grounds" for criticizing it: if a society doesn't want little girls learning to throw a ball, I may indeed have difficulty answering the claim that they shouldn't.  But absolutists often try to justify and rationalize their dogmas: not only shouldn't little girls throw a ball, they naturally don't want to and naturally can't.  This sort of claim can be rebutted and refuted.  (Remember too that the coaches who tried to drive girls out of tee-ball league didn't invoke religion; religion isn't the only vector for moral absolutism.  Whatever people want to believe will, however, sooner or later find its way into religion: It's not I who say this, it's God, and you cannot go against the word of God.  This move is only effective if you want to be fooled by it, but many people do.)

It then becomes evident that the desire to humiliate students who don't measure up to the instructor's standards is not just a personal neurotic quirk: it functions, often with social support, to weed out undesirables, students whose grew up in areas with inadequate schools and parents who were unable for whatever reason to take up the slack.  (The title of this post comes from an incident that is probably mostly forgotten now: in 1992, then Vice-President Dan Quayle attended an elementary-school spelling bee.  When one of the students spelled "potato" correctly, Quayle intervened to get him to add an "e."  If one comes from a well-to-do white Republican family, poor spelling and grammar skills need not interfere with one's ability to rise in society.)

In one respect, I'm all for people who flaunt their punctuation, spelling, and grammar intolerance openly: it makes it easy to identify and pick on them.

*Dowling is referring here to Melissa A. Landers and Gary Alan Fine, “Learning Life’s Lessons in Tee Ball: The Reinforcement of Gender and Status in Kindergarten Sport,” Sociology of Sport Journal 13 (1996): 87-93.

Wednesday, January 9, 2013

Groping the Truth

Then my friend the ambivalent Obama supporter posted this quotation from Thomas Paine:
Reason and Ignorance, the opposites of each other, influence the great bulk of mankind. If either of these can be rendered sufficiently extensive in a country, the machinery of Government goes easily on. Reason obeys itself; and Ignorance submits to whatever is dictated to it.
Reason and Ignorance are not opposites. Everybody is ignorant of far more than he or she knows.  And as someone else said, the trouble isn't that people are ignorant, it's that they know so much that isn't so.  Anyone who fancies him- or herself rational and free of ignorance will take a tumble in no time. As Jean-Paul Sartre wrote,
The rational man seeks the truth gropingly, he knows that his reasoning is only probable, that other considerations will arise to make it doubtful; he never knows too well where he's going, he is "open," he may even appear hesitant. But there are people who are attracted by the durability of stone. They want to be massive and impenetrable, they do not want to change: where would change lead them? This is an original fear of oneself and a fear of truth.  And what frightens them is not the content of truth which they do not suspect but the very form of the true -- that hinge of indefinite approximation.  It is as if their very existence were perpetually in suspension. They want to exist all at once and right away.  They do not want acquired opinions, they want them to be innate; since they are afraid of reasoning, they want to adopt a mode of life in which reasoning and research play but a subordinate role, in which one never seeks but that which one has already found, in which one never becomes other than what one originally was ...
The honest person knows that she's ignorant. That is the rationale for freedom of speech and debate: the fact that no one is free of ignorance, and so no one can decide infallibly in advance what opinion or belief should be suppressed.

Monday, January 7, 2013

Intellectual Culture

I'm reading Noam Chomsky's latest book of interviews with David Barsamian, Power Systems, just published this month by Metropolitan Books.  As usual, there's plenty of good stuff in it.  For example, Barsamian asks Chomsky how he manages to read so much: "We're sitting in your office, surrounded by piles and piles of books.  How do you get through all this stuff?"
Unfortunately, I don't.  This is the urgent pile.  There are many more stacks elsewhere.  But one of the painful experiences which I try to avoid as much as possible is to calculate how much time it would take, if I read constantly, to go through them.  And reading a book doesn't just mean turning the pages.  It means thinking about it, identifying parts that you want to go back to, asking how to place it in a broader context, pursuing the ideas.  There's no point in reading a book if you let it pass before your eyes and then forget about it ten minutes later.  Reading a book is an intellectual exercise, which stimulates thought, questions, imagination [103].
That gave me a small thrill of recognition, from the piles and piles of books to the despair that follows from thinking about how long it will take read them to the process of reading itself.  Chomsky goes on:
I suspect that will disappear.  You see various signs of it.  There has been a shift in my own classes over the past ten or twenty years.  Whereas I once could make casual literary references and people more or less knew what I was talking about, this is less and less true.  I can see from correspondence that people are constantly asking questions about something they saw on YouTube but not about an article or a book.  They very often rightly ask, "You said so-and-so.  What's the evidence for it?"  In fact, in an article I wrote the same week as that talk, there might have been footnotes and discussion, but it doesn't occur to them to look for that [103-4].
Oh dear.  I can't say for certain, of course, but I bet one reason why young people don't recognize Chomsky's "casual literary references" is that they come from works that were current when Chomsky was growing up, but which aren't current anymore.  The common culture that makes such allusions work is constantly changing over time, and always has been.  The only halfway stable body of literature that could have endured for more than a generation was the Greek and Roman canon, which was stable because it was closed, with no additions for over a thousand years.  But even in the days when a university education was built around that core, students were reading and writing new works in the vernacular, and they were as likely to be quoting soon-forgotten minor poets as the newer literary giants.  I know that today's undergraduates are still reading, but the works they know and quote are not any that Chomsky has had time to read.  I'm in the same boat: though I probably read more contemporary fiction than Chomsky does, I've read very little by the writers who are popular among twenty-somethings today.

"What does that mean for an intellectual culture, then?" Barsamian asks.
It's going to degrade the intellectual culture.  It can't help but do so.  It's a mixed story.  Take, say, electronic books.  They have advantages.  You have half a dozen books you can read on an airplane trip.  On the other hand, when I read a book I care about, I want to make comments in the margins, I want to underline things, I want to make notes on the flyleaf.  Otherwise I don't even know what to go back to.  You can't do that the same way with an electronic book.  Words just pass into your eyes.  Maybe they don't even stay in your brain [104].
And so on.  But wait -- write in a book?  I was brainwashed out of that from my earliest years as a reader, and I've never shaken the conditioning as an adult.  So I started by keeping notebooks, and eventually began writing notes on the computer as I read.  This has advantages, especially since it means I can search for "what to go back to" much more easily, and I can copy and paste what I've typed at will.  But I can't always write my notes when I'm reading, on the bus for example.  I used to see students in the university library making notes on index cards; now they're usually using their laptops.  From what I hear, e-books have made some advances, making it possible to make notes and bookmark passages.  People for whom that matters will adjust.

People for whom that matters, though, like Chomsky and me, are not typical readers.  We're atypical, probably even among people who've been to college.  The "intellectual culture," as I said, will adjust, even though I don't know exactly how.  Academics already have electronic tools for organizing material, which will filter down to non-academics who have use for them; ironically, the next group after academics will most likely be devout Christians who will use them for Bible study.  Older people, like Chomsky and me, will stick to the methods we know when we can't be bothered to adopt the newer ones.  (But look at Samuel Delany's remarks on hypertext, quoted here.)  While the number of people who read seriously, and the "intellectual culture" that includes them, may wax and wane, it would take an asteroid impact to wipe it out altogether.  Our rulers are ambivalent about literacy and intellectual work, but they need people who can do it to sustain themselves and their system.  This problem, I think, just constitutes one of Chomsky's blind spots.

Monday, December 31, 2012

"Tribalism" and the End of the Year Blahs ...

... to be followed, I suspect, by New Year blahs in short order.  I don't entirely know why I've slowed down in this last week of the year, though it's partly because of the heavy snow we got on the day after Christmas, and refreshed on Saturday and today.  No matter; it will pass.

I see that of the three big projects I listed in my agenda for this year, I didn't get to even one.  That's not terrible, because two of them had been on the back burner for several years anyway.  Maybe this year.

But one thing I can do, and that's to post a small supplement to my post on my unease with the word "tribalism," as used by left and progressive bloggers.  After I wrote it, I came across a book called Mistaking Africa: Curiosities and Inventions of the American Mind by Curtis Keim, published by Westview Press in 1999.  It's a reasonably short and accessible account of the ways Americans, white and to a lesser extent black, invent an Africa they want to believe in, rather than the Africa that is there. There's an entire chapter devoted to the word and concept of "tribe," and it confirms my sense that "tribal" and "tribalism," as used by white progressives, have racist overtones.

Keim writes that words like "nation," "people," and "tribe"
began to diverge in meaning in the late eighteenth century.  Europeans, who increasingly thought of themselves as more advanced than other peoples, needed words to distinguish themselves from others.  The word people retained its general usage, but nations came to thought of as larger groupings with more complex social structures and technologies.  The word tribes was reserved for groups that were smaller and, supposedly, simpler and less evolved.  Our modern ideas about primitives and the Dark Continent emerged in the same era.  By the mid-nineteenth century, the word tribe had assumed a negative meaning that implied political organizations that were primordial, backward, irrational, and static.  A person didn't join a tribe; one was born into it.  People in civilized societies could actively select from among different, creative courses of action, but tribal people followed tribal customs without thinking.  It was indeed fortunate for tribes that they had such customs to guide their actions, because members were so limited intellectually.  Of course, "tribalism" was expected of such people.  In other words, to be tribal was to be genetically incapable of more advanced thought or political organizations.

In the twentieth century, the meaning of the word tribe, as it applied to Africa, developed in two directions.  The first, favored by white politicians and colonial administrators, was a variation of the nineteenth-century definition of tribes as having closed boundaries and unchanging customs.  This was administratively useful because it allowed colonialists to make sense of an create order out of the bewildering variety of African political organizations ...

The second direction in the development of the word tribe was favored by anthropologists.  In the 1920s, anthropologists began to live with Africans and to take their day-to-day lives more seriously.  Their experiences revealed that the nineteen-century definition of tribe was deeply flawed.  They found that tribal peoples were neither unthinking nor less evolved than Westerners.  And they learned that tribes were constantly changing and adapting, just as were their own societies.  Anthropologists have sometimes been called servants of colonialism, because they provided the information and categories necessary to organize African people.  Although this negative label has some validity, it is also true that anthropologists were among the first to recognize that African complexity was creative rather than irrational and chaotic [99-101].
This history explains why the use of "tribal" as a putdown bothers me: it draws on nineteenth-century racist notions of "natives" as less "evolved," less intelligent and rational, and unable to think for themselves, but also as a different kind ("race") of person.  It also implies that non-tribalists are more evolved, smarter, and bold independent thinkers, though the leftish writers who use "tribal" this way would be properly contemptuous of the Europeans who thought they were a better breed than the dusky-hued people they tried to subjugate.  (And these same leftish writers are anticolonialist and anti-imperialist to a man.)  But this only shrinks the circle of evolved, intelligent, independent thinkers; it doesn't abolish the circle altogether.  There's still a "tribe" of Real Smart People in there, smug in their disdain for the lesser tribes.

(There's another side to this: I've noticed some intelligent American Indian writers and scholars who seem to accept the notion of tribes as timeless, ahistorical, uniform, and unchanging in their customs because the Elders know best.  None of them, I think, would actually want to live in such a society, but it's a comforting fantasy.)

I admit that I've been susceptible to this same tendency, especially as a lonely young bookworm and sissy.  It was very comforting to think of myself as part of an elite corps of Homo Superior, just waiting for our moment to emerge from the shadows and manifest our glory  It's a tendency I've worked to get free of, though I don't think it's possible for a human being to be completely free of it.  We're a social species, and there will always be Usses and Thems in our lives.  The question is how we treat the Thems.

I also admit that the use of "tribal" by these writers probably hurts no one.  It disturbs me mainly because it looks to me like a faultline in their politics and their thinking.  It disturbed me, but entertained me even more, when one of the writers in question and his Posse defended his use of "tribal" in the same way overt racists defend their practice: by accusing me of petty Political Correctness and Thought-Police tendencies, plus of course they couldn't have said anything racist, it's all in my imagination.  The writer's other line of defense was almost as funny: he cited the writer from whom he'd picked up this sense of "tribalism" as Authority.  (It's on the Internet, so it must be true!)  I consider this to be further evidence of how such groupthink erodes one's ability to think.

There are other words that could be used instead of "tribalism."  "Groupthink," for example.   Or more specific ones, like "party loyalty."  There are surely others  But it's so much easier to have a handy catchphrase, not too vulgar, for flogging one's ideological enemies.

Wednesday, December 19, 2012

Small Changes, Very Small

I'm almost done rereading Marge Piercy's third novel, Small Changes, originally published in 1973.  I first read it in the mid-1970s.  It's a long complex book, built around two main characters but with many other secondary ones.  (Two important male characters are Vietnam veterans, interestingly.)  It's typical now to dismiss it as one of the feminist classics of the 1970s, and many contemporary readers claim that it's dated.  I don't think so.  Relations between women and men have changed somewhat since the book was written, and middle-class women are more likely to have careers outside the home than they were previously.  But men still expect emotional and other service from women, and woman still find it difficult to refuse it, let alone demand emotional and other service from men in return.  And there's still reaction, a powerful movement to return us all to the Fifties, though the Fifties were an aberration, a diversion from tradition rather than Tradition itself.  We're not likely to go back to the days when many women could stay at home as housewives because most men can't earn enough to support a household by themselves.  It wasn't feminism that destroyed the "traditional" family, it was capitalism and the Reagan administration.

But although feminism is one of the core themes of Small Changes, it's about other things too.  It's partly an account of reaction against the Sixties, including state repression that most Americans paid no attention to at the time.  One of the characters must testify before a grand jury empaneled to destroy the resistant social movements that had scared our rulers so badly.  This was officially justified, insofar as anyone bothered, because of the violent segments of those movements: the Black Panthers, the Weathermen.  (Of course, the far greater State violence, against American citizens as well as foreigners, was of no great concern to most citizens, even after it was exposed.)  As one character explains:
It isn't like a regular jury.  They're older, richer white people always, and they even run a credit check on them.  They're to bring in indictments that the government wants.  Ninety-five per cent of the time that's what they do -- agree to prosecute.  But Anita says that mainly they're a device for investigating.  They're a way of forcing people to testify against each other.  The F.B.I. can't force you to talk, the police can't, though they try.  But a grand jury can send you to jail if you don't answer their questions [505].*
And Wanda, the character called before the grand jury, explains:
"This is a fishing expedition, Beth.  The questions they want to ask me aren't just about deserters.  They're asking about the whole web of connections in the movement.  They want to know who's friendly with who, who lives where, in what commune, who sleeps with who, who gives money.  They want to fill in all the missing connections so that people can't disappear underground any more."

"But they must know all that anyhow, they have all the phones in the country tapped!"

"Don't exaggerate, Beth.  They don't know everything.  Even when they 'know,' half of what they know is garbage and they want confirmation.  Look, the questions they're asking me include every address I've lived at since the year zero.  Who else lived there?  What vehicles have I owned and who borrowed them?  Who was at the women's conference last year in Boston and where did they stay?  How did I travel from Boston to New York for the health care conference that summer?  What meetings did I attend in that fall that plotted riots, demonstrations, or street actions -- all lumped as one! ... [507].
Yes, the novel is often didactic, but no more than any Robert Heinlein book.  No wonder that I, who grew up loving Heinlein, also love Piercy.  Heinlein's advocates have often pointed to his highly intelligent and competent characters as a feature of his work that sets it above most (especially mundane) fiction.  Piercy also focuses on intelligent competent characters, with the difference that she doesn't set them against straw opponents they can easily beat.

Things haven't changed all that much in American politics; we now have the Patriot Act and the surveillance state, and probably all our phones and e-mail are tapped now.

Another key topic is computers.  Miriam, one of the two principal characters, moves from mathematics to computer science.  This is in the late 1960s, when computers were mainframes that filled whole rooms, and programs and data were input on punched cards, before desktop computers much less tablets, before the Internet and WiFi.  Miriam gets a job with a small research firm on Route 128 near Boston, the only female programmer in the company.  (Come to think of it, Miriam is a lot like Heinlein's smart, sexy, male-identified heroines, though again, she lives in a different universe.  But then Heinlein's men are also fantasy figures.)  Their big contract is writing software for the anti-ballistic missile program that the Nixon administration backed at the time.  Miriam gets shuttled onto that team, which she hates.  There's a scene where she meets her old MIT professor Wilhelm Graben, who scoffs when she complains about being "hired by death."
"What nonsense.  It's scuttlebutt that the system will never operate.  For instance, assuming that the computer technology works out perfectly -- for the first time in history, perhaps -- but assuming that, the crucial interface is that combination of radar equipment and computers.  Now there cannot possibly in any real war situation be enough time to discriminate the raw data in the radar inputs about what would be real missiles and what simply extra junk floating about.  All weapons are programmed to fill the heavens with large quantities of objects that induce noise in the radar.  Chaff for instsance -- aluminum foil -- reflects radar beams quite nicely and produces responses that would indicate serious objects approaching.  The electronic countermeasures will be jamming, with strong transmission at the same frequency that the radar operates on ...

"But forget military questions.  Think about the most amusing aspect of the ABM.  Now you are exploding, say, a five-megaton warhead to knock down a missile no bigger than a barn.  Obviously, this does not require a warhead of five hundred million tons of TNT -- Hiroshima was destroyed by twenty thousand tons' equivalent.  Thus you can deduce that accuracy is simply not in it.  They're figuring on exploding a five-megaton bomb to knock down a missile because they are not counting on being in the same state with it -- states imagined to be lines superimposed upon the air ... "  His voice was calm and mocking.  She grew colder and colder ...

"Defense, it's called.  The rhetoric of defense, of course, is that human beings are being defended.  But the type of weaponry we are discussing is absolutely useless in defending human beings.  It would make little difference, I imagine, to someone on the ground whether he was fried alive because an enemy missile exploded twenty miles to his right hand, or because one of 'his' missiles exploded as far overhead.  The temperature on the ground would instantly rise several hundred degrees, in either case.  Defense is defense of missile sites, not of people or the landscape, which would be eliminated ... But you must admit, the rhetoric with which American politicians address their constituencies about defense spending is amusing" [379-380].
I'd forgotten this passage when, sometime in the early 90s, I leafed through the book looking for another passage I remembered.  The same weapons systems, with super space lasers added, were floated again in the 1980s as the Reagan gang's Strategic Defense Initiative or SDI, commonly derided as "Star Wars." There have occasional attempts to resurrect the program under Clinton and perhaps under Bush, even though the same criticisms Piercy put into Graben's mouth still apply.  What counts is pouring millions and billions of dollars into high-tech industry, not "defense ... of people or the landscape." 

There's a lot of quotable, still-relevant material in Small Changes -- I haven't come close to exhausting it here.  On one hand, while there's nothing wrong with books that focus on stereotypical concerns of women like domesticity and romance, concerns that still dominate what is derisively called "chick lit," it should be clear that Piercy's interests and ambitions include those concerns but aren't restricted to them.  Much as I love, for instance, Jennifer Crusie and her smart heroines, none of her books are as large in this sense as Small Changes or most of Marge Piercy's novels.  But then, the same is true of most male writers today.  I think that the reason many people try to dismiss Piercy's work as dated feminist cliches is that her left politics disturb them at least as much as her critical take on men in relation to women.

*Quoted from a Fawcett-Columbine trade paperback reprint from (I think) 1997.

Tuesday, November 13, 2012

I Me Mine

I saw the above meme today, and was struck by how obviously false both versions of the platitude are.  Good things come to those who work their asses off and never give up?  Someone is not living in these United States.  This is Social Darwinism: if you don't get any good things, obviously you haven't worked hard enough.  Those who have lots of good things deserve them, because they worked hard for them.  If Mitt Romney is worth a thousand times as much as I am, it's because he worked a thousand times harder.

The person who shared this meme is also a devotee of Soka Gokkai Buddhism; in addition to generic motivational memes she posts daily inspirational quotations from Soka Gokkai International's schismatic leader, Ikeda Daisaku, which are at best noteworthy for their vacuousness.  At times, though, something like the above sentiments peeks through.  I don't know enough about Soka Gokkai to make any broad generalizations about the movement, but I have noticed that Westerners will often embrace teachings from Eastern religions that they would scorn from Western ones.

Which reminded me of something Vijay Prashad wrote in The Karma of Brown Folk (Minnesota, 2000).  Here's my inspirational quotation for the day.
In 1967, during the Summer of Love, Maharishi Mahesh Yogi gave a revealing press conference in New York City. "The hungry of India, China, anywhere," he noted, "are lazy because of their lack of self-knowledge. We will teach them to derive from within, and then they will find food." Four months before the World Food Conference in Rome in November 1974, the CIA noted that because of grain shortages induced by the shifting cultivation patterns due to the uneven terms of trade "Washington would acquire life and death power over the fate of the multitudes of the needy." These details did not enter the worldview of the guru, who was content with the imaginary freedom for sale to the disenchanted bourgeois. Some reporters found the Maharishi's statement to be unacceptable, and one asked, "Do we have to ignore the poor to achieve inner peace?" The Yogi answered, "Like a tree in the middle of a garden, should we be liberal and allow the water to flow to other trees, or should we drink ourselves and be green?" "But isn't this selfish?" "Be absolutely selfish. That is the only way to bring peace, to be selfish, and if one does not have peace, how is one to help others attain it?" [Deepak] Chopra is not very different. One might expect his Law of Giving to contain a call for charity, but his notion of a gift is more indulgent. "Wherever I go, and whoever I encounter, I will bring them a gift. The gift may be a compliment, a flower, or a prayer." These neither feed nor clothe anyone. Further, "I will make a commitment to keep wealth circulating in my life by giving and receiving life's most precious gifts: the gifts of caring, affection, appreciation and love." He will give important emotional gifts, but he will be ready to accept "all the gifts that life has to offer me." Of obligations, a classic liberal trope, we hear nothing. Of the poor, we get an idealized picture that rivals Mother Teresa for condescension.  "On his many travels to India, [Chopak's son] Gautama has witnessed the harsh reality of the street children who have no belongings other than their beautiful souls. In India, even amidst the immense poverty and destitute conditions, one finds in the children no trace of violence, no hostility, no rage, no anger. There is a simple, sweet innocence even among the extremely impoverished." The poor cease to be human with the capacity to struggle and to aspire; they appear as contented people willing to sacrifice their material well-being for the spiritual happiness the bourgeois tourist wants them to enjoy. If the poor are unhappy, it ruins the tour as well as the image of the spiritual East [60-62].

Sunday, November 11, 2012

Big Brother Has Always Been Watching You

I'm reading Jon Wiener's 1991 collection of articles Professors, Politics and Pop (Verso), which I happened on recently.  Wiener is a historian and journalist, and I've read some of his pieces over the years, especially at The Nation.  Most of the pieces in this book were published in the 1980s, and as someone who lived through it, it's interesting to revisit that period.

For example, in a review of a book on the CIA and academia, Wiener writes:
A quarter of Winks' book is devoted to James Angleton, Yale '41, head of CIA counterintelligence during the 1950s and 1960s.  Angleton left the agency after the 1975-6 Church committee hearings revealed that he had headed CIA operation HT/LINGUAL, under which the agency had opened American mail since 1955 [and] kept a watch list of two million citizens, in violation of its charter.  Defending himself before the committee, Angleton said, "It is inconceivable that a secret intelligence arm of the government has to comply with all the overt orders of the government."  In spite of that, Winks can hardly restrain his admiration for Angleton, describing him as "incredibly attractive," the greatest mind in the history of counterintelligence and a "theoretician of human nature."  Yet Angleton worked for years with Kim Philby, had lunch with him weekly and never suspected him of being a Soviet Spy.  Indeed, the failure of the head of US counterintelligence to suspect Philby led some to conclude that Angleton himself was a KGB mole; the CIA investigated him for two years before deciding that he wasn't [121].
What follows, including the dubious claims by a defector that there was a mole in the CIA, suspected by Angleton himself to be William Colby, makes me wonder if the CIA was wrong about Angleton.  Maybe he was a KGB mole.  But then I just recently saw both the TV and movie versions of John Le Carre's Tinker Tailor Soldier Spy, about a Soviet mole in British intelligence, so I have moles on the brain.  But the most interesting thing to me is CIA surveillance of millions of American citizens over a twenty-year period.  A lot of people think that the Bush and Obama surveillance programs are something new.  It seems that whatever lull occurred after the Church committee's investigations was the real aberration.

One other item that was news to me, because I didn't know that anyone had numbers on this:
The Age of Reagan seems to be coming to an end on the nation's campuses: 37 per cent of the 1990 freshman class reported that they "participated in organized demonstrations" last year, according to an authoritative poll -- more than double the percentage of the late 1960s [156].
Noam Chomsky has long said that there is more activism on college campuses today than there was in the 60s, and my impression was that he was right, but it's good to have some evidence about the matter.  When students I worked with during the 90s told me that they'd missed the interesting times of the 60s and there was nothing happening politically, I would point out all the groups that were busy afflicting the comfortable at the time; many of them seemed to want to ignore that activism, though.

Also interesting is that according to Wiener, the demonstrations in which these kids participated weren't just for the hot-button issue of the period, South African apartheid, but about issues ranging from school dress codes to civil rights issues at home:
The dismissal of a black superintendent was the focus of the recent Selma, Alabama sit-in by black high school students.  In Los Angeles, 10,000 high school students took part in sit-ins and protest marches during a two-week teachers' strike in 1989, with many of the students actively supporting the teachers' demands -- the largest wave of student-led protests in the LA schools in twenty years [156].
There has always been more activism in the US than many Americans know about, and more than many want to believe.  At the same time that many adults complain about the (supposed) apathy of Kids Today, they also seem comforted by it, and relieved at the idea that the young have given up the idealism of the Sixties for the self-absorption of later decades.  This is clearly wishful thinking, based on the propaganda of corporate media that welcome and try to foster citizen apathy, and not on reality.

Wednesday, October 24, 2012

Space Bar

I'm feeling flustrated, distracted, and out of sorts today, so I'm just going to post this quotation from an interview (via) with Emma Donoghue:
"I got into all this doing a PhD in 18th-century literature, when I became interested in revisionism," Donoghue explains. "Who was left out of history? Well, primarily, women. But look at the history of everyday life and you find that most people are left out: the women lead you to their equivalently obscure male family members, then you come to the freaks and cripples and slaves – not just downtrodden, but treated as not fully real people. If you're writing a novel about Henry VIII, you don't have to say what you've fictionalised, because it's easy to check; Henry VIII doesn't need you to speak up about your sources. But Mollie Sanger, my doughty cowgirl – if I don't put her on the record she's not on it at all. And I'm so grateful to her and all of them for the good stories."
That improves my mood all by itself.  So, should I wait for her new collection of stories to come in at the library, or should I just go ahead and buy a copy?

Friday, October 5, 2012

Un Peuple en Diaspora

I haven't read much by the late Eric Hobsbawm, who died this week at the age of 95, but this excerpt from his autobiography, quoted at Jews sans frontieres, got my attention:
What exactly could 'being Jewish' mean in the 1920's to an intelligent Anglo-Viennese boy who suffered no anti-Semitism and was so remote from the practices and beliefs of traditional Judaism that, until after puberty, he was unaware even of being circumcised? Perhaps only this: that sometime around the age of ten I acquired a simple principle from my mother on a now forgotten occasion when I must have reported, or perhaps even repeated, some negative observations of an uncle's behaviour as 'typically Jewish'. She told me very firmly: 'You must never do anything, or seem to do anything that might suggest that you are ashamed of being a Jew.'

I have tried to observe it ever since, although the strain of doing so is sometimes intolerable, in the light of the behaviour of the government of Israel. My mother's principle was sufficient for me to abstain, with regret, from declaring myself konfessionslos (without religion) as one was entitled to do in Austria at the age of thirteen. It has landed me with the lifetime burden of an unpronounceable surname which seems spontaneously to call for the convenient slide into Hobson or Osborn. It has been enough to define my Judaism ever since, and left me free to live as what my friend Isaac Deutscher called a 'non-Jewish Jew', but not what the miscellaneous regiment of religious or nationalist publicists call a 'self-hating Jew'. I have no emotional obligation to the practices of an ancestral religion and even less to the small, militarist, culturally disappointing and politically  aggressive nation-state which asks for my solidarity on racial grounds. I do not even have to fit in with the most fashionable posture of the turn of the new century, that of 'the victim', the Jew who, on the strength of the Shoah (and in the era of unique and unprecedented Jewish world achievement, success and public acceptance), asserts unique claims on the world's  conscience as a victim of persecution.
Right and wrong, justice and injustice, do not wear ethnic badges or wave national flags. And as a historian I observe that, if there is any justification for the claim that the 0.25 per cent of the global population in the year 2000 which constitute the tribe into which I was born are a 'chosen' or special people, it rests not on what it has done within the ghettos or special territories, self-chosen or imposed by others, past, present or future. It rests on its quite disproportionate and remarkable contribution to humanity in the wider world, mainly in the two centuries or so since the Jews were allowed to leave the ghettos, and chose to do so. We are, to quote the title of the book by my friend Richard Marienstras, Polish Jew, French Resistance fighter, defender of Yiddish culture and his country's chief expert on Shakespeare, 'un peuple en diaspora'. We shall, in all probability, remain so. And if we make the thought experiment of supposing that Herzl's dream came true and all Jews ended up in a small independent territorial state which excluded from full citizenship all who were not the sons of Jewish mothers, it would be a bad day for the rest of humanity - and for the Jews themselves.
-- Eric Hobsbawm: Interesting Times: A Twentieth Century Life (Pantheon, 2002, p.24).
Just what I need: another oeuvre of a prolific writer to add to my piles of books to be read.  I probably don't have to read all his work, and I've already read The Invention of Tradition.  I'll probably start with Interesting Times.

Sunday, September 16, 2012

The Fantasy Police

While I was puttering around this morning after a late night, my eye lighted on Samuel Delany's On Writing (Wesleyan, 2005) and I decided to reread his remarks on Toni Morrison's novel The Bluest Eye.  That novel's premise is that a young black girl, living in the South on the cusp of the Second World War, becomes obsessed with the idea of having blue eyes and goes mad.  Quite mad.  I'm going to reread The Bluest Eye soon, I hope (I say that every time I think of Delany's critique, but this time I mean it), but my attention was caught today by this passage:
Morrison's novel aligns itself with the Fantasy Police.  Reading it, I find myself asking: What's wrong with wanting to be different from what you are?  The assumption that wanting to be other than you are means that you hate yourself is pathological and patently absurd.  A much clearer and more articulate argument might be posed that to desire effectively to be different, actually to expend energy to bring that difference about (to become surgically a woman if you are born a man; to become surgically a man if you are born a woman; to reconstruct your foreskin if you were circumcised before you could consent to it; to straighten your hair if you don't like it kinky; to wear blue contact lenses if you have brown eyes and dark skin; to wear dreadlocks if you were born with straight blond hair; to pierce, or tattoo, or or decorate your body in any way at all; to exercise or diet to contour your body toward whatever ideal you set yourself) requires much more self-confidence and a clear sense of who you are than those who never question or wish to adjust their bodily reality at all [166-7].
That is a long sentence, so don't lose sight of its beginning: that a much clearer and articulate argument to that effect might be posed.  I'm not sure I'd agree entirely with it myself, since I'm not sure what "a clear sense of who you are" is; I'd have to see a fuller articulation of it before I could decide.  But I still like his basic complaint.  Delany is black himself, and he's fully aware that "whatever ideal you set yourself" isn't necessarily a pure spark that comes out of a pure individual core: it's affected (though not completely determined) by the environment one grows up in.

I'd also argue that at least some decorations and other changes we may may make to our bodies aren't seen as recuperating or expressing our original selves before our parents conceived us.  People often try out different looks, even selves, without necessarily thinking of them as our essential being.  Whether a given modification feels like an expression of Self depends on the individual: the same look that represents a move toward the ideal for one person might just be this week's fashion statement for another.  But I think of this as an extension of Delany's point, not a refutation: why shouldn't people try on different selves, experiment with body ornament, and so on?  I don't think those experiments are rendered invalid if they don't last a lifetime.

Delany continues a couple of paragraphs later:
The accusation of self-hatred, whether racial or sexual (and notice, it is only blacks, only gays, only Jews, only women who are ever accused of hating themselves -- never straight white Protestant males ...), once we are attuned to it, always carries clear and persistent overtones of sour grapes from the accusers.  ... This has a common name in discussions of liberationist political analysis: blaming the victim [167].
Again, I go only part way with Delany here.  For one thing, it seems to me that straight white Protestant males are accused of self-hatred, usually under the name "liberal guilt" or (formerly) "radical chic."  If a man sides with feminism, a heterosexual with gays, a white person with anti-racism, a Jew with Palestinian struggle, he or she will be accused of self-hatred, and of inauthenticity at the very least.  (You can see this in many attacks on straight white Jewish Noam Chomsky: because he isn't poor, his criticisms on the privilege of the wealthy are hypocritical -- but if he were poor, he would be accused of mere envy of what he hasn't got.)  A blond, blue-eyed person who cultivates dreadlocks will be accused of cultural appropriation, if not overt racism.  Which is wack, if not racist itself.

Still, Delany's basic point is I think a good one.  (There's more to his critique of Morrison that's relevant here, but I'm not going to comment more until I've reread her book.)  I've written before about the same pattern among gay men seeking authenticity either in effeminacy or in machismo.  I think the two positions cancel each other out, but are to be opposed whenever they rear their heads. 

Thursday, June 14, 2012

Words for What We Feel

Here's an example that shows why I keep up with Ruth Vanita's writing.  From the first page of her introduction to Queering India: Same-Sex Love and Desire in Indian Culture and Society (Routledge, 2002):
At a critical moment in Deepa Mehta's film Fire, Sita remarks to her lover Radha, "There is no word in our language to describe what we are or how we feel for each other."  To which language does she refer -- Punjabi, some variant of Hindi, Urdu, or, more likely, some combination of all three?  We do not know because on screen the characters speak English.  In this metonymic moment, two things happen: English is disowned as "our language" (even though Indians have been speaking English for two hundred years) and "our language" is framed as a catch-all unnamed Indian language that lacks any word for same-sex identities or relationships.

Sita's (Mehta's) comment reflects an idea dominant in academia today -- that prior to late-nineteenth-century European sexologists' and psychologists' inventions of labeled identity categories such as invert, homosexual, lesbian, and heterosexual, inchoate sexualities and sexual behaviors existed but were not perceived or named as defining individuals, groups, or relationships.  For those who accept this formula, it follows that in parts of the world where these categories are not yet widely known, people do not perceive even long-term sexual relations as significant markers of identity or personality.  Hence the tendency of queer theorists to avoid using words like homosexual to refer to persons or relationships in earlier periods of Euro-American history or in places other than the first world today.
... Except, that is, when they don't: I've shown before how people who accept this formula seem unable to use it consistently, talking about, for example, "homosexual subcultures" in times and places where by their own criteria, homosexual subcultures could not have existed.  Besides, as Vanita continues:
This formula has been challenged by several historians of Europe who have pointed to the use of terms like Ganymede, tribade, Sapphist, and even lesbian as early as (in one case) the tenth century, and certainly from the Renaissance onward, to mark individuals habitually given to same-sex sexual relations.  In the context of South Asia, Michael Sweet and Leonard Zwilling have demonstrated the formulation of sexual categories in Hindu and Jain texts as early as the sixth century B.C.E, it is evident that the Kama Sutra (fourth century C.E.), while mentioning casual sexual relations between "men," also classifieds men who prefer men as "the third nature"; and scholars of the medieval Islamicate have written both on male-male love and on the representation of female-female love by male writers.
The usual response to such pre-modern terms is that they don't correspond to homosexuality or gayness or  lesbian "as we think of it today."  This move fails because there is no single way of thinking about homosexuality today, even if the thinking is limited to the United States, and some current ways correspond reasonably well to pre-modern or non-Western conceptions.  Since the people who accept this formula never clarify what they mean by "homosexuality as we think of it today," and tend to assume that there is only one "modern," "Western," way of thinking about homosexuality, they end up confusing themselves and their readers.

On the following pages Vanita gives numerous examples of "words for what we feel for each other" in classical Indian literature, and remarks that she's convinced "that these and other such terms found by scholars represent the tip of the iceberg, and that further research in these and other languages will uncover many more such terms" (3).  She also points out that, contrary to faux traditionalists who try to represent homosexuality as a Western importation,
The rhetoric of modern Indian homophobia (with concepts and terms like unnatural and sinful) draws directly on a Victorian version of Judeo-Christian discourse; this borrowing is indicated in Fire when Radha's husband Ashok, having seen his wife and Sita in bed together, says, "What I saw is a sin in the eyes of God and man."  This is a direct quote from the Bible (the words of the prodigal son to this father, in Christ's parable of sin and repentance, in the Book of Luke).  This instance begins to explain why Mehta found it so hard to translate such phrases into Hindi, and decided to make the film in English; it would be almost impossible to literally translate Ashok's sentence into Hindi and have it sound convincing [3].
This indicates that the words Mehta wrote for Sita reflect her own ignorance, not the reality of Indian languages or cultures.  It's a common apologetic claim in gay/lesbian fiction in English, and I suppose its function is to deny that the speaker was influenced by other perverts (who presumably would have taught terms as well as practices), and to claim that his or her desires and practices sprang automatically from his or her inner nature.  Generally, the people I've encountered who say this are quite aware of the "words for what we feel for each other," but (understandably) recoil from them.

Anyway, that's why I like Ruth Vanita's work.  She's aware of the complexities of human sexuality, and she doesn't fall into the oversimplifications (or obfuscations) that dominate the field these days without falling into an equally oversimple essentialism or anti-anti-essentialism.  It was a great relief to me, when I first read her and Saleem Kidwai's Same-Sex Love in India (St. Martin's, 2000) to find that I wasn't the only person who had noticed these problems in the prevailing approach to queer theory.  Vanita's Love's Rite: Same-Sex Marriage in India and the West (Penguin, 2005), is also worth the attention of anyone interested in this subject.

Saturday, April 21, 2012

I'll Show You the Life of the Mind

Today I picked up a copy of A. E. Housman's 1911 inaugural lecture as Kennedy Professor of Latin at the University of Cambridge.  Housman is probably best known to most people now for his verse, especially A Shropshire Lad (1896), which is a classic of love poetry between men.  But he was also a distinguished scholar of Latin and Greek literature, famous both for the quality of his work on the ancient texts and for his scathing reviews of other men's work that didn't come up his standards.  I first got interested in that aspect of Housman when I encountered his line "Three minutes' thought would suffice to find this out; but thought is irksome and three minutes is a long time."  Some years ago I found a selection of his prose writings, and enjoyed them a great deal even though I'm not a classicist and know no Latin or Greek at all.  What made the articles interesting was Housman's prose style, but even more, his vast learning, his clear reasoning, and his humor.  When I found the copy of the inaugural lecture and saw that it had never been completely published before, I snapped it up.

The reason the lecture had not been published during Housman's lifetime, and not in its entirety until 1969, was that he had made some statements about a poem of Percy Bysshe Shelley's that he couldn't back up: he'd criticized Algernon Swinburne for gushing over the beauty of a line that, Housman said, was "the verse, not of Shelley, but of a compositor" (33).  The poem, titled "A Lament" by Shelley's widow for posthumous publication, seems not to have been completed.  Because he couldn't prove his claim, Housman wouldn't allow the lecture to be published, and it would have been lost if his brother Laurence hadn't saved a copy.  When it was first published, the section on Shelley's poem was omitted, but eventually the editor John Carter was able to verify from manuscripts that Housman was correct, and the lecture was published unexpurgated (The Confines of Criticism: The Cambridge Inaugural 1911 [Cambridge University Press, 1969]).

It would have stood up well without the discussion of Shelley's poem.  I noticed that Housman's epigram about three minutes' thought appears to be a sharpened paraphrase of a saying of Goethe's, which he quotes: "Thinking is hard, and acting according to thought is irksome."  (Denken ist schwer, nach dem Gedanken handeln unbequem [37].)  A bit later he talks about the way people think:
Men hate to feel insecure; and a sense of security depends much less on the correctness of our opinions than on the firmness with which we hold them; so that by excluding intelligence we can often exclude discomfort.  The first thing wanted is a canon of orthodoxy, and the next thing is a pope.  The disciple resorts to the teacher, and the request he makes of him is not tell me how to get rid of error but tell me how to get rid of doubt.  In this there is nothing new: 'as knowledges are now delivered', said Bacon 300 years ago, 'there is a kind of contract of error between the deliverer and the receiver.  For he that delivereth knowledge desireth to deliver it in such form as may be best believed, and not as may be best examined; and he that receiveth knowledge desireth rather present satisfaction than expectant enquiry.'  Blind followers of rules will be blind followers of masters: a pupil who has got out of the habit of thinking will take his teacher's word for gospel, and will be delighted with a state of things in which intellectual scrutiny not only ceases to be a duty but becomes an act of insubordination [40-41].
Not much has changed in the past hundred years -- or in the three centuries before that, as Housman says.  The lecture as a whole is a splendid argument for intellectual autonomy, its necessity and its rarity.

Tuesday, March 13, 2012

A Mad Tea Party

Okay, back to other matters. I've quoted before from Michael E. Staub's Madness Is Civilization, (Chicago, 2011). Staub also tells how it became known in 1967 that
... a military psychiatrist named Lloyd H. Cotter had conducted behavioral experiments on patients diagnosed with schizophrenia at the Bien Hoa mental hospital in South Vietnam. Cotter’s stated goal was to help these patients, but it was impossible for critics not to read in Cotter’s report an utter disregard for the terror and suffering he must have brought them. In brief, Cotter had become enamored with “operant conditioning,” a technique devised by psychologist B. F. Skinner that suggested that “positive” and “negative” reinforcements could encourage individuals to give up behaviors deemed undesirable. Cotter had also read how psychiatrist O. Ivar Lovaas had withheld food from autistic children to promote a reduction in their antisocial behavior, and decided he would starve his own “difficult-to-activate patients.” Some patients held out for five days without food before they succumbed to Cotter, but the doctor was in the end able to announce that every single patient had finally “volunteered for work.” Yet this was not all. Cotter further administered to recalcitrant patients unmodified electroconvulsive treatments (a technique already illegal in the United States because it meant patients did not receive anesthetics or muscle relaxants and therefore ran a risk of fractured bones and compression injuries of the spine). Undeterred by the frank chance that he might injure his patients, Cotter announced that he’d given “several thousands” of these shock treatments, and that “our objective of motivating them to work was achieved.” And what kind of work exactly were these patients asked to do? It was growing crops for a deployment of Green Berets stationed nearby [124-5].
As I understand it, operant conditioning as B. F. Skinner developed it stressed rewards rather than punishments: "negative reinforcement" was seen as ineffective. But regardless of what Skinner thought, Cotter seems to have been more interested in negative reinforcement. What he was doing really sounds more like human experimentation, of the kind the Nazis were notorious for, than anything therapeutic.

Perhaps you'll argue that Cotter was a lone kook, a mad scientist doing his dirty work in the Vietnamese jungle far from responsible scrutiny, and that he therefore shouldn't be seen as representative of psychiatry in the 1960s. That argument won't work, though, because Cotter published his report not in Soldier of Fortune or Spanking Tales but in the American Journal of Psychiatry, a peer-reviewed professional publication -- which means that his colleagues read the article and concluded that it was a serious contribution to scientific knowledge in the field. According to Staub it was "radical therapists" who were appalled by it, not mainstream psychiatrists.

Staub then discusses another controversial issue from the same period:
In late 1967 another case sparked indignation, this time involving three doctors at Harvard Medical School who had suggested that social and economic factors alone could not explain the violent rioting in Detroit that summer – riots that left forty-three dead, more than a thousand injured, and resulted in over seven thousand arrests. In an oft-cited essay, “Role of Brain Disease in Riots and Urban Violence,” which appeared in the Journal of the American Medical Association, psychiatrist Frank Ervin and neurosurgeons Vernon Mark and William Sweet argued that while it was “well known” that poverty, unemployment, slum housing, and inadequate education underlie the nation’s urban riots,” the “obviousness of these causes may have blinded us to the more subtle role of other factors, including brain dysfunctions in the rioters who engaged in arson, sniping, and physical assault.” There was a strong need for clinical studies to be conducted “of the individuals committing the violence” to determine whether “brain dysfunction related to a focal lesion lays a significant role in the violent and assaultive behavior” of black ghetto dwellers. There was additionally a need, the doctors added, to develop “reliable early-warning tests” to detect and screen for potentially violent offenders. This last research agenda received major grants of more than $600,000 from the National Institute of Mental Health and the Justice Department’s Law Enforcement Assistance Administration, monies designed expressly to study “the incidence of violent disorders in a state penitentiary for men; estimate their prevalence in a non-incarcerated population; and improve, develop and test the usefulness of electrophysical disorders in routine examinations.” There had also to be established diagnostic detection centers that could help to prevent violent outbreaks. The first such center “for the study and reduction of violence” opened at UCLA in 1972 (and received more than a million dollars in state moneys from California governor Ronald Reagan in 1973) [125-6].
(The Journal of the American Medical Association is also peer-reviewed, by the way.)

Staub doesn't mention the vogue for psychosurgery in the same period, nor the role American psychiatrists played in developing new forms of torture, but they had the same intent: to treat dissent and resistance as symptoms of brain dysfunction, and to control subjects' behavior. No psychiatric professional seems to have thought of applying the same concepts and remedies to the practitioners of state violence, not even to the police who attacked Civil Rights demonstrators with clubs, firehoses, and attack dogs, let alone to Ronald Reagan.

It's a safe bet, though, that they would have considered the possibility of brain dysfunction in the demonstrators themselves. Armchair psychoanalysis of white student dissidents was a popular pastime in those days. Commentators like "Sidney Hook, George Kennan, Jacques Barzun, Irving Kristol, Nathan Glazer, and H. Stuart Hughes each offered his own unflattering diagnosis of young radicals’ mental states, identifying all sorts of possible analytic explanations for student radicalism" (168), despite "hundreds of empirical research summaries – most finding student radicals to be well adjusted" (235, note 11)" that indicated otherwise. None of those pundits was qualified to diagnose anyone's mental health, of course.

Later in Madness Is Civilization, Staub briefly mentions Marge Piercy's 1976 novel Woman on the Edge of Time. That story, you may remember, shuttles in time between the 1970s present and a utopian twenty-second century future. The protagonist, Connie, is in a mental hospital and has been drafted for an experiment involving the implantation of electrodes into patients' brains to enable control of their behavior by the doctors. Staub remarks that "in the evil 1970s, however, psychiatrists are sinister figures and the main character is involuntarily committed and diagnosed paranoid schizophrenic" (176). Given the evidence Staub himself has collected, it seems fair to say that many mainstream psychiatrists have been "sinister figures."

Which isn't to say that they all were, only to point out why many people in that period came to distrust the profession. Your psychiatrist might genuinely be trying to help you, or he* might be trying to advance his career, with you as a guinea pig. And even if he did want to help you, he might have no idea how to do so. A lot of empirical research also showed that psychiatric treatment was less effective than placebos or no treatment at all. Remember too that homosexuality was still officially a disease until 1973, and not until this century did the psychiatric profession reject attempts to change sexual orientation as the quackery they were.

Nor am I denying that there are people who are distressed and disabled by what goes on in their minds, or that they shouldn't be helped. It appears that changes in medication have helped some of them, though I retain some skepticism about that since the profession also touted its successes in the days of the talking cure, electroconvulsive therapy, and psychosurgery. I wonder what mainstream practices of today will have to be forgotten another generation down the line.
-------------------------------------
*I'm not using "he" generically; in those days most doctors and psychiatrists were male.

Saturday, March 10, 2012

The Intractable Woman

I forgot to include this in last night's post. From Kate Millett's Flying (Knopf, 1974); while Millett was in London, her friend and host the filmmaker Nell O'Rourke told her (312):
"They have taken our anger from us, it's the biggest crime perpetrated against women. Got in touch with mine through that group therapy business. Men and women behave differently in group therapy, you know. Men discover their fear. Women their hostility. Paul was physically attacked in the first group we joined. Look, learn to get angry, stop being humble and calling yourself queer. There are dead people shouting inside you. You have got to shut up your mother. There she is, quivering away inside you, screeching at you." I hear her voice on the phone, hear her voice at thirteen. "The family is a dead area inside our skulls," O'Rourke resumes. "Drop your endless capacity for punishment. Let yourself. Do. They cannot punish you, the voices inside. Stifle them."
I hadn't looked at this passage in a long time, probably more than twenty years. What I remembered was what O'Rourke said about anger, but when I first read it I was struck just as much by what she said about the punitive "voices ... "screeching at you" inside your head. These are not (necessarily) the kind of voices that go along with schizophrenia; I took them metaphorically as the echoes of everyone -- not just adults: peers too -- who do the job of keeping us within bounds. You can't think that. You can't say that. You can't do that. You can't dress like that. Boys don't cry. Nice girls don't do that. What are you, a fag? What are you, a slut? What are you, a nigger-lover? What are you, a commie?

Especially after having read Staub's book, I now detect echoes of Laingian antipsychiatry in O'Rourke's speech. At my age I'm less inclined to want to blame my family, or anyone else, for my inhibitions and guilt. Maybe I should read Laing, because I'd like to know if he took into account the fact that human beings are a social species; our vulnerability to our families comes from our prolonged infancy and childhood. We can't survive without the support and input, but also constraint, of other human beings. Those "voices" hold us up as well as crush us. From Staub's account of the antipsychiatrists, I gather they overlooked this.

I credit Dorothy Dinnerstein for at least implying this in her The Mermaid and the Minotaur (Harper, 1976), a flawed but still powerful book that had a big influence on me. She pointed out that infants and young children want everything, and with the best will in the world the most nurturing mother couldn't give them everything they want; they're too young to understand why they can't stay in the toy store forever, or run out into the street as they see grownups doing, or take home the things they see and want. This means to me that if growing up healthy means never being frustrated, no one could grow up healthy. And, of course, our parents also have their own wounds, and their own needs, they're the products of their own upbringing, so if I'm not to blame for my problems, neither are they.

Taking all this into account, O'Rourke's advice still looks good to me. For one thing, we selected the voices that screech at us from inside our heads. (Parents are often shocked to learn what their children remembered, as well as what they forgot.) This passage was a voice I kept in my head, and it helped me along through my later twenties and into my thirties. We also need to learn to ignore them, stifle them, outgrow them. Accept our anger, accept our fear, learn not to be stopped by them.

But this is also the problem with "mental illness," whether it refers to mild discomfort or major dysfunction, let alone dissenting opinions or personal styles that people too easily call "crazy." At the same time, though, becoming irrational and allowing oneself to rant and call for pitchforks and torches is considered proper behavior all through our society, provided we pick the right occasion and target for our craziness. (Staub discusses the way that mainstream commentators indulged in facile armchair psychoanalysis of student and other political dissenters in the 1960s, though most empirical studies showed student radicals to be well adjusted [for what that evaluation is worth]). Whether you view mental illness as a medical reality or a social construct -- which, again, are not mutually exclusive categories -- its boundaries are highly subjective, as are the possible remedies.

Which reminds me, here's a fine example that shows that the overreaching of scientists on social issues is still with us, and still good for news coverage. More about it later, if I get around to it, but for now, those researchers are obviously too dumb to be taken seriously; trust me on this, I'm an intellectual.