Thursday, September 4, 2014

D'Oh, Re, Mi ...

An old IU friend of mine (a musician, natch) posted a link to this article about a study of the effect of ongoing after-school music education on "at-risk" children's brains.  On Facebook it showed up as "Could Music Education Be the Key to Ending the Achievement Gap?", which I suppose is meant to be clickbait.  If you click through, the Huffington Post's title is "Study: Music Education Could Help Close the Achievement Gap Between Poor and Affluent Students" -- a very different claim.  But the opening words of the article are "Closing the achievement gap between low-income and affluent students could be as simple as do-re-mi."  No, it couldn't.  The article itself quotes the director of the research institute involved:
“These findings are a testament that it’s a mistake to think of music education as a quick fix, but that if it’s an ongoing part of children’s education, making music can have a profound and lifelong impact on listening and learning.”
At least this time the movement was away from an inflated claim to a more cautious one; typically it's the other way around.

Last week I got into an online squabble with a liberal acquaintance about this very issue.  He'd linked to another research project that was touted as showing that children's early drawing ability "may predict future intelligence."  We'd argued about "may" and "suggest" in these contexts just a day before, and he posted this one to try to bait me.  I didn't bite, but one of his other friends made a critical comment on it.  Again, the study doesn't actually do any such thing.  According to the post he linked to (at a site for freelance artists, like the friend who posted the link, so there's an agenda at work), the researchers found
a "moderate" association between higher scores and intelligence test results, first at the age of four, and then later at the age of 14... However, [lead authorj] Arden was also quick to point out that parents of children with bad drawing skills shouldn’t be worried, as there are "countless factors" that affect intelligence.
A "moderate" association means not only that a four-year-old with poor drawing skills may turn out to be "intelligent" after all, but a child with good drawing skills may turn out not to be "intelligent."  Without looking at the study itself (and I can't be arsed to do so, frankly), there's no telling how strong the correlation is, but "moderate" isn't world-shaking.  So while the study's results are not without interest (though they're also not particularly surprising), it doesn't seem that early drawing ability is a particularly good predictor of intelligence in adolescence.  I'd imagine that motor skills, which are notoriously variable in young children, are also a factor.   And then there's the question of how well "intelligence" can be measured, unless you go along with the claim that intelligence is what intelligence tests measure.

And then another friend, for whose judgment I have more respect, wrote something about a former student who wanted to "write her thesis on the 'economics of happiness,'" which I agree could be interesting, though I don't know how you'd approach it, and linked to a TED talk on "the surprising science of happiness."  It should come as no surprise that I'm skeptical of that.  One of my friend's friends commented: "the fairly new idea in law & economics movement is that laws shouldn't be trying to maximize utility measured using willingness to pay, but rather turn to measuring happiness directly and maximizing that."  I asked how one measures happiness directly.  He replied, "you measure self-reported happiness on the scale 1 to something. Turns out, it's quite a good estimate." 

Perhaps so, but "quite a good estimate" is not a direct measure.  A researcher might invite a subject to self-report his or her height or weight or body temperature or, come to that, IQ.  But all those things can also be measured directly.  It might be interesting to compare the self-reports with the direct measurements, since it's known how unreliable such self-reports often are when they can be compared.  When they can't be compared, it's not a good idea to put much store by the self-reports.  (Which doesn't stop many researchers from doing so anyway, of course.)

The point here is that the liberals and progressives I know, who love to jeer at the scientific illiteracy of Rethugs and Bible thumpers, post a lot of this kind of thing, much of which turns up on a Facebook page called "I fucking love science."  They may love it, but they don't know much about it, and they do get pissy when their own scientific and mathematical illiteracy is pointed out.  They react exactly the same way the Christian right-wingers do when their fond fantasies are debunked: I like to believe this, it makes me feel good, so I'm going to believe no matter what you mean old skeptics say -- and besides, who knows -- it might turn out to be true after all!

This isn't an innocent error either.  I wonder what the consequences of these studies are supposed to be.  All kids should be getting a good education anyway, and we know that a richer environment -- not just musical and artistic but literary and more narrowly intellectually stimulating -- is good for them.  The problem in the US is not that we don't know what children need and why: we know that perfectly well.  It's that many adults don't want to provide it.  Many others can't, for lack of time and other resources.  The most interesting finding of the music-education project was that its effects didn't show until the second year, meaning that there is no quick fix here; but again, we already know that.  Education, serious education of the kind that is piously talked about, takes years.  We know that.  Maybe someone should study the excuses that are made for not doing what we know needs to be done?