Monthly Archives: September 2008

Identifiable vs. Statistical Victims

John McCain wears a bracelet. So does Barack Obama. And both of them signal something vital about how the public views various social and financial policies – people tend to sympathize with identifiable victims, those given a name and a story, even when that identification provides no real information about the victim’s situation or the policies affecting it.  The Situationst has a timely post about a study that, while a few years old, definitely rings the bell of our current political situation.  Here’s the abstract of the paper (the entire document can be downloaded here).

We draw out implications of the identifiable victim effect – the greater sympathy shown toward identifiable than statistical victims – for public finance. We first review research showing (1) that people respond more strongly to identifiable than statistical victims even when identification provides absolutely no information about the victims, (2) that the identifiable victim effect is a special case of a more general tendency to react more strongly to identifiable others whether they evoke sympathy or other emotions, and (3) that identifiability influences behavior via the emotional reactions it evokes. Next, we discuss the normative status of the effect, noting that, contrary to the usual assumption that people overreact to identifiable victims, identifiability can shift people’s responses in a normatively desirable direction if people are otherwise insufficiently sympathetic toward statistical victims. Finally, we examine implications of the identifiable victim effect for public finance. We show that the identifiable victim effect can influence the popularity of different policies, for example, naturally favoring hidden taxes over those whose incidence is more easily assessed, since a hidden tax has no identifiable victims. Identifiable other effects also influence public discourse, with much of the debate about government spending and taxation being driven by vivid exemplars – iconic victims and perpetrators – rather than any rational calculation of costs and benefits.


Filed under About Perception

The Psychology of Campaign ’08

Excellent piece by neurologist Robert Burton in Salon on the psychology of politics.  Burton argues convincingly that support for ones candidate generally says more about the supporter than the candidate. In his book, On Being Certain, Burton puts several nails in the coffin of the myth of rational certainty, and this article does a nice job of applying the argument to the current political situation.  Here’s an excerpt:

Feelings of absolute certainty and utter conviction are not rational deliberate conclusions; they are involuntary mental sensations generated by the brain. Like other powerful mental states such as love, anger and fear, they are extraordinarily difficult to dislodge through rational arguments. Just as it’s nearly impossible to reason with someone who’s enraged and combative, refuting or diminishing one’s sense of certainty is extraordinarily difficult. Certainty is neither created by nor dispelled by reason.

Similarly, without access to objective evidence, we are terrible at determining whether a candidate is telling us the truth. Most large-scale psychological studies suggest that the average person is incapable of accurately predicting whether someone is lying. In most studies, our abilities to make such predictions, based on facial expressions and body language, are no greater than by chance alone — hardly a recommendation for choosing a presidential candidate based upon a gut feeling that he or she is honest.

Worse, our ability to assess political candidates is particularly questionable when we have any strong feeling about them. An oft-quoted fMRI studyby Emory psychologist Drew Westen illustrates how little conscious reason is involved in political decision-making.

Westen asked staunch party members from both sides to evaluate negative (defamatory) information about their 2004 presidential choice. Areas of the brain (prefrontal cortex) normally engaged during reasoning failed to show increased activation. Instead, the limbic system — the center for emotional processing — lit up dramatically. According to Westen, both Republicans and Democrats “reached totally biased conclusions by ignoring information that could not rationally be discounted” (cognitive dissonance).

In other words, we are as bad at judging ourselves as we are at judging others. Most cognitive scientists now believe that the majority of our thoughts originate in the areas of the brain inaccessible to conscious introspection. These beginnings of thoughts arrive in consciousness already colored with inherent bias. No two people see the world alike. Each of our perceptions is filtered through our genetic predispositions, inherent biologic differences and idiosyncratic life experiences. Your red is not my red. These differences extend to the very building blocks of thoughts; each of us will look at any given question from his own predispositions. Thinking may be as idiosyncratic as fingerprints.

Leave a comment

Filed under Uncategorized

On Real Education: Interview with Author Charles Murray

Charles Murray, bestselling author of Losing Ground and coauthor of The Bell Curve, has written a new book focused on the transformation of our educational system: Real Education: Four Simple Truths for Bringing America’s Schools Back to Reality (Crown Forum Publishers)In it, he outlines the four simple truths that he contends must be addressed to initiate real transformation of our schools: (1) Ability Varies (2) Half of Children are Below Average (3) Too Many People Are Going to College, and (4) America’s Future Depends on How We Educate the Academically Gifted.  Below is a brief discussion with Mr. Murray about a few of the book’s main points.  

NN: Your most recent book, Real Education, is a hard-hitting critique of our educational system, but it seems to have a larger audience than merely policy wonks. Who would you say should read the book and what can they expect to learn from it?

CM: Actually, it isn’t written for policy wonks at all. It’s written for teachers—both K-12 and college—and parents who know from their own experience that education today is out of touch with reality. They know that ability varies among children, and that education should take that into account. It’s just the politicians and the educational bureaucrats who try to pretend that children can be anything they want to be if they try hard enough.


You mention early in the book that evaluation of intelligence (or what you predominantly refer to as “ability”) has been largely left out of discussions on education. Why do you think this has been the case?

We had a concatenation of two shifts in the culture in the late 1960s. The Civil Rights revolution made it embarrassing to acknowledge academic deficits among some black children, so they were ignored, or attributed to easily-solved problems with the schools (despite everything that the Coleman Report had already taught us). And if you couldn’t talk openly about intractable academic problems among some black children, you couldn’t talk about similarly intractable problems among some white children. At about the same time, the self-esteem movement took off, and it was decided that criticism and acknowledgment of academic limits were damaging to self-esteem, and high self-esteem was the be-all and end-all of child development. The net result: Straight talk about the relationship of academic ability to education became verboten.  


Much has been written recently about the brain’s plasticity, yet it seems the jury is still out on whether we can permanently alter intelligence in any significant way. Given the progress that has been made in the area of plasticity, do you think it’s a worthwhile project to attempt to alter IQ?  Is this, in your opinion, even possible?

Eventually, it will happen, without question—but whether “eventually” means ten years or a century from now remains an open question. We have proved that altering IQ significantly through environmental enrichment is really, really hard. Adoption at birth produces the only consistent results, and even adoption at birth does not bring the adoptive children up to level of the biological children of adoptive parents, independently of the genetic heritage of the adopted child. 


Measuring intelligence is controversial and inevitably leads to claims of cultural bias. You have said that regardless of the specific tests being used, the underlying academic ability (g) remains a fact that must be acknowledged. How do we arrive at an honest assessment of g that won’t be derailed by cultural bias arguments?

Two separate issues are involved, both of which are empirically resolved. Is g real or just a statistical phenomenon? It’s pretty hard argue that it’s just a statistical phenomenon when neuroscience keeps identifying new physiological characteristics of the brain (e.g., the volume of very specific portions of the brain, glucose metabolism in specific portions of the brain) that are not just correlated with IQ scores, but with the measurement of g embedded in those scores. Are IQ test scores culturally biased? The serious scientific debate about cultural bias in IQ tests was over as of the mid 1980s—for practical purposes, since Arthur Jensen’s Cultural Bias in Mental Tests was published in 1980. All the major empirical issues involving bias in predictive validity are settled. The major tests, administered to the people for whom they were designed and in the way they were designed, are not culturally biased. Period. But when will the nature of the public debate catch up with the science? Not for a while. Look what happened to Larry Summers when he talked about sex differences in cognitive profiles. The science didn’t matter. He became instant PC road kill. 


In your book’s chapter entitled “Too Many People are Going to College”, you make reference to distance learning technologies and their potential for eventually making bricks and mortar colleges—as we understand them now—far less important. How do you address objections to “online literacy”, like those voiced by Mark Bauerlein in his recent book The Dumbest Generation, which argue against relying on the new technologies to enhance learning?

There’s nothing inherent in learning over the Internet that makes it less rigorous than learning in the classroom. The differences in process between the classroom experience and the Internet experience are constantly being narrowed as technology improves. Content is everything. If the online learning consists of a probing interaction about Aristotle’s Nichomachean Ethics, then there’s no problem. I grant the aspects of the Internet that are problematic in ways that Bauerlein describes, but they needn’t contaminate online learning if the course designers do their job right.


Some of your critics have said that your positions would lead to “giving up” on a certain percentage of children, and that you’re guilty of “underestimating their abilities” (I’m thinking especially of Ben Wildavsky’s book review in the Wall Street Journal, “When Learning has a Limit”). How do you respond to these criticisms?

They are educational romantics. Anyone who thinks that every child can learn, say, how to factor quadratic equations needs to spend a lot more time around children who are well into the lower half of the distribution of math ability. The odd thing is that the educational romantics who are parents of more than one child know from their own experience that there’s no way to make their own children more alike in their profiles of cognitive abilities. But they keep insisting that they can do it with other people’s children. It is worse than romanticism. It is intellectual self-indulgence.


You’re careful to point out that, whether we like it or not, America’s future depends on academically gifted elite that have the greatest affect on the country, and our ability to choose the elite, as with elections, is limited. What is the most important thing that needs to happen in our educational system to prepare the next generation of elite leaders to positively influence the nation? 

Every one of them should be viscerally aware of their own limits, and the current college curriculum of the social sciences and humanities seldom forces them to do that. We have far too many students coming out of elite colleges who have never had their feet held to the fire intellectually, and are far too in love with their own wonderful selves. Humility is a prerequisite for wisdom, and we are in urgent need of more wisdom among the people who shape the culture.

1 Comment

Filed under Interviews

Scary Mary

Here’s an excellent post at Cognitive Daily that discusses research on how music played before or after a film character is shown affects our perception of their emotion. Here’s a snippet, but the entire piece is worth reading:

Whether the music was played before or after the clip, at least in the case of happiness, sadness, or fear, it affected viewers’ perception of the actor’s intended emotion. The researchers were careful not to mention music at all during the presentation of the clips, asking the viewers only about the emotions intended by the character. But music clearly had a dramatic impact on the viewers’ perceptions of the scene. The scenes had been pre-screened without music by 31 viewers to verify that the emotion portrayed in the scene was neutral (two other potential scenes were eliminated during this pre-screening process as not being sufficiently neutral).

Additional analysis by the researchers suggests that the music played before the scene had a more powerful effect on perceived emotion than music played afterwards, but both clearly have an important effect.

And when you’re finished reading that, watch this:



and then this…

Leave a comment

Filed under About Perception

Neuron Resurrection

A new study from UCLA researchers has taken the search for where memories are made, and how they are recalled, another step further. Its significance is in confirming for the first time that neuronal activity associated with spontaneous memory is from the very same neurons that were activated when the memories were first made – thus providing evidence for the reactivation of neurons in the hippocampus that are linked directly to conscious recall. In a nutshell: mentally reliving past experience is the resurrection of neurons. Here’s an article in Science Daily on the study, which was published in Science.

Leave a comment

Filed under About Neuroscience

David Foster Wallace (1962-2008)

Very sad news this weekend about author David Foster Wallace, found dead in his home on Friday from an apparent suicide. He was only 46. While already a ‘game changing’ writer -someone who forever alters the literary landscape – I can’t help but wonder what else he might have produced in years to come. I’m quite sure that he could have been a cross-generational influence, much like Saul Bellow or Philip Roth – a writer who affects readers over the course of a long and prolific lifetime.  Aside from his published works, well worth reading is his commencement address to Kenyon College in 2005. Here’s an excerpt, which seems in some ways eerily prophetic.

Given the triumphant academic setting here, an obvious question is how much of this work of adjusting our default setting involves actual knowledge or intellect. This question gets very tricky. Probably the most dangerous thing about an academic education — least in my own case — is that it enables my tendency to over-intellectualize stuff, to get lost in abstract argument inside my head, instead of simply paying attention to what is going on right in front of me, paying attention to what is going on inside me.

 As I’m sure you guys know by now, it is extremely difficult to stay alert and attentive, instead of getting hypnotized by the constant monologue inside your own head (may be happening right now). Twenty years after my own graduation, I have come gradually to understand that the liberal arts cliché about teaching you how to think is actually shorthand for a much deeper, more serious idea: learning how to think really means learning how to exercise some control over how and what you think. It means being conscious and aware enough to choose what you pay attention to and to choose how you construct meaning from experience. Because if you cannot exercise this kind of choice in adult life, you will be totally hosed. Think of the old cliché about quote the mind being an excellent servant but a terrible master.

This, like many clichés, so lame and unexciting on the surface, actually expresses a great and terrible truth. It is not the least bit coincidental that adults who commit suicide with firearms almost always shoot themselves in: the head. They shoot the terrible master. And the truth is that most of these suicides are actually dead long before they pull the trigger.

And I submit that this is what the real, no bullshit value of your liberal arts education is supposed to be about: how to keep from going through your comfortable, prosperous, respectable adult life dead, unconscious, a slave to your head and to your natural default setting of being uniquely, completely, imperially alone day in and day out. That may sound like hyperbole, or abstract nonsense. Let’s get concrete. The plain fact is that you graduating seniors do not yet have any clue what “day in day out” really means. There happen to be whole, large parts of adult American life that nobody talks about in commencement speeches. One such part involves boredom, routine, and petty frustration. The parents and older folks here will know all too well what I’m talking about.


Filed under Tributes to Great Minds

Fan-tastic News for Rabid Fans

Well, it’s official sports fans – we now have scientific evidence that being a fan is good for you. What’s more, it’s apparently also good for society. This Boston Globe piece outlines some recent research on the topic very nicely.  Who would have thought that being in love with your team can increase happiness, wealth, and one’s general outlook on life? Here’s an excerpt:

 Edward Hirt, a professor of psychology at Indiana University, has found that the school’s famously ardent basketball fans saw their opinion of themselves rise and fall with the fortunes of the team. Over the years, he has conducted various versions of a study in which fans, after watching a game, are asked how well they thought they would do at a variety of tasks – throwing darts, shooting free throws, solving various word games, even rolling dice. Consistently, both male and female fans showed a sharp rise in confidence in their abilities on all the tasks after a win, and a corresponding drop after a loss. Winning even made fans feel sexier: When shown a picture of an attractive member of the opposite gender after a win, they were far more likely to say they would be able to get that person to go on a date with them.

Findings like these suggest that there might be a broadly shared psychological boost from our stubborn inability to separate our own accomplishments from those of a group of multimillionaire professional athletes who have never heard of us. Multiple studies, for example, have shown that testosterone levels in male fans rise in the wake of a victory, and drop in the wake of a defeat. A study published earlier this year of traders on the London Stock Exchange found that the ones with higher levels of testosterone made more money than their colleagues – the researchers suggested that it might be because the hormone emboldened the traders to take bigger risks.

A large body of psychological research suggests that the kind of psychological changes seen in fans after a victory could translate into positive behavior. More self-confident people tend generally to do better at life: they get better grades, make more money, have more friends, even live longer. And the self-confidence doesn’t have to be earned to make a difference. Shelley Taylor, a UCLA psychologist, has found that having outright illusions about one’s abilities, and about the amount of control one has over the events in one’s life, makes people happier, harder working, and more successful at whatever they put their minds to. When Taylor looked at AIDS patients in the late 1980s (a time when the disease was far less treatable than it is today), she found that those with an unrealistically optimistic sense of their prognosis lived an average of 9 months longer than those with a more accurate understanding of the disease.

Leave a comment

Filed under Uncategorized