Monthly Archives: January 2009

Self-perception Deception Makes for Dangerous Driving

Do as we say, and not as we do.  -Giovanni Boccaccio, 1313-1375

readingdrivingForbes ran an informative article this week on The Most Dangerous Times to Drive that puts a fine point on one dimension of the self-perception paradox (how we see ourselves versus how we actually are).  Consistently, and across many areas of life, we hold our behavior in far higher regard than our actions demonstrate is deserved, and driving is a fantastic example.  Unfortunately, the end result of this particular example can be fatal (hence the article’s title). I’ve pulled out a few sample facts below.

According to the National Highway Traffic Administration, 95% of all traffic accidents are caused by human error; but when surveyed, 75% of drivers say they are more careful than everyone else on the road

According to the Automobile Association of America, 82% of drivers say that driving while distracted is a serious problem;  but more than half say they talk on a cell phone while driving, and 14% admit to reading and sending text messages while driving

75% of drivers say that speeding is a serious problem; but 20% say that they drive more than 15 miles per hour over the speed limit on the highway, and 14% say they occasionally do so on neighborhood streets

And the most dangerous time of the year to drive?  It’s not winter. It’s August, on a Saturday.



Filed under About Neuroscience, Books and Ideas

Musings on the Grape Loving Ape: An Interview with Dr. Frans de Waal

dewa-fransDr. Frans de Waal, one of the world’s leading primatologists, is rare among scholars — both an accomplished researcher and an excellent communicator. In numerous books and articles he has vastly broadened what we know about primate societies, and this knowledge has in turn shed much needed light on human societies.  His ability to draw credible links from one to the other has made his work indispensable across multiple disciplines.

Dr. de Waal’s new book, The Age of Empathy, will come out this year from Harmony Books.  He was gracious enough to make some time to discuss primate emotions, whether nature can produce morality, and the persistent belief in the nature/culture divide with Neuronarrative.


You’ve done a great deal of work to dispel the notion of a “nature/culture” divide. What is the argument endorsing this alleged divide, and why in your view is it flawed?

The problem is that if all human societies have culture — and they do — it is a product of human nature, so can’t be truly separated from it. We are born with a capacity for culture and to absorb all of the cultural habits and values around us. In addition, there is the issue that other primates also have cultural traditions.

There is more and more evidence that groups of the same species may act quite differently from each other, having different eating habits, communication signals, and tools, which behavior they hand down from generation to generation. We now openly speak of primate cultures, but the phenomenon is not limited to our near relatives, but found in tons of mammals and birds, even fish. This is another reason to question the nature/culture divide.


You’ve also been heavily involved in correcting the misconception (which dates back to T. H. Huxley) that nature is inherently “amoral” and could never result in morality. What have you discovered in your primate research that might be considered evidence of rudiments of primate morality?

inner-apeHuxley believed, as do still many biologists today, that the human animal is inherently bad and nasty, and that if any good is produced by us, it is thanks to a thin veneer of culture and morality. This veneer is of our own making. Darwin didn’t see it this way, and believed, like I do, that the tendencies that underpin our moral systems derive from old social instincts that we share with other primates.

We’re not the only ones to occasionally help each other, be sensitive to other’s emotions, or be sensitive to fairness. We recently published an experiment, for example, in which primates either got rewarded without any reward for a companion next to them or get rewarded while the partner got the same. So, the only difference was what the partner got. They systematically preferred the latter option, thus showing sensitivity to the welfare of others. In fact, I believe that giving is self-rewarding for them, as we know it is for humans.


Along the same lines, do primates show evidence of having ‘moral emotions,’ such as empathy, compassion and generosity? (those emotions that we typically only associate with human morality.) 

Empathy is a very old mammalian characteristic. Of course, if it is defined top-down, as often done in psychology, it involves cognitive perspective taking (adopting another’s point of view) that is probably limited to a few large-brained species, such as humans, apes, and elephants. But if we define it bottom up, by its most basic ingredients, empathy is being in tune with the emotions of others, such as contagious fear, distress, or joy. Emotional contagion has recently been demonstrated in mice, and I believe it originally derives from maternal care, which is obligatory in the mammals, since every female needs to immediately react to offspring that is hungry, cold, in danger, etcetera. This origin may also explain the empathy differences between males and females in our species.

But apes go further than this. Not only are they affected by the emotions of others, they show strong reactions. For example, they console victims of aggression. The victim sits screaming in a corner, and others approach, embrace and kiss them, or groom them for the longest time, until they have calmed down. This is the same sort of behavior that psychologist have used to study empathy in young children, and I always feel that similar behavior in closely related species deserves similar labels and explanations. The warnings that I often hear against “anthropomorphism” strike me as reflective of a pre-Darwinian mind-set, because clearly mental continuity is the simpler assumption.


I know that you were one of the principal researchers involved in the now famous “grape / cucumber” study.  Briefly, how was this study conducted and what did you discover? 

With Sarah Brosnan, we gave monkeys different rewards for the same task. If they get the same reward everything is fine and dandy, and no one complains, even if they just get cucumber. But if their companion gets grapes for the same task, all hell breaks loose and they refuse to perform the task, and often refuse the food itself.

I was reminded of this during the recent outcry in the media about the CEO’s of the car industry who had flown to Washington in private jets. We humans are very sensitive to inequity, and now that we are going through some rough times, these feelings surface very easily. The CEO’s were munching on grapes, whereas all of us have to content ourselves with cucumber.


I’ve been intrigued by recent research on self-awareness in animals (primates and other species).  In what ways are the results of this research challenging our preconceptions about the natural world and our relationship to other animal species? 

ape-sushiSelf-consciousness needs to be part of every animal’s interaction with its environment. If you’re a monkey, you want to be aware of how your body will impact a lower branch that you intend to jump on, based on the thickness of the branch, the distance of the jump, and your body mass. This kind of self awareness is everywhere, but some species go further in that they recognize themselves in a mirror, or have a full understanding of how their own behavior will affect the situation of others. We are learning more and more about these various levels of self awareness, and this obviously brings animals closer to us than in the days that everything animals did was explained in Cartesian, mechanistic terms.


Moving forward, what do you think the next round of new (or continuing) primate research may tell us about ourselves and our relationship to primates? Are we in for a continued eye opening? 

Our field will keep eroding the human-animal distinction even though many people crave such a distinction. Every few years a new one comes up. It used to be tool-use, followed by the making of tools, then symbol-use, theory-of-mind, imitation, altruism, you name it. Every distinction has fallen by the wayside, and I am sure that soon another one will be proposed. We welcome these challenges. The basic message from our field is that differences may well exist, but they are invariably gradual.


Dr. Frans de Waal is the C. H. Candler Professor of Psychology. Director Living Links, part of the Yerkes National Primate Research Center, Emory University, Atlanta.

Link to Living Links



Filed under Interviews

The Truth About Lie to Me

rothThe television show Lie to Me premieres tonight; it’s about an expert at reading human faces (played by the talented actor Tim Roth) who specializes in identifying when people are lying.  There’s been a slew of these sorts of shows out in the last couple of years, in which psychological expertise is used to solve crimes that are otherwise unsolvable by traditional methods. 

All of these shows are, at best, only loosely based on actual behavioral psychology, and are predictably sensationalized for TV. I’m guessing this one won’t be any different.  But, it’s worth noting that Roth’s character is modeled after a quite credible expert in the field, Dr. Paul Ekman, and apparently prior to every weekly episode of the show he’s going to be posting a column on his web site called “Truth About Lie to Me” in which he comments on what happened in the episode.  This may not make the show any more firmly factual, but at least it’s a way for viewers to get a glimpse into the science underlying the plot lines.  Below is Ekman’s column for this week’s show (airing tonight) followed by a clip from the show.  After that is a clip of Paul Ekman himself discussing the science of lie detection.

In the first few minutes of the first episode of Lie To Me the prisoner showed what we call an emblematic slip, the equivalent in gesture of a slip of the tongue. I use the term ‘emblem’ for any gesture that has a precise meaning known to all members of a cultural group – such as the A-OK emblem in the U.S. (Watch out; emblems are specific to each culture. Someone will slug you if make the A-OK emblem in Sicily where it refers to what is considered a perverse sexual practice!)

Typically emblems are made in what I call the ‘presentation position’, very noticeable because they are performed right in front of the person making it and very pronounced, with a beat. Emblematic slips are made outside of the presentation position, and usually they are only a fragment of the full emblem, performed without a beat. That is what the prisoner did, he showed just a fragment of the shrug that means ‘I don’t know’ or ‘I can’t do it’. The person showing the emblematic slip knows what he or she is thinking but doesn’t know it has leaked out. More about emblems in the third issue of my newsletter Reading Between The Lies.

Link to a Wired article about Paul Ekman


Filed under Videos

When Times are Tough, Female Brains are Tougher

It seems that the battle of the sexes may officially be over — in fact, when it comes to male and female brains, it isn’t 090117-cell-starvation-02even that close a call, at least as far as nature is concerned.  A new study funded by the National Institutes of Health indicates that when it comes to keeping brains alive, nature favors the female brain, hands down. 

From the Live Science report on the study:

Past studies looking at the effect of starvation on animal bodies have been done mostly by looking at nutrient-rich tissues like muscles, fat deposits, and the liver. Robert Clark and colleagues at the University of Pittsburgh Medical Center grew neurons taken from male and female rats or mice in lab dishes, then subjected them to nutrient starvation over 72 hours.

After 24 hours, the male neurons experienced significantly more cell dysfunction. A key indicator called cell respiration decreased by more than 70 percent in male cells compared to 50 percent in female cells. Visually, male neurons showed more signs of autophagy, whereby a cell breaks down its own less vital components to use as a fuel source, while female neurons created more lipid droplets to store fat reserves.

Male neurons basically eat themselves from the inside, the scientists conclude.

The nutrient deprivation “produced cell death more profoundly in neurons from males versus females,” Clark’s team writes in the Jan. 23 issue of Journal of Biological Chemistry. “Thus, during starvation, neurons from males more readily undergo autophagy and die, whereas neurons from females mobilize fatty acids, accumulate triglycerides, form lipid droplets, and survive longer.”

Link to a video about fat and the hungry brain

Leave a comment

Filed under About Neuroscience

The Psychology of Plastic Couch Covers

sofa_single_coverHaving grown up in the waning years of the plastic couch cover (or the plastic slip, if you prefer), I’ve always been intrigued by the psychology behind this peculiar practice.  Here’s the scenario: you go to a furniture store, you spend the requisite time to find just the right size, shape and style of furniture to grace your living room, and you lay out considerable cash to purchase and have the furniture delivered to your house.  When it gets there, you spend more time positioning it just right to ensure that it entirely fulfills your vision of the complete, well-appointed living room.  Then, as you look over your creation, flush with pride — you proceed to cover everything with plastic.

‘You’ don’t do this, I realize, but so many people have that this practice is a defining mark of a generation.  A pure utilitarian would have no trouble understanding it.  Clear plastic allows you to see the fabric beneath and also protect it, thus lengthening the life of the furniture and maximizing its utility.  As to the aesthetic concern of not being able to feel the fabric, but instead hearing a crunch every time you sit down and then sliding about for a while until becoming as comfortable as someone can be cushioned in plastic, the utilitarian says “too bad – that’s the trade-off for maximizing the utility of your investment.” 

chairThis is a good example of something that in a certain light makes perfect practical sense, and yet is still a sure sign of neuroticism.  I think that’s what intrigues me about it most: it’s an example of the illusion of normalcy.  For a generation that was prone to drape its rooms in plastic, this practice was as normal as drinking coffee in the morning.  Now, to us (with perspective of a much differently adjusted normal state) covering perfectly good furniture in plastic seems insane.  The utility argument seems equally insane, if only because it’s so ridiculously myopic (which is one of the reasons that pure utilitarianism is about as influential these days as the flat earth movement).

More recent generations, though, are prone to their own flavor of “covering” neuroticism — the car bra.  You buy the car of your dreams, so it’s only reasonable to want to protect its paint, right?  And what better way than to put on a car bra–the leather (or vinyl) slip that’s fitted to cover the front end of the car–and thus keep bugs, tar and anything carbraelse from corroding the paint.  Practically speaking, the logic of this practice seems unassailable, and car bra practioners would argue that aesthetically it also makes the grade–so stylish and classy. 

But there’s one problem with all of this, though it only becomes evident when, a couple years later, the bra comes off and you realize that the paint beneath the bra is now a different shade than the rest of the car, which has been exposed to all of the elements that roughly one eighth of the car was protected from by the bra.

Back to couch covers… I’ve found nothing on the web as illustrative of the practice than this clip from “Everybody Loves Raymond”.  I especially like the “freedom” initially experienced when the cover comes off — like a collective sense of relief felt from overcoming a severe anxiety disorder–until things take a turn for the (very funny) worst. Freedom is indeed a fragile thing. 


Filed under About Perception

On Procrastination and da Vinci (sometimes genius doodles)

sk03davinciLeonardo da Vinci, it has been said, was an inveterate procrastinator.  Fond of doodling while the hours passed by, the genius who changed the world wasn’t so expert at getting things done.  Helps to put things into perspective, no?  At the engaging site Procrastinus (where one can learn much while procrastinating) we learn this of da Vinci:

He explored almost every field available to him, in both science and art. He made significant contributions in engineering, architecture, biology, botany, anatomy, math, and physics. He sculpted, painted, both portrait and mural (e.g., The Last Supper) and made plans for ingenious machines that wouldn’t be built for centuries (e.g., planes, submarines). He also never finished a project on time.

Part of what made Leonardo such a “Renaissance Man” was that he was distractible as he was talented. Jacob Bronowski, the scientific historian, speaks about his procrastination. His talents and energy were often wasted in doodles and unfinished projects. The Last Supper was only finished after his patron threatened to cut off all funds. Mona Lisa took twenty years to complete.  The Adoration of the Magi, an early painting, was never finished and his equestrian projects were never built.

His procrastination caused him much grief in later years. Despite his varied contributions, he felt he could have achieved much more. Given his talents, it is without doubt that more of his aspirations could have become a reality in his own time. So much was half-completed that he appealed to God, “Tell me if anything ever was done. Tell me if anything was done.”

Author John Perry may have had a solution for da Vinci. While spending time trying to figure out the source of his own procrastination, he came to this conclusion: Procrastination is the result of a fantasy manufactured to avoid imperfection. Here’s Perry:

Many procrastinators do not realize that they are perfectionists, for the simple reason that they have never done anything perfectly, or even nearly so. They have never been told that something they did was perfect. They have never themselves felt that anything they did was perfect. They think, quite mistakenly, that being a perfectionist implies often, or sometimes, or at least once, having completed some task to perfection. But this is a misunderstanding of the basic dynamic of perfectionism.

Perfectionism is a matter of fantasy, not reality. Here’s how it works in my case.  I am assigned some task, say, refereeing a manuscript for a publisher. I accept the task, probably because the publisher offers to pay me with a number of free books, which I wrongly suppose that if I owned I would get around to reading. But for whatever reason, I accept the task.

Immediately my fantasy life kicks in. I imagine myself writing the most wonderful referees report. I imagine giving the manuscript an incredibly thorough read, and writing a report that helps the author to greatly improve their efforts.  I imagine the publisher getting my report and saying, “Wow, that is the best referee report I have ever read.” I imagine my report being completely accurate, completely fair, incredibly helpful to author and publisher.

His solution?  Structured procrastination.  With a bit of self-imposed psychological judo, the weakness can become a strength.

Procrastinators seldom do absolutely nothing; they do marginally useful things, like gardening or sharpening pencils or making a diagram of how they will reorganize their files when they get around to it. Why does the procrastinator do these things? Because they are a way of not doing something more important. If all the procrastinator had left to do was to sharpen some pencils, no force on earth could get him do it. However, the procrastinator can be motivated to do difficult, timely and important tasks, as long as these tasks are a way of not doing something more important.

Structured procrastination means shaping the structure of the tasks one has to do in a way that exploits this fact. The list of tasks one has in mind will be ordered by importance. Tasks that seem most urgent and important are on top. But there are also worthwhile tasks to perform lower down on the list. Doing these tasks becomes a way of not doing the things higher up on the list. With this sort of appropriate task structure, the procrastinator becomes a useful citizen. Indeed, the procrastinator can even acquire, as I have, a reputation for getting a lot done.

Perry’s essay is very funny, and worth wasting some time to read.  And, if you’ve got time to spare after that, you can go measure your procrastination here.


Filed under Books and Ideas

Noggin Raisers Vol.6

brainmapIf you want to lose weight, is it better to clear the house of sweets, or put a pile in every room?  We’re Only Human provides the paradoxical answer

Ars Psychiatrica gives us a tour of F. Scott Fitzgerald’s troubled psyche when he penned “The Crack Up”

Dr. Shock tells us why chronic stress is a critical risk factor in psychiatry

Great piece at Mind Hacks telling us about how we learn

Hack your brain and hallucinate with ping pong balls using this graphic from The Boston Globe

Trust and fraud are the topics of an insightful post at The Frontal Cortex

Is Google making us Stoopid?  Not at all says Carl Zimmer, and he’s got the argument to back it up

The Situationist points us to a piece about why we crave the smell of a loved one’s clothes

The lively debate about voodoo neuroscience goes on at The Neurocritic, and here at BPS Research

Shrink Rap helps keep us grounded with a human experiment

And finally, although this has nothing to do with the topics of this blog, I want to let you know that YOU too can have your very own Obamicon, courtesy of Paste Magazine.


Filed under Noggin Raisers