Category Archives: Books and Ideas

To Esteem Thyself, Or Not

magical-weave-mirrorIf anyone was asked to list the top 10 topics that ignite arguments, I doubt very much that ‘self esteem’ would make the cut. And yet, this seemingly bland, bordering-on-clichéd topic is in fact the source of many battles.  Too little, or too much is the question: how much self esteem is the right amount?

Now a study from the University of Geneva (courtesy of BPS Research) suggests that self esteem at low doses is linked to higher suicide rates around the world.  Researchers evaluated suicide rates and self esteem levels, using data from the International Sexuality Description Project, across 55 nations and arrived at this conclusion:

Results indicate that suicide is especially common in nations with relatively low levels of self-esteem. This relation is consistent across sex lines, age of suicide and independent from several other relevant factors such as economic affluence, transition, individualism, subjective well-being, and neuroticism.

If these results stike you as uncontroversial, consider the work of psychologist Roy Baumeister, a decades-long public critic of the self esteem movement — and, one might confidently say, self esteem in general.  Baumeister’s research tells a different story about high self-esteem, linking it not to successful performance in life, but to tendencies toward bullying, murder, racism and gang involvement.  Here’s a snippet from an article he did in the Los Angeles Times a few years ago:

It was widely believed that low self-esteem could be a cause of violence, but in reality violent individuals, groups and nations think very well of themselves. They turn violent toward others who fail to give them the inflated respect they think they deserve. Nor does high self-esteem deter people from becoming bullies, according to most of the studies that have been done; it is simply untrue that beneath the surface of every obnoxious bully is an unhappy, self-hating child in need of sympathy and praise. High self-esteem doesn’t prevent youngsters from cheating or stealing or experimenting with drugs and sex. (If anything, kids with high self-esteem may be more willing to try these things at a young age.)

He also points to research indicating that self-esteem in high doses leads to a host of more common shortcomings:

High self- esteem in schoolchildren does not produce better grades. In fact, according to a study by Donald Forsyth at Virginia Commonwealth University, college students with mediocre grades who got regular self-esteem strokes from their professors ended up doing worse on final exams than students who were told to suck it up and try harder.

Self-esteem doesn’t make adults perform better at their jobs either. Sure, people with high self-esteem rate their own performance better – even declaring themselves smarter and more attractive than their low self-esteem peers – but neither objective tests nor impartial raters can detect any difference in the quality of work.

And Baumesiter isn’t alone, either on the secular or sectarian front. Nicholas Emler, noted social psychologist at the London School of Economics, shares this view and adds many additional caveats. And religious leaders the world over have routinely condemned the self esteem movement (in its official and generic forms) as endorsing a view of humanity too ‘esteemed’ for its own good.  “Esteem thyself not” is the anthem of many preaching this position in the West.

Of course, if you Google ‘self esteem’, what you’ll predominantly get is glowing, unabashed praise for self esteem as a movement, and simply as an essential staple of the good life.

If the University of Geneva research is correct, the pro-self esteem position seems to have the final trump card on this controversy.  Maybe.  It depends on what is meant by ‘self esteem.’ Does this term translate well across language and cultural lines?  Does someone in Beijing believe that self esteem is the same thing that someone in Birmingham believes it to be? 

With those questions in mind, I’m launching my first poll on this site.  We have an international audience right here, so who better to answer the question than you?




Filed under About Belief, About Research, Books and Ideas

Managing Your Mind in Elsewhere U.S.A

elsewhereusaSalon has an interesting interview with Dalton Conley, chair of the sociology department at New York University, and author of Elsewhere U.S.A.. Conley discusses how the texting, email, and social media culture has  “turned the 9/5 into the 24/7,”  and the cultural and economic factors underlying the change.

Not long ago I did an interview with Gary Small, author of iBrain: Surviving the Technological Alteration of the Modern Mind, who explores some of the issues Conley mentions but from the perspective of how these technologies are affecting our brains.  I’ll be interested to read Conley’s book to see how he takes this on from a sociological perspective.   The interview with Salon is a good place to get a taste of his approach. 

From the interview:

You seem ambivalent about some of these changes. But you end up saying you’re not telling anyone to toss their BlackBerrys or iPhones, and live in the moment. Well, why not?

If some people want to do that, great. I’m just trying not to sit on some high horse and lecture folks. I’m sure that 50 years from now, the struggles that we are going through with the lack of boundaries will look quaint and silly to folks in 2050, the way the Organization Man, and social life in the ’50s, look quaint and earnest to us now.

We can make choices about policies, such as paid family leave, which would change things for the better. But a lot of the forces, like increased individuation and technology, are going to dictate life, whether we like it or not. I do know folks who sell the business, pack up everything, move to rural Maine and build a log cabin. I think it’s interesting that for them it takes such a drastic act to regain control of their lives. The challenge for most of us is to manage these buzzing, beeping demands on us while being part of the mainstream economy. And at the same time preserve some things we value outside that sphere.

hat tip: Andrew Sullivan

Leave a comment

Filed under Books and Ideas

Monkey See, Monkey Persuaded

codyIt has been called one of the greatest commercials ever made — the 1971 Chief Iron Eyes Cody anti-litter advertisement created by the Marsteller ad agency for Keep America Beautiful.  As the camera pans across a littered landscape, Chief Iron Eyes Cody sheds a famous tear, filled with sadness by humanity’s cruel treatment of nature — indeed, people are littering even as he watches and weeps.   In another version, Iron Eyes canoes through a river of pollution, peering across the water to a factory-cluttered shore, people are still littering, and he’s still crying. (video of one version at the bottom of this post)

Better known as the “Crying Indian” ad, it and the larger litter prevention campaign it was part of was reportedly successful in recruiting an anti-litter workforce across the United States.  According to the campaign’s creators (and that’s important to note), by the end of the 10-year campaign local teams of volunteers had helped to reduce litter by as much as 88%  in 38 states.

That’s all worth talking about (we won’t discuss the fact that Chief Iron Eyes wasn’t really a Native American, he was actually Italian-American; and yes, the tear is fake too) — but what’s even more interesting is the research that the Crying Indian sparked.  I was reminded about this today while reading “Supermarket Trolleys Make Us Behave Badly” in the Times Online, courtesy of The Situationist.   The article summarizes recent research suggesting that disordered, ugly environments inspire disorderly, ugly behavior. 

cialdiniThe study picks up on the work of psychologist Robert Cialdini, author of Influence: the Psychology of Persuasion, and progenitor of what’s often referred to as ‘The Cialdini Effect’  — in short, the behavior you witness others getting away with will influence you to join in.  If you see a parking lot full of shopping carts, you’re more likely to leave yours there too, according to Cialdini’s influential theory. 

Why this made me think of the Crying Indian is that there’s a lesser known side to the story of this famous commercial.  While it’s typically credited as part of a successful anti-litter campaign, there’s also the possibility that it actually encouraged littering.  Counter intuitive as it may sound, the littered landscape that made Chief Iron Eyes cry may have also influenced people to litter. 

In a 1990 study, (summarized in this article published in Current Directions in Psychological Science) Cialdini tested whether the Crying Indian ad contained a conflicting internal dynamic that would compel an opposite effect to the one intended.  Here’s the problem: the ad depicted an already littered environment, and then showed people tossing more litter into the mess. Cialdini wondered whether this might communicate the message that, since other people are littering in what is clearly already a polluted environment, that it’s probably ok to do the same.  From the study:

We had three main predictions. First, we expected that participants would be more likely to litter into an already littered environment than into a clean one. Second, we expected that participants who saw the confederate drop trash into a fully littered environment would be most likely to litter there themselves, because they would have had their attention drawn to evidence of a pro-littering descriptive norm-that is, to the fact that people typically litter in that setting.

Conversely, we anticipated that participants who saw the confederate drop trash into a clean environment would be least likely to litter there, because they would have had their attention drawn to evidence of an anti-littering descriptive norm-that is, to the fact that (except for the confederate) people typically do not litter in that setting. This last expectation distinguished our normative account from explanations based on simple modeling processes in that we were predicting decreased littering after participants witnessed a model litter. 

The results were as predicted: (1) people littered more in an already littered environment versus a clean one, (2) people littered more when they saw someone else litter in an already littered environment, and (3) people littered less when they saw someone litter in a clean environment. 

If Cialdini is correct (and subsequent research has backed him up) it’s reasonable to believe that the Crying Indian ad unintentionally depicted a favorable environment in which to litter. 

The question is, which norm depicted in the ad holds stronger sway over peoples’ behavior: the injunctive norm (perception of behavior that is or is not acceptable – i.e. littering is wrong and makes Chief Cody cry) or the descriptive norm (perception of behaviors that most people do – i.e. people are littering in an already well-littered environment)?  Research shows that both norms influence behavior, but when in conflict, people tend to choose what Cialdini predicts they will — the path of least resistance. 

So let’s rewrite the ad…  Chief Iron Eyes Cody paddles his canoe to the shore and looks out over a pristine landscape–not even the hint of litter as far as the eye can see. Then, just as he’s tempted to smile about this, someone drives by and throws a Big Mac wrapper out of their car window.  The once unscathed greenery is now defaced by a rancid splotch of garbage. The camera pans back to Chief Cody’s face, and — wait for it — he’s crying. 

The injunctive and descriptive norms no longer conflict:  the message is conveyed that (1) littering is wrong and (2) some irresponsible miscreant just did something wrong by desecrating nature, and making an Italian-American actor who looks like a Native American cry. 

(here’s the original ad)

Leave a comment

Filed under About Perception, Books and Ideas

Searching for Useful Metaphors of the Mind

comic_history_of_rome_p_173_hannibal_crossing_the_alpsIn 1930, Freud was awarded the Goethe Prize for Literature. His masterful use of metaphor, a talent that set him apart from many of this contemporaries, was a major consideration in the awarding of the prize.  Quoting Jonathan Edelson,

Though Freud was a superb writer, as a scientist he was writing not merely to entertain but to inform. Metaphor in Freud’s work is not a mere decorative flourish; it is a necessary part of Freud’s formulation and exposition of his scientific theories.

In light of that esteemed tradition, I occasionally come across mind metaphors that strike me as especially useful — more than just “mere decorative flourish.”  As a self-acknowledged metaphor junkie, I’m always on the lookout for these, though few and far between they may be.  I’ll talk about two below.

The first comes from the book Managing Your Mind, a comprehensive yet readable tour through practical cognitive therapy.  I’d call it a self help book except I think it’s several cuts above that description (no matter how the publisher marketed it).  This metaphor comes from the section of the book on time management:

An old Renault car ran as smooth as cream once it was going, but was a devil to get started, particularly in wet weather.  Most of us are like that car. The first rule of time management is to get to the task at hand. Do not spend time in the limbo of neither getting down to work, nor enjoying your leisure.

What I really like about that metaphor is that everyone has felt the limbo the writer describes – it’s a palpable paralysis.  Take exercise for example. For me, once I can get myself moving on a treadmill or in the weight room, then I’m moving and can keep moving with the routine.  But initially getting up and moving is extremely hard.  Same thing appllies to getting started on a project (at work, home, etc).  Simple metaphor, but effective.

The next one is exponentially more substantial, and the best way to really get the most from it is to go read Jonathan Haidt’s book, The Happiness Hypothesis, where it’s discussed at length.  Here it is:

The mind is divided into parts that sometimes conflict. Like a rider on the back of an elephant, the conscious, reasoning part of the mind has only limited control of what the elephant does.   

Modern theories about rational choice and information processing don’t adequately explain weakness of the will. The older metaphors about controlling animals work beautifully. The image that I came up with for myself, as I marveled at my weakness, was that I was a rider on the back of an elephant. I’m holding the reins in my hands, and by pulling one way or the other I can tell the elephant to turn, to stop, or to go. I can direct things, but only when the elephant doesn’t have desires of his own. When the elephant really wants to do something, I’m no match for him.

Haidt says early in the book that he began writing thinking that the metaphor would work well for his chapter on “The Divided Self” (where the quote above comes from) but quickly realized that it was central to his entire book.  It’s most applicable to what he calls the fourth division of the self, controlled (rider) vs automatic (elephant) processes of the mind.  I recommend reading the book, but here’s the particular section in Google books if you’d like a preview.

I also like what Haidt has to say about metaphor overall and will close with this quote that sums up the subject well:

Human thinking depends on metaphor. We understand new or complex things in relation to things we already know. For example, it’s hard to think about life in general, but once you apply the metaphor “life is a journey,” the metaphor guides you to some conclusions: You should learn the terrain, pick a direction, find some good traveling companions, and enjoy the trip, because there may be nothing at the end of the road. It’s also hard to think about the mind, but once you pick a metaphor it will guide your thinking.


Filed under About Neuroscience, Books and Ideas

Self-perception Deception Makes for Dangerous Driving

Do as we say, and not as we do.  -Giovanni Boccaccio, 1313-1375

readingdrivingForbes ran an informative article this week on The Most Dangerous Times to Drive that puts a fine point on one dimension of the self-perception paradox (how we see ourselves versus how we actually are).  Consistently, and across many areas of life, we hold our behavior in far higher regard than our actions demonstrate is deserved, and driving is a fantastic example.  Unfortunately, the end result of this particular example can be fatal (hence the article’s title). I’ve pulled out a few sample facts below.

According to the National Highway Traffic Administration, 95% of all traffic accidents are caused by human error; but when surveyed, 75% of drivers say they are more careful than everyone else on the road

According to the Automobile Association of America, 82% of drivers say that driving while distracted is a serious problem;  but more than half say they talk on a cell phone while driving, and 14% admit to reading and sending text messages while driving

75% of drivers say that speeding is a serious problem; but 20% say that they drive more than 15 miles per hour over the speed limit on the highway, and 14% say they occasionally do so on neighborhood streets

And the most dangerous time of the year to drive?  It’s not winter. It’s August, on a Saturday.


Filed under About Neuroscience, Books and Ideas

On Procrastination and da Vinci (sometimes genius doodles)

sk03davinciLeonardo da Vinci, it has been said, was an inveterate procrastinator.  Fond of doodling while the hours passed by, the genius who changed the world wasn’t so expert at getting things done.  Helps to put things into perspective, no?  At the engaging site Procrastinus (where one can learn much while procrastinating) we learn this of da Vinci:

He explored almost every field available to him, in both science and art. He made significant contributions in engineering, architecture, biology, botany, anatomy, math, and physics. He sculpted, painted, both portrait and mural (e.g., The Last Supper) and made plans for ingenious machines that wouldn’t be built for centuries (e.g., planes, submarines). He also never finished a project on time.

Part of what made Leonardo such a “Renaissance Man” was that he was distractible as he was talented. Jacob Bronowski, the scientific historian, speaks about his procrastination. His talents and energy were often wasted in doodles and unfinished projects. The Last Supper was only finished after his patron threatened to cut off all funds. Mona Lisa took twenty years to complete.  The Adoration of the Magi, an early painting, was never finished and his equestrian projects were never built.

His procrastination caused him much grief in later years. Despite his varied contributions, he felt he could have achieved much more. Given his talents, it is without doubt that more of his aspirations could have become a reality in his own time. So much was half-completed that he appealed to God, “Tell me if anything ever was done. Tell me if anything was done.”

Author John Perry may have had a solution for da Vinci. While spending time trying to figure out the source of his own procrastination, he came to this conclusion: Procrastination is the result of a fantasy manufactured to avoid imperfection. Here’s Perry:

Many procrastinators do not realize that they are perfectionists, for the simple reason that they have never done anything perfectly, or even nearly so. They have never been told that something they did was perfect. They have never themselves felt that anything they did was perfect. They think, quite mistakenly, that being a perfectionist implies often, or sometimes, or at least once, having completed some task to perfection. But this is a misunderstanding of the basic dynamic of perfectionism.

Perfectionism is a matter of fantasy, not reality. Here’s how it works in my case.  I am assigned some task, say, refereeing a manuscript for a publisher. I accept the task, probably because the publisher offers to pay me with a number of free books, which I wrongly suppose that if I owned I would get around to reading. But for whatever reason, I accept the task.

Immediately my fantasy life kicks in. I imagine myself writing the most wonderful referees report. I imagine giving the manuscript an incredibly thorough read, and writing a report that helps the author to greatly improve their efforts.  I imagine the publisher getting my report and saying, “Wow, that is the best referee report I have ever read.” I imagine my report being completely accurate, completely fair, incredibly helpful to author and publisher.

His solution?  Structured procrastination.  With a bit of self-imposed psychological judo, the weakness can become a strength.

Procrastinators seldom do absolutely nothing; they do marginally useful things, like gardening or sharpening pencils or making a diagram of how they will reorganize their files when they get around to it. Why does the procrastinator do these things? Because they are a way of not doing something more important. If all the procrastinator had left to do was to sharpen some pencils, no force on earth could get him do it. However, the procrastinator can be motivated to do difficult, timely and important tasks, as long as these tasks are a way of not doing something more important.

Structured procrastination means shaping the structure of the tasks one has to do in a way that exploits this fact. The list of tasks one has in mind will be ordered by importance. Tasks that seem most urgent and important are on top. But there are also worthwhile tasks to perform lower down on the list. Doing these tasks becomes a way of not doing the things higher up on the list. With this sort of appropriate task structure, the procrastinator becomes a useful citizen. Indeed, the procrastinator can even acquire, as I have, a reputation for getting a lot done.

Perry’s essay is very funny, and worth wasting some time to read.  And, if you’ve got time to spare after that, you can go measure your procrastination here.


Filed under Books and Ideas

The Hypnosis-Consciousness Conundrum

fate2Scientific American recently ran an article online entitled, “Is Hypnosis a Distinct Form of Consciousness?” that surveyed the last 50 or so years of research on whether hypnosis is truly an altered state of consciousness, or something else.  The ‘something else’ possibility is, according to the authors of this article, the one we’re left with. 

Despite best efforts, researchers have not been able to pinpoint any specific “markers”–indicators–of hypnosis that distinguish it from other states of consciousness.  The typical terms associated with hypnosis (trance state, sleep state, posthypnotic amnesia) each fail, under research scrutiny, to describe anything substantial. 

For example, contrary to widespread belief, hypnosis does not involve inducing a “waking sleep” or really anything related to a true sleep state.  Electroencephalographic (EEG) studies confirm that someone under hypnosis is entirely awake (although some people report feeling a little drowsy).  Posthypnotic amnesia (failing to remember what happened during hypnosis) has also been largely contradicted by research, which shows that this reaction only occurs when the subject is told to expect it to occur.

The “trance state” phenomena has also had the wind taken out of its sails by studies that show the same physiological markers of this alleged state in subjects experiencing quiet states of concentration.  One of these markers, heightened Theta activity in the brain, can be easily achieved in subjects without hypnotic induction.

So if this is all true, then what exactly is hypnosis?  Anyone who has watched a stage hypnosis show, or a street side hypnotist having fun with passers by, I think would agree that something is going on – not to mention hypnosis’ acceptance by many in the psychiatric community as a useful therapeutic tool.

The general conclusion of the SciAm article is that hypnosis works on a fulcrum of suggestibility.  This is not, as might be hypnosis-slave2guessed, “hypnotic suggestibility” – because this dynamic has also been undercut by research that has determined only a 10% or less increase in suggestibility following hypnotic induction.  Rather, the suggestibility we’re talking about  is the old fashioned kind, the same kind that leads people to buy lottery tickets and get into pyramid schemes.  The more suggestible we are in this way, then the more open to being hypnotized–and pliable under hypnosis–we’ll be, however you choose to define the term.

An example of this not discussed in the article, but one we can witness on television every day, is the mass suggestibility evident in things like faith healing.  If you ever get the chance, it’s worth watching one of these closely – the suggestibility manipulation is palpable, and it produces definite results (though not the sort promised by the ‘healer’).  If you spend any time in a Pentecostal church you can watch a similar dynamic play out in ‘speaking in tongues’ (here’s a taste if you need it).

But, while the explanatory value of suggestibility can probably take us quite far, I’m still not sure it gives us everything. Maybe I’ve read too much of psychiatrist and hypnosis maven Milton Erickson for my own good (and perhaps a victim of my own suggestibility) but I’m inclined to think more is going on, even if it is largely (as Erickson himself admitted) an attention-confusion technique

Below is an example of an Erickson-inspired instant hypnotic induction.


Filed under About Neuroscience, Books and Ideas