Category Archives: About Belief

What’s More Potent, Testosterone or the Power of Belief?

When most people think of testosterone, words like “aggression,” “dominance,” and “violence” usually come to mind.  Those words are memetically linked with testosterone the way “expensive” is linked with diamonds, and most of us have adopted the linkage without thinking much about it.  Collectively, we’ve adopted a “folk hypothesis” about testosterone–a generalized presupposition grounded in folk wisdom assumed to be correct.

What makes folk hypotheses noteworthy is that they’re hard to challenge–not because they are fact-based, but because they are so deeply entrenched in collective thinking.  So I was intrigued to come across a study in the journal Nature that takes on the testosterone folk hypothesis directly, and also manages to illustrate something important about the power of belief. 

A fair amount of evidence has surfaced that testosterone is a key ingredient in social relations, not by increasing, but by decreasing conflict. With that in mind, researchers wanted to know what would happen if they gave a group of female subjects a sublingual dose of testosterone before playing the ultimatum game (the bargaining game in which one subject must negotiate with another about how to divide a sum of money. If an agreement is reached, both parties get the money as agreed. If an agreement isn’t reached, neither get the money).  The subjects were not told whether they were receiving a placebo or true testosterone, only that they were getting a dose that could be either.

The folk hypothesis about testosterone predicts that it will increase unfair bargaining by making one of the parties more conflictual and less willing to negotiate.   In the ultimatum game, unfair bargaining means offering significantly less than 50% of the sum. Typically if less than 50% is offered, the other party will not agree because he or she would rather that neither party receive the money than accept the indignity of being treated unfairly. If someone with a testosterone boost is experiencing heightened feelings of dominance and aggression, it makes sense that fairness wouldn’t be top of mind. 

The results, however, were exactly the opposite. Overall, those subjects that actually did receive testosterone were not less fair but significantly more fair in their dealings.  So much for the folk hypothesis.

But that result, though compelling, is not the most telling part of this study.  After the games were played, researchers asked the subjects if they believed they had received a dose of testosterone or a placebo. Subjects who believed they were receiving a dose of testosterone, whether they actually did or not, acted the part.  Thinking they were negotiating under the influence of a notorious hormone, their bargaining behavior followed suit and was significantly less fair than that of those who thought they had only received a placebo.

So as much as this study showed that the folk hypothesis about testosterone is flawed, it also showed that belief is a powerful enough agent to induce effects presumed to be true.  It’s even more powerful than the infamous chemical in question. Time and time again, belief proves itself the strongest bully on the cognitive block.
ResearchBlogging.org
Abbott, A. (2009). Testosterone link to aggression may be all in the mind Nature DOI: 10.1038/news.2009.1131

Add to: Facebook | Digg | Del.icio.us | Stumbleupon | Reddit | Blinklist | Twitter | Technorati | Yahoo Buzz | Newsvine

5 Comments

Filed under About Belief, About Research

When it Comes to Trusting Authority, Moral Conviction and Religiosity Part Ways

PASOne of the consistent elements in political discussions is the influence of religious belief on attitudes toward government. And typically it’s assumed that a high degree of religiosity is synonymous with a high degree of moral conviction – they’re popularly thought to go hand-in-hand.  So, if someone’s attitude toward governmental authority is influenced by his or her religiosity, it should logically follow that this attitude is further buttressed by his or her moral conviction; the influence should be the same. 

But is that true?

A new study in the journal Psychological Science sought to find out how religiosity and moral conviction influence attitudes toward authority.  A survey was administered to a representative sample of 727 Americans, ages 19-90, to asses the degree of trust or mistrust people have in major decisions made by the Supreme Court (in this case, physician assisted suicide, a.k.a ‘PAS’).  The sample drew from a wide socioeconomic and educational background.

Measures evaluated via the survey included:

  • Support or opposition to PAS
  • Level of strength or weakness of support or opposition (to gauge attitude extremity)
  • Overall level of moral conviction
  • Trust in Supreme Court to make decisions regarding PAS
  • Length of time it takes to give an opinion on level of trust in Supreme Court (to reveal the degree of visceral emotion linked to this opinion; more emotion = less time)
  • Level of overall religiosity

Here’s what researchers found out:  First, the stronger a person’s moral conviction, the less they trust the Supreme Court to make a judgment about PAS.  Conversely, the higher the degree of a person’s religiosity, the MORE they trust the Supreme Court to make a decision on this sensitive issue. 

Just to be clear about that — the results for moral conviction were exactly the opposite of those for religiosity. 

Also, the stronger a person’s moral conviction, the faster they responded to the trust question, indicating a visceral reaction as opposed to a more considered one.  Likewise, the higher the degree of someone’s religiosity, the faster they responded to the trust question.  So in the case of both moral conviction and religiosity, responses were significantly visceral.

At least two major implications can be drawn out from this study. The first is that the typical assumption that religiosity and moral conviction are necessarily synonymous is false. Moral conviction in this study was strongly linked to distrust in legitimate authority, while religiosity was strongly linked to trust in legitimate authority.

The second implication is that morally convicted people don’t merely “react” to decisions with which they don’t agree. Instead, it’s clear that they don’t trust legitimate authorities to make the right decisions in the first place.  Their reaction is simply a projection of a predisposition already strongly held. 

The one crucial area this study didn’t tease out fully enough, in my opinion, is where religiosity and moral conviction overlap. Presumably, level of moral conviction would trump level of religiosity on attitudes toward authority (at least it certainly seems this way) – but it’s also possible that religiosity has a moderating effect on moral conviction’s influence in some cases.  It would have been useful to see this worked out more carefully in the study; nevertheless, the results are telling.

UPDATE:  It’s always great when an author of a study reviewed here comments on the post.  Dr. Linda Skitka, one of the authors of this study, left the comment below, which provides an important clarification.  Many thanks!

I’m one of the authors of this article. FYI: we did test whether religiosity moderated the effects of moral conviction, and it did not–in other words, the effects of moral conviction on trust in the Supreme Court did not change as a function of whether the perceiver was low or high in religiosity. We measured both general religiosity, as well as whether people’s feelings about PAS were based on religious convictions, and got the same pattern of results regardless of which way we operationalized “religiousness”. Interestingly (and counter-intuitively), about one-third of those whose attitude about PAS reflected a strong religious conviction did not report that their attitude about PAS was a strong moral conviction.

ResearchBlogging.org
Wisneski, D., Lytle, B., & Skitka, L. (2009). Gut Reactions: Moral Conviction, Religiosity, and Trust in Authority Psychological Science, 20 (9), 1059-1063 DOI: 10.1111/j.1467-9280.2009.02406.x

4 Comments

Filed under About Belief, About Morality, About Religion, About Research

What Enemas and Demonic Possession Have to Do with Developing False Beliefs

confusedIf there’s anything that cognitive psychology studies have made clear over the years, it’s that humans can be exceptionally gullible.  With a little push, we’re prone to developing false beliefs not only about others, but about ourselves with equal prowess — and the results can be, well, hard to believe.

For example, a study in 2001 asked participants to rate the plausibility of having witnessed demonic possession as children and their confidence that they had actually experienced one.  Later, the same participants were given articles describing how commonly children witness demonic possessions, along with interviews with adults who claimed to have witnessed possessions as children. After reading the articles, participants not only rated demonic possession as more plausible than they’d previously said, but also became more confident that they themselves had actually experienced demonic possession as children.

Another (less dramatic) study asked participants to rate the plausibility that they’d received barium enemas as children.  As with the other study, participants were later presented with “credible” information about the prevalence of barium enemas among children, along with step-by-step procedures for administering an enema. And again, the participants rated the plausibility of having received a barium enema as children significantly higher than they had before.

A recent study in the journal Applied Cognitive Psychology wanted to determine the effect of prevalence information (information that establishes how commonly an event happens, making it seem more likely and therefore more self-relevant) on the development of false beliefs.  Participants were asked to rate the plausibility of 10 events from 1 (not at all plausible) to 8 (extremely plausible), and how confident they were that they’d experienced each event from 1 (definitely did not happen) to 8 (definitely did happen).  The events included a range of the highly plausible (I got lost in a shopping mall as a child) to the highly implausible (I was abducted by a UFO). 

Two weeks later, participants were brought back and given information on four of the events they’d previously rated, all in the low-to-moderate plausibility range (no UFOs). The information included newspaper articles, third-person descriptions, and data from previous study subjects — all of which was designed to establish higher prevalence of the events. 

The results:  High prevalence information from all sources affected the development of false beliefs. In particular, participants given high prevalence information in false newspaper articles became more confident that they had actually experienced the events, testifying to the power of the printed word.

The takeaway here probably has a few prongs. First, we shouldn’t discount the possibility that we’re just as susceptible to developing false beliefs as anyone else walking around this planet.  The brain is a superb miracle of errors and no one, except the brainless, is exempt. On the other hand, knowing that to be true is also the best preventative against chasing the make believe rabbit into his hole. A little error adjustment can go a long, long way.

3 Comments

Filed under About Belief, About Research

What Might Make You Trust a Stranger?

pd_holding_money_070925_mn

by David DiSalvo

It comes as no surprise that people tend to prefer others of the same in-group. If you’re a diehard supporter of a political candidate and someone drives by with a bumper sticker endorsing the candidate, you feel a hint of “inness” with that person. If someone drives by with a bumper sticker of the candidate’s opponent, you feel a twinge of “otherness” about that person.  If asked why, you might say that the first person probably shares many of your views and you’re on the same team, more or less. The second driver is showing with the opponent’s bumper sticker that she’s on the other team.  In effect, you feel a sense of in-group trust with the first person that you don’t feel with the second.

But why, exactly, trust a stranger any more than another stranger if you don’t really know either of them?  That question was addressed in a study in the April issue of Psychological Science.  The study begins by establishing two possible bases for group-based trust. The first is stereotyping — people tend to judge in-group members as nicer, more helpful, generous, trustworthy and fair. The second is expectation — people tend to expect relatively better treatment from in-group members because they are thought to value, and want to further, other in-group members’ interests.

Study participants were offered a choice between an unknown sum of money from an in-group member or an out-group member (and were told that the in-group and out-group members controlled the amount of money to allocate as they desired).  The initial result was that participants overwhelmingly chose the in-group member option.  And, surprisingly, this result held true even when the stereotype of the in-group was more negative than that of the out-group. Good, bad or indifferent, the stereotype was ignored in favor of group identity. 

But, when participants were told that the in-group money giver didn’t know they were part of the same group, the situation changed.  When this was the case, participants resorted to making their choice on the basis of stereotype.  If the in-group was portrayed negatively, then the participants were more likely to choose the out-group member option, and visa versa.

So this study suggests that when members of the in-group are mutually aware of their inness, there’s an expectation of better treatment than would be received from an out-grouper. But when that awareness is muddied, reliance on stereotypes kicks in. 

This analysis gets really interesting when focused on electronic communication. Online, most people are not aware of others’ inness or outness. According to the results of this study, in these cases we’d expect most people to rely on group stereotypes when deciding who to trust (follow, read, etc), and social networking provides fertile ground to test this hypothesis in real-time.

ResearchBlogging.org
Foddy, M., Platow, M., & Yamagishi, T. (2009). Group-Based Trust in Strangers: The Role of Stereotypes and Expectations Psychological Science, 20 (4), 419-422 DOI: 10.1111/j.1467-9280.2009.02312.x

Leave a comment

Filed under About Belief, About Perception, About Research

To Esteem Thyself, Or Not

magical-weave-mirrorIf anyone was asked to list the top 10 topics that ignite arguments, I doubt very much that ‘self esteem’ would make the cut. And yet, this seemingly bland, bordering-on-clichéd topic is in fact the source of many battles.  Too little, or too much is the question: how much self esteem is the right amount?

Now a study from the University of Geneva (courtesy of BPS Research) suggests that self esteem at low doses is linked to higher suicide rates around the world.  Researchers evaluated suicide rates and self esteem levels, using data from the International Sexuality Description Project, across 55 nations and arrived at this conclusion:

Results indicate that suicide is especially common in nations with relatively low levels of self-esteem. This relation is consistent across sex lines, age of suicide and independent from several other relevant factors such as economic affluence, transition, individualism, subjective well-being, and neuroticism.

If these results stike you as uncontroversial, consider the work of psychologist Roy Baumeister, a decades-long public critic of the self esteem movement — and, one might confidently say, self esteem in general.  Baumeister’s research tells a different story about high self-esteem, linking it not to successful performance in life, but to tendencies toward bullying, murder, racism and gang involvement.  Here’s a snippet from an article he did in the Los Angeles Times a few years ago:

It was widely believed that low self-esteem could be a cause of violence, but in reality violent individuals, groups and nations think very well of themselves. They turn violent toward others who fail to give them the inflated respect they think they deserve. Nor does high self-esteem deter people from becoming bullies, according to most of the studies that have been done; it is simply untrue that beneath the surface of every obnoxious bully is an unhappy, self-hating child in need of sympathy and praise. High self-esteem doesn’t prevent youngsters from cheating or stealing or experimenting with drugs and sex. (If anything, kids with high self-esteem may be more willing to try these things at a young age.)

He also points to research indicating that self-esteem in high doses leads to a host of more common shortcomings:

High self- esteem in schoolchildren does not produce better grades. In fact, according to a study by Donald Forsyth at Virginia Commonwealth University, college students with mediocre grades who got regular self-esteem strokes from their professors ended up doing worse on final exams than students who were told to suck it up and try harder.

Self-esteem doesn’t make adults perform better at their jobs either. Sure, people with high self-esteem rate their own performance better – even declaring themselves smarter and more attractive than their low self-esteem peers – but neither objective tests nor impartial raters can detect any difference in the quality of work.

And Baumesiter isn’t alone, either on the secular or sectarian front. Nicholas Emler, noted social psychologist at the London School of Economics, shares this view and adds many additional caveats. And religious leaders the world over have routinely condemned the self esteem movement (in its official and generic forms) as endorsing a view of humanity too ‘esteemed’ for its own good.  “Esteem thyself not” is the anthem of many preaching this position in the West.

Of course, if you Google ‘self esteem’, what you’ll predominantly get is glowing, unabashed praise for self esteem as a movement, and simply as an essential staple of the good life.

If the University of Geneva research is correct, the pro-self esteem position seems to have the final trump card on this controversy.  Maybe.  It depends on what is meant by ‘self esteem.’ Does this term translate well across language and cultural lines?  Does someone in Beijing believe that self esteem is the same thing that someone in Birmingham believes it to be? 

With those questions in mind, I’m launching my first poll on this site.  We have an international audience right here, so who better to answer the question than you?

 

2 Comments

Filed under About Belief, About Research, Books and Ideas

Kluge on the Brain: An Interview with Author Gary Marcus

marcus_edge_bw_c250px1If you’ve ever wondered why your mind seems to fail at the wrong times despite every earnest attempt to get everything right, or why following the most touted self-help program to a perfect T still doesn’t yield results as advertised – it’s time you got in touch with your inner kluge.  Fortunately, Gary Marcus, professor of psychology at New York University, has written the definitive book on the topic – one that could be administered as an antidote to self help blindness far and wide. 

Reading Kluge is not unlike being injected with a dose of “ah ha”. The effect of the elixir isn’t to reassure that perfection is attainable if only we do, and think, all the right things, but rather to cogently reveal that perfection was never within reach to begin with. And yet, we still clumsily get by, making due with this strange kluge of a mind that manages to work despite itself.   Gary Marcus was kind enough to take a break from a grueling travel schedule to explain a bit about how this all works.

For those who haven’t yet read your book, what exactly is a kluge? 

The word kluge is a word that engineers use to describe a solution that’s clumsy but surprisingly effective; think MacGyver or Rube Goldberg, duct tape, baling wire and rubber bands.kluge

 

The idea that our minds operate haphazardly seems to fly in the face of our tendency to think that we’re especially well-designed creatures, standing above and apart from the chaotic animal world. What’s your reply when you hear arguments along those lines?

We do for sure, in many ways, stand apart from other animals. No other animal has a communication system that’s as sophisticated or powerful as ours, and no other animal can do as much with culture as we can. We do stand apart, but saying that our minds are (in many, not all) respects more powerful than our animal cousins is not the same as saying we are perfect, or even particularly good at what we do.

We have, for example, the capacity to reason deliberately, but evolution did a fairly poor job of installing that capacity, such that the deliberate reasoning faculties frequently stand around idle, yielding much of decision making to ancestral mechanisms that are tuned more towards short-term rewards. The thing you have to remember is that human beings have only been around 100,000 years or so, and that’s not a lot of time for evolution to iron out the bugs.

 

You discuss at some length the complementary biases known as confirmation bias and motivated reasoning, and include what I thought was an especially provocative statement: “The reality is that we are just not born to reason in balanced ways.” Can you elaborate a bit on how these biases work together to unbalance our ability to reason?   

Confirmation bias is the tendency to notice evidence that supports our own theories, even as we miss evidence that might contradict those theories. If you believed in astrology, for example, you might find it easy to remember the days when your horoscope came true, yet tend to overlook the days in which the horoscope seemed off target.

Motivated reasoning is kind of the opposite: it’s the tendency to work harder to debunk things that we don’t like relative to things that we do. If we like an idea, we give it a free ride; if we dislike it, we dig in. The net result is that people often insulate themselves from ideas that challenge their beliefs. Republicans watch Fox, Democrats listen to NPR, and very few people ever really change their minds about anything of significance.

 

Your book ends with a number of suggestions for avoiding the pitfalls of our minds’ klugery.  The one I found the most compelling in its simplicity is number 13, “Try to be rational.” Is that possible when it seems we begin from a less-than-rational, ‘klugified’ starting point? How do we get there from here?

Being rational is not something that comes natural to us, but is (at least to some degree) something we can do; the real trick is to remember to do it. I think of it a bit like trying to fix your golf swing; you may naturally want to bring your shoulders up, but if you work hard enough at it you can learn to keep them down. The problem isn’t so much in keeping your shoulders down for one shot, but in learning to do so routinely.

Our problem as a species is not that we can’t behave rationally, but that usually we don’t; simply being aware of that fact can help us to build better habits.

 

You spend some time discussing implications of the ideas addressed in the book for education. What, in your opinion, needs to happen for our educational system to really begin teaching people to think critically, rather than, as you say, rely on “revealed truths”?

I think we have to rethink what it is that we want to achieve in our schools. In my view, we spend way too much time having kinds memorize trivia that can easily be looked up on the web; maybe that made sense in the 18th century, but now it’s a waste of time.  The memorize-and-test framework exists because it’s easy for teachers and straightforward for students, but it doesn’t leave much that lasts.  Meanwhile, so-called “critical thinking skills’ are teachable, but teaching them takes time, just like teaching anything else. So it’s a matter of priorities.  Do we want our kids to memorize dates and places, or teach them to think for themselves?

Link to Kluge on Amazon

Link here to see a recent Bloggingheads.tv interview Gary Marcus did with science writer Carl Zimmer

3 Comments

Filed under About Belief, About Neuroscience, Interviews

Jonathan Haidt on the Moral Roots of Ideology

Jonathan Haidt, professor of psychology at the University of Virginia and author of The Happiness Hypothesis, wrote a provocative article for Edge not long ago about the moral roots of ideology, which garnered some notable responses.   The article and responses are well worth reading, and Haidt’s talk from TED 2008 below is also well worth the time.  Haidt focuses on the five moral values that form the basis of our political choices, whether we’re left, right or center and pinpoints the moral values that liberals and conservatives tend to honor most.

Leave a comment

Filed under About Belief, About Morality, Videos