Category Archives: About Morality

Power Makes the Hypocrite Bolder and Smugger

We’ve all had the experience of listening to someone in a position of power rail against the moral ineptitude of others. Turn on the news on any given day and you’re likely to see someone moralizing about family values, for example.  Most of us listen to these diatribes and wonder if those doing the judging would fare well under judgment—though we strongly suspect they would not.

A new study that will be published in a forthcoming issue of Psychological Science confirms our suspicions. Researchers investigated whether people in positions of power that hold high standards for others actually live up to those standards themselves. To do so, they set up a power simulation in which two groups of people were assigned roles of ‘high-power’ or ‘low-power’ (specifically, the roles of prime minister or civil servant).  Participants were then presented with a litany of moral dilemmas related to breaking traffic laws, reporting taxes, returning stolen property, and over-reporting travel expenses, among others.  

Five experiments followed in which researchers examined the impact of power on the moral hypocrisy of the participants. They found a consistent and alarming outcome:  those assigned to the ‘high-power’ group repeatedly condemned moral failures of others while committing unethical acts themselves. In one experiment, high-power participants were asked for their positions on cheating and over-reporting travel expenses, both of which they flatly condemned.  They and the low-power group were then asked to play a dice game alone, in a private cubicle, to win lottery tickets. The powerful reported significantly higher lottery winnings than the low-power group, even though both groups had the same odds of winning.

Continue reading

Advertisement

6 Comments

Filed under About Morality, About Research

When it Comes to Trusting Authority, Moral Conviction and Religiosity Part Ways

PASOne of the consistent elements in political discussions is the influence of religious belief on attitudes toward government. And typically it’s assumed that a high degree of religiosity is synonymous with a high degree of moral conviction – they’re popularly thought to go hand-in-hand.  So, if someone’s attitude toward governmental authority is influenced by his or her religiosity, it should logically follow that this attitude is further buttressed by his or her moral conviction; the influence should be the same. 

But is that true?

A new study in the journal Psychological Science sought to find out how religiosity and moral conviction influence attitudes toward authority.  A survey was administered to a representative sample of 727 Americans, ages 19-90, to asses the degree of trust or mistrust people have in major decisions made by the Supreme Court (in this case, physician assisted suicide, a.k.a ‘PAS’).  The sample drew from a wide socioeconomic and educational background.

Measures evaluated via the survey included:

  • Support or opposition to PAS
  • Level of strength or weakness of support or opposition (to gauge attitude extremity)
  • Overall level of moral conviction
  • Trust in Supreme Court to make decisions regarding PAS
  • Length of time it takes to give an opinion on level of trust in Supreme Court (to reveal the degree of visceral emotion linked to this opinion; more emotion = less time)
  • Level of overall religiosity

Here’s what researchers found out:  First, the stronger a person’s moral conviction, the less they trust the Supreme Court to make a judgment about PAS.  Conversely, the higher the degree of a person’s religiosity, the MORE they trust the Supreme Court to make a decision on this sensitive issue. 

Just to be clear about that — the results for moral conviction were exactly the opposite of those for religiosity. 

Also, the stronger a person’s moral conviction, the faster they responded to the trust question, indicating a visceral reaction as opposed to a more considered one.  Likewise, the higher the degree of someone’s religiosity, the faster they responded to the trust question.  So in the case of both moral conviction and religiosity, responses were significantly visceral.

At least two major implications can be drawn out from this study. The first is that the typical assumption that religiosity and moral conviction are necessarily synonymous is false. Moral conviction in this study was strongly linked to distrust in legitimate authority, while religiosity was strongly linked to trust in legitimate authority.

The second implication is that morally convicted people don’t merely “react” to decisions with which they don’t agree. Instead, it’s clear that they don’t trust legitimate authorities to make the right decisions in the first place.  Their reaction is simply a projection of a predisposition already strongly held. 

The one crucial area this study didn’t tease out fully enough, in my opinion, is where religiosity and moral conviction overlap. Presumably, level of moral conviction would trump level of religiosity on attitudes toward authority (at least it certainly seems this way) – but it’s also possible that religiosity has a moderating effect on moral conviction’s influence in some cases.  It would have been useful to see this worked out more carefully in the study; nevertheless, the results are telling.

UPDATE:  It’s always great when an author of a study reviewed here comments on the post.  Dr. Linda Skitka, one of the authors of this study, left the comment below, which provides an important clarification.  Many thanks!

I’m one of the authors of this article. FYI: we did test whether religiosity moderated the effects of moral conviction, and it did not–in other words, the effects of moral conviction on trust in the Supreme Court did not change as a function of whether the perceiver was low or high in religiosity. We measured both general religiosity, as well as whether people’s feelings about PAS were based on religious convictions, and got the same pattern of results regardless of which way we operationalized “religiousness”. Interestingly (and counter-intuitively), about one-third of those whose attitude about PAS reflected a strong religious conviction did not report that their attitude about PAS was a strong moral conviction.

ResearchBlogging.org
Wisneski, D., Lytle, B., & Skitka, L. (2009). Gut Reactions: Moral Conviction, Religiosity, and Trust in Authority Psychological Science, 20 (9), 1059-1063 DOI: 10.1111/j.1467-9280.2009.02406.x

4 Comments

Filed under About Belief, About Morality, About Religion, About Research

Riding the Self-Regulation See Saw

mushroomseesaw

by David DiSalvo

The April issue of Psychological Science includes an interesting paradoxical study on moral self-regulation.  Building on previous research that examines why people act altruistically even when such action is costly, researchers wanted to take a closer look at the moral back-and-forth we all engage in when deciding how to act. 

In the spotlight are two polar opposite terms:  moral cleansing and moral licensing.  Moral cleansing is the tendency to engage in a moral behavior to offset negative feelings of self worth.  For example, if someone feels bad that they don’t regularly recycle, they might be strolling through a Home Depot one day and decide to buy a boxful of energy-efficient light bulbs to switch out all the less-efficient bulbs in their house.  The self-worth deficit of the first lack-of-action is offset by the self-worth boost of the second.

In the case of moral licensing, someone may be inclined to either act immorally, or (more likely) not act at all, if they already have a self-perception of being a moral person. For example, if someone just volunteered to work a charity event, they may be less likely to give blood during a blood drive the following week.  The first action produced a moral-currency surplus that boosted self-worth, and acted as a “pass” not to engage in the second action.

In the first experiment, participants were asked to write a self-relevant story using words that referred to either their positive or negative traits.  After finishing, they were told that the research lab was interested in supporting social awareness and usually asks participants if they would like to make a small donation to the charity of their choice. Participants were told they could write down the charity and the amount they wanted to donate (note, they were not aware of any link between the story they wrote and the charity donation).  The result: participants who used positive words about themselves in their stories donated one fifth as much money as those who used negative words.  A follow-up experiment verified these results.

In the final experiment, participants completed the same positive/negative-words story as in the first, but were then asked to take part in another study focused on the best way to reduce pollution at manufacturing plants.  They were given a scenario of being in charge of a plant that was under pressure from environmental lobbyists to run pollution filters on smokestacks, at a monetary cost, and presented with the cost of running the filters for varying percentages of time.  The standard, “legally agreed upon” amount of time was 60%, at a cost of $1.2 million.  Or, they were told they could deviate from this agreement and choose to run the filters for any 10% interval of time up to 100%, with each interval costing $0.2 million.  In effect, participants were being asked to make a decision between cost to the company or improving the environment.  They were also asked a number of follow-up questions, such as what they thought the responsibility of  a plant manager should be in this situation, and how likely they thought it was that they’d be caught if they decided not to adhere to the agreement.

The results: participants who used positive words in their self-relevant stories were significantly more likely to run the pollution filters less often than those who used negative words.  They were also more likely to say that plant managers should place profit ahead of environmental concerns, and were less likely to believe they’d be caught if they broke the agreement.

So this study suggests that our perceived level of self-worth effects our moral decisions.  More specifically, it suggests that feelings of negative self-worth can predispose us to acting morally in an effort to fill up the self-worth bank account.  If the account is already full, we might be predisposed to choosing not to act morally, or just not act at all. 

That being said, there are many caveats that make this study less than complete. For example, the last experiment doesn’t really tell us if moral licensing was going on because there was no way of gauging how much moral activity any of the participants had engaged in prior to the study — this would have been useful to know. Also, we don’t know if some participants believed that acting firmly in the interests of the company was the more moral decision, particularly since such a high cost might affect jobs, workers’ benefits, etc.   But, there’s enough here to be thought-provoking nonetheless.

ResearchBlogging.org
Sachdeva, S., Iliev, R., & Medin, D. (2009). Sinning Saints and Saintly Sinners: The Paradox of Moral Self-Regulation Psychological Science, 20 (4), 523-528 DOI: 10.1111/j.1467-9280.2009.02326.x

7 Comments

Filed under About Morality, About Research

The Banality of Evil

eichmann-two-faces3The New Yorker has an excellent article entitled “Beware of Pity” about Hannah Arendt, the brilliant historian and politcal scholar who coined the term “the banality of evil” to describe what she witnessed at the trial of Adolf Eichmann, the infamous architect of the holocaust. She said of Eichmann, and by extension of all the Nazi functionaries, “(he) does not regard himself as a murderer because he has not done it out of inclination but in his professional capacity.”  This radical compartmentalization of the public and private self to justify horrendous acts is, she said, “the fearsome, word-and-thought-defying banality of evil.”

The entire article is worth reading, if you’re inclined to learn more about Arendt and her influence, but I want to paste in the first section below because it’s such a salient example of the banality of evil.   Croatian novelist Slavenka Drakulić visited The Hague in 1999 to observe trials for war crimes committed in the former Yugoslavia.  Her discovery is described in this excerpt:

Among the defendants was Goran Jelisić, a thirty-year-old Serb from Bosnia, who struck her as “a man you can trust.” With his “clear, serene face, lively eyes, and big reassuring grin,” he reminded Drakulić of one of her daughter’s friends. Many of the witnesses at The Hague shared this view of the defendant—even many Muslims, who told the court how Jelisić helped an old Muslim neighbor repair her windows after they were shattered by a bomb, or how he helped another Muslim friend escape Bosnia with his family.

But the Bosnian Muslims who had known Jelisić seven years earlier, when he was a guard at the Luka prison camp, had different stories to tell. Over a period of eighteen days in 1992, they testified, Jelisić himself killed more than a hundred prisoners. As Drakulić writes, he chose his victims at random, by asking “a man to kneel down and place his head over a metal drainage grating. Then he would execute him with two bullets in the back of the head from his pistol, which was equipped with a silencer.” He liked to introduce himself with the words “Hitler was the first Adolf, I am the second.” He was sentenced to forty years in prison.

None of Drakulić’s experience in creating fictional characters could help her understand such a mind, which remained all the more unfathomable because of Jelisić’s apparent normality, even gentleness. “The more you realize that war criminals might be ordinary people, the more afraid you become,” she wrote.

For more on this topic, you may be interested to read my interview with Dr. Philip Zimbardo, author of  The Lucifer Effect. Dr. Zimbardo’s position regarding evil acts is similar to Arendt’s, and he has been a pioneer in this field of psychological research for the last four decades.

Leave a comment

Filed under About Morality, Books and Ideas

This is Your Brain on Morality

In October, Sam Harris–author, provocateur and neuroscientist–spoke at Beyond Belief: Candles in the Dark, an annual event sponsored by the Salk Institute for Biological Studies. This year participants were asked  to propose a “Candle”—a potential solution to a problem that they have identified in their area of expertise.  Sam Harris delivered a talk on the relationship between science and morality–and specifically the erroneous idea that science is divorced from morality–called “Can We Ever Be Right About Right and Wrong?”  Both parts of the talk are below (roughly 10 minutes each).

1 Comment

Filed under About Morality, Videos

You Will Be Emulated, Resistance is Futile

sivartha3Take one part old school materialism and judiciously mix with two parts new school neuroscience, and you get the latest release from the Future of Humanity Institute: The Whole Brain Emulation Roadmap.  This is incredibly heady stuff, pardon the pun, and I admit to being fascinated by the implications of digitally emulating a human brain.  But my inner techno-skeptic urges temperance of my enthusiasm.

Consider this statement made In the opening section on why the research is important, under the header “Individuality”:

If emulation of particular brains is possible and affordable, and if concerns about individual identity can be met, such emulation would enable back‐up copies and “digital immortality”. 

It would seem, then, that the braintrust behind the roadmap is openly acknowledging that this is in part an immortality project. Provided concerns about “individual identity” can be met, brain emulation would allow us to live forever (though sipping your favorite Cabernet would be a little harder, unless that can be digitized as well).  I’m certain that Ernest Becker would be snickering while reading about how neuroscience can allegedly provide the ultimate ‘denial of death.’

The document is thick and rich and well worth perusing. For those inclined, you’ll learn about applications and implications of topics like “nanodissasembly”  and “nondestructive scanning.”  And the neural mapping discussion is fascinating all on its own.  But perhaps nothing in this tome is as thought inspiring as statements concerning the inevitable collision of whole brain emulation and the concept of free will – which the researchers seem to indicate can be handled by inputting “sufficient noise in the simulation,” followed by this statement:

Hidden variables or indeterministic free will appear to have the same status as quantum consciousness: while not in any obvious way directly ruled out by current observations, there is no evidence that they occur or are necessary to explain observed phenomena.

Which I take to mean, it’s not really a concern.  I can’t help but find this a curious way to treat a problem that has been a top three topic of Western philosophy for centuries.  

cube_fcAs a matter of working through the technical feasibility of a theoretically possible project, the Roadmap is weighty stuff.  But before revision 2.0 comes out, I think the authors need to consult a broader range of qualified thinkers on topics that deserve far more deliberation and concern than they’ve given them.

 

Brain map graphic is from The Book of Life: The Spiritual and Physical Constitution of Man (1912)

Leave a comment

Filed under About Morality, About Neuroscience

Jonathan Haidt on the Moral Roots of Ideology

Jonathan Haidt, professor of psychology at the University of Virginia and author of The Happiness Hypothesis, wrote a provocative article for Edge not long ago about the moral roots of ideology, which garnered some notable responses.   The article and responses are well worth reading, and Haidt’s talk from TED 2008 below is also well worth the time.  Haidt focuses on the five moral values that form the basis of our political choices, whether we’re left, right or center and pinpoints the moral values that liberals and conservatives tend to honor most.

Leave a comment

Filed under About Belief, About Morality, Videos