Monthly Archives: February 2009

What is Literary Darwinism? An Interview with Joseph Carroll

oct-2008If you’ve heard the term “Literary Darwinism,” you may have been tempted to lump it in with the list of schools of thought conjoining evolutionary thinking with, well, almost everything. In the interest of full disclosure, I’ll admit I did the same.

But the simple fact that this field has in the last couple of decades attracted a diversity of credible thinkers, from the sciences and literary studies alike, urged me to take a closer look. After all, there aren’t many fields featuring in their ranks both preeminent scientists like E. O. Wilson and renowned authors like novelist Ian McEwen. The closer look led me to the work of the field’s most respected thinkers, among them Joseph Carroll.

Joseph Carroll, professor of English at the University of Missouri in St. Louis, wrote the book on Literary Darwinism, literally. Literary Darwinism (2004), and before it, Evolution and Literary Theory (1995), are considered the foundational texts in the field.  No article is written about Literary Darwinism that doesn’t quote Carroll, and no anthology in the field is complete without his contribution (notably, The Literary Animal (2005).  He recently spent some time with Neuronarrative discussing what Literary Darwinism is all about, addressing a few of the main criticisms levied against it, and envisioning what the future may hold for evolutionary thinking and literary study.       

 

You’ve been called a “founder of Literary Darwinism.” What is Literary Darwinism?

Literary Darwinists integrate literary concepts with a modern evolutionary understanding of the evolved and adapted characteristics of human nature. They aim not just at being one more “school” or movement in literary theory. They aim at fundamentally transforming the framework for all literary study. They think that all knowledge about human behavior, including the products of the human imagination, can and should be subsumed within the evolutionary perspective.

 

Over the years, Darwinians have been criticized for over-applying evolutionary explanations to social and cultural phenomena. What makes the effort you are undertaking now different from those of the past?

carroll_bookFrom my perspective, previous evolutionary studies in the human sciences have not erred in extending evolutionary explanations into social and cultural areas. If they have fallen short, it is only in not more fully integrating social and cultural levels of explanation with evolutionary levels. Virtually all evolutionists in the humanities and social sciences explicitly formulate “bio-cultural” ideas. That is, they recognize that humans are cultural animals. For the “cultural constructivists” who still dominate the humanities, “culture” operates autonomously, generating all thought and emotion, all sense of individual and collective identity, constrained by no biological dispositions any more general than, say, physical hunger. (And even hunger, like sexuality, one would be sure to be told, is “constructed.”)

Evolutionists identify a rich array of innate, genetically transmitted dispositions that strongly constrain sex roles, family relationships, social interactions, and the forms of cognition. They also recognize that none of these dispositions articulates itself in a cultural vacuum. Dispositions for transmitting information in non-genetic ways appear in various species-primates, cetaceans, and corvids, for example-but only humans produce symbolic structures that embody ethical norms, depict the world and human experience, and evoke subjective sensations. Only humans configure the elements of their experience in shared symbolic systems. All differences of social and material organization constitute divergent ways of organizing the universal dispositions of human nature, and each specific organization manifests itself in distinct artistic, religious, and philosophical traditions.

carrol_quote2One could “over-apply” evolutionary explanations to any given culture only by falsely reducing the particular organization of that culture to some set of human universals. This would be a failure in degree of “resolution,” like not having enough pixels to get a decent picture on one’s computer screen. Such failures can and should be corrected. The formula is fairly simple: Every specific culture consists in a particular organization of genetically transmitted dispositions shared by all members of the human species. One task for biocultural critics is to describe in fine detail the particular organization of human universals that distinguishes any given culture. Another task is, so far as possible, to explain how that organization arose-to identify its source in specific ecological and social conditions and in the available materials of “imaginative” culture, that is, religious, philosophical, and artistic traditions. And yet another task, especially for humanists, is to interpret the aesthetic and affective character of any given cultural organization. Cultural critics need to know what the universal forms of imagination are, how they vary from culture to culture, how those variations constrain the range of imaginative possibility for any given author, and how each individual work produces some specific imaginative effect.

It is not possible to “over-apply” evolutionary explanations to social and cultural phenomena, though it is easy enough to apply them badly or incompletely. One can apply them badly by not combining a sufficient number of analytic elements from an evolutionary view of human nature and by not considering sufficiently the way the elements of human nature interact with environmental conditions, including cultural conditions.

Many cultural theorists under-apply evolutionary explanations. They neglect or explicitly repudiate the idea that genetically transmitted dispositions fundamentally constrain the organization of all cultures. They identify “culture,” in a circular way, as the sole cause of cultural phenomena, or they give lip-service to the “bio-cultural” idea while tacitly reducing the biological part of the interaction to a negligible “physical” aspect entirely distinct from thought, feeling, motive, and behavior. Such under-applications dominated mainstream cultural theory through the first three quarters of the twentieth century. They still dominate literary study. When critics of a bio-cultural approach charge evolutionists with taking a “reductive” view of human affairs, most of the time what they really mean is that they would like to continue taking a purely culturalist, “constructivist” approach to human affairs, leaving out the biological altogether or reducing it to a negligible minimum.

 

You’ve said in a recent article that “literacy is a very recent acquisition in human evolutionary history.” Tell us a bit about how we know this to be the case.

Evidence for written languages occurs first in state societies of the ancient Near East. All known hunter-gatherer cultures, on first contact with Europeans or Asians, have lacked written language, though of course they all have spoken language. Childhood development also provides evidence on this topic. All normally developing children spontaneously acquire spoken language. Literacy comes much later in childhood and usually has to be taught, a process extending well into adult life. If it is not actively taught, children often do not acquire the ability to read, even if the larger culture in which they live is widely literate.

 

You’ve also said that “culture–literature and the other arts–are functionally significant features of human evolution.” Some would argue that this position is too reductionistic–that the arts are too rich and complex to be categorized this way. How do you reply to this argument?

Identifying adaptive functions for the arts need not detract from the richness and complexity of the arts. My own view on the adaptive function of the arts is that they provide an imaginative universe in which we recognize the emotionally and conceptually significant features of our experience. The arts delight us with the satisfactions of understanding-not abstract, detached understanding, as in the sciences, but emotionally responsive, subjectively positioned understanding. They make us feel the weight and value of things. They give us the sense of things, absorbing the qualities of “felt life” but also composing, condensing, arranging them so as to bring out their essential features.

literary_animalWe quite literally live in such imaginative structures, with all their sensory and emotional qualities, and we also stand apart from them, looking at them. That dual perspective, inside an imaginative construct, seeing from within it, and outside it, looking in on it, is a peculiar quality of the “aesthetic.” Specifically aesthetic, imaginative forms of pleasure are as distinct as the pleasures we get from satisfying hunger, fulfilling sexual desire, or meeting with friends.

The old Horatian saw, “dulce et utile,” to give pleasure and instruction, points at the kind of fulfillment art provides. One can of course too easily translate the “utile,” instruction, into didacticism: “The moral of the story is.” That can easily be reductive and boring. But even that reduction has its element of truth. Who has ever been bored by Aesop? Learning lessons like those Aesop teaches is evidently part of our total subjective experience and is thus part of what art encompasses. But then, what does art not encompass? It takes in the whole world and all our experience in the world. It makes up imagined worlds. All that is adaptively functional. It helps us organize our experience and orient ourselves to the world of possibility.

The general function of the arts is to make imaginative sense of the world. As I see it, the challenge for an evolutionary understanding of the arts is to render this general proposition more analytically useful by linking specific works of art with more particular psychological and social functions. For instance, several of the theorists discussing the adaptive function of the arts have emphasized “social cohesion” as one of its functions. Now, there are many instances where social cohesion is evidently at work. Art is integral to social and religious rituals all over the world. Our rites of passage almost always involve music and pageantry (weddings, funerals, bar mitzvahs). But then there are also works of art, especially in the modern age, that seem designed to subvert and disrupt normal modes of thinking and traditional values. One has to look at specific cases and see exactly what is being accomplished, what sort of “psychological work” is being performed.

carroll_quote1I’ll give just a couple of examples. For the past five years I’ve been working on a book project with three other researchers, a literary scholar (Jon Gottschall) and two psychologists (John Johnson and Dan Kruger). We put up an on-line questionnaire and listed about 2,000 characters from Victorian novels, asking respondents to score the characters on motives and personality; assign the characters to roles as protagonists, antagonists, or minor characters; and give a numerical rating to their own emotional responses to the characters. We found that antagonists are characterized almost exclusively by dominance behavior-seeking wealth, power, and prestige. They have no affiliative dispositions.

Protagonists, in contrast, are communitarian, keen to take care of kin, make friends, and work cooperatively with others. We argue that this pattern exactly parallels the social dynamic in hunter-gatherer cultures, as delineated by Christopher Boehm in Hierarchy in the Forest: The Evolution of Egalitarian Behavior. Individual people love dominance for themselves but hate it in others. Hunter-gatherers compromise by working collectively to suppress dominance in individuals. No one gets all the dominance he or she wants, but no one has to submit to dominance from other individuals, either. The agonistic structure of the Victorian novel, on average, strongly stigmatizes dominance behavior and valorizes communitarian dispositions. The novels evidently provide a medium through which readers participate in a collective cultural ethos that valorizes communitarian behavior. That communitarian, egalitarian ethos is part of the evolved structure of the human motivational system. The novels provide a medium through which that ethos can be activated on a large cultural scale. In that sense the novels fulfill an adaptive social function-at least one adaptive social function. They might well fulfill other social functions, and they might fulfill psychological functions that could not properly be called social.

Along with the website listing all the characters from many novels, we put up a website dedicated solely to a single novel, Hardy’s The Mayor of Caster Bridge. We solicited respondents from among Hardy specialists. For that novel, we discovered patterns in the “agonistic structure” that are radically different from those in most other novels. The protagonist, Michael Henchard, has antagonistic features; the most typically protagonistic character, Elizabeth-Jane, is a minor character; and readers’ emotional responses register extraordinarily high levels of “indifference” to all the characters. This novel, then, doesn’t fall under the adaptive explanation of the other novels. It doesn’t perform the same kind of psychological work. What it does, we think, is adopt a certain stance toward the world, a stance of reflective, stoic detachment. This is a defensive stance, a coping strategy.

In the broadest sense, in my view, that’s what all novels are. They reflect a point of view, a specific way of organizing the world so that it conforms to the artist’s particular needs, the artist’s characteristic way of organizing his or her perceptions, thoughts, and feelings. When we read novels, we are participating vicariously in the novelist’s point of view–the novelist’s whole vision of the world. We learn that way, not just about what is being depicted, but about the novelist’s way of looking at things. That kind of knowledge is good to have in itself, as social information, but we might also use it in a more practical way, picking up possible strategies for coping with challenges in our own lives.

  Continue reading

11 Comments

Filed under Interviews

Survival of the Kindest: An Interview with Dacher Keltner

dacherdalailama2I have an interview with Dacher Keltner, author of Born to Be Good, in Scientific American Mind Matters today.

Keltner is the director of the UC Berkley Social Interaction Laboratory, leading research efforts focusing on the biological and evolutionary origins of human goodness, with a special concentration on compassion, awe, love, and beauty, as well as the study of power, status and social class, and the nature of moral intuitions. He’s also the founder of the Greater Good Science Center and co-editor of Greater Good Magazine.

Plus, he’s chummy with the Dalai Lama.

It was a pleasure interviewing him.

Link to interview

Leave a comment

Filed under About Neuroscience, Interviews

This is Not Your Grandfather’s Monkey (or pigeon)

baboon2The mind can always intend, and know when it intends, to think the Same. This sense of sameness is the keel and backbone of our thinking.

–William James

Science has just served humanity another helping of humble pie.  Ed Wasserman, principal researcher at the University of Iowa, presented his findings on animal cognition at the Association for the Advancement of Science annual meeting, and the news is that the disparity between human and animal cognition is not as wide as we thought.

Wasserman and his team found that baboons and pigeons (pigeons!) can (1) determine differences between different and same objects, and (2) learn relations between relations.

Here’s an example of the first item: if you put two peanuts under one cup on a table, and a cashew and walnut under another cup, you, as a human, can correctly identify that the objects under one cup are the same and those under the other cup are different.  This may sound ridiculously simple, but it’s an essential rudiment for complex thinking (it’s the “keel and backbone of our thinking” said William James). 

Turns out, baboons and pigeons can do this too – and the really important thing shown in this research is that they can actually learn to do it. 

Here’s an example of the second item: if two identical photos of a dog are shown to you, followed by two identical photos of another dog, you will be able to recognize that the first two are the same (A and A) and the other two are the same (B and B), and that you have just looked at two pairs of images that were the same (same equals same).  Then, if you are shown a pair of different dogs (A and B), followed by yet another pair of different dogs (C and D), you will recognize that you have viewed two sets of photos that are different (different equals different).  

But what happens if you view two identical dogs (A and A) and then two different dogs (C and D)?  We know that humans arrive at this conclusion: same does not equal different. And up until recently, it was thought that ONLY humans could arrive at this conclusion. Wasserman’s study finds this just isn’t so.  Baboons and pigeons can learn this ability too (the only difference being that the baboons pointed and the pigeons pecked).

What this all means is that other animal species–very likely many other animal species–are capable of higher-order relational learning, and that’s a big deal. It’s one less major cognitive ability humans can claim an exclusive on, and there will no doubt be many more coming. From Wasserman,

The newsworthiness of our baboon experiment was to show that nonhuman primates are capable of higher-order relational learning. Understanding the relation between relations was previously believed to be a kind of cognition that sets humans apart from all other animals.  The follow-up discovery — that pigeons too are capable of such higher-order relational learning — affirmed our suspicion that we’ve really established a finding of broad evolutionary significance.”

hat tip: Machines Like Us

1 Comment

Filed under About Research

Attention Under Siege: An Interview with Author Maggie Jackson

maggiejackson190In his masterwork, Flow, psychologist Mihaly Csikszentmihalyi tells us that the two major components affecting our ability to control and direct our mental resources are time and attention. 

On the first, time, most of our verdicts are the same: we don’t have enough of it.  In the case of the second, however, the analysis is murkier. While we can all agree that there are a multitude of demands on our attention, it’s not exactly clear whether this is good, bad or neutral. Some would say, for instance, that the attention dividing practice of multitasking is an essential skill for being successful, while others claim that multitasking is a widespread cultural myth; something we aren’t capable of no matter how hard we try.

Maggie Jackson has taken a position in the core of controversy with her book, Distracted: The Erosion of Attention and the Coming Dark Age, in which she argues that our ability to focus attention is facing colossal challenges which we will either manage to meet, or risk falling into a cultural black hole.  She recently spent some time with Neuronarrative discussing the science behind attention, whether we can train ourselves to be more focused, and what she believes we must do to avert an attention deficit “dark age.”

 

Tell us what your book Distracted is about and what led you to write it?

Distracted focuses on the steep costs of cultivating a split-focused, hyper-mobile and cyber-centric culture, and details how new scientific discoveries related to the nature and workings of attention can be a starting point for sparking a renaissance of attention. I argue that unless we are able to better manage our technologies and strengthen our powers of attention, we will usher in a dark age – a time of high-tech invention but cultural and social losses.

In a sense, I backed into the subject of attention as the key to creating a high-tech yet reflective and deeply connected society. I began seeking clues to how we could better navigate our own digital world by studying how inventions such as the cinema, train, car, camera and telegraph vastly changed people’s experiences of time and space long ago.

From this early research, I had two revelations. First, the roots of our virtual, split-focus, mobile world were seeded in these first high-tech revolutions. We have to look far beyond the Blackberry and iPod to understand our current culture. Second, the vast changes to human experience that have unfolded in the past two or three centuries essentially deal with attention — how and when we focus, our powers of awareness, our ability to plan and judge. Attention is the key to both understanding and shaping our world for the better.

 

We all have a subjective notion of terms like “attention,” “focus” and “distraction.”  Give us a sense of the scientific basis for discussing these terms more objectively.

distracted-largeYes, we all know what it feels like to concentrate on a problem, or to walk into a garden and become deeply aware of the beauty of the flowers, their scent, the touch of a breeze on our skin, the call of birds around us. But now, scientists are beginning to understand how attention works, how it develops in children, and how deficiencies in its operations can affect us.

And if you ask a neuroscientist about attention, they immediately begin speaking in the plural. Attention is not a single entity. It’s now considered to be an organ system, like circulation or digestion, with its own anatomy, circuitry and cellular structure. There is debate and much more to learn about the workings of attention, but many, if not most, attention scientists consider that this human faculty is made up of three “networks” or types of attention: alerting, i.e. awareness, sensitivity to our surroundings; orienting, i.e. focus, or the “spotlight” of the mind, and executive attention,  a package of higher-order skills related to planning and judgment. The networks often work in conjunction with one another, yet they are independent. At the moment, there is an explosion of research into the workings of attention, in part because of the brilliant pioneering work of Michael Posner.

And intriguingly, a growing body of research is showing that these attentional powers can be trained. The great philosopher William James thought that attention could not be highly trained by “drill or discipline,” but he was wrong. While we do not yet know how long-lasting the gains are, neuroscientists including Amishi Jha are discovering that computer-based exercises, meditation and even behavioral therapies can improve focus, awareness and executive attention. These findings could revolutionize parenting, education and workplace training.

 

We’re plainly awash in digital technologies, with new ones being unveiled all the time – each vying for scarce pieces of our attention.  Is it possible that the human brain is adapting to manage this onslaught? 

Yes, we are awash in digital technologies that prey on our attention – from ads on screens in public places to the beeping, pinging communications gadgetry that is crucial for today’s work. Let’s look at this as an environmental issue first.

attention-quote2In one sense, we are not adapting well to this new environment, nor is this new environment conducive to the kind of in-depth thought and innovation that we need badly in the “information age.” Attention is our ticket to the world, our key to staying in tune with our environment. We are born to react to the new, the different or dangerous in our surroundings. That’s why we’re interrupt-driven. But there is a tension within attention, for this crucial system also gives us the ability to plan for the long-term, pursue our goals, and understand intangibles like the passage of time. When we’re constantly jumping to answer every beep or ping, we’re off-balance, overly depending on certain attentional skills, but overlooking our human need to plan or to tackle big-picture, messy, complex problems. Today, this is one reason why so many people feel frustrated that they do little more than “put out fires” and try to keep up with email all day at work.

Second, we are superficially adapting to managing the daily onslaught, yet in reality we’re undercutting our deeper abilities to think and relate deeply, and innovate. We seem to be so productive, speedily clicking through emails and ticking items off our never-ending to-do lists. By rampant multitasking and by fragmenting work into smaller and layered chunks, we can busily and efficiently seem to keep up with the tsunamis of communications data and information pummeling us. But consider that a third of workers say they’re often too busy and interrupted to process or reflect on the work they do, according to the Families and Work Institute.

As well, the average worker now switches tasks every three minutes throughout the day, and yet high levels of interruptions are related to stress, frustration, even lowered creativity, studies show. Most multitasking is actually task-switching, which is often linked to higher rates of errors and more shallow learning. Are we adapting to the demands of a so-called knowledge economy, or are we too often just more frenetically busy than ever before?

I worry that if we don’t change our path, we may collectively nurture new forms of ignorance, born not from a dearth of information as in the past, but from an inability or an unwillingness to do the difficult work of forging knowledge from the data flooding our world. In-depth analysis, reflective judgment and critical thinking begin with discomfort, a willingness to doubt, and discipline.

attention-quote-3Finally, I do agree in part with the philosopher Andy Clark, who believes that technologies are truly a part of the power of the human mind. In other words, “thinking” doesn’t stop with what’s inside our skull. And I also believe that we can and will create technologies that become more sensitive to our need for focused, reflective thought and uninterrupted, unmediated human connection. (For more on this, check out my article in BusinessWeek on such research here.)

But I firmly believe that we can’t adapt to a complex, overload, digital world if we become overly dependent on our machinery to manage the info-onslaught for us. And we certainly are mistaken if we believe that a steady diet of multitasking and split-focus will give us the cognitive booster rockets needed to progress in this new age.

 

Many, including you, have pointed out that every generation throughout history has faced challenges to its status quo modes of thinking by new technology.  What’s different about what’s happening now? Is there something inherent in digital technology that makes this challenge more disruptive?

Technologies from writing to steam engines certainly shake up the status quo. At the dawn of writing, Plato rightfully predicted that this new technology would lead to the demise of memory as the great repository of oral culture. There is much public discomfort when new technologies begin to challenge old habits.

But just because such discomforts naturally and periodically arise, does this mean that we should quickly dismiss them, or denigrate thoughtful questioning of a technology’s purpose and impact? As the historian Carolyn Marvin points out in When Old Technologies Were New, technologists throughout history tend to dominate early public discussions of their inventions. They, of course, are the first to understand the mechanics of these often-mysterious and powerful new tools. Yet they often try to control critical discussions related to the social impact of technologies in part by labeling critics as “luddites” and “naysayers.” Today, I believe this is still true. I believe we urgently need to have more measured, tolerant, thoughtful discussions of technology.

In weighing the impact of technology on our lives, it’s important to ask, what are the challenges that face us today? Were humans in the past any more able to focus, to think critically? That may not matter. What matters is whether or not we have the frameworks and architectures for going deeply into a text or problem. What matters is whether or not we are satisfied with increasingly faceless, virtual, quick and brief means of relating – even to our own kin. What matters is, what future do we want?

 

What in your opinion can we begin doing now to avert the “dark age” you suggest is coming? 

To avert a dark age, we must take several steps:

Question the values that undermine attention – Helped by influential tools that are seedbeds of societal change, we’ve built a culture over generations that prizes frenetic movement, fragmented work and instant answers. Recently, my morning paper carried a front-page story about efforts “in a new age of impatience” to create a quick-boot computer. Explained one tech executive, “It’s ridiculous to ask people to wait a couple of minutes” to start up their computer. The first hand up in the classroom, the hyper-businessman who can’t sit still, much less listen – these are markers of success in American society. Instead of venerating scattershot focus, rushed detachment, knowledge built on sound bites, we need to value whole focus, full awareness and the difficult work of knowledge creation.

attention-quote-23Rewrite the climate of distraction – We can set the stage for focus by judiciously protecting against interruptions; by dialing down the noisy, cluttered sensory environment that we’ve come to accept as a norm; and by disciplining ourselves to sharpen our powers of attention. To help, some companies are experimenting with “white space” – the creation of physical spaces or times on the calendar for uninterrupted, unwired thinking and connection. IBM’s global practice of “ThinkFridays” began three years ago when software engineers decided to limit email, conference calls and meetings one day a week in order to focus on their creative, patent work. Now, different teams and departments interrupt “ThinkFridays” in varied ways. This pioneering initiative is fluid, flexible and workable – more so than the rigid, top-down policies that ban email one day a week.

Role model attention – If there’s just one action we can take to spark a “renaissance of attention,” it should be to give the gift of our attention to others. Parents and leaders, in particular, need to role model attention. As contemplative scholar Alan Wallace says, “When we give another person our attention, we don’t get it back. We’re giving our attention to what seems worthy of our life from moment to moment. Attention, the cultivation of attention, is absolutely core.”

 

What’s the next project on your radar screen? 

I’m contemplating writing a book on the fascinating scientific and other work going on now worldwide to understand and strengthen our powers of attention. There are a whole host of exciting stories to be told involving a new field that I’m calling the “applied science of attention.”

 

Link to Maggie Jackson’s website

4 Comments

Filed under Interviews

What Makes You Happier, Stuff or Experience?

gift-carAccording to a study conducted at San Francisco State Univeristy, the things you own can’t make you as happy as the things you do. 

One reason is adaptation: we adapt to all things material in our lives in a matter of weeks, no matter how infatuated we were with the much-coveted possession the day we got it.  Another is that experience, unlike possession, generally involves other people, and fosters or strengthens relationships that are more edifying over time than owning something.

Here’s a snippet from CNN’s coverage of the study:

The study looked at 154 people enrolled at San Francisco State University, with an average age of about 25. Participants answered questions about a recent purchase — either material or experiential — they personally made in the last three months with the intention of making themselves happy.  While most people were generally happy with the purchase regardless of what it was, those who wrote about experiences tended to show a higher satisfaction at the time and after the experience had passed.

The most striking difference was in how participants said others around them reacted to either the purchased object or experience. Experiences led to more happiness in others than purchases did. A sense of relatedness to others — getting closer to friends and family — may be one of the reasons why experiences generate more happiness.

This study backs up an earlier one (To Do or to Have: That is the Question – available in full as PDF), which also found that not only are people happier experiencing than possessing, but they are also happier having the experience of thinking about possessing something than actually getting it.  

Understanding that bit of knowledge about ourselves goes a long way toward explaining how advertising works. It’s largely about making the anticipation of buying something a fantastic experience. You can see yourself driving the car and it feels great. It’s a tremendous experience simply envisioning how wonderful it will be to own it. Then, when you do own it, that experience fades and you begin the process of adapting to the new material possession the way you do with every other possession in your life.  This happens in six to eight weeks–three months tops–the research tells us.

The twist on all of this is that enabling someone else to own something (commonly known as giving) is also an experience, and typically a good one — in fact, it may very well be more happiness-inspiring over time than the thing you’re giving will be to the recipient.  So the old adage, tis better to give than receive, may not just be good folk wisdom, but not a bad scientific observation as well.

3 Comments

Filed under About Research

The Monkeys of Wall Street

monkey2Scientific American is running a 60-Second Science audio clip of primatologist Frans de Waal’s address at the annual meeting of the American Association for the Advancement of Science in Chicago.  He discusses a frequently cited experiment he co-conducted that tested whether monkeys are able to distinguish between the relative value of rewards: a cucumber or a grape.  Grapes win every time, and the monkeys left with cucumber aren’t too happy about it.

If this sounds familar, it might be because, as de Waal told the audience, most of us these days are monkeys left munching on bits of cucumber while we’ve been watching the Wall Street monkeys munch on grapes.

Last month, Dr. de Waal did an interview here with me in which he mentioned this study and another humorous connection to our current economic situation:

You were one of the principal researchers involved in the now famous “grape / cucumber” study.  Briefly, how was this study conducted and what did you discover? 

With Sarah Brosnan (at the Yerkes National Primate Research Center ), we gave monkeys different rewards for the same task. If they get the same reward everything is fine and dandy, and no one complains, even if they just get cucumber. But if their companion gets grapes for the same task, all hell breaks loose and they refuse to perform the task, and often refuse the food itself.

I was reminded of this during the recent outcry in the media about the CEO’s of the car industry who had flown to Washington in private jets. We humans are very sensitive to inequity, and now that we are going through some rough times, these feelings surface very easily. The CEO’s were munching on grapes, whereas all of us have to content ourselves with cucumber.

Leave a comment

Filed under About Neuroscience, About Research

Bridging the Empathy Gap: An Interview with J. D. Trout

teal-troutRenowned author and cognitive scientist Steven Pinker has said of J. D. Trout’s latest book, The Empathy Gap, that it is “important and engaging,” and on both counts I agree. But I would also add one more word: sensible

The topics Trout addresses–bias, free will, decision making, empathy–are not prepackaged, self-explanatory bits of knowledge, and understanding them in light of larger social and policy issues is an even harder undertaking.  But Trout, with a sensible approach that never wallows in theory too long, nor jumps to practice too quickly, manages to imbue these topics with rare transparency. 

Reading this book is like having a discussion with an accomplished philosopher who is tired of philosophy being viewed as a static, insulated exercise–he sees the applications to our personal and social lives, and he wants you to see them as well. Most importantly, he sees a gap between where we stand as individuals and what the world needs from us all, and he’s writing not just to explain it, but to help us bridge it.

J. D. Trout made time to talk with Neuronarrative about his new book, the psychological biases that afflict us, and how to rebuild the human mind, among other topics.

Tell us what your new book The Empathy Gap is about.

The Empathy Gap describes the ever-expanding emotional distance between people in our 21st century megademocracy, and how we can bridge this gap efficiently and empathically by building science-based policies. When we try to make these policy changes, however, we get ambushed by unconscious cognitive biases, biases of discounting, anchoring, availability, and overconfidence, to name a few. We can’t overcome these biases by acts of the will; they are mostly design features of humans. So we also need strategies to counteract the mental influences of our Pleistocene past, like intuition, gut reaction and folk belief.

empathy-gap-front-2The central problem of The Empathy Gap arises from our country’s greatest strength. The U.S. is home to people of many ethnicities and religions, personal styles, skin tones, and job descriptions. We are a country of different neighborhoods, with different public schools of widely varying resources. There are enclaves of great wealth and pockets of utter destitution. Combined with the seductive American myth that free will gives us the power to overcome all circumstances, these differences make it easier for us to feel that people get what they deserve, and harder to feel our common vulnerability.

In view of these empathic frailties, decent policy-making must place social initiatives beyond the reach of our wavering personal conviction. After all, the important point is that people in need get help, not that we feel good helping them. The Empathy Gap supplies a rich sampling of these policies, from estate tax and parole to health care and traffic laws. And it also advocates for a new organ of government to vet social policy proposals, and to monitor existing policies – a House Committee on Social Science, to balance the physical and natural science responsibilities of the existing House Committee on Science & Technology.

There’s been growing interest in whether we’re genetically hard-wired to be one way or another – inclined toward altruism or selfishness, for example. Some argue that we’re endowed with ‘moral minds’ and others argue quite the opposite. What’s your position on this?  Do we start life with functioning rudiments of empathy, or does it have to be learned the hard way?

The evidence leaves little doubt that we are endowed with powerful capacities of natural sympathy – the ability to empathize with others. Consider the way our sociality is rooted in the neurophysiology of imitation. In the human brain’s anterior cingulate, just behind the frontal lobes, are “pain neurons” that fire when, for example, we are poked with a needle. But these cells have an additional and curious feature: they also fire when we watch someone else getting poked. This happens when people see pictures of other people in uncomfortable positions, and even when we are asked to imagine it. Whether we imagine ourselves or others in painful positions, the same well-travelled neural networks get activated, which include the anterior cingulated cortex, the parietal operculum, and the anterior insula.

This subtle suite of emotional reactions is hard to explain if we had no interest in the suffering of others. While there is now powerful evidence that we use the same circuitry to process our own pain and that of others, these “empathy neurons,” or “Dalai Llama neurons,” as neuroscientist V. S. Ramachandran calls them, don’t dissolve the barrier between self and others. (After all, we don’t feel the same kind of discomfort we observe.) Instead, they bridge it.

But that is only half of the story; genuine empathy is oriented toward change. When we see the suffering of others, we don’t just feel it; we have the impulse to correct it. Whether we succeed has a lot to do with how much machinery there is around you to support your empathic actions. So, while humans have an empathic capacity, it is not always exercised in equal measure by all people all of the time. This aspect of empathy is learned. In fact, I think to an impartial observer it would look like the particular brand of capitalist democracy developed in the U.S. had to be crafted to suppress most of those natural sentiments, even though they are abundantly evident in other countries.

text-blurb_31Look at the theoretical models that attempt to justify it. Some political philosophers appeal to life in the state of nature even though there is no evidence of how we would behave as social isolates; we have always lived in communities. Most economists adopt the idealization that humans are optimally rational. They concede that real people fall short but insist that all sciences use idealizations. Behavioral economics has established that in the many areas of consumption there is really nothing to the economists’ assumptions. Their idealizations are too wild to be useful. 

In fact, our cognitive frailties are deep, habitual, predictable, increasingly well-understood, and at least arguably, crucial to good economic theorizing about rationality. There is perhaps no better evidence that the American culture of rugged individualism suppresses the empathic impulse that, when we ask Americans what explains wealth or poverty, the majority says character, but when in capitalist democracies of Europe, the majority of citizens say luck. These are the countries – Denmark, the Netherlands, Sweden – whose policies protect the most vulnerable. (Notice, too, that they are culturally much more homogenous than the U.S., perhaps making it easier to empathize.)

 

I know that you’ve done a lot of work to understand our psychological biases, like the ‘anchoring bias’ that leads us to latch onto false information even if we see proof that it’s false (hence the success of negative campaigning). Tell us about some of the biases you’ve studied and how they affect our thinking.

What’s interesting about anchoring is that whether the information is false is neither here nor there; instead we anchor on information that should be irrelevant to our estimates. Researchers have found the irrational influence of anchoring in such places as real estate pricing and sentencing recommendations. And anchoring has the same influence on experts as it does on novices.

I am most interested in the overconfidence and hindsight biases, because the processes that generate them are at the center of scientific theorizing and yet responsible for crippling errors in the history of science. Roughly put, the overconfidence bias is the tendency to overestimate the probability that you are correct, and the hindsight bias is the tendency to suppose about some fact that you “knew it all along” or that you could have predicted some event.

In the philosophy of science, I have also developed a line of research that traces the psychology of explanation to its cognitive, even physiological, roots. I focus on the sense of understanding that dominates scientific theorizing, and that so many scientists and laypeople require of good explanations. When people decide on whether or not to accept an explanation, they often do so by subjectively assessing the sense of understanding it conveys to us, the feeling of coherence it carries. And people often report anxiety in its absence.

William James described “that peculiar feeling of unease” that gets discharged when you offer a satisfying explanation. The problem is, our sense of understanding isn’t a very reliable cue to genuine understanding, or to good explanation for that matter; it doesn’t track the truth. In fact, just as Descartes claimed that there were no “certain signs” distinguishing waking from dreaming life, there are no certain signs distinguishing reliable and unreliable senses of understanding. Ptolemy, Copernicus, Haldane are just three scientists who expressed great confidence in their belief that they had a sense of understanding that had high fidelity. And overconfidence can cause you to be prematurely dismissive of alternative hypotheses.

ephjcoverThe reliance on subjective, “intuitive” appraisals of the evidence was a centerpiece of traditional epistemology, and may still have a legitimate place in our theories, at least for some isolated corners of our intellectual lives. But the theory of knowledge has really benefited from the expansion of research on cognitive biases. A few years back, Michael Bishop and I published a philosophy book (Epistemology and the Psychology of Human Judgment) on the nature of knowledge, in which we asked how philosophers who knew so little of scientific psychology could presume to make recommendations about how humans ought to reason (psychologists seemed to like this book a lot).

True, epistemologists could respond that we should try to generate true rather than false beliefs, but who would disagree with that? You may as well recommend that we buy low and sell high. The real challenge is to give good advice that is action-guiding. Against a philosophical tradition that lionized our intuitions – some of them the products of discredited heuristics – we argued for a theory of epistemic excellence: Epistemic excellence consists in the efficient allocation of cognitive resources to robustly reliable reasoning strategies, all applied to significant problems.  We had more fun writing the book than authors ever should. But in this case it was fitting, because we wanted to write a book that was fun to read.

textblurb51We proposed that epistemologists take a lesson from the folks who brought us predictive linear models – handy little formulas or forecasting customs that can outperform clinical diagnosticians, college admissions committees, and parole boards, on predictive tasks right in their expert wheelhouses. These models are more accurate (and cheaper) than our existing decision-making methods, but their accuracy doesn’t depend on subjective appraisals of “fit with the evidence”, compatibility with our intuitions, or elaborate theorizing about causal mechanisms. Instead, you just plug in the numbers and go. Sometimes it doesn’t feel right to apply the rule, but that is precisely the point: our intuitions are often unreliable. Sometimes doing right doesn’t feel good.

Some philosophers liked the book, especially a younger breed of philosophers who also had been wondering whether these heavily-worked intuitions were really just artifacts of one’s culture, class, and personality, and not the harbingers of truth. A new movement called Experimental Philosophy was underfoot, and asked some of the same questions about the status of supposedly eternal or immutable intuitions in philosophy. Among other activities, experimental philosophers looked at people from different cultures and social classes to see if they shared the professional, English-speaking philosophers’ view about the nature of knowledge. In a word, they don’t. English-speaking epistemologists, too, form a demographic. So we are left wondering just how general the lessons of traditional epistemology really are.

 

Since we’re faced with these biases on a daily basis (many times a day for most of us, and the outcomes aren’t always exactly positive), what’s a biased human mind to do?  Can we overcome ourselves? 

That’s exactly how I would put the puzzle. We can overcome our biases, but not in the way you might think. Self-control doesn’t work. Only policies that regulate our behavior from a distance and control our options will do the trick. In The Empathy Gap, I call these Outside Strategies. One psychologist who studies habitual behavior estimates that half of our actions are automatic. Great examples of systematic error come from the field of psychology in which I have worked – speech perception.

text-blurb_3Here is how one Outside Strategy goes. Our language processing is so automatic that our failures are very predictable. We make many more mistakes – mostly confusions – on words that have lots of similar sounding neighbors, and this little piece of psychological knowledge could save thousands of people from disturbing (and sometimes fatal) prescription errors at the pharmacy. Apparently, pharmaceutical companies prefer drug names that begin with the Z sound, which is perceived by the public as pleasingly “science-y”. So that name-space has become very crowded. And naming four drugs Zocor, Zofran, Zoloft, and Zomig is like building a road that dumps out into oncoming traffic. No sooner do you complete the first syllable, and all nodes with the same initial sound get activated. With that amount of competition, you are much more likely to complete the sequence incorrectly. When your pharmacist makes that confusion, you may not shake that migraine, but you’ll reduce your cholesterol. I will leave it to you to check the effects of confusing Ziac and Ziagen.

But there is a solution, and it is an outside strategy. Rather than training pharmacists more, or making them accountable, or reminding them to concentrate, we can simply space the drug names out more – make them more remote linguistic neighbors – and there are now drug naming councils that do just that. Many of the best policy strategies improve behavior not by teaching people, but by improving their options.

 

If you could redesign the human mind from the ground up, what’s one major improvement you’d be certain to include?

Whenever I imagine redesigning the mind, I am humbled by the problems facing civil engineers when they tried to gain farm and residential land by straightening streams. When you channelize a river, you can also produce upstream flooding and downstream erosion. It is hard to predict the effects of changes in a complex system like the mind.

But as long as you are allowing me a scientific fantasy…  I would make our sensory modalities, and our perception of space and time, far more flexible. If we could change the scale of perception at will – as though we were turning a dial — then we could actually observe social movements even though they are spread out in space and time, casually inspect the slow evolution of learning, and peer into the fast transitions of subatomic magnitudes – without a lot of cumbersome theory. As a result, it would be much easier to isolate the causes of, and so to understand, labor struggles, speciation, demographic shifts, and so much more.

 

If you could make one significant contribution to the human mind as it stands, what would it be?

I would try to get people to thrill at numbers as much as they do at stories. Stephen J. Gould is widely credited with the statement that humans are the primates who tell stories. But stories are flabby tools for communication, allowing us to weave together contradictory evidence and serve as a repository for all sorts of psychological distortions. Our drive to understand prompts the search for a coherent narrative about our world, and the need for coherence leads to great story-telling abilities. But these stories are more about comfort than truth.

Numbers don’t excite people nearly as much as narratives. But numbers represent dimensions of humanity that could never be accurately comprehended without them. We know what it means to have a hungry child at our dinner table. But how can a story convey the suffering of 400,000 children in the Sudan, the anxiety of fifty million people in the U.S. without health insurance, or the joy of tens of thousands of children in successful kindergarten classes? Numbers promise to make these complex problems cognitively tractable.  But in order to grab and keep an audience, we need methods that tell a story with those numbers. We need to look at a graph and see the heartbeats behind the data points. Tufte’s work on the visual display of quantitative information is brilliant in this respect, but we need to do a lot more work on how to represent quantitative information with dimensions that make its human significance easily accessible.

 

Last question: who is your favorite influential thinker from any point in history and why?

david-humeIt has to be David Hume. My official philosophical attachments pit me against his empiricist doctrines, but I have always admired the way he combines sophisticated argumentation and expansive humanity. His optimism about the power of thoughtful action is authentic and contagious. Nearly three centuries ago, in the Introduction to his Treatise, he envisioned the power of social experiments: “Where experiments of this kind are judiciously collected and compared, we may hope to establish on them a science which will not be inferior in certainty, and will be much superior in utility to any other of human comprehension.” I believe modern social science is proving Hume correct.

And he seems a gem of a person, with a real sense of proportion about what matters to a good life. In the Conclusion of his Enquiry Concerning the Principles of Morals, Hume says “How little is requisite to supply the necessities of nature? And in a view to pleasure, what comparison between the unbought satisfaction of conversation, society, study, even health and the common beauties of nature, but above all the peaceful reflection on one’s own conduct: What comparison, I say, between these, and the feverish, empty amusements of luxury and expense? These natural pleasures, indeed, are really without price; both because they are below all price in their attainment, and above it in their enjoyment.”

Yeah, you have to love Hume. The same combination of logical rigor and profound humanity is no doubt at the bottom of my admiration for a number of contemporary scholars – like economists Amartya Sen and Partha Dasgupta, and psychologists Robyn Dawes, Baruch Fischhoff, and Paul Slovic, to name just a few – who apply quantitative methods in the service of human well-being. Taken together, it is enough to make you hold out hope for a new Enlightenment.

2 Comments

Filed under Interviews