Category Archives: Interviews

Everything We Knew About Human Vision is Wrong: Author Mark Changizi Tells Us Why

For theoretical neurobiologist and author Mark Changizi, “why” has always been more interesting than “how.” While many scientists focus on the mechanics of how we do what we do, his research aims to grasp the ultimate foundations underlying why we think, feel and see as we do.  Guided by this philosophy, he has made important discoveries on why we see in color, why we see illusions, why we have forward-facing eyes, why letters are shaped as they are, why the brain is organized as it is, why animals have as many limbs and fingers as they do, and why the dictionary is organized as it is.

His latest book, The Vision Revolution, is a trenchant and insightful investigation into why humans see and interact with the world as we do. His findings are challenging and often surprising, and his witty, engaging style is accessible to a broad range of readers . He was generous enough to spend a few minutes with me recently to discuss his book and other topics.

NN: What originally led you to write a book about human vision in particular, instead of any of the other human evolutionary adaptive traits?

MC: Indeed, I don’t consider myself solely a vision scientist. I call myself a theoretical neurobiologist, more generally, and I have had a number of non-vision research directions, including, for example, the shape and evolution of the brain, and why animals have as many limbs and digits as they do.  Some of these research directions were central parts of my first book, The Brain from 25,000 Feet.

I was led to a book on vision because that’s where my research led me, and so the question is, Why did I end up with quite a few research directions in vision?

As a theoretical neurobiologist, I try to find interesting phenomena that I can wrap my head around, with the hope of putting forth and testing rigorous and general explanatory hypotheses. That’s not easy, but there are a number of reasons why it’s easier for vision.

First, relative to other senses and/or behaviors, the amount of data we possess for vision is huge. There’s a century-sized pile of data, much of it not well explained, much less in a unifying manner.

Second, vision is theoretically approachable. You have a visual field, you see objects, and so on. We know how to at least begin thinking about the phenomenology. It’s more difficult for audition, and practically impossible for olfaction, where we have little idea how to even describe our perceptions. …forget about explaining anything!

And, third, for vision we have the best understanding of the underlying mechanisms.

My point is that, as a theorist struggles for phenomena he or she can crack, vision appears as a large attractive target compared with many other aspects of brain and behavior. One may end up attacking vision problems even if one isn’t excited by vision, merely because it’s juicy. (I am excited by it, though, especially to the extent that I can find exciting hypotheses.)

I was intrigued by the “mind reading” aspects of vision.  In a nutshell, how does this work, and how do humans benefit from this ability?

Our color vision fundamentally relies upon the cones in our retina, and I argue in my research that color vision evolved in us primates for the purpose of sensing the emotions and states of those around us. We primates have an unusual kind of color vision – our cones sample the visible spectrum in a peculiar fashion – and I have shown that one needs that kind of peculiar color sense in order to pick up the color modulations that occur on our skin when we blush, blanch, redden with anger, and so on. Our funny primate variety of color vision turns out to be optimized for seeing the physiological modulations in the blood in the skin that underlies our primate color signals.

So, we evolved special mechanisms designed for sensing the emotions and states of others around us. That sounds a lot like the evolution of a “mind-reading” mechanism, which is why I (only half in jest) describe it that way.

You mention in the book that reading and writing are relatively recent advances in human development, and yet we take for granted that we “see” and understand words, as if our brains were simply meant to see and understand them.  What’s really going on that allows us to make sense of symbols on a page—and why can we do this at all?

In talks I often show a drawing of a child reading a book titled “How to Somersault.” The “joke” is that most kids are able to read very early, often even before they can do stereotypical ape behaviors like somersaults and monkey bars. Sure, they comprehend speech much earlier, but they’re getting orders of magnitude more speech thrown at them than writing. Kids learn to read very early, and very well; and as adults we are ridiculously capable readers, and spend nearly all our day reading.

Aliens might be excused for thinking we evolved to read.

But the invention of writing is only thousands of years old. In addition, for most of us, our grandparents, great grandparents or great great grandparents didn’t read at all. Writing is much too recent for our brains to have evolved to have reading mechanisms.

How does our brain do it?

Is it because our visual system can become good at reading whatever we present to it? No. Kids would surely not be capable readers by around six if they were tasked to read bar codes or fractal patterns.

The solution is that culture made writing easy on the eye, by shaping letters to be what the eye likes. The idea that culture shapes our artifacts to be good for us is not new. What’s new here is a specific hypothesis for what writing should look like in order to be good for us.

To be easy on the eye, writing needs to “look like nature,” just what our illiterate visual systems are fantastically competent at processing. The trick of that research direction was making this “writing looks like nature” idea rigorous, and coming up with ways of testing it. I show that there are certain signature visual patterns found in nearly any natural environment with opaque objects strewn about, and that these signature patterns are found in human writing. In short, writing has evolved so that written words look like visual objects.

Continue reading

13 Comments

Filed under Books and Ideas, Interviews

Changing Socrates’ Diapers: An Interview with Author Alison Gopnik

gopnik150At different points in history, baby brains have been described as blank slates, balls of clay, and information sponges—and the debate about which is closer to the mark has smoldered for centuries. Today, the debate is more refined, though no less dynamic, and percolates amidst a commercial sea of products claiming to catalyze genius in junior’s noggin.  Finding grains of truth in this tsunami of misdirection might be one of the most exhausting things already exhausted new parents try to do. Baby is, after all, worth the effort.

Thankfully, incisive minds are on the case, and Alison Gopnik is a pioneer in the pack. Her earlier book, Scientist in the Crib, was a refreshing change from the deluge of speculative “how to” baby books for parents, aiming instead to tell us what science has actually uncovered about the minds of children. Her latest book, The Philosophical Baby: What Children’s Minds Tell Us About Truth, Love and the Meaning of Life, takes the discussion to a higher level, challenging long-held assumptions about the quasi-consciousness of the infant brain, and showing that a better understanding of how babies think provides richer knowledge of the ongoings in our heads.  She recently spent some time discussing her new book with Neuronarrative.

NN: When I saw the title of your new book, I instantly thought of a baby playing with toys in a similar way that a philosopher plays with ideas. Is that the sort of image you had in mind when you started writing?

Gopnik: Well, partly that, but mostly the thought was that the philosopher should be paying more attention to the babies. Childhood is a significant and profound part of life for all of us but it hardly appeared in 2500 years of philosophy. Fortunately the scientific discoveries of the past thirty years have started to change that,

dd-gopnik03_ph2_0500397321You discuss the different sorts of intelligence that babies and adults possess. Briefly, what characterizes each and how do they differ?

The idea is that babies explore, and adults exploit. I argue that the very purpose of childhood is to give us a long protected period in which we can explore the world without having to act on it.  Babies are designed to learn as much as they can about the world. Adults are designed to take what they’ve learned and act on it swiftly and efficiently.

A lot of recent research has been devoted to plasticity in the adult brain, suggesting that the brain can change in significant ways even in life’s later stages. What distinguishes that sort of plasticity from what we see in the young brain?

The main difference is that adult plasticity seems to be much more “top-down.” It’s the result of intentional processes of attention and training. It also seems to be balanced by inhibitory processes that actually reduce plasticity in other parts of the brain. Young brains seem much more generally plastic and changes in those brains are driven more by bottom-up experience. In fact, you might say that as adults when we pay attention to something we are really intentionally regressing a small part of our brain to its infant state, while we hold the rest constant.

You said in a recent New York Times op-ed piece that “our mature brain seems to be programmed by our childhood experiences—we plan based on what we’ve learned as children.” Some will see in that statement a hint of biological determinism. Can you elaborate on what you mean by “programmed” in this sense?

The thought is anti-deterministic, I think, in that it says that what we learn as children shapes what we can do as adults – it’s not all genetic. But, of course, we continue to be able to learn as adults, we just don’t do it as generally or easily or spontaneously. In fact, much of my recent work has been informed by Bayesian statistical ideas. And from that perspective it makes a lot of sense to be less willing to give up ideas when they’ve been very strongly confirmed by lots of prior experience. But I think science shows that everything is up for revision, in principle, even as adults

parent-9780688177881What’s your impression of the vast “make your baby smarter” industry that’s sprung up in the last couple of decades?  Can we make our babies smarter, or are we just making the creators of these products richer?

I understand where it comes from.  It’s probably the first time in history when most people who have children haven’t had much experience with children before – and they’re understandably anxious. But I do think it’s a sad irony that we spend billions on these basically useless products, and very little to support the caregivers – parents and preschool teachers and babysitters who actually make a real difference to how children grow up.

On a related note, what’s your advice to parents of babies and young children who want their kids to intellectually and imaginatively be all they can be?

I’m afraid this is a case where the psychological wisdom really is pretty banal – talk to your children, read to them, pay attention to them—but not too much. Let them watch you. Give them a rich environment to learn in, with the understanding that from an evolutionary point of view the ideally rich learning environment for babies probably involved large amounts of mud, relatives and livestock. Unfortunately, second cousins once removed, jolly great uncles and friendly pigs are in short supply nowadays. But a sandbox, friends and babysitters, and a bean plant and a goldfish will do very well. And though it’s banal, it’s a lot more than many parents can manage.

2 Comments

Filed under Interviews

The State of Science Journalism: A Discussion with Science Writer Carl Zimmer

Carl_Zimmer_S8I0005-743212Science journalism is taking it on the chin lately.  Major news outlets are curtailing their science coverage, and in some cases closing down science departments altogether.  In a rough economy–when the overall future of print media is in question–it seems that science has been deemed expendable. In a sense, this is inexplicable, considering the fact that science has never failed to capture the public’s imagination. But with budgets tumbling, that fact isn’t keeping water out of the boat.

Carl Zimmer, science journalist and author, has spent a lot of time thinking about the state of science journalism. As a contributor to the New York Times, Discover, Scientific American, Science and Popular Science, he’s in an excellent position to weigh in on the current situation. I asked him if he would, and he graciously accepted. 

DiSalvo: From your vantage point, what’s the trajectory of mainstream science journalism?  Going up, going down, going elsewhere?

Zimmer: Hanging in there. There certainly have been a lot of depressing pieces of news about the science journalism business in recent months. CNN gets rid of its science unit. The Boston Globe eliminates its science section. Science magazines are still surviving, although Scientific American just lost its editor-in-chief John Rennie and other staffers. But in order for people to get excited enough about science to pick up a science magazine, somebody’s got to lure them in and show them why science is important and marvelous. It will be a shame if there’s less science in the public square. But I still think there’s a great passion for science out there. We just have to find ways to satisfy that passion in a sustainable way.

Along with those major outlets shutting down their science departments, we’re also seeing some going completely online. Is the same economic engine driving both of these trends?

Zimmer: I think so. When media outlets start suffering, they look around and try to figure out what they can live without. Unfortunately, a lot of them decide science can go. As far as I can tell, though, it doesn’t look like these stop-gap measures can stop the decline. So perhaps the people who are cutting science sections should ask whether they are actually still producing something people want to read.

If economics are trumping science, how much of a chance does science journalism stand?

Zimmer: I have hopes for science journalism in the long-term. It may evolve into some unexpected forms, but it will stay with us. Consider Stephen Colbert. He’s regularly got scientists and science writers on to talk about their work. And the people who watch that show go out and buy lots of books about science. Four years ago, could anyone have predicted that a guy doing a Bill O’Reilly impression on a comedy station would become a major new force in popularizing science?

Non-traditional outlets, like blogs, are more and more filling a role of delivering science content – but I know you’ve expressed some concerns before about blogs not having the resources to do solid investigative journalism, like a newspaper would.  Do you think that function is really in jeopardy – or, perhaps, morphing into something else?

Zimmer: I can’t think of a blogger who arrived on the scene and declared that she or he was going to replace science journalists. They had something else in mind. They wanted to write about beetles or attack medical quacks or do something else that filled them with enough passion to figure out how to use WordPress.  So I don’t think science bloggers ever claimed that they would replace science journalists. They were certainly ready to call out the bad science journalists and explain why they were peddling misinformation. But that’s different.

I could imagine science bloggers attracting some of the readers who would have turned to science journalism otherwise. But there’s some stuff that won’t be replicated if science journalism were to disappear. The plain fact is that it costs money to send people places to report. And science often happens in the remotest places on Earth, like the South Pole or Pacific islands or on top of mountains. Bloggers blog mainly for love, and love won’t get you to Mount Kilimanjaro.

I’ve heard some make the argument that if the mainstream behemoths of science news go down, we’ll be awash in disinformation coming from all sides.  Is that a fear you share?

Zimmer: I do share that fear, but I also don’t think that’s a reason for mainstream behemoths to be smug and self-satisfied. I have been appalled at the lack of fact-checking at the Washington Post when George Will decides it’s time for him to be the expert on global warming. Indeed, every week you can delve into plenty of blog posts in which some irate scientist rails against bad reporting about science in some newspaper, on some television show, or on the radio.

If people can no longer get their science fix from major newspapers or magazines, they will turn to the Internet. And some of the most popular sources of science on the Internet are nightmares of nonsense. For some reason, the Huffington Post delights in a ludicrous cocktail of anti-vaccination disinformation, New Age gibberish about swine flu being in your mind, and other anti-scientific posts.

For the average reader looking for truthful content, how can someone distinguish the good from the bad?

Zimmer: It’s an iterative process. Readers shouldn’t just passively absorb what they see in print or on the monitor. They should explore, and the web means there’s no excuse not to. You may quickly discover critics who point out real problems with something you just read, or you may find that it jibes with what the experts are saying. There are also some good clues to the stuff you can trust—it frequently cites peer-reviewed research (and does so accurately), for example, and it doesn’t rely simply on name-calling (although that can be fun in the proper dosage).

If it’s clear—and it certainly seems to be–that fact-checking ain’t what it used to be, then why do we trust the big guys so much? Should we?

Zimmer: My criticisms are focused on the op-ed pages of newspapers and the fake scientific debates sometimes organized on television. These are places where people can make factually false statements, and no one give them a hard time about it because it’s opinion. There are still plenty of places where science journalism is carefully fact-checked. That doesn’t mean mistakes don’t get through—but if they do, corrections are published.

Right now, where is the best science content coming from?  Is it an array of pubs, or a handful? (and which might you suggest for those wanting a regular dose of quality science news?)

Zimmer: I’m totally biased here, so feel free to ignore what I have to say. But the fact is that I like to publish in places where there’s a lot of reporting I like to read–The New York Times, Scientific American, Discover, Science Magazine, National Geographic. The New Yorker may go a few weeks between science stories, but they often have very intriguing stuff.

Any new books coming out from you that we should be on the lookout for? 

Zimmer: I’m putting the last touches on The Tangled Bank: An Introduction to Evolution. It’s a non-majors textbook with lots of illustrations and some of the weirder case studies of evolution I’ve encountered, such as the sexual arms race in ducks, mind-controlling parasites, and the evolution of snake venom. It should be out in November, to coincide with the 150th anniversary of The Origin of Species.

Link to Carl Zimmer’s excellent science blog, The Loom

4 Comments

Filed under Interviews

Are We Born Believers or Cultural Receivers? A Discussion with Author and Psychologist Bruce Hood

SuperSenseFew topics in psychology are gaining more momentum than the origin of religious beliefs. Questions of whether we’re born with neural apparatus that predisposes us to belief, or whether we learn to becomes believers, or some combination of both, are on the minds of researchers from all quarters.  Bruce Hood, experimental psychologist at Bristol University, is a groundbreaker among the curious. In his new book, SuperSense: Why We Believe in the Unbelievable, Hood argues that we are each born with an innate “SuperSense” rooted in our capacity for intuitive reasoning. Drawing on recent research and historical examples, Hood convincingly makes a case that supernatural belief arises spontaneously well before cultural influences assert themselves.

I recently spent a few minutes with Professor Hood discussing his SuperSense theory and other topics from his book.

DiSalvo: What is the “SuperSense”?

Hood: The SuperSense is a set of related beliefs that there are hidden dimensions to reality manifesting as energies, patterns and forces accounting for paranormal claims and experiences.

How would you describe these “hidden dimensions to reality”?

Hood: People infer from their experience that there are hidden forces, energies and entities operating in the world. From luck to God, people think there are hidden elements.

The main argument of the book is stated as, “Children generate knowledge through their own intuitive reasoning about the world around them, which leads to both natural and supernatural beliefs.”  Do children arrive at supernatural beliefs entirely before culture makes an imprint on their developing minds? If that’s true, then what’s the role of culture, to provide content for these beliefs?

Hood: Not entirely. Rather, children are inclined to those beliefs from culture which resonate with what they believe could be possible. For example, pre-school children do not understand death as a final state. When they are told that something has died they want to know where it has gone, so after-life beliefs either from religion or paranormal accounts are readily accepted.

However, if there was no culture to feed the children with after-life beliefs then the SuperSense theory predicts that such notions would arise spontaneously. What we need is a “Lord of the Flies” scenario to test this prediction but the fact that after-life beliefs are present in all cultures strongly suggests that this is a universal belief.

You discuss the history of supernaturalism and mention a couple of surprising examples: one is the “lion-man” statuette that dates to 32,000 years ago, and another is evidence of ritualistic burials dating back 45,000 years.  What do these examples tell us about our species’ supernatural leanings?

lion manHood: The lion-man statuette found in Germany is a therianthrope – part human part animal. Many of the early religions dealt with such mythical creatures and possibly reflect early man’s pre-occupation with hunting and how to improve success by appeasing the animal gods. This is still present in a number of remaining hunting societies such as the Inuits.

I also think that it is no coincidence that therianthropy resembles young children’s early anthropomorphism which is to attribute human properties to animals, though I neglected to cover this in SuperSense. The early emergence of these beliefs, including ritualistic burials, in the dawn of civilization across differing groups strongly suggests that they are part of our mental machinery.

You say that the number one reason people believe in the supernatural is their own personal experience – and this for many is a sort of permanent inoculation against scientific explanations. In light of that, is all of the time and energy spent by the scientific community and others to dispel supernatural explanations simple going to waste?

Hood: No, education is always a good thing. But my point is that even scientists can seem to fence-ring their supernatural beliefs from their science. For example, I recently attended a conference and discussed the nature of the book with colleagues. What surprised me was that many colleagues admitted to superstitious rituals when it came to submitting grants and papers for review.

What do you think explains religious scientists? 

Hood: Some scientists seem to happily shift from their rational, analytical approach to their topic and the supernatural aspects of their spiritual life. Some endorse religion as a community activity whereas others are quite happy to entertain the possibility of secular supernaturalism.

I find it ironic that psychology, which is often considered a “soft” science, has more skeptics than physics, for example. Maybe psychologists are more familiar with the foibles of the human mind whereas physicists understand the rules that govern the natural world.

You spend some time in the book discussing “intuitive essentialism” and how many supernatural beliefs arise from a sense of “essentialist violation.”  Please explain a bit about this.

Hood: ProfBruceHoodIntuitive essentialism is an untaught notion that children spontaneously develop that living things have an inner substance (essence) that makes something what it truly is. For example, pre-school children understand that there is a ‘doggy’ essence that makes dogs different from cats that have a ‘catty’ essence. Well there is. It’s called DNA, but no pre-school child is taught this. They simply infer this as part of their intuitive essentialism.

This is why apparent violation of the essence–for example, mixing up the species through either weird experiments or genetic modification–presents a violation to our notion that the essential integrity of an individual should not be violated.

How does this tie back to supernaturalism?

Hood: Well, in the book I explain how psychological essentialism helps the child to carve up the living world into different species and that this reflects an inference of essential integrity. Hence, violations (such as genetic modification) challenge the identity of the animal. Psychological essentialism also explains why individuals feel there is something inside living things that can be imbibed or transferred; this property is considered the vitalistic essence of the animal. This is why we eat certain foods for virility or animal strength, for example.

We all have, you say, “totems and sacred objects,” religious or otherwise, and that within each of us is a “sacred super sense.” When I read that, I immediately thought of the pushback you’ll likely receive for using the word “sacred” in reference to the secular. Why does “sacred” fit here for you as a defensible description? 

Hood: I borrowed the term from the economic psychologist, Philip Tetlock who talked about sacred values, a set of taboos and beliefs that members of a social group share. They are sacred because members of group should not violate them and hence they operate to sustain cohesion in the group.

In SuperSense, I argued how sacred values could be the totems, places, objects, rituals and all manner of items that are thought to have supernatural powers. These objects are deemed to be supernatural because of our inherent SuperSense and so they are revered as being profound rather than mundane. Otherwise, they would be just ordinary clumps of matter – which, of course, is exactly what they are, as indeed we all are.

5 Comments

Filed under Interviews

Diagnosis: DREAD – Talking about Epidemics, Panic and the Revenge of the Germs with Philip Alcabes

By David DiSalvo        

flu germIt’s a huge understatement to say that panic is part of human nature. We’re all wired to anticipate threats and experience nervous system overdrive when they arrive – our species wouldn’t have made it this far if we didn’t. But what happens when the anticipation itself is enough to trigger heart pounding panic?  And stranger still, why do threats as rare as they are vague cause more panic than threats that surround us every day? 

Those are a couple of the questions that infectious-disease epidemiologist Philip Alcabes set out to investigate in his newly released book, Dread: How Fear and Fantasy Have Fueled Epidemics from the Black Plague to Avian FluWhat he couldn’t have known, however, is that his book would begin hitting bookstore shelves just as swine flu began consuming the public consciousness – providing a more than timely example of a dread-catalyzing threat with mass-panic potential.  

Dr. Alcabes spent some time talking with me about epidemics real and imagined, how we respond to threat inspiring messages in the media, and why our attention is riveted by remote threats while tangible ones close to home are not hard to find. 

 

We’re right in the middle of what appears to be a full-blown epidemic just as your book is hitting the shelves. What’s your take on what we’re seeing in the news?

book alcabesAll epidemics are stories.  They often have a widespread disease at their core (often but not always, as the epidemics of “cyber-stalking” and school shootings attest).  But the numbers of the sick, dying, and deceased aren’t the main aspect of the story.  There have been 50-odd deaths associated with the new flu strain as we speak.  Does 50 deaths make for an epidemic?  That’s less than the death toll on American highways and roads on the average day.  It’s less than the toll taken by malaria in Africa in any one-hour period of any day.  It’s sad, and it’s a frightening reminder of the randomness of nature’s deadly bite.  But 50 deaths from accident, incident, or infection doesn’t always constitute an epidemic for us.

In fact, the numbers of cases of swine flu and the flu death rate are both quite low in comparison to the normal situation with seasonal flu, the “bug” that comes around every winter.  If this were January, we might not even have noticed this outbreak, as it would have been hidden by the far larger and more lethal outbreak of plain-vanilla flu.  In fact, if in any given winter the death rate from flu were as low as it’s been in this springtime outbreak, we’d be relieved and call it a mild flu season.

 

But would you agree that it’s truly an epidemic – the “real thing”?

alcabes-portraitYet, I would agree that this is an epidemic — simply because that’s what people say.  In fact, as we speak, the W.H.O. has raised the “pandemic alert” to 5 on a scale of 6.  Our officials are leading the way in making sure that this small outbreak (it has affected a handful of countries, with about 2500 cases in Mexico so far, 90-odd in the U.S., and scattered clusters elsewhere) is indeed defined as an epidemic.  Possibly a pandemic.

The question I ask myself is, why is it so important to us to see this small, thus-far mild outbreak of flu as a scary situation?  Why should W.H.O. feel the need to act? 

In part, it’s because we’ve been primed for this.  Our health agencies (the W.H.O. most notably) have been telling us for years that a flu pandemic is “inevitable.”  All those agencies needed a case-in-point to justify their dire warnings, otherwise the “pandemic preparedness” campaign might have gone the way of the prior “bioterrorism preparedness” campaign (2002-2004):  simply withered away from lack of interest.

But more deeply, the preparedness rhetoric influenced our thinking.  Repeatedly gesturing toward the terrible 1918 flu outbreak, in which tens of millions of people died worldwide, authorities and flu researchers reminded us to think of 1918 when we think of flu.  The result, as we see now, is that the few facts available about the new flu serve as the basis for projections of our horror fantasies.  People (again, including W.H.O. officials) talk about the inevitability of a “pandemic,” about the likelihood that there will be more cases and more deaths. 

So, if by “real thing” you mean, is this a public health problem, I’d say yes.  People are sick with a contagious disease.  More might fall ill.  It demands attention from public health authorities.

But if you mean, is this the disaster that is being depicted, I would say not yet, and probably not ever.  The problem is that once the fantasy scenarios start being painted, the facts become scenery on the stage.  It’s the fear that drives the drama.  We’ll undoubtedly see more fear-driven pronouncements.  I hope we’ll also see good public health.

 

We’re hearing some health officials say that this flu is a harbinger of diseases to come — an evolved mutant virus combining multiple strains.

pigs_deadThis is, simply, influenza.  What flu does is switch back and forth between species, recombining genetic elements, mutating here and there, “reinventing itself,” to use the term of art.  I suspect that calling it “swine” flu gives it a certain pernicious cachet, “swine” being associated with filth in the language.  But it seems important to us to label this virus with its own name, not just as flu but “swine flu,” as if it had some special status.  I think the naming helps us to be frightened.

Is it a harbinger of the future?  Well, I don’t have that particular crystal ball.  A lot of people who call themselves flu researchers and whom the media refers to as “experts” are fond of making predictions about pandemics, as if they could see the future.  This has gotten us into trouble at least once, in the swine flu immunization fiasco of 1976 (when hundreds of Americans were sickened by a flu vaccine and over 30 died, yet there was no serious outbreak).  And it gets us into trouble when, as with “bioterrorism,” we spend a fortune protecting ourselves from a chimeric threat. 

 

But how do we plan to protect public health unless we make some predictions about possible outcomes?

I think we have to draw a distinction between sensible planning for sound public health programs based on observable facts, and so-called predictions that are really just projections of horror fantasies.  We have to be careful with this flu outbreak, because, as I said earlier, there are a lot of fantasies afoot, and because many of them hark back to 1918.  We have to remember that the world is a very different place than it was in 1918.  We have to do good public health to ease suffering and control disease — but we don’t want to get into the business of divining the future.  We should stick to what we know, and can see, and what we know how to do about what we can see.

Continue reading

3 Comments

Filed under Interviews

Can You Outsmart Your Genes? An Interview with Author Richard Nisbett

Book Review Intelligence And How To Get ItWhile the debate over intelligence rages on many fronts, the battle over the importance of heredity rages loudest. It’s easy to see why. If the camp that argues intelligence is 75 to 85 percent genetically determined is correct, then we’re faced with some tough questions about the role of education. If intelligence is improved very little by schools, and if the IQ of the majority of the population will remain relatively unchanged no matter how well schools perform, then should school reform really be a priority? 

More to the point, if our genes largely determine our IQ, which in turn underlies our performance throughout our lives, then what is the role of school? For some in this debate the answer to that question is simply, “to be the best you can be.”  But that seems little comfort for those who aspire to “be” more than what their IQ category predicts they will.

Those on the other side of this debate question whether heredity plays as big a role as the strong hereditarians claim.  And for the role it does play, they question whether hereditability implies immutability. Heredity of height, for example, is about 90 percent, and yet average height in several populations around the world has been steadily increasing due to non-genetic influences, like nutrition. If such a strong hereditary trait can be radically altered by environmental factors–and height is but one example of this–then why is intelligence different? 

It is not, argues the camp that might best be described as intelligence optimists.  For them, the pessimism that colors the strong hereditarian position isn’t only discouraging, it’s dangerous. Too much is hanging in the balance for pessimism about the potential of our children to prevail.

nisbett1Richard Nisbett is a champion of the intelligence optimist camp, and with his latest book, Intelligence and How to Get It, he has emerged as the most persuasive voice marshalling evidence to disprove the heredity-is-destiny argument.  Intellectual advancement, Nisbett argues, is not the result of hardwired genetic codes, but the province of controllable factors like schools and social environments–and as such, improving these factors is crucially important.  In the thick of controversy, he was gracious enough to spend a few minutes discussing his book with Neuronarrative.

In Intelligence and How to Get It, you counter the arguments of strong intelligence hereditarianism, but in a sense you’re countering heritability dogma overall. What led you to take on this challenge?

My only complaint was with the heritability of intelligence per se. I just had the strong intuition that intelligence, and certainly IQ scores, were heavily influenced by the environment and by gene-environment interactions. My research indicates that in fact heritability, especially for adult IQ, is substantially less than frequently assumed.

One of the topics you discuss in the book is that drawing inferences based on correlations often produces misleading results. What’s an example of this in the case of intelligence?   

The correlation between identical twins reared apart gives an overestimate of heritability because the environments of identical twins reared apart are often highly similar. But the main contradiction of heritability estimates lies in the fact that adoption produces a huge effect on IQ – much bigger than could be explained if you believed the conclusion of heritability estimates based on sibling correlations.

You discuss the importance of early childhood education and provide some compelling statistics on the IQ-boosting effects of preschool. Why in a nutshell is early education so essential?

This is speculative at this point, but here goes. It is beginning to look like the IQ deficits of poor minority kids begin extremely early and have to do with rearing techniques. Parents of such kids don’t talk to them much and don’t do things that would stimulate intelligence. At any rate, we know of several socialization practices that correlate substantially with IQ, and for all those practices parents of poor minority kids are on the low side.

If a child doesn’t receive quality early education, will he or she still be able to bridge the gap later on? 

We do know that interventions as late as early adulthood can have a big effect on IQ and academic achievement. College reduces the IQ gap between blacks and whites from one standard deviation (SD) to .4 SD. Just telling junior high school kids that their intelligence is under their control can produce a gain in GPA. You can put a great deal of educational effort in at middle school and junior high ages and produce marked IQ and academic achievement gains.

You mention that children with greater self-control tend to have higher intelligence.  How are these linked, and is it reasonable to conclude that increasing self-control raises intelligence? 

This is speculative. We know there is a correlation between self-control and intelligence, especially between self-control and both ACT achievement and SAT scores. What we don’t know is whether this relationship is causal. I don’t doubt that it is, but I can’t prove it.

We now know that the brain isn’t a static entity, but rather possesses remarkable plasticity – even, to a degree, well into adulthood.  In light of this, and your own research, is it possible for adults to still boost their IQs?  

We know that you can increase fluid intelligence even in adults by some kinds of computer-game-like programs. But that work is in its infancy. We know also that the hippocampi of London taxi drivers is 25 percent larger than normal – due to an increase in the spatial relations requirements of the job.

I took away the sense from reading the book that you’re a hopeful realist.  If we could begin making changes to our educational system today, what do you think are the most important things we can do to create a brighter future for our kids?

Really effective intervention with parents of low socioeconomic status infants to help them with socialization practices, really good pre-K, KIPP-type elementary and middle school.

I am hopeful, for sure. In principle you could have all these things for the bottom third of socioeconomic status  families for less per year than the bailout of AIG. But I hasten to say that we don’t really know how well any of the programs shown to be effective in demonstration projects would scale up.

3 Comments

Filed under About Intelligence, Interviews

Psychology for Dummies: An Interview with Authors Laura Smith and Charles Elliott

elliott_smith_bwDepression for Dummies, Overcoming Anxiety for Dummies, Obsessive Compulsive Disorder for Dummies- these are just a few of the titles penned by Dr. Laura Smith and Dr. Charles Elliott, a writing duo with a library of psychology and self-help books between them.  Tackling challenging topics with an accessible style is their specialty, and has allowed millions of readers gain a better understanding of anxiety, depression, OCD, and borderline personality disorder, among other topics.

They recently spent some time discussing the For Dummies series and a variety of psychology issues and questions with Neuronarrative. 

You’ve written several books on depression, anxiety, OCD and related topics, including some of the wildly popular For Dummies books. What led you to the Dummies format to address these topics?  

depressionYes we have; in fact, we’ve just finished our sixth book in the series. As clinical psychologists, we’ve read dozens of self-help books. Most of them focus on how to deal with some specific mental disorder such as depression, obsessive compulsive disorder, or generalized anxiety disorder. Some of these books ignore empirical findings and present an interesting, but highly idiosyncratic and non-data based set of recommendations. Many of the better books in this genre are written by highly renowned researchers and do a great job of presenting the findings from a specific researcher’s approach to the disorder. However, in the past couple of decades, the mental health field has managed to develop a number of empirically based treatment strategies for most emotional disorders. We believe people can profit from knowing about a range of strategies so long as they rest on a research base.

In the For Dummies series, we saw an opportunity to provide consumers with an unusually comprehensive approach to each topic covered. Thus, in all of our books we discuss a variety of empirically supported treatment approaches, diagnostic issues and controversies, related disorders, etiology, prevalence, where and how to find professional help, and ideas for how friends and family can facilitate treatment. For example, in Obsessive Compulsive Disorder For Dummieswe discussed the fact that twenty years ago, the only treatments for OCD were exposure and response prevention (ERP) and medication. We were able to review not only ERP, but new mindfulness based approaches, cognitive therapy specifically tailored to OCD, medications, and Deep Brain Stimulation (a very preliminary, but possibly promising strategy).

At the same time, we appreciate the For Dummies series for its nontechnical, no nonsense approach to presenting information. We really enjoy taking complex subjects and presenting them in a way that enables intelligent consumers to understand a topic that may be new to them. Finally, we were thrilled that the editors also encourage the use of humor and a panache of irreverence. We believe that readers enjoy a touch of levity when reading about such serious subjects.

Some fear that the proliferation of medical information, particularly on the internet, is causing widespread self-diagnosis panic.  What’s your take on this?

We’re firm believers in the value of information. No doubt, some people panic when they discover on the Internet that they may have a couple of symptoms of some serious disorder yet later learn that they don’t really have the actual illness or disease. But we suspect that for all those who are unnecessarily rattled by what they read, many more discover that they suffer from problems that that they were unaware of, but that can be successfully treated–and generally with greater success than they would have had by not starting treatment until their doctor discovered something at a physical exam months or years down the road.

News concerning the development of psychiatric disorders in children, such as OCD, is on the rise. In your opinion, are parents getting better at identifying symptoms in their kids?  And have doctors become more willing to consider the possibility that a child needs psychiatric help? 

ocddummiesWe do believe that both parents and doctors have greater awareness about these issues than ever before. That awareness is certainly one of the reasons we see disturbing trends in the rise of various mental health issues in kids today as compared to the past. At the same time, some evidence suggests that more than increased awareness lies behind the escalating numbers we’ve seen in the past fifty years or so. Several studies have suggested that the rate of anxiety and depression in kids today greatly exceeds levels we’ve seen in the past.

We’re also concerned that there has appeared to be an over reliance on medications for dealing with these issues. Potent medications are increasingly being prescribed to kids for disorders which were once considered rare in children such as bipolar disorder. We suspect some of these diagnoses are given instead of behavioral disorder diagnoses so that these medications may be employed.

We take a more conservative approach to medications in kids because of a dearth of long term safety and efficacy studies. In fact, some studies have shown that many of these medications significantly increase risks of diabetes and sometimes set off suicidal thinking. Therefore, our usual recommendation is that treatments should first target the child’s problematic behaviors or moods as well as involve parents, the family, and the school environment. Cognitive behavioral interventions have been found to be especially effective and often obviate the need for medications. When they don’t, medications can be considered, but as judiciously as possible. 

Tell us what “anxiety” really is in a clinical sense, and how it’s different than “a case of nerves” that everyone occasionally feels.

saddummiesYou’re correct that everyone feels stress and gets a case of nerves from time to time. You couldn’t live a meaningful life without them. Normal anxiety occurs when you’re faced with real challenges and hassles. Normal anxiety can even prepare you to deal with such challenges more effectively. Some experts call this type of anxiety facilitative anxiety. Normal anxiety dissipates when the problem is solved or diminishes.

Think of preparing for an examination. If you have no anxiety or worry at all, you’re likely to feel little motivation to prepare. If you’re moderately anxious, you’ll spend a lot more time studying. If your anxiety goes over the top, you may study a lot, but be unable to concentrate or you may deal with the anxiety by procrastination or avoidance of the task. In other words, complete absence of anxiety isn’t always such a good thing, moderate amounts can help and excessive anxiety interferes with performance.

Clinical anxiety debilitates rather than facilitates. By definition, most anxiety disorders persist for months. They involve reactions that exceed the objective nature of whatever seems to trigger them and in some cases; no trigger is even easily identifiable. Clinical anxiety comes with strong physical symptoms such as fatigue, restlessness, interrupted sleep, poor concentration, muscle tension, and irritability. Clinical anxiety reduces quality of life.

When it comes to prescribing meds for anxiety, the voices of dissent are many. Tell us something about the controversy surrounding benzodiazepines and other anxiety meds.

Evidence suggests that many anxiety disorders are treated especially effectively with certain psychotherapies most of which are based on cognitive behavior therapy. Thus, we would rarely suggest medications as the first line strategy. Benzodiazepines, although frequently prescribed for anxiety by general practitioners, are especially problematic for a variety of reasons including:

  • Benzodiazepines have a significant addictive potential which may be heightened among those with anxiety disorders.
  • Some data suggests that these drugs may actually increase the risk of relapse when combined with cognitive behavior therapies for anxiety.
  • Benzodiazepines increase the risk of falling among the elderly and one study showed they may double the risk of automobile accidents. Of course, combined with alcohol, these risks escalate considerably.
  • One study suggested that taking benzodiazepines on a prolonged basis shortly following a trauma actually increased the risk of developing Post Traumatic Stress Disorder later.

This is not to say that medications have no role to play in the amelioration of anxiety disorders. However, we generally recommend that any use of benzodiazepines be strictly limited to short-term acute stress. Other medications such as the Selective Serotonin Reuptake Inhibiters (SSRIs) and Serotonin Norepinephrine Reuptake Inhibitors (SNRIs) sometimes appear useful as adjuncts to psychotherapy especially when appropriate therapies have failed to result in sufficient improvement. 

Diagnosed cases of depression are also rising, and some believe that it’s an increasingly over-diagnosed illness.  What do you believe is behind the surge in depression cases?  

dummiesdepressionOf course doctors and mental health workers are more aware of the symptoms of depression than ever before. In fact, increasing numbers of primary care offices provide screening instruments for depression and anxiety. The public has also become more aware of the symptoms of depression due to a bombardment of advertizing, largely paid for by the pharmaceutical industry. For the most part, this increased awareness is a good thing and it has steered many people into earlier treatment of their distress.

At the same time, we have concerns about that the fact that antidepressant medications are today the most prescribed drug for any condition whether mental or physical. We worry that doctors may be prescribing medications for what used to be considered subclinical conditions. People receive prescriptions after breaking up with a boyfriend, being fired from a job and failing to be accepted into a college. Life transitions such as moving to a new city or coping with grief or loss sometimes trigger a trip to the doctor’s office for relief. When emotional reactions to such events are unusually profound or prolonged, medication may be warranted in some cases. However, there’s value in struggling with sadness, worry, and loss. From times such as these, humans develop new philosophies, literature, and creative solutions. We’d hate to see medications used to stifle struggles. Happiness is better appreciated when one has experienced sadness.

One of the continuous controversies surrounding psychiatric disorders is the influence of drug companies on the prescription–perhaps the over-prescription–of certain meds.  How large a concern do you think this really is? 

bpddummies1The heat has turned up on this controversy in the past few years as increasing numbers of articles have appeared which indicate that pharmaceutical companies have frequently engaged in questionable practices such as holding back negative results from publication. In addition, many authors of medication studies have failed to disclose their substantial financial ties to the pharmaceutical industry. Finally, it appears that many of the authors of clinical practice guidelines have had significant financial arrangements with the pharmaceutical industry.

If you could choose any topic, which would it be for your next Dummies book?

We’d really like to answer this question. However, we’re currently discussing the next project or two with the For Dummies publisher (Wiley). We can say that one of these projects excites us more than any of the others we’ve done to date as we see a huge need for an accessible book on this topic.

Link to the authors’ website

Link to the authors’ blog

5 Comments

Filed under Interviews