Category Archives: About Perception

Did You See the Gorilla? An Interview with Psychologist Daniel Simons

If you’ve spent any time on YouTube over the last few years (and you know you have), you’ve likely seen the video of the invisible gorilla experiment (if you’ve somehow missed it, catch yourself up here). The researchers who conducted that study, Dan Simons and Chris Chabris, didn’t realize that they were about to create an instant classic—a psychology study mentioned alongside the greats, and known well outside the slim confines of psych wonks. Milgram taught us about our sheepish obedience to authority; Mischel used marshmallows to teach us about delayed gratification; and Simons and Chabris used a faux gorilla to teach us that we are not the masters of attention we think we are.

The duo’s new book, The Invisible Gorilla, and Other Ways Our Intuitions Deceive Us, is every bit as engaging as the original study was innovative.Using the invisible gorilla study as a jumping off point, the authors go on to explain why so many of our intuitions are off the mark, though we’re typically convinced otherwise. I recently had a chance to chat with Dan Simons about the study, the book, and why we’re usually in the dark about how our minds really work.

DiSalvo: What gave you and Chris Chabris the idea for the invisible gorilla study?

Simons: Our study was actually based on some earlier research by Ulric Neisser conducted in the 1970s.  His studies were designed to tease apart whether people focus attention on regions of space or on objects.  He wanted to see whether, if people were focusing on one part of a scene, they would automatically notice if something unexpected passed through that “spotlight” of attention.  To do that, he made all the objects partly transparent so that they all occupied the same space and could pass through each other. He found that people often missed an unexpected event.  But, the strange, ghostly appearance of the displays gave people a ready excuse for why they missed the unexpected event. Oddly, no one followed up on those studies, so we thought we’d give them another look and see whether people would miss something that was fully visible and easy to see.  We did our study as part of an undergraduate class project in a class that I was teaching.

Why the gorilla suit?

We were looking for something dramatic so that if people missed it, they would be surprised when we showed it to them again.  We also wanted something that would have some humor value to it.  Fortunately for us, Jerome Kagan, an eminent developmental psychologist at Harvard, happened to have one in his lab.

I remember the first time I watched the YouTube video of the study and was completely dumfounded when the question, “Did you see the gorilla?” flashed on the screen.  As researchers, I can imagine getting that reaction from people is like hitting a home run.

It surprised us the first time we ran the study – we didn’t expect it to work as well as it did.  It’s still a thrill to present the video to an audience and have people miss it.  Our intuition that we’ll notice something as visible as a gorilla is a hard one to overcome.  It took me years before I could trust that some people in almost any audience would miss it.

What do people tell you about their reaction afterwards?

Normally people can’t believe that they missed it.  On occasion, they’ve accused us of switching the video. The intuition that we would notice makes it jarring for people to realize that they didn’t.

And that’s really the point, right, that we can’t know what we are missing until our attention is refocused on it?

That’s a big part of it.  We can easily miss what’s right in front of us, but we don’t realize that we can.  Part of the problem is that we’re only aware of the things we notice and we’re not aware of the things we didn’t notice.  Consequently, we often have no idea what we’re missing.

Hence the myth of multi-tasking.

It depends on what you mean by multi-tasking.  If you mean simultaneous attention shared across multiple tasks, then yes, it’s a myth. We typically cannot do two things simultaneously.  We can perform multiple tasks one after another—a sort of serial tasking.

In the case of the first meaning, simultaneous attention across multiple tasks, why do you think so many of us are convinced we can do it?

I think a lot of people confuse these two possible ways of doing multiple tasks.  Because we can do one task and then another, switching back and forth among them, we falsely believe we can do two at once.  That confusion happens in part because we don’t realize how impaired we are when doing two things at once.  We’re too distracted to notice that we’re distracted.  That has dramatic consequences.  For example, we can’t talk on the phone while driving because that requires doing two tasks at once rather than sequentially (and both require attention).

Where does the intuition originate?

Our intuitions are based on our experiences. The problem is that our daily experiences frequently support incorrect intuitions about how our minds work.  We only are aware of the things we’ve noticed and we aren’t aware of the things we’ve missed, so we assume that we always notice things.  We don’t notice when we’re distracted by multitasking, so we think we aren’t distracted.  The same sort of principle explains many of our mistaken intuitions.

But why wouldn’t we develop an intuition from our experience that we can’t parse our attention?

Our experience is tied to our awareness.  We are aware of what we notice, not of what we miss, so we develop an intuition based on noticing. The principle applies to multi-tasking: we are aware only that we are accomplishing multiple tasks, because our daily life demands it, but we aren’t aware that we’re not really doing them at the same time. As a result, we mistakenly assume that we can do two things at once.  Given that we rarely encounter evidence to contradict our awareness — normally, there’s nobody around to point out the gorilla — we don’t learn when our intuitions are wrong.

We see people all the time who know very bad things can happen from, as one example, texting while driving, but they still do it.

That’s true, but most people could drive much of their lives without having an accident. And the longer they go without having an accident, the more they are deluded into thinking they can drive and text safely.  Fortunately, accidents are rare, but when they happen, they are catastrophic.  Knowing that we have these limits and taking them to heart can save our lives.  We learn best from our own experiences, but in this case, you shouldn’t wait to experience the consequences of distracted driving for yourself.

I can’t help but notice how so much of what we’ve been discussing runs counter to the conclusions of one of the most popular non-fiction books out there: Malcolm Gladwell’s Blink. Many people I’ve talked to who have read that book are convinced that we should trust our instincts instead of thinking things through.

The idea that intuition, gut instincts, and rapid decisions are a panacea for all of our decision making problems is really dangerous.  Unfortunately, that’s the message that some people have taken from Gladwell’s book.  Intuitions can be quite useful for some types of decisions, especially those that involve an emotional preference —who do you find most attractive, what ice cream tastes best—but they can lead us dangerously wrong when they are based on assumptions about how our minds work.  Gladwell is an incredible storyteller, but some of the conclusions he reaches in Blink are problematic.  Our work, and the work of other cognitive scientists, shows again and again that the intuitions people hold about how their minds work are often wrong.  When you dig deeper into the material he covers in Blink, you see that many of the featured examples are of expert pattern recognition, and that’s a very different thing than simply trusting intuition or instinct.

Like the example of a quarterback acting decisively without having time to think?

Yes, that’s expert pattern recognition.  Peyton Manning studies films for many hours in preparation for each game, and he has done that for years.  Then, in a game situation, he recognizes the pattern really quickly, and that leads him to find the open receiver readily.  That said, even expert pattern recognition is far from perfect.  If you let Manning analyze the films at a leisurely pace, he’ll find things he missed during the game.  The same principle applies to most experts.  They can make reasonably good decisions quickly and seemingly based on intuition — they’ll outperform novices with only a glance.  But given more time, even the experts often would make better decisions.

Yet the takeaway for many people is that “thinking” is a hindrance.

Thinking takes work, and the idea that we could go with our gut and do better is really appealing.  Unfortunately, it’s often not true.

What can we expect as a follow up from you guys? Can you top the gorilla study?

It’s hard to top having people miss a gorilla.  I do have a new paper that just came out in the new open-access journal I-Perception.  It talks about a new demonstration that I’ve called “The Monkey Business Illusion.”  It’s on YouTube now.  Basically, I wanted to see if people who knew about the original gorilla video would be immune to this sort of failure of awareness.  Try it for yourself!

Link to the authors’ website.

ResearchBlogging.org
Simons, D. (2010). Monkeying around with the gorillas in our midst: familiarity with an inattentional-blindness task does not improve the detection of unexpected events i-Perception, 1 (1), 3-6 DOI: 10.1068/i0386

Advertisement

13 Comments

Filed under About Perception, About Research, Books and Ideas

Ask, Don’t Tell, and Get it Done

Are you the sort of person who routinely tells yourself that you probably can’t achieve whatever it is you’d like to achieve? Does the voice in your head say things like, “Be realistic, you can’t really do this.”  And perhaps, fed up with positive self-talk mumbo jumbo in the media, you think that the only self-talk worth listening to is the “realistic” kind—the kind that tells you how it is. 

Well, whatever your feelings about positive psychology and its many spin-offs, there is some decent research with something to say about all of this—and your little voice should be listening. Research by University of Illinois Professor Dolores Albarracin and her team has shown that those who ask themselves whether they will perform a task generally do better than those who tell themselves that they will.

But first, a slight digression. If you have young kids or even early teens (or just have the misfortune of watching children’s TV shows), you may be familiar with the show “Bob the Builder.”  Bob is a positive little man with serious intentions about building and fixing things.  Prior to taking on any given task, he loudly asks himself and his team, “Can we fix it?”  To which his team responds, “Yes we can!”   Now, compare this approach with that of the Little Engine Who Could, who’s oft repeated success phrase was, “I think I can, I think I can…”  In a nutshell, the research we’re about to discuss wanted to know which approach works best.

Researchers tested these two different motivational approaches first by telling study participants to either spend a minute wondering whether they would complete a task or telling themselves they would. The participants showed more success on an anagram task (rearranging words to create different words) when they asked themselves whether they would complete it than when they told themselves they would.

In another experiment, students were asked to write two seemingly unrelated sentences, starting with either “I Will” or “Will I,” and then work on the same anagram task. Participants did better when they wrote, “Will I” even though they had no idea that the word writing related to the anagram task.  A final experiment added the dimension of having participants complete a test designed to gauge motivation levels.  Again, the participants who asked themselves whether they would complete the task did better on the task, and scored significantly higher on the motivation test.

In other words, by asking themselves a question, people were more likely to build their own motivation than if they simply told themselves they’d get it done.

The takeaway for us: that little voice has a point, sort of.  Telling ourselves that we can achieve a goal may not get us very far. Asking ourselves, on the other hand, can bear significant fruit, indeed. Retool your self-talk to focus on the questions instead of presupposing answers, and allow your mind to build motivation around the questions.

A short-cut:  just remember the battle cry of Bob the Builder.

5 Comments

Filed under About Perception, About Research

51 Pragmatic Suggestions: Mix, Match, Dismiss, Do!

1. Learn how to enter and exit the daily vortex (it will swallow you if you don’t)

2. Don’t give people the power to direct your life (because if you do, they will)

3. Beware those with an inflated sense of self (they’re always trying to expand their pyramid scheme)

4. Become a negotiator (it’s not a business term, it’s a life term)

5. Be nice (but not only nice)

6. Be tenacious (if you think you’re persistent enough, you probably aren’t)

7. Learn how to resolve into strategy (problem-solving must eventually rise to the top of consciousness even during the hardest times)

8. Stay grounded while you’re spiritualizing (and beware spiritual etherealites who aren’t)

9. Don’t become a sycophant, and don’t abide those who are

10. Don’t let a pursuit of your “purpose” short-circuit your passion (purpose isn’t verifiable, passion is)

11. Use mediocrity to find your edge (even doing poorly at something can be useful)

12. Know what you want (or at least try hard to figure it out)

13. Beware the mystification of entitlement (it’s a delusion that distorts reason)

14. Learn to enjoy competition (the race and the win)

15. Develop a love of “play” (it’s not kid stuff, it’s human stuff)

16. Beware the myth of predestination (and those who believe it)

17. Learn to use escapism, but don’t get lost in it

18. Learn to love culture, but don’t get drunk on it

19. Learn to appreciate business, but don’t deify it

20. Learn to manage expectations (yours and others’)

Continue reading

3 Comments

Filed under About Perception, Books and Ideas, Noggin Raisers

Getting Warmer, Getting Colder: The Chilly Paradox of Familiarity

People are strange when you’re a stranger.

– Jim Morrison

For most of us, familiar surroundings are comforting. Familiar places and faces offer a sense of stability in the maelstrom of everyday life. This seems especially true when we’re going through hard times; perhaps any port in the storm will suffice, but the one you know best is doubtless the one you’d rather find. 

But does familiarity hold the same value if we’re feeling on top of the world?  In other words, does the warm glow of what we know always stay strong despite our mood? 

A  research report in the journal Psychological Science suggests that the warmth of familiarity intensifies or lowers depending on the emotional state-of-mind we bring to it.

In a series of experiments, researchers compared participant reactions to familiarity under happy, sad, and neutral mood conditions. In the first experiment, they found that under general conditions, when the mood variable was not manipulated one way or another, people prefered familiarity. But following experiments showed that sad participants strongly preferred familiarity over the neutral condition (indicated in both self-reports and facial electromyography – EMG). Happiness, however, eliminated this preference.

It’s worth noting that happiness did not in any way reduce the level of familiarity – it simply reduced its value (decreased the “warmth of its glow”).

Continue reading

5 Comments

Filed under About Perception, About Research

Watching Too Much Crime TV Skews Views for the Worse

If you watch prime-time television, chances are you watch at least one crime drama. Most of us do. Year in and year out, the most consistently popular shows on television are about crime: CSI, Law & Order, Cold Case, The Closer, and all of the other spin-offs and ad nauseum syndications.

Regrettably for the viewing audience, a recent study from Purdue University suggests that the more we feed this craving for crime drama, the more distorted are our views of the criminal justice system and crime rates overall.

Glenn Sparks, a professor of communication who studies mass media effects, and Susan Huelsing Sarapin, a doctoral student in communication, conducted 103 surveys with jury-eligible adults about their crime-television show viewing and their perceptions of crime and the judicial system. Their research was presented in October at the International Crime, Media, and Popular Culture Studies Conference: A Cross Disciplinary Exploration at Indiana State University.

“Many people die as a result of being murdered in these types of shows, and we found the heavy TV-crime viewers estimated two and a half times more real-world deaths due to murder than non-viewers,” Sarapin says. “People’s perceptions also were distorted in regards to a number of other serious crimes. Heavy TV-crime viewers consistently overestimated the frequency of crime in the real world.”

Viewers of crime shows also misjudged the number of police officers and attorneys in the total work force. Lawyers and police officers each make up less than 1 percent of the work force, but those surveyed estimated it at more than 16 percent and 18 percent, respectively.

The study also linked heavy viewership of these shows with “mean world syndrome” — the belief that the world is more dangerous than it actually is. Previous research by media scholar George Gerbner associates this syndrome with paranoia about imminent victimization. Quoting Gerbner:

Our studies have shown that growing up from infancy with this unprecedented diet of [TV] violence has three consequences, which, in combination, I call the “mean world syndrome.” What this means is that if you are growing up in a home where there is more than say three hours of television per day, for all practical purposes you live in a meaner world – and act accordingly – than your next-door neighbor who lives in the same world but watches less television. The programming reinforces the worst fears and apprehensions and paranoia of people.

The present study is especially interesting in light of recent Gallup stats on public perception of crime as discussed on the blog Neuroworld, here. Crime decreased all through the 1990s, and for the last decade crime rates have remained steady. Yet, between 52% and 89% of Americans every year since 1990 have thought that crime is on the rise.

9 Comments

Filed under About Perception, About Research

Thinking You’re in Control Can Lead to an Impulsive Demise

temptationFor six months you’ve worked really hard to stick to a diet, and it’s paying off.  Not only have you lost weight, but now more than ever you’re better able to restrain your impulse to eat fattening foods. Your friends are telling you how impressed they are with your resolve, and truth be told you’re feeling pretty damn good about yourself as well.

Which is why, around month seven, you decide that your impulse control is sufficiently strengthened that avoiding being around ice cream, nachos, chicken wings, soda—and all the other things you used to eat out with your friends—is no longer necessary.  You’ve spent half a year changing the way you think about food and it worked. Maintenance won’t be difficult with a new mindset. Time to live again.

I probably don’t have to end this story for you to know how it turns out. It’s a classic tragedy with which many of us are already too familiar.  Pride comes before a fall, but even more often it’s our sense of inflated self-restraint that precedes a tumble into relapse. 

A new study in the journal Psychological Science investigated the dynamics underlying why we repeatedly convince ourselves that we’ve overcome impulsiveness and can stop avoiding our worst temptations.  This particular tendency toward self-deception is called restraint bias, and four experiments were conducted under this study to test the hypothesis that it’s rampant in our bias-prone species.

In one of the experiments, people walking in and out of a cafeteria were approached with seven snacks of varying fattiness, and asked to rank the snacks from least to most favorite. Once they finished ranking, participants were told to pick one snack, and further told that they could eat it at anytime they liked, but if they returned the snack to the same location in one week they’d receive $5 and could also keep the snack.  After choosing the snack, participants indicated if they would return it for the money, and then filled out a questionnaire which assessed their hunger level and impulse-control beliefs. 

Participants who were walking into the cafeteria said they were hungry, and those leaving said they were full; so the first evaluation was whether those leaving with full stomachs would indicate stronger impulse-control beliefs – and they did.  The next evaluation was whether the not-hungry participants claiming the most impulse-control would choose the most tempting (and most fatty) snacks.  They did.  Finally, would those who selected the most tempting snacks be least likely to return them a week later?  Indeed, they were.

In another experiment, heavy smokers were asked to take a test to assess their level of impulse-control.  The test was bogus, designed only to label roughly half of the participants as having a high capacity for self-control, and half as having a low capacity.  Being told which label they earned seeded participants with a self-perception in either direction.

Participants were then asked to play a game that pitted the temptation to smoke against an opportunity to win money. The goal of the game was to watch a film called “Coffee and Cigarettes” without having a cigarette.  They could select among four levels of temptation, each with a corresponding dollar value: (1) keep a cigarette in another room: $5; (2) keep a cigarette on a nearby desk: $10; (3) hold an unlit cigarette in their hand throughout the film: $15; (4) or hold an unlit cigarette in their mouth throughout the film: $20.  Participants earned the money only if they avoided smoking the cigarette for the entire movie.

As predicted, smokers told they had high self-control exposed themselves to significantly more temptation than those told they had low self-control. On average, low self-control participants opted to watch the movie with a cigarette on the table; high self-controllers opted to watch with a cig in their hand. 

The result: the failure rate for those told they had high self-control was massively higher than for the low self-control group, to the tune of 33% vs. 11%.  Those who thought themselves most able to resist temptation had to light up three times as much as those who suspected they’d fail.

One way to view these results is as reinforcement of a very old cliché: we’re our own worst enemies. Restraint bias has a place high on the list of biases we trip on routinely, and tripping on it once is no guarantee of not doing so again, and again…and maybe again.  Dieters relapse, smokers relapse, anyone with anything approaching a compulsion or addiction relapses—usually more than once. This study suggests that part of this repetition is due to thinking we can handle more than we can.

Another takeaway is that an entire industry is based on bolstering impulse control.  Self help books and motivational speakers aplenty play on a dubious concept, that there’s a “gold ring” of restraint we all can reach—just follow X system to get there.  But what this study suggests is that even if you think you’ve arrived “there,” you’ll eventually find out that “there” never existed. You were sold a mirage in the form of an inflated self-perception of restraint.  No refunds.

Reality is, psychological bias–restraint bias included–is a lot like conflict. You can’t avoid it. You just manage it.
This post was chosen as an Editor's Selection for ResearchBlogging.org

Nordgren, L., van Harreveld, F., & van der Pligt, J. (2009). The Restraint Bias: How the Illusion of Self-Restraint Promotes Impulsive Behavior Psychological Science DOI: 10.1111/j.1467-9280.2009.02468.x

Add to: Facebook | Digg | Del.icio.us | Stumbleupon | Reddit | Blinklist | Twitter | Technorati | Yahoo Buzz | Newsvine

6 Comments

Filed under About Perception, About Research

Just How ‘Blind’ Are You When Talking on a Cell Phone?

cell-phone-driving_smallEveryday in the news we see stories decrying the use of cell phones while driving.  Research reports aplenty have been released estimating the percentage of one’s attention siphoned by mobile jabber and how little is left to focus on the highway. 

This is great and I’m glad the discussion is happening, but it might be useful to ask whether cell phone use in other (non-driving) venues has a similar effect on attention. What better way to make the point that cell phone use is dangerous when driving than showing its effect on someone doing something not nearly as focus intensive — like walking, for instance.

That’s exactly what the authors of a new study published in the journal Applied Cognitive Psychology wanted to do. Researchers examined the effects of divided attention when people are either (1) walking while talking on a cell phone, (2) walking and listening to an MP3 player, (3) walking without any electronics, or (4) walking in a pair. 

The measure of how much attention is diverted during any of these activities is called “inattentional blindness” — not ‘seeing’ what’s right in front of you, or around you, due to a distracting influence.  If you’ve ever watched the YouTube video of the gorilla walking through the crowd of people passing around a ball, then you’ve seen an example of inattentional blindness (here’s a great paper on the effect downloadable as a PDF). 

For the first experiment of the study, trained observers were positioned at corners of a large, well-traveled square of a university campus.  Data was collected on 317 individuals, ages 18 and older, with a roughly equal breakdown between men and women.  The breakdown between the four conditions (with MP3, with cell phone, etc) was also roughly equal.  Observers measured several outcomes for each individual, including the time it took to cross the square; if the individual stopped while crossing; the number of direction changes the individual made; how much they weaved, tripped or stumbled; and if someone was involved in a collision or near-collision with another walker.

The results:  for people talking on cell phones, every measure with the exception of two (length of time and stopping) was significantly higher than the other conditions.  Cell phones users changed direction seven times as much as someone without a cell phone (29.8% vs 4.7%), three times as much as someone with an MP3 player (vs 11%), and weaved around others significantly more than the other conditions (though, interestingly, the MP3 users weaved the least of all conditions). 

People on phones also acknowledged others only 2.1%  of the time (vs 11.6% for someone not on a phone), and collided or nearly collided with others 4.3% of the time (vs 0% for walking alone or in a pair, and 1.9% when using an MP3 player).

The slowest people, who also stopped the most, were walking in pairs.  In fact, next to the other conditions walking in pairs was the only one that came anywhere close to using a cell phone across the range of measures.

The next experiment replicated the first, but only one measure was tracked: whether or not walkers saw a clown unicycling across the square.  And this was an obnoxiously costumed clown, complete with huge red shoes, gigantic red nose and a bright purple and yellow outfit.  Interviewers approached people who had just walked through the square and asked them two questions: (1) did you just see anything unusual?, and (2) did you see the clown?

The results:  When asked if they saw anything unusual, 8.3% of cell phone users said yes, compared to between 32 and 57% of those walking without electronic devices, with an MP3 player, or in pairs.  When asked if they saw the clown, 25% of cell phone users said yes compared to 51%, 60% and 71.4% of the other conditions, respectively.  In effect, 75% of the cell phone users experienced inattentional blindness.  (The discrepancy between the 8.3% and the 25% might be because the clown didn’t register as something “unusual” — this is, after all, a university campus.)

So, coming back around to the original point — if using a cell phone impairs attention as drastically as this study shows for people just walking, could it by any stretch of the imagination be a good idea to use one while driving? 

One caveat to that concluding question should be mentioned: As noted in the results, people walking in pairs–most likely talking to each other–were next in line for inattentional blindness. This jibes with research (discussed in this TIME article) indicating that talking to someone in your car while driving is significantly distracting–perhaps not quite as much as chatting on a cell phone, but in the neighborhood.  Auditory cues, whether from a phone or from the person next to you, divert attention. The problem with cell phones, however, is that a user lacks the other set of eyes his co-chatter has to offer, which could very well be the difference between being in an accident or getting home safely.

ResearchBlogging.org
Hyman, I., Boss, S., Wise, B., McKenzie, K., & Caggiano, J. (2009). Did you see the unicycling clown? Inattentional blindness while walking and talking on a cell phone Applied Cognitive PsychologyDOI: 10.1002/acp.1638

10 Comments

Filed under About Perception, About Research