Monthly Archives: March 2009

Can We Really Multitask?

8066better-multitasking-through-caffeine-postersThe latest post at PsyBlog discusses a classic study on multitasking, in which two participants were reportedly taught to read and write at the same time.  From the post:

Professor Elizabeth Spelke and colleagues at Cornell University wanted to know whether we can really divide our conscious attention between two demanding tasks, like reading and writing. To find out they recruited two participants willing to put in 29 hours of practice over a 6 week period: Diane and John were their volunteers (Spelke, Hirst & Neisser, 1976). Before the training Diane and John’s normal reading and comprehension rates were measured, so it could be compared with post-training. Then Spelke and colleagues set about their three-phase training regime.

There are a number of objections to this study, all discussed at PsyBlog (the most obvious being that two people is not a legitimate sample size). The one that’s most relevant to the current debate on this topic is this:

Diane and John were learning to switch their attention from one task to the other very quickly, not focus on both at the same time.

The multitasking vs task-switching debate is an important one because it touches on a fundamental aspect of brain functioning: whether attention can be simultaneously divided between two or more tasks–each performed with equal precision–or if attention must be switched between the tasks, like a railroad switch redirecting a train. 

The University of Michigan Brain, Cognition and Action Labortatory has done quite a lot of work on this topic, which you can review in depth here.  Great information to be found there if you want to learn more about the essential aspects of the debate.

John Medina, author of Brain Rules, is an outspoken critic of multitasking. I’ll wrap this post with a snippet from his Brain Rules video series–another good resource for those wanting to pursue this topic further.

6 Comments

Filed under About Neuroscience

Noggin Raisers Vol.11

machine-consciousness-imageExcellent article by David Dobbs on the overdiagnosis of Post Traumatic Stress Disorder at Scientific American, and the author provides links to sources and supplemental materials here at his blog, Neuron Culture

There’s “psych” in semen, oh yes there is, and Jena Pincott describes its hypnotic ingredients on her blog

Dr. Feelgood has a wild side; Neuroskeptic gives us the “rest of the story” about Serotonin here

When I’m looking for new mind and consciousness books, I go to My Mind on Books; I always find titles there I didn’t know existed – great resource

The future of science journalism is a topic on the rise, and Carl Zimmer at The Loom does it justice in this post

I happened upon a new neuroscience site recently and glad that I did: Very Evolved is one to watch 

Dr. Shock tells us about the neuroscience of interpersonal space here – what a terrific, undervalued topic

The two cultures — will they ever get along? Rationally Speaking discusses a recent article that shows signs of a truce in the making (um…maybe)

Groupism? Teamism? What the F-ism is going on? Wander over to The Situationist to read about a recent study on these burgeoning biases

This is philosophy, and this is how to do it — so writes lecturer Wayne Buck in an intriguing letter to his class, published in Philosophy Now

A new edition of The New Atlantis is up with an interesting piece about why our minds are not like computers

And finally, Respectful Insolence takes The Huffington Post to the woodshed on its anti-vaccine lunacy, here

1 Comment

Filed under Noggin Raisers

What Does Expert Advice Really Do to Our Brains?

saupload_jim_cramerA new study in PLoS suggests that expert advice causes the brain to “offload” calculations of expected utility (loss or gain) when making a financial decision under risk.  This is an intriguing result, but we should take a closer look to see why this study really only examines one aspect of decision-making, and does not suggest, contrary to headlines, that expert advice causes the brain to “switch off rationality” or “shut down.” 

Study participants were asked to make financial choices both inside and outside an fMRI scanner (choices were divided into two categories: “sure win” and “lottery”).  During the scanner session, researchers introduced a financial expert variable, and provided the expert’s credentials to enhance his influence.  The expert’s advice was presented to participants on a computer screen above their financial choice options. If the expert recommended an option, the word “Accept” was displayed above it; if he advised against the option the word “Reject” was displayed. During half the trials, the word “Unavailable” was displayed, indicating that the expert had no advice for that decision.

The results: both behavior and neural activation patterns were significantly affected by expert advice.  When given an “Accept” signal by the expert, participants tended to make decisions based on the advice.  Simultaneously, neural activation patterns correlating with valuation were witnessed in the absence of expert advice; no significant neural correlations with valuation were witnessed in the presence of expert advice.

expert_imageIn other words, the brain appears to offload the burden of figuring out the best decision when given expert financial advice. At first glance, that’s what the study tells us. But, of course, there’s always a but.

Study participants were given a mean of 3.5 seconds to make a decision, which means that they did not have time to deliberate to even a modest degree. Debriefing at the end of the study trial bears this out; quoting from the methods section: “participants indicated that they had not identified any way to engage in strategic behavior.” 

Seldom is anyone faced with making a risky financial decision in seconds. More reasonably, most of us take hours if not days to make an important risk decision – and certainly we’d consider a decision important if seeking expert financial advice to sort it out.  The point being, the study does not tell us anything substantial about real-world decision making. 

What the study really tells us is that the brain defers to the expert when first given expert advice, much as we’d expect.  If then immediately challenged to make a decision, we’d also expect someone to go ahead with the expert’s advice. The study evidences that.  But we know this isn’t really how decision-making works. Rather, the expert’s advice gets folded into a more lengthy process of figuring out the right way to go.  That process will probably include information from other sources, perhaps other experts, family members impacted by the decision, associates who have faced similar decisions, etc. 

With the backdrop of the financial crisis, this study supplies fodder for critics to lay blame on financial experts, especially those on TV, for the misguided decisions of investors. But that’s a painfully simplistic conclusion that at best applies to people who are not careful decision-makers to begin with. Anyone sitting in front of the TV with stock-happy trigger fingers caressing his or her laptop is playing a game, not making a careful decision.

As to the brain “shutting down” or “switching off,” nothing in this study indicates either. Results show an “attenuation” of neural activity correlating with valuation – that is, a tapering or reduction of activity. And again, in the context of how decisions are actually made, this attenuation would presumably reverse as soon as more factors are considered (as soon as, to use the words of the study, “strategic behavior” is engaged).  It would be interesting to study how much time it takes for valuation-linked neural activity to rebound after the expert’s advice has been digested with a stew of other variables.

To sum up – immediately upon receiving expert advice, the brain appears to offload value calculations and the words of the expert carry the day. If all risk decisions were made in an instant, this would be extremely important to know. But thankfully they are not, at least not by most of us.

ResearchBlogging.org
Jan B. Engelmann, C. Monica Capra, Charles Noussair, Gregory S. Berns (2009). Expert Financial Advice Neurobiologically “Offloads” Financial Decision-Making under Risk PLoS

8 Comments

Filed under About Neuroscience, About Research

Finding the Money Illusion in the Brain

moneymind2One of the daggers that have pierced the heart of the long-held economic rationality assumption (that we are all rational actors on the economic stage) is the “money illusion” proposition.  Rather than only rationally considering the real value of money (the value of goods that it can buy), we actually consider a combination of the real value and the nominal value (the amount of currency) – and sometimes we ignore the real value altogether.

Using an example from the book Choices, Values and Frames by psychologist Daniel Kahneman, let’s say that you receive a 2% salary increase. Unfortunately, the rate of inflation when you receive this increase is 4%.  In real terms, you are actually in the hole by 2%, which, under the rationality assumption, we’d expect would elicit a negative reaction — the same as we’d expect if someone got a 2% pay cut.  But this isn’t how most people react. Rather, the reaction to the real loss of 2% is tempered by the reaction to the nominal gain of 2%.  In effect, the nominal evaluation interferes with the real evaluation, hence the money illusion.

Now a new fMRI study in the Proceedings of the National Academy of Sciences has tested whether the brain’s reward circuitry exhibits the money illusion, and it turns out that it does.  From the study abstract:

Subjects received prizes in 2 different experimental conditions that were identical in real economic terms, but differed in nominal terms. Thus, in the absence of money illusion there should be no differences in activation in reward-related brain areas. In contrast, we found that areas of the ventromedial prefrontal cortex (vmPFC), which have been previously associated with the processing of anticipatory and experienced rewards, and the valuation of goods, exhibited money illusion. We also found that the amount of money illusion exhibited by the vmPFC was correlated with the amount of money illusion exhibited in the evaluation of economic transactions.

illusionsKahneman often uses a perceptual illustration to show how the money illusion works.  In the image to the left, there are two ways to interpret what we see: as two dimensional figures or as three dimensional objects.  If asked to evaluate the relative size of the figures, it’s necessary to rely on a two-dimensional interpretation to arrive at the correct answer. But, the three-dimensional assessment of the objects’ size biases our perception because it is more accessible, making it difficult to see that the objects are all exactly the same size. 

Same goes for how we perceive money: the real evaluation necessary to arrive at the correct answer is biased by the nominal evaluation that makes arriving at this answer difficult.  In a perfectly rational world, that wouldn’t be the case, but by now we know this ain’t a perfectly rational world — and as this study shows, we’re beginning to identify the brain dynamics underlying that fact. 

Image via Very Evolved

ResearchBlogging.org
Weber, B., Rangel, A., Wibral, M., & Falk, A. (2009). The medial prefrontal cortex exhibits money illusion Proceedings of the National Academy of Sciences DOI: 10.1073/pnas.0901490106

10 Comments

Filed under About Neuroscience, About Perception, About Research

This is Your Brain on the Edge of Chaos

389879703_acc141a544What do our brains have in common with piles of sand, earthquakes, forest fires and avalanches?  Each of those is a dynamic system in a self-organized critical state, and according to a new study in PloS Computational Biology, so is the brain. 

Systems in a critical state are on the cusp of a transition between ordered and random behavior.   Take a pile of sand for example: as grains of sand are added to the pile, they eventually form a slope. At a certain point, the sloping sand reaches a “critical state,” and at this point adding even a single grain can cause an avalanche that may be small or large. We can’t predict the moment or size of the avalanche, but we know that when the critical state is reached, there are several potential responses that may occur in the system (pile of sand). 

In effect, the system is globally stable at the same time as being locally unstable. Local instability (small avalanches in the sand pile) can create global instability (large avalanches leading to the collapse of the pile) bringing the system back to a new stable state. The pile of sand reorganizes itself.

While self-organized critical state models have been used to model brain dynamics before (in simulated neural networks), this study took the additional step of linking  modeling with neuroimaging to measure dynamic changes in the synchronization of activity between different regions of the brain’s network.  After developing a profile of brain dynamics with neuroimaging, researchers compared the profile with synchronization of brain activity in critical-state computational models. They found that the computational model results exactly reflected the dynamic activity in the brain, which strongly suggests that the brain exists dynamically in a critical state.

Which is to say, another door has been opened to understanding how the brain functions on the precipice of utter chaos.  Next up will be to study how the brain’s criticality is (or is not) linked to its adaptability, and to cognitive performance overall.  There’s not much evidence out there at all yet pulling these threads together, but this study does establish the groundwork for much more research. 

Another interesting question to consider: to what extent are critical state dynamics in the brain linked with psychiatric disorders?  Can better understanding how the brain teeters on the brink of randomness enable more effective treatments for certain disorders?  It’s difficult to even discuss this possibility without relying too heavily on metaphors (“neuronal avalanche” for example — and that’s a term actually used in the study), but until we have more evidential rudiments to work with, metaphor will have to fill the gaps.

ResearchBlogging.org
Manfred G. Kitzbichler, Marie L. Smith, Søren R. Christensen, Ed Bullmore (2009). Broadband Criticality of Human Brain Network Synchronization PLoS Computational Biology

4 Comments

Filed under About Neuroscience, About Research

An Appeal for Practical Wisdom

Aristotle said that practical wisdom is the combination of “moral will and moral skill.”  In this TED lecture, psychologist Barry Schwartz makes an engaging appeal for practical wisdom as an antidote to a society gone mad with bureaucracy (“a society at war with wisdom”). He argues powerfully that rules often fail us, incentives often backfire, and practical, everyday wisdom will help rebuild our world. Roughly 20 minutes long and well worth the time.

 

4 Comments

Filed under Videos

You Can Be Afraid To Lose, But Don’t Lose Perspective

anxietyAnyone who has ever stood to lose anything (all of us) knows that emotions play a big part in how we react to potential loss.  Sweaty palms and upper lips, fidgety fingers and bouncing knees, frantic, racing thoughts – all are signs of emotional tumult when facing the risk of loss – and all seem involuntary.  But a recent study indicates that we can influence the degree of emotional reaction, and our level of loss aversion. The solution, in short: think like a trader.

Seasoned traders are careful not to lose perspective when facing potential loss. They view loss as part of the game, but not the end of the game, and they rationally accept that taking a risk entails the possibility of losing.  Researchers wanted to investigate whether cognitive regulation strategies (like those embodied by traders) could be used to affect loss aversion and the physiological correlates of facing loss.

Subjects were given $30 and offered a choice to either gamble the money, and potentially lose it, or keep it.  They could theoretically win up to $572, or lose the $30 and be left with nothing.  The outcomes of their choices were revealed immediately after the choice was made (e.g. “you won”).  Subjects completed two full sets of choices (140 choices per set).  During the first set, subjects were told that the choice was isolated from any larger context (“as if it was the only one to consider”); during the second set, subjects were told that the choice was part of a greater context (“as if creating a portfolio”) — in other words, the introduction of “greater context” (taking a different perspective) functioned as a cognitive regulation strategy.

The researchers conducted this study twice: in the first, they observed behavior; in the second, they observed behavior and administered a skin conductance test (a measure of sympathetic nervous system activity) to measure level of emotional arousal.

The results: using the cognitive regulation strategy had the strong effect of decreasing loss aversion.  Most importantly, only individuals successful at decreasing their loss aversion by taking a different perspective had a corresponding reduction in physiological arousal response to potential loss.  So, cognitive regulation led to less loss aversion, which led to less sweat on the upper lip.

The question remains: is loss aversion a satisfactory response to anticipating discomfort and pain (emotional or physical), or is it more of a judgment error caused by a tendency to exaggerate the outcome of loss?  The results of this study support both positions.  On one hand, losses feel worse than gains feel good because the physiological response is linked to feedback about loss or gain (in other words, it’s easier to feel really bad about a potential loss than it is to feel really good about a potential gain if loss is still a possibility – we tend to dwell on the loss side because we know it hurts). 

On the other hand, the study also shows that fear of loss can be regulated, which means that it’s a changeable quantity.  Even though loss aversion serves a purpose, there’s a high likelihood we begin with too much of it for our own good.

So, it seems even if we are sensitive to the possibility of loss, we can make ourselves less so by changing our thinking. By taking a different, larger perspective, loss loses a few of its teeth and becomes a less scary beast.

(One concluding note: since this study addressed monetary loss, I’d leave the analysis in that category and of those things with similar dynamics (e.g. asking someone out on a date, interviewing for a job, etc.), and not extend it to Loss (with a capital “L”) of life, or life of loves ones; it seems to me that gets into a different area altogether and can’t be as practically addressed.)
ResearchBlogging.org
Sokol-Hessner, P., Hsu, M., Curley, N., Delgado, M., Camerer, C., & Phelps, E. (2009). Thinking like a trader selectively reduces individuals’ loss aversion Proceedings of the National Academy of Sciences DOI: 10.1073/pnas.0806761106

6 Comments

Filed under About Neuroscience, About Research