In his masterwork, Flow, psychologist Mihaly Csikszentmihalyi tells us that the two major components affecting our ability to control and direct our mental resources are time and attention.
On the first, time, most of our verdicts are the same: we don’t have enough of it. In the case of the second, however, the analysis is murkier. While we can all agree that there are a multitude of demands on our attention, it’s not exactly clear whether this is good, bad or neutral. Some would say, for instance, that the attention dividing practice of multitasking is an essential skill for being successful, while others claim that multitasking is a widespread cultural myth; something we aren’t capable of no matter how hard we try.
Maggie Jackson has taken a position in the core of controversy with her book, Distracted: The Erosion of Attention and the Coming Dark Age, in which she argues that our ability to focus attention is facing colossal challenges which we will either manage to meet, or risk falling into a cultural black hole. She recently spent some time with Neuronarrative discussing the science behind attention, whether we can train ourselves to be more focused, and what she believes we must do to avert an attention deficit “dark age.”
Tell us what your book Distracted is about and what led you to write it?
Distracted focuses on the steep costs of cultivating a split-focused, hyper-mobile and cyber-centric culture, and details how new scientific discoveries related to the nature and workings of attention can be a starting point for sparking a renaissance of attention. I argue that unless we are able to better manage our technologies and strengthen our powers of attention, we will usher in a dark age – a time of high-tech invention but cultural and social losses.
In a sense, I backed into the subject of attention as the key to creating a high-tech yet reflective and deeply connected society. I began seeking clues to how we could better navigate our own digital world by studying how inventions such as the cinema, train, car, camera and telegraph vastly changed people’s experiences of time and space long ago.
From this early research, I had two revelations. First, the roots of our virtual, split-focus, mobile world were seeded in these first high-tech revolutions. We have to look far beyond the Blackberry and iPod to understand our current culture. Second, the vast changes to human experience that have unfolded in the past two or three centuries essentially deal with attention — how and when we focus, our powers of awareness, our ability to plan and judge. Attention is the key to both understanding and shaping our world for the better.
We all have a subjective notion of terms like “attention,” “focus” and “distraction.” Give us a sense of the scientific basis for discussing these terms more objectively.
Yes, we all know what it feels like to concentrate on a problem, or to walk into a garden and become deeply aware of the beauty of the flowers, their scent, the touch of a breeze on our skin, the call of birds around us. But now, scientists are beginning to understand how attention works, how it develops in children, and how deficiencies in its operations can affect us.
And if you ask a neuroscientist about attention, they immediately begin speaking in the plural. Attention is not a single entity. It’s now considered to be an organ system, like circulation or digestion, with its own anatomy, circuitry and cellular structure. There is debate and much more to learn about the workings of attention, but many, if not most, attention scientists consider that this human faculty is made up of three “networks” or types of attention: alerting, i.e. awareness, sensitivity to our surroundings; orienting, i.e. focus, or the “spotlight” of the mind, and executive attention, a package of higher-order skills related to planning and judgment. The networks often work in conjunction with one another, yet they are independent. At the moment, there is an explosion of research into the workings of attention, in part because of the brilliant pioneering work of Michael Posner.
And intriguingly, a growing body of research is showing that these attentional powers can be trained. The great philosopher William James thought that attention could not be highly trained by “drill or discipline,” but he was wrong. While we do not yet know how long-lasting the gains are, neuroscientists including Amishi Jha are discovering that computer-based exercises, meditation and even behavioral therapies can improve focus, awareness and executive attention. These findings could revolutionize parenting, education and workplace training.
We’re plainly awash in digital technologies, with new ones being unveiled all the time – each vying for scarce pieces of our attention. Is it possible that the human brain is adapting to manage this onslaught?
Yes, we are awash in digital technologies that prey on our attention – from ads on screens in public places to the beeping, pinging communications gadgetry that is crucial for today’s work. Let’s look at this as an environmental issue first.
In one sense, we are not adapting well to this new environment, nor is this new environment conducive to the kind of in-depth thought and innovation that we need badly in the “information age.” Attention is our ticket to the world, our key to staying in tune with our environment. We are born to react to the new, the different or dangerous in our surroundings. That’s why we’re interrupt-driven. But there is a tension within attention, for this crucial system also gives us the ability to plan for the long-term, pursue our goals, and understand intangibles like the passage of time. When we’re constantly jumping to answer every beep or ping, we’re off-balance, overly depending on certain attentional skills, but overlooking our human need to plan or to tackle big-picture, messy, complex problems. Today, this is one reason why so many people feel frustrated that they do little more than “put out fires” and try to keep up with email all day at work.
Second, we are superficially adapting to managing the daily onslaught, yet in reality we’re undercutting our deeper abilities to think and relate deeply, and innovate. We seem to be so productive, speedily clicking through emails and ticking items off our never-ending to-do lists. By rampant multitasking and by fragmenting work into smaller and layered chunks, we can busily and efficiently seem to keep up with the tsunamis of communications data and information pummeling us. But consider that a third of workers say they’re often too busy and interrupted to process or reflect on the work they do, according to the Families and Work Institute.
As well, the average worker now switches tasks every three minutes throughout the day, and yet high levels of interruptions are related to stress, frustration, even lowered creativity, studies show. Most multitasking is actually task-switching, which is often linked to higher rates of errors and more shallow learning. Are we adapting to the demands of a so-called knowledge economy, or are we too often just more frenetically busy than ever before?
I worry that if we don’t change our path, we may collectively nurture new forms of ignorance, born not from a dearth of information as in the past, but from an inability or an unwillingness to do the difficult work of forging knowledge from the data flooding our world. In-depth analysis, reflective judgment and critical thinking begin with discomfort, a willingness to doubt, and discipline.
Finally, I do agree in part with the philosopher Andy Clark, who believes that technologies are truly a part of the power of the human mind. In other words, “thinking” doesn’t stop with what’s inside our skull. And I also believe that we can and will create technologies that become more sensitive to our need for focused, reflective thought and uninterrupted, unmediated human connection. (For more on this, check out my article in BusinessWeek on such research here.)
But I firmly believe that we can’t adapt to a complex, overload, digital world if we become overly dependent on our machinery to manage the info-onslaught for us. And we certainly are mistaken if we believe that a steady diet of multitasking and split-focus will give us the cognitive booster rockets needed to progress in this new age.
Many, including you, have pointed out that every generation throughout history has faced challenges to its status quo modes of thinking by new technology. What’s different about what’s happening now? Is there something inherent in digital technology that makes this challenge more disruptive?
Technologies from writing to steam engines certainly shake up the status quo. At the dawn of writing, Plato rightfully predicted that this new technology would lead to the demise of memory as the great repository of oral culture. There is much public discomfort when new technologies begin to challenge old habits.
But just because such discomforts naturally and periodically arise, does this mean that we should quickly dismiss them, or denigrate thoughtful questioning of a technology’s purpose and impact? As the historian Carolyn Marvin points out in When Old Technologies Were New, technologists throughout history tend to dominate early public discussions of their inventions. They, of course, are the first to understand the mechanics of these often-mysterious and powerful new tools. Yet they often try to control critical discussions related to the social impact of technologies in part by labeling critics as “luddites” and “naysayers.” Today, I believe this is still true. I believe we urgently need to have more measured, tolerant, thoughtful discussions of technology.
In weighing the impact of technology on our lives, it’s important to ask, what are the challenges that face us today? Were humans in the past any more able to focus, to think critically? That may not matter. What matters is whether or not we have the frameworks and architectures for going deeply into a text or problem. What matters is whether or not we are satisfied with increasingly faceless, virtual, quick and brief means of relating – even to our own kin. What matters is, what future do we want?
What in your opinion can we begin doing now to avert the “dark age” you suggest is coming?
To avert a dark age, we must take several steps:
Question the values that undermine attention – Helped by influential tools that are seedbeds of societal change, we’ve built a culture over generations that prizes frenetic movement, fragmented work and instant answers. Recently, my morning paper carried a front-page story about efforts “in a new age of impatience” to create a quick-boot computer. Explained one tech executive, “It’s ridiculous to ask people to wait a couple of minutes” to start up their computer. The first hand up in the classroom, the hyper-businessman who can’t sit still, much less listen – these are markers of success in American society. Instead of venerating scattershot focus, rushed detachment, knowledge built on sound bites, we need to value whole focus, full awareness and the difficult work of knowledge creation.
Rewrite the climate of distraction – We can set the stage for focus by judiciously protecting against interruptions; by dialing down the noisy, cluttered sensory environment that we’ve come to accept as a norm; and by disciplining ourselves to sharpen our powers of attention. To help, some companies are experimenting with “white space” – the creation of physical spaces or times on the calendar for uninterrupted, unwired thinking and connection. IBM’s global practice of “ThinkFridays” began three years ago when software engineers decided to limit email, conference calls and meetings one day a week in order to focus on their creative, patent work. Now, different teams and departments interrupt “ThinkFridays” in varied ways. This pioneering initiative is fluid, flexible and workable – more so than the rigid, top-down policies that ban email one day a week.
Role model attention – If there’s just one action we can take to spark a “renaissance of attention,” it should be to give the gift of our attention to others. Parents and leaders, in particular, need to role model attention. As contemplative scholar Alan Wallace says, “When we give another person our attention, we don’t get it back. We’re giving our attention to what seems worthy of our life from moment to moment. Attention, the cultivation of attention, is absolutely core.”
What’s the next project on your radar screen?
I’m contemplating writing a book on the fascinating scientific and other work going on now worldwide to understand and strengthen our powers of attention. There are a whole host of exciting stories to be told involving a new field that I’m calling the “applied science of attention.”
Link to Maggie Jackson’s website