Multitasking hurts brain's ability to focus, scientists say
When one of the most important e-mail messages of his life landed in his in-box a few years ago, Kord Campbell overlooked it.
SAN FRANCISCO — When one of the most important e-mail messages of his life landed in his in-box a few years ago, Kord Campbell overlooked it.
Not just for a day or two, but 12 days. He finally saw it while sifting through old messages: A big company wanted to buy his Internet start-up.
The message had slipped by him amid an electronic flood: two computer screens alive with e-mail, instant messages, online chats, a Web browser and the computer code he was writing.
While he managed to salvage the $1.3 million deal after apologizing to his suitor, Campbell continues to struggle with the effects of the deluge of data. Even after he unplugs, he craves the stimulation he gets from his electronic gadgets. He forgets things like dinner plans, and he has trouble focusing on his family.
His wife, Brenda, complains, "It seems like he can no longer be fully in the moment."
This is your brain on computers.
Scientists say juggling e-mail, phone calls and other incoming information can change how people think and behave. They say our ability to focus is being undermined by bursts of information.
These play to a primitive impulse to respond to immediate opportunities and threats. The stimulation provokes excitement — a dopamine squirt — that researchers say can be addictive. In its absence, people feel bored.
While many people say multitasking makes them more productive, research shows otherwise. Heavy multitaskers actually have more trouble focusing and shutting out irrelevant information, scientists say, and they experience more stress.
And scientists are discovering that even after the multitasking ends, fractured thinking and lack of focus persist. In other words, this is also your brain off computers.
Technology use can benefit the brain in some ways, researchers say. Imaging studies show the brains of Internet users become more efficient at finding information. And players of some video games develop better visual acuity.
More broadly, cell phones and computers have transformed life. They let people escape their cubicles and work anywhere. They shrink distances and handle countless mundane tasks, freeing up time for more exciting pursuits.
For better or worse, the consumption of media, as varied as e-mail and TV, has exploded. In 2008, people consumed three times as much information each day as they did in 1960. And they are constantly shifting their attention. Computer users at work change windows or check e-mail or other programs nearly 37 times an hour, new research shows.
The nonstop interactivity is one of the most significant shifts ever in the human environment, said Adam Gazzaley, a neuroscientist at the University of California, San Francisco.
Campbell, 43, came of age with the personal computer, and he is a heavier user of technology than most. But researchers say the habits and struggles of Campbell and his family typify what many experience — and what many more will, if trends continue.
For him, the tensions feel increasingly acute, and the effects harder to shake.
The Campbells recently moved to California from Oklahoma to start a software venture. Campbell's life revolves around computers.
He goes to sleep with a laptop or iPhone on his chest, and when he wakes, he goes online. He and Brenda Campbell, 39, head to the tidy kitchen in their four-bedroom hillside rental in Orinda, an affluent suburb of San Francisco, where she makes breakfast and watches a TV news feed in the corner of the computer screen while he uses the rest of the monitor to check his e-mail.
Major spats have arisen because Campbell escapes into video games during tough emotional stretches. On family vacations, he has trouble putting down his devices. When he rides the subway to San Francisco, he knows he will be offline 221 seconds as the train goes through a tunnel.
Their 16-year-old son, Connor, tall and polite like his father, recently received his first C's, which his family blames on distraction from his gadgets. Their 8-year-old daughter, Lily, like her mother, playfully tells her father that he favors technology over family.
"I would love for him to totally unplug, to be totally engaged," says Brenda Campbell, who adds that he becomes "crotchety until he gets his fix." But she would not try to force a change.
"He loves it. Technology is part of the fabric of who he is," she says. "If I hated technology, I'd be hating him, and a part of who my son is, too."
As computers have changed, so has the understanding of the human brain. Until 15 years ago, scientists thought the brain stopped developing after childhood. Now they understand that its neural networks continue to develop, influenced by things like learning skills.
So not long after Eyal Ophir arrived at Stanford in 2004, he wondered whether heavy multitasking might be leading to changes in a characteristic of the brain long thought immutable: that humans can process only a single stream of information at a time.
Going back a half-century, tests had shown that the brain could barely process two streams, and could not simultaneously make decisions about them. But Ophir, a student-turned-researcher, thought multitaskers might be rewiring themselves to handle the load.
Ophir, like others around the country studying how technology bent the brain, was startled by what he discovered.
The test subjects were divided into two groups: those classified as heavy multitaskers based on their answers to questions about how they used technology, and those who were not.
In a test created by Ophir and his colleagues, subjects at a computer were briefly shown an image of red rectangles. Then they saw a similar image and were asked whether any of the rectangles had moved. It was a simple task until the addition of a twist: Blue rectangles were added, and the subjects were told to ignore them.
The multitaskers then did a significantly worse job than the nonmultitaskers at recognizing whether red rectangles had changed position. In other words, they had trouble filtering out the blue ones — the irrelevant information.
So, too, the multitaskers took longer than nonmultitaskers to switch among tasks, like differentiating vowels from consonants and then odd from even numbers. The multitaskers were shown to be less efficient at juggling problems.
Other tests at Stanford, an important center for research in this fast-growing field, showed multitaskers tended to search for new information rather than accept a reward for putting older, more valuable information to work.
Researchers say these findings point to an interesting dynamic: Multitaskers seem more sensitive than non-multitaskers to incoming information.
The results also illustrate an age-old conflict in the brain, one that technology may be intensifying. A portion of the brain acts as a control tower, helping a person focus and set priorities. More primitive parts of the brain, like those that process sight and sound, demand that it pay attention to new information, bombarding the control tower when they are stimulated.
Researchers say there is an evolutionary rationale for the pressure this barrage puts on the brain. The lower-brain functions alert humans to danger, like a nearby lion, overriding goals like building a hut. In the modern world, the chime of incoming e-mail can override the goal of writing a business plan or playing catch with the children.
Melina Uncapher, a neurobiologist on the Stanford team, said she and other researchers were unsure whether the muddied multitaskers were simply prone to distraction and would have had trouble focusing in any era. But she added that the idea that information overload causes distraction was supported by more and more research.
A study at the University of California, Irvine, found that people interrupted by e-mail reported significantly increased stress compared with those left to focus. Stress hormones have been shown to reduce short-term memory, said Dr. Gary Small, a psychiatrist at the University of California, Los Angeles.
Preliminary research shows some people can more easily juggle multiple information streams. These "supertaskers" represent less than 3 percent of the population, according to scientists at the University of Utah.
Other research shows computer use has neurological advantages. In imaging studies, Small observed that Internet users showed greater brain activity than nonusers, suggesting they were growing their neural circuitry. At the University of Rochester, researchers found that players of some fast-paced video games could track the movement of a third more objects on a screen than nonplayers. They say the games can improve reaction and the ability to pick out details amid clutter.
There is a vibrant debate among scientists over whether technology's influence on behavior and the brain is good or bad, and how significant it is.
"The bottom line is, the brain is wired to adapt," said Steven Yantis, a professor of brain sciences at Johns Hopkins University. "There's no question that rewiring goes on all the time," he added. But he said it was too early to say whether the changes caused by technology were materially different from others in the past.
Ophir is loath to call the cognitive changes bad or good, though the impact on analysis and creativity worries him.
He is not just worried about other people. Shortly after he came to Stanford, a professor thanked him for being the one student in class paying full attention and not using a computer or phone. But he recently began using an iPhone and noticed a change; he felt its pull, even when playing with his daughter.
"The media is changing me," he said. "I hear this internal ping that says: Check e-mail and voice mail.
"I have to work to suppress it."
Sam and Sara Lucchese create handmade pasta out of their kitchen-garage adjacent to their Ballard home. Here, they illustrate the final steps in making pappardelle pasta.