Is technology changing our brains? A new study by UCLA neuroscientist Gary Small adds to a growing body of research that says it is. And according to Small's new book, "iBRAIN: Surviving the Technological Alteration of the Modern Mind," a dramatic shift in how we gather information and communicate with one another has touched off an era of rapid evolution that may ultimately change the human brain as we know it. "Perhaps not since early man first discovered how to use a tool has the human brain been affected so quickly and so dramatically," he writes. "As the brain evolves and shifts its focus towards new technological skills, it drifts away from fundamental social skills."
The impact of technology on our circuitry should not come as a surprise. The brain's plasticity—it's ability to change in response to different stimuli—is well known. Professional musicians have more gray matter in brain regions responsible for planning finger movements. And athletes' brains are bulkier in areas that control hand-eye coordination. That's because the more time you devote to a specific activity, the stronger the neural pathways responsible for executing that activity become. So it makes sense that people who process a constant stream of digital information would have more neurons dedicated to filtering that information. Still, that's not the same thing as evolution.
To see how the Internet might be rewiring us, Small and colleagues monitored the brains of 24 adults as they performed a simulated Web search, and again as they read a page of text. During the Web search, those who reported using the Internet regularly in their everyday lives showed twice as much signaling in brain regions responsible for decision-making and complex reasoning, compared with those who had limited Internet exposure. The findings, to be published in the American Journal of Geriatric Psychiatry, suggest that Internet use enhances the brain's capacity to be stimulated, and that Internet reading activates more brain regions than printed words. The research adds to previous studies that have shown that the tech-savvy among us possess greater working memory (meaning they can store and retrieve more bits of information in the short term), are more adept at perceptual learning (that is, adjusting their perception of the world in response to changing information), and have better motor skills.
Small says these differences are likely to be even more profound across generations, because younger people are exposed to more technology from an earlier age than older people. He refers to this as the brain gap. On one side, what he calls digital natives—those who have never known a world without e-mail and text messaging—use their superior cognitive abilities to make snap decisions and juggle multiple sources of sensory input. On the other side, digital immigrants—those who witnessed the advent of modern technology long after their brains had been hardwired—are better at reading facial expressions than they are at navigating cyberspace. "The typical immigrant's brain was trained in completely different ways of socializing and learning, taking things step-by-step and addressing one task at a time," he says. "Immigrants learn more methodically and tend to execute tasks more precisely."
But whether natural selection will favor one skill set over the other remains to be seen. For starters, there's no reason to believe the two behaviors are mutually exclusive. In fact, a 2005 Kaiser study found that young people who spent the most time engaged with high-technology also spent the most time interacting face-to-face, with friends and family. And as Small himself points out, digital natives and digital immigrants can direct their own neural circuitry—reaping the cognitive benefits of modern technology while preserving traditional social skills—simply by making time for both.
In the meantime, modern technology, and the skills it fosters, is evolving even faster than we are. There's no telling whether future iterations of computer games, online communities and the like will require more or less of the traditional social skills and learning strategies that we've spent so many eons cultivating. "Too many people write about this as if kids are in one country and adults are in another," says James Gee, a linguistics professor at the University of Wisconsin-Madison. What the future brain will look like is still anybody's guess.