Are Digital Devices Altering Our Brains?
Ten years ago technology writer Nicholas Carr published an article in the Atlantic entitled “Is Google Making Us Stupid?” He strongly suspected the answer was “yes.” Himself less and less able to focus, remember things or absorb more than a few pages of text, he accused the Internet of radically changing people’s brains. And that is just one of the grievances leveled against the Internet and at the various devices we use to access it–including cell phones, tablets, game consoles and laptops. Often the complaints target video games that involve fighting or war, arguing that they cause players to become violent.
But digital devices also have fervent defenders—in particular the promoters of brain-training games, who claim that their offerings can help improve attention, memory and reflexes. Who, if anyone, is right?
The answer is less straightforward than you might think. Take Carr’s accusation. As evidence, he quoted findings of neuroscientists who showed that the brain is more plastic than previously understood. In other words, it has the ability to reprogram itself over time, which could account for the Internet’s effect on it. Yet in a 2010 opinion piece in the Los Angeles Times, psychologists Christopher Chabris, then at Union College, and Daniel J. Simons of the University of Illinois at Urbana-Champaign rebutted Carr’s view: “There is simply no experimental evidence to show that living with new technologies fundamentally changes brain organization in a way that affects one’s ability to focus,” they wrote. And the debate goes on.
The Case for Stupidity
Where does the idea that we are becoming “stupid” come from? It derives in part from the knowledge that digital devices capture our attention. A message from a friend, an anecdote shared on social networks or a sales promotion on an online site can act like a treat for the human brain. The desire for such “treats” can draw us to our screens repeatedly and away from other things we should be concentrating on.
People may feel overwhelmed by the constant input, but some believe they have become multitaskers: they imagine they can continually toggle back and forth between Twitter and work, even while driving, without losing an ounce of efficiency. But a body of research confirms that this impression is an illusion. When individuals try to do two or more things at once that require their attention, their performance suffers. Moreover, in 2013 Stéphane Amato, then at Aix-Marseille University in France, and his colleagues showed that surfing Web pages makes people susceptible to a form of cognitive bias known as the primacy effect: they weight the first few pieces of information they see more heavily than the rest.
Training does not improve the ability to multitask. In 2009 Eyal Ophir, then at Stanford University, and his colleagues discovered that multitasking on the Internet paradoxically makes users less effective at switching from one task to another. They are less able to allocate their attention and are too vulnerable to distractions. Consequently, even members of the “digital native” generation are unlikely to develop the cognitive control needed to divide their time between several tasks or to instantly switch from one activity to another. In other words, digital multitasking does little more than produce a dangerous illusion of competence.
The good news is that you do not need to rewire your brain to preserve your attention span. You can help yourself by thinking about what distracts you most and by developing strategies to immunize yourself against those distractions. And you will need to exercise some self-control. Can’t resist Facebook notifications? Turn them off while you’re working. Tempted to play a little video game? Don’t leave your device where you can see it or within easy reach.
Evidence for Aggression
What about the charge that video games increase aggression? Multiple reports support this view. In a 2015 review of published studies, the American Psychological Association concluded that playing violent video games accentuates aggressive thoughts, feelings and behavior while diminishing empathy for victims. The conclusion comes both from laboratory research and from tracking populations of online gamers. In the case of the gamers, the more they played violent games, the more aggressive their behavior was.
The aggression research suffers from several limitations, however. For example, lab studies measure aggressiveness by offering participants the chance to inflict a punishment, such as a dose of very hot sauce to swallow—actions that are hardly representative of real life. Outside the lab, participants would probably give more consideration to the harmful nature of their actions. And studies of gamers struggle to make sense of causality: Do video games make people more violent, or do people with a fundamentally aggressive temperament tend to play video games?
Thus, more research is needed, and it will require a combination of different methods. Although the findings so far are preliminary, researchers tend to agree that some caution is in order, beginning with moderation and variety: an hour here and there spent playing fighting games is unlikely to turn you into a brainless psychopath, but it makes sense to avoid spending entire days at it.
Gaming for Better Brains?
On the benefit side of the equation, a number of studies claim that video games can improve reaction time, attention span and working memory. Action games, which are dynamic and engaging, may be particularly effective: immersed in a captivating environment, players learn to react quickly, focus on relevant information and remember. In 2014, for example, Kara Blacker of Johns Hopkins University and her colleagues studied the impact games in the Call of Duty series—in which players control soldiers—on visual working memory (short-term memory). The researchers found that 30 hours of playing improved this capacity.
The assessment consisted of asking participants a number of times whether a group of four to six colored squares was identical to another group, presented two minutes earlier. Once again, however, this situation is far from real life. Moreover, the extent to which players “transfer” their learning to everyday activities is debatable.
This issue of skill transfer is also a major challenge for the brain-training industry, which has been growing since the 2000s. These companies are generally very good at promoting themselves and assert that engaging in various exercises and computer games for a few minutes a day can improve memory, attention span and reaction time.
Posit Science, which offers the BrainHQ series of brain training and assessment, is one such company. Its tools include UFOV (for “useful field of view). In one version of a UFOV-based game, a car and a road sign appear on a screen. Then another car appears. The player clicks on the original car and also clicks on where the road sign appeared. By having groups of objects scroll faster and faster, the activity is supposed to improve reaction time.
The company’s Web site touts user testimonials and says its customers report that BrainHQ “has done everything from improving their bowling game, to enabling them to get a job, to reviving their creativity, to making them feel more confident about their future.” Findings from research, however, are less clear-cut. On one hand, Posit Science cites the Advanced Cognitive Training for Independent and Elderly (ACTIVE) Study to claim that UFOV training can improve overall reaction time in elderly players and reduce the risk that they will cause car crashes by almost 50 percent. But in a 2016 analysis of research on brain-training programs, Simons and his colleagues are far less laudatory. The paper, which includes an in-depth analysis of the ACTIVE study, says that the overall risk of having an accident—the most relevant criterion—decreased very little. Several reviews of the scientific literature come to much the same conclusion: brain-training products enhance performance on tasks that are trained directly, but the transfer is often weak.
Dr. Kawashima’s Brain Traininggame, released by Nintendo in the mid-2000s, provides another example of contrary results. In addition to attention and memory exercises, this game has a player do calculations. Does the program improve overall arithmetic skills? No, according to work done in 2012 by Siné McDougall and Becky House, both at Bournemouth University in England on a group of seniors. A year earlier, though, Scottish psychologists David Miller and Derek Robertson found that the game did increase how fast children could calculate.
Overall then, the results from studies are mixed. The benefits need to be evaluated better, and many questions need answering, such as how long an intervention should last and at what ages might it be effective. The answers may depend on the specific interventions being considered.
No Explosive Growth in Capacity
Any cognitive improvements from brain-training games probably will be marginal rather than an “explosion” of human mental capacities. Indeed, the measured benefits are much weaker and ephemeral than the benefits obtained through traditional techniques. For remembering things, for example, rather than training your recall with abstract tasks that have little bearing on reality, try testing your memory regularly and making the information as meaningful to your own life as possible: If you memorize a shopping list, ask yourself what recipe you are buying the ingredients for and for which day’s dinner. Unlike brain-training games, this kind of approach involves taking some initiative and makes you think about what you know.
Exercising our cognitive capacities is important to combating another modern hazard: the proliferation of fake news on social networks. In the same way that digital devices accentuate our tendency to become distracted, fake news exploits our natural inclination to believe what suits us. The solution to both challenges is education: more than ever, young people must be taught to develop their concentration, self-control and critical-thinking skills.