Has habitual Internet browsing altered kids’ brains? In this excerpt from his new book The Reading Mind: A Cognitive Approach to Understanding How the Mind Reads, Daniel T. Willingham argues that the brain is always changing, so the effects of web activities aren’t likely to be permanent. In fact, the biggest challenge for young readers may be staying focused on the printed page when having a smartphone means they never have to be bored.
Some observers—including prominent reading researcher MaryAnn Wolf—have suggested that habitual Web reading, characterized by caroming from one topic to another and skimming when one alights, changes one’s ability to read deeply.  Nick Carr popularized this sinister possibility with the question: “Is Google Making us Stoopid?”  In that article (and in a follow-up book, The Shallows), Carr argued that something had happened to his brain.  Years of quick pivots in his thinking prompted by Web surfing had left him unable to read a serious novel or long article.
We’ve all been there. You flick from a document to a website to check a fact. A few minutes later you’re three sites away, watching a video of a donkey eating a burrito (and perhaps, in a moment of clarity, asking “what am I doing with my life?”). The consequences of inveterate rapid attention shifts are an actual change in the wiring of the brain that renders us incapable of focusing attention—or so the argument goes. This does sound similar to the mental change many teachers feel they have seen in their students in the last decade or two; they can’t pay attention, and teachers feel they must do a song and dance to engage them. 
I doubt kids’ brains have changed for the worse, and although a formal poll has not been taken, I suspect most cognitive psychologists are in my camp.  First of all, sure, video games and surfing the Web change the brain. So does reading this book, buying gasoline, or seeing a stranger smile. The brain is adaptive, so it’s always changing.
Well, if it’s adaptive, couldn’t that mean that it would adapt to the need for constant shifts in attention, and maybe thereby lose the ability to sustain attention to one thing? I don’t think so because the basic architecture of the mind probably can’t be completely reshaped. Cognitive systems (vision, attention, memory, problem solving) are too interdependent. If one system changed in a fundamental way—such as attention losing the ability to stay focused on one object—that change would cascade through the entire cognitive system, affecting most or all aspects of thought. I suspect the brain is too conservative in its adaptability for that to happen, and if it had happened, I think the results would be much more obvious. The consequences wouldn’t be limited to our interest in reading longer texts; reading comprehension would drop, as would problem-solving ability, math achievement, and a host of higher cognitive functions that depend on attention and working memory.
More important, I don’t know of any good evidence that young people are worse at sustaining attention than their parents were at their age. They can sustain attention through a three-hour movie like Titanic, just as their parents did. They are capable of reading a novel they enjoy, like The Perks of Being a Wallflower. So I doubt that they can’t sustain attention. But being able to sustain attention is only half of the equation. You also have to deem something worthy of your attention, and that is where I think digital technologies may have their impact. They may change expectations.
I’m bored. Fix it.
Despite the diversity of activities afforded by digital technologies, I think many have two characteristics in common. Specifically, I think whatever experience the technology offers, you get it immediately—no waiting. Furthermore, producing this experience costs you very little—minimal effort. For example, if you’re watching a YouTube video and don’t like it, you can switch to another. In fact, the website makes it simple by displaying a list of suggestions. If you get tired of videos, you can check Snapchat. If that’s boring, look for something funny on theonion.com. Television has the same characteristics: cable offers a few score of channels, but if nothing appeals, get something from Netflix. When it comes to gaming, the carefully staircased pattern of challenge and reward is often pointed to as essential to a successful gaming experience. If the staircase is too steep, the game fails. Perhaps most important, those who own smartphones have sources of entertainment at all times. There is never a reason to be bored.
The consequence of long-term experience with digital technologies is not an inability to sustain attention. It’s impatience with boredom. It’s an expectation that I should always have something interesting to listen to, watch, or read, and that creating an interesting experience should require little effort. While a child’s choice to read or not should be seen in context of what else the child might do, the mind-boggling availability of experiences afforded by digital technologies means there is always something right at hand that one might do. Unless we’re really engrossed, we have the continuous, nagging suspicion: There’s a better way to spend my time than this. That’s why, when a friend sends me a video titled “Dog goes crazy over sprinkler—FuNNY!,” I find myself impatient if it’s not funny within the first 10 seconds. That’s why my nephew checks his phone at red lights, even when he’s not expecting any messages. That’s why teachers feel they must sing and dance to keep students’ attention. We’re not distractible. We just have a very low threshold for boredom.
If I’m right, there’s good news; the distractibility we’re all seeing is addressable. It’s not due to long-term changes in the brain that represent a fundamental (and unwanted) overhaul in how attention operates.
It’s due to beliefs—beliefs about what is worthy of sustained attention, and about what brings rewarding experiences. Beliefs are difficult to change, true, but the prospect intimidates less than repairing a perhaps permanently damaged brain.
Daniel T. Willingham is professor at the University of Virginia.
Adapted with permission from The Reading Mind: A Cognitive Approach to Understanding How the Mind Reads, by Daniel T. Willingham, 2017, published by Jossey-Bass: A Wiley Brand. For more information, please visit http://www.wiley.com/WileyCDA/WileyTitle/productCd-1119301378.html
1. Rosenwald, M. S. (2014, April 6). Serious reading takes a hit from online scanning and skimming, researchers say. Washington Post. Retrieved from www.washingtonpost.com/local/serious-reading- takes-a-hit-from-online-scanning-and-skimming-researchers- say/2014/04/06/088028d2-b5d2-11e3-b899-20667de76985_ story.html/.
2. Carr, N. (2008). Is Google making us stupid? Yearbook of the National Society for the Study of Education, 107(2), 89–94. http:// doi.org/10.1111/j.1744-7984.2008.00172.x.
3. Carr, N. (2011). The shallows: What the Internet is doing to our brains. New York: Norton.
4. Richtel, M. (2012, November 1). For better and for worse, technology use alters learning styles, teachers say. New York Times,
5. Steven Pinker and Roger Schank have both written in this vein. See: Pinker, S. (2010, January). Not at all. Retrieved from http://edge.org/q2010/q10_10.html#pinker. See also: Schank, R. (2010, January). The thinking process hasn’t changed in 50,000 years. Retrieved from www.edge.org/response-detail/11519/.
Last updated June 19, 2017