In his article “The Web Shatters Focus, Rewires Brains,” Nicholas Carr expresses his strong beliefs that internet use is rewiring our brains, and not for the better. Carr makes it quite clear in his article that he is against the high prevalence of the Internet in our lives and that he feels this “addiction” to the Internet is harmful. He even goes as far as saying that “what we’re experiencing is, in a metaphorical sense, a reversal of the early trajectory of civilization: We are evolving from cultivators of personal knowledge into hunters and gatherers in the electronic data forest.” Carr does an excellent job terrifying the masses in these text, myself included when I first read his article. I am happy to say that while this threat could potentially exist in the future, there is not substantial proof that this our current reality. But alas, I am getting ahead of myself.
For those who have not read Carr’s piece on Wired.com, allow me to illuminate you. Carr argues that frequent Internet use is creating “distinctive neural pathways” within our brains that increase our brain activity when reading on the web. This sounds like a good thing, right? Wrong! (According to Carr at least.) He suggests that this brain activity, caused by navigating through articles filled with hypertext and advertisements, “disrupts concentration” and “weakens comprehension.” Carr also comments that our “short-term storage is fragile: a break in our attention can sweep its contents from our mind.” No offense to Carr, whose own article is rapt with hypertext links, but I would argue that the present-day human is more resilient than he would have us believe. I will concede that it possible for such effects to occur in the future, but there is little evidence to support his claims today and more than enough to dispute them.
During my research, I discovered that while researchers agree that there is some evidence the Internet is changing our brains, they do not see this as a cause for alarm. Instead, it seems that this heightened brain activity could be a sign that we are simply evolving with technology. Academicearth.org agrees with Carr that our memory has changed suggesting, “Google is the friend with all of the expertise. If the sum of all knowledge is constantly available in our pockets, is it any wonder that we’ve stopped bothering to keep it in our heads?” We have evolved to no longer need to retain information that may not be critical to survival, such as who sang “Here Comes the Sun.” We now have an external memory drive in Google, which can have that information ready so you no longer wake up in the middle of the night because you finally remembered it was The Beatles.
Apparently, even schools have adapted to this approach, teaching students how to make their own connections rather than focusing on rote memorization. According to academicearth.org, “American schools have focused less on fact memorization and more on teaching students how to make innovative connections between the curriculum and real life. This way, it’s less about the knowledge you have, and more about how you use the information at hand.” After all, we praise early humans for evolving to use tools for survival. Why are we judging ourselves for doing the same?
Betsy Sparrow, Jenny Liu, Daniel M. Wegner ask a similar question based on their Google/memory study results. Sparrow et al. note that while study participants were less likely to remember facts if they thought the information was readily accessible, our ability to use the internet “gives us the advantage of access to a vast range of information” that we would not normally have. But Sparrow et al. does not believe this approach to information is anything new. Our dependency on the Internet is the same as our dependency “on all the knowledge we gain from our friends and co-workers and lose if they are out of touch.” As Lisa M. Krieger at Mercury News said, “ we may not recall our aunt’s birthday, the name of a high school teacher or who gave us that nice bottle of wine –but someone we know does.” Krieger supports Sparrow’s argument against Carr, going as far as quoting Sparrow in “Google is changing your brain, study says, and don’t you forget it”. Krieger agrees with Sparrow that our reliance on the Internet “doesn’t prove that we’re incapable of thinking long and hard about anything…‘and it could be that once we stop worrying about memorizing dates and facts and names, we’re better able to concentrate.’” I am sure that anyone who promptly forgot everything they learned in middle school geometry upon entering high school would certainly agree.
Christian Jarrett, also at Wired.com, notes similar findings as Krieger and Sparrow et al. in “The Internet Probably Isn’t Ruining Your Teenager’s Brain.” Jarrett admits that “any cognitive neuroscientist will tell you that your brain adapts to whatever you do, so yes, if we engage in new activities such as more Internet use, the brain will change in response.” Jarrett, however, also makes it clear that little research to suggest our brains or deteriorating because of Internet use. In fact, Jarrett proposes quite the opposite stating “despite the fears spread by many commentators, there is actually a good deal of research suggesting positive psychological effects for teenagers from using the Internet.” His research implies that teens who use the Internet have boosted self-esteem and a stronger urge to participate in social clubs and athletic activities. Jarrett also points out “it’s possible teens with certain brain profiles are drawn to greater Internet use and gaming, rather than those activities affecting their brains.” In other words, teens are attracted to gaming and are not necessarily addicted to the distractions Carr claims the Internet possesses. We could get into popular belief that gaming is a distraction that is rotting teen brains, but let’s save that argument for another day.
By this point, I am sure you are questioning what you should believe. “Here’s the truth about how the Internet affects your brain,” by Fusion.net writer Kristen Brown, offers the reader a view from both sides. On one side of Brown’s text stands Susan Greenfield, a neurobiologist who believes the Internet is extremely harmful, to the point where she is certain Internet use is linked to autism. In the other corner, stands Vaughan Bell, a psychologist at University Collage London who completely disagrees. After reading Brown’s article, I have a strong suspicion that Carr and Greenfield would be bosom companions.
Bell, on the other hand, would most likely disagree with Carr as he does Greenfield. Bell’s main disapproval, according to Brown at least, is that Greenfield has no scientific evidence to support her claims. Brown quotes him as saying “The general finding is that those who use social networks to avoid social difficulties have reduced wellbeing, while use of social networks to deal with social challenges improves outcomes,” which goes against Greenfield’s claims “that social networking sites could negatively affect social interaction, interpersonal empathy, and personal identity.” The main issue between Greenfield and Bell comes to a disagreement on whether or not the Internet is addictive. Carr would say we are addicted, that “we want to be interrupted, because each interruption—email, tweet, instant message, RSS headline—brings us a valuable piece of information.” My question, however, harkens back to our current discussion: Is this such a terrible thing?
Gregory Ulmer and Dennis Baron might ask the same question. Both Ulmer and Baron concerned themselves with where technology has taken us in the past and what doors it will open for us in the future. Ulmer, in Introduction to Electracy, reminds of a time orality was the newest technology. Did the Greeks realize how quickly the alphabet and literacy would become the new and preferred technology? While most of us today do not consider the oral tradition or even the written an advanced technology, but after their creation they were. Ulmer argues that just as we advanced from orality, to literacy, we will advance into “electracy.” Electracy encompasses our new digitally focused world, where technology is our main means of transferring knowledge. Ulmer, however, does not feel that new technologies replace the ones of the past, but rather develop them. “Electracy,” he writes, “similarly is being invented, not to replace religion and science (orality and literacy), but to supplement them with a third dimension of thought, practice, and identity. “Electracy” is to digital media what literacy is to alphabetic writing: an apparatus, or social machine, partly technological, partly institutional.” In layman terms, there will always be a new technology that is “better” than the one before it but instead of replacing, our society simply builds on what we have already. I think Ulmer would agree that our brains develop along with technology, but would also say that we retain and build on what we already know. We evolve for the better alongside technology. Technology, then, is just a tool to provide us with the means to evolve.
I believe Baron, who also voiced strong, but encouraging viewpoints on technology and literacy, would agree with Sparrow, Kreiger, Jarrett, Bell, and Ulmer. Barron writes “when we write with cutting-edge tools, it is easy to forget that whether it consists of energized particles on a screen or ink embedded in paper or lines gouged into clay tablets, writing itself is always first and foremost a technology, a way of engineering materials in order to accomplish an end.” For Barron, this technology is something to utilize and not fear. He reminds us in “From Pencils to Pixels: The Stage of Literacy Technology,” that “as the old technologies become automatic and invisible, we find ourselves more concerned with fighting or embracing what’s new. Ten years ago math teachers worried that if students were allowed to use calculators, they wouldn’t learn their arithmetic tables.” While many, including myself, are quite attached to the calculators on our phones, society has not lost the ability to add or multiple figures in one’s head (even if it does take use longer than Siri to discover the answer). I completely agree with Barron, “the computer is simply the latest step in a long line of writing technologies. In many ways its development parallels that of the pencil…though the computer seems more complex and is undoubtedly more expensive.”
While I cannot even attempt to say where the Internet will take us, or even that we will never be able to prove that the Internet is robbing us of our memories or brains, I can with some confidence say that we all need to take a deep breath and embrace the positive. I do not fault Carr for being nervous. As Brown says, “the truth is the Internet is a young technology with a broader impact we can’t yet fully understand. With more research, though, one day we might.” But for now, when we do not have solid proof that we are taking steps back into the primordial ooze, let us rejoice in the fact that we do have proof that we are evolving as a species and that we have access to wonderful knowledge and technology we did not previously possess. I may not be able to remember who sings “Here Comes the Sun” (you can look at paragraph three, I won’t tell), but I know that if I need important, even life saving, information it is only a few clicks away. Despite this belief, please do not think that I am naive enough to think that the Internet is a panacea for our issues. I do not, however, believe we should banish something solely because we are afraid of change. So if evolving to use new tools makes us “hunters and gatherers in the electronic data forest,” I for one cannot wait to see what we become when we discover the next fire.