Is Technology Making Us Smarter -- or Dumber?
It cuts both ways, so the real answer is: It depends how we use it
By Annie Murphy Paul
Originally Posted On August 9, 2013
This article originally appeared on Annie Murphy Paul’s Brilliant Blog.
Is technology making us stupid — or smarter than we’ve ever been? Author Nicholas Carr memorably made the case for the former in his 2010 book The Shallows: What The Internet Is Doing to Our Brains
. This fall brings a rejoinder of sorts from Clive Thompson, with his book Smarter Than You Think: How Technology Is Changing Our Minds for the Better.
Personally, I believe that technology can make us smarter or
stupider, which underscores the need to develop a set of principles to guide our everyday behavior and ensure that tech is improving, not impeding, our mental processes.
I’d like to propose one such principle, in response to the question, "What kind of information do we need to have stored in our heads? And what kind can we leave 'in the cloud,' to be accessed as necessary?"
But first I want to tell you about the octopus who lives in a tree.
In 2005 researchers at the University of Connecticut asked a group of seventh graders to read a website full of information about the Pacific Northwest Tree Octopus, or Octopus paxarbolis
. The Web page described the creature’s leafy habitat, diet and mating rituals in precise detail. Then, applying an analytical model they’d learned, the students evaluated the trustworthiness of the site and the information it offered.
Their assessment? The tree octopus was legit. All but one of the pupils rated the website as “very credible.” The headline of the university’s press release that heralded the results of the study read, “Researchers Find Kids Need Better Online Academic Skills.” It quoted Don Leu, professor of education at UConn and co-director of its New Literacies Research Lab, lamenting that classroom instruction in online reading is “woefully lacking.”
There’s something wrong with this picture — and it’s not just that the arboreal octopus is, of course, a fiction concocted by Leu and his colleagues to probe their subjects’ Internet savvy. The other fable here is the notion that the main thing students need is to learn online skills in school.
Clearly, what Leu’s seventh graders really require — what everyone needs, really — is knowledge and the ability to think objectively and critically. (One would like to believe that students would have been tipped off by the “fact” that the tree octopus’s natural predator is the Sasquatch.)
Adults Need Critical Thinking, Too
The Internet can be our best friend — or a true enemy of deep, critical thinking. I’m not only talking about “Nigerian prince” scams and other phishing schemes that abound online, but also a general casualness about what we consider “facts” and how quick we are to assume that anything we read online is factual.
Wikipedia, for instance, has become the bible for so many and yet it’s not fact-checked. Also distressing is the proliferation of alarmist emails that are routinely accepted as truth, despite the ease with which these things can debunked at reliable sites like snopes.com
Beyond that, people are failing to note the ever-widening chasm between fact and opinion. There’s a general belief that if you read it online, it’s true. Part of our gullibility stems from the fact that we’re also lacking in basic knowledge.
Indeed, there is evidence from cognitive science that critical-thinking skills cannot exist independent of factual knowledge. Dan Willingham, a professor of psychology at the University of Virginia, is a leading expert on how people learn. “Data from the last 30 years leads to a conclusion that is not scientifically challengeable: thinking well requires knowing facts — and that’s true not only because you need something to think about,” he says.
“The very processes that teachers care about most — critical thinking processes such as reasoning and problem solving — are intimately intertwined with factual knowledge that is stored in long-term memory, not just found in the environment.”
Just because you can Google the date of Black Tuesday doesn’t mean you understand why the Great Depression happened or how it compares with our recent economic slump. And sorting the wheat from the abundant online chaff requires more than simply evaluating the credibility of the source. (The tree octopus material was supplied by the “Kelvinic University branch of the Wild Haggis Conservation Society,” which sounded impressive to the seventh graders in Don Leu’s experiment). It demands the knowledge of facts that can be used to independently verify or discredit the information on the screen.
There is no doubt that we need to innovate, collaborate and evaluate, to name three of the “21st-century” so dear to digital literacy enthusiasts. But such skills can’t be separated from the knowledge that gives rise to them. To innovate, you need to know what came before. To collaborate, you must contribute knowledge to the joint venture. And to evaluate, you have to compare new information with knowledge you’ve already mastered.
How to Use Technology to Make You Smarter
So here’s my suggestion for how to use the digital world to make you smarter. First, acquire the basic facts in any and all domains in which you want to perform well. This is the essential foundation for building skills — it can’t be “outsourced” to a search engine.
Second: Take advantage of computers’ “invariant memory” but, in equal measure, of your brain’s “elaborative memory.”
Computers are great for storing and retrieving information that shouldn’t change — say, the date and time of that appointment next week. A computer (unlike your brain) won’t misremember the time of the appointment as 3 p.m. instead of 2 p.m. But brains are the superior choice when you want information to change in interesting and useful ways: to connect up with other facts and ideas, to acquire successive layers of meaning, to steep for a while in your accumulated knowledge and experience to produce a richer mental brew.