Rules For Thinking In A Digital World

Share Button

Is technology making us stupid—or smarter than we’ve ever been? Author Nicholas Carr memorably made the case for the former in his 2010 book The Shallows: What The Internet Is Doing To Our Brains. This fall we’ll have a rejoinder of sorts from writer Clive Thompson, with his book Smarter Than You Think: How Technology Is Changing Our Minds For The Better.

My own take: technology can make us smarter or stupider, and we need to develop a set of principles to guide our everyday behavior, making sure that tech is improving and not impeding our mental processes. Today I want to propose one such principle, in response to the important question: What kind of information do we need to have stored in our heads, and what kind can we leave “in the cloud,” to be accessed as necessary?

The answer will determine what we teach our students, what we expect our employees to know, and how we manage our own mental resources. But before I get to that answer, I want to tell you about the octopus who lives in a tree.

In 2005, researchers at the University of Connecticut asked a group of seventh graders to read a website full of information about the Pacific Northwest Tree Octopus, or Octopus paxarbolis. The Web page described the creature’s mating rituals, preferred diet, and leafy habitat in precise detail. Applying an analytical model they’d learned, the students evaluated the trustworthiness of the site and the information it offered.

Their judgment? The tree octopus was legit. All but one of the pupils rated the website as “very credible.” The headline of the university’s press release read, “Researchers Find Kids Need Better Online Academic Skills,” and it quoted Don Leu, professor of education at UConn and co-director of its New Literacies Research Lab, lamenting that classroom instruction in online reading is “woefully lacking.”

There’s something wrong with this picture, and it’s not just that the arboreal octopus is, of course, a fiction, presented by Leu and his colleagues to probe their subjects’ Internet savvy. The other fable here is the notion that the main thing these kids need—what all our kids really need—is to learn online skills in school. It would seem clear that what Leu’s seventh graders really require is knowledge: some basic familiarity with the biology of sea-dwelling creatures that would have tipped them off that the website was a whopper (say, when it explained that the tree octopus’s natural predator is the sasquatch).

But that’s not how an increasingly powerful faction within education sees the matter. They are the champions of “new literacies”—or “21st century skills” or “digital literacy” or a number of other faddish-sounding concepts. In their view, skills trump knowledge, developing “literacies” is more important than learning mere content, and all facts are now Google-able and therefore unworthy of committing to memory.

There is a flaw in this popular account. Robert Pondiscio, executive director at the nonprofit organization CitizenshipFirst  (and a former fifth-grade teacher), calls it the “tree octopus problem”: even the most sophisticated digital literacy skills won’t help students and workers navigate the world if they don’t have a broad base of knowledge about how the world actually operates. “When we fill our classrooms with technology and emphasize these new ‘literacies,’ we feel like we’re reinventing schools to be more relevant,” says Pondiscio. “But if you focus on the delivery mechanism and not the content, you’re doing kids a disservice.”

Indeed, evidence from cognitive science challenges the notion that skills can exist independent of factual knowledge. Dan Willingham, a professor of psychology at the University of Virginia, is a leading expert on how students learn. “Data from the last thirty years leads to a conclusion that is not scientifically challengeable: thinking well requires knowing facts, and that’s true not only because you need something to think about,” Willingham has written. “The very processes that teachers care about most—critical thinking processes such as reasoning and problem solving—are intimately intertwined with factual knowledge that is stored in long-term memory (not just found in the environment).”

Just because you can Google the date of Black Tuesday doesn’t mean you understand why the Great Depression happened or how it compares to our recent economic slump. And sorting the wheat from the abundant online chaff requires more than simply evaluating the credibility of the source (the tree octopus material was supplied by the “Kelvinic University branch of the Wild Haggis Conservation Society,” which sounded impressive to the seventh graders in Don Leu’s experiment). It demands the knowledge of facts that can be used to independently verify or discredit the information on the screen.

There is no doubt that the students of today, and the workers of tomorrow, will need to innovate, collaborate and evaluate, to name three of the “21st century skills” so dear to digital literacy enthusiasts. But such skills can’t be separated from the knowledge that gives rise to them. To innovate, you have to know what came before. To collaborate, you have to contribute knowledge to the joint venture. And to evaluate, you have to compare new information against knowledge you’ve already mastered.

So here’s a principle for thinking in a digital world, in two parts: First, acquire a base of fact knowledge in any domain in which you want to perform well. This base supplies the essential foundation for building skills, and it can’t be outsourced to a search engine.

Second: Take advantage of computers’ invariant memory, but also the brain’s elaborative memory. Computers are great when you want to store information that shouldn’t change—say, the date and time of that appointment next week. A computer (unlike your brain, or mine) won’t misremember the time of the appointment as 3 PM instead of 2 PM. But brains are the superior choice when you want information to change, in interesting and useful ways: to connect up with other facts and ideas, to acquire successive layers of meaning, to steep for a while in your accumulated knowledge and experience and so produce a richer mental brew.

That’s one principle for thinking in a digital world; over the next few months I’ll be introducing others. Now, your turn: Have you discovered any rules for using your mind in a world full of technology? Please share!

Share Button

5 Responses to “Rules For Thinking In A Digital World”

  1. Rule #1: Don’t get distracted! With the amazing breadth of information out there, let alone sites like Facebook and Spotify, it’s easy to get side tracked by things that aren’t what me or my students need to be learning about today – sometimes completely worthwhile material, just not relevant right now. I’d recommend something like meditation – serious mind training – alongside regular contemplation of personal knowledge goals. What did you want to upload into your brain today? Then later: did you accomplish what you wanted to do?

  2. The task of accumulating and retaining knowledge isn’t always fun. Therein lies the problem. Acquiring literacy skills and the fluency to innovate, collaborate and evaluate, even in an elementary way involves some tedious drill, practice and more drill and practice. Necessary but not glamorous. But how powerful to enable students’ inherent curiosity for the accumulation of knowledge.

  3. When I taught a quantitative literacy course, I was struck by how much we think of as critical thought is really just knowledge. If I tell you, for example, that playing football correlates to lower college attendance rates you’ll immediately ask me whether the study controlled for gender.

    Did you magically process every last detail of that in your head — or do you have a shortlist of likely confounders (gender, age, social class, income, etc) that you quickly run over a stat like that? It happens so fast over time that it’s hard to tell, but my guess is that you’re probably more likely to spot a common confounder (gender) than an uncommon one (region of the country). So that brilliant process turns out to be pretty content dominated.

    A similar thing goes for what I call “touchstone values”. If I tell you that over 100 million people in the U.S. declared bankruptcy last year, you can tell me that’s false right away. Why? Because you know there’s about 315 million people in the country, and not 1 in 3 people went bankrupt (especially when a non-trivial portion of the 315 million are children).

    There is a skill there — always converting numbers to a ratio, natural frequency, or percent to check plausibility. But if you don’t have an idea of what the population is, you’re never going to get to that point. It’s knowing the population that calls attention to that claim over other claims I would make.

    Another thing I taught my students was the “Brown M&M test” which relies a lot on facts. Basically, we followed Van Halen’s advice – look at the little, knowable things in an article — if they get some of the simple facts wrong, don’t trust the bigger stuff (they are either careless or lying). It’s based on this Van Halen tactic: http://www.snopes.com/music/artists/vanhalen.asp But again, the ability to do this Brown M & M trick requires that you have some knowledge of *something* in the document you are evaluating. Once the alarm bell goes off, then it’s skill — but getting there requires facts.

  4. Scott McLeod says:

    I know few advocates of ’21st century skills’ who think that facts aren’t important at all, that everything is Googleable and thus we don’t need to know anything. That’s a straw man argument, I believe, that’s inapplicable for thoughtful, technology-informed proponents of different ways of learning, teaching, and schooling.
    Also see this link regarding the idea that one has to learn facts FIRST before other thinking and skill-building work can occur:
    Do students need to learn lower-level factual and procedural knowledge before they can do higher-order thinking?

  5. Michael Monchilov says:

    You raise very good points and I agree with your perception of what the brain can be used for, but aren’t you shortchanging the computer a bit – makes a good appointment book?

Leave a Reply