These days, there’s a cottage industry around heralding the decline of Western culture and the end of civilization as we know it. We regularly see news stories about stupid people doing stupid things, and each election season brings books about how this political party or that is causing the end of America.
The other popular version of this motif is the “technology is the end of the world” perspective, as seen in books like The Shallows and The Dumbest Generation. Often, these sorts of books bring in in “objective scientific data” about brain function, but rarely discuss it in its full complexity. As a result, they end up only telling part of the story—the bad part.
Case in point, an article entitled “Pretty Pictures: Can Images Stop Data Overload?” by Fiona Graham of the BBC news this past week. The story covered some recent research into how the brain deals with data overload. Researchers wanted to know whether or not using information visualization techniques— in this case, mind maps— would help people acquire and understand information.
The results showed that when tasks were presented visually rather than using traditional text-based software applications, individuals used around 20% less cognitive resources. In other words, their brains were working a lot less hard. (Link to goofy infographic original.)
“A ha,” says the critic of technology. “I told you people who read those infographics were stupid. They use less of their brains!” I’ll concede that infographics are often terrible. But here’s the problem with that conclusion.
As a result, [individuals] performed more efficiently, and could remember more of the information when asked later. (Emphasis added.)
The study suggests that there is no simple correlation between brain resource use and learning. In this case, use of less cognitive resources doesn’t mean dumb. Indeed, it may actually mean smarter. These findings held for people working both as individuals and in groups.
One must always, of course, keep findings like these in perspective. Was the sample size large enough? Has it been repeated? Did the researchers have a particular agenda (or product) that they want to support through their data or interpretation? These questions are essential, no matter whether you take the luddite or technological fetishist perspective.
But the most important question to keep in mind, though, is: do the claims of the research really support my conclusions? We play a lot of numbers game these days. Neither the “decline of western civilization” nor the “technology will save us” crowds are exceptions. Lowering SAT scores, decline in time spent reading books, increasing time reading overall, falling graduation rates, broader social circles, violent video games, etc., etc., etc. I don’t want to dispute that the times, they are a’changin’. But studies like this show that numbers don’t always say what we think they say or want them to say.
At this point in history, claims about the impact on technology on long term cognitive, psychological, and social effects of technology are provisional at best. We need to observe and need to reflect, but remain humble about our future. Gutenberg would have been pressed to make accurate predictions about how his invention would change the world.