Most of us have heard by now that overuse of digital media has degraded our focus. But perhaps the changes we are observing are behavioral, not cognitive.
In spring of 2014, Microsoft Canada released the results of a study on media consumers that claimed attention spans are shrinking. The report warned us that the average attention span has dropped from 12 to 8 seconds since about the year 2000, and is likely caused by the exponential increase in use of digital media. Warning that humans now have a shorter attention span than a goldfish meant the report was perfect for click-bait, and it worked--with most of the media seeming to ignore the fact that the results were pretty clearly favorable from a Microsoft advertising point of view. Rather than prove that digital technology is undermining human cognition, however, this marketing study and the sloppy reporting that gave it legs only serve to raise further questions about how our brains work, and especially how quickly our brains can evolve in response to changes in use.
Whether or not the study itself has been used well is up for debate, but the headline-friendly "8 second attention span" line has made its way into a kind of conventional wisdom in the popularizing of neuroscience. Like many of the unhelpful generalizations about "the millennial generation," it has come to be accepted as true. If you do a quick search of the phrase "attention span" in posts on LinkedIn, for instance, you will see what I mean. We seem to have accepted that smartphones and video games have destroyed human focus.
In the world of L&D, the "8 second" claim has been used to support a conceptual shift in instructional design widely known as "microlearning." In its most generalizable sense, the term describes various instructional techniques used to break up information into smaller chunks, leaving behind day-long training seminars and wordy PowerPoint decks. We'll likely all agree that this is entirely a good thing.
I'm not sure our attention spans have anything to do with making the case for this shift, however. First, I wonder if the "8 second" claim, and the insistence that it is caused by increased use of technology, might not be a good example of mass confirmation bias (or a post hoc fallacy) about the rapid increase of information technology and social media use. Second, it's no secret in the education world that long lectures and training programs don't work effectively. Additionally--if you'll forgive the anecdotal evidence--I've seen little evidence for a difference in attention spans between the youngest and the mid-career to retired students in university classrooms.
What I think has changed since the latest information revolution is the impatience of learners with instruction that doesn't make use of the best tools now available, nor the current knowledge of human cognition. People are now accustomed to finding information quickly, on their own, when they need it. When they are confronted with learning that is not built with smart design and technology, they don't respond well; often, they just stop paying attention.
In our next post, we'll keep thinking about the concept of "microlearning," and see if we can figure out if there is a good common definition, and broad agreement about its effectiveness.
Morris Davis, PhD, is an Associate Professor at Drew University and Senior Learning, Performance, & Design Consultant to Ontuitive. Twitter: @morleydj.