You’re Probably Not Reading This

02/03/2010

In between checking Twitter, Facebook, email, and trying to cook dinner, I noticed a new episode of Frontline premiered tonight.

When I finally finished cooking dinner1, I sat down to watch a re-visit to a 2007 episode called “Growing Up Online.” This followup concerned “Our New Digital Nation.”

As I watched, I was reminded of something I had learned long ago, and I continue to be reminded of as I work on technologies related to the Internet: the way now-teenagers experience technology, media, and the Internet… is actually pretty different than the way I experienced it.

If this is a topic that fascinates you, it’s certainly worth the ninety minutes; I found three elements particularly interesting:

  • Early on in the episode, a study on human multitasking conducted at Stanford is explored.

    What is particularly interesting to me about the study is they interview both students at MIT and Stanford—not slouches by any academic standards—and ask them how they think they perform at multitasking.

    The question seems almost insulting: one student2 quips “I feel like the professors here do have to accept that we can multitask very well and that we do at all times, and so if they try and restrict us from doing it, it’s almost unfair, because we are completely capable.”

    Except… the punchline is Frontline talks to a professor at Stanford, who has an MRI, and who has used the scientific method3 to prove that students—who are presumably at the top of their game, intellectually4—perform absolutely horribly at mundane tasks5 when they’re trying to multitask.

    Earlier in the story, an MIT professor describes this myth that the (current) human brain is good at multitasking as “koolaid,” and based on these students’ perplexed and embarrassed reactions to their test results, I don’t think the analogy is far off. In fact, the Stanford doctor in charge of the study says “Y’know, they understand the research; they’re smart kids. But, they seem utterly convinced… it doesn’t apply to them.”

    Except… it totally does. And the discrepancy between the science and what we think-we-can-do is revisited throughout the episode. There are some interesting developments that are touched upon for today’s students.

    But the fact remains that science is showing that if you were born before (and this is me guessing) approximately 1995, and you think you can multitask… the harsh reality is: you’re completely and utterly fooling yourself.

    Quoth the Stanford doc: “Virtually all multitaskers think they’re brilliant at multitasking. And one of the big discoveries is: you know what? You’re really lousy at it. It turns out: multitaskers are terrible at every aspect of multitasking. … Recent work we’ve done suggests they’re worse at analytic reasoning.”

    The jury is still out on those in the tail end of the “millennial generation,” though.
  • Later in the episode, they cover a so-called “Army Experience Center,”6 which is effectively a gaming center in a low-income mall that has X-box games were you can shoot at brown-people, with Army recruiters walking around doing a, and I quote, “soft sell.”

    This is particularly distasteful, but what I found interesting was what a Major says: “Here in the Army Experience Center, it’s not the whole Army; you know, video games are never going to replicate the real thing. But it is a sampling experience, to pique your interest and maybe encourage you to go and learn more, just as [the] Apple [Store] is trying to do.”

    Just as the Apple Store is trying to do? Really?7
  • The episode also covers a study of 5- and 6-year olds where they hook them up to a virtual-reality simulator, ask them “Do you remember swimming with dolphins when you were three?” and then proceed to simulate swimming with dolphins.


    A full 50% of them later said that they remember swimming with dolphins in earlier years, except none of them ever had.

    This is interesting, academically… but the practical applications have intriguing—and horrifying—applications.9

I think the coverage of this issue is most notable because you’re likely to have one of two reactions to it: you’ll either find the “scary” stuff… well… unidentifiable and scary, or you’ll completely relate and wonder why it’s a story at all. I suspect the differentiation will be in what year in the 90′s you were born.

For my own part, it makes one thing plain: I’m not even 30, and yet: I don’t experience the Internet, digital media, and “cyberspace” in the same way someone entering college—something the episode coins a “digital native”—does.

And that… is notable.
_____________________
1 Homemade spaghetti, if you must know
2 A very arrogant-sounding “Lauren”
3 That “thing” most students were probably surfing Facebook while it was discussed in lecture
4 In terms of their lifetime
5 Identifying vowels and even numbers, in fact
6 Which cost, incidentally, $13 million
7 “There’s just one more thing: life-long, post-traumatic stress disorder”8
8 Although, Frontline does go into treatment of that, so maybe it’s ok…
9 Part of me is sad our brain doesn’t distinguish between reality and faux-reality, but… there you have it.