The Shallows: What the Internet is Doing to Our Brains, by Nicholas Carr, Norton, 2010. PBS/Frontline Interview: Professor Clifford Nass, Director of CHIMe (Communications Between Humans and Interactive Media Lab), Stanford University, 2009; “Zuckerberg: The World’s Richest Primatologist,” by Lionel Tiger, The Wall Street Journal, February 26, 2012; Hamlet’s Blackberry: A Practical Philosophy for Building a Good Life in the Digital Age, by William Powers, Harper, 2010; Digital Nation, PBS, Netflix, 2010; “Decline But Not Fall,” by Russell Baker, The New York Review, September 30, 2010; Super Sad True Love Story, by Gary Shteyngart, Random House, 2010; The Singularity is Near: When Humans Transcend Biology, Raymond Kurzweil, Viking, 2005; The Western Canon, by Harold Bloom, Harcourt Brace, 1994.
From the clicking of our thumbs, something bad this way comes?
Journalist Russell Baker sounds possessed of reliable common sense in talking recently about the dire and growing troubles at The Washington Post and beyond in print media:
“The great newspaper blight–that bizarre affliction that mysteriously turns the reading public away from the rich pleasures and delights of the daily press–is too often reduced to accounts of incomprehensible projects for adapting print to electronic keystrokes.”
Elsewhere in his lament, Baker judges:
“…(the) quest for the secret of how to create an authentic and profitable newspaper without using paper, ink, or a printing press …(has) prospects of success (that) seem slightly better than the likelihood of finding a pot of gold at a rainbow’s end.”
Harold Bloom, perhaps our most prodigious living reader of literature, writing way, way back in 1994, blames the decline of book-reading essentially on a madness among the stewards of great literature he disgustedly groups in “The School of Resentment,” primarily in our higher education and temporarily flourishing in our period which he terms “The Chaotic Age.” The betrayers have become ideological at the expense of the aesthetic, the latter having absolutely nothing to do, says Bloom, with correctness, sincerity, injustice, deprivation, fairness, academic attention, “career-enhancement” and especially a few publicity-mongering French cultural theorists. If anything, the literary aesthetic is usually “incorrect.” Regardless, says Bloom, that aesthetic has no redeeming social value. Reading Shakespeare won’t necessarily make you a better person nor will the Bard’s lingering popularity make the world a better place.
Bloom is here more than guarded; he’s become radical as a savior of literature: we must do all we can, he says, to preserve it, even if it means isolating it.
It can’t and shouldn’t be isolated. But you can understand how the embattled Bloom got himself into this Maginot Line defense.
Herewith: Are these two very smart and accomplished minds saying anything very incisive about the human mind in the Digital Age?
You don’t want to be a curmudgeon, anti-technology and righteously nostalgic. To be so is to be naive, unrealistic, irresponsible and an enemy of the good.
But what about Clickety-Split? We should look unblinkingly at it.
What if the Great Digital is not very neutral when it comes to our minds? Or perhaps said more fundamentally, our thinking? What if it’s not merely not neutral but sometimes possibly lethal?
In short: What if the argument that guns themselves aren’t good or bad, it’s simply a matter of the comparatively few criminals who have them, is not in principle an insightful way to think about the Great Digital?
Nicholas Carr’s The Shallows is an honest, openly confessional, hardworking, half-aware, prosaic mosaic about what Clickety-Split is doing to more and more of us faster and faster in the so-called wired, digital, global age. About what he suggests is the Digital Plague seemingly suddenly upon us, distinctions of scale and breadth must be made, and they are yet incomplete but taking enormous shape within a tidal wave of data collected on all fronts: confidently it can be said now that the possibly diseased must number in at least the hundreds of millions, with the most outcry from “multitasking” elites and corporate workers gnawingly more and more worried publicly in books and articles that they are losing their old minds; invincibly cocky college students who in their “super-multitasking” reportedly have insular, defiant, wrong-headed confidence in their moment-to-moment distracted minds; and the huge youthful (apparently mainly teen) crowds drowning in “social networking” and in playing computer games to the point of even occasionally fatal compulsion (notably but far from exclusively in South Korea).
Without giving much of an idea of it, Carr especially laments the late fade of “the literary mind.” He laments the already huge crowd of netroots wired to Google, YouTube, Facebook, Twitter, blogging and the rest: a mass disease of the mental shallows. U.S. teens are now said to send and receive just over 2,700 texts a month–90 a day–and they will make quaint those present university students said to be sitting in lectures distractedly hypertexting on open laptops and texting on iPods with the dexterity of fair-to-middling pianists. (Are you there, Professor Bauerline? Must we reread your The Dumbest Generation?) Literature professors Carr has interviewed say students no longer read literary classics or, for that matter, virtually any other texts; in fact, many people of all ages who used to do so apparently no longer read any books.
Cubicles in the workplace should be renamed Clickables. Corporate staff have business meetings in which they send avatars to represent themselves in virtual conference rooms. Husbands and wives e-mail each other from adjoining home offices as though they were living backwards and becoming acquaintances. Kids live in cyberspace, wandering further away all the time in Clickasia, a place of unconditional surrender to the adolescent anthropological super urge to “connect” so that you squander the few years in which you’ve been structured to do much of your growing up, a certain stunned loneliness being essential to said learning, and where, as a member of a road gang of grasshoppers, you might have hooted at the Prodigal Son as you hopped past him on a summer day as he was slouching back to his family.
Carr himself confesses that over the past several years, before his sinking sensation became The Shallows, he had been spending increasing and considerable time “foraging in the Web’s data thickets”–becoming, in short, an Internet junkie–“reading and writing e-mails, scanning headlines and blog posts, following Facebook updates, watching video streams, downloading music, or just tripping lightly from link to link”–and that he was no longer thinking the way he used to think.
He is unquestionably archetypal here and, though not a great mind diagnosing and treating this plague, he’ll do in sounding any needed alarm. His most disheartening symptom has been his growing inability to sustain the vital, single-minded reading of narrative. He thinks of himself as corrupted into craving immediate distraction because it had become immediate gratification. Otherwise, he thinks of the plague as the destroyer of calm, deep thinking: isn’t immediate gratification usually the enemy of learning and of creation of anything worthwhile except progeny? At the end of many a “busy” day of mental trivialities he came to understand that he had suffered a false sense of achievement.
He began to sense deeply that he was ruining his brain, and hence his mind, in some hellishly ironical, primal way.
In his confession he early details being caught up in “the upgrade cycle”: From the start of the 1990s to nearly the present, he telescopes a familiar, breathless journey into the Digital Age: Apple’s earliest Macs featuring a RAM of one megabyte together with a hard drive of 20 megabytes and a small screen; Microsoft Word; early Excel; the pioneering HyperCard app; the leap outwards with the modem; the advent of online services such as Compuserve, Prodigy and AOL; upgrading to a Mac Performa 550 with CD-ROM drive and color screen, 500-megabyte hard drive and a “fast” 33-megahertz processor; on to a Zip drive; a CD burner; then becoming caught up in the new phenomenon called the Internet; then the new Netscape browser; speedier chips and modems; Gigabyte hard drives; Yahoo, Amazon and eBay; MP3s; streaming video; broadband; Napster; Google; Blackberrys; iPods; Wi-Fi; YouTube; Wikipedia; blogging; smartphones; thumb drives; netbooks; RSS feed readers; Myspace; Facebook; Digg; and Twitter.
Carr reiterates the common experience that long into this journey it came to be that:
“Hyperlinks and search engines delivered an endless supply of words to my screen, alongside pictures, sounds and videos. …the flood of free content turned into a tidal wave.… (And then) sometime in 2007, a serpent of doubt slithered into my info paradise.”
Carr, now indignant and sarcastically humorous about Clickety-Split, wants to find out as rigorously, deeply and specifically as he can what had happened to his mind and conduct. He talks to the still small but growing community of concerned neuroscientists about what he sees as the Great Digital Obsession, the Dismal Digital Hurricane, affecting our brains and minds, especially given the brain’s “plasticity.” Of course, we have far, far to go here in understanding, but he thinks that of late we have learned indisputably that brains surely reconfigure themselves far more radically than we’d thought as they adapt to stimuli, and that they do so from cradle to grave.
The Nurture Road is a five-lane superhighway between Nature’s curbs, a superhighway across whose lanes we can swerve wildly; and we are now doing so in historical sweep and compulsion.
To switch to Carr’s fundamental metaphor, many digiholics are now in this neural plasticity regressing to being once again infants loving to splash in shallow waters. They are training their brains for the quick fix. Surface skipping. Perish the thought of mere “snorkeling,” let alone deep sea diving, in streams of knowledge, especially unified, coherent narratives, even the most suspenseful ones whose art snares you into an enlightenment and even an epiphany about what really to expect from yourself, others, and the world.
Well, apparently it’s about neurotransmitters, at least so concludes Carr with significant (if perhaps not final) authority: serotonin, dopamine and the like reward (and always have) the brain’s adaptivity, and since we are sensuous creatures and now at least three senses are engaged by your iPad, and the Eric Schmidts of the world want you to click as often as you can–do “data surfing” and “surface skimming” and never discover the Titanic ruins far below (a descent into the quiet which demands narrow, focused concentration to witness something of historical grimness that is an invaluable reality and metaphor all at once)–that they are trying to turn you into Super Clickster–that herewith a great irony is ensuing in our time: our alert ancestors then clung to the mental shallows to survive in the uncivilized state of the wilds because accordingly they saw the flash of tawny fur behind a bush at the watering hole and, adrenalin pumping, scampered up a tree–in short, had more prevalent lives as hyper-alerts for whom deep thinking was like the cat’s fatal curiosity…well, today we, who haven’t changed all that much in our psychophysiology, are becoming in an ironic seduction in our runaway age not only shallow mentally from immersion in the superficial, distracting, interrupting world of addictive clicking–of hyper hypertexting and hysterical (?) “social networking”–but increasingly missing our earlier calmer, deeper minds.
We’re past Emerson’s ire at the apparitionlike steam engine that growled and howled and bellowed and grunted and interrupted his meditative peace on that summer day in the 1800s, but we know Something Isn’t Right. Since for Carr what’s wrong has to do with deep neural adaptivity–with billionfold electrochemically rewarded reconfiguration–the specific problem is…you don’t like to say it…addiction. Plainly, there’s the matter of withdrawal.
In a catchy summation to all this ponderousness: is “multitasking” a dumb, pandering word invented by business-savvy King Geeks and Merlin Geeks to promote and flatter addicts (“customers”) and a word that means absolutely nothing of any righteous, laudable, scientific, technical, business or philosophical probity? Carr would say, Yes! He would say that the issue here is one of malady. The faux technical “Attention Deficit Disorder” (ADD) stinks but is much better; “superficiality” and “distraction,” however, do plainly and perfectly well.
Is today’s classic anti-authoritarian advice, “Turn off, tune out, drop in”?
The most important drama of Carr’s difficult book (both in the writing and the reading) is the slow but gathering rebellion by the neuroscientific, literary, philosophical and traditionalist (not in the Fox News sense) communities, especially the scientific one, against the Digital Barons and their minions as motivated by the sad spectacles of the oppressiveness of more and more Hughs and Sues living most of their days in Clickdom. The Labsters are definitely on the counterattack, though their campaign is early.
Carr quotes a cultural critic quoting one of that critic’s friends who is recalling Dante as that friend visits the Googleplex (now there’s a pattern of the stuff-of-thought legends!): the gist is that amidst the slo-mo scenes of Golden Retrievers leaping through sprinklers on close-cropped, impeccable lawns and then the bright colors inside the Plex and then the toys everywhere, the friend has the sense that were the Devil to come to Earth, where better to be hiding than in the shadows in the corners of that shining hilltop Clicksle? Perhaps for Carr, Dante extended is: A new circle of the Damned–the Digitally Distracted exhaustingly driven to search after search, click after click, with a mocking stentorian voice-over shouting “Multitask!” to create an insufferable Din. Instead of Virgil as Dante’s Guide, it’s Marshall McLuhan whose enigmatic observation, “The medium is the message,” now makes overwhelming empirical sense, proven in a large demonstration in Vanity Fair by History itself.
Carr’s strenuous expedition through the less and less ruly cognitive science community and some of its adjunct friends, can be radically and no doubt over-selectively summarized herewith: Scott Karp, blogger extraordinaire, confesses he can no longer concentrate on one narrative long enough to read anything deeply; ditto Bruce Friedman, blogger mugger, together with another blogger, Philip Davis; Katherine Kayles, Duke literature professor, shows herself a confessor that her students can no longer read extended narrative; J.Z. Young, the eminent biologist, says the brain is in constant flux and that in our Clicking Times this f—s; Norman Droige, a research psychiatrist, echoes Young as does Michael Merzenich, cognitive scientist, who says all our neurocircuits are always changeable, including for the bad in a digital hysteria, citing Hebb’s Rule: “Brain cells that fire together wire together” which reminds of a waggish, anonymous neuroscientist’s observation of the digital corporate world as “Survival of the busiest” with Merzenich himself adding that multitaskers are “training their brains to pay attention to the crap”; cogno eminence Pascual-Leone who discovered just shortly before now that human thought patterns strongly affect brain anatomy; and last but foremost Stanford’s nasty professor Clifford Nass who studies the Clickemmings and, as a welcome mean-ass, has said of the Heroic Multitasker: “(They are) suckers for irrelevancy…. Everything distracts them.”
These learned brutalities come from a wealth of late-breaking, already classic lab experiments by many of the above and their colleagues on which Carr reports in slow detail and which seem to show that the Digital Multitaskateers have lousy recall, spotty concentration, wouldn’t draw an analogy at deep levels except by rare accident, probably believe an epiphany is a rock band, and, in glum sum, are hooked, rooked and cooked by the Clickmeisters (Google picked that very precisely shaded blue lettering after comparison of millions upon millions of clicks had made them believe that particular blue hue to rule every Hugh and Sue). Abandon Hope, All Ye Who Enter Into These Labs?
Of course, there are here crucial modern assumptions about brain and mind. One might be called the Traditionalist view. This is Carr’s view as well as that of most of the others talked about here. The grand assumption is that there is the present and long-evolved human cognition and that it alone is what should be considered. Unless the traditionalist is arguing with those who might make a second assumption (which we’ll not discuss here–it’s far too immense)–call it a coming Strong Artificial Intelligence–there is never much question about dealing with other than what human cognition is now, the given. The Traditionalist faces a problem, really a sort of trap, with modern information technology, the “Digital Age.” That problem has until recently mainly been called: “Information Overload.” But, as per above, it is being understood rapidly in the neuroscientific community and in other scientific, technological, artistic and philosophical communities, as much more subtle and ironic. We can get a simplistic but useful gist by considering a present macro model of the human cognition which will no doubt be seen as hokey in perhaps a few decades but which helps get to the heart of the matter. The simplistic model of the human mind is this: there are long-term memory, short-term memory and a sort of eminently crucial subdivision of short-term memory called the working memory. Here are salient quotations from Carr based on the work of Australian psychologist John Sweller:
“(The working memory) plays an instrumental role in the transfer of information into long-term memory and hence in the creation of our personal store of knowledge. …’We are conscious of what is in working memory and not conscious of anything else,’ says Sweller. …’We are only aware that something was stored in long-term memory when it is brought down into working memory,’ explains Sweller. …brain scientists have come to realize that long-term memory is actually the seat of understanding. It stores…’schemas.’ …’Our intellectual prowess is derived largely from the schemas we have acquired over long periods of time,’ says Sweller. …But the passage from working memory to long-term memory …forms the major bottleneck in our brain. …According to Sweller, current evidence suggests that ‘we can process no more than two to four elements at a given time with the actual number probably being at the lower [rather] than the higher end of this scale.’ …Imagine filling a bathtub with a thimble; that’s the challenge involved in transferring information from working memory into long-term memory. By regulating the velocity and intensity of information flow, media exert a strong influence on this process. When we read a book, the information faucet provides a steady drip…. …we can transfer all or most of the information, thimbleful by thimbleful, into long-term memory and forge the rich associations essential to the creation of schemas. With the Net, we face many information faucets, all going full blast. Our little thimble overflows as we rush from one faucet to the next. We’re able to transfer only a small portion of the information to long-term storage, and what we do transfer is a jumble of drops from different faucets, not a continuous, coherent stream from one source.”
What, really, must we make of the “bottleneck”? Well, is Nature a “perfect designer”? There’s a rhetorical question if ever there was one! As neuroscientists and anthropologists and others remind us, our minds weren’t made to think nearly so much as to act. So: Is civilization in defiance of Nature? Are the meager few crucial epiphanies we’ve had as a species miracles of rebellion against Nature as She constrains our brains?
One could certainly look at the Human Condition from this view. However, the traditionalist, I think, doesn’t often concern herself with this truism. The traditionalist critic of the Digital Trap (or, if you prefer, Seduction) fears the burgeoning multitasking cannot be stopped by any warnings, however scientific and/or wise and/or symptomatic and cringes as in the meantime experiment after experiment gives the lie to “multitasking.”
Warning = casting pearls.
Is moderation in all things the answer? (But for insight here use your own good judgment, don’t read Hamlet’s Blackberry, William Powers’s pathetic book on how to think positively about the Digital Age as adopted from the great philosophers.)
“The Digital Age” has brought wonders.
As usual: Is enlightened adaptivity the hope?
Carr is dour.
(Gary Shteyngart in his dystopian novel of an Orwellian nightmare of a Super-Digital Age in the near future, Super Sad True Love Story, is absurdly sour.)
Zuckerberg is Time Magazine’s “Person of the Year.”
As always, we have much to cheer and much to fear.
The fear? Might it be the dark Faustian bargain of technology?
The Dark Multiplier.
Seems like the old advice is good–get off the train often enough to see the countryside.