The Shallows

RATING: 8/10…READ: December 20, 2013

The shallows examines are relationship to online reading and distraction. I know internet reading leaves me distracted and thought I would get bored with this book after a chapter, but Carr’s book kept me turning the page learning about our relationship with the printed word and examining how our mind is changing as a result of constant ADD.

Get at Amazon

Notes:

What both enthusiast and skeptic miss is what McLuhan saw: that in the long run a medium’s content matters less than the medium itself in influencing how we think and act. As our window onto the world, and onto ourselves, a popular medium molds what we see and how we see it—and eventually, if we use it enough, it changes who we are, as individuals and as a society.

In the end, we come to pretend that the technology itself doesn’t matter. It’s how we use it that matters, we tell ourselves. The implication, comforting in its hubris, is that we’re in control. The technology is just a tool, inert until we pick it up and inert again once we set it aside.

They supply the stuff of thought, but they also shape the process of thought. And what the Net seems to be doing is chipping away my capacity for concentration and contemplation. Whether I’m online or not, my mind now expects to take in information the way the Net distributes it: in a swiftly moving stream of particles. Once I was a scuba diver in the sea of words. Now I zip along the surface like a guy on a Jet Ski.

The map and the clock belong to the fourth category, which might best be called, to borrow a term used in slightly different senses by the social anthropologist Jack Goody and the sociologist Daniel Bell, “intellectual technologies.” These include all the tools we use to extend or support our mental powers—to find and classify information, to formulate and articulate ideas, to share know-how and knowledge, to take measurements and perform calculations, to expand the capacity of our memory.

The typewriter is an intellectual technology. So are the abacus and the slide rule, the sextant and the globe, the book and the newspaper, the school and the library, the computer and the Internet. Although the use of any kind of tool can influence our thoughts and perspectives—the plow changed the outlook of the farmer, the microscope opened new worlds of mental exploration for the scientist—it is our intellectual technologies that have the greatest and most lasting power over what and how we think. They are our most intimate tools, the ones we use for self-expression, for shaping personal and public identity, and for cultivating relations with others.

Every intellectual technology, to put it another way, embodies an intellectual ethic, a set of assumptions about how the human mind works or should work. The map and the clock shared a similar ethic. Both placed a new stress on measurement and abstraction, on perceiving and defining forms and processes beyond those apparent to the senses.

Ultimately, it’s an invention’s intellectual ethic that has the most profound effect on us. The intellectual ethic is the message that a medium or other tool transmits into the minds and culture of its users.

“If the experience of modern society shows us anything,” observes the political scientist Langdon Winner, “it is that technologies are not merely aids to human activity, but also powerful forces acting to reshape that activity and its meaning.”13 Though we’re rarely conscious of the fact, many of the routines of our lives follow paths laid down by technologies that came into use long before we were born.

Because language is, for human beings, the primary vessel of conscious thought, particularly higher forms of thought, the technologies that restructure language tend to exert the strongest influence over our intellectual lives.

The history of language is also a history of the mind.

Language itself is not a technology. It’s native to our species. Our brains and bodies have evolved to speak and to hear words. A child learns to talk without instruction, as a fledgling bird learns to fly. Because reading and writing have become so central to our identity and culture, it’s easy to assume that they, too, are innate talents. But they’re not. Reading and writing are unnatural acts, made possible by the purposeful development of the alphabet and many other technologies. Our minds have to be taught how to translate the symbolic characters we see into the language we understand. Reading and writing require schooling and practice, the deliberate shaping of the brain.

Should the Egyptians learn to write, Thamus goes on, “it will implant forgetfulness in their souls: they will cease to exercise memory because they rely on that which is written, calling things to remembrance no longer from within themselves, but by means of external marks.” The written word is “a recipe not for memory, but for reminder. And it is no true wisdom that you offer your disciples, but only its semblance.” Those who rely on reading for their knowledge will “seem to know much, while for the most part they know nothing.” They will be “filled, not with wisdom, but with the conceit of wisdom.”

Socrates, it’s clear, shares Thamus’s view. Only “a simple person,” he tells Phaedrus, would think that a written account “was at all better than knowledge and recollection of the same matters.” Far better than a word written in the “water” of ink is “an intelligent word graven in the soul of the learner” through spoken discourse. Socrates grants that there are practical benefits to capturing one’s thoughts in writing—“as memorials against the forgetfulness of old age”—but he argues that a dependence on the technology of the alphabet will alter a person’s mind, and not for the better. By substituting outer symbols for inner memories, writing threatens to make us shallower thinkers, he says, preventing us from achieving the intellectual depth that leads to wisdom and true happiness.

In a purely oral culture, thinking is governed by the capacity of human memory. Knowledge is what you recall, and what you recall is limited to what you can hold in your mind.28 Through the millennia of man’s preliterate history, language evolved to aid the storage of complex information in individual memory and to make it easy to exchange that information with others through speech.

The written word liberated knowledge from the bounds of individual memory and freed language from the rhythmical and formulaic structures required to support memorization and recitation.

Silent reading was largely unknown in the ancient world. The new codices, like the tablets and scrolls that preceded them, were almost always read aloud, whether the reader was in a group or alone.

According to an extensive 2009 study conducted by Ball State University’s Center for Media Design, most Americans, no matter what their age, spend at least eight and a half hours a day looking at a television, a computer monitor, or the screen of their mobile phone. Frequently, they use two or even all three of the devices simultaneously.

When we go online, we enter an environment that promotes cursory reading, hurried and distracted thinking, and superficial learning. It’s possible to think deeply while surfing the Net, just as it’s possible to think shallowly while reading a book, but that’s not the type of thinking the technology encourages and rewards.

The Net’s interactivity gives us powerful new tools for finding information, expressing ourselves, and conversing with others. It also turns us into lab rats constantly pressing levers to get tiny pellets of social or intellectual nourishment.

The researchers found that when people search the Net they exhibit a very different pattern of brain activity than they do when they read book-like text. Book readers have a lot of activity in regions associated with language, memory, and visual processing, but they don’t display much activity in the prefrontal regions associated with decision making and problem solving. Experienced Net users, by contrast, display extensive activity across all those brain regions when they scan and search Web pages.

According to Sweller, current evidence suggests that “we can process no more than about two to four elements at any given time with the actual number probably being at the lower [rather] than the higher end of this scale.”

Imagine filling a bathtub with a thimble; that’s the challenge involved in transferring information from working memory into long-term memory. By regulating the velocity and intensity of information flow, media exert a strong influence on this process. When we read a book, the information faucet provides a steady drip, which we can control by the pace of our reading. Through our single-minded concentration on the text, we can transfer all or most of the information, thimbleful by thimbleful, into long-term memory and forge the rich associations essential to the creation of schemas. With the Net, we face many information faucets, all going full blast. Our little thimble overflows as we rush from one faucet to the next. We’re able to transfer only a small portion of the information to long-term memory, and what we do transfer is a jumble of drops from different faucets, not a continuous, coherent stream from one source.

The information flowing into our working memory at any given moment is called our “cognitive load.” When the load exceeds our mind’s ability to store and process the information—when the water overflows the thimble—we’re unable to retain the information or to draw connections with the information already stored in our long-term memory. We can’t translate the new information into schemas. Our ability to learn suffers, and our understanding remains shallow.

She found that comprehension declined as the number of links increased. Readers were forced to devote more and more of their attention and brain power to evaluating the links and deciding whether to click on them. That left less attention and fewer cognitive resources to devote to understanding what they were reading.

Nielsen told his clients, “when you add verbiage to a page, you can assume that customers will read 18% of it.”

What we’re experiencing is, in a metaphorical sense, a reversal of the early trajectory of civilization: we are evolving from being cultivators of personal knowledge to being hunters and gatherers in the electronic data forest.

The Net is making us smarter, in other words, only if we define intelligence by the Net’s own standards. If we take a broader and more traditional view of intelligence—if we think about the depth of our thought rather than just its speed—we have to come to a different and considerably darker conclusion.

We’ve become ever more adept at working out the problems in the more abstract and visual sections of IQ tests while making little or no progress in expanding our personal knowledge, bolstering our basic academic skills, or improving our ability to communicate complicated ideas clearly. We’re trained, from infancy, to put things into categories, to solve puzzles, to think in terms of symbols in space. Our use of personal computers and the Internet may well be reinforcing some of those mental skills and the corresponding neural circuits by strengthening our visual acuity, particularly our ability to speedily evaluate objects and other stimuli as they appear in the abstract realm of a computer screen.

Taylor’s system of measurement and optimization is still very much with us; it remains one of the underpinnings of industrial manufacturing. And now, thanks to the growing power that computer engineers and software coders wield over our intellectual and social lives, Taylor’s ethic is beginning to govern the realm of the mind as well. The Internet is a machine designed for the efficient, automated collection, transmission, and manipulation of information, and its legions of programmers are intent on finding the “one best way”—the perfect algorithm—to carry out the mental movements of what we’ve come to describe as knowledge work.

“On the web,” says Mayer, “design has become much more of a science than an art. Because you can iterate so quickly, because you can measure so precisely, you can actually find small differences and mathematically learn which one is right.”

In one famous trial, the company tested forty-one different shades of blue on its toolbar to see which shade drew the most clicks from visitors. It carries out similarly rigorous experiments on the text it puts on its pages. “You have to try and make words less human and more a piece of the machinery,” explains Mayer.

Taylorism, he wrote, is founded on six assumptions: “that the primary, if not the only, goal of human labor and thought is efficiency; that technical calculation is in all respects superior to human judgment; that in fact human judgment cannot be trusted, because it is plagued by laxity, ambiguity, and unnecessary complexity; that subjectivity is an obstacle to clear thinking; that what cannot be measured either does not exist or is of no value; and that the affairs of citizens are best guided and conducted by experts.”

Google’s business strategy. Nearly everything the company does is aimed at reducing the cost and expanding the scope of Internet use. Google wants information to be free because, as the cost of information falls, we all spend more time looking at computer screens and the company’s profits go up.

Google has let it be known that it won’t be satisfied until it stores “100% of user data.”

“When businesses like Google look at libraries, they do not merely see temples of learning,” wrote Robert Darnton, who, in addition to teaching at Harvard, oversees its library system. “They see potential assets or what they call ‘content,’ ready to be mined.”

To him, memorizing was far more than a means of storage. It was the first step in a process of synthesis, a process that led to a deeper and more personal understanding of one’s reading. He believed, as the classical historian Erika Rummel explains, that a person should “digest or internalize what he learns and reflect rather than slavishly reproduce the desirable qualities of the model author.” Far from being a mechanical, mindless process, Erasmus’s brand of memorization engaged the mind fully. It required, Rummel writes, “creativeness and judgment.”

Erasmus’s advice echoed that of the Roman Seneca, who also used a botanical metaphor to describe the essential role that memory plays in reading and in thinking. “We should imitate bees,” Seneca wrote, “and we should keep in separate compartments whatever we have collected from our diverse reading, for things conserved separately keep better. Then, diligently applying all the resources of our native talent, we should mingle all the various nectars we have tasted, and then turn them into a single sweet substance, in such a way that, even if it is apparent where it originated, it appears quite different from what it was in its original state.”5 Memory, for Seneca as for Erasmus, was as much a crucible as a container. It was more than the sum of things remembered. It was something newly made, the essence of a unique self.

More times an experience is repeated, the longer the memory of the experience lasts. Repetition encourages consolidation. When they examined the physiological effects of repetition on individual neurons and synapses, they discovered something amazing. Not only did the concentration of neurotransmitters in synapses change, altering the strength of the existing connections between neurons, but the neurons grew entirely new synaptic terminals. The formation of long-term memories, in other words, involves not only biochemical changes but anatomical ones.

When our sleep suffers, studies show, so, too, does our memory.

While an artificial brain absorbs information and immediately saves it in its memory, the human brain continues to process information long after it is received, and the quality of memories depends on how the information is processed.”28 Biological memory is alive. Computer memory is not.

Biological memory is in a perpetual state of renewal. The memory stored in a computer, by contrast, takes the form of distinct and static bits; you can move the bits from one storage drive to another as many times as you like, and they will always remain precisely as they were.

Evidence suggests, moreover, that as we build up our personal store of memories, our minds become sharper. The very act of remembering, explains clinical psychologist Sheila Crowell in The Neurobiology of Learning, appears to modify the brain in a way that can make it easier to learn ideas and skills in the future.

We don’t constrain our mental powers when we store new long-term memories. We strengthen them. With each expansion of our memory comes an enlargement of our intelligence. The Web provides a convenient and compelling supplement to personal memory, but when we start using the Web as a substitute for personal memory, bypassing the inner processes of consolidation, we risk emptying our minds of their riches.

The pocket calculator relieved the pressure on our working memory, letting us deploy that critical short-term store for more abstract reasoning. As the experience of math students has shown, the calculator made it easier for the brain to transfer ideas from working memory to long-term memory and encode them in the conceptual schemas that are so important to building knowledge.

“‘Learning how to think’ really means learning how to exercise some control over how and what you think,” said the novelist David Foster Wallace in a commencement address at Kenyon College in 2005. “It means being conscious and aware enough to choose what you pay attention to and to choose how you construct meaning from experience.” To give up that control is to be left with “the constant gnawing sense of having had and lost some infinite thing.”

The Web’s connections are not our connections—and no matter how many hours we spend searching and surfing, they will never become our connections.

William James, in concluding his 1892 lecture on memory, said, “The connecting is the thinking.” To which could be added, “The connecting is the self.”

 

People born into societies that celebrate individual achievement, like the United States, tend, for example, to be able to remember events from earlier in their lives than do people raised in societies that stress communal achievement, such as Korea.

Personal memory shapes and sustains the “collective memory” that underpins culture. What’s stored in the individual mind—events, facts, concepts, skills—is more than the “representation of distinctive personhood” that constitutes the self, writes the anthropologist Pascal Boyer. It’s also “the crux of cultural transmission.”41 Each of us carries and projects the history of the future. Culture is sustained in our synapses.

“I come from a tradition of Western culture,” he wrote, “in which the ideal (my ideal) was the complex, dense and ‘cathedral-like’ structure of the highly educated and articulate personality—a man or woman who carried inside themselves a personally constructed and unique version of the entire heritage of the West.” But now, he continued, “I see within us all (myself included) the replacement of complex inner density with a new kind of self—evolving under the pressure of information overload and the technology of the ‘instantly available.’” As T. S. Eliot had a similar experience when he went from writing his poems and essays by hand to typing them. “Composing on the typewriter,” he wrote in a 1916 letter to Conrad Aiken, “I find that I am sloughing off all my long sentences which I used to dote upon. Short, staccato, like modern French prose. The typewriter makes for lucidity, but I am not sure that it encourages subtlety.”

When people came to rely on maps rather than their own bearings, they would have experienced a diminishment of the area of their hippocampus devoted to spatial representation.

As we “externalize” problem solving and other cognitive chores to our computers, we reduce our brain’s ability “to build stable knowledge structures”—schemas, in other words—that can later “be applied in new situations.”

We want friendly, helpful software. Why wouldn’t we? Yet as we cede to software more of the toil of thinking, we are likely diminishing our own brain power in subtle but meaningful ways. When a ditchdigger trades his shovel for a backhoe, his arm muscles weaken even as his efficiency increases. A similar trade-off may well take place as we automate the work of the mind.

As more journals moved online, scholars actually cited fewer articles than they had before. And as old issues of printed journals were digitized and uploaded to the Web, scholars cited more recent articles with increasing frequency. A broadening of available information led, as Evans described it, to a “narrowing of science and scholarship.”

A series of psychological studies over the past twenty years has revealed that after spending time in a quiet rural setting, close to nature, people exhibit greater attentiveness, stronger memory, and generally improved cognition.

“Simple and brief interactions with nature can produce marked increases in cognitive control.” Spending time in the natural world seems to be of “vital importance” to “effective cognitive functioning.”

Leave a Reply

Your email address will not be published. Required fields are marked *

Post comment