Posted & filed under In The News

January 23, 2012

Invisible Gorillas Are Everywhere

By William Pannapacker

By now most everyone has heard about an experiment that goes something like this: Students dressed in black or white bounce a ball back and forth, and observers are asked to keep track of the bounces to team members in white shirts. While that’s happening, another student dressed in a gorilla suit wanders into their midst, looks around, thumps his chest, then walks off, apparently unseen by most observers because they were so focused on the bouncing ball. Voilà: attention blindness.

The invisible-gorilla experiment is featured in Cathy Davidson’s new book, Now You See It: How the Brain Science of Attention Will Transform the Way We Live, Work, and Learn (Viking, 2011). Davidson is a founder of a nearly 7,000-member organization called Hastac, or the Humanities, Arts, Sciences, and Technology Advanced Collaboratory, that was started in 2002 to promote the use of digital technology in academe. It is closely affiliated with the digital humanities and reflects that movement’s emphasis on collaboration among academics, technologists, publishers, and librarians. Last month I attended Hastac’s fifth conference, held at the University of Michigan at Ann Arbor.

Davidson’s keynote lecture emphasized that many of our educational practices are not supported by what we know about human cognition. At one point, she asked members of the audience to answer a question: “What three things do students need to know in this century?” Without further prompting, everyone started writing down answers, as if taking a test. While we listed familiar concepts such as “information literacy” and “creativity,” no one questioned the process of working silently and alone. And noticing that invisible gorilla was the real point of the exercise.

Most of us are, presumably, the products of compulsory educational practices that were developed during the Industrial Revolution. And the way most of us teach is a relic of the steam age; it is designed to support a factory system by cultivating “attention, timeliness, standardization, hierarchy, specialization, and metrics,” Davidson said. One could say it was based on the best research of the time, but the studies of Frederick Winslow Taylor, among others, that undergird the current educational regime (according to Davidson) depend upon faked data supporting the preconceptions of the managerial class. Human beings don’t function like machines, and it takes a lot of discipline—what we call “classroom management”—to make them conform. Crucial perspectives are devalued and rejected, stifling innovation, collaboration, and diversity.

It wasn’t always that way. Educational practices that seem eternal, such as letter grades, started hardly more than a century ago; they paralleled a system imposed on the American Meat Packers Association in the era of The Jungle. (At first the meatpackers objected because, they argued, meat is too complex to be judged by letter grades.) The factory assembly line provided inspiration for the standardized bubble test, which was adopted as a means of sorting students for admission to college. Such practices helped to make education seem efficient, measurable, and meritocratic, but they tended to screen out collaborative approaches to problem-solving.

Drawing on her scholarly work in American literary history, Davidson argued that resistance to technology in education is not new. Every new technology takes time to become accepted by institutional cultures. Writing, for example, was once considered a degenerate, impoverished form of communication; it’s why we know about the teachings of Socrates only from the writings of Plato. When the print revolution produced cheap novels for a mass audience, popular works were regarded as bad for young people, especially women, who secreted books in their skirt “offices.” Following the long trajectory of the Protestant Reformation, you no longer needed someone to tell you what to think: You could read for yourself, draw your own conclusions, and possibly select your own society. Now the Internet offers a radical expansion of that process of liberation: It challenges institutional authority, it’s uncontrolled, and it has the potential to disrupt existing hierarchies, opening up new fields of vision, and enabling us to see things that we habitually overlook.

Browsing the 2012 conference program of the Modern Language Association, which includes nearly 60 sessions involving the digital humanities, Stanley Fish recently observed that “I remember, with no little nostalgia, the days when postmodernism in all its versions was the rage and every other session at the MLA convention announced that in theory’s wake everything would have to change.” Now the isms of prior decades—“multiculturalism, postmodernism, deconstruction, postcolonialism, neocolonialism, racism, racialism, feminism, queer theory”—seem to have retreated. But the ethos and disciplinary range of the digital humanities on display at Hastac suggest that this movement is not a replacement for the old order of “Theory” that reigned in the 80s and 90s so much as it is a practical fulfillment of that movement’s vision of a more inclusive, egalitarian, and decentralized educational culture.

Providing examples of how people have worked collaboratively, using the Internet, to develop effective responses to real-world problems, Davidson made a compelling argument for significant reforms in higher education (many examples are provided in her book). Too many of our vestigial practices, such as the tenure monograph and the large-room lecture, have become impediments to innovative scholarship. Students often learn in spite of our practices, learning more outside of the structured classroom than in it. Google is not making the rising generations stupid, Davidson argued; on the contrary, they rely on it to teach themselves, and that experience is making students aware that invisible gorillas are everywhere—and that one of them is higher education as most of us know it.

I might add, as the cost of traditional education increases beyond affordability for more and more students, that they (and their employers) may increasingly decide that they don’t need us. We need to find more ways to expand and diversify higher education beyond traditional degrees earned in late adolescence. Without abandoning the value of preparing students for citizenship and a rewarding mental life, we need to develop more-flexible systems of transparent long-term and just-in-time credentialing, earned over the course of one’s life in response to changing needs and aspirations. Apparently to that end, Hastac is now supporting the exploration of digital “badges” signifying the mastery of specific skills, experiences, and knowledge.

Whatever the means, there is an emerging consensus that higher education has to change significantly, and Davidson makes a compelling case for the ways in which digital technology, allied with neuroscience, will play a leading role in that change.

Nevertheless, graduate students on Hastac panels—and especially in conversation—complain bitterly that their departments are not receptive to collaborative, digital projects. In most cases, their dissertation committees expect a written, 200-page proto-monograph; that’s nonnegotiable. Meanwhile, assistant professors complain that they can earn tenure only by producing one or perhaps two university press books that, in all likelihood, few people will read, when their energies might be more effectively directed toward online projects with, potentially, far greater impact.

In the context of a talk at Hastac on publishing, one graduate student observed that digital humanists—for some time, at least—must expect to perform double labor: digital projects accompanied by traditional written publications about those projects. The MLA and the American Historical Association have established guidelines for evaluating digital projects, but most faculty members are not yet prepared to put those guidelines into effect. It requires a radical change of perspective for scholars who have invested so much of their lives in written criticism as the gold standard. “The associate professors, especially,” one panelist noted, “judge the next generation by the standards they were expected to meet.” Senior professors seem more prepared to “let the kids do their thing.”

Holly Tucker, director of graduate studies in French and Italian at Vanderbilt University, noted that many graduate programs are failing to keep up. Graduate students can’t find local advisers in the digital humanities, and faculty members are diving into courses without adequate training. Meanwhile, increasing numbers of faculty job candidates are being asked about their engagement with the digital humanities and responding, “Well, I use Google and YouTube.” Support for the digital humanities is going to become important for attracting talented students who recognize that not having digital skills will make them unmarketable for an increasing percentage of academic or alt-academic positions. “Either get on the boat and help our graduate students,” observed Tucker, “or plan to suffer.”

Of course, that’s easier said at a wealthy private university than at a struggling public university, and it’s only a small part of a much larger vision of education being changed by technology. In his speech, Daniel Atkins, a professor of community informatics at Michigan, observed that more than 100 million people are going to become eligible for a college education in the next decade. To support them, a new college would have to be built every week, starting now, when public support is being withdrawn from much of higher education. The “grand challenge” of this era, he said, is going to be finding ways to cultivate that talent on a global scale. The knowledge infrastructure, according to Atkins, will no longer be “a publishing system so much as a networked public.” Education—aided by faster and cheaper networks—will become the shared life’s work of billions of people with access to the Internet, who are leveraging the resources it provides.

In another Hastac keynote, James A. Leach, director of the National Endowment for the Humanities, observed that “the greatest force for civilizing mankind” may be “the new digital class led by the kind of scholarly inquiry symbolized by the digital humanities.”

Some of the claims made by digital humanists when speaking to the faithful may seem utopian to skeptical outsiders. “The digital humanities is the name of the new dispensation, and its prophets tell us that if we put our faith in it, we shall be saved,” observed Stanley Fish, in the manner of a prophet who has seen millennial predictions come and go. “But what exactly is it?,” he asked, echoed by dozens of comments in The New York Times. “And how will its miracles be wrought?”

For some answers, I encourage those who might be inclined to dismiss the digital humanities on the basis of the titles of some conference papers to read Davidson’s book and view the keynote speeches from the Hastac convention. They can be found online. You don’t need to enroll at a university; you just need an Internet connection. And then you can join the conversation.

William Pannapacker is an associate professor of English at Hope College, in Holland, Mich. Many of his previous columns were published under a pseudonym, Thomas H. Benton. The views expressed here are his own and do not necessarily represent those of his employers.

Leave a Reply

(will not be published)
Cathy N. Davidson

Cathy N. Davidson

Follow Cathy