Posted & filed under In The News

Reblogged from Steve Wheeler’s “Learning with ‘e’s” website 

Tuesday, 17 January 2012

10Q: Cathy N. Davidson

Cathy N. Davidson first flashed on to my radar last year when the Times Higher Education news magazine invited me to review her new bookNow You See It. When I read the summary, I remembered that Cathy had been the prime mover in one of the first large scale iPod education projects at Duke University several years before. Then it all clicked. I have to say I enjoyed her book immensely and learnt a lot from reading it. I got paid to do the review and it felt a little like theft, because I would have gladly have paid to read the book for myself. You can read my review here, and I’m delighted to say that since it was published, I have been in touch with Cathy and have enjoyed some interesting conversations with her. She is truly one of the world’s visionaries when it comes to rethinking learning, and as one would expect, she is controversial too. Her books, and other projects she has engaged with over the years have all received brickbats as well as bouquets. But that is the nature of innovation. It disrupts, and it discomforts. In today’s article, Cathy responds to my 10Q interview questions:

Who are you? I am a lifelong innovator and a lifelong college professor – and if you think those two things are contradictory, you may be right! I’ve spent my life doing unconventional work, both in the academy and in communities and the workplace, pushing for educational reform. I’m lucky that my home institution, Duke University, has rewarded me for being an iconoclast. For eight years (1998-2006) I was essentially the R and D person for the university, Duke’s first full-time Vice Provost for Interdisciplinary Studies (the first anywhere in the US in fact), charged with innovation across all eight schools of the university, and with the delicious mandate to “break things and make things.” We made national news with our iPod experiment where we gave students iPods the year they came out and challenged them to come up with educational uses. Steve Jobs was no fool and he got a lot of free R and D out of Duke, where we held the first ever academic “podcasting” (the quotation marks were on the poster) conference. More than that, we created a paradigm shift, since students, not faculty, led the innovation. That’s the byword of the kind of educator I strive to be, committed to learning (Kindergarten through the retirement home) that’s motivated, engaged, inspired and inspiring.

What and/or who inspires you? So many things inspire me but if I could name one it would be the Open Web. The World Wide Web is not a technology–it is a mode of interaction and communication that potentially empowers any of us to contribute to one another’s knowledge and well-being. It is the potential for participation, worldwide, with consequences that are real and palpable, that inspires me. It’s not just Wikipedia (although that’s inspiring enough) but the whole world of Do-It-Yourself possibilities that are really “Do-It-Together.” Of course there are down sides, too, but you asked what “inspires” me, and that’s it, the potential of collective change and participation. I like to remind people of historian Robert Darnton’s idea that there have really been only four “technologies” in human history that have changed the very terms of how we communicate as a species: writing (4000 BCE Mesopotamia), movable type (10th C China and 15th C Europe with Gutenberg), steam-powered presses/mass printing (late 18th C), and the Internet (commercially available since April 1993 with the release of the Mosaic 1.0 browser). We need to take that in. We are now seeing the first generation to come of age during humanity’s fourth great information age. That is inspiring.

Why did you choose to be an educator? Well, it’s the last thing anyone would have predicted. I was one of those students who excelled in certain things (mostly certain areas of math and writing), and was pretty abysmal in everything else. I was also in trouble. A lot. At his recent 85th birthday party, my father did a pretty funny stand-up act rehashing all the times I’d been kicked out of school, from kindergarten (really!) to middle school and then four times in high school. It was a set-up because he’d organized the family to create a little plaque for me, honoring my confirmation by the Senate after being nominated by President Obama to serve on the National Council on the Humanities. (“Our favorite family delinquent” or some such thing). Okay, so there is your answer: I’ve spent my entire life fighting the standard educational system and just turned that into a profession in the end, one that I’ve been very fortunate to flourish in. Interestingly, I’m often at gatherings of educational innovators where it turns out we were all in trouble more than once as students–and, as adults, still tend to think we were right.

What did you learn from your iPod experiment at Duke University? It was a bit devious, that was its real charm. We only gave the shiny new Duke-branded iPods “free” to first year students. The second, third, and fourth year students were furious at us. Of course we knew they would be. So we challenged them. We said if they came up with new learning uses for what was in 2003 a “music-listening device,” and if they could convince a prof to change a syllabus to include this new learning application in the course, then we would give a free iPod to the professor and every student in that class. Within one semester, we gave away more iPods to students who had come up with these learning applications than we had, without strings, to the first years. That’s success! We didn’t just come up with R and D for the late Mr. Jobs but we changed the terms of the conversation, acknowledging that the new generation had new technical skills but that their professors had much to offer in thinking not just about technology but about use, IP, safety, security, and on and on. We flipped the learning rules in an important way.

Sadly, we’ve heard from Apple that the experiment convinced them not to lead their marketing with the idea that a new release of a product was for formal education. They saw that we were creamed in the press and in the other media for this experiment when we announced it – and then there was silence when it was a big success. One Apple executive recently said to me, “You may notice we now talk about life, home, and inspiration – not education in our ads. That was a billion dollar insight, how unfairly Duke was treated, how afraid people, including the media, are of educational innovation.” Now, I’ve heard the next Apple release will be taking on formal education. If that is true, I’d like to think we were the front-runners of that change. Maybe society is finally ready, almost a decade later.

What does brain science contribute to our understanding of how we learn? The pundits are pushing the myth of mono-tasking, and the idea that somehow this era’s Internet and mobile technologies violate the otherwise straight, narrow, linear, focused ways we operate in the world. They argue that multitasking is making us inefficient, distracted, shallow, lonely, incapable of reading long or deep works, unable to memorize anything any more. That’s just bad neuroscience, not supported by research that isn’t front-loaded to support those conclusions. You look at the research design of 95% of the monotasking research and it is fraught, loaded, unhistorical, selective, and just plain prejudiced. Blaming technology for human distraction isn’t anything new. Socrates, after all, said similar things about the invention of the writing (the Greek alphabet and the diacritical mark), and it was all downhill from there.

On the other hand, we have hundreds, even thousands of years, of Eastern traditions devoted to attention and mindfulness. In those traditions, meditation takes place in a quiet room, supposedly isolated from all distracting influence. Instead of mindfulness, the world intrudes. That is what lifelong Buddhist practice is about, and neuroscience supports the Eastern idea that the mind is intrinsically (not extrinsically) susceptible to distraction. The work of Morcom and Fletcher at Cambridge and Raichle at Washington University suggest that 80% of the brain’s energy is spent talking to itself, that there is no “baseline” of attention from which the world distracts us.

On the positive side, we know is that the brain learns by unlearning: when we are disrupted, when we make a mistake, we build on that. Habits are efficient, but they also get us into a lot of trouble since we can no longer see what is habitual. I believe in calculated, creative disruption as the single most important ingredient in learning. That’s how you write code, of course. You don’t memorize. You work on it until it works and, when it doesn’t, you figure out what does work.

You had a lot of criticism and praise about your recent book Now You See It. What’s the fuss about? I would have been disappointed if no one was angry about the book because it would have meant I wasn’t pushing hard enough for change. (Yes, there’s a pattern here!) One point of the title “Now You See It” is about the phenomenon of attention blindness (in the jargon, “inattentional blindness”) that I discuss as neurobiology and as metaphor: the more you focus in one direction, the more you miss everywhere else. I advocate Learning 3.0, what I call “collaboration by difference,” where we learn best not from experts but by those who offer radically different points of view, including opposite forms of expertise (training, culture, age, experiences, all of the above).

The other point of Now You See It as a title is it strives for a major paradigm shift. If Frederick Winslow Taylor stamped the whole 20th century with “scientific labor management” that, I argue, was turned by the educational-industrial complex into “scientific learning management” that trained an Industrial Age way of learning and working, I wrote Now You See It for a similar transformation in the ways we measure, collaborate, learn, work, and, indeed, live. If you are aiming that high – for a reversal of a century’s worth of deciding what counts, what is valued, how you count, and who decides what counts – not everyone will love you. That said, I am so fortunate: I have given 41 invited lectures since September 7, all of them by institutions that are seeking to transform their methods for “humanity’s fourth great information age.” I can’t begin to accept all the invitations I receive. It is so gratifying and, yes, inspiring to know that many institutions are already moving in this direction. It’s not just me but a worldwide movement towards rethinking education for the 21st century. I see that in the HASTAC Scholars, the graduate and undergraduate students who are part of the nonprofit I co-founded, HASTAC, that is dedicated, as we say, to “learning the future together.”

You asked earlier what inspires me. This is what inspires me: all these millions of people around the world whose lives have changed since April 1993 and who now want to change the institutions of work and learning for this new way we all communicate and interact.

What interesting projects or research are you currently working on? Now You See It is a big book with an even bigger vision of how we need to transform not just education but also the workplace and the way we view our lives, from infancy to old age. I now want to do a series of small, focused Now You Do It books that are more activist in nature, that have url’s to lots of ordinary people who “see it” and are doing remarkable things to make a new interconnected vision happen. So I’ve been talking to a graphic novelist about a comics version, maybe even a YA version, intended for students, college and high school. I’m also interested in a book specifically for K-12 teachers and for parents explaining and giving possible, diverse blueprints for my idea that the “3 R’s” (reading, ‘riting, ‘rithmetic) were right for the 19th century schoolroom that was training people for the industrial workplace – but now we need to add a fourth R (pRogramming or algorithm …you choose the ‘R’ word!). I’m convinced that if five year olds learned programming along with fundamental digital literacies, responsibilities, and possibilities that they would fight to keep the Web open. They would see themselves as contributors, not just consumers. And the Industrial Age division of the “two cultures” (with human and social sciences and the arts on one side of the equation and science and technology on the other) would be shown to be as nonsensical as, in everyday life, it is. The Web is a technology of life. But if we let others program it, if we aren’t participating and contributing, then we are being technologized. AsDouglas Rushkoff says, “program or be programmed.”

Finally, it really bothers me that so many open Web tech gatherings I attend are almost entirely white, middle-class guys. You can’t have a truly diverse, open technology created by a homogenous group of creators. If everyone learned programming, you’d be learning a think-as-you-do method, a social method, a technology that was also about multimedia representation (the arts), and that would have profound social implications for a diverse future, with developers thinking of ways to reach an audience that was more diverse in every way.

If you could go back in time and do one thing differently, what would you change? I actually don’t like the idea of going “back in time and doing differently.” I want to go forward in time and do it differently! This year, for example, I’m learning how to draw again. I thought at one point in life that I wanted to be an artist but hadn’t drawn anything since college art classes. I love it. I’ve also taken an online writing course to see how those actually work. And want to take a TechCrunch coding course online since I definitely need a refresher course.

What changes in education would you like to see? Get rid of the tyranny of standardized testing. It was invented in 1914, to be as fast and efficient as the assembly line that turned out Model Ts, to deal with a teacher shortage in World War I. It never was meant to be for more than, to quote Frederick Kelly who created the test, to test “lower order thinking” for the masses. It’s the tail wagging the learning dog. Thank goodness for Finland and for a totally different model of thinking about learning that, ironically, turns out to test #1 in the world according to the OECD, even though their Dewey-esque learn-then-do system, with its emphasis on equality not excellence, bans standardized testing in classrooms. I see Pasi Sahlberg as a fellow traveler and recommend his Finnish Lessons: What Can the World Learn from Educational Change in Finland?

What’s the most important advice you can give to educators? Focus on inspiring learning, not on test scores. Listen to your students. Don’t worry about expensive technology—worry about thinking about the way kids live and work today and how students, at any age, can be prepared for a world where, according to one study, 65% of 15-year-olds will end up in careers not invented yet, and will change careers (not jobs but careers) 4-6 times in the course of their lives. Teach for disruption. And teach disruptively. That’s the key.

Leave a Reply

(will not be published)
Cathy N. Davidson

Cathy N. Davidson

Follow Cathy