Tag Archives: Chronicle of Higher Education

Dumpster Diving and other career moves: remembering the job market with Roger Whitson

It would be hard to say I enjoyed reading Roger Whitson’s very fine recent meditation in the Chronicle on the quest for a tenure-track job, his ambivalent feelings on finding one, the mixed feelings of exaltation and guilt at getting what so many of his peers would never find, and of leaving behind an #Altac existence where he had begun to make a home.

Hard to enjoy reading both because the story seems to typify what our academic life has actually become, and, frankly, because it reminded me too much of my own wandering years as a new academic a couple of decades ago.  I spent seven years full-time on the job market back in the day (if you count the last two years of graduate school).  I have estimated in the past that I must have applied for at least 700-800 jobs during those years–the idea of being targeted and selective a joke for a new father.  Fortunately I was actually only totally unemployed for four months during those years, though that was enough to plunge me thousands of dollars in to debt paying for health insurance.  For five of those seven years I had full-time work in various visiting assistant positions, and for two of  those visiting years I was paid so little I qualified for food stamps, though I never applied for the program.  I worked as many extra courses as I could to pay the bills–probably foolish for my career since publishing slowed to a crawl, but it saved my pride.  I remember asking, naively, during an interview for one such visiting position whether it was actually possible to live in that area of the country on what I was going to be paid.  The chair interviewing me at the time hesitated, then responded, “Well, of course, your wife can work.”

Only one of those years did I not get an interview, and only two of those years did I not get a campus interview, but even then this seemed like a very peculiar and unhelpful way to claim success for a beginning academic career.  We did not have anything called #altac in those days, and my plan B–which on my worst days I sometimes still wonder whether I should have followed–was to go back to cooking school and become a chef (I know, I know.  Another growth industry).  I never felt bad about pursuing a PhD in English, and I don’t think I would have even if I had gone on to become a chef.  The learning was worth it, to me, at least.

But I did grow distant from college friends who became vice-presidents of companies or doctors in growing practices , all of whom talked about their mortgages and vacations in the Caribbean or Colorado, while I was living in the cheapest 2 bedroom apartment in Fairfax Virginia that I could find and fishing furniture, including my daughter’s first bed, out of a dumpster.  (The furniture was held together, literally, by duct tape; I had to pay for conferences). And I spent a lot of evenings walking off my anxiety through the park next to our apartment complex, reminding myself of how much I had to be thankful for.  After all, I had a job and could pay my bills through the creative juggling of credit card balances. A lot of my friends had found no jobs at all.  A low rent comparison, I realize, but I would take what solace I could get.

I do not resent those days now, but that depends a lot on my having come out the other side.  The sobering thought in all of this is in realizing that in the world of academics today I should count myself one of the lucky ones.  Reading Roger’s essay, and the many like it that have been published in the last twenty years, I always get a sick hollow feeling in the gut, remembering what it was like to wonder what would happen if….

Reading Roger’s essay I was struck again with the fact that this is now the permanent condition of academic life in the humanities.  My own job story began more than 20 years ago at Duke, and even then we were told that the job market had been miserable for 15 years (but was sure to get better by and by).  30 years is not a temporary downturn or academic recession.  It is a way of being.

The advent of MOOC’s, all-online education, and for-profit universities, are responses to the economics of higher education that are unlikely to make things any better for the freshly minted PhD.  While there are some exciting innovations here that have a lot of promise for increasing learning to the many, it’s also the case that they are attractive and draw interest because they promise to do it more cheaply, which in the world of higher education means teaching more students with fewer faculty hours.  Roger’s most powerful line came toward the end:  “Until we realize that we are all contingent, we are all #altac, we all need to be flexible, and we are all in this together, we won’t be able to effectively deal with the crisis in the humanities with anything other than guilt.”

This is right, it seems to me.  In a world that is changing as rapidly and as radically as higher education, we are all as contingent the reporters and editors in the newsrooms of proud daily newspapers.  It is easy to say that the person who “made it” was talented enough or smart enough or savvy enough, but mostly they, I, we were just lucky enough to come out the other side.  But we would be misguided to imagine that because we made it in to a world that at least resembled the world we imagined, that that world will always be there.  We are an older institution and industry than music or radio or newspapers, but we are an industry and an institution nonetheless, and it seems to me that the change is upon us.  We are all contingent now.

Digitization and the fulfillment of the book

My colleague in the library here at Messiah College, Jonathan Lauer, has a very nice essay in the most recent Digital Campus edition of the Chronicle of Higher Education.  Jonathan makes an eloquent defense of the traditional book over and against the googlization and ebookification of everything.   He especially employs an extended metaphor drawn from the transition to aluminum bats in various levels of baseball to discuss his unease and reservations about the shifts to electronic books and away from print that is profoundly and rapidly changing the nature of libraries as we’ve known them.  The essay is more evocative than argumentative, so there’s a lot of different things going on, but a couple of Jonathan’s main points are that enhancements we supposedly achieve with digitization projects come at a cost to our understanding of texts and at a cost to ourselves.

In the big leagues, wooden bats still matter. Keeping print materials on campus and accessible remains important for other reasons as well. Witness Andrew M. Stauffer’s recent Chroniclearticle, “The Troubled Future of the 19th-Century Book.” Stauffer, the director of the Networked Infrastructure for Nineteenth-Century Electronic Scholarship, cites several examples of what we all know intuitively. “The books on the shelves carry plenty of information lost in the process of digitization, no matter how lovingly a particular copy is rendered on the screen,” he writes. “There are vitally significant variations in the stacks: editions, printings, issues, bindings, illustrations, paper type, size, marginalia, advertisements, and other customizations in apparently identical copies.” Without these details, discernible only in physical copies, we are unable to understand a book’s total impact. Are we so easily seduced by the aluminum bat that we toss all wooden ones from the bat bag?

Let’s also acknowledge that our gadgets eventually program us. History teaches us that technologies often numb the very human capacities they amplify; in its most advanced forms, this is tantamount to auto-amputation. As weavers lost manual dexterity with their use of increasingly mechanized looms during the Industrial Revolution, so we can only imagine what effect GPS will have on the innate and learned ability of New York City cabbies to find their way around the five boroughs. Yet we practice auto-amputation at our own peril. We dare not abandon wooden bats for aluminum for those endeavors that demand prolonged attention, reflection, and the analysis and synthesis that sometimes lead to wisdom, the best result of those decidedly human endeavors that no gadget can exercise.

I have a lot of sympathy for Jonathan’s position, things like the revamping of the New York Public Library leaving me with a queasy hole in my stomach.  I’ve had a running conversation with Beth Transue, another of our librarians, about our desire to start leading alumni tours of the world’s great libraries, but if we’re going to do so we better get it done fast because most of them won’t be around anymore in a few more years, at least if the NYPL and its budgetary woes are anything to judge by.

At the same time, I think Jonathan overstates his case here.  I don’t think serious thinkers are assuming we’ll get rid of books entirely.  Although I currently think we are already living in what I’ve called an E-plus world, print will continue to be with us serving many different purposes. Jason Epstein over at the NYRB has a blog on this fact and progrognosticating the likely future and uses of the traditional book seems to be a growth industry at the moment. I don’t think the average student is too terribly interested in the material textuality that Jonathan references above, nor for that matter is the average scholar, the vast majority of whom remain interested in what people wrote not how the publishers chose to package it.  But those issues will continue to be extremely important for cultural and social historians, and there will be some forms of work that will only possibly be done with books.  Just as it is a tremendous boon to have Joyce’s manuscript’s digitized, making them available for the general reader and the scholar who cannot afford a trip to Ireland, authoritative interpretations of Joyce’s method, biography, and life’s work will still have to make the trip to Ireland to see the thing for themselves, to capture what can’t be captured by a high resolution camera.

That having been said, who would say that students studying Joyce should avoid examining the digitized manuscripts closely because they aren’t “the genuine article.”  Indeed, I strongly suspect that even the authoritative interpretations of those manuscripts will increasingly be a commerce between examination of the physical object and close examination of digitized objects since advanced DH work shows us time and time again that computerized forms of analysis can get at things the naked eye could never see.  So the fact that there are badly digitized copies of things in google books and beyond, shouldn’t belie the fact that there are some massively important scholarly opportunities here.

Jonathan’s second point is about the deeply human and quasi-spiritual aspects of engagement with traditional books that so many of us have felt over the years.  There’s something very true about this. It is also true that our technologies can result in forms of self amputation.  Indeed, if we are to take it to heart we need to admit that the technology of writing and reading itself is something that involves self-amputation.  Studies have shown that heavy readers alter their brains, and not always in a good sense.  We diminish the capacity of certain forms of memory, literally making ourselves absent minded professors.   Other studies have suggested that persons in oral cultures have this capacity in heightened form, and  some people argue that this generation is far more visually acute than those that preceded it, developing new abilities because of their engagement with visual texts.  So, indeed, our technologies alter us, and even result in self-amputation, but that is true of the traditional book as well as the internet.  This second is Jonathan’s larger claim since it seems to claim for traditional books as such a superiority in terms of something central to humanity as such. I am intrigued, with this argument that the book is superior for serious reflection and the quasi spiritual aspects of study that we have come to treat as central to the humanities.

I admit, I don’t buy it.

First, I admit that I’m just wary about attributing essential human superiorities to historical artifact and practices.  Homer as a collection of aural songs is not inherently inferior to the scrolls within which they were originally collected, then finding their apotheosis in the book form.  We have come to think of the book as exhibiting and symbolizing superior forms of humanity, but it’s not clear that book form was triumphant in the west because of these attributes.  Indeed, traditional Jews and others clearly think the scroll remains the superior spiritual form even to this day.  Rather, the codex triumphed for a variety of complicated reasons.  Partly Christian Churches for ideological reasons apparently wanted to distinguish their own writings from the writings of the Jews.  There may have been some more substantive reasons as well, though that’s not entirely clear: Anthony Grafton points out that many of the Christian innovations with the codex seemed to focus on the desire to compare different kinds of texts side by side (an innovation, I will point out, for which the internet is in many ways easily superior).  The codex also triumphed not because it was spiritually and intellectually superior but because it was, frankly, more efficient, cheaper, and easier to disseminate than its scrolly ancestors.  One good example is from the poet Martial who explicitly ties the selling of his poetry in codex form to making them easily and efficiently accessible to the common person:  “Assign your book-boxes to the great, this copy of me one hand can grasp.”

The entire trend of book history has been toward this effort to make texts and what they contain more readily and easily available to more and more people.  From the early clay tablets to the mass market paperback that let you carry Plato in your hip pocket, the thrust of the book has been toward broader and broader dissemination, toward greater and greater ease of use, toward cheaper and cheaper accessibility.  The goal of writing, even when that writing was imprisoned in libraries that only the initiated could enter as in Umberto Eco’s The Name of the Rose, has been open access.

The digitization that is occurring now comes to fulfill the book, not destroy it.

Secondarily, I guess I no longer believe fully in the spiritual or intellectual superiority of codex forms simply since it doesn’t comport with my experience.  As I do more and more of my reading of books with my various e-readers, I find that I have serious, contemplative, analytical, and synthetic engagements with all kinds of texts, from those hundreds of “pages” long and those not.  As I get used to the tools of various e-readers, theres almost nothing that can’t be accomplished in some way on an e-reader that is accomplished in traditional books.  Although I interact with texts differently now in a spatial sense, I am able to take fuller and more copious notes, I am able to mark texts more easily,  and if I can’t quite remember where something was in the book I can use a search engine to find not only a specific phrase or topic, but every single instance of that topic in the book.  Moreover, because every text represents an act of contemplation on and conversation with other texts, I can at the touch of a screen go and read for myself the interlocutors embedded within a book, just as those interested in Jonathan’s essay can touch my link above and decide for themselves whether I am reading him fairly.  Thus there are very obviously and seriously some ways in which e-readers are superior for serious analytical and interpretive readings of texts, or at least the equal to them.

All this having been said, I will say that there remains one way that I find the traditional paper book the clear superior to the e-book, and that has to do with my ability to make it mine.

I spoke a couple of days ago about the personal connection I felt to Kierkegaard in rereading him and discovering my many years of underlines, highlights and marginalia.  I even confess that I real Kimi Cunningham Grant’s new memoir on my iPad, but I still bought a hard cover at the reading–not because I thought I would be able to analyze it more effectively in hard cover, but because I wanted her to sign it for me.

This is a personal connection to the book that isn’t unimportant, but that is about my personal biography, and Kimi’s.  It’s not about the text, and frankly I doubt it will in the long run even be about literary history.  Some literary archivist somewhere is collecting all the shared comments on the Kindle version of Kimi’s book, and that massive marginalia will be fodder for some graduate student’s dissertation in a few decades.

I pity the poor graduate student who decides on such a project. But at least she won’t have to strain her eyes to decipher the handwriting.

Is hierarchy the problem? Higher Education Governance and Digital Revolutions

In the Digital Campus edition of the Chronicle, Gordon Freeman Makes the now common lament that colleges and universities are not taking the kind of advantage of Cloud technology that would enable superior student learning outcomes and more effective and efficient teaching. In doing so he lays blame at the feet of the hierarchical nature of higher Ed in a horizontal world.

Higher-education leaders, unlike the cloud-based companies of Silicon Valley, do not easily comprehend the social and commercial transformation gripping the world today. Indeed, there was a certain amount of gloating that the centuries-old higher-education sector survived the dot-com era. After all, textbooks are still in place, as are brick and mortar campuses.

The simple fact is that life is becoming more horizontal, while colleges remain hierarchical. We can expect the big shifts in higher education—where the smart use of digitization leads to degrees—to come from other countries.

And that’s sad, because the United States makes most of the new technologies that other parts of the world are more cleverly adapting, especially in education.

(via Instapaper)

I appreciate the metaphor of hierarchy versus horizontality, and I think it’s seductive, but I wonder if it’s accurate. It captures an American view of the world that images the bad guys as authoritarian and hierarchical and the good guys as democratic and dialogical.

Whether or not the horizontal crowd can ever do evil is a post for another day. I’m more interested in whether the slowness of higher Ed to take up new modes of doing business is actually due to hierarchy. Speaking to a colleague at a national liberal arts college that shall not be named, he indicated the president gave strong support for digital innovation in teaching and research, but that the faculty as a whole was slow on the uptake, which meant institution wide change was difficult to achieve.

This rings true to me as an administrator. Change is slow not because we are too hierarchical, but because we take horizontality to be an inviolable virtue. Faculty are largely independent operators with a lot of room to implement change or not as they choose. Institution wide changes in educational programming takes not one big decision, but a thousand small acts of persuasion and cajoling in order to effect change. I mostly think this is as it should be. It is the difficult opposite edge of academic freedom. It means that changing the direction of an institution is like changing the direction of an aircraft carrier, and doing so without an admiral who can indicate direction by fiat. There are problems with that, but I’m not sure that the changes required to make higher Ed as nimble as Gordon Freeman desires will result in an educational system we’d like to have.

Cosmopolis, My Home Town

In my first school years growing up as a child of American missionaries in Papua New Guinea, my friends and I lined up outside our two-room school house every day, stood to attention, and sang “God Save the Queen” to the raising of the Australian flag.  We played soccer at recess.  And cricket.  I learned quickly to speak a fluent pidgin–the standard language of commerce and conversion among the 1000 different language groups on the Island–and probably spoke as much pidgin in my four years there as I did English.  By the end of the first six months I spoke with an Aussie accent.

At the same time my friends and I were fiercely loyal Americans, even though America was mostly an idea our parents talked about.  A place in pictures we inhabited in the Polaroid versions of our infant selves.  I proudly proclaimed myself a Texan even though I had spent only the first two years of my life in Texas and had no living memory of it except hazy dream flashes of a  visit to a beach in Galveston.  Once, erudite already at the age of seven and reading my way through the World Book Encyclopedia, I proclaimed confidently that Australia was as big as the continental United States.  Fisticuffs ensued. My friends in utter disbelief that anything in the world could be so large as America–so large did it loom in our telescopic imaginations–and in disbelief too that I would have the temerity to state the blasphemy out loud.

I think this urgency to be American was born somehow out of an intuited recognition of our placelessness.  It was a longing to belong somewhere, and an acknowledgement that somehow, despite appearances, we were not entirely sure we belonged where we were. Unlike most of my friends, I returned to the States after only four years.  I shed my Aussie accent hurriedly.  When my father came to my third grade classroom in Bethany, Oklahoma, I refused to speak pidgin with him, embarrassed, pretending to forget.  No one played soccer.  No one had heard of cricket.  I semi-learned to throw a baseball, though my wife still throws better than I do.  For the first year back in the states, I rooted for the Texas Longhorns, before finally getting religion sometime right around 1970.  I’ve been a Sooner fan in good standing ever since.

This sense of cultural dislocation, of belonging and not belonging to two different countries and cultures, was, I think, felt much more acutely by my friends who remained in New Guinea for the duration of their childhoods.  And it has certainly been detailed and discussed much more movingly and thoughtfully by my former student here at Messiah College, Carmen McCain.  Still, I think this cultural lurching has remained important to me.  While I became thoroughly and unapologetically American, I retained a sense that people lived in other ways, that I had lived in other ways.  Somehow, to remain loyal to all the selves that I had been, I could never be loyal to just one place or just one people.  In that sense, I have always been drawn to a kind of cosmopolitan ideal, a recognition that the way we do things now is only a way of doing things now, bound by time, chance, and circumstance–that there are many different ways to live, and that these ways may be at different times taken up and inhabited.  And so the possibilities for our selves are not bounded by the blood we’ve been given or the ground to which we’ve been born.

At the same time, I’ve really been impressed lately by a couple of cautionary essays on the limitations of cosmopolitanism.  This week Peter Woods over at the Chronicle of Higher Education sounded a cautionary note about the ideal of global citizenship.

Being a “citizen of the world” sounds like a good and generous thing. Moreover it is one of those badges of merit that can be acquired at no particular cost. World citizens don’t face any of the ordinary burdens that come with citizenship in a regular polity: taxes, military services, jury duty, etc. Being a self-declared world citizen gives one an air of sophistication and a moral upper hand over the near-sighted flag-wavers without the bother of having to do anything.

Well, one can only say yes this strikes me as incredibly fair.  Though I will point out that it seems to me that a lot of times recently the flag-wavers seem to be not too interested in the basic things of a regular polity, like paying taxes.  Still, Woods has a point that cosmopolitanism can often devolve into a kind of irresponsible consumerist tourism–imbiber of all cultures, responsible for none.  He implies, rightly I think, that whatever the values of global awareness, the bulk of life is worked out in the nitty-gritty day to day of the local business of things.  All living, not just all politics,  is local in some utterly conventional and inescapable sense.

Wood goes on to critique Martha Nussbaum, though it is a generous critique it seems to me.

Higher education inevitably involves some degree of estrangement from the culture and the community in which a student began life. If a student truly engages liberal education, his horizons will widen and his capacity for comprehending and appreciating achievements outside his natal traditions will increase. Thus far I accept Nussbaum’s argument. But a good liberal-arts education involves a lot more than uprooting a student; showing him how limited and meager his life was before he walked into the classroom; and convincing him how much better he will be if he becomes a devotee of multiculturalism. Rather, a good liberal arts education brings a student back from that initial estrangement and gives him a tempered and deepened understanding of claims of citizenship—in a real nation, not in the figment of “world citizenship.”

I like a lot of what Woods is doing in passages like this, but I’m concerned that his only means of articulating a notion of particularity is through the category of the nation.  In a nation as big and baggy as the United States, does this give us a really very robust sense of the local and particular?  And does it solve my basic problem that I feel loyal to different localities, to the integrity of the memory of the person I have been and the people with whom I was and somehow still am.

I’m more attracted to what my colleague here at Messiah College, John Fea, has to say about cosmopolitanism in his recent and very good essay on the issue in academe, where he develops the concept of cosmopolitan rootedness as an ideal to strive after.

But this kind of liberal cosmopolitanism does not need to undermine our commitment to our local attachments. Someone who is practicing cosmopolitan rootedness engages the world from the perspective of home, however that might be defined. As Sanders writes:

To become intimate with your home region [or, I might add, one’s home institution], to know the territory as well as you can, to understand your life as woven into local life does not prevent you from recognizing and honoring the diversity of other places, cultures, ways. On the contrary, how can you value other places, if you do not have one of your own? If you are not yourself placed, then you wander the world like a sightseer, a collector of sensations, with no gauge for measuring what you see. Local knowledge is the grounding for global knowledge. (1993, 114)

Or to quote the late Christopher Lasch:

Without home culture, as it used to be called—a background of firmly held standards and beliefs—people will encounter the “other” merely as consumers of impressions and sensations, as cultural shoppers in pursuit of the latest novelties. It is important for people to measure their own values against others and to run the risk of changing their minds; but exposure to other will do them very little if they have no mind to risk. (New Republic, 18 February 1991)

So is cosmopolitan rootedness possible in the academy? Can the way of improvement lead home? Can we think of our vocation and our work in terms of serving an institution? Our natural inclination is to say something similar to the comments in the aforementioned blog discussion. I can be loyal to an institution as long as the administration of the institution remains loyal to me. Fair enough. Administrators must be sensitive to the needs of their faculty, realizing that institutional loyalty is something that needs to be cultivated over time. But this kind of rootedness also requires faculty who are open to sticking it out because they believe in what the institution stands for—whatever that might be. (This, of course, means that the college or university must stand for something greater than simply the production of knowledge). It requires a certain form of civic humanism—the ideological opposite of Lockean contractualism—that is willing to, at times, sacrifice rank careerism for the good of the institution.

Instead of Global citizenship, John is championing what is sometimes called by administrators institutional citizenship (and as an aside I would only say John is exemplary as this essay might suggest).  Yet I admit that I find myself still grappling after that thing that speaks out of our memories, those places to which we remain loyal in thought and speech because we have been committed to those locations too.  And I wonder then if it is possible that we might be loyal to the places and spaces of our imaginations, places and selves and worlds that we can imagine as places of becoming.  If I have been in and of these other places, how is that reflected in being in and of this place I’m in, and how should that be imagined in light of the places I might also be in and of, if only not yet.

John, I know, is a loyal citizen of New Jersey, and defends it when none of the rest of us will. I wish him well.  And I am a loyal citizen of the Papua New Guinea of my memory, and I am a fiercely loyal southerner and southwesterner who takes an ethnic umbrage at the easy sneering about the south that springs unconcsciously to the lips of northerners, and I am a fiercely loyal Oklahoman who believes the state has something to be proud of beyond its football team.

I am also, in some sense, a loyal citizen of the heaven of my imagining where all and everyone speak in the tongues of men and angels  and we hear each and every one in our own tongues, a transparent language without translation, a heaven where every northerner finally learns the proper way to say “Y’all.”

What theory of locality and cosmopolitanism can get at this sense that I am one body in a place, but that this body bears in its bones a loyalty to many places, growing full of spirit at the smell of cut grass after rain in the hills of Arkansas, nose pinching at the thought of the salty stink of Amsterdam, remembering faintly the sweat in the air and on the leaves of the banana trees in highland tropics of New Guinea?

Grading the Crowd

Can the wisdom of crowds apply to grading student papers, or to evaluation of culture more generally?  What about the quality of a theological argument, or a decision about foreign policy?  We’re taken a lot with the idea of crowds and collaboration lately, and not without good reason.  I think there’s a great deal to be said about getting beyond the notion of the isolated individual at work in his study;  especially in the humanities I think we need to learn something from our colleagues in the sciences and think through what collaborative engagement as a team of scholars might look like as a norm rather than an exception.  At the same time, is there a limit to collective learning and understanding?  Can we detect the difference between the wisdom of the crowd and the rather mindless preferences of a clique, or a mob.  I found myself thinking about these things again this evening as I read Cathy Davidson’s latest piece in The Chronicle Review, “Please Give Me Your Divided Attention: Transforming Learning for the Digital Age.”

Cathy Davidson, "Now You See It"

I wrote about Davidson a couple of days ago–she’s around a lot lately, as authors tend to be when a new book comes out that a publisher has decided to push—and I feel almost bad at taking up  only my crabbiest reactions to her recent work.  First, let me say that I briefly crossed paths with Davidson at Duke where she was hired the year I was finished my doctorate in English.  She seemed like a breath of fresh and genuine air in a department that could sometime choke on its collective self-importance, and the enthusiasm and generosity and love of teaching that Davidson evinces in this essay was evident then as well, though I never had her for class.  And, as this comment suggests, I think there’s a lot in this essay that’s really important to grapple with.  First, her suggestions of the ways that she and some of her colleagues at Duke trusted students with an experiment in iPod pedagogy paid off in so many unexpected ways, and we now know a good bit of that was far ahead of its time.  Moreover, she paints a wonderful picture of students as collaborative teachers in the learning process in her course on the way neuroscience is changing everything.  Still, as with a lot of these things that focus on student-centeredness, I find that promising insights are blinded by what amounts to a kind of ideology that may not be as deeply informed about human action as it really ought to be.  I felt this way in Davidson’s discussion of grading.

 There are many ways of crowdsourcing, and mine was simply to extend the concept of peer leadership to grading. The blogosphere was convinced that either I or my students would be pulling a fast one if the grading were crowdsourced and students had a role in it. That says to me that we don’t believe people can learn unless they are forced to, unless they know it will “count on the test.” As an educator, I find that very depressing. As a student of the Internet, I also find it implausible. If you give people the means to self-publish—whether it’s a photo from their iPhone or a blog—they do so. They seem to love learning and sharing what they know with others. But much of our emphasis on grading is based on the assumption that learning is like cod-liver oil: It is good for you, even though it tastes horrible going down. And much of our educational emphasis is on getting one answer right on one test—as if that says something about the quality of what you have learned or the likelihood that you will remember it after the test is over.

Grading, in a curious way, exemplifies our deepest convictions about excellence and authority, and specifically about the right of those with authority to define what constitutes excellence. If we crowdsource grading, we are suggesting that young people without credentials are fit to judge quality and value. Welcome to the Internet, where everyone’s a critic and anyone can express a view about the new iPhone, restaurant, or quarterback. That democratizing of who can pass judgment is digital thinking. As I found out, it is quite unsettling to people stuck in top-down models of formal education and authority.

Davidson’s last minute veering into ad hominem covers over the fact that she doesn’t provide any actual evidence for the superiority of her method, offers a cultural fact for a substantive good—if this is how things are done in the age of digital thinking, it must be good, you old fogies—seems to crassly assume that any theory of judgment that does not rely on the intuitions of 20 year olds is necessarily anti-democratic and authoritarian, and glibly overlooks the social grounding within which her own experiment was even possible.  All of this does sound like a lot of stuff that comes out of graduate departments in English, Duke not least of all, but I wonder if the judgment is really warranted.

An alternate example would be a class I occasionally teach, when I have any time to teach at all any more, on book reviewing.  In the spirit of democratizing the classroom, I usually set this course up as a kind of book contest in which students choose books to review and on the basis of those reviews, books proceed through a process of winnowing, until at last, with two books left, we write reviews of the finalists and then vote for our book of the year.  The wisdom of the crowd does triumph in some sense because through a process of persuasion students have to convince their classmates what books are worth reading next.  The class is partly about the craft of book reviewing, partly about the business of book publishing, and partly about theories of value and evaluation.  We spend time not only thinking about how to write effective book reviews for different markets, we discuss how theorists from Kant to Pierre Bourdieu to Barbara Herrnstein Smith discuss the nature of value, all in an effort to think through what we are saying when we finally sit down and say one thing is better than another thing.

The first two times I taught this class, I gave the students different lists of books.  One list included books that were short listed for book awards, one list included first time authors, and one list included other books from notable publishers that I had collected during the previous year.  I told them that to begin the class they had to choose three books to read and review from the lists that I had provided, and that at least one book had to be from a writer of color (my field of expertise being Ethnic literature of the U.S., I reserved the right).  They could also choose one book simply through their own research to substitute for a book on one of my lists.  Debates are always spirited, and the reading is always interesting.  Students sometimes tell me that this was one of their favorite classes.

The most recent time I taught the class, I decided to take the steps in democratizing one step further by allowing the students to choose three books entirely on their own accord.  Before class began I told them how to go about finding books through the use of major industry organs like Publishers Weekly, as well as how to use search engines on Amazon and elsewhere—which, their digital knowledge notwithstanding, students are often surprised at what you can do on a search engine.  The only other guidance was students would ultimately have to justify their choices by defending in their reviews why they liked the books and thought they could be described as good works of literature, leaving open what we meant by terms like “good” and “literature” since that was part of the purpose of the course.

The results were probably predictable but left me disheartened nonetheless.  Only one book out of fifty some books in the first round was by a writer of color.  A predictable problem, but one I had kept my fingers crossed would not occur.  More than half the books chosen by my students were from romance, mystery, fantasy lit, and science fiction genres.  Strictly speaking I didn’t have a problem with that since I think great works of fiction can be written in all kinds of genres, and most works of what we call literary fiction bear the fingerprints of their less reputable cousins (especially mystery writing, in my view, but that’s another post).  I thought there might be a chance that there would be some undiscovered gem in the midst.  I do not have the time to read all fifty books, of course, but rely on students to winnow for me and then try to read every book that gets to the round of eight.  It’s fair to say that in my personal authoritarian aesthetic, none of the books that fell in to that generic category could have been called a great work of fiction, though several of them were decent enough reads.  Still, I was happy to go with this and see where things would take us, relatively sure that as things went on and we had to grapple with what it meant to evaluate prose, we would probably still come out with some pretty good choices.

Most of the works that I would have considered literary were knocked out by the second round, though it is the case that Jonathan Franzen’s Freedom made it all the way to the finals, paired against Lisa Unger’s entry from 2010, whose title I can’t even remember now.  In the end the class split almost down the middle, but chose Unger’s book as the best book they had read during the course of the semester.  Not that I thought it was a terrible book. It was a nice enough read and Unger is a decent enough mystery writer.  But being asked to remember the book is a little like being asked to remember my last trip to MacDonalds.  One doesn’t go there for memorable dining experience, and one doesn’t read Lisa Unger in order come up with books that we will care to remember several weeks after having set them down.  But what was perhaps most intriguing to me was that after an hour long discussion of the two books in which students offered spirited defenses of each writer, I asked them that if they could project themselves in to the year 2020 and had to choose only one book to include on a syllabus in a course on the best books of 2010, which book would it be.  Without exception the students voted for Franzen’s book.  When I asked the students who changed their votes why this would be, they said “We think that Franzen is more important, we just liked reading Unger more.”

This is the nub.  Can the wisdom of crowds decide what is most important?  To that, the answer can only be “sometimes”.  As often crowds choose what is conveniently at hand, satisfies a sweet tooth, or even the desire for revenge. Is there a distinction between what is important or what is true and what is merely popular?  Collaboration can lead us past blindnesses, but it is not clear that the subjectivity of a crowd is anything but blind (in my original draft I typed “bling”, a telling typographical slip and one that may be truer and more interesting than “blind.”  It is not clear that they can consistently be relied upon by their intuition to decide what ought to last. This may not be digital thinking, but at least it is thinking, something crowds cannot be always relied upon to do.

If we could really rely on crowds to make our choices, we would discover that there is really very little to choose between almost anything.  Going on Amazon, what is amazing is that four stars is the average score for all of the 100,000s of thousands of books that are catalogued.  And popularity trumps everything:  Lisa Scottoline scores higher in a lot of cases than Jane Austen.  Literally everything is above average and worth my time.  This is because in the world of the crowd, people mostly choose to be with those crowds that are most like themselves and read those things that are most likely to reinforce the sense they have that they are in the right crowd to begin with.  This is true as even elementary studies of internet usage have pointed out.  Liberals read other liberals, and delight in their wisdom and the folly of conservatives.  Conservatives read other conservatives and do likewise.  This too is digital thinking, and in this case it is quite easily seen that crowds can become authoritarian over and against the voice of the marginalized.  My students choices to not read students of color unless I tell them to is only one small reminder of that.

Which leads to one last observation.  I wonder, indeed, whether it is not the case that this experiment worked so well at Duke because students at Duke already know what it takes to get an A.  That is, in some sense Davidson is not really crowdsourcing at all but is relying on the certain educational processes that will deliver students well-attuned to certain forms of cultural excellence,  able to create effectively and “challenge” the status quot because they are already deeply embedded within those forms of culture excellence and all the assumptions they entail.  That is, as with many pedagogical theories offered by folks at research institutions, Davidson isn’t theorizing from the crowd but from a tiny elite and extremely accomplished sample.  As Gerald Graff points out, most of us most want to teach the students that don’t need us, that means most of us want to teach at places where none of the students actually need us.  Davidson’s lauding of her students in the fact that they don’t’ really need her to learn may merely be an index of their privilege, not the inherent wisdom of crowds or the superiority of her pedagogical method.   Her students have already demonstrated that they know what it takes to get A’s in almost everything because they have them—and massively high test scores besides—or they are unusually gifted in other ways or they wouldn’t be at Duke.  These are the students who not only read all their AP reading list the summer before class started, they also read all the books related to the AP reading list and  have attended tutoring sessions to learn to write about them besides.

Let me hasten to say that there is absolutely nothing wrong with their being accomplished.  On some level, I was one of them in having gone to good private colleges and elite graduate programs.  But it is a mistake to assume that the well-learned practices of the elite, the cultural context that reinforces those practices, and the habits of mind that enable the kinds of things that Davidson accomplished actual form the basis for a democratizing pedagogy for everyone.  Pierre Bourdieu 101.

On Reading and Linearity; or, the virtues of disorganization

Ok, I’ve been hardpressed to keep to my commitment to blog at least once a week.  But (he says hopefully) did anyone really miss me.

Gina Barreca has a nice piece over at the Chronicle of Higher Ed, where she talks about her own absolute disorganization as a personal librarian.  According to Barreca, book people fall in to two basic types: the puritanical organizers and the bohemian disorganized.  Both of whom look on the other with something of a pronounced moral disdain.

Personally I think I fall somewhere in the middle.  I am constantly anxious

What my library would look like if I were guilt-free

What my library would look like if I were guilt-free

about the disorganization of my books, which I suppose makes me something of a Calvinist.  Aware of my fallenness away from an ideal order, but also aware of my inability to do anything about.  Oh woeful disorganized man that I am, who will save my books from this body of sin and death.  Perhaps I’ll have to hire a life coach.

I’m actually genuinely interested in how Barreca described the effect of her disorganization on her reading.

So why do I prefer my own disorder to, for example, the brilliant ease offered by the books in my husband’s part of the library — the ones grouped alphabetically within their own periods?

For the same reasons I prefer a real-live bound, paper dictionary or thesaurus to a virtual one, which is the same reason I like libraries and bookstores, which just so happens to be the same reason I like reading promiscuously in the first place: You don’t know what delight an unexpected coupling will offer. There are literally unimagined pleasures arising from the surprising juxtaposition of unlikely words, materials, and texts.

How wonderful to discover what I didn’t know I was searching for, and what fun not to move, always, from A to B.

To some degree Barreca is flogging a distinction made by advocates of hypertext novels etcetera.  Typically, we imagine reading as a linear activity that procedes from beginning to end, and the supposed tyranny of the book reinforces this kind of reading process.  Always getting from point A to point B, no distractions inbetween.

What I like about Barreca’s offhanded comment is it shows how bizarre a picture of reading that actually is.  When advocates of hypertext declaim pompously about the superiority of networked reading in comparison to the tyrannical linear insistence of the book, I always think they have never really read a book, or at least their idea of reading is a game with which I am unfamiliar.

More typically we always read several books at once.  And we don’t read from beginning to end, we skip the dull parts, we read ahead to see if what we’re ploughing through at the moment is really worth it, we attend to the dialogue rather than the description, or vice-versa.  We forget what we read a week ago and start over, or we forget and skip forward to something that looks interesting.  We give up half way through and cast the text aside in despair. The form of the book has always been much more malleable than those hermeneuts of suspicion have allowed.

None of this necessary denies the superiority of hypertext for certain kinds of things, but at least in argumentation, we ought to give accurate pictures of what reading normally entails, so we can know how it is actually changing.