Monthly Archives: March 2012

Teaching Humanities to digital natives who may know more than we do.

I remember a story about the advent of the New Criticism where one of those famous critic/scholar/teachers–I forget which one, but I want to say Cleanth Brooks or perhaps John Crowe Ransom–admitted to rushing home at night to read feverishly ahead in the texts he was teaching so that he was ready to go the following day.  On the one hand, this is a familiar story to any new (or not so new) professor who’s trying to stay one step ahead of the onrushing train.  On the other hand, its also the case that part of this was demanded by the fact that Brooks and others were trying to do something totally new for a literature classroom, the close perspicacious reading whose minutest detail nevertheless resulted miraculously in a coherent organic whole.  That kind of textual analysis was the meat of my own education, and to be honest, it hasn’t really changed all that much despite all the new (and now new old theories) that came in with the advent of deconstruction and its descendants.  We still, more or less, on the undergraduate level do the close reading, even if we now look for the way things fall apart or for hints and allegations of this or that cultural depravity.

But I am intrigued by just how hard Brooks/Ransom (or whomever it was) had to work to stay ahead of his students, in part because he really didn’t know entirely what he was doing.  He wasn’t building on the secure  corpus of knowledge that previous literary scholastics had received and passed on.  Despite the mythic and quasi-priestly status that some New Critics projected–turning the critic into an all-knowing seer, and thus setting the stage for the later assertions that critics were really the equals or superiors of the novelists and poets they read and critiqued, knowing what those poor souls could only allude to and  evoke–there was a very real sense in which the New Criticism was much more democratic than the literary scholasticism that preceded it.  (I am sure Frank Lentricchia is exploding about now, or would be if he ever actually would bother to read me).  While it may not have been more democratic in the sense that the New Critics seemed to cast a mysterious aura about all they did, developing a new and arcane ritual language to accompany it, it was more democratic in the sense that the method was potentially available to everyone.  Not everyone could have the time to read all the histories and all the letters and delve in to the archives and read the vast quantities of literature required for the literary scholasticism that characterized old style literary history .  But everyone could read the poem or the novel set in front of them.  And potentially a smart undergraduate could see a good deal that the prof had missed, or point out the problems in particular interpretations.  When the evidence of the poem was simply the poem itself, all the cards were on the table.  No longer could a professor say to the quivering undergraduate “Well, yes, but if you had bothered to read x,y, and z you would understand why your assertions about this poems place in literary history are totally asinine.”  The average undergraduate is never in a place to dispute with a professor on the place of this or that figure in literary history, but they could, in fact, argue that a professor had gotten a poem wrong, that an interpretation didn’t hold up to a closer scrutiny of the fact.  The feverish late night work of my Brooks/Ransom avatar, like the feverish late-night work of many a new and not so new professor, is sometimes cast as a noble inclination to truth or knowledge, or the discipline.  It is in truth, very often the quest to avoid embarrassment at the hands of our smarter undergraduates, the quest for just enough knowledge or just enough preparation to make sure we justify our authority in the eyes of our skeptical younger charges.

I was thinking about his again while attending the Re:Humanities undergraduate DH conference at Swarthmore/Bryn Mawr/Haverford Thursday and Friday. Clearly, one of the biggest challenges to bringing DH fully onboard in Humanities disciplines is the simple fact that undergraduates often know as much, and often know a great deal more, about the tools we are trying to employ.  On the one hand, this is a tremendous challenge to mid-career academics who understandably have little interest in abandoning the approaches to scholarship, teaching, and learning that they have developed, that they understand, and that they continue to use effectively given the assumptions and possibilities of those tools as they are.  It was ever thus and to some degree colleges remain always one step behind the students they are attempting to educate, figuring out on the fly how our own education and experience can possibly apply in this day and hour.

However, I also wonder whether the democratization of the technological environment in the classroom isn’t a newly permanent state of affairs.  The pace of technological change–at least for the present, and why would we assume that should stop in the near or mediate future–means that there is some sense in which we are entering an period in the history of education in which educators will, in some sense, never know any more about the possibilities of the tools they are using than do the students that they are teaching.  Indeed, given the nature of the tools, it is quite likely that collectively the students know a great deal more about how to use the tools available to them and that they are likely to be more attuned more quickly to the latest technological developments.  What they don’t know–and what we as educators don’t know either–is how to best deploy those resources to do different kinds of humanistic work.  The teleology of learning used to be fairly, if undemocratically, straightforward.  The basic educational goal was to learn how to do what your teacher could do–with reading, with texts, with research.  In our current age that teleology is completely, perhaps appropriately, disrupted.  But that doesn’t alleviate the sense that we don’t know entirely what we should be teaching our students to do when we don’t entirely know what to do or how to do it ourselves.

Mortimer Adler famously wrote a book on “How to Read a Book”  and though people bemoaned Adler as an elitist and a snob, the basic idea was still important.  Some people knew how to read books and others did not.  I still think its the case that we take a tremendous amount for granted if we assume an undergraduate actually knows how to read an old-fashioned codex well.  They don’t.  On the other hand, we have no equivalent book that tells us “how to read….”, in part because we don’t know how to fill in the blank, though perhaps “digital artifacts” comes as close as anything.  We’re not even sure what tools we should be using to do whatever it is we are doing as humanists in this day and age.  No wonder most professors choose to continue to use books, even though I think the day is fast approaching when students won’t tolerate that, anymore than an ancient would have tolerated the continued use of scrolls when a perfectly good codex was available at hand.  What the current technological changes are doing is radically democratizing the classroom on the level of the tool.

I did have a couple of signs of hope this past week at the Re:Humanities conference at Swarthmore. In the first place, if the educational system in the humanities is becoming radically democratized at the level of structure, I think it is safe to say there are many, many, many people using that democracy well.  The students at the conference were doings stunningly good and creative work that was clearly contributing to our knowledge of the world around us–sometimes pursuing these projects independently or, most often, in partnership with and in mentoring relationships with committed faculty.  (It is, of course, also the case that people can use democracy poorly, as I’ve suggested elsewhere;  this would be true in both the classroom and the body politic, so we should ask whether and where the democratization of our educational system is being used well, rather than assuming that because we use the word democracy we have named a substantive good).

Secondarily, one of the chief insights I drew from the different speakers was that if we put the tools on the table as possibilities, students will surprise and amaze us with what they can manage to come up with.  What if we found ways to encourage students to get beyond the research paper and asked that they do serious creative and critical work with the tools that they have everyday at hand on their iPhones, laptops, and etcetera.  What is we encouraged them to say we have to find the best way to answer the kind of questions humanists have always asked, and to identify the new questions and potential answers that new (and now not so new) technologies make possible.  We will have to do this regardless, I think.  The age demands it.  And I suspect that there will be many many more frantic late nights for faculty ahead.  But I think those frantic late nights will be built less and less on the belief that we have to get on top of “the material” and “stay ahead” of our students.  When they can bring in material we’ve never heard of with the touch of a finger on their iPhones, we have no hope of being on top of the material or staying ahead in a meaningful sense.  Perhaps what we can do is inspire them to charge ahead, guide them to the edges of the landscape that we already know, and partner with them in the exploration of the landscapes that we haven’t yet discovered.

Digital History of the Brethren in Christ; writing in an age of distraction

I’ve been spending the last couple of days at the undergraduate conference on the digital humanities at Swarthmore College, getting a feel for what might be possible at the undergraduate level. Yesterday’s keynote by Alexandra Juhaz, whose book Learning from YouTube created a splash a couple of years ago, emphasized that in our writing now we have to write in an environment in which we know people will be distracted, and it may not be a feasible goal to overcome their distraction. Her own work is trying to account for what that might mean for multimedia writing, for either scholars or undergraduates. What does it mean to write for the distracted and know that they will remain distracted. I don’t quite have my brain around that rhetorical situation yet.

I’ve especially been excited to see the real world possibilities for digital humanities project and imagining the kinds of things our undergraduates might do. My colleague John Fea over at The way of Improvement Leads Home directed my attention to a great new digital history site by one of our graduates from Messiah College, Devin Manzullo-Thomas on the history of Brethren in Christ and evangelicals. Devin’s now a graduate student in digital history at Temple. I’d love to see our undergraduates working with our faculty on a project like this

What does an education for democracy look like?

I’ve been reading a good bit lately about the importance of education for democracy, most recently via the new Patheos post from my colleague John Fea.  As is often the case, John roots his analysis of our current state of affairs in its comparison to the vision of the founding fathers in the early republic.  Broadly speaking, the narrative John sketches is that we have moved from an education for democracy to an education for utility (or for jobs).   Our contemporary discourse is focused almost exclusively on the purposes of education in procuring paying jobs for individuals and securing economic health for the nation.  Of this current state of affairs, John notes the following:

But is the kind of training necessary for a service-oriented capitalist economy to function the same kind of training necessary for a democracy to flourish? It would seem that the study of history, literature, philosophy, chemistry, politics, anthropology, biology, religion, rhetoric, and economics is essential for producing the kind of informed citizen necessary for a democracy to thrive. Democracy requires what the late Christopher Lasch called “the lost art of argument”—the ability to engage unfamiliar ideas and enter “imaginatively into our opponent’s arguments, if only for the purpose of refuting them.” The liberal arts teach this kind of civil dialogue. The founders knew what they were talking about.

Some of what John is saying is echoed in Andrew Delbanco’s book, which I discussed a couple of days ago and have made my way through a bit further.  The virtue of Delbanco’s book is to push John’s analysis even further in to the past, noting the high value that the Puritans put on education as a means of developing the whole person.  In other words, the writers of the early republic had inherited what was essentially a religious ideal.  We seek education fundamentally out of an ethical commitment to others and out of a religious commitment to a higher calling.

despite its history of misuse and abuse, there is something worth conserving in the claim, as Newman put it, that education “implies an action upon our mental nature, and the formation of a character.” 18 College, more than brain-training for this or that functional task, should be concerned with character— the attenuated modern word for what the founders of our first colleges would have called soul or heart. Although we may no longer agree on the attributes of virtue as codified in biblical commandments or, for that matter, in Enlightenment precepts (Jefferson thought the aim of education was to produce citizens capable of “temperate liberty”), students still come to college not yet fully formed as social beings, and may still be deterred from sheer self-interest toward a life of enlarged sympathy and civic responsibility.

Delbanco, Andrew (2012-03-22). College: What it Was, Is, and Should Be (Kindle Locations 733-739). Princeton University Press. Kindle Edition.

Delbanco argues that the uniquely American insight about a college education–a gift as unique and perhaps more important than jazz or Hollywood–is that this ideal of a transformative education is not limited to an elite but should in principle be available to all.  This is why the American system of general education at the tertiary level is nearly unique in the modern world.

The question, however, is whether this ideal has ever been realized in practice.  The answer is obviously no.  College attendance was in fact very limited until very recently, and the kind of education Jefferson and others imagined was primarily achieved through other means than a college education in the populace as a whole–in what we would now call high school or even earlier since even compulsory high school was a post-republican ideal.  Ironically, the very intense conflicts in the United States over the value of college and whether or not college should focus on liberal learning or professional preparation is precisely a consequence of the efforts toward its democratization.  The conflict between “practical” education for the masses and liberal education for the elite is a very long an old argument, one that has animated discussions about education throughout the twentieth century.  Think of the conflict between DuBois and Booker T. Washington  over what kind of education was most likely to secure freedom for the average AFrican American.

The more democratic that American education has become, the more the questions about what exactly we are preparing the average student for has been driven home. This is why both a liberal President like Barak Obama and conservative CEOs agree that what’s most important is education for a job.  Those of us in the liberal arts like John Fea and I disagree.  We show ourselves to be participants in a very old and long standing debate in American education, one as yet unresolved though proponents of a liberal education have been knocked to the mat pretty often lately.

What is the future of the book?–Anthony Grafton’s Keynote lecture at Messiah College

This past February we had the privilege of hearing from Dr. Anthony Grafton from Princeton University at our Humanities Symposium at Messiah College.  Grafton is a formidable scholar and intellect, and a generous soul, a too rare combination.  The following video is his keynote lecture for the Symposium.  Grafton’s instincts are conservative, readily admitting his undying love for the codex and its manifold cultural institutions (libraries, used bookstores, even Barnes and Nobles).  At the same time, he is under no illusions that the future of the book lies elsewhere.  His lecture looks at what is threatened, what should be valued and protected from the fast, but also what might be a potential for the future of the book, and what values we should bring to bear to shape the book, which is, after all, a human institution.

Many thanks to Derick Esch, my work study student, for his work in filming and producing this video.  Other videos from the symposium can be found at the same vimeo page.

Dr. Anthony Grafton: 2012 Humanities Symposium Keynote Address from Messiah Humanities on Vimeo.

Andrew Delbanco–What are the virtues of a college education?

I’ve begun reading Andrew Delbanco’s latest book, College, What It Was, Is, and Should Be, impressed by an essay in the Chronicle Review derived from the book.  I’ve only made my way through the first chapter, but there are a several things to note immediately.

First, Delbanco dances a little bit with question of what college was. He  shows how all of our current debates and lamentations about college life–students are too often debauched, professors teach too little and too poorly, and the college curriculum isn’t focused well enough on getting students jobs–are all of very long-standing, common to our public discourse as equally in 1776 as in 1976 and on to today.  At the same time he shows how in some very real ways colleges have already abandoned and are ever more quickly fleeing from ideals that they once embodied, however imperfectly.

For Delbanco, the genius of college–as opposed to the professionally oriented university–is primarily to be found in an ethical imperative rather than an economic motive.  It’s main value is to establish a kind of personhood that is necessary for citizenship.  It’s qualities include the following:

1. A skeptical discontent with the present, informed by a sense of the past.

2. The ability to make connections among seemingly disparate phenomena.

3. Appreciation of the natural world, enhanced by knowledge of science and the arts.

4. A willingness to imagine experience from perspectives other than one’s own.

5. A sense of ethical responsibility.

These habits of thought and feeling are hard to attain and harder to sustain. They cannot be derived from exclusive study of the humanities, the natural sciences, or the social sciences, and they cannot be fully developed solely by academic study, no matter how well “distributed” or “rounded.” It is absurd to imagine them as commodities to be purchased by and delivered to student consumers. Ultimately they make themselves known not in grades or examinations but in the way we live our lives.

Delbanco, Andrew (2012-03-22). College: What it Was, Is, and Should Be (Kindle Locations 138-148). Princeton University Press. Kindle Edition.

For Delbanco, these qualities are essential to the functioning of a healthy democracy.  He puts this most succinctly and eloquently, I thought, in his adaptation for the Chronicle Review, referencing Matthew Arnold and saying, “Knowledge of the past, in other words, helps citizens develop the capacity to think critically about the present–an indispensable attribute of a healthy democracy.”  Amen and a mane.

The problem, and Delbanco is, of course, aware of it, is that what college is, and is fast exclusively becoming, is a commodity that is purchased by and delivered to student customers.  The economic metaphors for college life are triumphant, and no more clearly so than in our discourse about whether a college education is “worth it.”  The question of whether a college education is “worth it” is posed and answered these days in almost exclusively monetary terms.  How much does it cost, and how much will you get for the investment?

Over and against this rather ruthless bottom line, Delbanco’s descriptions seem noble, but I’m a little afraid that it is so much tilting at windmills (I reserve judgement until I’ve actually finished the book).  Only today I was discussing these matters with several of my faculty who are going to be attending the conference at Wake Forest, Rethinking Success:  From the liberal arts to careers in the 21st century.  Our career development director described to me parents who come to her asking for job statistics for their children as they chose between our small Christian college and other more well-known universities.  The fundamental decisions are not related so much to the the quality of education we could provide, not the kind of transformative potential that her child might realize in an environment at Messiah College devoted to the development and integration of an intellectual, spiritual and ethical life, but whether in fact our graduates get jobs as readily and whether those jobs pay as much as her child’s other options.  The difficulty for a College less well known than the Ivies Delbanco focuses on, is to find a rhetoric and an educational program that holds up the flame of the education Delbanco imagines, while also speaking frankly and less idealistically to the ways in which that education can pay off in material ways.

It’s not that these are poor questions for parents to be asking;  its just that these questions are unrelated to the kinds of things Delbanco is saying College is for and that many of us have believed that it is for.  Delbanco, of course, is trying to intervene in useful way to alter the national discourse about what college ought to be about.  Without a shift in that discourse, its impossible to imagine College being for what Delbanco says it should be for, except somewhere in the hidden and secret recesses of the academic heart.

What could our educational system learn from Wegmans?

Higher education in general has been almost inveterately averse to the idea that they should take a cue or two from the world of business, seeing our own ideals of educational transformation as going well beyond the utilitarian emphases of the for-profit world and its emphases on the bottom-line.  I think this wariness is warranted, and my own sense is that education in the United States on both the secondary and tertiary levels is being severely damaged by an over-emphasis on pre-professionalism at the expense of comprehensive intellectual and imaginative development.

Nevertheless, I think our wariness of big business suffers from the abstraction of the singular noun.  “Business” is not one thing, but many things, and businesses in the plural can be quite unique and plural in their approach to their own markets, blending bottom-line success with larger cultural and social goals within the culture of their companies and beyond.  Especially, it seems to me, higher education could learn something from the way some businesses have sought to place a premium on higher levels of more informed service as well as the effort to provide higher quality products that in some ways create the need and desire that is necessary for their markets to work, rather than depending on discerning whatever it is we think that people want or markets want or governments want.

I’ve been running across several articles recently along the lines of this one concerning Wegmans:

“Our employees are our number one asset, period,” said Kevin Stickles, the company’s vice-president for human resources. “The first question you ask is: ‘Is this the best thing for the employee?’ That’s a totally different model.”

Yet the company is profitable. Its prices are low. And it is lauded for exemplary customer service.

“When you think about employees first, the bottom line is better,” Stickles argued. “We want our employees to extend the brand to our customers.”

The Wegmans model is simple. A happy, knowledgeable and superbly trained employee creates a better experience for customers. Extraordinary service builds tremendous loyalty.

In a slightly different vein, I think Apple’s attention to providing the highest quality services to customers has paid off as well:

Part of the problem facing the non-Apples of the world is historical baggage. Big phone makers, such as HTC and Samsung, were never computing experts. As for PC makers, in the 1980s and 1990s, when Intel and Microsoft ruled, they had little choice but to focus on cutting costs to eke out a profit after paying the bill for those Pentium chips and Windows licenses.

Kerry Chrapliwy, a former executive in HP’s PC group, says that if a product did not turn into a blockbuster overnight at HP or Dell, it was often killed. “We were always fighting the philosophy at HP of, ‘How do I get this product to market at the lowest possible cost to the highest volume of people?’ There was not enough focus on delivering the right experience to people.”

Apple, by contrast, “had a worldview that said, ‘We’ll suck it up for three or four years and make it happen,’ ” said Roger McNamee, a co-founder of technology investment firm Elevation Partners.

I think both of these are examples of businesses that have succeeded in significant part because they worked in ways that were counter to the tendency in “business” to cut first and ask questions later–whether the cuts came in the benefits of employees, cutting employees, or cutting in to the quality of life of employees (by, say, attempting to deliver more services at the same price by stacking more work in to the lives of existing employees), or by cutting the quality or quantity of services delivered.  The brand of Wegmans and Apple alike is such that it has generated tremendous customer  appreciation and loyalty even when and if people have to pay more for those services and those products.

Our educational system is a different animal, to be sure.  Yet I still wonder whether the assault on secondary teachers and on faculty that is so pervasive in our political culture is really a prescription for educational success.  What if, in fact, we created an environment in which higher levels of performance by teachers was the result of educational environments that placed investment in teachers as job one, and investment in unique and creative educational product (rather than the production of students able to perform on standardized tests) as job one A?

And what would be the trade-off required of teachers or professors in creating that environment?  What if the price were something like loyalty to an institution and its mission and its immediate students rather than the more abstract loyalty that one has to one’s discipline or intellectual interests?

Teaching (to the) Tests

I was a little appalled to find out last week that my son is spending multiple days taking PSSA standardized tests in his school, a ritual that he shrugs his shoulders and rolls his eyes at, a teenager’s damning assessment that says wordlessly that this-is-just-one-more-stupid-thing-adults-make-me-do-for-my-own-good.

Teenage boys are notoriously bored with school and sneer at book-learning.  Think Huckleberry Finn as only one great American example.  So I think my son’s attitude about school is not particularly new.  But whereas Twain’s hero, in a great romantic tradition, saw books themselves as corrupt and the learning they provided as a corrupter of virtuous instincts, nothing is further from the truth for  my son.  He reads ravenously, much more than I did at his age, which is saying quite a bit.  And the level and range of his interests astonish me, from jazz poetry and history, to hindu mysticism, to the thickest of contemporary literary novels like Delillo’s Mao II and David Foster Wallace’s Infinite Jest.  When he bought Wallace’s Infinite Jest he was nearly giddy with delight.  “It’s like a giant two-fisted hamburger” he said laughing, holding it up to his open mouth. “Something you can eat up.”  This same boy who sees learning as a feast when left to the devices of books and their authors and his own imagination, seems to experience school as an emotional and intellectual Waste Land, a place he finds dull and dulling.

I can’t blame his teachers.  They are talented and caring.  But I can’t help wondering with a chorus of others, whether something has gone badly wrong with our approaches to schooling, approaches that trap teachers as surely as they enervate students, or at least many male students.

I tend to agree with this blog from Diane Ravitch, who I find to be one of our most interesting commentators on education, if only because she has become so unpredictable and yet so incisive in her analyses. From the NYRB, “Flunking Arne Duncan”

Will Duncan’s policies improve public education?

No. Under pressure to teach to tests—which assess only English and math skills—many districts are reducing the time available for teaching the arts, history, civics, foreign languages, physical education, and other non-tested subjects. (Other districts are spending scarce dollars to create new tests for the arts, physical education, and those other subjects so they can evaluate all their teachers, not just those who teach reading and mathematics.) Reducing the time available for the arts, history, and other subjects will not improve education. Putting more time and money into testing reduces the time and money available for instruction. None of this promotes good education. None of this supports love of learning or good character or any other ideals for education. Such a mechanistic, anti-intellectual approach would not be tolerated for President Obama’s children, who attend an excellent private school. It should not be tolerated for the nation’s children, 90 percent of whom attend public schools. Grade: F.

Of course, I care about this for my son, and I am thankful to God that he is getting through his high school education before the ax has cleaved away programs in the arts and diminished offering in other areas.  But I also care about this as an educator and administrator in a college.  In the first place, it is impossible to imagine that we will have students who are effective writers if they have no historical sense, no understanding of the arts, less exposure to the civic virtues, and fewer opportunities to read and think in languages other than their own.  It is impossible to imagine that such students–even scoring better on a writing or a math exam–are better prepared for college than students from a generation ago, before we started worrying about their being left behind.  It is impossible to imagine that we are getting a better student simply because they can do a math problem successfully when they have never had to struggle to draw a picture or understand a symphony or reach in to the past to engage the worlds of the ancients before them.
If this is the education we are giving them, we are producing neither saints or scholars.  We may be producing a generation of students (perhaps boys especially) who are bored mindless and who will look for their learning in more likely places, like the cover of a book for which no one will think to devise a test.

Why digital humanities is already a basic skill, not just a specialist niche–Matthew Kirschenbaum

Sometimes I think we humanists “of a certain age,” to put the issue politely, imagine digital humanities as an optional activity that will be filled by an interesting niche of young professors who take their place in the academy as yet another niche sub-discipline, something that research universities hire for and small colleges struggle hopelessly to replicate.  It may be indeed that small colleges will struggle to integrate digital humanities in to their own infrastructures, but I think the general picture of Digital Humanities as an optional sub-discipline will simply be unsustainable.  The argument smells a little of the idea that e-books are a nice sub-genre of texts, but not something the average humanist has to worry that much about.  I think, to the contrary, that digital humanities and the multitude of techniques that it entails, will become deeply integrated in a fundament way with the basic methodologies of how we go about doing business, akin to knowing how to do close reading or how to maneuver our way through libraries.

Although pointing out this fact is not his main point, Matthew Kirschenbaum–already a Digital Humanities patron saint in many respects–has an essay in The Chronicle that points to this fact.  Kirschenbaum is currently interested in how we preserve digital material, and the problems are just as complex if not moreso than the general question of how and when to save print materials.  Moreso to the degree that we cannot be sure that the current forms in which we place our digital intelligence will actually be usable five years from now.  The consequences for humanities research and writing are profound and must be considered. From Kirschenbaum:

Digital preservation is the sort of problem we like to assume others are thinking about. Surely someone, somewhere, is on the job. And, in lots of ways, that is true. Dire warnings of an approaching “digital dark ages” appear periodically in the media: Comparisons are often made to the early years of cinema—roughly half of the films made before 1950 have been lost because of neglect. 

But the fact is that enormous resources—government, industry, and academic—are being marshaled to attack the problem. In the United States, for example, the Library of Congress has been proactive through its National Digital Information Infrastructure and Preservation Program. Archivists of all stripes now routinely receive training in not only appraisal and conservation of digital materials but also metadata (documentation and description) and even digital forensics, through which we can stabilize and authenticate electronic records. (I now help teach such a course at the University of Virginia’s renowned Rare Book School.) Because of the skills of digital archivists, you can read former presidents’ e-mail messages and examine at Emory University Libraries a virtual recreation of Salman Rushdie’s first computer. Jason Scott’s Archive Team, meanwhile, working without institutional support, leaps into action to download and redistribute imperiled Web content.

What this suggests is that Rushdie’s biographers will have to not so much know how to sift through piles of letters, but how to recreate digital archives that authors themselves may not be interested in preserving.  Biographers of the present and surely the future, will have to be Digital technicians, as well as close readers of the digital archives they are able to recover.

Kirschenbaum goes on to suggest that most of us must do this work on our own, and must do this work for ourselves, in preserving our own archives.

But despite those heroic efforts, most individuals must still be their own digital caretakers. You and I must take responsibility for our own personal digital legacy. There are no drive-through windows (like the old photo kiosks) where you can drop off your old floppies and pick up fresh files a day or two later. What commercial services are available tend to assume data are being recovered from more recent technology (like hard drives), and these also can be prohibitively expensive for average consumers. (Organizations like the Library of Congress occasionally sponsor public-information sessions and workshops to teach people how to retrieve data from old machines, but those are obviously catch as catch can.)

Research shows that many of us just put our old disks, CD’s, and whatnot into shoeboxes and hope that if we need them again, we’ll figure out how to retrieve the data they contain when the time comes. (In fact, researchers such as Cathy Marshall, at Microsoft Research, have found that some people are not averse to data loss—that the mishaps of digital life provide arbitrary and not entirely unwelcome opportunities for starting over with clean slates.)

This last, of course, is an interesting problem.  Authors have often been notoriously averse to having their mail probed and prodded for signs of the conflicts and confessions, preferring that the “work” stand on its own. Stories of authors burning their letters and manuscripts are legion, nightmarishly so for  the literary scholar.   Such literary self-immolations are both harder and easier in a digital world.  My drafts and emails can disappear at the touch of a button.  On the other hand, I am told that a hard drive is never actually erased for those who are really in the know.  Then again, the task of scholar who sees a writers computer as his archive is in some ways vastly more difficult than that of the writer who was an assiduous collector of his type-written drafts.  Does every deletion and spell correct count as a revision.  What should we trace as an important change, and what should we disregard as detritus.  These are, of course, the standard archival questions, but it seems to me they are exponentially more complicated in a digital archive where a text may change a multitude of times in a single sitting, something not so possible in a typewritten world.
Well, these are the kinds of things Kirschenbaum takes up.  And having the tools to apply to such questions will be the task for every humanist in the future, not a narrow coterie.

Mark Sandel–The commodification of everything

I’ve enjoyed listening occasionally to Mark Sandel’s lectures in philosophy via iTunes.  He has an interesting new article in the April Atlantic focusing on the ways in which nearly everything in American life, at least, has been reduced to a market value.  Despite the admonition that money can’t buy me love, we are pretty sure that it can buy everything else, and that we are willing to sell just about anything, including body parts and personal dignity, for whatever the market will bear.

Sandel somewhat peculiarly to my mind traces this to a post-Cold War phenomenon.

WE LIVE IN A TIME when almost everything can be bought and sold. Over the past three decades, markets—and market values—have come to govern our lives as never before. We did not arrive at this condition through any deliberate choice. It is almost as if it came upon us.

As the Cold War ended, markets and market thinking enjoyed unrivaled prestige, and understandably so. No other mechanism for organizing the production and distribution of goods had proved as successful at generating affluence and prosperity. And yet even as growing numbers of countries around the world embraced market mechanisms in the operation of their economies, something else was happening. Market values were coming to play a greater and greater role in social life. Economics was becoming an imperial domain. Today, the logic of buying and selling no longer applies to material goods alone. It increasingly governs the whole of life.

The last gasp Marxists I studied with at Duke had a word for what Sandel sees and descries, commodification, and it didn’t mysterious just come upon us in the 1980s.  Commodification, the rendering of every bit of life as a commodity that can bought and sold, is the central thrust of capitalist economies in the 20th century, perhaps the central feature of capitalism per se.  The essential act of commodification is at the center of Marx’s understanding that the worker in some very real sense sells him or herself through selling his or her labor power.  Thus, human beings were commodified well before people became willing to sell tattoos on their foreheads to advertise products.  So Sandel’s perplexity and astonishment at this state of affairs in our contemporary economy strikes me as the perplexity of someone who has only recently awakened from a dream.

On the other hand, I do think Sandel is on to something.  It is the case that despite this thrust of capitalist economies (and, to be frank, I’m not sure that Marxist economies were all that different), there have been sectors of culture and their accompanying institutions that resisted their own commodification.  The edifice of modernism in the arts and literature is built on the notion that the arts could be a transcendent world apart from degradations of the social world, including perhaps especially its markets.  The difficulty and density of modern art and literature was built in part out of a desire that it not be marketable in any typical sense.  Modern art was sometimes ugly precisely to draw attention to the difficulty and difference of its aesthetic and intellectual properties.  It was meant not to sell, or at least not to sell too well.  Remember that the next time a Picasso sells for millions.  Similarly, the church and in a different way educational institutions retained a relative independence form the marketplace, or at least resisted the notion that they could be reduced to market forces.  Whether claiming to provide access to the sacred or to enduring human values, religious institutions and educational institutions served–even when they were corrupt or banal–to remind the culture that there was a world apart, something that called us to be better than ourselves, or at least reminded us that our present values were not all the values that there were.

Sandel rightly notes that that residue has all but disappeared, and the result has been an hollowing out of our public life, and a debasement of our humanity.

In hopes of avoiding sectarian strife, we often insist that citizens leave their moral and spiritual convictions behind when they enter the public square. But the reluctance to admit arguments about the good life into politics has had an unanticipated consequence. It has helped prepare the way for market triumphalism, and for the continuing hold of market reasoning.

In its own way, market reasoning also empties public life of moral argument. Part of the appeal of markets is that they don’t pass judgment on the preferences they satisfy. They don’t ask whether some ways of valuing goods are higher, or worthier, than others. If someone is willing to pay for sex, or a kidney, and a consenting adult is willing to sell, the only question the economist asks is “How much?” Markets don’t wag fingers. They don’t discriminate between worthy preferences and unworthy ones. Each party to a deal decides for him- or herself what value to place on the things being exchanged.

This nonjudgmental stance toward values lies at the heart of market reasoning, and explains much of its appeal. But our reluctance to engage in moral and spiritual argument, together with our embrace of markets, has exacted a heavy price: it has drained public discourse of moral and civic energy, and contributed to the technocratic, managerial politics afflicting many societies today.

Sandel wonders about a way to connect to some kind of moral discourse to inform public life, something that will reach beyond the reach of markets, but he clearly despairs that such a connection can be found.  I think there’s good reason.  Rapidly our educational institutions have become factories that shamelessly advertise themselves as places where people can make themselves in to better commodities than they were before, and which build programs designed to sell themselves to the highest number of student-customers possible.  Our religious institutions are floundering.  Only today I read in Time magazine that the rise of so-called “nones”–people who claim to have no religious affiliation–is one of the most notable developments in our spiritual culture.  Such people often seek to be spiritual but not religious on the grounds that religions are dogmatic and inflexible.  I have come to wonder whether that dogmatism and inflexibility points to the hard won truth that it is not good enough to just go along to get along.

One wonders, in fact, whether a spirituality based on getting along really provides a hard point of resistance to the tendency to see everything in life–whether my beliefs or my ethics–as an investment that must pay off if it is to be worth keeping.  I wonder, too, whether our educational systems and institutions are up to the task of providing an education that isn’t just another instance of the market.  As for art, writing, and literature.  Well, who knows?  Modernism was not always commodified, though it very quickly became  so.  I do find it intriguing that this point of hyper-commodification is also a time when there has been an explosion of free or relatively free writing and music on the internet.  There is a small return to the notion of the artist as a community voice, with musician and poets producing work for free on the internet, and making their living through performance or through other jobs–escaping or at least partially escaping the notion that we produce work primarily to sell it.  This is a small resistance, but worth thinking about.

I wonder if there are other ways our culture is equipped to resist in a larger collective fashion, the turning of our lives in to the image of a can of soup?

The purpose of intellectual work in a democracy–Tony Judt

I discovered Tony Judt very late, only really attending to his astonishing mind a year or so before he was stricken with Lou Gehrig’s disease.  He is a man much further to the left than am I, but I was convicted by his conviction and convinced by his belief that the intellectual life should serve a purpose larger than itself–what might have once been quaintly called something like the realization of the Good Republic.  The most recent NYRB has an excerpt from his new book, Thinking the Twentieth Century.  What I admire in the excerpt is the recognition that “democracy” is not a word that signifies an inherent good.  Like all things human,  “democracy” may be used for good or ill, may work to enhance human decency and community, or may work to corrupt it.  Most recently, in my own view, I think we have seen the ways in which our “democracy” in both its electoral and legislative practices, debases rather than enhances our sense of our humanity.   In a succinct summary from Judt:  “Democracy has been the best short-term defense against undemocratic alternatives, but it is not a defense against its own genetic shortcomings. The Greeks knew that democracy is not likely to fall to the charms of totalitarianism, authoritarianism, or oligarchy; it’s much more likely to fall to a corrupted version of itself.”

Given this fact, Judt is suspicious of intellectual work that aligns itself in favor of grand abstractions like “democracy” or “freedom”, favoring instead a concrete and particularistic practice of nurturing and protecting the institutions and practices that make democracy possible:

I see the present century as one of growing insecurity brought about partly by excessive economic freedom, using the word in a very specific sense, and growing insecurity also brought about by climate change and unpredictable states. We are likely to find ourselves as intellectuals or political philosophers facing a situation in which our chief task is not to imagine better worlds but rather to think how to prevent worse ones. And that’s a slightly different sort of situation, where the kind of intellectual who draws big pictures of idealized, improvable situations may not be the person who is most worth listening to.

We may find ourselves asking how we can defend established legal or constitutional or human rights, norms, freedoms, institutions, and so on. We will not be asking whether the Iraq war was a good or not good way to bring democracy, freedom, liberty, the market, etc. to the Middle East; but rather, was it a prudent undertaking even if it achieved its objectives? Recall the opportunity costs: the lost potential to achieve other things with limited resources.

All this is hard for intellectuals, most of whom imagine themselves defending and advancing large abstractions. But I think the way to defend and advance large abstractions in the generations to come will be to defend and protect institutions and laws and rules and practices that incarnate our best attempt at those large abstractions. And intellectuals who care about these will be the people who matter most.

This seems to me to be true in cultural life as a whole, not only in our politics in a democratic society.  What are those institutions and practices that are most worth defending and nurturing that speak to our best efforts to incarnate large abstractions like “goodness”, “justice”, and “beauty”?