Category Archives: literary criticism

Teaching Humanities to digital natives who may know more than we do.

I remember a story about the advent of the New Criticism where one of those famous critic/scholar/teachers–I forget which one, but I want to say Cleanth Brooks or perhaps John Crowe Ransom–admitted to rushing home at night to read feverishly ahead in the texts he was teaching so that he was ready to go the following day.  On the one hand, this is a familiar story to any new (or not so new) professor who’s trying to stay one step ahead of the onrushing train.  On the other hand, its also the case that part of this was demanded by the fact that Brooks and others were trying to do something totally new for a literature classroom, the close perspicacious reading whose minutest detail nevertheless resulted miraculously in a coherent organic whole.  That kind of textual analysis was the meat of my own education, and to be honest, it hasn’t really changed all that much despite all the new (and now new old theories) that came in with the advent of deconstruction and its descendants.  We still, more or less, on the undergraduate level do the close reading, even if we now look for the way things fall apart or for hints and allegations of this or that cultural depravity.

But I am intrigued by just how hard Brooks/Ransom (or whomever it was) had to work to stay ahead of his students, in part because he really didn’t know entirely what he was doing.  He wasn’t building on the secure  corpus of knowledge that previous literary scholastics had received and passed on.  Despite the mythic and quasi-priestly status that some New Critics projected–turning the critic into an all-knowing seer, and thus setting the stage for the later assertions that critics were really the equals or superiors of the novelists and poets they read and critiqued, knowing what those poor souls could only allude to and  evoke–there was a very real sense in which the New Criticism was much more democratic than the literary scholasticism that preceded it.  (I am sure Frank Lentricchia is exploding about now, or would be if he ever actually would bother to read me).  While it may not have been more democratic in the sense that the New Critics seemed to cast a mysterious aura about all they did, developing a new and arcane ritual language to accompany it, it was more democratic in the sense that the method was potentially available to everyone.  Not everyone could have the time to read all the histories and all the letters and delve in to the archives and read the vast quantities of literature required for the literary scholasticism that characterized old style literary history .  But everyone could read the poem or the novel set in front of them.  And potentially a smart undergraduate could see a good deal that the prof had missed, or point out the problems in particular interpretations.  When the evidence of the poem was simply the poem itself, all the cards were on the table.  No longer could a professor say to the quivering undergraduate “Well, yes, but if you had bothered to read x,y, and z you would understand why your assertions about this poems place in literary history are totally asinine.”  The average undergraduate is never in a place to dispute with a professor on the place of this or that figure in literary history, but they could, in fact, argue that a professor had gotten a poem wrong, that an interpretation didn’t hold up to a closer scrutiny of the fact.  The feverish late night work of my Brooks/Ransom avatar, like the feverish late-night work of many a new and not so new professor, is sometimes cast as a noble inclination to truth or knowledge, or the discipline.  It is in truth, very often the quest to avoid embarrassment at the hands of our smarter undergraduates, the quest for just enough knowledge or just enough preparation to make sure we justify our authority in the eyes of our skeptical younger charges.

I was thinking about his again while attending the Re:Humanities undergraduate DH conference at Swarthmore/Bryn Mawr/Haverford Thursday and Friday. Clearly, one of the biggest challenges to bringing DH fully onboard in Humanities disciplines is the simple fact that undergraduates often know as much, and often know a great deal more, about the tools we are trying to employ.  On the one hand, this is a tremendous challenge to mid-career academics who understandably have little interest in abandoning the approaches to scholarship, teaching, and learning that they have developed, that they understand, and that they continue to use effectively given the assumptions and possibilities of those tools as they are.  It was ever thus and to some degree colleges remain always one step behind the students they are attempting to educate, figuring out on the fly how our own education and experience can possibly apply in this day and hour.

However, I also wonder whether the democratization of the technological environment in the classroom isn’t a newly permanent state of affairs.  The pace of technological change–at least for the present, and why would we assume that should stop in the near or mediate future–means that there is some sense in which we are entering an period in the history of education in which educators will, in some sense, never know any more about the possibilities of the tools they are using than do the students that they are teaching.  Indeed, given the nature of the tools, it is quite likely that collectively the students know a great deal more about how to use the tools available to them and that they are likely to be more attuned more quickly to the latest technological developments.  What they don’t know–and what we as educators don’t know either–is how to best deploy those resources to do different kinds of humanistic work.  The teleology of learning used to be fairly, if undemocratically, straightforward.  The basic educational goal was to learn how to do what your teacher could do–with reading, with texts, with research.  In our current age that teleology is completely, perhaps appropriately, disrupted.  But that doesn’t alleviate the sense that we don’t know entirely what we should be teaching our students to do when we don’t entirely know what to do or how to do it ourselves.

Mortimer Adler famously wrote a book on “How to Read a Book”  and though people bemoaned Adler as an elitist and a snob, the basic idea was still important.  Some people knew how to read books and others did not.  I still think its the case that we take a tremendous amount for granted if we assume an undergraduate actually knows how to read an old-fashioned codex well.  They don’t.  On the other hand, we have no equivalent book that tells us “how to read….”, in part because we don’t know how to fill in the blank, though perhaps “digital artifacts” comes as close as anything.  We’re not even sure what tools we should be using to do whatever it is we are doing as humanists in this day and age.  No wonder most professors choose to continue to use books, even though I think the day is fast approaching when students won’t tolerate that, anymore than an ancient would have tolerated the continued use of scrolls when a perfectly good codex was available at hand.  What the current technological changes are doing is radically democratizing the classroom on the level of the tool.

I did have a couple of signs of hope this past week at the Re:Humanities conference at Swarthmore. In the first place, if the educational system in the humanities is becoming radically democratized at the level of structure, I think it is safe to say there are many, many, many people using that democracy well.  The students at the conference were doings stunningly good and creative work that was clearly contributing to our knowledge of the world around us–sometimes pursuing these projects independently or, most often, in partnership with and in mentoring relationships with committed faculty.  (It is, of course, also the case that people can use democracy poorly, as I’ve suggested elsewhere;  this would be true in both the classroom and the body politic, so we should ask whether and where the democratization of our educational system is being used well, rather than assuming that because we use the word democracy we have named a substantive good).

Secondarily, one of the chief insights I drew from the different speakers was that if we put the tools on the table as possibilities, students will surprise and amaze us with what they can manage to come up with.  What if we found ways to encourage students to get beyond the research paper and asked that they do serious creative and critical work with the tools that they have everyday at hand on their iPhones, laptops, and etcetera.  What is we encouraged them to say we have to find the best way to answer the kind of questions humanists have always asked, and to identify the new questions and potential answers that new (and now not so new) technologies make possible.  We will have to do this regardless, I think.  The age demands it.  And I suspect that there will be many many more frantic late nights for faculty ahead.  But I think those frantic late nights will be built less and less on the belief that we have to get on top of “the material” and “stay ahead” of our students.  When they can bring in material we’ve never heard of with the touch of a finger on their iPhones, we have no hope of being on top of the material or staying ahead in a meaningful sense.  Perhaps what we can do is inspire them to charge ahead, guide them to the edges of the landscape that we already know, and partner with them in the exploration of the landscapes that we haven’t yet discovered.

Why digital humanities is already a basic skill, not just a specialist niche–Matthew Kirschenbaum

Sometimes I think we humanists “of a certain age,” to put the issue politely, imagine digital humanities as an optional activity that will be filled by an interesting niche of young professors who take their place in the academy as yet another niche sub-discipline, something that research universities hire for and small colleges struggle hopelessly to replicate.  It may be indeed that small colleges will struggle to integrate digital humanities in to their own infrastructures, but I think the general picture of Digital Humanities as an optional sub-discipline will simply be unsustainable.  The argument smells a little of the idea that e-books are a nice sub-genre of texts, but not something the average humanist has to worry that much about.  I think, to the contrary, that digital humanities and the multitude of techniques that it entails, will become deeply integrated in a fundament way with the basic methodologies of how we go about doing business, akin to knowing how to do close reading or how to maneuver our way through libraries.

Although pointing out this fact is not his main point, Matthew Kirschenbaum–already a Digital Humanities patron saint in many respects–has an essay in The Chronicle that points to this fact.  Kirschenbaum is currently interested in how we preserve digital material, and the problems are just as complex if not moreso than the general question of how and when to save print materials.  Moreso to the degree that we cannot be sure that the current forms in which we place our digital intelligence will actually be usable five years from now.  The consequences for humanities research and writing are profound and must be considered. From Kirschenbaum:

Digital preservation is the sort of problem we like to assume others are thinking about. Surely someone, somewhere, is on the job. And, in lots of ways, that is true. Dire warnings of an approaching “digital dark ages” appear periodically in the media: Comparisons are often made to the early years of cinema—roughly half of the films made before 1950 have been lost because of neglect. 

But the fact is that enormous resources—government, industry, and academic—are being marshaled to attack the problem. In the United States, for example, the Library of Congress has been proactive through its National Digital Information Infrastructure and Preservation Program. Archivists of all stripes now routinely receive training in not only appraisal and conservation of digital materials but also metadata (documentation and description) and even digital forensics, through which we can stabilize and authenticate electronic records. (I now help teach such a course at the University of Virginia’s renowned Rare Book School.) Because of the skills of digital archivists, you can read former presidents’ e-mail messages and examine at Emory University Libraries a virtual recreation of Salman Rushdie’s first computer. Jason Scott’s Archive Team, meanwhile, working without institutional support, leaps into action to download and redistribute imperiled Web content.

What this suggests is that Rushdie’s biographers will have to not so much know how to sift through piles of letters, but how to recreate digital archives that authors themselves may not be interested in preserving.  Biographers of the present and surely the future, will have to be Digital technicians, as well as close readers of the digital archives they are able to recover.

Kirschenbaum goes on to suggest that most of us must do this work on our own, and must do this work for ourselves, in preserving our own archives.

But despite those heroic efforts, most individuals must still be their own digital caretakers. You and I must take responsibility for our own personal digital legacy. There are no drive-through windows (like the old photo kiosks) where you can drop off your old floppies and pick up fresh files a day or two later. What commercial services are available tend to assume data are being recovered from more recent technology (like hard drives), and these also can be prohibitively expensive for average consumers. (Organizations like the Library of Congress occasionally sponsor public-information sessions and workshops to teach people how to retrieve data from old machines, but those are obviously catch as catch can.)

Research shows that many of us just put our old disks, CD’s, and whatnot into shoeboxes and hope that if we need them again, we’ll figure out how to retrieve the data they contain when the time comes. (In fact, researchers such as Cathy Marshall, at Microsoft Research, have found that some people are not averse to data loss—that the mishaps of digital life provide arbitrary and not entirely unwelcome opportunities for starting over with clean slates.)

This last, of course, is an interesting problem.  Authors have often been notoriously averse to having their mail probed and prodded for signs of the conflicts and confessions, preferring that the “work” stand on its own. Stories of authors burning their letters and manuscripts are legion, nightmarishly so for  the literary scholar.   Such literary self-immolations are both harder and easier in a digital world.  My drafts and emails can disappear at the touch of a button.  On the other hand, I am told that a hard drive is never actually erased for those who are really in the know.  Then again, the task of scholar who sees a writers computer as his archive is in some ways vastly more difficult than that of the writer who was an assiduous collector of his type-written drafts.  Does every deletion and spell correct count as a revision.  What should we trace as an important change, and what should we disregard as detritus.  These are, of course, the standard archival questions, but it seems to me they are exponentially more complicated in a digital archive where a text may change a multitude of times in a single sitting, something not so possible in a typewritten world.
Well, these are the kinds of things Kirschenbaum takes up.  And having the tools to apply to such questions will be the task for every humanist in the future, not a narrow coterie.

Living in an e-plus world: Students now prefer digital texts when given a choice

A recent blog by Nick DeSantis in the Chronicle points to a survey by the Pearson Foundation that suggests Tablet ownership is on the rise.  That’s not surprising, but more significant is the fact that among tablet users there’s a clear preference for digital texts over the traditional paper codex, something we haven’t seen before even among college students of this wired generation:

One-fourth of the college students surveyed said they owned a tablet, compared with just 7 percent last year. Sixty-three percent of college students believe tablets will replace textbooks in the next five years—a 15 percent increase over last year’s survey. More than a third said they intended to buy a tablet sometime in the next six months.

This year’s poll also found that the respondents preferred digital books over printed ones. It’s a reversal of last year’s results and goes against findings of other recent studies, which concluded that students tend to choose printed textbooks. The new survey found that nearly six in 10 students preferred digital books when reading for class, compared with one-third who said they preferred printed textbooks.

I find this unsurprising as it matches up pretty well with my own experience.  5 years ago I could never imagine doing any significant reading on a tablet.  Now I do all my reading of scholarly journals and long form journalism–i.e The Atlantic, the New York Review of Books, The Chronicle Review–on my iPad.  And while I still tend to prefer the codex for the reading of novels and other book length works, the truth is that preference is slowly eroding as well.  As I become more familiar with the forms of e-reading, the notions of its inherent inferiority, like the notions of any unreflective prejudice, gradually fade in the face of familiarity.

And yet I greet the news of this survey with a certain level of panic, not panic that it should happen at all, but panic that the pace of change is quickening and we are hardly prepared, by we I mean we in the humanities here in small colleges and elsewhere.  I’ve blogged on more than one occasion about my doubts about e-books and yet my sense of their inevitable ascendancy.  For instance here on the question of whether e-books are being foisted on students by a cabal of publishers and administrators like myself out to save a buck (or make a buck as the case may be), and here on the nostalgic but still real feeling that I have that print codex forms of books have an irreplaceable individuality and physicality that the mere presence of text in a myriad of e-forms does not suffice to replace.

But though I’ve felt the ascendancy of e-books was inevitable, I think I imagined a 15 or 20 year time span in which print and e-books would mostly live side by side.  Our own librarians here at Messiah College talk about a “print-plus” model for libraries, as if e-book will remain primarily an add on for some time to come.  I wonder.  Just as computing power increases exponentially, it seems to me that the half-life of print books is rapidly diminishing.  I now wonder whether we will have five years before students will expect their books to be in print–all their books, not just their hefty tomes for CHEM 101 that can be more nicely illustrated with iBook Author–but also their books for English and History classes as well.  This is an “e-plus”  world  where print will increasingly not be the norm, but the supplement to fill whatever gaps e-books have not yet bridged, whatever textual landscapes have not yet been digitized.

Despite warnings, we aren’t yet ready for an e-plus world.  Not only do we not know how to operate the apps that make these books available, we don’t even know how to critically study books in tablet form.  Yet learning what forms of critical engagement are possible and necessary will be required.  I suspect, frankly, that our current methods developed out of a what was made possible by the forms that texts took, rather than forms following our methodological urgencies.  This means that the look of critical study in the classroom will change radically in the next ten years.  What will it look like?

Tonguecat by Peter Verhelst

TonguecatTonguecat by Peter Verhelst

My rating: 3 of 5 stars

I admire this book more than I like it. That is, I understand that Verhelst is pulling off a kind of writerly virtuosity and I applaud appropriately. But I feel about it like I feel about a good bit of contemporary music that appeals to the musical theorist rather than the musical ear. It’s possible to feel intellectually compelled, but viscerally unmoved; that’s kind of where I end up with Verhelst and his cast of characters. The book recounts fantastical and horrific events in the aftermath of the apocalyptic end of an empire, but the books surfaces are icy, a little like the frigid ice age that descends on the countryside as a major event of the novel. The characters are frozen and statuesque, a little like the frozen corpses that litter the landscape. They remain untouchable, and so untouched and untouching. As a result, Verhlest’s story works like an allegory, but one from which I remain mostly removed and uncaring. I’m not sorry I read the book, but why go back. Given that the book is at least in part about terror, terrorism, empire, and totalitarianism, I’m not sure this is a great way to feel

View all my reviews

In Praise of Reviews, Reviewing, and Reviewers

I think if I was born again by the flesh and not the spirit, I might choose to become a book reviewer in my second life.  Perhaps this is “true confessions” since academics and novelists alike share their disdain for the review as a subordinate piece of work, and so the reviewer as a lowly creature to be scorned.  However, I love the review as a form, see it as a way of exercising creativity, rhetorical facility, and critical consciousness.  In other words, with reviews I feel like I bring together all the different parts of myself.  The creativity and the rhetorical facility I developed through and MFA, and the critical consciousness of my scholarly self developed in graduate school at Duke.  I developed my course on book-reviewing here at Messiah College precisely because I think it is one of the most  challenging forms to do well.  To write engagingly and persuasively for a generally educated audience while also with enough informed intelligence for an academic audience.

Like Jeffrey Wasserstrom in the  Chronicle Review, I also love reading book reviews, and often spend vacation days not catching up on the latest novel or theoretical tome, but on all the book reviews I’ve seen and collected on Instapaper.  Wasserstrom’s piece goes against the grain of a lot of our thinking about book reviews, even mine, and it strikes me that he’s absolutely right about a lot of what he says.  First, I often tell students that one of the primary purposes of book reviewers is to help sift wheat from chafe and tell other readers what out there is worth the reading.  This is true, but only partially so.

Another way my thinking diverges from Lutz’s relates to his emphasis on positive reviews’ influencing sales. Of course they can, especially if someone as influential as, say, Michiko Kakutani (whose New York Times reviews I often enjoy) or Margaret Atwood (whose New York Review of Books essays I never skip) is the one singing a book’s praises. When I write reviews, though, I often assume that most people reading me will not even consider buying the book I’m discussing, even if I enthuse. And as a reader, I gravitate toward reviews of books I don’t expect to buy, no matter how warmly they are praised.

Consider the most recent batch of TLS issues. As usual, I skipped the reviews of mysteries, even though these are precisely the works of fiction I tend to buy. And I read reviews of nonfiction books that I wasn’t contemplating purchasing. For instance, I relished a long essay by Toby Lichtig (whose TLS contributions I’d enjoyed in the past) that dealt with new books on vampires. Some people might have read the essay to help them decide which Dracula-related book to buy. Not me. I read it because I was curious to know what’s been written lately about vampires—but not curious enough to tackle any book on the topic.

What’s true regarding vampires is—I should perhaps be ashamed to say—true of some big fields of inquiry. Ancient Greece and Rome, for example. I like to know what’s being written about them but rarely read books about them. Instead, I just read Mary Beard’s lively TLS reviews of publications in her field.

Reviews do influence my book buying—just in a roundabout way. I’m sometimes inspired to buy books by authors whose reviews impress me. I don’t think Lichtig has a book out yet, but when he does, I’ll buy it. The last book on ancient Greece I purchased wasn’t one Mary Beard reviewed but one she wrote.

I can only say yes to this.  It’s very clear that I don’t just read book reviews in order to make decisions as a consumer.  I read book reviews because I like them for themselves, if they are well-done, but also just to keep some kind of finger on the pulse of what’s going on.  In other words, there’s a way in which I depend on good reviewers not to read in order to tell me what to buy, but to read in my place since I can’t possibly read everything.  I can remain very glad, though, that some very good reader-reviewers out there are reading the many good things that there are out there to read.  I need them so I have a larger sense of the cultural landscape than I could possibly achieve by trying to read everything on my own.

Wasserstrom also champions the short review, and speculates on the tweeted review and its possibilities:

I’ve even been musing lately about the potential for tweet-length reviews. I don’t want those to displace other kinds, especially because they can too easily seem like glorified blurbs. But the best nuggets of some reviews could work pretty well within Twitter’s haiku-like constraints. Take my assessment of Kissinger’s On China. When I reviewed it for the June 13 edition of Time’s Asian edition, I was happy that the editors gave me a full-page spread. Still, a pretty nifty Twitter-friendly version could have been built around the best line from the Time piece: “Skip bloated sections on Chinese culture, focus on parts about author’s time in China—a fat book w/ a better skinnier one trying to get out.”

The basic insight here is critical.  Longer is not always better.  I’m even tempted to say not often better, the usual length of posts to this blog notwithstanding.  My experiences on facebook suggest to me that we may be in a new era of the aphorism, as well as one that may exalt the wit of the 18th  century, in which the pithy riposte may be more telling than the blowsy dissertation.

A new challenge for my students in the next version of my book-reviewing class, write a review that is telling accurate and rhetorically effective in 160 characters or less.

Reading and Redemption

I saw Atonement last night, the Oscar-nominated film based on Ian McEwan’s award winning novel. I’m kind of vaguely interested in what happens to novels when they become films, but more so in films and novels that are in some way about the process of reading and writing. I have no idea about McEwan’s novel itself—I hope I can get to it someday—but I found the conceptual interaction between visual and textual storytelling—between viewing and reading—very layered and complex in the film. To some degree compelling, but also troubling.

Because I get to these things about three weeks after everyone’s seen the film, I’m going to assume whoever reads this post has already watched the movie (fair warning if you think the ending is given away). The interplay between reading/viewing and writing/performing is there throughout the film, of course. The main character, Briony, is a budding novelist of 13 whose urgent hormone-driven plays are transparently presented as sublimated efforts to deal with her adolescent crush on a older young man, Robbie, who is in love with her older sister, Cecilia. This love of a young girl for an older man is perversely reversed when another young man about the age of Robbie rapes her young friend.

Briony has seen the rape, but using her well-practiced imagination, and perhaps revenging herself on Robbie for loving her sister instead of her, accuses Robbie of the deed. Briony’s decision to fabricate Robbie’s role is caught in the following clip. Too bad it doesn’t start just a bit earlier, where we see the two girls building a story based on their own fears, needs, and class stereotypes.

“Atonement” is, of course, about whether or not one can atone for the past. Can the past be repaired? Even to some degree, does Briony need to atone for the past? Can a young girl of 13 be held responsible for an act, however reprehensible, that can readily be understood as an act of immaturity rather than an act of adult malice? Even, can any action by a much older and much changed Briony count in any way for atonement of sins by the younger child she resembles but in no way repeats. Are our older selves, in so many ways discontinuous with the children that we were, even capable of repenting for sins that were in some very real sense committed by someone else? This distance is registered in the film by having actresses who are similar in appearance—at least in, implausibly, retaining the same haircut for approximately 60 years—but who are otherwise obviously very different people “playing” the same person. Again, this question of atonement is perversely registered in that the actual adult rapist “atones” for the past by eventually marrying the young girl he raped when she comes of age. While the true agent of brutality goes on to live out the Western mythology of human fulfillment in marriage, Robbie and Cecilia are forever separated by the sins of someone else.

For my purposes I’m interested in the layered question of whether writing and reading—whether an act of and engagement with the imagination—can atone for sins committed in the world. How does the imagination act on the world? This is most pronounced in the conclusion of the film where we cut to a latter-day television interview with an elderly and ill Briony, played by Vanessa Redgrave, who has just written her final novel, final because she has realized that she has incurable and progressive dementia that is gradually destroying her ability to remember and to use language.

We immediately understand as viewers that the movie we have just been watching is this last novel—rendered visually. We have been the reader/viewers of the novel, which is supposedly autobiographical. However Briony/Vanessa Redgrave informs us that the story didn’t really end as she left it in the scenes we had just seen. Robbie did not return from Europe to be with Cecilia. He died on the beaches of Dunkirk from sepsis. Briony is never reconciled to Cecilia—as the film had just made it appear. Instead, Briony had been too cowardly to find her sister and make the attempt at reconciliation. Cecilia character died in the bombing of London, living alone and estranged from her family because she had refused to believe that Robbie was a rapist and had refused to renounce her love for him.

The elder Vanessa Redgrave/Briony explains her decision to give the novel/movie a happy ending for two reasons—readers could not accept the reality, and because the imagined ending was an act of repair, giving Robbie and Cecilia a life of joy together—symbolized by life on Dover Beach—that had eluded them because of Briony’s deception.

The two reasons, I think, work in very different direction, and finally don’t completely hold together. I’m not completely taken with the notion of readers needing the happy ending. It’s true, of course, that Hollywood films and any number of romance novels make their way in the world on the hunger for uncomplicated fantasy. But is it the case that human beings are so unused to the idea that the innocent die while the guilty go free and live happily ever after that we refuse it in our literature? Indeed, isn’t it our literature that teaches us this repeatedly. It feeds the generally tragic sense of reader-geeks that their own nobility is tragically unrecognized in the world at large. It is played out by English professors who grump that their C students get jobs right out of college that pay more than they make as tenured professors.

Still, this notion does comport with the general tenor of the film. Briony’s sin is first and foremost an act of the imagination. She “sees” what she wants to see so that it will reflect her own story in the world. Her refusal to allow the world to be more complicated that her own seeing is the source of her original accusation. In a very real sense, Briony’s imagination is what she must atone for. Imagination is her original sin—her writing is, after all, a particular way of reading the world that refuses to let the world be what it is truthfully. Her imagination is a thirteen-year-old act of violence on the world, and results in very real violence to many people down the line.

And so, can we really buy the elder Vanessa Redgrave/Briony’s assertion that she is somehow redeeming the lives of Cecilia and Robbie, giving them what they couldn’t otherwise have in reality? Something she wants to understand as an act of generosity and even love. I’m not sure. To some degree this could be connected with the work of someone like Ernst Bloch who insisted that the utopian function of art was to say “And Yet” to life, to insist that “reality” did not have the final say if that final say was understood to be beyond the act of human agency, human shaping, human imagination. In the same fashion, if atonement is possible, it seems to me that atonement must be an act that includes the imagination.

Still, is this an act that the imagination can carry out in reference to our own actions in the past? No human action is every finished in and of itself. Rather, it is read and reread, and its meaning accrues and changes by the means and contexts through which it is reread. I sometimes tell students I prefer to understand God as a reader than a writer. Redemption is an act of reading and discovering the possibilities in a life-text that could not have been imagined by those individuals and other historical agents who brought that life-text into being in the first place. But I guess what makes me leery of this particular act is Briony’s act of self-justifying imagination. Can Briony atone for the failures of her imagination by another act of the imagination that further falsifies the lives of those that she has damaged, however “innocently” or unknowingly? I tend to think that this isn’t atonement but self-justification.

On the other hand, what we finally get from Briony-Vanessa Redgrave is not imagination as atonement, but a very different secularized Christian practice—Confession. Briony apparently tells the truth to the reader at the end of the story, and the reader/viewer is the only person in the position to forgive. Briony’s confession of what actually happened is, at least putatively, something that removes her own imagination as an agent in her own redemption. She no longer writes someone different from who she is, but says who she is and what she has done and failed to do, and what the consequences have been. The production of art that moves a reader is no compensation for the evil that produces it. But the frank confession of the truth is a work of art in which we recognize ourselves. We forgive her because we see in her all the unthinking dishonesties by which we have harmed others and ourselves. In her need for us, we recognize our own need for forgiving readers.

Critical Thinking and Cultural Literacy: Or, Is Unmasking Shakespeare Productive Cultural Work?

Ok, a slightly lame way of doing the blog entry today, but I spent a lot of time commenting on Mark Bauerlein’s blog at the Chronicle today, so I thought I’d just copy some of that and expand just a bit on what I had to say there.

In sum, Bauerlein makes the argument that the arguments in favor of critical thinking as a raison d’etre for literary study are really only half the story for professors in the humanities, and perhaps especially in English. The other half is that we need to pass on an appreciation of a cultural tradition.

As a department chair, I’m used to giving the usual run-down on critical thinking in making arguments for English studies. They generally sell well with provosts and deans because they both seem to comport with traditional practices of the humanities while at the same time being a marketable skill to discuss with skeptical external constituencies. On the other hand, I’m not completely convinced that the humanities are the only place to get critical thinking skills. What, they aren’t doing critical thinking in the hard and social sciences? I think we sometimes assume that because different fields investigate different data sets, they are therefore not developing critical thinking. What is an economist doing but attempting to think critically about received wisdom as applied to sets of data in the economy?

Thus, I fully appreciate, while not going all the way with, Bauerlein’s argument that the humanities have to be about familiarizing students with a substantive subject matter and understanding its active or potential value in the world and for themselves. In my own terms, I think that English studies, especially, has to be more than a critical project; it has to be a constructive project as well.

My comments on Bauerlein’s blog were as follows:

I think the comments above that suggest an exclusive identification of literary or humanistic studies with critique has become strangely vacuous are right on the mark. And, in reality, it’s not clear that critique per se has changed very much over the course of the last two or three decades. This is because critique must always have an object of its attention and is therefore always dependent on some kind of received culture.

In an older form of literary study, criticism meant not simple-minded passing on, nor simple-minded tearing apart, but critical evaluation. That is, what is worth passing on, what is worth reading, and for what reasons? The literary academy and the humanities more broadly have almost entirely defaulted on this particular task because to make an affirmative act of construction is to lay oneself open to the, I guess, humiliating preference for deconstruction or other forms of political critique.

In our curriculum I teach both the courses on literary theory and a course on book reviewing, and in both attempt to get students to think in concrete and critical ways about what’s worth reading and why. I have to say that students find the classes incredibly important to them. Far from feeling like the web—with its massive democratization of product and opinion—has done away with the need for discussion of value, they really find it an important question. Why should I spend my time with this book rather than that book? With Mark Bauerlein’s blog instead of Moby Dick? These are theoretical questions, critical questions, and questions that involve themselves in the construction of traditions and cultures rather than simply critiquing them.

In my own view, I think the current explosion of textual matter on the web—whether blogs, or online fictions, or newspapers, or e-books—has created a critical situation very similar to that which existed after the invention of the printing press. In a certain sense, the invention of the press changes the function of criticism. Prior to widely accessible print and the expansion of both reading audience and authorship beyond the narrow confines of the clerisy and aristocracy, criticism more or less existed to catalogue and discuss the characteristics of good writing. This was not, properly speaking, an evaluative project. Things that were published and preserved were, by and large, already considered good. “Criticism,” such as it was, was more a taxonomic affair, describing the goodness that was already known to exist.

After Gutenberg, criticism became the task of defining what, out of the immense amount of material on hand that could be read, really should be read. What was worth preserving? What things being produced by the new class of writer/readers deserved a status similar to that of the ancients as worthy of being preserved? To some degree, we are still at the dawning moment of that part of the internet revolution. What is really worth reading? Even, what is really worth writing? Is a blog worth doing? Is it real writing or is it conversation. Is real thinking going on, or is it ephemeral. To some degree popularity sites like Technorati or Digg that try to apply the democratic impulses of the web to blogs and the like are trying to serve an evaluative function. The wisdom of crowds applied to the function of criticism. Will this work for the long term? I have my doubts. There’s always been a tendency to try to insist that “best-sellers” are those things that are really valuable, but their value hasn’t been sustainable for more than a generation or two. I suspect that we are still working out the function of criticism at the present time. What shape will criticism take? How will we decide what is worth reading and writing. How will we decide what being written—or perhaps we should now simply say, “being produced—on the web are the kinds of things that should be passed down to our children as we attempt the inevitable human activity of forging a common culture.

After a variety of comments for Bauerlein with varying levels of vitriol in play, I followed up on a comment that made the argument that we need to be teaching things that students are comfortable with, but also things that sting them with their unfamiliarity.

My response:

Tim, I wonder in this day and age whether reading almost anything longer than a blog will be, for many students, a de-familiarizing and unsettling experience. That is, one doesn’t have to buy in to all the hype about a reading crisis to recognize that the nature of reading is changing, and the ability to read extended and complex texts has been eroding among college graduates.

Because we are so habituated by our own reading practices and training, we often make deeply flawed assumptions about what students will find de-familiarizing. And, to be honest, we often default to simple-minded notions of unfamiliar cultural content. “De-familiarization” first developed among formalists as a conception of how literary language served to shock readers from their comfortable linguistic frames of reference. On that score, I think we often find that contemporary students find reading much of anything “literary” at all to be unfamiliar, defamiliarizing, and unsettling. Especially so in poetry, but in a different register in long novels and plays they no longer even bother to try and read. Rather than experiencing the sting of defamiliarization in Shakespeare’s Tempest, students are quite as likely to go get the Sparknotes so they can pass the test and even write their essays.

In this kind of reading context, it seems to me that discussions of how to upset the cultural applecart on the basis of whether folks read Shakespeare or not are increasingly arcane and disconnected from cultural realities in which long form reading is taking place. While I agree that the task can’t be a simple passing on of received tradition, I think the cultural situation does call for engaging students with the question of why certain forms of reading may be valuable, and thinking through what texts might be worth the time required for reading them. In other words, the philosophical conception of “The Good” surely can’t be “Whatever has always been.” But it also surely can’t be, “Whatever I decide might make my students talk in class,” or “Whatever an individual wants it to be.” To go this route is, I think, to give up on the question of “The Good” entirely, something I think most students are still unwilling to do.

This is something I find repeatedly in play among literary intellectuals. It’s almost as if we are so hermetically sealed within the discourses and practices of our discipline that we can’t conceive of a world where the fact of reading a book might be uncomfortable or unfamiliar for students. When I raise this problem at conferences, I repeatedly have professors reply by saying “Everyone I know reads.” I want to say “Duh. You work in an English department.”

This fact, I think, calls in to question some of the basic premises of the canon wars that preoccupied folks at Duke while I was there as a grad student in the late eighties and early nineties. In the world that we are entering and are now in, people who read literature as an important part of their cultural lives are a distinct minority group that all have more in common with one another regardless of ethnicity, sexual identity, religion or gender, than they do with other members of their various identity groups—at least insofar as reading is concerned. That is, reading books, and reading literature especially, marks them out as different, as Other from the culture they inhabit–whether we are thinking of an ethnic, a national, a religious or a sexual culture. We need to recognize that we are quickly entering a world, and are already in it, wherein the simple fact of reading Moby Dick or Shakespeare will be a stinging act of defamiliarization that unsettles the cultural life of students.

This doesn’t mean that Bauerlein is right that we need to be passing on a received tradition—though I think students value that more than we sometimes realize. But it certainly does mean that we have to be involved in a constructive project and not simply a critical project.