Category Archives: literary criticism

“Beyond The Review: Reading African American Literature and Religion”

Langston

was very happy this week to publish a new review essay in The Cresset for their Lent 2019 edition. On the one hand publishing has not lost its charge, perhaps because I do it so rarely.  But beyond that it was good to see in print my meditations on being part of a mini-movement in literary and cultural studies that has been taking religion in/and African American lit more seriously.  Besides my efforts on the Harlem Renaissance, there are many other, probably more important contributions going on, and I look at three recent works by Wallace Best, M. Cooper Harris, and Josef Sorett just to give a sense of the importance of what’s going on, besides my evaluation of the works at hand.  I also liked the challenge of trying to write 

about academic literary criticism for a non-specialist audience, and to take up the issue of why lay readers ought to read criticism, even against their better judgment.  Not an easy task, and one at which I think I only partially succeeded, but which the editor liked enough to print at any rate.  A little flavor of this aspect of the review:

harriss

“Unlike breathing or the beating of the heart, reading is a skill developed within particular cultures, each with its own values and peculiarities, and each with its own notion of excellence. At its best, literary criticism models forms of readerly virtuosity that stretch our imagination beyond the straightforward pleasures of enjoying a good story. The best criticism allows us to know literature within a cultural ecosystem of reference and connection. In the normal course of things, we pluck books from the Barnes & Noble bookshelf or the Amazon algorithm as we might pluck up a flower in a field, enjoying (or not) the pleasure of the text. Literary criticism reads the role that flower plays in the field. It considers the ways it depends on or perhaps destroys other features of the field, or perhaps the ways other cultural ecosystems consider it a weed or an invasive species to be eradicated. SorettWhile reading literary criticism is not always a walk in the park, doing so can make our pleasures more aware and engaged, delivering enhanced or other pleasures, much as we might take pleasure in not only the scent of the air, but in being able to name the flowers and the trees and understand our relationship to them and theirs to one another.”

You can read the rest of my review here…

 

Are writers afraid of the dark?

In a new blog at NYRB, Tim Parks questions the notion that literature is about the stuff of life and instead might be a kind of withdrawal from the complexity and fearfulness of life itself:

So much, then, for a fairly common theme in literature. It’s understandable that those sitting comfortably at a dull desk to imagine life at its most intense might be conflicted over questions of courage and fear. It’s also more than likely that this divided state of mind is shared by a certain kind of reader, who, while taking a little time out from life’s turmoil, nevertheless likes to feel that he or she is reading courageous books.

The result is a rhetoric that tends to flatter literature, with everybody over eager to insist on its liveliness and import. “The novel is the one bright book of life,” D H Lawrence tells us. “Books are not life,” he immediately goes on to regret. “They are only tremulations on the ether. But the novel as a tremulation can make the whole man alive tremble.” Lawrence, it’s worth remembering, grew up in the shadow of violent parental struggles and would always pride himself on his readiness for a fight, regretting in one letter that he was too ill “to slap Frieda [his wife] in the eye, in the proper marital fashion,” but “reduced to vituperation.” Frieda, it has to be said, gave as good as she got. In any event words just weren’t as satisfying as blows, though Lawrence did everything he could to make his writing feel like a fight: “whoever reads me will be in the thick of the scrimmage,” he insisted.

In How Fiction Works James Wood tells us that the purpose of fiction is “to put life on the page” and insists that “readers go to fiction for life.” Again there appears to be an anxiety that the business of literature might be more to do with withdrawal; in any event one can’t help thinking that someone in search of life would more likely be flirting, traveling or partying. How often on a Saturday evening would the call to life lift my head from my books and have me hurrying out into the street.

(via Instapaper)

I was reminded in reading this of a graduate seminar with Franco Moretti wherein he said, almost as an aside, that we have an illusion that literature is complex and difficult, but that in fact, literature simplifies the complexity and randomness of life as it is.  In some sense literature is a coping mechanism.  I don’t remember a great deal more than that about the seminar–other than the fact that Moretti wasn’t too impressed with my paper on T.S. Eliot–but I do remember that aside.  It struck me as at once utterly convincing and yet disturbing, unsettling the notion that we in literature were dealing with the deepest and most complicated things in life.

On the other hand, I’m reminded of the old saw, literature may not be life, but, then, what is?  Parks seems to strike a little bit of a graduate studenty tone here in presenting the obvious as an earthshaking discovery, without really advancing our understanding of what literature might actually be and do.  Parks seems to take delight in skewering without revealing or advancing understanding.  There’s a tendency to set up straw men to light afire, and then strike the smug and knowing revelatory critical pose, when what one has revealed is more an invention of one’s own rhetoric than something that might be worth thinking about.

This desire to convince oneself that writing is at least as alive as life itself, was recently reflected by a New York Times report on brain-scan research that claims that as we read about action in novels the relative areas of the brain—those that respond to sound, smell, texture, movement, etc.—are activated by the words. “The brain, it seems,” enthuses the journalist, “does not make much of a distinction between reading about an experience and encountering it in real life; in each case, the same neurological regions are stimulated.”

What nonsense! As if reading about sex or violence in any way prepared us for the experience of its intensity. (In this regard I recall my adolescent daughter’s recent terror on seeing our border collie go into violent death throes after having eaten some poison in the countryside. As the dog foamed at the mouth and twitched, Lucy was shivering, weeping, appalled. But day after day she reads gothic tales and watches horror movies with a half smile on her lips.)

I’m tempted to say “What nonsense!”  Parks’s willingness to use his daughter to dismiss a scientific finding strikes me a bit like the homeschool student I once had who cited her father as an authority who disproved evolution.  Well.  The reference to the twitching dog invokes emotion that in fact runs away–in a failure of critical nerve perhaps?–from the difficult question of how exactly the brain processes and models fictional information, how that information relates to similar real world situations in which people find themselves, and how people might use and interrelate both fictional and “real world” information.

Parks seems to have no consciousness whatsoever of the role of storytelling in modeling possibility, one of its most complex ethical and psychological effects.  It’s a very long-standing and accepted understanding that one reason we tell any stories at all is to provide models for living.  Because a model is a model, we need not assume it lacks courage or is somehow a cheat on the real stuff of life.  Horror stories and fairy tales help children learn to deal with fear, impart warning and knowledge and cultural prohibitions to children, and attempt to teach them in advance how to respond to threat, to fear, to violence, etcetera.  Because those lessons are always inadequate to the moment itself hardly speaks against the need to have such mental models and maps.  It would be better to ask what we would do without them.  The writer who provides such models need not be skewered for that since to write well and convincingly, to provide a model that serves that kind of ethical or psychic purpose, the writer him or herself must get close to those feelings of terror and disintegration themselves.  It’s why there’s always been a tradition of writers like Hemingway or Sebastian Junger who go to war in order to get into that place within themselves where the emotions of the real can be touched.  It’s also why there’s always been a tradition of writers self-medicating with alcohol.

Thus, I kind of found Parks’s implied assumption that writers are cowering just a bit from the real stuff of life to be a cheap shot, something that in the cultural stories we tell each other is usually associated with cowardice and weakness, in a writer or a fighter.  The novelists and poets Parks takes on deserve better.

Barack Obama’s Waste Land; President as First Reader

GalleyCat reported today that the new biography of Barack Obama gives an extensive picture of Obama’s literary interests, including a long excerpt of a letter in which Obama details his engagement with TS Eliot and his signature poem, The Waste Land. Obama’s analysis:

Eliot contains the same ecstatic vision which runs from Münzer to Yeats. However, he retains a grounding in the social reality/order of his time. Facing what he perceives as a choice between ecstatic chaos and lifeless mechanistic order, he accedes to maintaining a separation of asexual purity and brutal sexual reality. And he wears a stoical face before this. Read his essay on Tradition and the Individual Talent, as well as Four Quartets, when he’s less concerned with depicting moribund Europe, to catch a sense of what I speak. Remember how I said there’s a certain kind of conservatism which I respect more than bourgeois liberalism—Eliot is of this type. Of course, the dichotomy he maintains is reactionary, but it’s due to a deep fatalism, not ignorance. (Counter him with Yeats or Pound, who, arising from the same milieu, opted to support Hitler and Mussolini.) And this fatalism is born out of the relation between fertility and death, which I touched on in my last letter—life feeds on itself. A fatalism I share with the western tradition at times.

A Portrait of Barack Obama as a Literary Young Man – GalleyCat.

For a 22 year old, you’d have to say this is pretty good. I’m impressed with the nuance of Obamas empathetic imagination, both in his ability to perceive the differences between the three great conservative poets of that age, and in his ability to identify with Eliot against his own political instincts. This is the kind of reading we’d like to inculcate in our students, and I think it lends credence to the notion that a mind trained in this kind of engagement might be better trained for civic engagement than those that are not. But too often even literature profs are primarily readers of the camp, so to speak, lumping those not of their own political or cultural persuasion into the faceless, and largely unread, camp of the enemy, and appreciating without distinction those who further our pet or current causes.

This is too bad, reducing a richer sense of education for civic engagement into the narrower and counterproductive sense of reading as indoctrination. I think the older notion was a vision of education that motivated the founding fathers. Whatever one thinks of his politics, passages like this suggest to me that Obama could sit unembarrassed with Jefferson and Adams discussing in all seriousness the relationship between poetry and public life. It would be a good thing to expect this of our presidents, rather than stumbling upon it by accident.

Annotating Kierkegaard; an intellectual’s appreciation

I am largely an intellectual because of Soren Kierkegaard.  I mean this primarily in terms of intellectual biography rather than genealogy.  A few days ago I noted briefly my own vocational journey into English at the hands of T.S. Eliot.  That is a true tale. However, at Eliot’s hands and through English alone as an undergraduate I largely wanted to be the next great poet or novelist.  Kierkegaard taught me to think, or at least taught me that thinking was something a Christian could do, ought to do, with whatever capacity God had given him.  Through Kierkegaard I came to Walker Percy, subject of my undergraduate thesis, and then John Updike, subject of my first scholarly essay, and probably too to literary and cultural theory which became a field of my doctoral studies and has remained a passion.   His writerly creativity, his playfulness with language image and authorial personae, never let me believe that critical writing was the inherent inferior to fiction, even if it is often practiced poorly.

In honor of Kierkegaard’s birthday yesterday, I took down some of my old SK from the shelf and blew the dust off.  The old Walter Lowrie paperback editions that were 3.95 back in the day.  The rapturous and pious annotations that fill the margins are now cringe-inducing, but I am reminded of the passions an intellectual engagement deeply felt can arouse.  A lot of the passages are marked over in four or five different colors of highlights and underlining, a way of trying to keep track, I suspect, of the many different readings I gave those book back in the day, a way of tracking the different person I was becoming.  And if I now have moved a long way from those Kierkegaardian roots in to other hipper modes of thinking, I’m also of an age where I’ve started realizing that the newest thing is not necessarily a mark of the best thing, maybe only showing you what you already knew without realizing it rather than what you need to know.

I still think The Great Dane wears well.  His comments on sectarianism, as well as his more general clarity about easy piety, say something to our own age as equally as his.  And, I still wonder sometimes, deep down, whether my first love was not the best.

From Fear and Trembling:

The true knight of faith is always absolute isolation, the false knight is sectarian. This sectarianism is an attempt to leap away from the narrow path of the paradox and become a tragic hero at a cheap price. The tragic hero expresses the universal and sacrifices himself for it. The sectarian punchinello, instead of that, has a private theatre, i.e. several good friends and comrades who represent the universal just about as well as the beadles in The Golden Snuffbox represent justice. The knight of faith, on the contrary, is the paradox, is the individual, absolutely nothing but the individual, without connections or pretensions. This is the terrible thing which the sectarian manikin cannot endure. For instead of learning from this terror that he is not capable of performing the great deed and then plainly admitting it (an act which I cannot but approve, because it is what I do) the manikin thinks that by uniting with several other manikins he will be able to do it. But that is quite out of the question. In the world of spirit no swindling is tolerated. A dozen sectaries join arms with one another, they know nothing whatever of the lonely temptations which await the knight of faith and which he dares not shun precisely because it would be still more dreadful if he were to press forward presumptuously. The sectaries deafen one another by their noise and racket, hold the dread off by their shrieks, and such a hallooing company of sportsmen think they are storming heaven and think they are on the same path as the knight of faith who in the solitude of the universe never hears any human voice but walks alone with his dreadful responsibility.

The knight of faith is obliged to rely upon himself alone, he feels the pain of not being able to make himself intelligible to others, but he feels no vain desire to guide others. The pain is his assurance that he is in the right way, this vain desire he does not know, he is too serious for that. The false knight of faith readily betrays himself by this proficiency in guiding which he has acquired in an instant. He does not comprehend what it is all about, that if another individual is to take the same path, he must become entirely in the same way the individual and have no need of any man’s guidance, least of all the guidance of a man who would obtrude himself. At this point men leap aside, they cannot bear the martyrdom of being uncomprehended, and instead of this they choose conveniently enough the worldly admiration of their proficiency. The true knight of faith is a witness, never a teacher, and therein lies his deep humanity, which is worth a good deal more than this silly participation in others’ weal and woe which is honored by the name of sympathy, whereas in fact it is nothing but vanity. He who would only be a witness thereby avows that no man, not even the lowliest, needs another man’s sympathy or should be abased that another may be exalted. But since he did not win what he won at a cheap price, neither does he sell it out at a cheap price, he is not petty enough to take men’s admiration and give them in return his silent contempt, he knows that what is truly great is equally accessible to all.

Either there is an absolute duty toward God, and if so it is the paradox here described, that the individual as the individual is higher than the universal and as the individual stands in an absolute relation to the absolute / or else faith never existed, because it has always existed, or, to put it differently, Abraham is lost.

What is the Digital Humanities and Where can I get some of it?

As I’ve started listening in on Digital Humanities conversations over the past 12 to 18 months, and especially in the last three or four months as I’ve gotten more fully onto Twitter and understood its potential for academics, I’ve realized that I am mostly just a bobbing rubber duck in a great wave of ignorant interest in Digital Humanities.  “What is this thing, Digital Humanities, and where can I get some of it?”  seems to be a general hue and cry, and I’ve added my own voice to the mix.  Some of that wave is no doubt driven by the general malaise that seems to be afflicting the humanistic ecosystem, and mid-career academics look back with rose-colored nostalgia to culture wars of the 80s when our classes were full and our conflicts were played out in middle-brow journals so we could feel self-important.  Maybe digital humanities will make us relevant again and keep our budgets from getting cut.

On the other hand, I think that most people recognize that many different aspects of digital humanities practice seem to coalesce and provide responses to driving forces in academe at the moment:  our students’ need for technical proficiency, the increasingly porous border between distanced and bricks and mortar instruction, the needs to connect effectively with the public while maintaining high standards of academic rigor, the need for our students to be involved in “real world” experiential learning, the effort to provide opportunities for serious and original undergraduate research, the need to support collaborative forms of learning.

I have been terribly impressed with the generosity of Digital Humanities folks in responding to these repeated pleas to “Show me how to do like you.  Show me how to do it.”   There’s been a lot of different things I’ve discovered over the past year, and as many more than have been put out.  The most recent is a bibliographic blog compiled by Matthew Huculak.  As with any bibliography it is an act of interpretation. There are inclusions I’ve not seen elsewhere–Franco Moretti’s book on data in literary studies was on the list and is now on mine. [Though I admit this is at least as much because I had Moretti as a prof while a grad student at Duke, the same semester I completed my first essay ever on a Macintosh computer].  The blog skews toward the literary–and I have increasingly realized that while there is this strong discourse of solidarity among digital humanists, the traditional disciplinary divisions still play a strong role in much of the practical work that is actually done.  DH’ers have conferences together but it’s not always clear that they work together on projects across their humanistic disciplines. There are also obviously omissions (I thought that the new Journal of the Digital Humanities should have made the list).

My larger concern though is that I’m actually beginning to feel that there may actually be a glut of introductory materials, so many different possible things to do at the beginning that it is actually impossible to point to a place and say, “this is where to start.”  To some degree on this score the Digital Humanities are reflecting what Geoffrey Harpham has indicated is a basic feature of the Humanities in general.

In a great many colleges and universities, there is no “Intro to English” class at all, because there is no agreement among the faculty on what constitutes a proper introduction to a field in which the goals, methods, basic concepts, and even objects are so loosely defined, and in which individual subjective experience plays such a large part. This lack of consensus has sometimes been lamented, but has never been considered a serious problem.

Geoffrey Galt Harpham. The Humanities and the Dream of America (p. 101). Kindle Edition.

The details don’t quite apply.  I’m not entirely sure individual subjective experience is at the heart of DH work, and that is one of the biggest bones of contention with DH work as it has been reflected in English departments (see my reflections on Fish and his comments on DH in yesterdays post).  But I do think the general pragmatic feel of humanities departments where you can begin almost anywhere, which is in stark contrast to the methodical and even rigid approach to immersing students in the STEM disciplines, may be characteristic of DH as well.  Start where you are and get where you want to go.

In reading Harpham, I was reminded of one of Stanley Fish’s essays, which one I forget, in which he talks about the best way to introduce students to literary criticism is not to give them a theory, but to give them examples and say “Go thou and do likewise”.  I’m increasingly feeling this is the case for the Digital Humanities.  Figure out what seems appealing to you, and then figure out what you have to figure out so you can do like that.

Distanced and Close Reading in literary study: Metaphors for love

I am old enough now to begin sentences with the phrase “I am old enough…”  Seriously, though, I am old enough now to feel like I have lived through one revolution, into a new orthodoxy, and now the experience of a new revolution in literary studies.  In the ongoing debates I hear about the digital humanities versus whatever other kind of humanities happens to be at hand, I keep having this vertiginous sense of deja vu, as if I’m hearing the same arguments I heard two decades ago, but transformed in to a key just different enough that I can’t tell whether today’s debates are mere variations on a theme or some genuinely new frame of discourse.

The song that I think is remaining the same is the divide between the proponents of what gets called “distanced reading,”  which in some hands is a shorthand for all things digital humanities (if it’s digital, it must be distanced as compared to the human touch of paper, ink, and typewriters–how the industrial period came to be the sign and symbol of all thing human and intimate I am not entirely clear), and close reading which is somehow taken to be THE form of intimate human contact with the text.

This division is exemplified in Stanley Fish’s recent essay on the digital humanities in the New York times, an argument that has the usual whiff of caustic Fishian insight leavened with what I take to be a genuine if wary respect for what he sees in the practices of distanced reading.  Nevertheless, for Fish, it is finally close reading that is genuinely the work of the humane critic devoted to intimacy with the text:

But whatever vision of the digital humanities is proclaimed, it will have little place for the likes of me and for the kind of criticism I practice: a criticism that narrows meaning to the significances designed by an author, a criticism that generalizes from a text as small as half a line, a criticism that insists on the distinction between the true and the false, between what is relevant and what is noise, between what is serious and what is mere play. Nothing ludic in what I do or try to do. I have a lot to answer for.

Ironically, in an earlier period it was Fish and precisely this kind of close reading (as practiced by deconstructionists) that was descried for its lack of seriousness, for the way it removed literature from the realm of human involvement and into the play of mere textuality .  By contrast, the distanced readers in those days imagined themselves as defenders of humanity (or, since humanism was a dirty word, at least the defender of the poor, the downtrodden, the miserable, the huddled masses).  Historicism read widely and broadly in the name of discourse, and proclaimed itself a liberating project, ferreting out the hidden political underbelly in a multitude of texts and considering literary criticism to be an act of responsible justice-seeking over and against the decadent jouissance-seekers of post-structuralism.

A recent blog by Alex Reid takes up this same criticism of what he describes as the Close Reading industry, arguing for the ways digitization can free us from the tyranny of the industrialized close reader:

In the composition classroom, the widgets on the belt are student papers. If computers can read like people it’s because we have trained people to read like computers. The real question we should be asking ourselves is why are we working in this widget factory? And FYC essays are perhaps the best real world instantiation of the widget, the fictional product, produced merely as a generic example of production. They never leave the warehouse, never get shipped to market, and are never used for anything except test runs on the factory floor. 

In an earlier period, it was again the close-readers who were accused of being mechanistic, dry, and scientific as putatively more humanistic readers accused New Critics of an unfeeling scientism in their formalist attitude toward the text, cutting out every human affect in the quest for a serious and scientific study of literature.

I wonder at root, whether this is the controlling metaphor, the key to which all our tunes in literary and cultural studies are played, a quest for the human that is not merely scientific, and yet an unrepressed desire for the authority of the scientist to say things with security, to wear the mantle of authority that our culture apparently only believes a statistical method can endow.

It is probably a mark against my character that I tend to be a both/and pragmatist as a thinker.  I do not buy the notion that distanced reading is inconsequential, or some how less about truth or less serious than the close rhetorical readings that Fish invokes.  At the same time, I am not too given to the euphoric and pugnacious challenges that can sometimes characterize digital humanities responses to the regnant forms of literary criticism.  At their best, Fishian forms of close reading are endowed not simply with acute attention, but with attention that seems to give birth to a form of wisdom that only attentiveness and close examination can provide, the kind of insistent close reading that led Gerard Manley Hopkins to seek the “inscape” of individual instances beyond categories, rather than simply the ways in which individuals fit into the vast landscapes popular in his post-romantic period.

I was reminded of this need to attend to the close properties of the individual use of language again in a recent article on Chaucer in the Chronicle. The writer attends to the detail of Chaucer’s language in a way that seems to reveal something important about the ways in which we are human.

translating Chaucer is like translating any other foreign language: The words are different from one language to the next. And then comes the third category, the most fascinating and the most aggravating because it is the trickiest: the false cognates, words that look like they should mean what they do in Modern English, but don’t. False cognates are especially aggravating, and fascinating when they carry their Middle and Modern English meanings simultaneously. These are exciting moments, when we see, through a kind of linguistic time-lapse photography, Chaucer’s language on its way to becoming our own.

In Middle English, for instance, countrefete means “to counterfeit,” as in “to fake,” but it also has the more flattering meaning of “to imitate.” Corage has not only the Modern English sense of bravery but also, frequently, overtones of sexual energy, desire, or potency. Corage takes its roots from the word coeur, or “heart,” and transplants them slightly southward. The same is true for solas, or “solace.” The “comfort,” “satisfaction,” or “pleasure” it entails is often sexual.

Lust might seem to pose no problem for the modern reader. Yet in the 14th century, the word, spelled as it is today, could mean any kind of desire or pleasure, though around that time it was beginning to carry a sexual connotation, too. And lest it seem as if false cognates always involve sex, take sely, or “silly.” It most often means “blessed” or “innocent,” as well as “pitiful” and “hapless,” but “foolish” was making its way in there, too.

A sentence like “The sely man felte for luste for solas” could mean “The pitiful man felt desire for comfort.” It could just as likely mean: “The foolish man felt lust for sex.” In Chaucer’s hands, it could mean both at once.

Chaucer was fully aware of the slipperiness of language. He delights in it; he makes his artistic capital from it. He is an inveterate punster. The Wife of Bath, for example, repeatedly puns on the word queynte (eventually the Modern English “quaint”). In the 14th century, the word means not only “curious” or “fascinating” but also the curious part of her female anatomy that most fascinates her five husbands. What’s more, the slipperiness of language gives Chaucer the tools to form his famous irony and ambiguity. If the way-too-pretty Prioress is “nat undergrowe” (“not undergrown”), how big is she?

(via Instapaper)

 These kinds of particularities of language are the worthy objects of our attention as literary scholars.  At the same time,  I do not think we need say that distanced reading plays no role in our understanding of such peculiarities.  A Chaucer project on the order of the Homer Multi-text, might actually deepen and multiply our understanding of Chaucer’s slipperiness and originality.  At the same time, vast database-driven analyses of every text written within a hundred years of Chaucer might allow us to discover the kinds of linguistic sources he was drawing on and manipulating anew for his own purposes, they might show us new creativities we had not imagined, or they might show us things we had taken to be unique were fairly common stock and trade.
These kinds of knowledges could not be derived from a contest between methods, but only from a reading marked by attentiveness, skill and desire, one willing to draw on any resource to understand what one wishes to know, which used to be a metaphor for love.

Is Twitter the future of fiction? Micro-prose in an age of ADD

As I’ve mentioned, I’ve been struck by Alex Juhasz’s pronouncement at the Re:Humanities conference that we must learn what it means to write for an audience that is permanently distracted.  In response, I put up a Facebook post: “We need a rhetoric of the caption. A hermeneutic of the aphorism. Haiku as argument.”  My Provost at Messiah College–known for thorough and intricate argument–left a comment “I’m Doomed.”

Perhaps we all are, those of us who are more Faulkneresque than Carveresque in our stylistic leanings.  This latest from GalleyCat:

R.L. Stine, the author of the popular Goosebumps horror series for kids, gave his nearly 49,000 Twitter followers another free story this afternoon.To celebrate Friday the 13th, the novelist tweeted a mini-horror story called “The Brave One.” We’ve collected the posts below for your reading pleasure.

via R.L. Stine Publishes ‘The Brave Kid’ Horror Story on Twitter – GalleyCat.

Ok, I know it’s a silly reach to put Stine and Faulkner in the same paragraph, and to be honest I found Stine’s story trite.  On the other hand, I do think it’s obvious we’re  now in an age wherein shorter prose with bigger impact may be the necessity.  Flash fiction is growing, and we can witness the immense popularity of NPR’s three minute fiction contest.  These forms of fiction, of writing in general speak to the necessities of an art of the moment, rather than the art of immersion.  Literature, and prose in general, is ALWAYS responsive to material and cultural forms of its own moment, and I think prose that is short and explosive, or prose that pierces beneath the surface of the readers psyche in a moment only to spread and eat its way into the unconscious when the moment of reading is long forgotten, is mostly likely the prose that is the order of the day.

BUT…Stine certainly doesn’t do it for me.  I don’t know a lot about Twitter fiction.  Is there any really good stuff out there on twitter–as opposed to flash fiction written in a standard format which I know more about? Or is it all carney-style self-promotion or unrealized theory at the moment?

[And what, I wonder, does this mean for the future of academic prose as well?  I’m a late comer to Twitter myself, but I’ve been a little fascinated with the academic discourse that can occur, but more on that some other time.]

Teaching Humanities to digital natives who may know more than we do.

I remember a story about the advent of the New Criticism where one of those famous critic/scholar/teachers–I forget which one, but I want to say Cleanth Brooks or perhaps John Crowe Ransom–admitted to rushing home at night to read feverishly ahead in the texts he was teaching so that he was ready to go the following day.  On the one hand, this is a familiar story to any new (or not so new) professor who’s trying to stay one step ahead of the onrushing train.  On the other hand, its also the case that part of this was demanded by the fact that Brooks and others were trying to do something totally new for a literature classroom, the close perspicacious reading whose minutest detail nevertheless resulted miraculously in a coherent organic whole.  That kind of textual analysis was the meat of my own education, and to be honest, it hasn’t really changed all that much despite all the new (and now new old theories) that came in with the advent of deconstruction and its descendants.  We still, more or less, on the undergraduate level do the close reading, even if we now look for the way things fall apart or for hints and allegations of this or that cultural depravity.

But I am intrigued by just how hard Brooks/Ransom (or whomever it was) had to work to stay ahead of his students, in part because he really didn’t know entirely what he was doing.  He wasn’t building on the secure  corpus of knowledge that previous literary scholastics had received and passed on.  Despite the mythic and quasi-priestly status that some New Critics projected–turning the critic into an all-knowing seer, and thus setting the stage for the later assertions that critics were really the equals or superiors of the novelists and poets they read and critiqued, knowing what those poor souls could only allude to and  evoke–there was a very real sense in which the New Criticism was much more democratic than the literary scholasticism that preceded it.  (I am sure Frank Lentricchia is exploding about now, or would be if he ever actually would bother to read me).  While it may not have been more democratic in the sense that the New Critics seemed to cast a mysterious aura about all they did, developing a new and arcane ritual language to accompany it, it was more democratic in the sense that the method was potentially available to everyone.  Not everyone could have the time to read all the histories and all the letters and delve in to the archives and read the vast quantities of literature required for the literary scholasticism that characterized old style literary history .  But everyone could read the poem or the novel set in front of them.  And potentially a smart undergraduate could see a good deal that the prof had missed, or point out the problems in particular interpretations.  When the evidence of the poem was simply the poem itself, all the cards were on the table.  No longer could a professor say to the quivering undergraduate “Well, yes, but if you had bothered to read x,y, and z you would understand why your assertions about this poems place in literary history are totally asinine.”  The average undergraduate is never in a place to dispute with a professor on the place of this or that figure in literary history, but they could, in fact, argue that a professor had gotten a poem wrong, that an interpretation didn’t hold up to a closer scrutiny of the fact.  The feverish late night work of my Brooks/Ransom avatar, like the feverish late-night work of many a new and not so new professor, is sometimes cast as a noble inclination to truth or knowledge, or the discipline.  It is in truth, very often the quest to avoid embarrassment at the hands of our smarter undergraduates, the quest for just enough knowledge or just enough preparation to make sure we justify our authority in the eyes of our skeptical younger charges.

I was thinking about his again while attending the Re:Humanities undergraduate DH conference at Swarthmore/Bryn Mawr/Haverford Thursday and Friday. Clearly, one of the biggest challenges to bringing DH fully onboard in Humanities disciplines is the simple fact that undergraduates often know as much, and often know a great deal more, about the tools we are trying to employ.  On the one hand, this is a tremendous challenge to mid-career academics who understandably have little interest in abandoning the approaches to scholarship, teaching, and learning that they have developed, that they understand, and that they continue to use effectively given the assumptions and possibilities of those tools as they are.  It was ever thus and to some degree colleges remain always one step behind the students they are attempting to educate, figuring out on the fly how our own education and experience can possibly apply in this day and hour.

However, I also wonder whether the democratization of the technological environment in the classroom isn’t a newly permanent state of affairs.  The pace of technological change–at least for the present, and why would we assume that should stop in the near or mediate future–means that there is some sense in which we are entering an period in the history of education in which educators will, in some sense, never know any more about the possibilities of the tools they are using than do the students that they are teaching.  Indeed, given the nature of the tools, it is quite likely that collectively the students know a great deal more about how to use the tools available to them and that they are likely to be more attuned more quickly to the latest technological developments.  What they don’t know–and what we as educators don’t know either–is how to best deploy those resources to do different kinds of humanistic work.  The teleology of learning used to be fairly, if undemocratically, straightforward.  The basic educational goal was to learn how to do what your teacher could do–with reading, with texts, with research.  In our current age that teleology is completely, perhaps appropriately, disrupted.  But that doesn’t alleviate the sense that we don’t know entirely what we should be teaching our students to do when we don’t entirely know what to do or how to do it ourselves.

Mortimer Adler famously wrote a book on “How to Read a Book”  and though people bemoaned Adler as an elitist and a snob, the basic idea was still important.  Some people knew how to read books and others did not.  I still think its the case that we take a tremendous amount for granted if we assume an undergraduate actually knows how to read an old-fashioned codex well.  They don’t.  On the other hand, we have no equivalent book that tells us “how to read….”, in part because we don’t know how to fill in the blank, though perhaps “digital artifacts” comes as close as anything.  We’re not even sure what tools we should be using to do whatever it is we are doing as humanists in this day and age.  No wonder most professors choose to continue to use books, even though I think the day is fast approaching when students won’t tolerate that, anymore than an ancient would have tolerated the continued use of scrolls when a perfectly good codex was available at hand.  What the current technological changes are doing is radically democratizing the classroom on the level of the tool.

I did have a couple of signs of hope this past week at the Re:Humanities conference at Swarthmore. In the first place, if the educational system in the humanities is becoming radically democratized at the level of structure, I think it is safe to say there are many, many, many people using that democracy well.  The students at the conference were doings stunningly good and creative work that was clearly contributing to our knowledge of the world around us–sometimes pursuing these projects independently or, most often, in partnership with and in mentoring relationships with committed faculty.  (It is, of course, also the case that people can use democracy poorly, as I’ve suggested elsewhere;  this would be true in both the classroom and the body politic, so we should ask whether and where the democratization of our educational system is being used well, rather than assuming that because we use the word democracy we have named a substantive good).

Secondarily, one of the chief insights I drew from the different speakers was that if we put the tools on the table as possibilities, students will surprise and amaze us with what they can manage to come up with.  What if we found ways to encourage students to get beyond the research paper and asked that they do serious creative and critical work with the tools that they have everyday at hand on their iPhones, laptops, and etcetera.  What is we encouraged them to say we have to find the best way to answer the kind of questions humanists have always asked, and to identify the new questions and potential answers that new (and now not so new) technologies make possible.  We will have to do this regardless, I think.  The age demands it.  And I suspect that there will be many many more frantic late nights for faculty ahead.  But I think those frantic late nights will be built less and less on the belief that we have to get on top of “the material” and “stay ahead” of our students.  When they can bring in material we’ve never heard of with the touch of a finger on their iPhones, we have no hope of being on top of the material or staying ahead in a meaningful sense.  Perhaps what we can do is inspire them to charge ahead, guide them to the edges of the landscape that we already know, and partner with them in the exploration of the landscapes that we haven’t yet discovered.

Why digital humanities is already a basic skill, not just a specialist niche–Matthew Kirschenbaum

Sometimes I think we humanists “of a certain age,” to put the issue politely, imagine digital humanities as an optional activity that will be filled by an interesting niche of young professors who take their place in the academy as yet another niche sub-discipline, something that research universities hire for and small colleges struggle hopelessly to replicate.  It may be indeed that small colleges will struggle to integrate digital humanities in to their own infrastructures, but I think the general picture of Digital Humanities as an optional sub-discipline will simply be unsustainable.  The argument smells a little of the idea that e-books are a nice sub-genre of texts, but not something the average humanist has to worry that much about.  I think, to the contrary, that digital humanities and the multitude of techniques that it entails, will become deeply integrated in a fundament way with the basic methodologies of how we go about doing business, akin to knowing how to do close reading or how to maneuver our way through libraries.

Although pointing out this fact is not his main point, Matthew Kirschenbaum–already a Digital Humanities patron saint in many respects–has an essay in The Chronicle that points to this fact.  Kirschenbaum is currently interested in how we preserve digital material, and the problems are just as complex if not moreso than the general question of how and when to save print materials.  Moreso to the degree that we cannot be sure that the current forms in which we place our digital intelligence will actually be usable five years from now.  The consequences for humanities research and writing are profound and must be considered. From Kirschenbaum:

Digital preservation is the sort of problem we like to assume others are thinking about. Surely someone, somewhere, is on the job. And, in lots of ways, that is true. Dire warnings of an approaching “digital dark ages” appear periodically in the media: Comparisons are often made to the early years of cinema—roughly half of the films made before 1950 have been lost because of neglect. 

But the fact is that enormous resources—government, industry, and academic—are being marshaled to attack the problem. In the United States, for example, the Library of Congress has been proactive through its National Digital Information Infrastructure and Preservation Program. Archivists of all stripes now routinely receive training in not only appraisal and conservation of digital materials but also metadata (documentation and description) and even digital forensics, through which we can stabilize and authenticate electronic records. (I now help teach such a course at the University of Virginia’s renowned Rare Book School.) Because of the skills of digital archivists, you can read former presidents’ e-mail messages and examine at Emory University Libraries a virtual recreation of Salman Rushdie’s first computer. Jason Scott’s Archive Team, meanwhile, working without institutional support, leaps into action to download and redistribute imperiled Web content.

What this suggests is that Rushdie’s biographers will have to not so much know how to sift through piles of letters, but how to recreate digital archives that authors themselves may not be interested in preserving.  Biographers of the present and surely the future, will have to be Digital technicians, as well as close readers of the digital archives they are able to recover.

Kirschenbaum goes on to suggest that most of us must do this work on our own, and must do this work for ourselves, in preserving our own archives.

But despite those heroic efforts, most individuals must still be their own digital caretakers. You and I must take responsibility for our own personal digital legacy. There are no drive-through windows (like the old photo kiosks) where you can drop off your old floppies and pick up fresh files a day or two later. What commercial services are available tend to assume data are being recovered from more recent technology (like hard drives), and these also can be prohibitively expensive for average consumers. (Organizations like the Library of Congress occasionally sponsor public-information sessions and workshops to teach people how to retrieve data from old machines, but those are obviously catch as catch can.)

Research shows that many of us just put our old disks, CD’s, and whatnot into shoeboxes and hope that if we need them again, we’ll figure out how to retrieve the data they contain when the time comes. (In fact, researchers such as Cathy Marshall, at Microsoft Research, have found that some people are not averse to data loss—that the mishaps of digital life provide arbitrary and not entirely unwelcome opportunities for starting over with clean slates.)

This last, of course, is an interesting problem.  Authors have often been notoriously averse to having their mail probed and prodded for signs of the conflicts and confessions, preferring that the “work” stand on its own. Stories of authors burning their letters and manuscripts are legion, nightmarishly so for  the literary scholar.   Such literary self-immolations are both harder and easier in a digital world.  My drafts and emails can disappear at the touch of a button.  On the other hand, I am told that a hard drive is never actually erased for those who are really in the know.  Then again, the task of scholar who sees a writers computer as his archive is in some ways vastly more difficult than that of the writer who was an assiduous collector of his type-written drafts.  Does every deletion and spell correct count as a revision.  What should we trace as an important change, and what should we disregard as detritus.  These are, of course, the standard archival questions, but it seems to me they are exponentially more complicated in a digital archive where a text may change a multitude of times in a single sitting, something not so possible in a typewritten world.
Well, these are the kinds of things Kirschenbaum takes up.  And having the tools to apply to such questions will be the task for every humanist in the future, not a narrow coterie.

Living in an e-plus world: Students now prefer digital texts when given a choice

A recent blog by Nick DeSantis in the Chronicle points to a survey by the Pearson Foundation that suggests Tablet ownership is on the rise.  That’s not surprising, but more significant is the fact that among tablet users there’s a clear preference for digital texts over the traditional paper codex, something we haven’t seen before even among college students of this wired generation:

One-fourth of the college students surveyed said they owned a tablet, compared with just 7 percent last year. Sixty-three percent of college students believe tablets will replace textbooks in the next five years—a 15 percent increase over last year’s survey. More than a third said they intended to buy a tablet sometime in the next six months.

This year’s poll also found that the respondents preferred digital books over printed ones. It’s a reversal of last year’s results and goes against findings of other recent studies, which concluded that students tend to choose printed textbooks. The new survey found that nearly six in 10 students preferred digital books when reading for class, compared with one-third who said they preferred printed textbooks.

I find this unsurprising as it matches up pretty well with my own experience.  5 years ago I could never imagine doing any significant reading on a tablet.  Now I do all my reading of scholarly journals and long form journalism–i.e The Atlantic, the New York Review of Books, The Chronicle Review–on my iPad.  And while I still tend to prefer the codex for the reading of novels and other book length works, the truth is that preference is slowly eroding as well.  As I become more familiar with the forms of e-reading, the notions of its inherent inferiority, like the notions of any unreflective prejudice, gradually fade in the face of familiarity.

And yet I greet the news of this survey with a certain level of panic, not panic that it should happen at all, but panic that the pace of change is quickening and we are hardly prepared, by we I mean we in the humanities here in small colleges and elsewhere.  I’ve blogged on more than one occasion about my doubts about e-books and yet my sense of their inevitable ascendancy.  For instance here on the question of whether e-books are being foisted on students by a cabal of publishers and administrators like myself out to save a buck (or make a buck as the case may be), and here on the nostalgic but still real feeling that I have that print codex forms of books have an irreplaceable individuality and physicality that the mere presence of text in a myriad of e-forms does not suffice to replace.

But though I’ve felt the ascendancy of e-books was inevitable, I think I imagined a 15 or 20 year time span in which print and e-books would mostly live side by side.  Our own librarians here at Messiah College talk about a “print-plus” model for libraries, as if e-book will remain primarily an add on for some time to come.  I wonder.  Just as computing power increases exponentially, it seems to me that the half-life of print books is rapidly diminishing.  I now wonder whether we will have five years before students will expect their books to be in print–all their books, not just their hefty tomes for CHEM 101 that can be more nicely illustrated with iBook Author–but also their books for English and History classes as well.  This is an “e-plus”  world  where print will increasingly not be the norm, but the supplement to fill whatever gaps e-books have not yet bridged, whatever textual landscapes have not yet been digitized.

Despite warnings, we aren’t yet ready for an e-plus world.  Not only do we not know how to operate the apps that make these books available, we don’t even know how to critically study books in tablet form.  Yet learning what forms of critical engagement are possible and necessary will be required.  I suspect, frankly, that our current methods developed out of a what was made possible by the forms that texts took, rather than forms following our methodological urgencies.  This means that the look of critical study in the classroom will change radically in the next ten years.  What will it look like?