Category Archives: literature

Digitization and the fulfillment of the book

My colleague in the library here at Messiah College, Jonathan Lauer, has a very nice essay in the most recent Digital Campus edition of the Chronicle of Higher Education.  Jonathan makes an eloquent defense of the traditional book over and against the googlization and ebookification of everything.   He especially employs an extended metaphor drawn from the transition to aluminum bats in various levels of baseball to discuss his unease and reservations about the shifts to electronic books and away from print that is profoundly and rapidly changing the nature of libraries as we’ve known them.  The essay is more evocative than argumentative, so there’s a lot of different things going on, but a couple of Jonathan’s main points are that enhancements we supposedly achieve with digitization projects come at a cost to our understanding of texts and at a cost to ourselves.

In the big leagues, wooden bats still matter. Keeping print materials on campus and accessible remains important for other reasons as well. Witness Andrew M. Stauffer’s recent Chroniclearticle, “The Troubled Future of the 19th-Century Book.” Stauffer, the director of the Networked Infrastructure for Nineteenth-Century Electronic Scholarship, cites several examples of what we all know intuitively. “The books on the shelves carry plenty of information lost in the process of digitization, no matter how lovingly a particular copy is rendered on the screen,” he writes. “There are vitally significant variations in the stacks: editions, printings, issues, bindings, illustrations, paper type, size, marginalia, advertisements, and other customizations in apparently identical copies.” Without these details, discernible only in physical copies, we are unable to understand a book’s total impact. Are we so easily seduced by the aluminum bat that we toss all wooden ones from the bat bag?

Let’s also acknowledge that our gadgets eventually program us. History teaches us that technologies often numb the very human capacities they amplify; in its most advanced forms, this is tantamount to auto-amputation. As weavers lost manual dexterity with their use of increasingly mechanized looms during the Industrial Revolution, so we can only imagine what effect GPS will have on the innate and learned ability of New York City cabbies to find their way around the five boroughs. Yet we practice auto-amputation at our own peril. We dare not abandon wooden bats for aluminum for those endeavors that demand prolonged attention, reflection, and the analysis and synthesis that sometimes lead to wisdom, the best result of those decidedly human endeavors that no gadget can exercise.

I have a lot of sympathy for Jonathan’s position, things like the revamping of the New York Public Library leaving me with a queasy hole in my stomach.  I’ve had a running conversation with Beth Transue, another of our librarians, about our desire to start leading alumni tours of the world’s great libraries, but if we’re going to do so we better get it done fast because most of them won’t be around anymore in a few more years, at least if the NYPL and its budgetary woes are anything to judge by.

At the same time, I think Jonathan overstates his case here.  I don’t think serious thinkers are assuming we’ll get rid of books entirely.  Although I currently think we are already living in what I’ve called an E-plus world, print will continue to be with us serving many different purposes. Jason Epstein over at the NYRB has a blog on this fact and progrognosticating the likely future and uses of the traditional book seems to be a growth industry at the moment. I don’t think the average student is too terribly interested in the material textuality that Jonathan references above, nor for that matter is the average scholar, the vast majority of whom remain interested in what people wrote not how the publishers chose to package it.  But those issues will continue to be extremely important for cultural and social historians, and there will be some forms of work that will only possibly be done with books.  Just as it is a tremendous boon to have Joyce’s manuscript’s digitized, making them available for the general reader and the scholar who cannot afford a trip to Ireland, authoritative interpretations of Joyce’s method, biography, and life’s work will still have to make the trip to Ireland to see the thing for themselves, to capture what can’t be captured by a high resolution camera.

That having been said, who would say that students studying Joyce should avoid examining the digitized manuscripts closely because they aren’t “the genuine article.”  Indeed, I strongly suspect that even the authoritative interpretations of those manuscripts will increasingly be a commerce between examination of the physical object and close examination of digitized objects since advanced DH work shows us time and time again that computerized forms of analysis can get at things the naked eye could never see.  So the fact that there are badly digitized copies of things in google books and beyond, shouldn’t belie the fact that there are some massively important scholarly opportunities here.

Jonathan’s second point is about the deeply human and quasi-spiritual aspects of engagement with traditional books that so many of us have felt over the years.  There’s something very true about this. It is also true that our technologies can result in forms of self amputation.  Indeed, if we are to take it to heart we need to admit that the technology of writing and reading itself is something that involves self-amputation.  Studies have shown that heavy readers alter their brains, and not always in a good sense.  We diminish the capacity of certain forms of memory, literally making ourselves absent minded professors.   Other studies have suggested that persons in oral cultures have this capacity in heightened form, and  some people argue that this generation is far more visually acute than those that preceded it, developing new abilities because of their engagement with visual texts.  So, indeed, our technologies alter us, and even result in self-amputation, but that is true of the traditional book as well as the internet.  This second is Jonathan’s larger claim since it seems to claim for traditional books as such a superiority in terms of something central to humanity as such. I am intrigued, with this argument that the book is superior for serious reflection and the quasi spiritual aspects of study that we have come to treat as central to the humanities.

I admit, I don’t buy it.

First, I admit that I’m just wary about attributing essential human superiorities to historical artifact and practices.  Homer as a collection of aural songs is not inherently inferior to the scrolls within which they were originally collected, then finding their apotheosis in the book form.  We have come to think of the book as exhibiting and symbolizing superior forms of humanity, but it’s not clear that book form was triumphant in the west because of these attributes.  Indeed, traditional Jews and others clearly think the scroll remains the superior spiritual form even to this day.  Rather, the codex triumphed for a variety of complicated reasons.  Partly Christian Churches for ideological reasons apparently wanted to distinguish their own writings from the writings of the Jews.  There may have been some more substantive reasons as well, though that’s not entirely clear: Anthony Grafton points out that many of the Christian innovations with the codex seemed to focus on the desire to compare different kinds of texts side by side (an innovation, I will point out, for which the internet is in many ways easily superior).  The codex also triumphed not because it was spiritually and intellectually superior but because it was, frankly, more efficient, cheaper, and easier to disseminate than its scrolly ancestors.  One good example is from the poet Martial who explicitly ties the selling of his poetry in codex form to making them easily and efficiently accessible to the common person:  “Assign your book-boxes to the great, this copy of me one hand can grasp.”

The entire trend of book history has been toward this effort to make texts and what they contain more readily and easily available to more and more people.  From the early clay tablets to the mass market paperback that let you carry Plato in your hip pocket, the thrust of the book has been toward broader and broader dissemination, toward greater and greater ease of use, toward cheaper and cheaper accessibility.  The goal of writing, even when that writing was imprisoned in libraries that only the initiated could enter as in Umberto Eco’s The Name of the Rose, has been open access.

The digitization that is occurring now comes to fulfill the book, not destroy it.

Secondarily, I guess I no longer believe fully in the spiritual or intellectual superiority of codex forms simply since it doesn’t comport with my experience.  As I do more and more of my reading of books with my various e-readers, I find that I have serious, contemplative, analytical, and synthetic engagements with all kinds of texts, from those hundreds of “pages” long and those not.  As I get used to the tools of various e-readers, theres almost nothing that can’t be accomplished in some way on an e-reader that is accomplished in traditional books.  Although I interact with texts differently now in a spatial sense, I am able to take fuller and more copious notes, I am able to mark texts more easily,  and if I can’t quite remember where something was in the book I can use a search engine to find not only a specific phrase or topic, but every single instance of that topic in the book.  Moreover, because every text represents an act of contemplation on and conversation with other texts, I can at the touch of a screen go and read for myself the interlocutors embedded within a book, just as those interested in Jonathan’s essay can touch my link above and decide for themselves whether I am reading him fairly.  Thus there are very obviously and seriously some ways in which e-readers are superior for serious analytical and interpretive readings of texts, or at least the equal to them.

All this having been said, I will say that there remains one way that I find the traditional paper book the clear superior to the e-book, and that has to do with my ability to make it mine.

I spoke a couple of days ago about the personal connection I felt to Kierkegaard in rereading him and discovering my many years of underlines, highlights and marginalia.  I even confess that I real Kimi Cunningham Grant’s new memoir on my iPad, but I still bought a hard cover at the reading–not because I thought I would be able to analyze it more effectively in hard cover, but because I wanted her to sign it for me.

This is a personal connection to the book that isn’t unimportant, but that is about my personal biography, and Kimi’s.  It’s not about the text, and frankly I doubt it will in the long run even be about literary history.  Some literary archivist somewhere is collecting all the shared comments on the Kindle version of Kimi’s book, and that massive marginalia will be fodder for some graduate student’s dissertation in a few decades.

I pity the poor graduate student who decides on such a project. But at least she won’t have to strain her eyes to decipher the handwriting.

Kimi Cunningham Grant on Japanese American Experience

Last month I had the chance to hear Messiah alum, Kimi Cunningham Grant, read from her work and talk at The Midtown Scholar Bookstore with poet Julia Spicher Kasdorf about her new book, Silver like Dust. The conversation was wide-ranging touching on the elements of the writing process, on issues associated with Japanese American history and experience, and on Kimi’s own grappling with her identity as a bi-racial writer whose connections to her Japanese American heritage have grown over the years interviewing her grandmother and writing the book. I still have a blog vaguely planned related to Kimi’s book; I think there’s some really interesting things in there about books and reading. But for now I’m just happy to share this podcast recorded by the folks down at the Midtown. Catherine Lawrence and Eric Papenfuse do a tremendous amount for our community and for the world of books and writing generally. I’m glad we’ve got this record of a good event

.Kimi Cunningham Grant at the Midtown with Julia Spicher Kasdorf

Takeaways–NITLE Seminar: Undergraduates Collaborating in Digital Humanities Research

Yesterday afternoon at 3:00 about 30 Messiah College humanities faculty and undergraduates gathered to listen in on and virtually participate in the NITLE Seminar focusing on Undergraduates Collaborating in Digital Humanities Research.  A number of our faculty and students were tweeting the event, and a Storify version with our contributions can be found here.I am amazed and gratified to have such a showing late on a Friday afternoon.  Students and faculty alike were engaged and interested by the possibilities they saw being pursued in undergraduate programs across the country, and our own conversation afterwards extended for more than a half hour beyond the seminar itself. Although most of us freely admit that we are only at the beginning and feeling our way, there was a broad agreement that undergraduate research and participation in Digital Humanities work was something we needed to keep pushing on.

If you are interested in reviewing the entire seminar, including chat room questions and the like, you can connect through this link.  I had to download webex in order to participate in the seminar, so you may need to do the same, even though the instructions I received said I wouldn’t need to.  My own takeaways from the seminar were as follows:

  • Undergraduates are scholars, not scholars in waiting.  If original scholarship is defined as increasing the fund of human knowledge, discovering and categorizing and interpreting data that helps us better understand human events and artifacts, developing tools that can be employed by other scholars who can explore and confirm or disconfirm or further your findings, these young people are scholars by any definition.
  • Digital Humanities research extends (and, to be sure, modifies) our traditional ways of doing humanities work;  it does not oppose it.  None of these young scholars felt inordinate tensions between their traditional humanities training and their digital humanities research.  A student who reviewed a database of 1000 Russian folks tales extended and modified her understanding arrived at by the close reading of a dozen.  Digital Humanities tools enable closer reading and better contextual understanding of the poet Agha Shahid Ali, rather than pushing students away in to extraneous material.
  • Many or most of these students learned their tools as they went along, within the context of what they were trying to achieve.  I was especially fascinated that a couple of the students had had no exposure to Digital Humanities work prior to their honors projects, and they learned the coding and digital savvy they needed as they went along.  Learning tools within the context of how they are needed seems to make more and more sense to me.  You would not teach a person how to use a hammer simply by giving them a board and nails, at least not if you don’t want them to get bored.  Rather, give them something to build, and show or have them figure out how the hammer and nails will help them get there.

I’m looking forward to the Places We’ll Go.

What is an education for? Remembering the American Revolution

History can remind us of just how expansive our ancestors could be, and how foreshortened our own vision has become.  One thing that makes our current discussion of higher education so difficult is the dramatic impoverishment of the range of our discourse about educational purposes: the narrower our frame of reference the more cramped our imagination, the more limited our creative responses to crisis, and the fewer our possible options.

Geoffrey Galt Harpham begins his sixth chapter with a citation from John Adams.

I must study Politicks and War that my sons may have liberty to study Mathematicks and Philosophy.  My sons ought to study mathematics and Philosophy, Geography, natural History, naval Architecture, navigation, Commerce and Agriculature, in order to give their Children a right to study Painting, Poetry, Musick, Architecture, Statuary, Tapestry and Porcelaine.

Of this particular citation and others like it, Harpham goes on to say,

[It] is worth recalling that once upon a time the ruling class–which had also been the revolutionary class–imagined that they were risking their lives, their fortunes, and their sacred honor in behalf of a futurity where what would come to be called the humanities would dominate the concerns of the citizenry.  They humanities, they felt, would represent the crowning achievement of a nation that, having prevailed in war, would build its new society on a foundation of such economic, political, military, and social security that citizens could enrich their lives by turning their attention to the study and appreciation of material and textual artifacts…Adams, Jefferson, and others believed that a general concern for the humanities represented not only the best possible future for the new nation but also the natural progression of mankind, if freed from fear and want.

 We are, of course, a long way from that vision now, our educational vision cramped by a cultural imagination that extends no further than security, economic security first and foremost, but other kinds of security as well.  The quest for security leads fathers to discourage their sons interest in poetry and philosophy and insists that they study business, or leads other students to declare as education majors so they “have something to fall back on”.  It’s worth noting that Adams spoke in a period far more precarious and insecure for the American Republic than anything we face today, and so our current obsessions and fears that education ought to be about employment first and always seems spiritually and ethically….empty.  In the midst of a national experiment that could still have failed, Adams was able to imagine that work existed for the higher purposes of education, rather than education existing for the “practical” purposes of work.

Not that there was no debate between advocates for what is now called professional education and what we continue to call the liberal arts.  It was, in some respects, ever thus, even if it seems more thus now than ever. Harpham points out that John Locke was a philosopher in favor of what we now call professional education and dismissive of the preciousness of the liberal arts.  Harpham also points out that it is a good thing the Lockes of the world did not win the argument and the Adamses did since no one would now be reading either one were it not for the continuing if weakened importance of a liberal arts education.

However,  I think there’s an irony in Adams’s formulation (and in Harpham’s appreciation of it) since it seems to assume that fear and want are defined qualities that can be addressed, finite needs that can be satisfied.  We live in a society that in some respects makes a living off the generation and regeneration of fear–the beneficiaries being our massive security industries–the prisons, the military, homeland security, gated communities, home security systems, and on and on.  We are also a culture defined by the generation of want rather than its satisfaction.  As much as I admired Steve Jobs, Apple is a company built on the generation of desire for things people never knew they wanted, and the iconic Apple is one small mythic reminder of the infinite allure of the new product hanging like fruit from the lowest shelf.

The irony of Adams’s formulation is that there is never any end of want, and our insatiable desires generate, at a minimum, the ongoing fear that we will somehow lose track of all our baubles or have them taken from us.  And our fundamental fears for our children have to do with the fear that they will have fewer baubles than we have.  And so finally, if want and if fear are potentially never ending–like the wars that Adams feels compelled to study–what room left ever for those higher human ideals that Adams deferred for himself. I think he deferred them unknowingly for his sons and daughters and granddaughters and grandsons as well. Are they not deferred always, if we begin with the belief that security is the means and education is at the end? In the world we have created we will never be secure enough for the poetry and philosophy that Adams at least desired for his progeny.

A couple of years ago I tried to think through my own rationale for the purposes of education.  You can listen to it here as you have interest:  Convocation Address: Education for Praise

Can a liberal arts ethos and a professionalized research faculty co-exist?

I’ve been reading Geoffrey Galt Harpham’s recent book, Humanities and the Dream of America, whiling away the hours on my exercise bike.  Ok, half hours.  I’ve been struck by the similarities in analysis between Harpham and Andrew Delbanco’s analysis of the college as a distinctly American achievement, having just finished Delbanco’s meditation on the nature of the college a week or so ago.

For both, the fundamental antagonist to the ideals of the liberal arts has not been professional programs per se–though undergraduate programs in things like businesss, nursing, and engineering (and a host of others) are favorite bete noirs of our current humanistic discourse about the purposes of education.  Rather, for both the real threat lies in the research university and the ethos of specialization that it entails.  This emphasis of specialized knowledge is itself inherently narrowing, and is opposed to the generous expansiveness of spirit that, at least in theory, characterizes the highest ideals of a liberal arts education.

Like Delbanco, Harpham draws on Newman as a first resource for the fully articulated ideal of the idea that education should enrich our human experience, fitting us primarily for a life well lived, rather than for the specifics of a living.  I’m intrigued, though, that Harpham brings out this ethos not only as characterizing the curricular choices and the spiritual, ethical and cultural teloi of the undergraduate [side note, we no longer would use a word like teloi; we would invoke learning objectives];  more than that, this ethos characterizes the faculty and their understanding of their role in the life of the mind.

Moreover, the institution devoted to producing Newman’s protege bears little resemblance to today’s institutions of higher learning. Newman’s idea of a university had no room for either specialized or comprehensive knowledge, and the professors who taught there should, he fervently believed-it is the very first statement in the book-have nothing to do with research, an obsessive activity best conducted by those whose minds were too sadly contracted to engage in full life. …

Newman intuitively understood that the real threat to liberal education in his own time was not the shade of Locke but the spirit of research then crescent in Germany. By the end of the nineteenth century, the United States had begun to Germanize its academy, with some university faculties organizing themselves into departments that granted degrees that came to constitute credentials. With credentialing came professionalism, and with professionalism growth. President Lawrence Lowell of Harvard (1909-33) laid the foundation for Harvard’s eminence as a research powerhouse by encouraging his faculty to think of themselves as professionals.4 This meant adopting impersonal criteria of scholarly competence within each discipline, cultivating a spirit of empirical and methodological rigor, and coming to agreement about what would count as standards of achievement in a small, self-certifying group of mandarins

The new professionalism functioned as a way of insulating the university from extra-academic pressures by creating a separate world of academe that could be judged only by its own standards. But professionalism was accompanied by its dark familiars, partition and competition. A professionalized faculty was a faculty divided into units, each claiming considerable autonomy in matters of hiring and promotion, and competing with other units for salaries, students, office space, and prestige. Such competition naturally placed some stress on amity, and so while undergraduates were expected to enjoy four stress-free years in the groves of academe, the faculty in the same institutions were facing the prospect of going at it hammer and tong, competing with each other virtually face to face, for the rest of their lives.

Geoffrey Galt Harpham. The Humanities and the Dream of America (pp. 127-128). Kindle Edition.

This seems about right to me, and it might pass without comment, but it raises a troubling contradiction.  In most of the strong defenses of the liberal arts that I hear, the notion that faculty should abandon research or that the fullness of the liberal arts spirit is best embodied by a faculty full of generalists is never among them.  Indeed, quite the opposite.  We want an institution to display its commitment to the liberal arts precisely by giving more support to faculty research so that faculty can become more and more specialized, more fully recognized in their professional disciplines, more disconnected from the immediacy of their liberal arts institutions, and less able to exemplify the generalist qualities that we say we value as fully humanizing.

What this argument for the liberal arts wants (I’m not saying there aren’t other arguments), is a research university, or at least a research college, with a commitment to research in the liberal arts and all the prestige that entails.  What we definitely do not want to do is give up the right to writing books that no one wants to read so that we can demonstrate our particularly specialized knowledge.

The faculty, as Harpham recognizes, is fully professionalized, and in many respects in the liberal arts we have created specialized professional programs, imagining that our students are professors in embryo.  The college major is now, in many respects, a professional program ,and it is worth noting that the idea of a college major is coextensive with the advent of the research university.  Indeed, I have heard the argument that we should have much larger majors in the humanities than we do have because the size of the major is a measure of its prestige in the college, competing then as it would with the gargantuan programs in engineering, nursing, and many of the hard sciences, programs that students can barely finish in four years, if they can.  So much for our sentimental sop about the value of breadth of mind and spirit.

Can a research faculty that shows no real interest in giving up the ideals of research exemplify and support a genuine liberal arts ethos in an American college today (leaving aside the question of whether liberal arts colleges will survive at all)? I am not sure what the route out of this conundrum actually is.  I stopped in the middle of Harpham’s chapter where he actually has just noted that faculty in the liberal arts are essentially professionals and conceive of themselves as professionals in ways quite similar to their brethren in professional programs.  I am not sure where he is going with that insight, but I look forward to finding out.

Distanced and Close Reading in literary study: Metaphors for love

I am old enough now to begin sentences with the phrase “I am old enough…”  Seriously, though, I am old enough now to feel like I have lived through one revolution, into a new orthodoxy, and now the experience of a new revolution in literary studies.  In the ongoing debates I hear about the digital humanities versus whatever other kind of humanities happens to be at hand, I keep having this vertiginous sense of deja vu, as if I’m hearing the same arguments I heard two decades ago, but transformed in to a key just different enough that I can’t tell whether today’s debates are mere variations on a theme or some genuinely new frame of discourse.

The song that I think is remaining the same is the divide between the proponents of what gets called “distanced reading,”  which in some hands is a shorthand for all things digital humanities (if it’s digital, it must be distanced as compared to the human touch of paper, ink, and typewriters–how the industrial period came to be the sign and symbol of all thing human and intimate I am not entirely clear), and close reading which is somehow taken to be THE form of intimate human contact with the text.

This division is exemplified in Stanley Fish’s recent essay on the digital humanities in the New York times, an argument that has the usual whiff of caustic Fishian insight leavened with what I take to be a genuine if wary respect for what he sees in the practices of distanced reading.  Nevertheless, for Fish, it is finally close reading that is genuinely the work of the humane critic devoted to intimacy with the text:

But whatever vision of the digital humanities is proclaimed, it will have little place for the likes of me and for the kind of criticism I practice: a criticism that narrows meaning to the significances designed by an author, a criticism that generalizes from a text as small as half a line, a criticism that insists on the distinction between the true and the false, between what is relevant and what is noise, between what is serious and what is mere play. Nothing ludic in what I do or try to do. I have a lot to answer for.

Ironically, in an earlier period it was Fish and precisely this kind of close reading (as practiced by deconstructionists) that was descried for its lack of seriousness, for the way it removed literature from the realm of human involvement and into the play of mere textuality .  By contrast, the distanced readers in those days imagined themselves as defenders of humanity (or, since humanism was a dirty word, at least the defender of the poor, the downtrodden, the miserable, the huddled masses).  Historicism read widely and broadly in the name of discourse, and proclaimed itself a liberating project, ferreting out the hidden political underbelly in a multitude of texts and considering literary criticism to be an act of responsible justice-seeking over and against the decadent jouissance-seekers of post-structuralism.

A recent blog by Alex Reid takes up this same criticism of what he describes as the Close Reading industry, arguing for the ways digitization can free us from the tyranny of the industrialized close reader:

In the composition classroom, the widgets on the belt are student papers. If computers can read like people it’s because we have trained people to read like computers. The real question we should be asking ourselves is why are we working in this widget factory? And FYC essays are perhaps the best real world instantiation of the widget, the fictional product, produced merely as a generic example of production. They never leave the warehouse, never get shipped to market, and are never used for anything except test runs on the factory floor. 

In an earlier period, it was again the close-readers who were accused of being mechanistic, dry, and scientific as putatively more humanistic readers accused New Critics of an unfeeling scientism in their formalist attitude toward the text, cutting out every human affect in the quest for a serious and scientific study of literature.

I wonder at root, whether this is the controlling metaphor, the key to which all our tunes in literary and cultural studies are played, a quest for the human that is not merely scientific, and yet an unrepressed desire for the authority of the scientist to say things with security, to wear the mantle of authority that our culture apparently only believes a statistical method can endow.

It is probably a mark against my character that I tend to be a both/and pragmatist as a thinker.  I do not buy the notion that distanced reading is inconsequential, or some how less about truth or less serious than the close rhetorical readings that Fish invokes.  At the same time, I am not too given to the euphoric and pugnacious challenges that can sometimes characterize digital humanities responses to the regnant forms of literary criticism.  At their best, Fishian forms of close reading are endowed not simply with acute attention, but with attention that seems to give birth to a form of wisdom that only attentiveness and close examination can provide, the kind of insistent close reading that led Gerard Manley Hopkins to seek the “inscape” of individual instances beyond categories, rather than simply the ways in which individuals fit into the vast landscapes popular in his post-romantic period.

I was reminded of this need to attend to the close properties of the individual use of language again in a recent article on Chaucer in the Chronicle. The writer attends to the detail of Chaucer’s language in a way that seems to reveal something important about the ways in which we are human.

translating Chaucer is like translating any other foreign language: The words are different from one language to the next. And then comes the third category, the most fascinating and the most aggravating because it is the trickiest: the false cognates, words that look like they should mean what they do in Modern English, but don’t. False cognates are especially aggravating, and fascinating when they carry their Middle and Modern English meanings simultaneously. These are exciting moments, when we see, through a kind of linguistic time-lapse photography, Chaucer’s language on its way to becoming our own.

In Middle English, for instance, countrefete means “to counterfeit,” as in “to fake,” but it also has the more flattering meaning of “to imitate.” Corage has not only the Modern English sense of bravery but also, frequently, overtones of sexual energy, desire, or potency. Corage takes its roots from the word coeur, or “heart,” and transplants them slightly southward. The same is true for solas, or “solace.” The “comfort,” “satisfaction,” or “pleasure” it entails is often sexual.

Lust might seem to pose no problem for the modern reader. Yet in the 14th century, the word, spelled as it is today, could mean any kind of desire or pleasure, though around that time it was beginning to carry a sexual connotation, too. And lest it seem as if false cognates always involve sex, take sely, or “silly.” It most often means “blessed” or “innocent,” as well as “pitiful” and “hapless,” but “foolish” was making its way in there, too.

A sentence like “The sely man felte for luste for solas” could mean “The pitiful man felt desire for comfort.” It could just as likely mean: “The foolish man felt lust for sex.” In Chaucer’s hands, it could mean both at once.

Chaucer was fully aware of the slipperiness of language. He delights in it; he makes his artistic capital from it. He is an inveterate punster. The Wife of Bath, for example, repeatedly puns on the word queynte (eventually the Modern English “quaint”). In the 14th century, the word means not only “curious” or “fascinating” but also the curious part of her female anatomy that most fascinates her five husbands. What’s more, the slipperiness of language gives Chaucer the tools to form his famous irony and ambiguity. If the way-too-pretty Prioress is “nat undergrowe” (“not undergrown”), how big is she?

(via Instapaper)

 These kinds of particularities of language are the worthy objects of our attention as literary scholars.  At the same time,  I do not think we need say that distanced reading plays no role in our understanding of such peculiarities.  A Chaucer project on the order of the Homer Multi-text, might actually deepen and multiply our understanding of Chaucer’s slipperiness and originality.  At the same time, vast database-driven analyses of every text written within a hundred years of Chaucer might allow us to discover the kinds of linguistic sources he was drawing on and manipulating anew for his own purposes, they might show us new creativities we had not imagined, or they might show us things we had taken to be unique were fairly common stock and trade.
These kinds of knowledges could not be derived from a contest between methods, but only from a reading marked by attentiveness, skill and desire, one willing to draw on any resource to understand what one wishes to know, which used to be a metaphor for love.

Is Twitter the future of fiction? Micro-prose in an age of ADD

As I’ve mentioned, I’ve been struck by Alex Juhasz’s pronouncement at the Re:Humanities conference that we must learn what it means to write for an audience that is permanently distracted.  In response, I put up a Facebook post: “We need a rhetoric of the caption. A hermeneutic of the aphorism. Haiku as argument.”  My Provost at Messiah College–known for thorough and intricate argument–left a comment “I’m Doomed.”

Perhaps we all are, those of us who are more Faulkneresque than Carveresque in our stylistic leanings.  This latest from GalleyCat:

R.L. Stine, the author of the popular Goosebumps horror series for kids, gave his nearly 49,000 Twitter followers another free story this afternoon.To celebrate Friday the 13th, the novelist tweeted a mini-horror story called “The Brave One.” We’ve collected the posts below for your reading pleasure.

via R.L. Stine Publishes ‘The Brave Kid’ Horror Story on Twitter – GalleyCat.

Ok, I know it’s a silly reach to put Stine and Faulkner in the same paragraph, and to be honest I found Stine’s story trite.  On the other hand, I do think it’s obvious we’re  now in an age wherein shorter prose with bigger impact may be the necessity.  Flash fiction is growing, and we can witness the immense popularity of NPR’s three minute fiction contest.  These forms of fiction, of writing in general speak to the necessities of an art of the moment, rather than the art of immersion.  Literature, and prose in general, is ALWAYS responsive to material and cultural forms of its own moment, and I think prose that is short and explosive, or prose that pierces beneath the surface of the readers psyche in a moment only to spread and eat its way into the unconscious when the moment of reading is long forgotten, is mostly likely the prose that is the order of the day.

BUT…Stine certainly doesn’t do it for me.  I don’t know a lot about Twitter fiction.  Is there any really good stuff out there on twitter–as opposed to flash fiction written in a standard format which I know more about? Or is it all carney-style self-promotion or unrealized theory at the moment?

[And what, I wonder, does this mean for the future of academic prose as well?  I’m a late comer to Twitter myself, but I’ve been a little fascinated with the academic discourse that can occur, but more on that some other time.]

Our Data, Our Selves: Data Mining for Self-Knowledge

If you haven’t read Gary Shteygart’s Super, Sad, True, Love Story, I would encourage you to go, sell all, buy and do so.  I guess I would call it a dystopian black comedic satire, and at one point I would have called it futuristic.  Now I’m not so sure.  The creepy thing is that about every other week there’s some new thing I notice and I kind of say to myself “Wow–that’s right out of Shteyngart.”  This latest from the NYTimes is another case in point.  The article traces the efforts of Stephen Wolfram to use his immense collection of data from the records of his email to the keystrokes on his computer to analyze his life for patterns of creativity, productivity, and the like.

He put the system to work, examining his e-mail and phone calls. As a marker for his new-idea rate, he used the occurrence of new words or phrases he had begun using over time in his e-mail. These words were different from the 33,000 or so that the system knew were in his standard lexicon.

The analysis showed that the practical aspects of his days were highly regular — a reliable dip in e-mail about dinner time, and don’t try getting him on the phone then, either.

But he said the system also identified, as hoped, some of the times and circumstances of creative action. Graphs of what the system found can be seen on his blog, called “The Personal Analytics of My Life.”

The algorithms that Dr. Wolfram and his group wrote “are prototypes for what we might be able to do for everyone,” he said.

The system may someday end up serving as a kind of personal historian, as well as a potential coach for improving work habits and productivity. The data could also be a treasure trove for people writing their autobiographies, or for biographers entrusted with the information.

This is eerily like the processes in Shteyngart’s novel whereby people have data scores that are immediately readable by themselves and others, and the main character obsesses continuously over the state of his data, and judges the nature and potential for his relationship on the basis of the data of others.

Socrates was the first, I think, to say the unexamined life was not worth living, but I’m not entirely sure this was what he had in mind.  There is a weird distancing effect involved in this process by which we remove ourselves from ourselves and look at the numbers.

At the same time, I’m fascinated by the prospects, and I think its not all that different from the idea of “distanced reading” that is now becoming common through certain Digital humanities practices in literature, analyzing hundreds or thousands of novels instead of reading two or three closely in order to understand through statistical analysis the important trends in literary history at any particular point in time, as well as the way specific novels might fit in to that statistical history.

Nevertheless, a novel isn’t a person.  I remain iffy about reducing myself to a set of numbers I can work to improve, modify, analyze, and interpret.  The examined life leads typically not to personal policies, but to a sense of mystery, how much there is that we don’t know about ourselves, how much there is that can’t be reduced to what I can see, or what I can count.  If I could understand my life by numbers, would I?

For Your edification I include the book trailer for Shteygart’s novel below.

Is the decline of the Humanities responsible for Wall Street Corruption?

Andrew Delbanco’s view in a recent Huffington Post essay is “Yes”, at least to some indeterminate degree, though I admit that the title of his post “A Modest Proposal” gave me some pause given its Swiftian connotations:

What I do know is that at the elite universities from which investment firms such as Goldman Sachs recruit much of their talent, most students are no longer seeking a broad liberal education. They want, above all, marketable skills in growth fields such as information technology. They study science, where the intellectual action is. They sign up for economics and business majors as avenues to the kind of lucrative career Mr. Smith enjoyed. Much is to be gained from these choices, for both individuals and society. But something is also at risk. Students are losing a sense of how human beings grappled in the past with moral issues that challenge us in the present and will persist into the future. This is the shrinking province of what we call “the humanities.”

For the past twenty years, the percentage of students choosing to major in the humanities — in literature, philosophy, history, and the arts — has been declining at virtually all elite universities. This means, for instance, that fewer students encounter the concept of honor in Homer’s Iliad, or Kant’s idea of the “categorical imperative” — the principle that Mr. Smith thinks is out of favor at Goldman: that we must treat other people as ends in themselves rather than as means to our own satisfaction. Mr. Smith was careful to say that he was not aware of anything illegal going on. But few students these days read Herman Melville’s great novella, Billy Budd, about the difficult distinction between law and justice.

Correlation is not cause, and it’s impossible to prove a causal relation between what students study in college and how they behave in their post-college lives. But many of us who still teach the humanities believe that a liberal education can strengthen one’s sense of solidarity with other human beings — a prerequisite for living generously toward others. One of the striking discoveries to be gained from an education that includes some knowledge of the past is that certain fundamental questions persist over time and require every generation to answer them for itself.

via Andrew Delbanco: A Modest Proposal.

This is consonant with Delbanco’s thesis–expressed in his book College:  What it Was, Is, and Should Be–that education in college used to be about the education of the whole person but has gradually been emptied of the moral content of its originally religious more broadly civic vision, and the preprofessional and pecuniary imagination has become the dominant if not the sole rationale for pursuing an education. I am viscerally attracted to this kind of argument, so I offer a little critique rather than cheerleading.  First, while I do think its the case that an education firmly rooted in the humanities can provide for the kinds of deep moral reflection that forestalls a purely instrumentalist view of our fellow citizens–or should I say consumers–it’s also very evidently the case that people with a deep commitment to the arts and humanities descend into moral corruption as easily as anyone else.  The deeply felt anti-semitism of the dominant modernists would be one example, and the genteel and not so genteel racism of the Southern Agrarians would be another.  When Adorno said that there was no poetry after Auschwitz, he was only partly suggesting that the crimes of the 20th century were beyond the redemption of literature;  he also meant more broadly that the dream that literature and culture could save us was itself a symptom of our illness, not a solution to it.  Delbanco might be too subject to this particular dream, I think.

Secondly, I think that this analysis runs close to blaming college students for not majoring in or studying more in the humanities and is a little bit akin to blaming the victim–these young people have inherited the world we’ve given them, and we would do well to look at ourselves in the mirror and ask what kind of culture we’ve put in place that would make the frantic pursuit of economic gain in the putative name of economic security a sine qua non in the moral and imaginative lives of our children.

That having been said.  Yes, I do think the world–including the world of Wall Street–would be better if students took the time to read Billy Budd or Beloved, wedging it in somewhere between the work study jobs, appointments with debt counselors, and multiple extracurriculars and leadership conferences that are now a prerequisite for a job after college.

Kurt Vonnegut, Zombie Author–Or, I am the hype that I descry

I picked up via twitter yesterday that a Kurt Vonnegut novella that was twice rejected by major magazines has been published for the first time via Amazon singles.  Dutifully, I downloaded the book and now have it available via my Kindle app on my iPad and ready for reading. Perhaps a review will be in the offing if I can get around to it. (If I would blog less, I’m sure I would read more.)  The ways in which this leaves me feeling strange and uneasy requires a catalogue.

1.  If the book has been rejected multiple times and Vonnegut in his later life never chose to publish it when he most assuredly could have, as an e-book or otherwise, why should I buy this book now.  Is it because “it’s a Vonnegut,” and therefore worthy of my time.  I think this must not be true since in the end Vonnegut didn’t think it was even worth his time.  Am I somehow trading in a cult of celebrity in which the Vonnegut industry keeps pumping out the undead wisdom–even if the quality of the book might end up being something akin to a black velvet painting of Elvis being sold on the roadside beside a 7-Eleven in Arkansas.  Somehow I think here of Foucaults inquiry in to what exactly constitutes the work of an author.  I always kind of smiled at the question of whether an author’s laundry list is a part of his or her work.  I am now wondering whether Vonnegut’s laundry lists will be imaged and sold online as amazon singles.

2. I haven’t read half of the Vonnegut that Vonnegut himself thought was worth publishing.  Why should I purchase this latest for 1.99.  Because its easy and I didn’t even have to leave my couch to do it?  On the other hand, I have wasted a good bit more money than that on impulse purchases of literary magazines that now serve landfills, or perhaps could be fodder for Liz Laribee’s latest art project(the latter a worthy demise, I should say)

3.  Should I really enrich Rosetta Books and Amazon.com in pursuit of Vonnegut’s ghost.

4. Is there a problem with the fact that the internet erases distinctions between ephemera and things of “enduring value”?  What is the nature of “enduring value” when essentially everything can endure on an equal plane and theoretically in to eternity. ( I know here that my colleague Samuel Smith thinks the internet is more ephemeral than a paper book, but I have my doubts.  If we really get to the point of apocalypse in which all our digital resources are essentially unavailable through some massive destruction of the grid, we won’t be reading our paper books either.  We’ll be burning them in our fireplaces.  Or cooking them in our soups.)  The intellectual world is flat, and if Rosetta books had not chosen to publish the work, some enterprising graduate student with a scanner and an email account could have.

5.  Is it a problem that in writing this blog post, linking to websites, retweeting GalleyCat missives, posting to Facebook, I am flogging a book that I haven’t even read, part of the industrial–internet–publishing complex that makes Kurt Vonnegut a Zombie Author who continues to fascinate and destroy.  I am the hype that I descry.