Category Archives: Books

Book futures, or, apocalypse now.

I led a discussion in the Adult Forum down at St. Stephens last Sunday in which I suggested that apocalyptic literature falls in to basically two types: apocalyptic nihilism and apocalyptic redemption, the one seeing an end to everything the other seeing destruction as the necessary precursor to renewal. Theres an awful lot of apocalypticism out there about the book these days, a good bit of it just assuming the book is going to hell in a handbasket.

I mentioned in my post earlier today that prognosticating the future of the book seems to be a growth industry. Indeed, we devoted an entire symposium to it here at Messiah College. Besides the recent articles I mentioned earlier from my colleague Jonathan Lauer, and another by Jason Epstein, I ran across this from John Thompson at Huffington Post. A lot of it was the usual and obvious grist for the blogging mill, but I was intrigued by his final point, that the death of our current models for book production and dissemination may well lead to a flourishing of smaller independent publishing concerns

Seventh, small publishing operations and innovative start-ups will proliferate, as the costs and complexities associated with the book supply chain diminish, and threats of disintermediation will abound, as both traditional and new players avail themselves of new technologies and the opportunities opened up by them to try to eat the lunch of their erstwhile collaborators.

This strikes me as a plausible idea, and an exciting one. Although Anthony Grafton lamented the loss of the demanding professional editor, I think there’s an awful lot of talented creative people out there who could bring new energy and innovation to the world of ebooks and print books alike. We might be able to look back at this time fifty years from now and see this moment as one that heralded a new beginning for the book rather than its demise. If that’s not just so much rose colored glasses.

Digitization and the fulfillment of the book

My colleague in the library here at Messiah College, Jonathan Lauer, has a very nice essay in the most recent Digital Campus edition of the Chronicle of Higher Education.  Jonathan makes an eloquent defense of the traditional book over and against the googlization and ebookification of everything.   He especially employs an extended metaphor drawn from the transition to aluminum bats in various levels of baseball to discuss his unease and reservations about the shifts to electronic books and away from print that is profoundly and rapidly changing the nature of libraries as we’ve known them.  The essay is more evocative than argumentative, so there’s a lot of different things going on, but a couple of Jonathan’s main points are that enhancements we supposedly achieve with digitization projects come at a cost to our understanding of texts and at a cost to ourselves.

In the big leagues, wooden bats still matter. Keeping print materials on campus and accessible remains important for other reasons as well. Witness Andrew M. Stauffer’s recent Chroniclearticle, “The Troubled Future of the 19th-Century Book.” Stauffer, the director of the Networked Infrastructure for Nineteenth-Century Electronic Scholarship, cites several examples of what we all know intuitively. “The books on the shelves carry plenty of information lost in the process of digitization, no matter how lovingly a particular copy is rendered on the screen,” he writes. “There are vitally significant variations in the stacks: editions, printings, issues, bindings, illustrations, paper type, size, marginalia, advertisements, and other customizations in apparently identical copies.” Without these details, discernible only in physical copies, we are unable to understand a book’s total impact. Are we so easily seduced by the aluminum bat that we toss all wooden ones from the bat bag?

Let’s also acknowledge that our gadgets eventually program us. History teaches us that technologies often numb the very human capacities they amplify; in its most advanced forms, this is tantamount to auto-amputation. As weavers lost manual dexterity with their use of increasingly mechanized looms during the Industrial Revolution, so we can only imagine what effect GPS will have on the innate and learned ability of New York City cabbies to find their way around the five boroughs. Yet we practice auto-amputation at our own peril. We dare not abandon wooden bats for aluminum for those endeavors that demand prolonged attention, reflection, and the analysis and synthesis that sometimes lead to wisdom, the best result of those decidedly human endeavors that no gadget can exercise.

I have a lot of sympathy for Jonathan’s position, things like the revamping of the New York Public Library leaving me with a queasy hole in my stomach.  I’ve had a running conversation with Beth Transue, another of our librarians, about our desire to start leading alumni tours of the world’s great libraries, but if we’re going to do so we better get it done fast because most of them won’t be around anymore in a few more years, at least if the NYPL and its budgetary woes are anything to judge by.

At the same time, I think Jonathan overstates his case here.  I don’t think serious thinkers are assuming we’ll get rid of books entirely.  Although I currently think we are already living in what I’ve called an E-plus world, print will continue to be with us serving many different purposes. Jason Epstein over at the NYRB has a blog on this fact and progrognosticating the likely future and uses of the traditional book seems to be a growth industry at the moment. I don’t think the average student is too terribly interested in the material textuality that Jonathan references above, nor for that matter is the average scholar, the vast majority of whom remain interested in what people wrote not how the publishers chose to package it.  But those issues will continue to be extremely important for cultural and social historians, and there will be some forms of work that will only possibly be done with books.  Just as it is a tremendous boon to have Joyce’s manuscript’s digitized, making them available for the general reader and the scholar who cannot afford a trip to Ireland, authoritative interpretations of Joyce’s method, biography, and life’s work will still have to make the trip to Ireland to see the thing for themselves, to capture what can’t be captured by a high resolution camera.

That having been said, who would say that students studying Joyce should avoid examining the digitized manuscripts closely because they aren’t “the genuine article.”  Indeed, I strongly suspect that even the authoritative interpretations of those manuscripts will increasingly be a commerce between examination of the physical object and close examination of digitized objects since advanced DH work shows us time and time again that computerized forms of analysis can get at things the naked eye could never see.  So the fact that there are badly digitized copies of things in google books and beyond, shouldn’t belie the fact that there are some massively important scholarly opportunities here.

Jonathan’s second point is about the deeply human and quasi-spiritual aspects of engagement with traditional books that so many of us have felt over the years.  There’s something very true about this. It is also true that our technologies can result in forms of self amputation.  Indeed, if we are to take it to heart we need to admit that the technology of writing and reading itself is something that involves self-amputation.  Studies have shown that heavy readers alter their brains, and not always in a good sense.  We diminish the capacity of certain forms of memory, literally making ourselves absent minded professors.   Other studies have suggested that persons in oral cultures have this capacity in heightened form, and  some people argue that this generation is far more visually acute than those that preceded it, developing new abilities because of their engagement with visual texts.  So, indeed, our technologies alter us, and even result in self-amputation, but that is true of the traditional book as well as the internet.  This second is Jonathan’s larger claim since it seems to claim for traditional books as such a superiority in terms of something central to humanity as such. I am intrigued, with this argument that the book is superior for serious reflection and the quasi spiritual aspects of study that we have come to treat as central to the humanities.

I admit, I don’t buy it.

First, I admit that I’m just wary about attributing essential human superiorities to historical artifact and practices.  Homer as a collection of aural songs is not inherently inferior to the scrolls within which they were originally collected, then finding their apotheosis in the book form.  We have come to think of the book as exhibiting and symbolizing superior forms of humanity, but it’s not clear that book form was triumphant in the west because of these attributes.  Indeed, traditional Jews and others clearly think the scroll remains the superior spiritual form even to this day.  Rather, the codex triumphed for a variety of complicated reasons.  Partly Christian Churches for ideological reasons apparently wanted to distinguish their own writings from the writings of the Jews.  There may have been some more substantive reasons as well, though that’s not entirely clear: Anthony Grafton points out that many of the Christian innovations with the codex seemed to focus on the desire to compare different kinds of texts side by side (an innovation, I will point out, for which the internet is in many ways easily superior).  The codex also triumphed not because it was spiritually and intellectually superior but because it was, frankly, more efficient, cheaper, and easier to disseminate than its scrolly ancestors.  One good example is from the poet Martial who explicitly ties the selling of his poetry in codex form to making them easily and efficiently accessible to the common person:  “Assign your book-boxes to the great, this copy of me one hand can grasp.”

The entire trend of book history has been toward this effort to make texts and what they contain more readily and easily available to more and more people.  From the early clay tablets to the mass market paperback that let you carry Plato in your hip pocket, the thrust of the book has been toward broader and broader dissemination, toward greater and greater ease of use, toward cheaper and cheaper accessibility.  The goal of writing, even when that writing was imprisoned in libraries that only the initiated could enter as in Umberto Eco’s The Name of the Rose, has been open access.

The digitization that is occurring now comes to fulfill the book, not destroy it.

Secondarily, I guess I no longer believe fully in the spiritual or intellectual superiority of codex forms simply since it doesn’t comport with my experience.  As I do more and more of my reading of books with my various e-readers, I find that I have serious, contemplative, analytical, and synthetic engagements with all kinds of texts, from those hundreds of “pages” long and those not.  As I get used to the tools of various e-readers, theres almost nothing that can’t be accomplished in some way on an e-reader that is accomplished in traditional books.  Although I interact with texts differently now in a spatial sense, I am able to take fuller and more copious notes, I am able to mark texts more easily,  and if I can’t quite remember where something was in the book I can use a search engine to find not only a specific phrase or topic, but every single instance of that topic in the book.  Moreover, because every text represents an act of contemplation on and conversation with other texts, I can at the touch of a screen go and read for myself the interlocutors embedded within a book, just as those interested in Jonathan’s essay can touch my link above and decide for themselves whether I am reading him fairly.  Thus there are very obviously and seriously some ways in which e-readers are superior for serious analytical and interpretive readings of texts, or at least the equal to them.

All this having been said, I will say that there remains one way that I find the traditional paper book the clear superior to the e-book, and that has to do with my ability to make it mine.

I spoke a couple of days ago about the personal connection I felt to Kierkegaard in rereading him and discovering my many years of underlines, highlights and marginalia.  I even confess that I real Kimi Cunningham Grant’s new memoir on my iPad, but I still bought a hard cover at the reading–not because I thought I would be able to analyze it more effectively in hard cover, but because I wanted her to sign it for me.

This is a personal connection to the book that isn’t unimportant, but that is about my personal biography, and Kimi’s.  It’s not about the text, and frankly I doubt it will in the long run even be about literary history.  Some literary archivist somewhere is collecting all the shared comments on the Kindle version of Kimi’s book, and that massive marginalia will be fodder for some graduate student’s dissertation in a few decades.

I pity the poor graduate student who decides on such a project. But at least she won’t have to strain her eyes to decipher the handwriting.

Memories of Maurice: The Great Outpouring

The world is awash in memories of Maurice Sendak, and I’m so pleased that my colleague here at Messiah College, Anita Voelker, has a new editorial on her views of Sendak that just came out on the Fox News web site.  Anita is a wonderful teacher and a specialist in children’s lit. An excerpt from her article:

It is that time spent with you becoming one with the book that has me mourning Maurice Sendak. He was the miracle maker who could turn you into savvy kings or horrible ogres and still bring you back in time for a hot supper. 

Truth be known, Maurice Sendak allowed me to tame my unseen Wild Things. 

Sendak would have appreciated knowing he fed my grown-up spirit along side yours. He resented when his books were pigeon-holed for only children. 

So, today I call upon you and your now grown-up friends from childhood to honor Sendak by grabbing a copy of “Where the Wild Things Are” and reading it out loud. 

 Read more: http://www.foxnews.com/opinion/2012/05/08/remembering-maurice-sendak-and-wild-things/#ixzz1uKEej1hQ

I’ve hesitated to add my own thought about Sendak’s passing to what seems like it could only be called The Great Outpouring, but I’ve been really touched and amazed at the broad and cross generational appreciation that Sendak’s death  is generating.  Everyone from my 17 year old son to my 30 year old former students to my fifty and sixty year old colleagues felt a little sad today, and a lot happy at having lived in a world in which Maurice Sendak lived.

My personal memory of Sendak will always be of the public library in Durham North Carolina.  I was an impoverished grad student with a wife and a two year old daughter.  Having next to nothing, we got our entertainment at free festivals in the park, tours of the art museum at Duke, and books and records checked out from the library downtown.  For a good long time we didn’t have a TV and claimed to not want one as a way of staving off having to admit the fact that we couldn’t have afforded one if we had wanted it.  Maurice Sendak was was a favorite of my daughter Devon, and week after week we would check out Sendak.  The Night Kitchen was a special favorite.  We were not really all that aware Night Kitchen was and still is on one of the most banned book lists.  I will say my mother was appalled that we read our daughter a book that allowed her to giggle and point repeatedly at a little boys genital.  Ah Well.  I’m sure we’ve done worse things to her in our parental lives than that.

Sometimes still, 20 years later, in fits of want and desire I will cry out Milk! Milk! Milk for the morning cake.  And I will remember Durham.  And Maurice Sendak. And smile.

Kimi Cunningham Grant on Japanese American Experience

Last month I had the chance to hear Messiah alum, Kimi Cunningham Grant, read from her work and talk at The Midtown Scholar Bookstore with poet Julia Spicher Kasdorf about her new book, Silver like Dust. The conversation was wide-ranging touching on the elements of the writing process, on issues associated with Japanese American history and experience, and on Kimi’s own grappling with her identity as a bi-racial writer whose connections to her Japanese American heritage have grown over the years interviewing her grandmother and writing the book. I still have a blog vaguely planned related to Kimi’s book; I think there’s some really interesting things in there about books and reading. But for now I’m just happy to share this podcast recorded by the folks down at the Midtown. Catherine Lawrence and Eric Papenfuse do a tremendous amount for our community and for the world of books and writing generally. I’m glad we’ve got this record of a good event

.Kimi Cunningham Grant at the Midtown with Julia Spicher Kasdorf

How Do Blogging and Traditional Modes of Scholarly Production Relate?

In the latest edition of the Chronicle’s Digital Campus, Martin Weller makes some strong claims about the significance of blogging.  Recognizing the difficulty of measuring the value of a blog in comparison to traditional modes of journalistic publication, Weller believes that blogging is ultimately in the interest of both institutions and scholarship.

It’s a difficult problem, but one that many institutions are beginning to come to terms with. Combining the rich data available online that can reveal a scholar’s impact with forms of peer assessment gives an indication of reputation. Universities know this is a game they need to play—that having a good online reputation is more important in recruiting students than a glossy prospectus. And groups that sponsor research are after good online impact as well as presentations at conferences and journal papers.

Institutional reputation is largely created through the faculty’s online identity, and many institutions are now making it a priority to develop, recognize, and encourage practices such as blogging.For institutions and individuals alike, these practices are moving from specialist hobby to the mainstream. This is not without its risks, but as James Boyle, author of the book The Public Domain: Enclosing the Commons of the Mind (Yale University Press, 2008), argues, we tend to overstate the dangers of open approaches and overlook the benefits, while the converse holds true for the closed system.

The Virtues of Blogging as Scholarly Activity – The Digital Campus – The Chronicle of Higher Education.

The claim that an Institutional reputation is largely created through the faculty’s online identity startled me when I first read it, an index no doubt of my deeply held and inveterate prejudice in favor of libraries.  But I have been trying to pound away with the faculty how utterly important our online presence is, and the internet–in many different modes–gives us the opportunity to create windows on humanities work that are not otherwise easily achieved–at least in comparison to some of the work done by our colleagues in the arts or in the sciences.  Blogging is one way of creating connection, of creating vision, and I think that with a very few exceptions like the ivies and the public ivies, it is very much the case that your online presence matters more than any other thing you can possibly do to establish your reputation in the public eye and in the eye of prospective students and their parents.

That is fairly easy to grasp.  The value of the blogging to scholarship in general, or its relationship to traditional scholarship remains more thorny and difficult to parse.  I’ve had conversations with my colleague John Fea over at The Way of Improvement Leads Home, and we both agree that in some sense scholars still have to have established some claim to speak through traditional modes of publication in order to give their scholarly blogging some sense of authority.  People listen to John about History because he’s published books and articles. [Why people listen to me I have no idea–although I do have books and articles I have nothing like John’s reputation;  it may have something to do with simply holding a position.  Because I am a dean at a college I can lay claim to certain kinds of experience that are relevant to discussing the humanities].

I am not sure it will always be thus.  I think the day is fast approaching when publishing books will become less and less important as the arbiter of scholarly authority.  But I think for now and perhaps for a very good long time to come, blogging exist in an interesting symbiosis with other traditional forms of scholarship. Weller quotes John Naughton to this effect:  “Looking back on the history,” he writes, “one clear trend stands out: Each new technology increased the complexity of the ecosystem.”

I’ve read some things lately that say blogging may be on its way out, replaced in the minds of the general public, I guess, by Facebook, Twitter, and Pinterest.  But for now I think it remains an interesting and somewhat hybrid academic form.  A forum for serious thought and reasonably good writing, but not one that claims to be writing for the ages.  In some respects, I think the best blogging is more like the recovery of the eighteenth century Salon, wherein wit that demonstrated learning and acumen was highly valued, and perhaps a basis of academic life that stood unembarrassed next to the more muscular form of the book. Blogging is one clearly important addition to the scholarly ecosystem, playing off of and extending traditional scholarship rather than simply replacing it.

In my own life right now, as an administrator, I have too little time during the school year to pursue the writing of a 40 page article or a 400 page book–nor, right now, do I have the interest or inclination (however much I want to get back and finish the dang Harlem Renaissance manuscript that sits moldering in my computer).  I do, however, continue to feel the need to contribute to scholarly conversation surrounding the humanities and higher education in general.  Blogging is one good way to do that, and one that, like Weller, I find enjoyable, creative, and stress relieving–even when I am writing my blog at 11:00 at night to make sure I can get something posted.  Ever the Protestant and his work ethic.

Can a liberal arts ethos and a professionalized research faculty co-exist?

I’ve been reading Geoffrey Galt Harpham’s recent book, Humanities and the Dream of America, whiling away the hours on my exercise bike.  Ok, half hours.  I’ve been struck by the similarities in analysis between Harpham and Andrew Delbanco’s analysis of the college as a distinctly American achievement, having just finished Delbanco’s meditation on the nature of the college a week or so ago.

For both, the fundamental antagonist to the ideals of the liberal arts has not been professional programs per se–though undergraduate programs in things like businesss, nursing, and engineering (and a host of others) are favorite bete noirs of our current humanistic discourse about the purposes of education.  Rather, for both the real threat lies in the research university and the ethos of specialization that it entails.  This emphasis of specialized knowledge is itself inherently narrowing, and is opposed to the generous expansiveness of spirit that, at least in theory, characterizes the highest ideals of a liberal arts education.

Like Delbanco, Harpham draws on Newman as a first resource for the fully articulated ideal of the idea that education should enrich our human experience, fitting us primarily for a life well lived, rather than for the specifics of a living.  I’m intrigued, though, that Harpham brings out this ethos not only as characterizing the curricular choices and the spiritual, ethical and cultural teloi of the undergraduate [side note, we no longer would use a word like teloi; we would invoke learning objectives];  more than that, this ethos characterizes the faculty and their understanding of their role in the life of the mind.

Moreover, the institution devoted to producing Newman’s protege bears little resemblance to today’s institutions of higher learning. Newman’s idea of a university had no room for either specialized or comprehensive knowledge, and the professors who taught there should, he fervently believed-it is the very first statement in the book-have nothing to do with research, an obsessive activity best conducted by those whose minds were too sadly contracted to engage in full life. …

Newman intuitively understood that the real threat to liberal education in his own time was not the shade of Locke but the spirit of research then crescent in Germany. By the end of the nineteenth century, the United States had begun to Germanize its academy, with some university faculties organizing themselves into departments that granted degrees that came to constitute credentials. With credentialing came professionalism, and with professionalism growth. President Lawrence Lowell of Harvard (1909-33) laid the foundation for Harvard’s eminence as a research powerhouse by encouraging his faculty to think of themselves as professionals.4 This meant adopting impersonal criteria of scholarly competence within each discipline, cultivating a spirit of empirical and methodological rigor, and coming to agreement about what would count as standards of achievement in a small, self-certifying group of mandarins

The new professionalism functioned as a way of insulating the university from extra-academic pressures by creating a separate world of academe that could be judged only by its own standards. But professionalism was accompanied by its dark familiars, partition and competition. A professionalized faculty was a faculty divided into units, each claiming considerable autonomy in matters of hiring and promotion, and competing with other units for salaries, students, office space, and prestige. Such competition naturally placed some stress on amity, and so while undergraduates were expected to enjoy four stress-free years in the groves of academe, the faculty in the same institutions were facing the prospect of going at it hammer and tong, competing with each other virtually face to face, for the rest of their lives.

Geoffrey Galt Harpham. The Humanities and the Dream of America (pp. 127-128). Kindle Edition.

This seems about right to me, and it might pass without comment, but it raises a troubling contradiction.  In most of the strong defenses of the liberal arts that I hear, the notion that faculty should abandon research or that the fullness of the liberal arts spirit is best embodied by a faculty full of generalists is never among them.  Indeed, quite the opposite.  We want an institution to display its commitment to the liberal arts precisely by giving more support to faculty research so that faculty can become more and more specialized, more fully recognized in their professional disciplines, more disconnected from the immediacy of their liberal arts institutions, and less able to exemplify the generalist qualities that we say we value as fully humanizing.

What this argument for the liberal arts wants (I’m not saying there aren’t other arguments), is a research university, or at least a research college, with a commitment to research in the liberal arts and all the prestige that entails.  What we definitely do not want to do is give up the right to writing books that no one wants to read so that we can demonstrate our particularly specialized knowledge.

The faculty, as Harpham recognizes, is fully professionalized, and in many respects in the liberal arts we have created specialized professional programs, imagining that our students are professors in embryo.  The college major is now, in many respects, a professional program ,and it is worth noting that the idea of a college major is coextensive with the advent of the research university.  Indeed, I have heard the argument that we should have much larger majors in the humanities than we do have because the size of the major is a measure of its prestige in the college, competing then as it would with the gargantuan programs in engineering, nursing, and many of the hard sciences, programs that students can barely finish in four years, if they can.  So much for our sentimental sop about the value of breadth of mind and spirit.

Can a research faculty that shows no real interest in giving up the ideals of research exemplify and support a genuine liberal arts ethos in an American college today (leaving aside the question of whether liberal arts colleges will survive at all)? I am not sure what the route out of this conundrum actually is.  I stopped in the middle of Harpham’s chapter where he actually has just noted that faculty in the liberal arts are essentially professionals and conceive of themselves as professionals in ways quite similar to their brethren in professional programs.  I am not sure where he is going with that insight, but I look forward to finding out.

Distanced and Close Reading in literary study: Metaphors for love

I am old enough now to begin sentences with the phrase “I am old enough…”  Seriously, though, I am old enough now to feel like I have lived through one revolution, into a new orthodoxy, and now the experience of a new revolution in literary studies.  In the ongoing debates I hear about the digital humanities versus whatever other kind of humanities happens to be at hand, I keep having this vertiginous sense of deja vu, as if I’m hearing the same arguments I heard two decades ago, but transformed in to a key just different enough that I can’t tell whether today’s debates are mere variations on a theme or some genuinely new frame of discourse.

The song that I think is remaining the same is the divide between the proponents of what gets called “distanced reading,”  which in some hands is a shorthand for all things digital humanities (if it’s digital, it must be distanced as compared to the human touch of paper, ink, and typewriters–how the industrial period came to be the sign and symbol of all thing human and intimate I am not entirely clear), and close reading which is somehow taken to be THE form of intimate human contact with the text.

This division is exemplified in Stanley Fish’s recent essay on the digital humanities in the New York times, an argument that has the usual whiff of caustic Fishian insight leavened with what I take to be a genuine if wary respect for what he sees in the practices of distanced reading.  Nevertheless, for Fish, it is finally close reading that is genuinely the work of the humane critic devoted to intimacy with the text:

But whatever vision of the digital humanities is proclaimed, it will have little place for the likes of me and for the kind of criticism I practice: a criticism that narrows meaning to the significances designed by an author, a criticism that generalizes from a text as small as half a line, a criticism that insists on the distinction between the true and the false, between what is relevant and what is noise, between what is serious and what is mere play. Nothing ludic in what I do or try to do. I have a lot to answer for.

Ironically, in an earlier period it was Fish and precisely this kind of close reading (as practiced by deconstructionists) that was descried for its lack of seriousness, for the way it removed literature from the realm of human involvement and into the play of mere textuality .  By contrast, the distanced readers in those days imagined themselves as defenders of humanity (or, since humanism was a dirty word, at least the defender of the poor, the downtrodden, the miserable, the huddled masses).  Historicism read widely and broadly in the name of discourse, and proclaimed itself a liberating project, ferreting out the hidden political underbelly in a multitude of texts and considering literary criticism to be an act of responsible justice-seeking over and against the decadent jouissance-seekers of post-structuralism.

A recent blog by Alex Reid takes up this same criticism of what he describes as the Close Reading industry, arguing for the ways digitization can free us from the tyranny of the industrialized close reader:

In the composition classroom, the widgets on the belt are student papers. If computers can read like people it’s because we have trained people to read like computers. The real question we should be asking ourselves is why are we working in this widget factory? And FYC essays are perhaps the best real world instantiation of the widget, the fictional product, produced merely as a generic example of production. They never leave the warehouse, never get shipped to market, and are never used for anything except test runs on the factory floor. 

In an earlier period, it was again the close-readers who were accused of being mechanistic, dry, and scientific as putatively more humanistic readers accused New Critics of an unfeeling scientism in their formalist attitude toward the text, cutting out every human affect in the quest for a serious and scientific study of literature.

I wonder at root, whether this is the controlling metaphor, the key to which all our tunes in literary and cultural studies are played, a quest for the human that is not merely scientific, and yet an unrepressed desire for the authority of the scientist to say things with security, to wear the mantle of authority that our culture apparently only believes a statistical method can endow.

It is probably a mark against my character that I tend to be a both/and pragmatist as a thinker.  I do not buy the notion that distanced reading is inconsequential, or some how less about truth or less serious than the close rhetorical readings that Fish invokes.  At the same time, I am not too given to the euphoric and pugnacious challenges that can sometimes characterize digital humanities responses to the regnant forms of literary criticism.  At their best, Fishian forms of close reading are endowed not simply with acute attention, but with attention that seems to give birth to a form of wisdom that only attentiveness and close examination can provide, the kind of insistent close reading that led Gerard Manley Hopkins to seek the “inscape” of individual instances beyond categories, rather than simply the ways in which individuals fit into the vast landscapes popular in his post-romantic period.

I was reminded of this need to attend to the close properties of the individual use of language again in a recent article on Chaucer in the Chronicle. The writer attends to the detail of Chaucer’s language in a way that seems to reveal something important about the ways in which we are human.

translating Chaucer is like translating any other foreign language: The words are different from one language to the next. And then comes the third category, the most fascinating and the most aggravating because it is the trickiest: the false cognates, words that look like they should mean what they do in Modern English, but don’t. False cognates are especially aggravating, and fascinating when they carry their Middle and Modern English meanings simultaneously. These are exciting moments, when we see, through a kind of linguistic time-lapse photography, Chaucer’s language on its way to becoming our own.

In Middle English, for instance, countrefete means “to counterfeit,” as in “to fake,” but it also has the more flattering meaning of “to imitate.” Corage has not only the Modern English sense of bravery but also, frequently, overtones of sexual energy, desire, or potency. Corage takes its roots from the word coeur, or “heart,” and transplants them slightly southward. The same is true for solas, or “solace.” The “comfort,” “satisfaction,” or “pleasure” it entails is often sexual.

Lust might seem to pose no problem for the modern reader. Yet in the 14th century, the word, spelled as it is today, could mean any kind of desire or pleasure, though around that time it was beginning to carry a sexual connotation, too. And lest it seem as if false cognates always involve sex, take sely, or “silly.” It most often means “blessed” or “innocent,” as well as “pitiful” and “hapless,” but “foolish” was making its way in there, too.

A sentence like “The sely man felte for luste for solas” could mean “The pitiful man felt desire for comfort.” It could just as likely mean: “The foolish man felt lust for sex.” In Chaucer’s hands, it could mean both at once.

Chaucer was fully aware of the slipperiness of language. He delights in it; he makes his artistic capital from it. He is an inveterate punster. The Wife of Bath, for example, repeatedly puns on the word queynte (eventually the Modern English “quaint”). In the 14th century, the word means not only “curious” or “fascinating” but also the curious part of her female anatomy that most fascinates her five husbands. What’s more, the slipperiness of language gives Chaucer the tools to form his famous irony and ambiguity. If the way-too-pretty Prioress is “nat undergrowe” (“not undergrown”), how big is she?

(via Instapaper)

 These kinds of particularities of language are the worthy objects of our attention as literary scholars.  At the same time,  I do not think we need say that distanced reading plays no role in our understanding of such peculiarities.  A Chaucer project on the order of the Homer Multi-text, might actually deepen and multiply our understanding of Chaucer’s slipperiness and originality.  At the same time, vast database-driven analyses of every text written within a hundred years of Chaucer might allow us to discover the kinds of linguistic sources he was drawing on and manipulating anew for his own purposes, they might show us new creativities we had not imagined, or they might show us things we had taken to be unique were fairly common stock and trade.
These kinds of knowledges could not be derived from a contest between methods, but only from a reading marked by attentiveness, skill and desire, one willing to draw on any resource to understand what one wishes to know, which used to be a metaphor for love.

Do Humanities Programs Encourage the Computational Illiteracy of Their Students?

I think the knee-jerk and obvious answer to my question is “No.”  I think if humanities profs were confronted with the question of whether their students should develop their abilities in math (or more broadly in math, science and technology), many or most would say Yes.  On the other hand, I read the following post from Robert Talbert at the Chronicle of Higher Ed.  It got me thinking just a bit about how and whether we in the humanities contribute to an anti-math attitude among our own students, if not in the culture as a whole.

I’ve posted here before about mathematics’ cultural problem, but it’s really not enough even to say “it’s the culture”, because kids do not belong to a single monolithic “culture”. They are the product of many different cultures. There’s their family culture, which as Shaughnessy suggests either values math or doesn’t. There’s the popular culture, whose devaluing of education in general and mathematics in particular ought to be apparent to anybody not currently frozen in an iceberg. (The efforts of MIT, DimensionU, and others have a steep uphill battle on their hands.)

And of course there’s the school culture, which itself a product of cultures that are out of kids’ direct control. Sadly, the school culture may be the toughest one to change, despite our efforts at reform. As the article says, when mathematics is reduced to endless drill-and-practice, you can’t expect a wide variety of students — particularly some of the most at-risk learners — to really be engaged with it for long. I think Khan Academy is trying to make drill-and-practice engaging with its backchannel of badges and so forth, but you can only apply so much makeup to an inherently tedious task before learners see through it and ask for something more.

via Can Math Be Made Fun? – Casting Out Nines – The Chronicle of Higher Education.

This all rings pretty true to me.  There are similar versions of this in other disciplines.  In English, for instance, students unfortunately can easily learn to hate reading and writing through what they imbibe from popular culture or through what the experience in the school system.  For every hopeless math geek on television, there’s a reading geek to match.  Still and all, I wonder whether we in the humanities combat and intervene in the popular reputation of mathematics and technological expertise, or do we just accept it, and do we in fact reinforce it.

I think, for instance, of the unconscious assumption that there are “math people” and “English people”;  that is, there’s a pretty firmly rooted notion that people are born with certain proclivities and abilities and there is no point in addressing deficiencies in your literacy in other areas.  More broadly, I think we apply this to students, laughing in knowing agreement when they talk about coming to our humanities disciplines because they just weren’t math persons or a science persons, or groaning together in the faculty lounge about how difficult it is to teach our general education courses to nursing students or to math students.  As if our own abilities were genetic.

In high school I was highly competent in both math and English, and this tendency wasn’t all that unusual for students in the honors programs.  On the other hand, I tested out of math and never took another course in college, and none of my good humanistic teachers in college ever challenged and asked me to question that decision.  I was encouraged to take more and different humanities courses (though, to be frank, my English teachers were suspicious of my interest in philosophy), but being “well-rounded’ and “liberally educated”  seems in retrospect to have been largely a matter of being well-rounded in only half of the liberal arts curriculum.  Science and math people were well-rounded in a different way, if they were well-rounded at all.

There’s a lot of reason to question this.  Not least of which being that if our interests and abilities are genetic we have seen a massive surge of the gene pool toward the STEM side of the equation if enrollments in humanities majors is to serve as any judge.  I think it was Malcolm Gladwell who recently pointed out that genius has a lot less to do with giftedness than it does with practice and motivation.  Put 10000 hours in to almost anything and you will become a genius at it (not entirely true, but the general principle applies).  Extrapolating, we might say that even if students aren’t going to be geniuses in math and technology, they could actually get a lot better at it if they’d only try.

And there’s a lot of reason to ask them to try.  At the recent Rethinking Success conference at Wake Forest, one of the speakers who did research into the transition of college students in to the workplace pounded the table and declared, “In this job market you must either be a technical student with a liberal arts education or a liberal arts major with technical savvy.  There is no middle ground.”  There is no middle ground.  What became quite clear to me at this conference is that companies mean it that they want students with a liberal arts background.  However, it was also very clear to me that they expect them to have technical expertise that can be applied immediately to job performance. Speaker after speaker affirmed the value of the liberal arts.  They also emphasized the absolute and crying need for computational, mathematical, and scientific literacy.

In other words, we in the Humanities will serve our students extremely poorly if we accept their naive statements about their own genetic makeup, allowing them to proceed with a mathematical or scientific illiteracy that we would cry out against if the same levels of illiteracy were evident in others with respect to our own disciplines.

I’ve found, incidentally, that in my conversations with my colleagues in information sciences or math or sciences, that many of them are much more conversant in the arts and humanities than I or my colleagues are in even the generalities of science, mathematics, or technology.  This ought not to be the case, and in view of that i and a few of my colleagues are considering taking some workshops in computer coding with our information sciences faculty.  We ought to work toward creating a generation of humanists that does not perpetuate our own levels of illiteracy, for their own sake and for the health of our disciplines in the future.

Teaching Latin on an iPad: An experiment at Messiah College

An example of some of the things that Messiah College is trying to do in experimenting with digital technology in the classroom.  My colleague Joseph Huffman is more pessimistic than I about the promise of iPads and e-books, but I’m just glad we have faculty trying to figure it out.  See the full post at the link below.

You might not expect a historian of Medieval and Renaissance Europe to be among the first educators at Messiah College to volunteer to lead a pilot project exploring the impact of mobile technology—in this case, the iPad—on students’ ability to learn. But that’s exactly what happened.Joseph Huffman, distinguished professor of European history, and the eight students in his fall 2011 Intermediate Latin course exchanged their paper textbooks for iPads loaded with the required texts, relevant apps, supplementary PDFs and a Latin-English dictionary. The primary goal was to advance the learning of Latin. The secondary goal was to determine whether the use of the iPad improved, inhibited or did not affect their ability to learn a foreign language.Why Latin?“A Latin course is about as traditional a humanities course as one can find,” Huffman says. Because any foreign language course requires deep and close readings of the texts, studying how student learning and engagement are affected by mobile technology is especially provocative in such a classic course. In addition, Latin fulfills general language course requirements and, therefore, classes are comprised of students from a variety of majors with, perhaps, diverse experiences with mobile technologies like iPads.One aspect of the experiment was to explore whether students would engage the learning process differently with an iPad than a textbook.

The assumption, Huffman admits, is that today’s students likely prefer technology over books.Huffman’s experiences with his Latin 201 course—comprised of five seniors, two sophomores and one junior—challenged that commonly held assumption.

via Messiah College: Messiah News – Messiah College Homepage Features » iPad experiment.

Is Twitter the future of fiction? Micro-prose in an age of ADD

As I’ve mentioned, I’ve been struck by Alex Juhasz’s pronouncement at the Re:Humanities conference that we must learn what it means to write for an audience that is permanently distracted.  In response, I put up a Facebook post: “We need a rhetoric of the caption. A hermeneutic of the aphorism. Haiku as argument.”  My Provost at Messiah College–known for thorough and intricate argument–left a comment “I’m Doomed.”

Perhaps we all are, those of us who are more Faulkneresque than Carveresque in our stylistic leanings.  This latest from GalleyCat:

R.L. Stine, the author of the popular Goosebumps horror series for kids, gave his nearly 49,000 Twitter followers another free story this afternoon.To celebrate Friday the 13th, the novelist tweeted a mini-horror story called “The Brave One.” We’ve collected the posts below for your reading pleasure.

via R.L. Stine Publishes ‘The Brave Kid’ Horror Story on Twitter – GalleyCat.

Ok, I know it’s a silly reach to put Stine and Faulkner in the same paragraph, and to be honest I found Stine’s story trite.  On the other hand, I do think it’s obvious we’re  now in an age wherein shorter prose with bigger impact may be the necessity.  Flash fiction is growing, and we can witness the immense popularity of NPR’s three minute fiction contest.  These forms of fiction, of writing in general speak to the necessities of an art of the moment, rather than the art of immersion.  Literature, and prose in general, is ALWAYS responsive to material and cultural forms of its own moment, and I think prose that is short and explosive, or prose that pierces beneath the surface of the readers psyche in a moment only to spread and eat its way into the unconscious when the moment of reading is long forgotten, is mostly likely the prose that is the order of the day.

BUT…Stine certainly doesn’t do it for me.  I don’t know a lot about Twitter fiction.  Is there any really good stuff out there on twitter–as opposed to flash fiction written in a standard format which I know more about? Or is it all carney-style self-promotion or unrealized theory at the moment?

[And what, I wonder, does this mean for the future of academic prose as well?  I’m a late comer to Twitter myself, but I’ve been a little fascinated with the academic discourse that can occur, but more on that some other time.]