Category Archives: Digital Humanities

Digital Archive as Advertisement: The Hemingway Papers

The pace at which digital material is being made available to the public and to students and scholars in the humanities is accelerating, whether one thinks of the digitization of books, the new MOOC’s from MIT and Harvard and others that will extend learning the humanities and other fields, or the digitization of papers and manuscripts that were previously in highly restricted manuscripts or rare book sections of single libraries like the James Joyce Papers just released in Ireland.

Another addition to this list is the release of a new digitized collection of Hemingway’s writings for the Toronto Star.  The Star has put together the columns written by Hemingway for the paper in the early 20s, along with some stories about the  writer.  I’m basically extremely happy that archives like this and others are taking their place in the public eye.  I had a great course on Hemingway while pursuing an MFA at the University of Montana with Gerry Brenner, and the legacy of Hemingway was felt everywhere.  Still is as far as I’m concerned.

At the same time, I admit that the Star site left me just a little queasy and raised a number of questions about what the relationship is between a commercial enterprise like the Star and digital work and scholarly work more generally.  First cue to me was the statement of purpose in the subtitle to the homepage:

The legendary writer’s reporting from the Toronto Star archives, featuring historical annotations by William McGeary, a former editor who researched Hemingway’s columns extensively for the newspaper, along with new insight and analysis from the Star’s team of Hemingway experts.

I hadn’t really realized that the Toronto Star was a center of Hemingway scholarship, but maybe I’ve missed something over the past 20 years.  Other similar statements emphasize the Star’s role in Hemingway’s life as much as anything about Hemingway himself:  emphases on the Star’s contributions to the great writer’s style (something that, if I remember, Hemingway himself connected more to his time in Kansas City), emphases on the way the Star nurtured the writer and on the jovial times Hemingway had with Star editorial and news staff.  Sounds a little more like a family album than a really serious scholarly take on what Hemingway was about in this period.  Indeed, there is even a straightforward and direct advertisement on the page as it sends you to the Toronto Star store where you can purchase newsprint editions of Hemingway’s columns.

I don’t really want to looks a gift horse in the mouth.  There’s a lot of good stuff here, and just having the articles and columns available may be enough and I can ignore the rest.  Nevertheless, the web is a framing device that makes material available within a particular context, and here that context clearly has a distinct commercial angle.  It strikes me that this is a version of public literary history that has all the problems of public history in general that my colleague John Fea talks about over at The Way of Improvement Leads Home.  Here of course it is not even really the public doing the literary history but a commercial enterprise that has a financial stake in making itself look good in light of Hemingways legacy.

The Star promises the site will grow, which is a good thing.  I hope it will grow in a way that will allow for more genuine scholarly engagement on Hemingways legacy as well as more potential interactivity.  The site is static with no opportunity for engagement at all, so everything is controlled by the Star and its team of Hemingway experts.  We take it or we leave it.

For the moment I am taking it, but I worry about the ways commercial enterprises can potentially shape our understanding of literary and cultural history for their own ends.  I wonder what others think about the role of commercial enterprises in establishing the context through which we think about literature and culture?

Digitization and the fulfillment of the book

My colleague in the library here at Messiah College, Jonathan Lauer, has a very nice essay in the most recent Digital Campus edition of the Chronicle of Higher Education.  Jonathan makes an eloquent defense of the traditional book over and against the googlization and ebookification of everything.   He especially employs an extended metaphor drawn from the transition to aluminum bats in various levels of baseball to discuss his unease and reservations about the shifts to electronic books and away from print that is profoundly and rapidly changing the nature of libraries as we’ve known them.  The essay is more evocative than argumentative, so there’s a lot of different things going on, but a couple of Jonathan’s main points are that enhancements we supposedly achieve with digitization projects come at a cost to our understanding of texts and at a cost to ourselves.

In the big leagues, wooden bats still matter. Keeping print materials on campus and accessible remains important for other reasons as well. Witness Andrew M. Stauffer’s recent Chroniclearticle, “The Troubled Future of the 19th-Century Book.” Stauffer, the director of the Networked Infrastructure for Nineteenth-Century Electronic Scholarship, cites several examples of what we all know intuitively. “The books on the shelves carry plenty of information lost in the process of digitization, no matter how lovingly a particular copy is rendered on the screen,” he writes. “There are vitally significant variations in the stacks: editions, printings, issues, bindings, illustrations, paper type, size, marginalia, advertisements, and other customizations in apparently identical copies.” Without these details, discernible only in physical copies, we are unable to understand a book’s total impact. Are we so easily seduced by the aluminum bat that we toss all wooden ones from the bat bag?

Let’s also acknowledge that our gadgets eventually program us. History teaches us that technologies often numb the very human capacities they amplify; in its most advanced forms, this is tantamount to auto-amputation. As weavers lost manual dexterity with their use of increasingly mechanized looms during the Industrial Revolution, so we can only imagine what effect GPS will have on the innate and learned ability of New York City cabbies to find their way around the five boroughs. Yet we practice auto-amputation at our own peril. We dare not abandon wooden bats for aluminum for those endeavors that demand prolonged attention, reflection, and the analysis and synthesis that sometimes lead to wisdom, the best result of those decidedly human endeavors that no gadget can exercise.

I have a lot of sympathy for Jonathan’s position, things like the revamping of the New York Public Library leaving me with a queasy hole in my stomach.  I’ve had a running conversation with Beth Transue, another of our librarians, about our desire to start leading alumni tours of the world’s great libraries, but if we’re going to do so we better get it done fast because most of them won’t be around anymore in a few more years, at least if the NYPL and its budgetary woes are anything to judge by.

At the same time, I think Jonathan overstates his case here.  I don’t think serious thinkers are assuming we’ll get rid of books entirely.  Although I currently think we are already living in what I’ve called an E-plus world, print will continue to be with us serving many different purposes. Jason Epstein over at the NYRB has a blog on this fact and progrognosticating the likely future and uses of the traditional book seems to be a growth industry at the moment. I don’t think the average student is too terribly interested in the material textuality that Jonathan references above, nor for that matter is the average scholar, the vast majority of whom remain interested in what people wrote not how the publishers chose to package it.  But those issues will continue to be extremely important for cultural and social historians, and there will be some forms of work that will only possibly be done with books.  Just as it is a tremendous boon to have Joyce’s manuscript’s digitized, making them available for the general reader and the scholar who cannot afford a trip to Ireland, authoritative interpretations of Joyce’s method, biography, and life’s work will still have to make the trip to Ireland to see the thing for themselves, to capture what can’t be captured by a high resolution camera.

That having been said, who would say that students studying Joyce should avoid examining the digitized manuscripts closely because they aren’t “the genuine article.”  Indeed, I strongly suspect that even the authoritative interpretations of those manuscripts will increasingly be a commerce between examination of the physical object and close examination of digitized objects since advanced DH work shows us time and time again that computerized forms of analysis can get at things the naked eye could never see.  So the fact that there are badly digitized copies of things in google books and beyond, shouldn’t belie the fact that there are some massively important scholarly opportunities here.

Jonathan’s second point is about the deeply human and quasi-spiritual aspects of engagement with traditional books that so many of us have felt over the years.  There’s something very true about this. It is also true that our technologies can result in forms of self amputation.  Indeed, if we are to take it to heart we need to admit that the technology of writing and reading itself is something that involves self-amputation.  Studies have shown that heavy readers alter their brains, and not always in a good sense.  We diminish the capacity of certain forms of memory, literally making ourselves absent minded professors.   Other studies have suggested that persons in oral cultures have this capacity in heightened form, and  some people argue that this generation is far more visually acute than those that preceded it, developing new abilities because of their engagement with visual texts.  So, indeed, our technologies alter us, and even result in self-amputation, but that is true of the traditional book as well as the internet.  This second is Jonathan’s larger claim since it seems to claim for traditional books as such a superiority in terms of something central to humanity as such. I am intrigued, with this argument that the book is superior for serious reflection and the quasi spiritual aspects of study that we have come to treat as central to the humanities.

I admit, I don’t buy it.

First, I admit that I’m just wary about attributing essential human superiorities to historical artifact and practices.  Homer as a collection of aural songs is not inherently inferior to the scrolls within which they were originally collected, then finding their apotheosis in the book form.  We have come to think of the book as exhibiting and symbolizing superior forms of humanity, but it’s not clear that book form was triumphant in the west because of these attributes.  Indeed, traditional Jews and others clearly think the scroll remains the superior spiritual form even to this day.  Rather, the codex triumphed for a variety of complicated reasons.  Partly Christian Churches for ideological reasons apparently wanted to distinguish their own writings from the writings of the Jews.  There may have been some more substantive reasons as well, though that’s not entirely clear: Anthony Grafton points out that many of the Christian innovations with the codex seemed to focus on the desire to compare different kinds of texts side by side (an innovation, I will point out, for which the internet is in many ways easily superior).  The codex also triumphed not because it was spiritually and intellectually superior but because it was, frankly, more efficient, cheaper, and easier to disseminate than its scrolly ancestors.  One good example is from the poet Martial who explicitly ties the selling of his poetry in codex form to making them easily and efficiently accessible to the common person:  “Assign your book-boxes to the great, this copy of me one hand can grasp.”

The entire trend of book history has been toward this effort to make texts and what they contain more readily and easily available to more and more people.  From the early clay tablets to the mass market paperback that let you carry Plato in your hip pocket, the thrust of the book has been toward broader and broader dissemination, toward greater and greater ease of use, toward cheaper and cheaper accessibility.  The goal of writing, even when that writing was imprisoned in libraries that only the initiated could enter as in Umberto Eco’s The Name of the Rose, has been open access.

The digitization that is occurring now comes to fulfill the book, not destroy it.

Secondarily, I guess I no longer believe fully in the spiritual or intellectual superiority of codex forms simply since it doesn’t comport with my experience.  As I do more and more of my reading of books with my various e-readers, I find that I have serious, contemplative, analytical, and synthetic engagements with all kinds of texts, from those hundreds of “pages” long and those not.  As I get used to the tools of various e-readers, theres almost nothing that can’t be accomplished in some way on an e-reader that is accomplished in traditional books.  Although I interact with texts differently now in a spatial sense, I am able to take fuller and more copious notes, I am able to mark texts more easily,  and if I can’t quite remember where something was in the book I can use a search engine to find not only a specific phrase or topic, but every single instance of that topic in the book.  Moreover, because every text represents an act of contemplation on and conversation with other texts, I can at the touch of a screen go and read for myself the interlocutors embedded within a book, just as those interested in Jonathan’s essay can touch my link above and decide for themselves whether I am reading him fairly.  Thus there are very obviously and seriously some ways in which e-readers are superior for serious analytical and interpretive readings of texts, or at least the equal to them.

All this having been said, I will say that there remains one way that I find the traditional paper book the clear superior to the e-book, and that has to do with my ability to make it mine.

I spoke a couple of days ago about the personal connection I felt to Kierkegaard in rereading him and discovering my many years of underlines, highlights and marginalia.  I even confess that I real Kimi Cunningham Grant’s new memoir on my iPad, but I still bought a hard cover at the reading–not because I thought I would be able to analyze it more effectively in hard cover, but because I wanted her to sign it for me.

This is a personal connection to the book that isn’t unimportant, but that is about my personal biography, and Kimi’s.  It’s not about the text, and frankly I doubt it will in the long run even be about literary history.  Some literary archivist somewhere is collecting all the shared comments on the Kindle version of Kimi’s book, and that massive marginalia will be fodder for some graduate student’s dissertation in a few decades.

I pity the poor graduate student who decides on such a project. But at least she won’t have to strain her eyes to decipher the handwriting.

Is hierarchy the problem? Higher Education Governance and Digital Revolutions

In the Digital Campus edition of the Chronicle, Gordon Freeman Makes the now common lament that colleges and universities are not taking the kind of advantage of Cloud technology that would enable superior student learning outcomes and more effective and efficient teaching. In doing so he lays blame at the feet of the hierarchical nature of higher Ed in a horizontal world.

Higher-education leaders, unlike the cloud-based companies of Silicon Valley, do not easily comprehend the social and commercial transformation gripping the world today. Indeed, there was a certain amount of gloating that the centuries-old higher-education sector survived the dot-com era. After all, textbooks are still in place, as are brick and mortar campuses.

The simple fact is that life is becoming more horizontal, while colleges remain hierarchical. We can expect the big shifts in higher education—where the smart use of digitization leads to degrees—to come from other countries.

And that’s sad, because the United States makes most of the new technologies that other parts of the world are more cleverly adapting, especially in education.

(via Instapaper)

I appreciate the metaphor of hierarchy versus horizontality, and I think it’s seductive, but I wonder if it’s accurate. It captures an American view of the world that images the bad guys as authoritarian and hierarchical and the good guys as democratic and dialogical.

Whether or not the horizontal crowd can ever do evil is a post for another day. I’m more interested in whether the slowness of higher Ed to take up new modes of doing business is actually due to hierarchy. Speaking to a colleague at a national liberal arts college that shall not be named, he indicated the president gave strong support for digital innovation in teaching and research, but that the faculty as a whole was slow on the uptake, which meant institution wide change was difficult to achieve.

This rings true to me as an administrator. Change is slow not because we are too hierarchical, but because we take horizontality to be an inviolable virtue. Faculty are largely independent operators with a lot of room to implement change or not as they choose. Institution wide changes in educational programming takes not one big decision, but a thousand small acts of persuasion and cajoling in order to effect change. I mostly think this is as it should be. It is the difficult opposite edge of academic freedom. It means that changing the direction of an institution is like changing the direction of an aircraft carrier, and doing so without an admiral who can indicate direction by fiat. There are problems with that, but I’m not sure that the changes required to make higher Ed as nimble as Gordon Freeman desires will result in an educational system we’d like to have.

Is Twitter Destroying the English language?

Coming out of the NITLE seminar on Undergraduate Research in Digital Humanities, my title question was one of the more interesting questions on my mind.  Janis Chinn, a student at the University of Pittsburgh, posed this question as a motivation for her research on shifts in linguistic register on Twitter.  I’m a recent convert to Twitter and see it as an interesting communication tool, but also an information network aggregator.  I don’t really worry about whether twitter is eroding my ability to write traditional academic prose, but then, I’ve inhabited that prose for so long its more the case that I can’t easily adapt to the more restrictive conventions of twitter.  And while I do think students are putting twitterisms in their papers, I don’t take this as specifically different than the tendency of students to use speech patterns as the basis for constructing their papers, and not recognizing the different conventions of academic prose.  So twitter poses some interesting issues, but not issues that strike me as different in kind from other kinds of language uses.

I gather from the website for her project that Janis is only at the beginning of her research and hasn’t developed her findings yet, but it looks like a fascinating study.  Part of her description of the work is as follows:

Speakers shift linguistic register all the time without conscious thought. One register is used to talk to professors, another for friends, another for close family, another for one’s grandparents. Linguistic register is the variety of language a speaker uses in a given situation. For example, one would not use the same kind of language to talk to one’s grandmother as to your friends. One avoids the use of slang and vulgar language in an academic setting, and the language used in a formal presentation is not the language used in conversation. This is not just a phenomenon in English, of course; in languages like Japanese there are special verbs only used in honorific or humble situations and different structures which can increase or decrease the politeness of a sentence to suit any situation. This sort of shift takes place effortlessly most of the time, but relatively new forms of communication such as Twitter and other social media sites may be blocking this process somehow.

In response to informal claims that the current generation’s language is negatively affected by modern communication tools likeTwitter, Mark Liberman undertook a brief analysis comparing the inaugural addresses of various Presidents. This analysis can be found on University of Pennsylvania‘s popular linguistics blog “Language Log”. Remarkably, he found a significant trend of shortening sentence and word lengths over the last 200 years. My research, while not addressing this directly, will demonstrate whether using these services affects a user’s ability to shift linguistic registers to match the situation as they would normally be expected to.

Fascinating question in and of itself. I think on some level I’ve always been deeply aware of these kinds of shifts.  As I kid when my parents were missionaries in New Guinea, I would speak with an Aussie accent while I was with kids at the school across the valley, which shifting back in to my Okie brogue on the mission field and in my house.  And as I made my way in to academe my southern and southwesternisms gradually dropped away with a very few exceptions–aware as I was that my accent somehow did not signal intelligence and accomplishment.  Mockery of southern white speech remains a last bastion of prejudice in the academy generally.  I don’t think these are the kinds of register shifts Janis is looking at, but same territory.

I’m also more interested in the general motive questions.  If we could prove that Twitter inhibited the ability to shift registers, would that count as destroying or damaging the language in some sense?  If we could demonstrate that Twitter was leading people to use shorter and shorter sentences–or to be less and less able to comprehend sentences longer than 160 characters.  Would this signal an erosion in the language.  We must have some notion that language can be used in more effective and less effective ways since we are all very aware that communication can fail abysmally or succeed beyond our hopes, and usually ends up somewhere in-between.  Does the restricted nature of Twitter limit or disable some forms of effective communication, while simultaneously enabling others.  These are interesting questions.  I’m sure more intelligent people than I am are working on them.

Takeaways–NITLE Seminar: Undergraduates Collaborating in Digital Humanities Research

Yesterday afternoon at 3:00 about 30 Messiah College humanities faculty and undergraduates gathered to listen in on and virtually participate in the NITLE Seminar focusing on Undergraduates Collaborating in Digital Humanities Research.  A number of our faculty and students were tweeting the event, and a Storify version with our contributions can be found here.I am amazed and gratified to have such a showing late on a Friday afternoon.  Students and faculty alike were engaged and interested by the possibilities they saw being pursued in undergraduate programs across the country, and our own conversation afterwards extended for more than a half hour beyond the seminar itself. Although most of us freely admit that we are only at the beginning and feeling our way, there was a broad agreement that undergraduate research and participation in Digital Humanities work was something we needed to keep pushing on.

If you are interested in reviewing the entire seminar, including chat room questions and the like, you can connect through this link.  I had to download webex in order to participate in the seminar, so you may need to do the same, even though the instructions I received said I wouldn’t need to.  My own takeaways from the seminar were as follows:

  • Undergraduates are scholars, not scholars in waiting.  If original scholarship is defined as increasing the fund of human knowledge, discovering and categorizing and interpreting data that helps us better understand human events and artifacts, developing tools that can be employed by other scholars who can explore and confirm or disconfirm or further your findings, these young people are scholars by any definition.
  • Digital Humanities research extends (and, to be sure, modifies) our traditional ways of doing humanities work;  it does not oppose it.  None of these young scholars felt inordinate tensions between their traditional humanities training and their digital humanities research.  A student who reviewed a database of 1000 Russian folks tales extended and modified her understanding arrived at by the close reading of a dozen.  Digital Humanities tools enable closer reading and better contextual understanding of the poet Agha Shahid Ali, rather than pushing students away in to extraneous material.
  • Many or most of these students learned their tools as they went along, within the context of what they were trying to achieve.  I was especially fascinated that a couple of the students had had no exposure to Digital Humanities work prior to their honors projects, and they learned the coding and digital savvy they needed as they went along.  Learning tools within the context of how they are needed seems to make more and more sense to me.  You would not teach a person how to use a hammer simply by giving them a board and nails, at least not if you don’t want them to get bored.  Rather, give them something to build, and show or have them figure out how the hammer and nails will help them get there.

I’m looking forward to the Places We’ll Go.

More Undergraduate Research in the Digital Humanities

This afternoon the School of the Humanities at Messiah College will be connecting to the NITLE Symposium on Undergraduate work in the digital Humanities. Messiah College is currently considering making the development of undergraduate research, and especially collaborative research between faculty and students, a central theme of our next strategic plan.  Like many colleges and universities across the country, we are seeing undergraduate research as a way of deepening student learning outcomes and engagement with their education, while also providing more and better skills for life after college.

The push toward student research has some detractors–Andrew DelBanco and Geoffrey Galt Harpham among them–but I’ll blog at some other time about my disagreement with them on liberal arts grounds.  I’ve been on record before as to how I think Digital Humanities is a (or THE) way to go with this effort within my own disciplines.  I was glad to receive the video below from Adeline Koh at Richard Stockton College, chronicling the achievements of the RE:Humanities conference at Swarthmore.  A nice overview of the conference.  If you look closely and don’t blink there’s a couple of shots of my colleagues, Larry Lake, and one of me apparently typing away distractedly on my iPad.  Although perhaps I was tweeting and achieving a transcendent level of attention and interaction without really having to listen.  🙂

This afternoon, the School of the Humanities here at Messiah College is going to consider some more whether and how Digital Humanities might be applicable to our situation by participating in the NITLE symposium on this topic at 3:00.

What is the Digital Humanities and Where can I get some of it?

As I’ve started listening in on Digital Humanities conversations over the past 12 to 18 months, and especially in the last three or four months as I’ve gotten more fully onto Twitter and understood its potential for academics, I’ve realized that I am mostly just a bobbing rubber duck in a great wave of ignorant interest in Digital Humanities.  “What is this thing, Digital Humanities, and where can I get some of it?”  seems to be a general hue and cry, and I’ve added my own voice to the mix.  Some of that wave is no doubt driven by the general malaise that seems to be afflicting the humanistic ecosystem, and mid-career academics look back with rose-colored nostalgia to culture wars of the 80s when our classes were full and our conflicts were played out in middle-brow journals so we could feel self-important.  Maybe digital humanities will make us relevant again and keep our budgets from getting cut.

On the other hand, I think that most people recognize that many different aspects of digital humanities practice seem to coalesce and provide responses to driving forces in academe at the moment:  our students’ need for technical proficiency, the increasingly porous border between distanced and bricks and mortar instruction, the needs to connect effectively with the public while maintaining high standards of academic rigor, the need for our students to be involved in “real world” experiential learning, the effort to provide opportunities for serious and original undergraduate research, the need to support collaborative forms of learning.

I have been terribly impressed with the generosity of Digital Humanities folks in responding to these repeated pleas to “Show me how to do like you.  Show me how to do it.”   There’s been a lot of different things I’ve discovered over the past year, and as many more than have been put out.  The most recent is a bibliographic blog compiled by Matthew Huculak.  As with any bibliography it is an act of interpretation. There are inclusions I’ve not seen elsewhere–Franco Moretti’s book on data in literary studies was on the list and is now on mine. [Though I admit this is at least as much because I had Moretti as a prof while a grad student at Duke, the same semester I completed my first essay ever on a Macintosh computer].  The blog skews toward the literary–and I have increasingly realized that while there is this strong discourse of solidarity among digital humanists, the traditional disciplinary divisions still play a strong role in much of the practical work that is actually done.  DH’ers have conferences together but it’s not always clear that they work together on projects across their humanistic disciplines. There are also obviously omissions (I thought that the new Journal of the Digital Humanities should have made the list).

My larger concern though is that I’m actually beginning to feel that there may actually be a glut of introductory materials, so many different possible things to do at the beginning that it is actually impossible to point to a place and say, “this is where to start.”  To some degree on this score the Digital Humanities are reflecting what Geoffrey Harpham has indicated is a basic feature of the Humanities in general.

In a great many colleges and universities, there is no “Intro to English” class at all, because there is no agreement among the faculty on what constitutes a proper introduction to a field in which the goals, methods, basic concepts, and even objects are so loosely defined, and in which individual subjective experience plays such a large part. This lack of consensus has sometimes been lamented, but has never been considered a serious problem.

Geoffrey Galt Harpham. The Humanities and the Dream of America (p. 101). Kindle Edition.

The details don’t quite apply.  I’m not entirely sure individual subjective experience is at the heart of DH work, and that is one of the biggest bones of contention with DH work as it has been reflected in English departments (see my reflections on Fish and his comments on DH in yesterdays post).  But I do think the general pragmatic feel of humanities departments where you can begin almost anywhere, which is in stark contrast to the methodical and even rigid approach to immersing students in the STEM disciplines, may be characteristic of DH as well.  Start where you are and get where you want to go.

In reading Harpham, I was reminded of one of Stanley Fish’s essays, which one I forget, in which he talks about the best way to introduce students to literary criticism is not to give them a theory, but to give them examples and say “Go thou and do likewise”.  I’m increasingly feeling this is the case for the Digital Humanities.  Figure out what seems appealing to you, and then figure out what you have to figure out so you can do like that.

Distanced and Close Reading in literary study: Metaphors for love

I am old enough now to begin sentences with the phrase “I am old enough…”  Seriously, though, I am old enough now to feel like I have lived through one revolution, into a new orthodoxy, and now the experience of a new revolution in literary studies.  In the ongoing debates I hear about the digital humanities versus whatever other kind of humanities happens to be at hand, I keep having this vertiginous sense of deja vu, as if I’m hearing the same arguments I heard two decades ago, but transformed in to a key just different enough that I can’t tell whether today’s debates are mere variations on a theme or some genuinely new frame of discourse.

The song that I think is remaining the same is the divide between the proponents of what gets called “distanced reading,”  which in some hands is a shorthand for all things digital humanities (if it’s digital, it must be distanced as compared to the human touch of paper, ink, and typewriters–how the industrial period came to be the sign and symbol of all thing human and intimate I am not entirely clear), and close reading which is somehow taken to be THE form of intimate human contact with the text.

This division is exemplified in Stanley Fish’s recent essay on the digital humanities in the New York times, an argument that has the usual whiff of caustic Fishian insight leavened with what I take to be a genuine if wary respect for what he sees in the practices of distanced reading.  Nevertheless, for Fish, it is finally close reading that is genuinely the work of the humane critic devoted to intimacy with the text:

But whatever vision of the digital humanities is proclaimed, it will have little place for the likes of me and for the kind of criticism I practice: a criticism that narrows meaning to the significances designed by an author, a criticism that generalizes from a text as small as half a line, a criticism that insists on the distinction between the true and the false, between what is relevant and what is noise, between what is serious and what is mere play. Nothing ludic in what I do or try to do. I have a lot to answer for.

Ironically, in an earlier period it was Fish and precisely this kind of close reading (as practiced by deconstructionists) that was descried for its lack of seriousness, for the way it removed literature from the realm of human involvement and into the play of mere textuality .  By contrast, the distanced readers in those days imagined themselves as defenders of humanity (or, since humanism was a dirty word, at least the defender of the poor, the downtrodden, the miserable, the huddled masses).  Historicism read widely and broadly in the name of discourse, and proclaimed itself a liberating project, ferreting out the hidden political underbelly in a multitude of texts and considering literary criticism to be an act of responsible justice-seeking over and against the decadent jouissance-seekers of post-structuralism.

A recent blog by Alex Reid takes up this same criticism of what he describes as the Close Reading industry, arguing for the ways digitization can free us from the tyranny of the industrialized close reader:

In the composition classroom, the widgets on the belt are student papers. If computers can read like people it’s because we have trained people to read like computers. The real question we should be asking ourselves is why are we working in this widget factory? And FYC essays are perhaps the best real world instantiation of the widget, the fictional product, produced merely as a generic example of production. They never leave the warehouse, never get shipped to market, and are never used for anything except test runs on the factory floor. 

In an earlier period, it was again the close-readers who were accused of being mechanistic, dry, and scientific as putatively more humanistic readers accused New Critics of an unfeeling scientism in their formalist attitude toward the text, cutting out every human affect in the quest for a serious and scientific study of literature.

I wonder at root, whether this is the controlling metaphor, the key to which all our tunes in literary and cultural studies are played, a quest for the human that is not merely scientific, and yet an unrepressed desire for the authority of the scientist to say things with security, to wear the mantle of authority that our culture apparently only believes a statistical method can endow.

It is probably a mark against my character that I tend to be a both/and pragmatist as a thinker.  I do not buy the notion that distanced reading is inconsequential, or some how less about truth or less serious than the close rhetorical readings that Fish invokes.  At the same time, I am not too given to the euphoric and pugnacious challenges that can sometimes characterize digital humanities responses to the regnant forms of literary criticism.  At their best, Fishian forms of close reading are endowed not simply with acute attention, but with attention that seems to give birth to a form of wisdom that only attentiveness and close examination can provide, the kind of insistent close reading that led Gerard Manley Hopkins to seek the “inscape” of individual instances beyond categories, rather than simply the ways in which individuals fit into the vast landscapes popular in his post-romantic period.

I was reminded of this need to attend to the close properties of the individual use of language again in a recent article on Chaucer in the Chronicle. The writer attends to the detail of Chaucer’s language in a way that seems to reveal something important about the ways in which we are human.

translating Chaucer is like translating any other foreign language: The words are different from one language to the next. And then comes the third category, the most fascinating and the most aggravating because it is the trickiest: the false cognates, words that look like they should mean what they do in Modern English, but don’t. False cognates are especially aggravating, and fascinating when they carry their Middle and Modern English meanings simultaneously. These are exciting moments, when we see, through a kind of linguistic time-lapse photography, Chaucer’s language on its way to becoming our own.

In Middle English, for instance, countrefete means “to counterfeit,” as in “to fake,” but it also has the more flattering meaning of “to imitate.” Corage has not only the Modern English sense of bravery but also, frequently, overtones of sexual energy, desire, or potency. Corage takes its roots from the word coeur, or “heart,” and transplants them slightly southward. The same is true for solas, or “solace.” The “comfort,” “satisfaction,” or “pleasure” it entails is often sexual.

Lust might seem to pose no problem for the modern reader. Yet in the 14th century, the word, spelled as it is today, could mean any kind of desire or pleasure, though around that time it was beginning to carry a sexual connotation, too. And lest it seem as if false cognates always involve sex, take sely, or “silly.” It most often means “blessed” or “innocent,” as well as “pitiful” and “hapless,” but “foolish” was making its way in there, too.

A sentence like “The sely man felte for luste for solas” could mean “The pitiful man felt desire for comfort.” It could just as likely mean: “The foolish man felt lust for sex.” In Chaucer’s hands, it could mean both at once.

Chaucer was fully aware of the slipperiness of language. He delights in it; he makes his artistic capital from it. He is an inveterate punster. The Wife of Bath, for example, repeatedly puns on the word queynte (eventually the Modern English “quaint”). In the 14th century, the word means not only “curious” or “fascinating” but also the curious part of her female anatomy that most fascinates her five husbands. What’s more, the slipperiness of language gives Chaucer the tools to form his famous irony and ambiguity. If the way-too-pretty Prioress is “nat undergrowe” (“not undergrown”), how big is she?

(via Instapaper)

 These kinds of particularities of language are the worthy objects of our attention as literary scholars.  At the same time,  I do not think we need say that distanced reading plays no role in our understanding of such peculiarities.  A Chaucer project on the order of the Homer Multi-text, might actually deepen and multiply our understanding of Chaucer’s slipperiness and originality.  At the same time, vast database-driven analyses of every text written within a hundred years of Chaucer might allow us to discover the kinds of linguistic sources he was drawing on and manipulating anew for his own purposes, they might show us new creativities we had not imagined, or they might show us things we had taken to be unique were fairly common stock and trade.
These kinds of knowledges could not be derived from a contest between methods, but only from a reading marked by attentiveness, skill and desire, one willing to draw on any resource to understand what one wishes to know, which used to be a metaphor for love.

Do Humanities Programs Encourage the Computational Illiteracy of Their Students?

I think the knee-jerk and obvious answer to my question is “No.”  I think if humanities profs were confronted with the question of whether their students should develop their abilities in math (or more broadly in math, science and technology), many or most would say Yes.  On the other hand, I read the following post from Robert Talbert at the Chronicle of Higher Ed.  It got me thinking just a bit about how and whether we in the humanities contribute to an anti-math attitude among our own students, if not in the culture as a whole.

I’ve posted here before about mathematics’ cultural problem, but it’s really not enough even to say “it’s the culture”, because kids do not belong to a single monolithic “culture”. They are the product of many different cultures. There’s their family culture, which as Shaughnessy suggests either values math or doesn’t. There’s the popular culture, whose devaluing of education in general and mathematics in particular ought to be apparent to anybody not currently frozen in an iceberg. (The efforts of MIT, DimensionU, and others have a steep uphill battle on their hands.)

And of course there’s the school culture, which itself a product of cultures that are out of kids’ direct control. Sadly, the school culture may be the toughest one to change, despite our efforts at reform. As the article says, when mathematics is reduced to endless drill-and-practice, you can’t expect a wide variety of students — particularly some of the most at-risk learners — to really be engaged with it for long. I think Khan Academy is trying to make drill-and-practice engaging with its backchannel of badges and so forth, but you can only apply so much makeup to an inherently tedious task before learners see through it and ask for something more.

via Can Math Be Made Fun? – Casting Out Nines – The Chronicle of Higher Education.

This all rings pretty true to me.  There are similar versions of this in other disciplines.  In English, for instance, students unfortunately can easily learn to hate reading and writing through what they imbibe from popular culture or through what the experience in the school system.  For every hopeless math geek on television, there’s a reading geek to match.  Still and all, I wonder whether we in the humanities combat and intervene in the popular reputation of mathematics and technological expertise, or do we just accept it, and do we in fact reinforce it.

I think, for instance, of the unconscious assumption that there are “math people” and “English people”;  that is, there’s a pretty firmly rooted notion that people are born with certain proclivities and abilities and there is no point in addressing deficiencies in your literacy in other areas.  More broadly, I think we apply this to students, laughing in knowing agreement when they talk about coming to our humanities disciplines because they just weren’t math persons or a science persons, or groaning together in the faculty lounge about how difficult it is to teach our general education courses to nursing students or to math students.  As if our own abilities were genetic.

In high school I was highly competent in both math and English, and this tendency wasn’t all that unusual for students in the honors programs.  On the other hand, I tested out of math and never took another course in college, and none of my good humanistic teachers in college ever challenged and asked me to question that decision.  I was encouraged to take more and different humanities courses (though, to be frank, my English teachers were suspicious of my interest in philosophy), but being “well-rounded’ and “liberally educated”  seems in retrospect to have been largely a matter of being well-rounded in only half of the liberal arts curriculum.  Science and math people were well-rounded in a different way, if they were well-rounded at all.

There’s a lot of reason to question this.  Not least of which being that if our interests and abilities are genetic we have seen a massive surge of the gene pool toward the STEM side of the equation if enrollments in humanities majors is to serve as any judge.  I think it was Malcolm Gladwell who recently pointed out that genius has a lot less to do with giftedness than it does with practice and motivation.  Put 10000 hours in to almost anything and you will become a genius at it (not entirely true, but the general principle applies).  Extrapolating, we might say that even if students aren’t going to be geniuses in math and technology, they could actually get a lot better at it if they’d only try.

And there’s a lot of reason to ask them to try.  At the recent Rethinking Success conference at Wake Forest, one of the speakers who did research into the transition of college students in to the workplace pounded the table and declared, “In this job market you must either be a technical student with a liberal arts education or a liberal arts major with technical savvy.  There is no middle ground.”  There is no middle ground.  What became quite clear to me at this conference is that companies mean it that they want students with a liberal arts background.  However, it was also very clear to me that they expect them to have technical expertise that can be applied immediately to job performance. Speaker after speaker affirmed the value of the liberal arts.  They also emphasized the absolute and crying need for computational, mathematical, and scientific literacy.

In other words, we in the Humanities will serve our students extremely poorly if we accept their naive statements about their own genetic makeup, allowing them to proceed with a mathematical or scientific illiteracy that we would cry out against if the same levels of illiteracy were evident in others with respect to our own disciplines.

I’ve found, incidentally, that in my conversations with my colleagues in information sciences or math or sciences, that many of them are much more conversant in the arts and humanities than I or my colleagues are in even the generalities of science, mathematics, or technology.  This ought not to be the case, and in view of that i and a few of my colleagues are considering taking some workshops in computer coding with our information sciences faculty.  We ought to work toward creating a generation of humanists that does not perpetuate our own levels of illiteracy, for their own sake and for the health of our disciplines in the future.

Teaching Latin on an iPad: An experiment at Messiah College

An example of some of the things that Messiah College is trying to do in experimenting with digital technology in the classroom.  My colleague Joseph Huffman is more pessimistic than I about the promise of iPads and e-books, but I’m just glad we have faculty trying to figure it out.  See the full post at the link below.

You might not expect a historian of Medieval and Renaissance Europe to be among the first educators at Messiah College to volunteer to lead a pilot project exploring the impact of mobile technology—in this case, the iPad—on students’ ability to learn. But that’s exactly what happened.Joseph Huffman, distinguished professor of European history, and the eight students in his fall 2011 Intermediate Latin course exchanged their paper textbooks for iPads loaded with the required texts, relevant apps, supplementary PDFs and a Latin-English dictionary. The primary goal was to advance the learning of Latin. The secondary goal was to determine whether the use of the iPad improved, inhibited or did not affect their ability to learn a foreign language.Why Latin?“A Latin course is about as traditional a humanities course as one can find,” Huffman says. Because any foreign language course requires deep and close readings of the texts, studying how student learning and engagement are affected by mobile technology is especially provocative in such a classic course. In addition, Latin fulfills general language course requirements and, therefore, classes are comprised of students from a variety of majors with, perhaps, diverse experiences with mobile technologies like iPads.One aspect of the experiment was to explore whether students would engage the learning process differently with an iPad than a textbook.

The assumption, Huffman admits, is that today’s students likely prefer technology over books.Huffman’s experiences with his Latin 201 course—comprised of five seniors, two sophomores and one junior—challenged that commonly held assumption.

via Messiah College: Messiah News – Messiah College Homepage Features » iPad experiment.