Category Archives: Reading

We are all twitterers now: revisiting John McWhorter on Tweeting

Angus Grieve Smith over at the MLA group on Linked In pointed me toward John McWhorter’s take on Twitter a couple of years back. [I admit to some embarrassment in referencing an article that’s TWO YEARS OLD!! But the presentism of the writing for the web is a story for another time]. McWhorter’s basic case is that Twitter is not really writing at all, but a form of graphic speech. My term, not McWhorter’s. He points out that most people even after knowing how to read and write speak in bursts of 7 to 10 words, and writing at its origins reflected these kinds of patterns. In other words, as speakers we are all twitterers. Of Twitter, McWhorter says:

 

The only other problem we might see in something both democratic and useful is that it will exterminate actual writing. However, there are no signs of this thus far. In 2009, the National Assessment of Education Performance found a third of American eighth graders – the ones texting madly right now — reading at or above basic proficiency, but crucially, this figure has changed little since 1992, except to rise somewhat. Just as humans can function in multiple languages, they can also function in multiple kinds of language. An analogy would be the vibrant foodie culture that has thrived and even flowered despite the spread of fast food.

Who among us really fears that future editions of this newspaper will be written in emoticons? Rather, a space has reopened in public language for the chunky, transparent, WYSIWYG quality of the earliest forms of writing, produced by people who had not yet internalized a sense that the flavor of casual language was best held back from the printed page.

This speech on paper is vibrant, creative and “real” in exactly the way that we celebrate in popular forms of music, art, dance and dress style. Few among us yearn for a world in which the only music is classical, the only dance is ballet and daily clothing requires corsets and waistcoats. As such, we might all embrace a brave new world where we can both write and talk with our fingers.

via Talking With Your Fingers – NYTimes.com.

 

I mostly agree with this idea, that we are all code shifters. I’m less sanguine than, McWhorter, however. His eighth grade sample takes students before they’ve entered a period of schooling where they’d be expected to take on more serious reading and longer and more complex forms of writing. Twitter may not be to blame, but it’s not clear to me the state of writing and reading comprehension at higher levels is doing all that well. There’s some pretty good evidence that it isn’t. So just as a foodie culture may thrive in the midst of a lot of fast food, it’s not clear that we ought to be complacent in the face of an obesity epidemic. In the same way, just because tweeting may not signal the demise of fine writing, it’s not clear that it’s helping the average writer become more facile and sophisticated in language use.

Distanced and Close Reading in literary study: Metaphors for love

I am old enough now to begin sentences with the phrase “I am old enough…”  Seriously, though, I am old enough now to feel like I have lived through one revolution, into a new orthodoxy, and now the experience of a new revolution in literary studies.  In the ongoing debates I hear about the digital humanities versus whatever other kind of humanities happens to be at hand, I keep having this vertiginous sense of deja vu, as if I’m hearing the same arguments I heard two decades ago, but transformed in to a key just different enough that I can’t tell whether today’s debates are mere variations on a theme or some genuinely new frame of discourse.

The song that I think is remaining the same is the divide between the proponents of what gets called “distanced reading,”  which in some hands is a shorthand for all things digital humanities (if it’s digital, it must be distanced as compared to the human touch of paper, ink, and typewriters–how the industrial period came to be the sign and symbol of all thing human and intimate I am not entirely clear), and close reading which is somehow taken to be THE form of intimate human contact with the text.

This division is exemplified in Stanley Fish’s recent essay on the digital humanities in the New York times, an argument that has the usual whiff of caustic Fishian insight leavened with what I take to be a genuine if wary respect for what he sees in the practices of distanced reading.  Nevertheless, for Fish, it is finally close reading that is genuinely the work of the humane critic devoted to intimacy with the text:

But whatever vision of the digital humanities is proclaimed, it will have little place for the likes of me and for the kind of criticism I practice: a criticism that narrows meaning to the significances designed by an author, a criticism that generalizes from a text as small as half a line, a criticism that insists on the distinction between the true and the false, between what is relevant and what is noise, between what is serious and what is mere play. Nothing ludic in what I do or try to do. I have a lot to answer for.

Ironically, in an earlier period it was Fish and precisely this kind of close reading (as practiced by deconstructionists) that was descried for its lack of seriousness, for the way it removed literature from the realm of human involvement and into the play of mere textuality .  By contrast, the distanced readers in those days imagined themselves as defenders of humanity (or, since humanism was a dirty word, at least the defender of the poor, the downtrodden, the miserable, the huddled masses).  Historicism read widely and broadly in the name of discourse, and proclaimed itself a liberating project, ferreting out the hidden political underbelly in a multitude of texts and considering literary criticism to be an act of responsible justice-seeking over and against the decadent jouissance-seekers of post-structuralism.

A recent blog by Alex Reid takes up this same criticism of what he describes as the Close Reading industry, arguing for the ways digitization can free us from the tyranny of the industrialized close reader:

In the composition classroom, the widgets on the belt are student papers. If computers can read like people it’s because we have trained people to read like computers. The real question we should be asking ourselves is why are we working in this widget factory? And FYC essays are perhaps the best real world instantiation of the widget, the fictional product, produced merely as a generic example of production. They never leave the warehouse, never get shipped to market, and are never used for anything except test runs on the factory floor. 

In an earlier period, it was again the close-readers who were accused of being mechanistic, dry, and scientific as putatively more humanistic readers accused New Critics of an unfeeling scientism in their formalist attitude toward the text, cutting out every human affect in the quest for a serious and scientific study of literature.

I wonder at root, whether this is the controlling metaphor, the key to which all our tunes in literary and cultural studies are played, a quest for the human that is not merely scientific, and yet an unrepressed desire for the authority of the scientist to say things with security, to wear the mantle of authority that our culture apparently only believes a statistical method can endow.

It is probably a mark against my character that I tend to be a both/and pragmatist as a thinker.  I do not buy the notion that distanced reading is inconsequential, or some how less about truth or less serious than the close rhetorical readings that Fish invokes.  At the same time, I am not too given to the euphoric and pugnacious challenges that can sometimes characterize digital humanities responses to the regnant forms of literary criticism.  At their best, Fishian forms of close reading are endowed not simply with acute attention, but with attention that seems to give birth to a form of wisdom that only attentiveness and close examination can provide, the kind of insistent close reading that led Gerard Manley Hopkins to seek the “inscape” of individual instances beyond categories, rather than simply the ways in which individuals fit into the vast landscapes popular in his post-romantic period.

I was reminded of this need to attend to the close properties of the individual use of language again in a recent article on Chaucer in the Chronicle. The writer attends to the detail of Chaucer’s language in a way that seems to reveal something important about the ways in which we are human.

translating Chaucer is like translating any other foreign language: The words are different from one language to the next. And then comes the third category, the most fascinating and the most aggravating because it is the trickiest: the false cognates, words that look like they should mean what they do in Modern English, but don’t. False cognates are especially aggravating, and fascinating when they carry their Middle and Modern English meanings simultaneously. These are exciting moments, when we see, through a kind of linguistic time-lapse photography, Chaucer’s language on its way to becoming our own.

In Middle English, for instance, countrefete means “to counterfeit,” as in “to fake,” but it also has the more flattering meaning of “to imitate.” Corage has not only the Modern English sense of bravery but also, frequently, overtones of sexual energy, desire, or potency. Corage takes its roots from the word coeur, or “heart,” and transplants them slightly southward. The same is true for solas, or “solace.” The “comfort,” “satisfaction,” or “pleasure” it entails is often sexual.

Lust might seem to pose no problem for the modern reader. Yet in the 14th century, the word, spelled as it is today, could mean any kind of desire or pleasure, though around that time it was beginning to carry a sexual connotation, too. And lest it seem as if false cognates always involve sex, take sely, or “silly.” It most often means “blessed” or “innocent,” as well as “pitiful” and “hapless,” but “foolish” was making its way in there, too.

A sentence like “The sely man felte for luste for solas” could mean “The pitiful man felt desire for comfort.” It could just as likely mean: “The foolish man felt lust for sex.” In Chaucer’s hands, it could mean both at once.

Chaucer was fully aware of the slipperiness of language. He delights in it; he makes his artistic capital from it. He is an inveterate punster. The Wife of Bath, for example, repeatedly puns on the word queynte (eventually the Modern English “quaint”). In the 14th century, the word means not only “curious” or “fascinating” but also the curious part of her female anatomy that most fascinates her five husbands. What’s more, the slipperiness of language gives Chaucer the tools to form his famous irony and ambiguity. If the way-too-pretty Prioress is “nat undergrowe” (“not undergrown”), how big is she?

(via Instapaper)

 These kinds of particularities of language are the worthy objects of our attention as literary scholars.  At the same time,  I do not think we need say that distanced reading plays no role in our understanding of such peculiarities.  A Chaucer project on the order of the Homer Multi-text, might actually deepen and multiply our understanding of Chaucer’s slipperiness and originality.  At the same time, vast database-driven analyses of every text written within a hundred years of Chaucer might allow us to discover the kinds of linguistic sources he was drawing on and manipulating anew for his own purposes, they might show us new creativities we had not imagined, or they might show us things we had taken to be unique were fairly common stock and trade.
These kinds of knowledges could not be derived from a contest between methods, but only from a reading marked by attentiveness, skill and desire, one willing to draw on any resource to understand what one wishes to know, which used to be a metaphor for love.

Do Humanities Programs Encourage the Computational Illiteracy of Their Students?

I think the knee-jerk and obvious answer to my question is “No.”  I think if humanities profs were confronted with the question of whether their students should develop their abilities in math (or more broadly in math, science and technology), many or most would say Yes.  On the other hand, I read the following post from Robert Talbert at the Chronicle of Higher Ed.  It got me thinking just a bit about how and whether we in the humanities contribute to an anti-math attitude among our own students, if not in the culture as a whole.

I’ve posted here before about mathematics’ cultural problem, but it’s really not enough even to say “it’s the culture”, because kids do not belong to a single monolithic “culture”. They are the product of many different cultures. There’s their family culture, which as Shaughnessy suggests either values math or doesn’t. There’s the popular culture, whose devaluing of education in general and mathematics in particular ought to be apparent to anybody not currently frozen in an iceberg. (The efforts of MIT, DimensionU, and others have a steep uphill battle on their hands.)

And of course there’s the school culture, which itself a product of cultures that are out of kids’ direct control. Sadly, the school culture may be the toughest one to change, despite our efforts at reform. As the article says, when mathematics is reduced to endless drill-and-practice, you can’t expect a wide variety of students — particularly some of the most at-risk learners — to really be engaged with it for long. I think Khan Academy is trying to make drill-and-practice engaging with its backchannel of badges and so forth, but you can only apply so much makeup to an inherently tedious task before learners see through it and ask for something more.

via Can Math Be Made Fun? – Casting Out Nines – The Chronicle of Higher Education.

This all rings pretty true to me.  There are similar versions of this in other disciplines.  In English, for instance, students unfortunately can easily learn to hate reading and writing through what they imbibe from popular culture or through what the experience in the school system.  For every hopeless math geek on television, there’s a reading geek to match.  Still and all, I wonder whether we in the humanities combat and intervene in the popular reputation of mathematics and technological expertise, or do we just accept it, and do we in fact reinforce it.

I think, for instance, of the unconscious assumption that there are “math people” and “English people”;  that is, there’s a pretty firmly rooted notion that people are born with certain proclivities and abilities and there is no point in addressing deficiencies in your literacy in other areas.  More broadly, I think we apply this to students, laughing in knowing agreement when they talk about coming to our humanities disciplines because they just weren’t math persons or a science persons, or groaning together in the faculty lounge about how difficult it is to teach our general education courses to nursing students or to math students.  As if our own abilities were genetic.

In high school I was highly competent in both math and English, and this tendency wasn’t all that unusual for students in the honors programs.  On the other hand, I tested out of math and never took another course in college, and none of my good humanistic teachers in college ever challenged and asked me to question that decision.  I was encouraged to take more and different humanities courses (though, to be frank, my English teachers were suspicious of my interest in philosophy), but being “well-rounded’ and “liberally educated”  seems in retrospect to have been largely a matter of being well-rounded in only half of the liberal arts curriculum.  Science and math people were well-rounded in a different way, if they were well-rounded at all.

There’s a lot of reason to question this.  Not least of which being that if our interests and abilities are genetic we have seen a massive surge of the gene pool toward the STEM side of the equation if enrollments in humanities majors is to serve as any judge.  I think it was Malcolm Gladwell who recently pointed out that genius has a lot less to do with giftedness than it does with practice and motivation.  Put 10000 hours in to almost anything and you will become a genius at it (not entirely true, but the general principle applies).  Extrapolating, we might say that even if students aren’t going to be geniuses in math and technology, they could actually get a lot better at it if they’d only try.

And there’s a lot of reason to ask them to try.  At the recent Rethinking Success conference at Wake Forest, one of the speakers who did research into the transition of college students in to the workplace pounded the table and declared, “In this job market you must either be a technical student with a liberal arts education or a liberal arts major with technical savvy.  There is no middle ground.”  There is no middle ground.  What became quite clear to me at this conference is that companies mean it that they want students with a liberal arts background.  However, it was also very clear to me that they expect them to have technical expertise that can be applied immediately to job performance. Speaker after speaker affirmed the value of the liberal arts.  They also emphasized the absolute and crying need for computational, mathematical, and scientific literacy.

In other words, we in the Humanities will serve our students extremely poorly if we accept their naive statements about their own genetic makeup, allowing them to proceed with a mathematical or scientific illiteracy that we would cry out against if the same levels of illiteracy were evident in others with respect to our own disciplines.

I’ve found, incidentally, that in my conversations with my colleagues in information sciences or math or sciences, that many of them are much more conversant in the arts and humanities than I or my colleagues are in even the generalities of science, mathematics, or technology.  This ought not to be the case, and in view of that i and a few of my colleagues are considering taking some workshops in computer coding with our information sciences faculty.  We ought to work toward creating a generation of humanists that does not perpetuate our own levels of illiteracy, for their own sake and for the health of our disciplines in the future.

Teaching Latin on an iPad: An experiment at Messiah College

An example of some of the things that Messiah College is trying to do in experimenting with digital technology in the classroom.  My colleague Joseph Huffman is more pessimistic than I about the promise of iPads and e-books, but I’m just glad we have faculty trying to figure it out.  See the full post at the link below.

You might not expect a historian of Medieval and Renaissance Europe to be among the first educators at Messiah College to volunteer to lead a pilot project exploring the impact of mobile technology—in this case, the iPad—on students’ ability to learn. But that’s exactly what happened.Joseph Huffman, distinguished professor of European history, and the eight students in his fall 2011 Intermediate Latin course exchanged their paper textbooks for iPads loaded with the required texts, relevant apps, supplementary PDFs and a Latin-English dictionary. The primary goal was to advance the learning of Latin. The secondary goal was to determine whether the use of the iPad improved, inhibited or did not affect their ability to learn a foreign language.Why Latin?“A Latin course is about as traditional a humanities course as one can find,” Huffman says. Because any foreign language course requires deep and close readings of the texts, studying how student learning and engagement are affected by mobile technology is especially provocative in such a classic course. In addition, Latin fulfills general language course requirements and, therefore, classes are comprised of students from a variety of majors with, perhaps, diverse experiences with mobile technologies like iPads.One aspect of the experiment was to explore whether students would engage the learning process differently with an iPad than a textbook.

The assumption, Huffman admits, is that today’s students likely prefer technology over books.Huffman’s experiences with his Latin 201 course—comprised of five seniors, two sophomores and one junior—challenged that commonly held assumption.

via Messiah College: Messiah News – Messiah College Homepage Features » iPad experiment.

Is Twitter the future of fiction? Micro-prose in an age of ADD

As I’ve mentioned, I’ve been struck by Alex Juhasz’s pronouncement at the Re:Humanities conference that we must learn what it means to write for an audience that is permanently distracted.  In response, I put up a Facebook post: “We need a rhetoric of the caption. A hermeneutic of the aphorism. Haiku as argument.”  My Provost at Messiah College–known for thorough and intricate argument–left a comment “I’m Doomed.”

Perhaps we all are, those of us who are more Faulkneresque than Carveresque in our stylistic leanings.  This latest from GalleyCat:

R.L. Stine, the author of the popular Goosebumps horror series for kids, gave his nearly 49,000 Twitter followers another free story this afternoon.To celebrate Friday the 13th, the novelist tweeted a mini-horror story called “The Brave One.” We’ve collected the posts below for your reading pleasure.

via R.L. Stine Publishes ‘The Brave Kid’ Horror Story on Twitter – GalleyCat.

Ok, I know it’s a silly reach to put Stine and Faulkner in the same paragraph, and to be honest I found Stine’s story trite.  On the other hand, I do think it’s obvious we’re  now in an age wherein shorter prose with bigger impact may be the necessity.  Flash fiction is growing, and we can witness the immense popularity of NPR’s three minute fiction contest.  These forms of fiction, of writing in general speak to the necessities of an art of the moment, rather than the art of immersion.  Literature, and prose in general, is ALWAYS responsive to material and cultural forms of its own moment, and I think prose that is short and explosive, or prose that pierces beneath the surface of the readers psyche in a moment only to spread and eat its way into the unconscious when the moment of reading is long forgotten, is mostly likely the prose that is the order of the day.

BUT…Stine certainly doesn’t do it for me.  I don’t know a lot about Twitter fiction.  Is there any really good stuff out there on twitter–as opposed to flash fiction written in a standard format which I know more about? Or is it all carney-style self-promotion or unrealized theory at the moment?

[And what, I wonder, does this mean for the future of academic prose as well?  I’m a late comer to Twitter myself, but I’ve been a little fascinated with the academic discourse that can occur, but more on that some other time.]

Is the laptop going the way of the codex; technological nostalgia in the iPad imperium

My colleague John Fea over at The Way of Improvement Leads Home, pointed me to this essay by Alex Golub on the relative merits of the iPad and the laptop.  For Golub, the iPad is indispensable, but, as he puts it “it’s not a laptop and it never will be.”  Golub goes on with a litany of limitations that, in fact, I mostly agree with–too hard to produce things, too hard to multi-task, etcetera, etcetera.

On the other hand, I’m struck by the degree to which his lamentations strike me as just the sort of thing people are saying about the demise of the book.

Perhaps I am one of the old generation who will someday be put to shame by nimble-fingered young’uns tapping expertly away on their nanometer-thick iPad 7s, but I don’t think so. People may get used to the limitations of the device, but that doesn’t mean that it’s better than what came before.

In fact, I see this as one of the dangers of the iPad. I see them everywhere on campus, and I wonder to myself: Are my students really getting through college without a laptop? Frankly, the idea seems horrifying to me. I don’t doubt that they can do it — I worry what skills they are not learning because of the smallness (in every sense of that word) of the devices they learn on.

Read more: http://www.insidehighered.com/views/2012/04/09/essay-use-ipad-academics#ixzz1rbBPGr4L
Inside Higher Ed

Substitute the word “book” for every reference to laptop and you’ve got a pretty good rendition of the typical concerns with the demise of the codex, profs in horror at the idea that students may someday come to their classes without books in hand and they may be required to teach students from text on a screen. (Who am I kidding, the thought horrifies me still).  As if somehow there were an inherent depth or proficiency of knowledge that is unavailable through this other form.  My college began an iPad experiment this year, and so far there’s been quite a bit of success, even if there are also hiccups.  Just yesterday I read an interview with Clive Thompson who is reading War and Peace on his iPhone.  On his iPhone!

As I said, I’m reading War and Peace on my iPhone. But you can’t tell I’m reading War and Peaceon my iPhone. When I take my kids to the park and they’re off playing while I’m reading War and Peace, I look like just some fatuous idiot reading his email. I almost went to CafePress and designed a T-shirt that said, “Piss off, I’m reading War and Peace on my iPhone.”

I mildly object to the notion that people look like fatuous idiots answering their email.  It’s what I spend about 80% of my day doing.  Nevertheless, I agree with the sentiment that simply because the embodiment or the tools of our intelligence are unfamiliar, we should not assume intelligence and learning aren’t present.

We’ve had the codex for about two millennia in one form or another.  We’ve had the laptop for less than 40.  I admit to being just a bit bemused at the foreshortening of our nostalgia for the good old days.

Our Data, Our Selves: Data Mining for Self-Knowledge

If you haven’t read Gary Shteygart’s Super, Sad, True, Love Story, I would encourage you to go, sell all, buy and do so.  I guess I would call it a dystopian black comedic satire, and at one point I would have called it futuristic.  Now I’m not so sure.  The creepy thing is that about every other week there’s some new thing I notice and I kind of say to myself “Wow–that’s right out of Shteyngart.”  This latest from the NYTimes is another case in point.  The article traces the efforts of Stephen Wolfram to use his immense collection of data from the records of his email to the keystrokes on his computer to analyze his life for patterns of creativity, productivity, and the like.

He put the system to work, examining his e-mail and phone calls. As a marker for his new-idea rate, he used the occurrence of new words or phrases he had begun using over time in his e-mail. These words were different from the 33,000 or so that the system knew were in his standard lexicon.

The analysis showed that the practical aspects of his days were highly regular — a reliable dip in e-mail about dinner time, and don’t try getting him on the phone then, either.

But he said the system also identified, as hoped, some of the times and circumstances of creative action. Graphs of what the system found can be seen on his blog, called “The Personal Analytics of My Life.”

The algorithms that Dr. Wolfram and his group wrote “are prototypes for what we might be able to do for everyone,” he said.

The system may someday end up serving as a kind of personal historian, as well as a potential coach for improving work habits and productivity. The data could also be a treasure trove for people writing their autobiographies, or for biographers entrusted with the information.

This is eerily like the processes in Shteyngart’s novel whereby people have data scores that are immediately readable by themselves and others, and the main character obsesses continuously over the state of his data, and judges the nature and potential for his relationship on the basis of the data of others.

Socrates was the first, I think, to say the unexamined life was not worth living, but I’m not entirely sure this was what he had in mind.  There is a weird distancing effect involved in this process by which we remove ourselves from ourselves and look at the numbers.

At the same time, I’m fascinated by the prospects, and I think its not all that different from the idea of “distanced reading” that is now becoming common through certain Digital humanities practices in literature, analyzing hundreds or thousands of novels instead of reading two or three closely in order to understand through statistical analysis the important trends in literary history at any particular point in time, as well as the way specific novels might fit in to that statistical history.

Nevertheless, a novel isn’t a person.  I remain iffy about reducing myself to a set of numbers I can work to improve, modify, analyze, and interpret.  The examined life leads typically not to personal policies, but to a sense of mystery, how much there is that we don’t know about ourselves, how much there is that can’t be reduced to what I can see, or what I can count.  If I could understand my life by numbers, would I?

For Your edification I include the book trailer for Shteygart’s novel below.

Dispatches from the Digital Revolution

I know right now that I am partly subject to the enthusiasm of the new convert in seeing my object of adoration everywhere I turn, but truly, it seems that everywhere I turn these days I see the landslide toward a total digitalization of the world of the humanities.  Like a landslide, it may have looked a long ways off at first, but its upon us now, and rumble has become a roar.   As I said in this previous post, I think we’re a long way past a print plus world and we better figure out how digital tools, either simple things like e-books or complex tools and methodologies associated with digitalization, are going to change what we are doing with ourselves and our students.  A few rumblings:

1. Robert Darnton announces that the Digital Public Library of America will be up and running by 2013.  Darnton, an advocate of public digitalization efforts that will prevent private entities like Google from controlling access to information, has spearheaded the effort to bring together the digitalization efforts of libraries around the globe.  According to the DPLA’s website, the purpose of the the DPLA is focused in the following ways:

Many universities, public libraries, and other public-spirited organizations have digitized materials that could be brought together under the frame of the DPLA, but these digital collections often exist in silos. Compounding this problem are disparate technical standards, disorganized and incomplete metadata, and a host of legal issues. No project has yet succeeded in bringing these different viewpoints, experiences, and collections together with leading technical experts and the best of private industry to find solutions to these complex challenges. Users have neither coherent access to these materials nor tools to use them in new and exciting ways, and institutions have no clear blueprint for creating a shared infrastructure to serve the public good. The time is right to launch an ambitious project to realize the great promise of the Internet for the advancement of sharing information and of using technology to enable new knowledge and discoveries in the United States.

2. Appearance of the Journal of Digital Humanities:  I already mentioned this yesterday, but I’ll go ahead and do it again.  It seems to me that Digital Humanities is coalescing in to a force in academe–rather than a marginalized crew on the ragtag end–not unlike the massive changes that occurred in humanistic studies after 1966 and the advent of deconstruction and its step-children.  In my estimation the change may be even more massive–and perhaps more painful and more exciting–than those earlier changes since deconstruction did not essentially change the tools of the trade–we still read books (and gradually included film, pop-culture, and other media) and we still wrote papers about them.  While deconstruction may have been a more sophisticated and nifty looking hammer, it was still basically a hammer.  Digital Humanities is changing humanistic work at the level of the tool, creating houses without hammers.

3.People Who read e-books read more books than those who do not--A new Pew Research Center study suggests the following:

a survey from the Pew Research Center’s Internet & American Life Project shows that e-book consumers in the U.S. are reading over a third more books than their print-only customers. According to the report, titled “The Rise of E-Reading,” the average reader of e-books says he or she has read 24 books in the past 12 months, compared with an average of 15 books by non–e-book consumers.

Overall, Pew found that the number of American adults who say they have read an e-book rose to 21%, compared to 17% reported just a few months ago in December 2011. That jump comes following a holiday season that saw a spike in the ownership of both tablet computers and dedicated e-readers.

I admit that I want to cavil a bit about this news.  It’s also been demonstrated that e-readers so far are overwhelmingly dominated by pulp fiction romances and mysteries, the kind of thing you can read easily in a day.  On the other hand, book selling and reading in general has ALWAYS been dominated by the romance and mystery genres, so that’s nothing new.

The same Publishers Weekly article points to a study saying that e-readers are poised to take off with a massive global spike.  We’ve heard this before, but….Well, I asked my boss the other day if I could purchase a Kindle so I could experiment with the Kindle library program.  I am over the edge and into the dark side of the abyss.

4. The New York Public Library opened up an amazing new database tool for the 19040 census–itself an amazing database just released by the U.S. government.  I haven’t totally figured out how to use it yet, but your can search for persons in the census, tag their location in GIS based maps of New York City and do multilayered searching of NYC based on the crowd-sourced effort at developing a digital social history of New York City.  According to this article in the Gothamist,

Kate Stober at the NYPL tells us it’s “more than just a research tool, we’ll be helping New Yorkers create a social history map of buildings and neighborhoods in the five boroughs. When you find an address, the tool pins it to both a 1940 map and a contemporary map, so you can see how the area has changed. You’re then invited to leave a note attached to the pin—memories, info about who lived there, what the neighborhood was like, questions… As people use the site, we’ll build a cultural map of New York in 1940 that will assist both professional historians and laypeople alike.” And that’s pretty amazing.

I’m especially fond of this article because it goes on to point out that famous recluse, J.D. Salinger was indeed living in plain site on Park Avenue in New York City in 1940.  You just had to know his first name was Jerome and have faith that there couldn’t be more than one Jerome D. Salinger’s in Manhattan.  I think the question for humanist scholars will be what responsible teacher of the culture, art, history, and politics, etcetera of America in the 1940s would not want to use this tool and insist that their students use it to.

It’s more than a rumble.

Anthropodermic Bibliopegy: Books in a pound of flesh

Among the other advantages of Twitter–besides finding out what famous people ate for breakfast–I discover knowledge that I find both nauseating and compelling.  In his recent discourse on the history of the book at Messiah College, Anthony Grafton did not manage to get in to the arcana of book binding, else he may have filled us in a bit more on Anthropodermic Bibliopegy, a term I picked up via a tweet from the LA Times book review.  From the blog the chirurgeon’s apprentice: a website devote to the horrors of pre-anaesthetic surgery:

The process of binding books using human flesh is known as ‘anthropodermic bibliopegy’. One of the earlier examples dates from the 17th century and currently resides in Langdell Law Library at Harvard University. It is a Spanish law bookpublished in 1605. The colour of the binding is a ‘subdued yellow, with sporadic brown and black splotches like an old banana’. [1] On the last page, there is an inscription which reads:


The bynding of this booke is all that remains of my dear friende Jonas Wright, who was flayed alive by the Wavuma [possibly an African tribe from modern-day Zimbabwe, see below illustration]on the Fourth Day of August, 1632. King Mbesa did give me the book, it being one of poore Jonas chiefe possessions, together with ample of his skin to bynd it. Requiescat in pace. [2]

Although it seems macabre to our modern sensibilities, this book was rebound as a way of memorialising the life of Jonas Wright. In this way, it is similar to mourning jewellery made from the hair of the deceased and worn by the Victorians during the 19th century. It is a poignant reminder of the life that has been lost.

Poignant indeed, though I doubt I’ll be asking my wife if she would like a skin-covered book to remember me by.  The post goes on to note.

Anthropodermic bibliopegy reached its height of popularity during the French Revolution, when a fresh supply of bodies was always available. All sorts of books were wrapped in human skins, including a collection of poems by John Milton. One of the last known books to be bound in this fashion dates from 1893 and currently resides at Brown University. The binder did not have quite enough skin for the book, and thus split the piece into two – the front cover is bound using the outer layer of skin; the back cover and spine are bound using the inner layer of skin.

If you didn’t know better, you would think it was suede.

Gives new meaning to the idea of “Kindle Skins.”

Is the decline of the Humanities responsible for Wall Street Corruption?

Andrew Delbanco’s view in a recent Huffington Post essay is “Yes”, at least to some indeterminate degree, though I admit that the title of his post “A Modest Proposal” gave me some pause given its Swiftian connotations:

What I do know is that at the elite universities from which investment firms such as Goldman Sachs recruit much of their talent, most students are no longer seeking a broad liberal education. They want, above all, marketable skills in growth fields such as information technology. They study science, where the intellectual action is. They sign up for economics and business majors as avenues to the kind of lucrative career Mr. Smith enjoyed. Much is to be gained from these choices, for both individuals and society. But something is also at risk. Students are losing a sense of how human beings grappled in the past with moral issues that challenge us in the present and will persist into the future. This is the shrinking province of what we call “the humanities.”

For the past twenty years, the percentage of students choosing to major in the humanities — in literature, philosophy, history, and the arts — has been declining at virtually all elite universities. This means, for instance, that fewer students encounter the concept of honor in Homer’s Iliad, or Kant’s idea of the “categorical imperative” — the principle that Mr. Smith thinks is out of favor at Goldman: that we must treat other people as ends in themselves rather than as means to our own satisfaction. Mr. Smith was careful to say that he was not aware of anything illegal going on. But few students these days read Herman Melville’s great novella, Billy Budd, about the difficult distinction between law and justice.

Correlation is not cause, and it’s impossible to prove a causal relation between what students study in college and how they behave in their post-college lives. But many of us who still teach the humanities believe that a liberal education can strengthen one’s sense of solidarity with other human beings — a prerequisite for living generously toward others. One of the striking discoveries to be gained from an education that includes some knowledge of the past is that certain fundamental questions persist over time and require every generation to answer them for itself.

via Andrew Delbanco: A Modest Proposal.

This is consonant with Delbanco’s thesis–expressed in his book College:  What it Was, Is, and Should Be–that education in college used to be about the education of the whole person but has gradually been emptied of the moral content of its originally religious more broadly civic vision, and the preprofessional and pecuniary imagination has become the dominant if not the sole rationale for pursuing an education. I am viscerally attracted to this kind of argument, so I offer a little critique rather than cheerleading.  First, while I do think its the case that an education firmly rooted in the humanities can provide for the kinds of deep moral reflection that forestalls a purely instrumentalist view of our fellow citizens–or should I say consumers–it’s also very evidently the case that people with a deep commitment to the arts and humanities descend into moral corruption as easily as anyone else.  The deeply felt anti-semitism of the dominant modernists would be one example, and the genteel and not so genteel racism of the Southern Agrarians would be another.  When Adorno said that there was no poetry after Auschwitz, he was only partly suggesting that the crimes of the 20th century were beyond the redemption of literature;  he also meant more broadly that the dream that literature and culture could save us was itself a symptom of our illness, not a solution to it.  Delbanco might be too subject to this particular dream, I think.

Secondly, I think that this analysis runs close to blaming college students for not majoring in or studying more in the humanities and is a little bit akin to blaming the victim–these young people have inherited the world we’ve given them, and we would do well to look at ourselves in the mirror and ask what kind of culture we’ve put in place that would make the frantic pursuit of economic gain in the putative name of economic security a sine qua non in the moral and imaginative lives of our children.

That having been said.  Yes, I do think the world–including the world of Wall Street–would be better if students took the time to read Billy Budd or Beloved, wedging it in somewhere between the work study jobs, appointments with debt counselors, and multiple extracurriculars and leadership conferences that are now a prerequisite for a job after college.