Category Archives: cultural literacy

On being human: Lessons from Harvard Institute for Management and Leadership in Higher Education

In a specific sense I am an unabashed advocate of what has come to be called the applied humanities, roughly and broadly speaking the effort to connect the study of the humanities to identifiable and practical social goods.  For me, in addition it includes the effort to develop humanities programs that take seriously that we are responsible (at least in significant part) for preparing our students for productive lives after college, preparation that I think really should be embedded within humanities curricula, advising, cocurricular programming, and the general ethos and rhetoric that we use to inculcate in our students what it means to be a humanist.

In several respects this conviction lies at the root of my advocacy for both digital humanities programs and for career planning and programming for liberal arts students, as different as these two areas seem to be on the surface.  I have little room left any more for the idea that “real learning” or intellectual work pulls up its skirts to avoid the taint of the marketplace or the hurly-burly of political arenas and that we demonstrate the transcendent value of what we do over and above professional programs by repeatedly demonstrating our irrelevance.  Far from diminishing the humanities, an insistence that what we do has direct and indirect, obvious and not so obvious connections to social value enhances the humanities.  It’s not just a selling point to a doubting public.  As I said yesterday, the only good idea is the idea that can be implemented.  We ought to be proud of the fact that we can show directly how our students succeed in life, how they apply the things they’ve learned, how they find practical ways of making meaningful connections between their academic study and the world of work.

At the same time, I will admit that some versions of this argument leave me cold.  It risks saying that the only thing that is really valuable about the humanities is what is practically relevant to the marketplace. I greet this effort to make Wordsworth a useful version of a management seminar with a queasy stomach.

It may sound like a nice day out in beautiful surroundings, but can walking around Lake District sites synonymous with Romantic poet William Wordsworth really offer business leaders and local entrepreneurs the crucial insights they need?

That is precisely the claim of Wordsworth expert Simon Bainbridge, professor of Romantic studies at Lancaster University, who believes the writer can be viewed as a “management guru” for the 21st century.

Since 2007, the scholar has taken students down into caves and out on canoes to the island on Grasmere once visited by Wordsworth and fellow poet Samuel Taylor Coleridge, and to places where many of the former’s greatest works were written, for what he called “practical exercises linked to the themes of Wordsworth’s poetry.”

Such walks, which also have been incorporated into development days for individual firms, are now being offered as a stand-alone option for local and social entrepreneurs at a rate of £175 ($274) a day.

Read more: http://www.insidehighered.com/news/2012/08/09/businesses-pay-british-professor-teach-them-about-wordsworth#ixzz236bQaECf 
Inside Higher Ed 

I do not find the insight here wrong so much as sad.  If the only reason we can get people to read Wordsworth is because he will enhance their management skills, we have somehow misplaced a priority, and misunderstood the role that being a manager ought to play in our lives and in the social and economic life of our society.  It is the apparent reduction of all things and all meaning to the marketplace that is to be objected to and which every educational institution worthy of the name ought to furiously resist, not the fact of marketplaces themselves.

I was lucky enough this summer to attend the Harvard Institute for Management and Leadership in Education.  To be honest, I went thinking I was going to get all kinds of advice on things like how to organize projects, how to manage budgets, how to promote programs, how to supervise personnel.  There was some of that to be sure, but what struck me most was that the Institute, under the leadership of Bob Kegan, put a high, even principal, priority on the notion that managers have to first take care of who they are as human beings if they are to be the best people they can be for their colleagues and their institutions.  You have to know your own faults and weakness, your own strengths, your dreams, and you have to have the imagination and strength of mind and heart (and body) to learn to attend to the gifts, and faults and dreams and nightmares of others before or at least simultaneously with your own.  In other words, being a better manager is first and foremost about becoming a healthier, more humane, fuller human being.

The tendency of some applied humanities programs to show the relevance of poetry by showing that it has insights in to management techniques, or the relevance of philosophy because it will help you write a better project proposal, is to misplace causes and to turn the human work of another imagination (in this case Wordsworth) into an instrumental opportunity.  The reason for reading Wordsworth, first and foremost, is because Wordsworth is worth reading, and simultaneously because the encounter with Wordsworth will give you the opportunity to be a fuller, more imaginative, more thoughtful human being than you were before.

If you become that, you will have a chance to be a better manager.  But even if you don’t become a better manager, or if you lose your job because your company is overtaken by Bain capital or because students no longer choose to afford your pricey education, you will, nonetheless, be richer.

Hermeneutics of the stack and the list: unreading journals in print or online

The New York Times reported yesterday that The Wilson Quarterly will put out its final print issue in July ( Wilson Quarterly to End Print Publication – NYTimes.com). The editorial staff seemed sanguine.

“We’re not going on the Web per se,” Steven Lagerfeld, the magazine’s editor, said in an interview. “We already have a Web site. The magazine will simply be published in a somewhat different form as an app,” first in the iTunes store and later on the Android platform.

And, to be honest, I’m sanguine too.  Although, I noted a the demise of the University of Missouri Press with a half shudder last week, I have to admit that I don’t greet the demise of print journals with the same anxiety.  I’ve recognized lately that I mostly buy paper journals so I can have access to their online manifestations or because I feel guilty knowing that online long form journalism and feature writing has yet to find a way to monetize itself effectively. I try to do my part by littering my office and bedroom with stack and stacks of largely unopened New York Reviews, New Yorkers, Chronicles of Higher Ed, and a few other lesser known magazines and specialist journals. But most of my non-book reading, long form or not, is done on my iPad.

I will leave the question of what we will do if good journals like the Wilson Quarterly really can’t survive on iTunes distribution (WQ only survived in paper because of the indulgence of the Woodrow Wilson International Center for Scholars). I’m more interested at the moment in the fact of the stack and what it signifies in the intellectual life.  Every intellectual I know of is guilty of stockpiling books and journals that  she never reads, and can never reasonably expect to, at least not if she has a day job.  The stack is not simply a repository of knowledge and intellectual stimulation beckoning to the reader, drawing him away from other mundane tasks like reading or preparing for class with an ennobling idea of staying informed. (Side note:  academia is the one place in life where every activity of daily life can be construed as tax deductible; just make a note about it and write “possible idea for future article” at the top of the page.)

No, The stack is also a signifier.  It exists not so much to read, since most academics give up hopelessly on the idea of reading every word of the journals that they receive.  The stack exists to be observed.  Observed on the one hand by the academic him or herself, a reassuring sign of one’s own seriousness, that one reads such thing and is conversant with the big ideas, or at least the nifty hot ideas, about culture high and low.  The stack also exists to be observed by others:  the rare student who comes by during office hours, the dean who happens to drop by to say hello, the colleagues coming in to ask you out for coffee–“Oh, you already got the latest issue of PMLA!” The stack suggests you are uptodate, or intend to be.  The stack communicates your values.  Which journal do you put strategically out at the edge of the desk to be observed by others, which do you stack heedlessly on top of the file cabinet.  Even the hopelessly disheveled office can signify, as did Derrida’s constantly disheveled hair; I am too busy and thinking too many big thoughts to be concerned with neatness.

The stack, like the Wilson Quarterly, is on its way out, at least for academics.  I realized four or five years ago that e-books would signify the end of a certain form of identification since people would no longer self-consciously display their reading matter in coffee houses or on subways, every text hidden in the anonymous and private cover of the Kindle or now the iPad.  While I could connect now with other readers in Tibet or Siberia, I could not say off-handedly to the college student sitting next to me–“Oh, you’re reading Jonathan Safran Foer, I loved that book!”

The stack too is going and will soon be gone.  Replaced now by the endless and endlessly growing list of articles on Instapaper that I pretend I will get back to.  This has not yet had the effect of neatening my office, but it will remove one more chance at self-display.  I will soon be accountable only for what I know and what I can actually talk about, not what I can intimate by the stacks of unread paper sitting on my desk.

Digital Archive as Advertisement: The Hemingway Papers

The pace at which digital material is being made available to the public and to students and scholars in the humanities is accelerating, whether one thinks of the digitization of books, the new MOOC’s from MIT and Harvard and others that will extend learning the humanities and other fields, or the digitization of papers and manuscripts that were previously in highly restricted manuscripts or rare book sections of single libraries like the James Joyce Papers just released in Ireland.

Another addition to this list is the release of a new digitized collection of Hemingway’s writings for the Toronto Star.  The Star has put together the columns written by Hemingway for the paper in the early 20s, along with some stories about the  writer.  I’m basically extremely happy that archives like this and others are taking their place in the public eye.  I had a great course on Hemingway while pursuing an MFA at the University of Montana with Gerry Brenner, and the legacy of Hemingway was felt everywhere.  Still is as far as I’m concerned.

At the same time, I admit that the Star site left me just a little queasy and raised a number of questions about what the relationship is between a commercial enterprise like the Star and digital work and scholarly work more generally.  First cue to me was the statement of purpose in the subtitle to the homepage:

The legendary writer’s reporting from the Toronto Star archives, featuring historical annotations by William McGeary, a former editor who researched Hemingway’s columns extensively for the newspaper, along with new insight and analysis from the Star’s team of Hemingway experts.

I hadn’t really realized that the Toronto Star was a center of Hemingway scholarship, but maybe I’ve missed something over the past 20 years.  Other similar statements emphasize the Star’s role in Hemingway’s life as much as anything about Hemingway himself:  emphases on the Star’s contributions to the great writer’s style (something that, if I remember, Hemingway himself connected more to his time in Kansas City), emphases on the way the Star nurtured the writer and on the jovial times Hemingway had with Star editorial and news staff.  Sounds a little more like a family album than a really serious scholarly take on what Hemingway was about in this period.  Indeed, there is even a straightforward and direct advertisement on the page as it sends you to the Toronto Star store where you can purchase newsprint editions of Hemingway’s columns.

I don’t really want to looks a gift horse in the mouth.  There’s a lot of good stuff here, and just having the articles and columns available may be enough and I can ignore the rest.  Nevertheless, the web is a framing device that makes material available within a particular context, and here that context clearly has a distinct commercial angle.  It strikes me that this is a version of public literary history that has all the problems of public history in general that my colleague John Fea talks about over at The Way of Improvement Leads Home.  Here of course it is not even really the public doing the literary history but a commercial enterprise that has a financial stake in making itself look good in light of Hemingways legacy.

The Star promises the site will grow, which is a good thing.  I hope it will grow in a way that will allow for more genuine scholarly engagement on Hemingways legacy as well as more potential interactivity.  The site is static with no opportunity for engagement at all, so everything is controlled by the Star and its team of Hemingway experts.  We take it or we leave it.

For the moment I am taking it, but I worry about the ways commercial enterprises can potentially shape our understanding of literary and cultural history for their own ends.  I wonder what others think about the role of commercial enterprises in establishing the context through which we think about literature and culture?

Barack Obama’s Waste Land; President as First Reader

GalleyCat reported today that the new biography of Barack Obama gives an extensive picture of Obama’s literary interests, including a long excerpt of a letter in which Obama details his engagement with TS Eliot and his signature poem, The Waste Land. Obama’s analysis:

Eliot contains the same ecstatic vision which runs from Münzer to Yeats. However, he retains a grounding in the social reality/order of his time. Facing what he perceives as a choice between ecstatic chaos and lifeless mechanistic order, he accedes to maintaining a separation of asexual purity and brutal sexual reality. And he wears a stoical face before this. Read his essay on Tradition and the Individual Talent, as well as Four Quartets, when he’s less concerned with depicting moribund Europe, to catch a sense of what I speak. Remember how I said there’s a certain kind of conservatism which I respect more than bourgeois liberalism—Eliot is of this type. Of course, the dichotomy he maintains is reactionary, but it’s due to a deep fatalism, not ignorance. (Counter him with Yeats or Pound, who, arising from the same milieu, opted to support Hitler and Mussolini.) And this fatalism is born out of the relation between fertility and death, which I touched on in my last letter—life feeds on itself. A fatalism I share with the western tradition at times.

A Portrait of Barack Obama as a Literary Young Man – GalleyCat.

For a 22 year old, you’d have to say this is pretty good. I’m impressed with the nuance of Obamas empathetic imagination, both in his ability to perceive the differences between the three great conservative poets of that age, and in his ability to identify with Eliot against his own political instincts. This is the kind of reading we’d like to inculcate in our students, and I think it lends credence to the notion that a mind trained in this kind of engagement might be better trained for civic engagement than those that are not. But too often even literature profs are primarily readers of the camp, so to speak, lumping those not of their own political or cultural persuasion into the faceless, and largely unread, camp of the enemy, and appreciating without distinction those who further our pet or current causes.

This is too bad, reducing a richer sense of education for civic engagement into the narrower and counterproductive sense of reading as indoctrination. I think the older notion was a vision of education that motivated the founding fathers. Whatever one thinks of his politics, passages like this suggest to me that Obama could sit unembarrassed with Jefferson and Adams discussing in all seriousness the relationship between poetry and public life. It would be a good thing to expect this of our presidents, rather than stumbling upon it by accident.

We are all twitterers now: revisiting John McWhorter on Tweeting

Angus Grieve Smith over at the MLA group on Linked In pointed me toward John McWhorter’s take on Twitter a couple of years back. [I admit to some embarrassment in referencing an article that’s TWO YEARS OLD!! But the presentism of the writing for the web is a story for another time]. McWhorter’s basic case is that Twitter is not really writing at all, but a form of graphic speech. My term, not McWhorter’s. He points out that most people even after knowing how to read and write speak in bursts of 7 to 10 words, and writing at its origins reflected these kinds of patterns. In other words, as speakers we are all twitterers. Of Twitter, McWhorter says:

 

The only other problem we might see in something both democratic and useful is that it will exterminate actual writing. However, there are no signs of this thus far. In 2009, the National Assessment of Education Performance found a third of American eighth graders – the ones texting madly right now — reading at or above basic proficiency, but crucially, this figure has changed little since 1992, except to rise somewhat. Just as humans can function in multiple languages, they can also function in multiple kinds of language. An analogy would be the vibrant foodie culture that has thrived and even flowered despite the spread of fast food.

Who among us really fears that future editions of this newspaper will be written in emoticons? Rather, a space has reopened in public language for the chunky, transparent, WYSIWYG quality of the earliest forms of writing, produced by people who had not yet internalized a sense that the flavor of casual language was best held back from the printed page.

This speech on paper is vibrant, creative and “real” in exactly the way that we celebrate in popular forms of music, art, dance and dress style. Few among us yearn for a world in which the only music is classical, the only dance is ballet and daily clothing requires corsets and waistcoats. As such, we might all embrace a brave new world where we can both write and talk with our fingers.

via Talking With Your Fingers – NYTimes.com.

 

I mostly agree with this idea, that we are all code shifters. I’m less sanguine than, McWhorter, however. His eighth grade sample takes students before they’ve entered a period of schooling where they’d be expected to take on more serious reading and longer and more complex forms of writing. Twitter may not be to blame, but it’s not clear to me the state of writing and reading comprehension at higher levels is doing all that well. There’s some pretty good evidence that it isn’t. So just as a foodie culture may thrive in the midst of a lot of fast food, it’s not clear that we ought to be complacent in the face of an obesity epidemic. In the same way, just because tweeting may not signal the demise of fine writing, it’s not clear that it’s helping the average writer become more facile and sophisticated in language use.

Is Twitter Destroying the English language?

Coming out of the NITLE seminar on Undergraduate Research in Digital Humanities, my title question was one of the more interesting questions on my mind.  Janis Chinn, a student at the University of Pittsburgh, posed this question as a motivation for her research on shifts in linguistic register on Twitter.  I’m a recent convert to Twitter and see it as an interesting communication tool, but also an information network aggregator.  I don’t really worry about whether twitter is eroding my ability to write traditional academic prose, but then, I’ve inhabited that prose for so long its more the case that I can’t easily adapt to the more restrictive conventions of twitter.  And while I do think students are putting twitterisms in their papers, I don’t take this as specifically different than the tendency of students to use speech patterns as the basis for constructing their papers, and not recognizing the different conventions of academic prose.  So twitter poses some interesting issues, but not issues that strike me as different in kind from other kinds of language uses.

I gather from the website for her project that Janis is only at the beginning of her research and hasn’t developed her findings yet, but it looks like a fascinating study.  Part of her description of the work is as follows:

Speakers shift linguistic register all the time without conscious thought. One register is used to talk to professors, another for friends, another for close family, another for one’s grandparents. Linguistic register is the variety of language a speaker uses in a given situation. For example, one would not use the same kind of language to talk to one’s grandmother as to your friends. One avoids the use of slang and vulgar language in an academic setting, and the language used in a formal presentation is not the language used in conversation. This is not just a phenomenon in English, of course; in languages like Japanese there are special verbs only used in honorific or humble situations and different structures which can increase or decrease the politeness of a sentence to suit any situation. This sort of shift takes place effortlessly most of the time, but relatively new forms of communication such as Twitter and other social media sites may be blocking this process somehow.

In response to informal claims that the current generation’s language is negatively affected by modern communication tools likeTwitter, Mark Liberman undertook a brief analysis comparing the inaugural addresses of various Presidents. This analysis can be found on University of Pennsylvania‘s popular linguistics blog “Language Log”. Remarkably, he found a significant trend of shortening sentence and word lengths over the last 200 years. My research, while not addressing this directly, will demonstrate whether using these services affects a user’s ability to shift linguistic registers to match the situation as they would normally be expected to.

Fascinating question in and of itself. I think on some level I’ve always been deeply aware of these kinds of shifts.  As I kid when my parents were missionaries in New Guinea, I would speak with an Aussie accent while I was with kids at the school across the valley, which shifting back in to my Okie brogue on the mission field and in my house.  And as I made my way in to academe my southern and southwesternisms gradually dropped away with a very few exceptions–aware as I was that my accent somehow did not signal intelligence and accomplishment.  Mockery of southern white speech remains a last bastion of prejudice in the academy generally.  I don’t think these are the kinds of register shifts Janis is looking at, but same territory.

I’m also more interested in the general motive questions.  If we could prove that Twitter inhibited the ability to shift registers, would that count as destroying or damaging the language in some sense?  If we could demonstrate that Twitter was leading people to use shorter and shorter sentences–or to be less and less able to comprehend sentences longer than 160 characters.  Would this signal an erosion in the language.  We must have some notion that language can be used in more effective and less effective ways since we are all very aware that communication can fail abysmally or succeed beyond our hopes, and usually ends up somewhere in-between.  Does the restricted nature of Twitter limit or disable some forms of effective communication, while simultaneously enabling others.  These are interesting questions.  I’m sure more intelligent people than I am are working on them.

Do Humanities Programs Encourage the Computational Illiteracy of Their Students?

I think the knee-jerk and obvious answer to my question is “No.”  I think if humanities profs were confronted with the question of whether their students should develop their abilities in math (or more broadly in math, science and technology), many or most would say Yes.  On the other hand, I read the following post from Robert Talbert at the Chronicle of Higher Ed.  It got me thinking just a bit about how and whether we in the humanities contribute to an anti-math attitude among our own students, if not in the culture as a whole.

I’ve posted here before about mathematics’ cultural problem, but it’s really not enough even to say “it’s the culture”, because kids do not belong to a single monolithic “culture”. They are the product of many different cultures. There’s their family culture, which as Shaughnessy suggests either values math or doesn’t. There’s the popular culture, whose devaluing of education in general and mathematics in particular ought to be apparent to anybody not currently frozen in an iceberg. (The efforts of MIT, DimensionU, and others have a steep uphill battle on their hands.)

And of course there’s the school culture, which itself a product of cultures that are out of kids’ direct control. Sadly, the school culture may be the toughest one to change, despite our efforts at reform. As the article says, when mathematics is reduced to endless drill-and-practice, you can’t expect a wide variety of students — particularly some of the most at-risk learners — to really be engaged with it for long. I think Khan Academy is trying to make drill-and-practice engaging with its backchannel of badges and so forth, but you can only apply so much makeup to an inherently tedious task before learners see through it and ask for something more.

via Can Math Be Made Fun? – Casting Out Nines – The Chronicle of Higher Education.

This all rings pretty true to me.  There are similar versions of this in other disciplines.  In English, for instance, students unfortunately can easily learn to hate reading and writing through what they imbibe from popular culture or through what the experience in the school system.  For every hopeless math geek on television, there’s a reading geek to match.  Still and all, I wonder whether we in the humanities combat and intervene in the popular reputation of mathematics and technological expertise, or do we just accept it, and do we in fact reinforce it.

I think, for instance, of the unconscious assumption that there are “math people” and “English people”;  that is, there’s a pretty firmly rooted notion that people are born with certain proclivities and abilities and there is no point in addressing deficiencies in your literacy in other areas.  More broadly, I think we apply this to students, laughing in knowing agreement when they talk about coming to our humanities disciplines because they just weren’t math persons or a science persons, or groaning together in the faculty lounge about how difficult it is to teach our general education courses to nursing students or to math students.  As if our own abilities were genetic.

In high school I was highly competent in both math and English, and this tendency wasn’t all that unusual for students in the honors programs.  On the other hand, I tested out of math and never took another course in college, and none of my good humanistic teachers in college ever challenged and asked me to question that decision.  I was encouraged to take more and different humanities courses (though, to be frank, my English teachers were suspicious of my interest in philosophy), but being “well-rounded’ and “liberally educated”  seems in retrospect to have been largely a matter of being well-rounded in only half of the liberal arts curriculum.  Science and math people were well-rounded in a different way, if they were well-rounded at all.

There’s a lot of reason to question this.  Not least of which being that if our interests and abilities are genetic we have seen a massive surge of the gene pool toward the STEM side of the equation if enrollments in humanities majors is to serve as any judge.  I think it was Malcolm Gladwell who recently pointed out that genius has a lot less to do with giftedness than it does with practice and motivation.  Put 10000 hours in to almost anything and you will become a genius at it (not entirely true, but the general principle applies).  Extrapolating, we might say that even if students aren’t going to be geniuses in math and technology, they could actually get a lot better at it if they’d only try.

And there’s a lot of reason to ask them to try.  At the recent Rethinking Success conference at Wake Forest, one of the speakers who did research into the transition of college students in to the workplace pounded the table and declared, “In this job market you must either be a technical student with a liberal arts education or a liberal arts major with technical savvy.  There is no middle ground.”  There is no middle ground.  What became quite clear to me at this conference is that companies mean it that they want students with a liberal arts background.  However, it was also very clear to me that they expect them to have technical expertise that can be applied immediately to job performance. Speaker after speaker affirmed the value of the liberal arts.  They also emphasized the absolute and crying need for computational, mathematical, and scientific literacy.

In other words, we in the Humanities will serve our students extremely poorly if we accept their naive statements about their own genetic makeup, allowing them to proceed with a mathematical or scientific illiteracy that we would cry out against if the same levels of illiteracy were evident in others with respect to our own disciplines.

I’ve found, incidentally, that in my conversations with my colleagues in information sciences or math or sciences, that many of them are much more conversant in the arts and humanities than I or my colleagues are in even the generalities of science, mathematics, or technology.  This ought not to be the case, and in view of that i and a few of my colleagues are considering taking some workshops in computer coding with our information sciences faculty.  We ought to work toward creating a generation of humanists that does not perpetuate our own levels of illiteracy, for their own sake and for the health of our disciplines in the future.

Interview with Andrew Delbanco: Students, you have saved others, now save yourselves

Following up on my recent posts on Andrew Delbanco (here, here, and here), there’s an interesting interview with Delbanco on the Chronicle of the Higher Education as part of their Afterwords series, speaking further about his recent book:

Andrew Delbanco Interview–Chronicle of Higher Education

Mostly Delbanco covers the same territory here, and again, I admire his ideals.  I remain struck, though, by the way in which he puts the onus on students to resist the commercialization of college life. Again, I wonder, why is it up to students to do this.  Don’t they, most of them, end up working with an overwhelmingly overdetermined system, hopelessly recognizing that a college or university degree is necessary for their success in life, and realizing at the exchange of several tens of thousands of dollars in debt they are being offered a chance at a reasonably secure existence.  How can it be up to college students to resist this commercialization when college and university life is so thoroughly commercialized from the moment of the transaction–through admissions decisions that consider the ability to pay, to financial aid offerings, to debt loads, to student jobs necessary for paying basic expenses.  What student could avoid understanding that there is a deeply commercial angle to the transaction.

Note, I am not saying the commercialization of higher education should not be resisted, but it seems peculiar to me to put emphasis on the need for students to do this.  The question ought to be, how do we change the structures of higher education that are making the commercialization of their education inevitable.

That is a tougher nut to crack than pleading with undergraduates to resist pecuniary interests and take humanities majors anyway.

Dispatches from the Digital Revolution

I know right now that I am partly subject to the enthusiasm of the new convert in seeing my object of adoration everywhere I turn, but truly, it seems that everywhere I turn these days I see the landslide toward a total digitalization of the world of the humanities.  Like a landslide, it may have looked a long ways off at first, but its upon us now, and rumble has become a roar.   As I said in this previous post, I think we’re a long way past a print plus world and we better figure out how digital tools, either simple things like e-books or complex tools and methodologies associated with digitalization, are going to change what we are doing with ourselves and our students.  A few rumblings:

1. Robert Darnton announces that the Digital Public Library of America will be up and running by 2013.  Darnton, an advocate of public digitalization efforts that will prevent private entities like Google from controlling access to information, has spearheaded the effort to bring together the digitalization efforts of libraries around the globe.  According to the DPLA’s website, the purpose of the the DPLA is focused in the following ways:

Many universities, public libraries, and other public-spirited organizations have digitized materials that could be brought together under the frame of the DPLA, but these digital collections often exist in silos. Compounding this problem are disparate technical standards, disorganized and incomplete metadata, and a host of legal issues. No project has yet succeeded in bringing these different viewpoints, experiences, and collections together with leading technical experts and the best of private industry to find solutions to these complex challenges. Users have neither coherent access to these materials nor tools to use them in new and exciting ways, and institutions have no clear blueprint for creating a shared infrastructure to serve the public good. The time is right to launch an ambitious project to realize the great promise of the Internet for the advancement of sharing information and of using technology to enable new knowledge and discoveries in the United States.

2. Appearance of the Journal of Digital Humanities:  I already mentioned this yesterday, but I’ll go ahead and do it again.  It seems to me that Digital Humanities is coalescing in to a force in academe–rather than a marginalized crew on the ragtag end–not unlike the massive changes that occurred in humanistic studies after 1966 and the advent of deconstruction and its step-children.  In my estimation the change may be even more massive–and perhaps more painful and more exciting–than those earlier changes since deconstruction did not essentially change the tools of the trade–we still read books (and gradually included film, pop-culture, and other media) and we still wrote papers about them.  While deconstruction may have been a more sophisticated and nifty looking hammer, it was still basically a hammer.  Digital Humanities is changing humanistic work at the level of the tool, creating houses without hammers.

3.People Who read e-books read more books than those who do not--A new Pew Research Center study suggests the following:

a survey from the Pew Research Center’s Internet & American Life Project shows that e-book consumers in the U.S. are reading over a third more books than their print-only customers. According to the report, titled “The Rise of E-Reading,” the average reader of e-books says he or she has read 24 books in the past 12 months, compared with an average of 15 books by non–e-book consumers.

Overall, Pew found that the number of American adults who say they have read an e-book rose to 21%, compared to 17% reported just a few months ago in December 2011. That jump comes following a holiday season that saw a spike in the ownership of both tablet computers and dedicated e-readers.

I admit that I want to cavil a bit about this news.  It’s also been demonstrated that e-readers so far are overwhelmingly dominated by pulp fiction romances and mysteries, the kind of thing you can read easily in a day.  On the other hand, book selling and reading in general has ALWAYS been dominated by the romance and mystery genres, so that’s nothing new.

The same Publishers Weekly article points to a study saying that e-readers are poised to take off with a massive global spike.  We’ve heard this before, but….Well, I asked my boss the other day if I could purchase a Kindle so I could experiment with the Kindle library program.  I am over the edge and into the dark side of the abyss.

4. The New York Public Library opened up an amazing new database tool for the 19040 census–itself an amazing database just released by the U.S. government.  I haven’t totally figured out how to use it yet, but your can search for persons in the census, tag their location in GIS based maps of New York City and do multilayered searching of NYC based on the crowd-sourced effort at developing a digital social history of New York City.  According to this article in the Gothamist,

Kate Stober at the NYPL tells us it’s “more than just a research tool, we’ll be helping New Yorkers create a social history map of buildings and neighborhoods in the five boroughs. When you find an address, the tool pins it to both a 1940 map and a contemporary map, so you can see how the area has changed. You’re then invited to leave a note attached to the pin—memories, info about who lived there, what the neighborhood was like, questions… As people use the site, we’ll build a cultural map of New York in 1940 that will assist both professional historians and laypeople alike.” And that’s pretty amazing.

I’m especially fond of this article because it goes on to point out that famous recluse, J.D. Salinger was indeed living in plain site on Park Avenue in New York City in 1940.  You just had to know his first name was Jerome and have faith that there couldn’t be more than one Jerome D. Salinger’s in Manhattan.  I think the question for humanist scholars will be what responsible teacher of the culture, art, history, and politics, etcetera of America in the 1940s would not want to use this tool and insist that their students use it to.

It’s more than a rumble.

Literacy in the Digital Humanities: Or, a clueless “noob” in digital academe

Today my faculty group focused on the Digital Humanities here at Messiah College had a great session with Ryan Cordell from St. Norbert’s College.  Ryan blogs regularly for ProfHacker at the Chronicle of Higher Education, and holds down the Digital Humanities fort (or perhaps leads the insurgency) at St. Norbert’s.  He’s also especially done some work advising liberal arts colleges on projects in the Digital Humanities, so I thought he’d be a good choice for consulting.  I’m happy with the choice:  Ryan was practical and down-to-earth, while also pointing to really challenging and exciting places we could take some of our nascent ideas.  I think we came away with some good possibilities for next steps that will lead to some concrete action in the next year.  I highly recommend Ryan if your looking for a consultant for starting or managing digital humanities projects in a smaller school setting.

Earlier in the day I had had the good luck to look in on a massive twitter debate that was, unbeknownst to the participants, about or at least precipitated by me and a brief conversation I’d had with Ryan.  I’d told Ryan that one of my biggest concerns was professional development for faculty and getting them over some of the immediate humps of alienation that traditional humanistic scholars feel when confronted with what amounts to an alien DH world.  I mentioned the fact that I  and one of my colleagues, David Pettegrew--who is himself much more versed in technical know-how than I am–went to a THATCamp and spent the first two or three hours feeling completely lost and at sea, unable to fully comprehend half the language that was being used or the tasks that we were being asked to implement. I mentioned to Ryan that I felt that I probably needed to have had a half of a semester of a coding class before I would have gotten everything out of the THATCamp that I should have gotten.  Although that improved as things went along and we got in to concrete projects, and I also found everyone very gracious and the atmosphere enthusiastic,  I was worried that my faculty who were only interested in investigating (and perhaps then only after my pleading) would be discouraged or uninterested in engaging with DH if a THATCamp was their first experience.

Ryan mentioned this in a tweet yesterday.

All-twitter-hell broke loose.

Well, not really.  In fact it was a really fascinating and intellectually complex conversation–one I wouldn’t have thought could happen via Twitter.  I won’t try to completely replicate that conversation here.  You could go to Ryan’s twitter feed and find the essentials for yourself.  It was clear, though, that Ryan’s tweet had touched what amounted to a raw digital nerve.  Some twitterers were flabbergasted that anyone would find a THATCamp too daunting or that it could ever be alienating.  Others assumed that the problem definitely must have been with me, that I was too shy to ask for help.  Ultimately the conversation turned to a pretty serious engagement with the question of whether there were genuinely insider and exclusive groups and hierarchies within DH.

As a “noob”–which I discovered in the course of the twitter conversation yesterday is what I am–I am here to say without a hint of condemnation, “Yes, yes, yes there are.”

For me, this is not a moral or even a political statement, though it was very clear to me that for many people in the conversation this was a moral or political concern.  To admit to hierarchies and exclusivity was  a betrayal of the collaborative and radically democratic spirit that many feel is at the heart of DH work.  I will say that these collaborative aspects are part of what most attracts me to what’s going on in DH–as little as I actually do know;  I see it as a superb fit for some of the commitments my school has to the public humanities and to public service more generally, besides moving students in to more collaborative learning environments that will be useful to them in the world they are entering.

However, any academic discourse that is imaginable, maybe any discourse that is imaginable at all, operates by exclusion and inclusion simply given the facts that there are those who know the language and those who do not, there are those who are literate in the language and those who are not, there are those who are fluent in the language and those who are not, and there are those who are creators in with and of the language and there are those who are not.  It is impossible for me to imagine how this could be otherwise.

The reason DH can be difficult and alienating for beginners like me is because we don’t know enough of the language to even know what to ask for. I will say I mused over the question of whether I had just been too shy to ask for help at the THATCamp.  Being a fainting violet is not really a quality that will get you terribly far in administration, so I doubt it, but it may be that I could have asked for more help.  The problem was, I felt so lost that I wasn’t entirely sure what kind of help to ask for.  This is a basic function of discourse, to understand the parameters of the language games you are playing, to know what questions to ask, what moves to make and when, and where to go for the vocabulary you need.  Its why you need consultants like Ryan, or teachers who are in the know.  Its the rationale for the title of my post referencing Gerald Graff’s Clueless in Academe.  DH is obviously a part of academe, even in its alt-academic forms, and it is increasingly central to academic work in the humanities, and there are an awful lot of people who are clueless about where to begin.

There is nothing morally or politically wrong with this or with being a part of an in group.  To say there is would be to say there is something morally or politically wrong with being alive.  Hyper-Calvinists aside, I don’t think this is a tenable position.

The problem, however, from an administrators point of view–and I speak in to this conversation primarily as an administrator who is trying to facilitate the work of others and promote the well-being of our students–is the pathways toward accessing the language and practices of this world aren’t always terribly clear.  Indeed, ironically, I think some of the laudable democratic ethos in DH work and culture may contribute to this obscurity.  Because a THATCamp–and so much other DH work–is so democratically organized, it means that one experience, conference or workshop may in fact really work well for rank beginners, while another may really require attendees to be a little more versed in the basics before attending.

For me as a person and as a thinker, that’s fine.  I actually look forward to going to another THATCamp someday, even if I am just as lost as I was the first time around. My tenure no longer depends upon it–which gives me a freedom my junior faculty do not have.

However, as an administrator, that democratic quality is a disaster as I consider what kinds of professional development efforts to try to support with my faculty.  I would not be able to tell whether a particular experience would be appropriate for a rank beginner who is hesitantly interested or at least willing to give this a try.  Alternatively, I wouldn’t be able to know ahead of time whether a particular experience would be appropriate for a more advanced colleague who might go and get an iteration of the basics she already knows.  My ability to manage my budgets in a responsible fashion is hampered by my inability to gauge what kinds of professional development experiences I should pursue or promote with my colleagues who are at very different places in their experience of and expertise in DH methodologies and practices.

The traditional life of a humanist academic is elitist in its own obvious ways with its own arcana and exclusionary practices. But the pathway toward access to its languages is fairly well marked, even if it is now increasingly travelled successfully by the very lucky very few.  I could tell junior faculty members 10 years ago that if they wanted to succeed at my college they needed to do three or four things, and I could outline how they should go about doing them.  I don’t sense that kind of pathway to DH work, yet, so while I am wanting mightily to get my faculty more involved with some of these efforts, I’m also aware that without a clearer path for their own professional development, I may be as likely to facilitate confusion as I am to promote professional development.

This problem may simply disappear as DH becomes more and more central to the humanist enterprise, but I suspect as it does become more and more central that the pathways to access will have to become more and more clearly marked.  This means the development of disciplinary (or quasi-disciplinary–I am aware of the angst over thinking of DH as a discipline) protocols and expectations, and as importantly the expected means by which those elements of professional life are effectively accessed by beginners.

This means the recognition of certain gateways and the appointment of their gatekeepers, which all smacks a little bit of hierarchy and exclusion.  However, while it’s true that roadmaps undemocratically dominate a landscape, they also get you where you need to go.  And while gateways mark a boundary, they also let you in.