Category Archives: Humanities

Hermeneutics of the stack and the list: unreading journals in print or online

The New York Times reported yesterday that The Wilson Quarterly will put out its final print issue in July ( Wilson Quarterly to End Print Publication – NYTimes.com). The editorial staff seemed sanguine.

“We’re not going on the Web per se,” Steven Lagerfeld, the magazine’s editor, said in an interview. “We already have a Web site. The magazine will simply be published in a somewhat different form as an app,” first in the iTunes store and later on the Android platform.

And, to be honest, I’m sanguine too.  Although, I noted a the demise of the University of Missouri Press with a half shudder last week, I have to admit that I don’t greet the demise of print journals with the same anxiety.  I’ve recognized lately that I mostly buy paper journals so I can have access to their online manifestations or because I feel guilty knowing that online long form journalism and feature writing has yet to find a way to monetize itself effectively. I try to do my part by littering my office and bedroom with stack and stacks of largely unopened New York Reviews, New Yorkers, Chronicles of Higher Ed, and a few other lesser known magazines and specialist journals. But most of my non-book reading, long form or not, is done on my iPad.

I will leave the question of what we will do if good journals like the Wilson Quarterly really can’t survive on iTunes distribution (WQ only survived in paper because of the indulgence of the Woodrow Wilson International Center for Scholars). I’m more interested at the moment in the fact of the stack and what it signifies in the intellectual life.  Every intellectual I know of is guilty of stockpiling books and journals that  she never reads, and can never reasonably expect to, at least not if she has a day job.  The stack is not simply a repository of knowledge and intellectual stimulation beckoning to the reader, drawing him away from other mundane tasks like reading or preparing for class with an ennobling idea of staying informed. (Side note:  academia is the one place in life where every activity of daily life can be construed as tax deductible; just make a note about it and write “possible idea for future article” at the top of the page.)

No, The stack is also a signifier.  It exists not so much to read, since most academics give up hopelessly on the idea of reading every word of the journals that they receive.  The stack exists to be observed.  Observed on the one hand by the academic him or herself, a reassuring sign of one’s own seriousness, that one reads such thing and is conversant with the big ideas, or at least the nifty hot ideas, about culture high and low.  The stack also exists to be observed by others:  the rare student who comes by during office hours, the dean who happens to drop by to say hello, the colleagues coming in to ask you out for coffee–“Oh, you already got the latest issue of PMLA!” The stack suggests you are uptodate, or intend to be.  The stack communicates your values.  Which journal do you put strategically out at the edge of the desk to be observed by others, which do you stack heedlessly on top of the file cabinet.  Even the hopelessly disheveled office can signify, as did Derrida’s constantly disheveled hair; I am too busy and thinking too many big thoughts to be concerned with neatness.

The stack, like the Wilson Quarterly, is on its way out, at least for academics.  I realized four or five years ago that e-books would signify the end of a certain form of identification since people would no longer self-consciously display their reading matter in coffee houses or on subways, every text hidden in the anonymous and private cover of the Kindle or now the iPad.  While I could connect now with other readers in Tibet or Siberia, I could not say off-handedly to the college student sitting next to me–“Oh, you’re reading Jonathan Safran Foer, I loved that book!”

The stack too is going and will soon be gone.  Replaced now by the endless and endlessly growing list of articles on Instapaper that I pretend I will get back to.  This has not yet had the effect of neatening my office, but it will remove one more chance at self-display.  I will soon be accountable only for what I know and what I can actually talk about, not what I can intimate by the stacks of unread paper sitting on my desk.

Dumpster Diving and other career moves: remembering the job market with Roger Whitson

It would be hard to say I enjoyed reading Roger Whitson’s very fine recent meditation in the Chronicle on the quest for a tenure-track job, his ambivalent feelings on finding one, the mixed feelings of exaltation and guilt at getting what so many of his peers would never find, and of leaving behind an #Altac existence where he had begun to make a home.

Hard to enjoy reading both because the story seems to typify what our academic life has actually become, and, frankly, because it reminded me too much of my own wandering years as a new academic a couple of decades ago.  I spent seven years full-time on the job market back in the day (if you count the last two years of graduate school).  I have estimated in the past that I must have applied for at least 700-800 jobs during those years–the idea of being targeted and selective a joke for a new father.  Fortunately I was actually only totally unemployed for four months during those years, though that was enough to plunge me thousands of dollars in to debt paying for health insurance.  For five of those seven years I had full-time work in various visiting assistant positions, and for two of  those visiting years I was paid so little I qualified for food stamps, though I never applied for the program.  I worked as many extra courses as I could to pay the bills–probably foolish for my career since publishing slowed to a crawl, but it saved my pride.  I remember asking, naively, during an interview for one such visiting position whether it was actually possible to live in that area of the country on what I was going to be paid.  The chair interviewing me at the time hesitated, then responded, “Well, of course, your wife can work.”

Only one of those years did I not get an interview, and only two of those years did I not get a campus interview, but even then this seemed like a very peculiar and unhelpful way to claim success for a beginning academic career.  We did not have anything called #altac in those days, and my plan B–which on my worst days I sometimes still wonder whether I should have followed–was to go back to cooking school and become a chef (I know, I know.  Another growth industry).  I never felt bad about pursuing a PhD in English, and I don’t think I would have even if I had gone on to become a chef.  The learning was worth it, to me, at least.

But I did grow distant from college friends who became vice-presidents of companies or doctors in growing practices , all of whom talked about their mortgages and vacations in the Caribbean or Colorado, while I was living in the cheapest 2 bedroom apartment in Fairfax Virginia that I could find and fishing furniture, including my daughter’s first bed, out of a dumpster.  (The furniture was held together, literally, by duct tape; I had to pay for conferences). And I spent a lot of evenings walking off my anxiety through the park next to our apartment complex, reminding myself of how much I had to be thankful for.  After all, I had a job and could pay my bills through the creative juggling of credit card balances. A lot of my friends had found no jobs at all.  A low rent comparison, I realize, but I would take what solace I could get.

I do not resent those days now, but that depends a lot on my having come out the other side.  The sobering thought in all of this is in realizing that in the world of academics today I should count myself one of the lucky ones.  Reading Roger’s essay, and the many like it that have been published in the last twenty years, I always get a sick hollow feeling in the gut, remembering what it was like to wonder what would happen if….

Reading Roger’s essay I was struck again with the fact that this is now the permanent condition of academic life in the humanities.  My own job story began more than 20 years ago at Duke, and even then we were told that the job market had been miserable for 15 years (but was sure to get better by and by).  30 years is not a temporary downturn or academic recession.  It is a way of being.

The advent of MOOC’s, all-online education, and for-profit universities, are responses to the economics of higher education that are unlikely to make things any better for the freshly minted PhD.  While there are some exciting innovations here that have a lot of promise for increasing learning to the many, it’s also the case that they are attractive and draw interest because they promise to do it more cheaply, which in the world of higher education means teaching more students with fewer faculty hours.  Roger’s most powerful line came toward the end:  “Until we realize that we are all contingent, we are all #altac, we all need to be flexible, and we are all in this together, we won’t be able to effectively deal with the crisis in the humanities with anything other than guilt.”

This is right, it seems to me.  In a world that is changing as rapidly and as radically as higher education, we are all as contingent the reporters and editors in the newsrooms of proud daily newspapers.  It is easy to say that the person who “made it” was talented enough or smart enough or savvy enough, but mostly they, I, we were just lucky enough to come out the other side.  But we would be misguided to imagine that because we made it in to a world that at least resembled the world we imagined, that that world will always be there.  We are an older institution and industry than music or radio or newspapers, but we are an industry and an institution nonetheless, and it seems to me that the change is upon us.  We are all contingent now.

Why students of the Humanities should look for jobs in Silicon Valley

Ok, I’ll risk sounding like a broken record to say again that the notion that humanities students are ill-positioned for solid careers after college is simply misguided.  It still bears repeating.  This latest from Vivek Wadhwa at the Washington Post gives yet more confirmation of the notion that employers are not looking for specific majors but for skills and abilities and creativity, and that package can come with any major whatsoever, and it often comes with students in the humanities and social sciences.

Using Damon Horowitz, who possess degrees in both philosophy and engineering and whose unofficial title at Google is In-House Philosopher and whose official title is Director of Engineering, Wadhwa points out the deep need for humanities and social science students in the work of technology companies, a need that isn’t just special pleading from a humanist but is made vivid in the actual hiring practices of Silicon Valley companies.

Venture Capitalists often express disdain for startup CEOs who are not engineers. Silicon Valley parents send their kids to college expecting them to major in a science, technology, engineering or math (STEM) discipline. The theory goes as follows: STEM degree holders will get higher pay upon graduation and get a leg up in the career sprint.

The trouble is that theory is wrong. In 2008, my research team at Duke and Harvard surveyed 652 U.S.-born chief executive officers and heads of product engineering at 502 technology companies. We found that they tended to be highly educated: 92 percent held bachelor’s degrees, and 47 percent held higher degrees. But only 37 percent held degrees in engineering or computer technology, and just two percent held them in mathematics. The rest have degrees in fields as diverse as business, accounting, finance, healthcare, arts and the humanities.

Yes, gaining a degree made a big difference in the sales and employment of the company that a founder started. But the field that the degree was in was not a significant factor. ….

I’d take that a step further. I believe humanity majors make the best project managers, the best product managers, and, ultimately, the most visionary technology leaders. The reason is simple. Technologists and engineers focus on features and too often get wrapped up in elements that may be cool for geeks but are useless for most people. In contrast, humanities majors can more easily focus on people and how they interact with technology. A history major who has studied the Enlightment or the rise and fall of the Roman Empire may be more likely to understand the human elements of technology and how ease of use and design can be the difference between an interesting historical footnote and a world-changing technology. 

via Why Silicon Valley needs humanities PhDs – The Washington Post.

Again, at the risk of sounding like a broken record, this sounds like the kind of findings emphasized at the Rethinking Success Conference that I have now blogged on several times.    (I’ve heard theories that people come to be true believers if they hear a story 40 times.  So far I’ve only blogged on this 12 times, so I’ll keep going for a while longer).  Although I still doubt that it would be a good thing for a philosopher to go to Silicon Valley with no tech experience whatsoever,  a philosopher who had prepared himself by acquiring some basic technical skills alongside of his philosophy degree might be in a particularly good position indeed.  Worth considering.

Side note,  the Post article points to a nice little bio about Damon Horowitz.  I suspect there are not many folks in Silicon Valley who can talk about the ethics of tech products in terms that invoke Kant and John Stuart Mill.  Maybe there should be more.

Digital Humanities as Culture Difference: Adeline Koh on Hacking and Yacking

My colleague Bernardo Michael in the History department here has been pressing me to understand that properly understood Digital Humanities should be deeply connected to our College-wide efforts to address questions of diversity and what the AAC&U calls inclusive excellence.  (Bernardo also serves as the special assistant to the President for Diversity affairs).  At first blush I will admit that this has seemed counter-intuitive to me and I have struggled to articulate the priority between my interest in developing new efforts in Digital Humanities that I tie to our college’s technology plan and my simultaneous concerns with furthering our institutions diversity plan (besides just a general ethical interest, my primary field of study over the past 20 years has been multicultural American Literature).

Nevertheless, I’ve started seeing more and more of Bernardo’s point as I’ve engaged in the efforts to get things started in Digital Humanities.  For one thing, the practices and personages of the digital world are talked about in cultural terms:  We use language like “digital natives” and “digital culture” and “netizens”–cultural terms that attempt to articulate new forms of social and cultural being.  In the practical terms of trying to create lift-off for some of these efforts, an administrator faces the negotiation of multiple institutional cultures, and the challenging effort to get faculty–not unreasonably happy and proud about their achievements within their own cultural practices–to see that they actually need to become conversant in the languages and practices of an entirely different and digital culture.

Thus I increasingly see that Bernardo is right;  just as we need to acclimate ourselves and become familiar with other kinds of cultural differences in the classroom, and just as our teaching needs to begin to reflect the values of diversity and global engagement, our teaching practices also need to engage students as digital natives.  Using technology in the classroom or working collaboratively with students on digital projects isn’t simply instrumental–i.e. it isn’t simply about getting students familiar with things they will need for a job.  It is, in many ways, about cultural engagement, respect, and awareness.  How must our own cultures within academe adjust and change to engage with a new and increasingly not so new culture–one that is increasingly central and dominant to all of our cultural practices?

Adeline Koh over at Richard Stockton College (and this fall at Duke, I think), has a sharp post on these kinds of issues, focusing more on the divide between theory and practice or yacking and hacking in Digital Humanities.  Adeline has more theory hope than I do, but I like what she’s probing in her piece and I especially like where she ends up:

If computation is, as Cathy N. Davidson (@cathyndavidson) and Dan Rowinski have been arguing, the fourth “R” of 21st century literacy, we very much need to treat it the way we already do existing human languages: as modes of knowledge which unwittingly create cultural valences and systems of privilege and oppression. Frantz Fanon wrote in Black Skin, White Masks: “To speak a language is to take on a world, a civilization.”  As Digital Humanists, we have the responsibility to interrogate and to understand what kind of world, and what kind of civilization, our computational languages and forms create for us. Critical Code Studies is an important step in this direction. But it’s important not to stop there, but to continue to try to expand upon how computation and the digital humanities are underwritten by theoretical suppositions which still remain invisible.

More Hack, Less Yack?: Modularity, Theory and Habitus in the Digital Humanities | Adeline Koh
http://www.adelinekoh.org/blog/2012/05/21/more-hack-less-yack-modularity-theory-and-habitus-in-the-digital-humanities/

I suspect that Bernardo and Adeline would have a lot to say to each other.

Do all Canadian Professors wear funny robes and require a moderator? The Book Is Not Dead [jenterysayers.com]

Jentery Sayers at the University of Victoria posted a really interesting video set from a debate the humanities faculty put on about the book or the death thereof.  Couldn’t help being interested since it’s what’s absorbed me generally for the past several years, and since we here at Messiah had out own symposium on the book this past February.

I embedded one video below with part of Jentery’s speech–unfortunately split between two videos, and Jentery’s head is cut off some of the time, a talking body instead of a talking head.  The whole set is on Jentery’s website and apparently somewhere on the University of Victoria and of course on YouTube.  Worth my time this evening, though perhaps it says something about me that I am spending my time on a Friday night watching Canadian professors dressed in robes and addressing one another as “Madame Prime Minister” and “Leader of the Opposition”. Better than Monty Python.

The event is described as follows:

 As independent bookstores close their doors, newspapers declare bankruptcy and young people are more familiar with negotiating digitized data, it seems that the era of the printed word may be on it’s way out. Indeed, the emergency of digital humanities research seems to imply that, even in the most book-centric fields, the written word may be obsolete. Join us for a good-humoured look at whether the book is dead or if rumours of its demise are premature.

via The Book Is Not Dead [jenterysayers.com].

Takeaway line from Jentery:  “New Media Remediates Old Media”.  I’m still unpacking that, but I like Jentery’s general sense of the commerce between the traditional Gutenberg book and New Media.  It does seem to me that in a lot of ways this interaction between media forms is really what’s happening right now.  Every book published has a web site, a Facebook page, and the authors interact with readers via twitter and personal blogs.  A lot of what goes on in new media is repackaging and mashups of old media.  I do think though that its also the case that old media repackages new media as well.  Movies end up as books, and blogs become books that become movies.

It seems to me that our divisions between English and Film/communication/digital media might make less and less sense.  Would it make more sense to imagine books as such as “media” and simply have media studies, rather than imagining these things separately.

Other memorable line was someone quoting McLuhan.  “Old technologies become new art forms.”  Or words to that effect. I think this is right, and in the long haul I keep thinking this may be the destiny of the traditional book, though i could be proven wrong. I think book binders could be a growth industry, as well as publishers that specialize in high end book products.  I’ve mulled over the question of the book becoming an art object several times before, so I won’t bother to do it again here.

Side note:  Jentery Sayers was extremely generous with his time, attention, and intelligence in engaging with a number of faculty and students at Messiah College last week.  A lot of good ideas and great energy even if the computer hook up was less than desirable. Much appreciated.  The clip of Jentery’s speech is below:

 

Katrina Gulliver’s 10 Commandments of Twitter for Academics – With Exegetical Commentary

I’m a Johnny Come Lately to Twitter as I’ve mentioned on this blog before.   I’ve got the zeal of a new convert.  It was thus with great delight that I ran across Katrina Gulliver’s Ten Commandments of twittering for academics.  It’s a worthwhile article, but I’ll only list the ten commandments themselves as well as my self-evaluation of how I’m doing.

1. Put up an avatar. It doesn’t really matter what the picture is, but the “egg picture” (the default avatar for new accounts) makes you look like a spammer. [I CONFESS I WAS AN EGG FOR SEVERAL MONTHS BUT FINALLY GOT AROUND TO AN AVATAR THREE OR FOUR WEEKS AGO, LUCKILY TWITTER CONVERTS EVERYTHING TO YOUR AVATAR IMMEDIATELY, SO IF YOU ARE JUST AN EGG YOU CAN COVER OVER A MULTITUDE OF SINS IMMEDIATELY BY UPLOADING AN AVATAR.  IT’S VERY NEARLY A RELIGIOUS EXPERIENCE’\]

2. Don’t pick a Twitter name that is difficult to spell or remember. [I WOULD ADD TO THIS THAT IT COULD BE GOOD TO PICK SOMETHING FAIRLY SHORT.  MY OWN HANDLE IS MY NAME, @PETERKPOWERS, BUT THAT TAKES UP A LOT OF CHARACTERS OUT OF THE TWITTER LIMIT]

3. Tweet regularly. [DONE.  I AM NOT YET TO THE STAGE OF ANNOYING MY WIFE, BUT SHE DOESN’T REALIZE THAT’S WHAT I’M DOING ON MY IPAD;  I MIGHT ALSO SAY DON’T TWEET TOO REGULARLY, ESPECIALLY NOT IF YOU ARE LISTING SPECIFIC PERSONS.  NO ONE WANTS THEIR PHONE GOING OFF CONSTANTLY]

4. Don’t ignore people who tweet at you. Set Twitter to send you an e-mail notification when you get a mention or a private message. If you don’t do that, then check your account frequently. [AGREED, ALTHOUGH I STRUGGLE WITH WHETHER TO CONTACT EVERY PERSON WHO FOLLOWS ME;  NOT LIKE I’M INUNDATED, BUT I DON’T HAVE TONS OF TIME.  I TRY TO ACKNOWLEDGE FOLLOWS IF THE SELF-DESCRIPTION SUGGESTS THE PERSON IS CLOSELY CONNECTED TO MY PROFESSIONAL LIFE AND INTERESTS]

5. Engage in conversation. Don’t just drop in to post your own update and disappear. Twitter is not a “broadcast-only” mechanism; it’s CB radio. [DOING THIS, BUT IT TOOK ME A WHILE TO GET AROUND TO IT.  HOWEVER, I’M BETTER AT THIS THAN AT STRIKING UP CONVERSATIONS WITH STRANGERS AT PARTIES]

6. Learn the hashtags for your subject field or topics of interest, and use them.[OK, I DON’T REALLY DO THIS ONE THAT MUCH.  EXCEPT SOME WITH DIGITAL HUMANITIES.  I HAVEN’T FOUND THAT FOLLOWING HASHTAGS OUTSIDE OF #DIGITALHUMANITIES HAS GOTTEN ME ALL THAT FAR]

7. Don’t just make statements. Ask questions. [DONE]

8. Don’t just post links to news articles. I don’t need you to be my aggregator.[I’M NOT SURE ABOUT THIS ONE.  I ACTUALLY THINK TWITTER’S AGGREGATOR QUALITIES IS ONE OF ITS MOST IMPORTANT FEATURES.  FOR PEOPLE WHO I RESPECT IN THE FIELD OF DH, FOR INSTANCE, I REALLY LIKE THEM TO TELL ME WHAT THEY ARE READING AND WHAT THEY LIKE.  DAN COHEN, MARK SAMPLE, RYAN CORDELL, ADELINE KOH, ALL OF THEM ARE READING OR IN CONTACT WITH REALLY IMPORTANT STUFF AND I WANT THEM TO PASS STUFF ALONG.  I’D AGREE THAT JUST POSTING LINKS AT RANDOM MIGHT BE COUNTERPRODUCTIVE, BUT IF YOU ARE BUILDING A REPUTATION AT BEING IN TOUCH WITH GOOD STUFF IN PARTICULAR AREAS, I THINK POSTING LINKS IS ONE GOOD WAY OF BUILDING AN ONLINE PERSONA.  ON THE OTHER HAND, IN THE STRICT DEFINITION OF THE TEXT, I AGREE THAT I DON’T REALLY NEED POSTS OF NEWS ARTICLES PER SE.  I FOLLOW THE NEWS TWITTER FEEDS THEMSELVES FOR THAT KIND OF THING]

9. Do show your personality. Crack some jokes. [DOES TWEETING MY CONTRIBUTION TO INTERNATIONAL MONTY PYTHON STATUS DAY COUNT?]

10. Have fun. [TOO MUCH FUN.  I’VE GOT TO GET BACK TO WORK]

Related note, I‘ve been having a robust, sometimes contentious, sometimes inane discussion about twitter over at the MLA Linked-In group.  Be happy to have someone join that conversation as well.

Digital Archive as Advertisement: The Hemingway Papers

The pace at which digital material is being made available to the public and to students and scholars in the humanities is accelerating, whether one thinks of the digitization of books, the new MOOC’s from MIT and Harvard and others that will extend learning the humanities and other fields, or the digitization of papers and manuscripts that were previously in highly restricted manuscripts or rare book sections of single libraries like the James Joyce Papers just released in Ireland.

Another addition to this list is the release of a new digitized collection of Hemingway’s writings for the Toronto Star.  The Star has put together the columns written by Hemingway for the paper in the early 20s, along with some stories about the  writer.  I’m basically extremely happy that archives like this and others are taking their place in the public eye.  I had a great course on Hemingway while pursuing an MFA at the University of Montana with Gerry Brenner, and the legacy of Hemingway was felt everywhere.  Still is as far as I’m concerned.

At the same time, I admit that the Star site left me just a little queasy and raised a number of questions about what the relationship is between a commercial enterprise like the Star and digital work and scholarly work more generally.  First cue to me was the statement of purpose in the subtitle to the homepage:

The legendary writer’s reporting from the Toronto Star archives, featuring historical annotations by William McGeary, a former editor who researched Hemingway’s columns extensively for the newspaper, along with new insight and analysis from the Star’s team of Hemingway experts.

I hadn’t really realized that the Toronto Star was a center of Hemingway scholarship, but maybe I’ve missed something over the past 20 years.  Other similar statements emphasize the Star’s role in Hemingway’s life as much as anything about Hemingway himself:  emphases on the Star’s contributions to the great writer’s style (something that, if I remember, Hemingway himself connected more to his time in Kansas City), emphases on the way the Star nurtured the writer and on the jovial times Hemingway had with Star editorial and news staff.  Sounds a little more like a family album than a really serious scholarly take on what Hemingway was about in this period.  Indeed, there is even a straightforward and direct advertisement on the page as it sends you to the Toronto Star store where you can purchase newsprint editions of Hemingway’s columns.

I don’t really want to looks a gift horse in the mouth.  There’s a lot of good stuff here, and just having the articles and columns available may be enough and I can ignore the rest.  Nevertheless, the web is a framing device that makes material available within a particular context, and here that context clearly has a distinct commercial angle.  It strikes me that this is a version of public literary history that has all the problems of public history in general that my colleague John Fea talks about over at The Way of Improvement Leads Home.  Here of course it is not even really the public doing the literary history but a commercial enterprise that has a financial stake in making itself look good in light of Hemingways legacy.

The Star promises the site will grow, which is a good thing.  I hope it will grow in a way that will allow for more genuine scholarly engagement on Hemingways legacy as well as more potential interactivity.  The site is static with no opportunity for engagement at all, so everything is controlled by the Star and its team of Hemingway experts.  We take it or we leave it.

For the moment I am taking it, but I worry about the ways commercial enterprises can potentially shape our understanding of literary and cultural history for their own ends.  I wonder what others think about the role of commercial enterprises in establishing the context through which we think about literature and culture?