Category Archives: Humanities

Paul Kalanithi: When Breath Becomes Air

When Breath Becomes AirWhen Breath Becomes Air by Paul Kalanithi

My rating: 5 of 5 stars

My friend and colleague, Devin Manzullo-Thomas, warned me that he broke down weeping on a Chicago subway as he finished the book, to the consternation of his fellow passengers. I found it easier to weep discretely in a Barnes and Nobles cafe. As one might guess with a book that opens with the discovery of a young and hopeful and brilliant doctor at the beginning of his career that he has what by all accounts is an incurable cancer, death awaits. And awaits relentlessly. But the knowledge that the book ends in our common human destiny does little to steel the reader for the way the heart breaks against the stony shores of that certainty. We are not given the witness of his death; no autobiography could do that. But we are given the witness of his dying and are included in that process, one that is by turns, noble and wretched and then, in some sense, ennobling of the author, of those around him, and perhaps even of us as readers.

While focused on death, the book is not morbid. If it ends in tears, it is not even really depressing in any meaningful sense of that word. What we are given is the striving after understanding, and the effort to make meaning out of experience with the only tools we have been given to do it, through our language. The book is fascinating for its testimony to the difficulties of the life of the mind and the body, to the agonies of both surgery and uncertainty. Kalanithi began as a devotee of language and discusses his turn away from the study of literature and what might loosely be described as “the mind” and towards the study of the brain and the body in the attempt to understand more completely the physical mechanism by which, after all, we come to speak and conceive of ourselves as having something called a soul or a mind. Kalanithi does not go deeply enough into this transition away from literature and language for my personal taste, though he offers the throwaway line that he thought the study of literature had become too embroiled in the study of politics and was not leading him into the study of meaning. However, he also seems to think that the study of literature is too divorced from the world of action, of doing something in the world that makes a difference, a difference he thought he could make in the healing professions. Kalanithi’s book is not about a theory of literature, but if it were I would want to argue with him more forcefully here. There is, after all, nothing more thoroughly politicized in our day and age than the practice of medicine, though his book does not seem to take up the question of who gets to be there to have a chance at healing at a first rate medical center and why that might be. I might want to argue as well, as I have at other times, that the embrace of literature in particular, and of the humanities more broadly–indeed, education per se–is actually one part of the world’s healing, though we fail too often to recognize it as such. As Milton suggested, the purpose of education is to repair the ruin of our first parents (and perhaps the ruin of our own parents, and of ourselves, I might add).

But these are not the main points of Kalanithi’s book. He manages to make the beauty and power of the study of science, and especially the study of medicine real and persuasive. More broadly he shows convincingly that a life of intense study is a life worth living, that the life of the mind matters, even if that point is made more poignant by the fact that that life is cut off by a disease of the body that cannot be studied away and that as of yet remains beyond the realm of human comprehension and control. It is interesting to me that, in the end, Kalanithi returns to literature to make meaning of the death that he cannot, that none of us can, avoid. He makes mention of the fact, of his return to stories about living and dying, stories that tried to make sense of the fact of death. Some of his final memories are of sitting with his infant daughter on his lap, reading to her the words of Eliot’s “Waste Land”, a fact that might be comical were it not for the fact that it rings true for a father given over to the power of language to give meaning where meaning seems tenuous at best, were it not for the fact that I and other fathers I know in the English major set do or have done precisely the same thing.

Of course, he made his own story in the end. A story, to be sure, that could not have been written without his unimaginably deep engagement with science and medicine. But finally his story lives for us, tells us something about what it means to be human, because it lives for us in words.

View all my reviews

Can we create a humanities for the 21st century?: Reflections on Cathy Davidson

I’ve been invited to serve on a panel at the Lilly Fellows Administrators Workshop this fall in New Orleans, so I’ll use the event as an excuse to revive this blog–famous last words– by reflecting on some reading I’m doing in preparation. Broadly speaking, since we’ve done a lot of work in this area at Messiah College I’ve been asked to talk on how humanities can connect to career preparation as part of a conference that focuses on connecting mission and post-baccaluareate success.

Sometimes I admit that I think these kinds of discussions end up being far too narrowly cast for my taste;  humanists concede that we must do something to address our current and never-ending crisis or crises, and so we talk about career preparation as if it is a concession, something that we will do if we have to do it as long as we can keep doing the idealistic things that we have always done.  Or else something that we will do for the moment even as we look nostalgically to the past or longingly for a future in which the economy is better, our budgets are sound, our classrooms are burgeoning.  On this view, humanities faculty engaging with career preparation is a necessary evil or a pragmatic necessity, but it never really gets to the root of or affects a fundamental understanding of what the humanities are about.  As an administrator, I admit that I have become pretty pragmatic and willing to put up with more than my share of necessary evils.  Nevertheless, I confess that I find this view of engagement with student careers as seriously wanting and deficient.

I think that the halcyon days of yore are not returning, and even if they did it might not actually be all that great a thing.  Rather, I want to believe we are about at Messiah College–when we do curricular revision to include more attention to career concerns, or when we have more training of faculty advisors to address vocational issues,when we work to connect internships, service learning, and other forms of experiential learning directly to our liberal arts course work, or when we begin new projects in the digital humanities–What I want to believe we are about is creating a humanities for the 21st century.

In this, I resonate sympathetically when I read Cathy Davidson, or hear her speak as I did last year at the CIC conference in Pittsburgh.  Davidson’s ruling metaphor, it seems to me, is that our current forms of education, even humanities education, are appropriate to an industrial era, but that we have yet to develop an education appropriate to our own era.

I read again this afternoon her essay on these issues from Academe a few years back, Strangers on a Train.  A passage that particularly stuck out:

If you look at the curriculum in most humanities departments, you would barely notice that there is a crisis and there has been one for decades. At most colleges and universities, humanities departments continue to have a hierarchy of requirements and teaching assignments that imply that the department’s chief mission is to train students for professional careers in the humanities. Most humanities departments do not seem designed to prepare students for any and all careers, including in the sciences, even though all careers require reading, writing, critical thinking, theoretical analysis, historical perspective, and cross-cultural knowledge.

Davidson rightly points out that one consequence of mass education as we have come to know it is that liberal arts programs  have tended to become pre-professional in their orientation, but in a bad or deleterious sense.  That is, we think mostly that we are preparing future graduate students in the humanities, or we organize our curricula as if we are doing that.  Davidson’s essay is a clarion call, if a somewhat unspecific one, to get beyond this form of the humanities for a broader-based approach to the vocational needs of the contemporary students.  Ironically, it seems to me, this might ultimately make our humanities programs more genuinely liberal arts programs, designed broadly rather than for discipline specific expertise.

The one issue that I think Davidson doesn’t address here is one that I think leaves humanities programs resistant to change along the lines Davidson seems to be envisioning.  That is, so long as we argue that humanities programs are the best preparation for a flexible career in the future, that we give students superior skills in communication and analysis, that statistics show our students do relatively well in the job market overall, it becomes unclear why the pre-graduate-school model needs to change.  I have heard this argument stated eloquently.  “Yes, we prepare you so you can go to graduate school;  but if you don’t you’ve been prepared for everything else as well because of all the great communication skills we’ve given you.”

I don’t actually agree with this argument, but it is a genuine argument.  Where it falls short, I think, is in an overconfidence that our students know how to translate knowledge between fields of practice.  This is, I think, a false assumption.  Conversations with business and career development professionals over the past four or five years have convinced me that humanities students regularly and commonly struggle to be able to articulate the relationship between what they have done with their education and the needs of employers.  As I have put it in the past, we broaden our students’ horizons admirably, but we resist teaching them how to walk in to those horizons, or don’t even think to do so.  Indeed, in the worst case, where professors or departments give students only non-instrumental arguments for their fields–“this is inherently worth studying”–we implicitly teach students that they positively should not make connections between their academic fields and some other pathway or endeavor.  Students then not only do not receive practice in applying their knowledge,  they not only are left inarticulate about other career directions, they can come to feel unconsciously that it is inappropriate for them to do so.  I have had students–STUDENTS!–say to me, “I know we aren’t supposed to worry about whether humanities major X connects to a job, but….”

Fortunately, this kind of statement is becoming increasingly rare at Messiah College.  Whatever a humanities program for the 21st century should look like, outcomes ought to include that students have had practice applying their program of study to non-academic work environments, and that students can effectively and shamelessly articulate the value of their program of study in the humanities to future employers.

A podcast of Cathy Davidson’s talk to the CIC, “Educating Students for Their Future, Not Our Past,” is available here.

The Slide show from her presentation is available here.

The Good Lord Bird: A Review

16171272“The Good Lord Bird” is a book I ought to like. I liked McBride’s first autobiography, “The Color of Water,” an appreciative remembrance/meditation on being a black man with a white mother. I liked McBride personally when he visited Messiah College. He is a fantastic musician, whose passion for music I share. Like most America’s I’m intrigued by the fanatical John Brown. The novel’s theme is race in America, the fatal subject of many of the greatest American novels from Moby Dick to Huckleberry Finn to Absalom Absalom to the Invisible Man to Beloved. It’s James McBride’s fourth book, about the time in a career when one starts to see signature work produced. It was chosen for the National Book Award. Everything things about it says BIG. Good writer, big book, big subject.

But the book left me cold.

Well, really somewhat more than lukewarm, which feels like cold when you are eagerly expecting greatness. I liked many passages. It was better than watching netflix…most of the time. But I expect more than that from a National Book Award Winner, and couldn’t help but wonder what the judges were thinking, though it also crossed my mind that American fiction must have been in a bad way in 2013 if this was the best we could produce. Apparently this thought crossed McBride’s mind as well since, according to the Times story on the 2013 NBA ceremonies, McBride was stunned to have won and didn’t even prepare a speech.

The book has often been compared to Huckleberry Finn. indeed, the novel’s cover insists on it, evoking the ubiquitous straw hat that seems to accompany every edition of the novel and rendition of Huck’s story since that American classic was written. Henry, the novel’s protagonist, is a cross-dressing early adolescent struggling to find his own way and own identity in a world full of people who impose their vision of what he is or ought to be at will. There is no river–a good bit of the novel takes place in the plains of Kansas and Iowa, in its own more desiccated way as stark brutal and unforgiving a wild thing as the Huck’s Mississippi. In a different sense, John Brown himself is Henry’s river, a wild thing, a force of nature with a logic and will of his own that bends things to himself, like a Kansas tornado or a Mississippi flood, and does the bending in part through his own fanatical certainty in the will of God.

But the differences between Huckleberry Finn and The Good Lord Bird also point to the latter’s narrative problems. In Huckleberry Finn, the river is a character of its own, but it is never the character that we care about. From the opening sentence we know we care about Huck Finn and coming to know him. The river is finally an occasion for the main narrative character, and not the other way around. If the river drives Huck through coincidence, or if Huck follows it where it carries him, finally the reader follows Huck, only sticking with the river because of that human story. In The Good Lord Bird I could never muster a lot of concern from Henry, who followed John Brown for no good reason, didn’t seem to want to be there, and didn’t seem to learn all that much by his travels. In the Times story cited above, McBride says he enjoyed writing the novel because “It was always nice to have somebody whose world I could just fall into and follow him around.” The novel has that feel. That Henry is just falling in to the world and following John Brown around for no reason we can adequately figure out. In this sense, the novel feels all the way through a bit like Huck Finn feels in its disappointing conclusion, when Huck Finn, who has captured our imagination and matured before our interior eyes, turns freedom and maturity into a banal melodrama at the hands of his would-be mentor, Tom Sawyer.

In the acknowledgements section at the end of the book, McBride thanks the many who kept the memory of John Brown alive. I’m not sure this book will do that very well. When, for the space of several dozen pages in the middle of the novel, John Brown disappears from the scene and Henry is left to his own devices, we realize we just don’t care that much about him. And if Henry is not himself a compelling character, if Henry is not more than an empty sleeve of a boy becoming a man, then we can’t really find John Brown all that compelling either. Adolescents drifting through life are compelled by many things, some of them profound, many or most of them not. John Brown is a terrifying and complex figure in American History, one that Americans still feel uncertainly as both a brutal terrorist and a freedom fighter, and feel this contradiction all the more so in that Brown was indeed on the right moral side of the arc of history when it comes to the story of race in America, unlike so many other of our white founding fathers and mothers. But because Henry does not himself seem to struggle in any meaningful way with the desperate question of whether in pursuing freedom we are winging into flight or stumbling over a precipice, we ultimately don’t feel in this John Brown the terror that can be the birth pangs of freedom. Instead, McBride’s John Brown seems to shift uncomfortably between being a joke and being meaningfully sincere.

A worthy subject for a netflix melodrama, but I was hoping for more.

What is a liberal art: Elizabeth Stone on the vocation vs. vocational in higher education

This summer I’m working sporadically on what I hope will turn in to a paper on Critical Vocationalism for the NEMLA session that I hope will be draw some substantial proposals for next year’s conference in Harrisburg. Trying to get my brain around exactly what Gerald Graff and Paul Jay might mean by Critical Vocationalism since they leave the term underdefined in their own advocacy for the idea as a new defense for the humanities and the liberal art. To that end I read Elizabeth Stone’s essay on the conflict between vocation and vocationalism published a few years back in the Chronicle. I’m struck by the fact of how we seem to be stuck in a holding pattern, with nothing really advancing or changing in our discourse about the liberal arts in general and the humanities specifically, with the possible exception that we must now lament that the rate of debt our students are carrying has more than doubled in a decade.

Stone’s essay does point out some dimensions of the problem that I do think are important to keep trying to talk about. For instance, she points out that we are not just having an enrollment crisis in the liberal arts, we are having a crisis of definition. What are the liberal arts and why are they that instead of something else. For Stone:

<blockquote>So, platonically speaking, I don’t really know what a liberal art is (although I know it’s not auto mechanics), because there seems to be no single characteristic — old, new, theoretical, vocational, quantitative, qualitative, a matter of content, a matter of perspective — common to all liberal arts.

In practice, then, a liberal art is a little like obscenity. We faculty members know it when we see it, even if we can’t quite define it. But there isn’t anything approaching consensus. Because I’m a parent — of one son with a new B.A. and another who’s now a freshman at a liberal-arts college — I’ve seen more than my share of college catalogs over the past half-dozen years. All of them assert the value of the liberal arts, but at some colleges that includes computer science, industrial design, physical education, and even engineering.

If you are a pragmatist, as I tend to be in my weaker moments, this could strike you as merely a self-serving argumentative move. Since “liberal arts” tends to be defined differently in different periods of history and even in different institutional contexts, they must not really be anything at all. In my own College the Humanities–traditional and sometimes sole remaining bastion of the liberal arts–are defined to include not only Philosophy, History, and English (uncontroversial), but also Religion (unconventional but still uncontroversial), Biblical studies and Film production ( a number of raised eyebrows) and programs like Public Relations, Christian Ministries, and Chinese Business and Spanish Business (pandemonium). The Platonist suggests that if there is no essence that unites these disparate fields then there is no there there, no thing that we can call the liberal arts as opposed to any other thing.

I’m not really interested in answering this question, though I will say I am more interested in Wittgenstein’s notion of family resemblances than in Plato’s forms. What Stone makes clear is that in the absence of any defining essence, the liberal arts largely define themselves by what they are against or what they are not–a version of Aquinas’s theological via negativa for defining God only by saying what God is not, just as most of us build our identities by aversion to our evil others. That evil other for the liberal arts is usually vocationalism. Over and against our money grubbing brethren interested in mere vocationalism we posit the higher order values of vocation, of calling, of transcendent value, or at least of critical thinking.

The problem with this according to Stone is that we don’t have to probe very deeply beneath the skin of what we call the liberal arts to discover an always already fallen vocationalism in who we are and what we do.

<blockquote>Since it’s people like me who are often seen fretting that the liberal arts are being waylaid by the thugs of Mammon, I think it’s time that people like me acknowledged our own dirty little secret. I’ll go first and admit that I, for one, have an unseemly number of vocational courses in my undergraduate past, and the reason is that those courses were directly related to a job I had my eye on: I was a teenage English major, in training to be an English professor.

Stone’s suggestion here strikes me as having two different meanings. First many of our liberal arts disciplines have had vocational ends in some sense, even if that sense was never fully articulated and endlessly deferred. Aquinas’s notion that the liberal arts are things studied for their own sake nevertheless raises the question of why something studied for its own sake should be a required course of study in a society or a seminary. We must admit that the study of most disciplines of the liberal arts have been and were specifically conceived of as appropriate training for young men in order to prepare them for positions of leadership. To be sure, the “higher order” issues of character and spiritual formation have always been around, but young men were explicitly required to pursue studies in these fields in order to prepare for something, specifically to occupy adult roles of leadership as the elites of particular Western Societies. Moreover, some liberal arts as we now conceive of them were not even designed for Elites. My own discipline of English was understood and came in to the academy in England first and foremost as an appropriate course of study in what would have been the equivalent of British community colleges, educational schools for the working classes and for women, even while English was looked down upon by the more cultured classes. So we turn our face away from vocationalism almost like those afraid to recognize their kinship with the adulterated masses.

Also, it seems to me that Stone is suggesting that we ought to recognize that we have increasingly organized our liberal arts curricula around professional (and so vocational) ideals. We have tended for the past few decades to imagine undergraduate education at its best as preparing students for potential graduate study, and have valued most those students who looked just like us, could talk just like us, and wanted to prepare to be just like us. We have accepted a vocational model of education common to the research universities and the professional schools and baptized it in the name of the liberal arts. This fact is why so much of the discussion of a crisis in the humanities is preoccupied with a crisis of graduate students not getting jobs. That is actually a symptom of a much larger crisis that we cannot fully imagine a larger social purpose that doesn’t rely on our self-replication.

What, I wonder, would an education in the liberal arts look like that took it as its explicit task to better prepare students for participation as informed citizens AND as informed workers outside the world of academe. In other words, an education that took as its explicit purpose to produce workers who were not like and do not aspire to be like us. This might be a baseline for critical vocationalism

Humanities For Humanities Sake?

In a recent Inside Higher Ed post, Scott Jasick discussed Going Global, and international higher education conference in Dubai.  According to Jasick, whose post is title “Humanities Besieged, Worldwide,”  the plight of the humanities is not that much different anywhere else in the world than it is in the United States.  Indeed it may possibly be worse elsewhere since in the United States the importance of the humanities is sustained in some fashion through its predominant place in core liberal arts curricula, a tradition not common to other forms of education in the world but de rigueur in the United States. Jasick quotes Jo Beall from the British Council.

Beall described going to university websites in Britain and finding the humanities “positioned in very functional or utilitarian terms.” She found many references to how students gain from taking humanities courses “because it will help them do well in science and technology.” In other cases, departments describe how much employers value the “transferable skills” that one picks up in humanities courses.

While not disagreeing that the humanities can help in those ways, Beall noted that many scholars are “very uncomfortable with this marketing of the humanities” and lament that it is no longer possible to argue for the value of “art for art’s sake.” Instead, the humanities end up “as co-dependent” to other programs, she said.

Read more: http://www.insidehighered.com/news/2013/03/07/educators-consider-struggles-humanities-worldwide#ixzz2N3bmjfg1
Inside Higher Ed

I do think this gets an important conundrum that Humanities scholars face.  Rooted in a tradition that emphasizes learning for its own sake, we have never grown comfortable with a discourse of usefulness.  But is this a legacy of the liberal arts, or is it a legacy of romanticism in its idealistic but finally inadequate protest against the economic systems in which it was embedded and by which it was sustained.  The liberal arts were never envisioned in antiquity as being anti-utilitarian in a ontological sense.  Study was always to some purpose and their functions were quite explicit in giving the “free” or “liberal” person equipment for living appropriate to his (usually his) station in life.  Similarly, in later periods what has come to be known as the humanities served similar useful purposes as usefulness was determined by the various cultures in which they were existing;  useful for growing closer to God, for instance, or preparing for ministry, or for some other functioning with in the clerisy

It seems fruitless to me to continue a romantic protest against economic systems and economic gain per se from within the systems that sustain us since the humanities as an institutional entity will simply disappear if that is all we can say about them.  We act in bad faith with our students if we ask them to take on tens of thousands of dollars of debt, and then rule out the question of whether they will be able to pay off their loans once they graduate.

On being human: Lessons from Harvard Institute for Management and Leadership in Higher Education

In a specific sense I am an unabashed advocate of what has come to be called the applied humanities, roughly and broadly speaking the effort to connect the study of the humanities to identifiable and practical social goods.  For me, in addition it includes the effort to develop humanities programs that take seriously that we are responsible (at least in significant part) for preparing our students for productive lives after college, preparation that I think really should be embedded within humanities curricula, advising, cocurricular programming, and the general ethos and rhetoric that we use to inculcate in our students what it means to be a humanist.

In several respects this conviction lies at the root of my advocacy for both digital humanities programs and for career planning and programming for liberal arts students, as different as these two areas seem to be on the surface.  I have little room left any more for the idea that “real learning” or intellectual work pulls up its skirts to avoid the taint of the marketplace or the hurly-burly of political arenas and that we demonstrate the transcendent value of what we do over and above professional programs by repeatedly demonstrating our irrelevance.  Far from diminishing the humanities, an insistence that what we do has direct and indirect, obvious and not so obvious connections to social value enhances the humanities.  It’s not just a selling point to a doubting public.  As I said yesterday, the only good idea is the idea that can be implemented.  We ought to be proud of the fact that we can show directly how our students succeed in life, how they apply the things they’ve learned, how they find practical ways of making meaningful connections between their academic study and the world of work.

At the same time, I will admit that some versions of this argument leave me cold.  It risks saying that the only thing that is really valuable about the humanities is what is practically relevant to the marketplace. I greet this effort to make Wordsworth a useful version of a management seminar with a queasy stomach.

It may sound like a nice day out in beautiful surroundings, but can walking around Lake District sites synonymous with Romantic poet William Wordsworth really offer business leaders and local entrepreneurs the crucial insights they need?

That is precisely the claim of Wordsworth expert Simon Bainbridge, professor of Romantic studies at Lancaster University, who believes the writer can be viewed as a “management guru” for the 21st century.

Since 2007, the scholar has taken students down into caves and out on canoes to the island on Grasmere once visited by Wordsworth and fellow poet Samuel Taylor Coleridge, and to places where many of the former’s greatest works were written, for what he called “practical exercises linked to the themes of Wordsworth’s poetry.”

Such walks, which also have been incorporated into development days for individual firms, are now being offered as a stand-alone option for local and social entrepreneurs at a rate of £175 ($274) a day.

Read more: http://www.insidehighered.com/news/2012/08/09/businesses-pay-british-professor-teach-them-about-wordsworth#ixzz236bQaECf 
Inside Higher Ed 

I do not find the insight here wrong so much as sad.  If the only reason we can get people to read Wordsworth is because he will enhance their management skills, we have somehow misplaced a priority, and misunderstood the role that being a manager ought to play in our lives and in the social and economic life of our society.  It is the apparent reduction of all things and all meaning to the marketplace that is to be objected to and which every educational institution worthy of the name ought to furiously resist, not the fact of marketplaces themselves.

I was lucky enough this summer to attend the Harvard Institute for Management and Leadership in Education.  To be honest, I went thinking I was going to get all kinds of advice on things like how to organize projects, how to manage budgets, how to promote programs, how to supervise personnel.  There was some of that to be sure, but what struck me most was that the Institute, under the leadership of Bob Kegan, put a high, even principal, priority on the notion that managers have to first take care of who they are as human beings if they are to be the best people they can be for their colleagues and their institutions.  You have to know your own faults and weakness, your own strengths, your dreams, and you have to have the imagination and strength of mind and heart (and body) to learn to attend to the gifts, and faults and dreams and nightmares of others before or at least simultaneously with your own.  In other words, being a better manager is first and foremost about becoming a healthier, more humane, fuller human being.

The tendency of some applied humanities programs to show the relevance of poetry by showing that it has insights in to management techniques, or the relevance of philosophy because it will help you write a better project proposal, is to misplace causes and to turn the human work of another imagination (in this case Wordsworth) into an instrumental opportunity.  The reason for reading Wordsworth, first and foremost, is because Wordsworth is worth reading, and simultaneously because the encounter with Wordsworth will give you the opportunity to be a fuller, more imaginative, more thoughtful human being than you were before.

If you become that, you will have a chance to be a better manager.  But even if you don’t become a better manager, or if you lose your job because your company is overtaken by Bain capital or because students no longer choose to afford your pricey education, you will, nonetheless, be richer.

The Words matter: Digital Humanities Vocabulary 101

I always know that if the hits on my blog spike it has less to do with anything I’ve said than with the fact that some good soul or souls out there have bothered to mention me on their own much more well-read and insightful blogs. This has happened several times with the far flung folks in the Digital Humanities who I mostly only know virtually through Twitter and other social media, as well as with my colleague John Fea over at The Way of Improvement Leads Home, who I acknowledge both virtually and when I pass him in the hallway. Just today, Rebecca Davis over at NITLE wrote a blog post mentioning me and some of my own floundering engagement as an administrator trying to get my mind around what was happening in this field.

Just a word about Rebecca’s post. She’s begun a very interesting project, partly in response to the floundering of folks like me, designed to provide a glossary of terms to help beginners get a grip on things and begin to navigate the thickets of the Digital Humanities. She ran an experiment at a couple of conferences to get things started:

Normally, academics getting to know a new discipline would read about it before doing it. But, the ethos of doing in digital humanities is so strong, that THATCamps ask beginners to engage in doing digital humanities (more hack, less yack). To that end, at my last workshop (at the Institute for Pedagogy in the Liberal Arts hosted by Oxford College of Emory University), I came up with a strategy to let beginners do and help the digital humanities veterans be sensitive to specialist vocabulary. I asked my workshop participants to write down on a post-it note every term they heard from me or other workshop participants that they didn’t know. Then we added them to our workshop wiki and set about defining them. We accumulated quite a few terms (35 total). Although my co-teacher Sean Lind, Digital Services Librarian at Oxford, ended up contributing most of the definitions, I think the list was still useful as an indicator of terms veterans need to be prepared to define.

I repeated the experiment at THATCamp LAC 2012 by proposing a session on a digital humanities glossary and setting up a google doc for the glossary. I think that session happened, though I didn’t make it. Certainly terms were added to the doc throughout the THATcamp, with a final total of 28 terms.

Looking at this admittedly small sample, let me share some preliminary conclusions. There were only five terms that both lists shared (one of which I had contributed by initiating each list with the acronym DH):

  • Crowdsourcing
  • DH = Digital Humanities
  • Hashtag
  • Open Access (OA)
  • TEI= Text Encoding Initiative
I love this idea, love the activity, and I hope that Rebecca’s idea for a glossary takes off. The lists she’s come up with to start seem about right. I will say I ended my first THATCamp still entirely clueless about what TEI stood for and I’m still not entirely sure I could define “XML” for anyone else, even though I think I know it when I see it. (In my defense, I actually did know what crowd sourcing, hashtag, and open access indicated, although I hadn’t the foggiest how you did any of them).
Regarding hacking and yacking, I am, so far, more of a digital humanities advocate than a digital humanities practitioner, a position necessitated both by my ignorance and my position as an administrator with too little time to read his email, much less pursue digital humanities projects. From this position as a facilitator, I feel a little reluctant to take a position, other than to say words matter. Having the word for something gives you one way to act on the world. I’ve always been deeply moved by the section of The Autobiography of Malcolm X wherein he describes learning to read by reading the dictionary. This seems right. If you want to act in a world learn its words, start to speak its language even if at first you are only stringing nouns together into something that only vaguely resembles a sentence.Words became the necessary means of action. Thus, I think that Rebecca’s project will be a boon to those who are still at the stage of the DH language game where they are mostly pointing and grunting.

I started this post thinking I was going to write about intellectual generosity. How important it is and what it looks like when you find it. That will have to wait, but I will say I have appreciated the large hearted generosity of the many folks in DH who know they are blazing a trail and are willing to lay out signposts and serve as guides to others on the path.

Hermeneutics of the stack and the list: unreading journals in print or online

The New York Times reported yesterday that The Wilson Quarterly will put out its final print issue in July ( Wilson Quarterly to End Print Publication – NYTimes.com). The editorial staff seemed sanguine.

“We’re not going on the Web per se,” Steven Lagerfeld, the magazine’s editor, said in an interview. “We already have a Web site. The magazine will simply be published in a somewhat different form as an app,” first in the iTunes store and later on the Android platform.

And, to be honest, I’m sanguine too.  Although, I noted a the demise of the University of Missouri Press with a half shudder last week, I have to admit that I don’t greet the demise of print journals with the same anxiety.  I’ve recognized lately that I mostly buy paper journals so I can have access to their online manifestations or because I feel guilty knowing that online long form journalism and feature writing has yet to find a way to monetize itself effectively. I try to do my part by littering my office and bedroom with stack and stacks of largely unopened New York Reviews, New Yorkers, Chronicles of Higher Ed, and a few other lesser known magazines and specialist journals. But most of my non-book reading, long form or not, is done on my iPad.

I will leave the question of what we will do if good journals like the Wilson Quarterly really can’t survive on iTunes distribution (WQ only survived in paper because of the indulgence of the Woodrow Wilson International Center for Scholars). I’m more interested at the moment in the fact of the stack and what it signifies in the intellectual life.  Every intellectual I know of is guilty of stockpiling books and journals that  she never reads, and can never reasonably expect to, at least not if she has a day job.  The stack is not simply a repository of knowledge and intellectual stimulation beckoning to the reader, drawing him away from other mundane tasks like reading or preparing for class with an ennobling idea of staying informed. (Side note:  academia is the one place in life where every activity of daily life can be construed as tax deductible; just make a note about it and write “possible idea for future article” at the top of the page.)

No, The stack is also a signifier.  It exists not so much to read, since most academics give up hopelessly on the idea of reading every word of the journals that they receive.  The stack exists to be observed.  Observed on the one hand by the academic him or herself, a reassuring sign of one’s own seriousness, that one reads such thing and is conversant with the big ideas, or at least the nifty hot ideas, about culture high and low.  The stack also exists to be observed by others:  the rare student who comes by during office hours, the dean who happens to drop by to say hello, the colleagues coming in to ask you out for coffee–“Oh, you already got the latest issue of PMLA!” The stack suggests you are uptodate, or intend to be.  The stack communicates your values.  Which journal do you put strategically out at the edge of the desk to be observed by others, which do you stack heedlessly on top of the file cabinet.  Even the hopelessly disheveled office can signify, as did Derrida’s constantly disheveled hair; I am too busy and thinking too many big thoughts to be concerned with neatness.

The stack, like the Wilson Quarterly, is on its way out, at least for academics.  I realized four or five years ago that e-books would signify the end of a certain form of identification since people would no longer self-consciously display their reading matter in coffee houses or on subways, every text hidden in the anonymous and private cover of the Kindle or now the iPad.  While I could connect now with other readers in Tibet or Siberia, I could not say off-handedly to the college student sitting next to me–“Oh, you’re reading Jonathan Safran Foer, I loved that book!”

The stack too is going and will soon be gone.  Replaced now by the endless and endlessly growing list of articles on Instapaper that I pretend I will get back to.  This has not yet had the effect of neatening my office, but it will remove one more chance at self-display.  I will soon be accountable only for what I know and what I can actually talk about, not what I can intimate by the stacks of unread paper sitting on my desk.

Dumpster Diving and other career moves: remembering the job market with Roger Whitson

It would be hard to say I enjoyed reading Roger Whitson’s very fine recent meditation in the Chronicle on the quest for a tenure-track job, his ambivalent feelings on finding one, the mixed feelings of exaltation and guilt at getting what so many of his peers would never find, and of leaving behind an #Altac existence where he had begun to make a home.

Hard to enjoy reading both because the story seems to typify what our academic life has actually become, and, frankly, because it reminded me too much of my own wandering years as a new academic a couple of decades ago.  I spent seven years full-time on the job market back in the day (if you count the last two years of graduate school).  I have estimated in the past that I must have applied for at least 700-800 jobs during those years–the idea of being targeted and selective a joke for a new father.  Fortunately I was actually only totally unemployed for four months during those years, though that was enough to plunge me thousands of dollars in to debt paying for health insurance.  For five of those seven years I had full-time work in various visiting assistant positions, and for two of  those visiting years I was paid so little I qualified for food stamps, though I never applied for the program.  I worked as many extra courses as I could to pay the bills–probably foolish for my career since publishing slowed to a crawl, but it saved my pride.  I remember asking, naively, during an interview for one such visiting position whether it was actually possible to live in that area of the country on what I was going to be paid.  The chair interviewing me at the time hesitated, then responded, “Well, of course, your wife can work.”

Only one of those years did I not get an interview, and only two of those years did I not get a campus interview, but even then this seemed like a very peculiar and unhelpful way to claim success for a beginning academic career.  We did not have anything called #altac in those days, and my plan B–which on my worst days I sometimes still wonder whether I should have followed–was to go back to cooking school and become a chef (I know, I know.  Another growth industry).  I never felt bad about pursuing a PhD in English, and I don’t think I would have even if I had gone on to become a chef.  The learning was worth it, to me, at least.

But I did grow distant from college friends who became vice-presidents of companies or doctors in growing practices , all of whom talked about their mortgages and vacations in the Caribbean or Colorado, while I was living in the cheapest 2 bedroom apartment in Fairfax Virginia that I could find and fishing furniture, including my daughter’s first bed, out of a dumpster.  (The furniture was held together, literally, by duct tape; I had to pay for conferences). And I spent a lot of evenings walking off my anxiety through the park next to our apartment complex, reminding myself of how much I had to be thankful for.  After all, I had a job and could pay my bills through the creative juggling of credit card balances. A lot of my friends had found no jobs at all.  A low rent comparison, I realize, but I would take what solace I could get.

I do not resent those days now, but that depends a lot on my having come out the other side.  The sobering thought in all of this is in realizing that in the world of academics today I should count myself one of the lucky ones.  Reading Roger’s essay, and the many like it that have been published in the last twenty years, I always get a sick hollow feeling in the gut, remembering what it was like to wonder what would happen if….

Reading Roger’s essay I was struck again with the fact that this is now the permanent condition of academic life in the humanities.  My own job story began more than 20 years ago at Duke, and even then we were told that the job market had been miserable for 15 years (but was sure to get better by and by).  30 years is not a temporary downturn or academic recession.  It is a way of being.

The advent of MOOC’s, all-online education, and for-profit universities, are responses to the economics of higher education that are unlikely to make things any better for the freshly minted PhD.  While there are some exciting innovations here that have a lot of promise for increasing learning to the many, it’s also the case that they are attractive and draw interest because they promise to do it more cheaply, which in the world of higher education means teaching more students with fewer faculty hours.  Roger’s most powerful line came toward the end:  “Until we realize that we are all contingent, we are all #altac, we all need to be flexible, and we are all in this together, we won’t be able to effectively deal with the crisis in the humanities with anything other than guilt.”

This is right, it seems to me.  In a world that is changing as rapidly and as radically as higher education, we are all as contingent the reporters and editors in the newsrooms of proud daily newspapers.  It is easy to say that the person who “made it” was talented enough or smart enough or savvy enough, but mostly they, I, we were just lucky enough to come out the other side.  But we would be misguided to imagine that because we made it in to a world that at least resembled the world we imagined, that that world will always be there.  We are an older institution and industry than music or radio or newspapers, but we are an industry and an institution nonetheless, and it seems to me that the change is upon us.  We are all contingent now.

Why students of the Humanities should look for jobs in Silicon Valley

Ok, I’ll risk sounding like a broken record to say again that the notion that humanities students are ill-positioned for solid careers after college is simply misguided.  It still bears repeating.  This latest from Vivek Wadhwa at the Washington Post gives yet more confirmation of the notion that employers are not looking for specific majors but for skills and abilities and creativity, and that package can come with any major whatsoever, and it often comes with students in the humanities and social sciences.

Using Damon Horowitz, who possess degrees in both philosophy and engineering and whose unofficial title at Google is In-House Philosopher and whose official title is Director of Engineering, Wadhwa points out the deep need for humanities and social science students in the work of technology companies, a need that isn’t just special pleading from a humanist but is made vivid in the actual hiring practices of Silicon Valley companies.

Venture Capitalists often express disdain for startup CEOs who are not engineers. Silicon Valley parents send their kids to college expecting them to major in a science, technology, engineering or math (STEM) discipline. The theory goes as follows: STEM degree holders will get higher pay upon graduation and get a leg up in the career sprint.

The trouble is that theory is wrong. In 2008, my research team at Duke and Harvard surveyed 652 U.S.-born chief executive officers and heads of product engineering at 502 technology companies. We found that they tended to be highly educated: 92 percent held bachelor’s degrees, and 47 percent held higher degrees. But only 37 percent held degrees in engineering or computer technology, and just two percent held them in mathematics. The rest have degrees in fields as diverse as business, accounting, finance, healthcare, arts and the humanities.

Yes, gaining a degree made a big difference in the sales and employment of the company that a founder started. But the field that the degree was in was not a significant factor. ….

I’d take that a step further. I believe humanity majors make the best project managers, the best product managers, and, ultimately, the most visionary technology leaders. The reason is simple. Technologists and engineers focus on features and too often get wrapped up in elements that may be cool for geeks but are useless for most people. In contrast, humanities majors can more easily focus on people and how they interact with technology. A history major who has studied the Enlightment or the rise and fall of the Roman Empire may be more likely to understand the human elements of technology and how ease of use and design can be the difference between an interesting historical footnote and a world-changing technology. 

via Why Silicon Valley needs humanities PhDs – The Washington Post.

Again, at the risk of sounding like a broken record, this sounds like the kind of findings emphasized at the Rethinking Success Conference that I have now blogged on several times.    (I’ve heard theories that people come to be true believers if they hear a story 40 times.  So far I’ve only blogged on this 12 times, so I’ll keep going for a while longer).  Although I still doubt that it would be a good thing for a philosopher to go to Silicon Valley with no tech experience whatsoever,  a philosopher who had prepared himself by acquiring some basic technical skills alongside of his philosophy degree might be in a particularly good position indeed.  Worth considering.

Side note,  the Post article points to a nice little bio about Damon Horowitz.  I suspect there are not many folks in Silicon Valley who can talk about the ethics of tech products in terms that invoke Kant and John Stuart Mill.  Maybe there should be more.