Monthly Archives: September 2011

Michael Hart and Project Gutenberg

I felt an unaccountable sense of loss at reading tonight in the Chronicle of Higher Education (paper edition no-less) that the founder of Project Gutenberg, Michael Hart, has died at the age of 64.  This is a little strange since I had no idea who had founded Project Gutenberg until I read the obituary.  But Project Gutenberg I know, and I think I knew immediately when I ran across it that it was already and would continue to be an invaluable resource for readers and scholars, this even though I’ve never been much of a champion of e-books.  I guess it felt a bit like I had discovered belatedly the identity of a person who had given me a great gift and never had the chance to thank him.

One aspect of Hart’s vision for Project Gutenberg struck me in relationship to some of the things I’ve been thinking about in relationship to the Digital Humanities.  That’s Hart’s decision to go with something that was simple and nearly universal as an interface rather than trying to sexy, with it, and up to date.  Says the Chronicle

His early experiences clearly informed his choices regarding Project Gutenberg. He was committed to lo-fi—the lowest reasonable common denominator of textual presentation. That was for utterly pragmatic reasons: He wanted his e-texts to be readable on 99 percent of the existing systems of any era, and so insisted on “Plain-Vanilla ASCII” versions of all the e-texts generated by Project Gutenberg.

That may seem a small—even retro—conceit, but in fact it was huge. From the 80s on, as the Internet slowly became more publicly manifest, there were many temptations to be “up to date”: a file format like WordStar, TeX, or LaTeX in the 1980s, or XyWrite, MS Word, or Adobe Acrobat in the 90s and 2000s, might provide far greater formatting features (italics, bold, tab stops, font selections, extracts, page representations, etc.) than ASCII. But because Mr. Hart had tinkered with technology all his life, he knew that “optimal formats” always change, and that today’s hi-fi format was likely to evolve into some higher-fi format in the next year or two. Today’s ePub version 3.01 was, to Mr. Hart, just another mile marker along the highway. To read an ASCII e-text, via FTP, or via a Web browser, required no change of the presentational software—thereby ensuring the broadest possible readership.

Mr. Hart’s choice meant that the Project Gutenberg corpus—now 36,000 works—would always remain not just available, but readable. What’s more, it has been growing, in every system since.

This is no small thing.  The ephemeral character of digital humanities projects bothers me.  By ephemeral I don’t mean they are intellectually without substance.  I think the intellectual character of the work can be quite profound.  However, the forms in which the work is done can disappear or be outdated tomorrow.  Hart’s decision to use ASCII is in some sense an effort to replicate the durability of the book.  Books, for all the fragility of paper, have a remarkable endurance and stability overall.  The basic form doesn’t change and the book used by an ancient in the middle ages is, more or less, still usable by me in the same fashion.  By contrast I can’t even open some of my old files in my word processor.  I think the work I did was substantial, but the form it was placed in was not enduring.  Harts decision makes sense to me, but I’m not sure how it might be extended to other kinds of projects in the digital humanities.

Discussion of Henry Jenkins and Lev Manovich

I’m also blogging occasionally over at digitalhumanity.wordpress.org, our site for discussing Digital Humanities at Messiah College.  You’re invited to check in and see our flailing around as we try to get our minds around whatever it is that goes on with this field and try to think about how we might contribute.  Today we had a discussion of some work by Henry Jenkins and Lev Manovich.  A few of my notes can be found here.

Uncreative Writing: Kenneth Goldsmith and Liz Laribee on Originality in the Digital Age

Professors have endless angst over the new possibilities for plagiarism and other forms of intellectual property theft in the digital age.  But according to Kenneth Goldsmith in the Chronicle Review, such anxiety misses the point that we long entered a new age of uncreative creativity, a fact to be celebrated rather than lamented since it points to our having gotten beyond simplisitic and romantic or modernist notions of the creative individual.  Of course, Goldsmith is promoting his new book, which I guess he would to take to be some kind of act of creation and for which I’m guessing he will gain his portion of individual profits—though if he wants to share the profits with all those from whom his ideas derive in an uncreative fashion, I’m sure they will oblige.

My snarky comment aside, I think there’s something to Goldsmith’s ideas, encapsulated in his title “It’s Not Plagiarism. In the Digital Age, It’s ‘Repurposing.’”  As Goldsmith puts it.

The prominent literary critic Marjorie Perloff has recently begun using the term “unoriginal genius” to describe this tendency emerging in literature. Her idea is that, because of changes brought on by technology and the Internet, our notion of the genius—a romantic, isolated figure—is outdated. An updated notion of genius would have to center around one’s mastery of information and its dissemination. Perloff has coined another term, “moving information,” to signify both the act of pushing language around as well as the act of being emotionally moved by that process. She posits that today’s writer resembles more a programmer than a tortured genius, brilliantly conceptualizing, constructing, executing, and maintaining a writing machine.

Perloff’s notion of unoriginal genius should not be seen merely as a theoretical conceit but rather as a realized writing practice, one that dates back to the early part of the 20th century, embodying an ethos in which the construction or conception of a text is as important as what the text says or does. Think, for example, of the collated, note-taking practice of Walter Benjamin’s Arcades Project or the mathematically driven constraint-based works by Oulipo, a group of writers and mathematicians.

Today technology has exacerbated these mechanistic tendencies in writing (there are, for instance, several Web-based versions of Raymond Queneau’s 1961 laboriously hand-constructed Hundred Thousand Billion Poems), inciting younger writers to take their cues from the workings of technology and the Web as ways of constructing literature. As a result, writers are exploring ways of writing that have been thought, traditionally, to be outside the scope of literary practice: word processing, databasing, recycling, appropriation, intentional plagiarism, identity ciphering, and intensive programming, to name just a few.

I really do think there is something to this notion that there is a mark of “creativity”—sanitized or put under erasure (to use that hoary old theoretical term) by the quotation marks—in the ways in which we appropriate and redeploy sources from other areas on the internet.  We create personae through citation, quotation, sharing, and commentary rather than through creative acts that spring fully formed from our minds and imagination.  What we choose to cite and how we choose to comment on it, who we share it with, what other citations we assemble together with it in a kind of linguistic collage.  On one level this is old stuff, as Goldsmith points out, stretching back to a particular strand of modernism and even beyond.  Indeed, to go with a different reference to Benjamin, the figure of the storyteller is one who is best understood under the sign of repetition and appropriation, retelling stories that take on new meanings through their performance within particular contexts, rather than creating novel stories that exist on the page in the effort to create their own context.

Good behavior is the proper posture of the weak. (or, Jamaica Kincaid)

I’m reminded in this of some of the work of my friend and former student Liz Laribee, whose art I find visually provocative and surprisingly moving on an emotional scale, made up out of assemblage of leftovers.  About her work, Liz says the following:

My work almost always involves the repurposing of something else, and it’s in this process that I am trying to find meaning. Here, I used discarded bits and overlooked scraps of this bookstore to continue telling stories. The authors I’ve chosen are layered in my life in ways I can’t even quite tell you about. The dime novel poems force a new meaning to make room for a cheekier, sleuthier past

I’m not exactly sure what Liz means by a cheekier, sleuthier past, but what I take from it is that detritus, the schlocky stuff our commercial culture seems to vomit out and then shovel in to a corner is not something to be lamented so much as it is to be viewed as an opportunity, an occasion for a new kind of creativity that takes the vacuous surfaces of that commercial culture and creates a surprising visual and emotional depth.

Goldsmith thinks we are still too absolutely captive to old forms of doing things and thinks writing and literature has descended into irrelevance as a result.  He advocated for the development of a writing machine that moves us beyond the cult of personality and intended effect and into a realm of fortuitous and occasional affect.  Students need to be forced, he thinks, not to be original in the old sense, but to be repetitive and find whatever newness there is through this act of what Liz calls “repurposing.”

All this, of course, is technology-driven. When the students arrive in class, they are told that they must have their laptops open and connected. And so we have a glimpse into the future. And after seeing what the spectacular results of this are, how completely engaged and democratic the classroom is, I am more convinced that I can never go back to a traditional classroom pedagogy. I learn more from the students than they can ever learn from me. The role of the professor now is part party host, part traffic cop, full-time enabler.

The secret: the suppression of self-expression is impossible. Even when we do something as seemingly “uncreative” as retyping a few pages, we express ourselves in a variety of ways. The act of choosing and reframing tells us as much about ourselves as our story about our mother’s cancer operation. It’s just that we’ve never been taught to value such choices.

After a semester of my forcibly suppressing a student’s “creativity” by making her plagiarize and transcribe, she will tell me how disappointed she was because, in fact, what we had accomplished was not uncreative at all; by not being “creative,” she had produced the most creative body of work in her life. By taking an opposite approach to creativity—the most trite, overused, and ill-defined concept in a writer’s training—she had emerged renewed and rejuvenated, on fire and in love again with writing.

Having worked in advertising for many years as a “creative director,” I can tell you that, despite what cultural pundits might say, creativity—as it’s been defined by our culture, with its endless parade of formulaic novels, memoirs, and films—is the thing to flee from, not only as a member of the “creative class” but also as a member of the “artistic class.” At a time when technology is changing the rules of the game in every aspect of our lives, it’s time for us to question and tear down such clichés and reconstruct them into something new, something contemporary, something—finally—relevant.

I think there is something to this, although I doubt traditional novels and stories will disappear or should, any more than the writing of novels did away with storytelling in the old sense in any absolute way.  But I do think we need to think through, and not only in creative writing classes, what we might mean in encouraging our students to come up with their own original ideas, their personal arguments.

How might this notion change what we are doing, recognizing that we are in a period in which creative work, either artistic or academic, is primarily an act of redeploying, distributing, and remaking, rather than being original in the old sense of that word?

More observations on getting started with Digital Humanities

On the train home after a good day in Philly. The afternoon sessions focused a good deal on project management. David and I both agreed that in some respects it was a session that could have been presented at any kind of meeting whatsoever and didn’t seem particularly geared toward digital humanities issues. However, I do think that it is germane to humanists simply because it goes back to the whole issue of working collaboratively. I think we are more or less trained to work alone with a very few exceptions in our humanities disciplines; indeed, we glorify, support, and materially reward working alone as I suggested in my last post on THATCamp Philly. So I think it is helpful for humanists to think through very basic things like what it takes to plan a project, what it takes to run a meeting, what it takes to break a project down in to workable parts, what it takes to keep people running on schedule, what it takes to make people actually collaborate instead of setting off on their own (and instead of just providing them with information or telling them what to do–i.e. holding non-collaborative meetings). These are all issues that do not come naturally to humanists, and in many respects I think I am only figuring them out now after 7 years as a department chair and three years as a dean. These skills are essential if digital humanities requires collaborative work in order to function at any kind of high level.

Among the very simple gains from the day was an introduction to the possibilities of Google Docs. This comes as no surprise to the rest of the world, I’m sure, but I really have not moved out of the structured environs of a office software suite and/or a learning management system. IN my very brief exposure, google docs made these methods of doing work seems really quite clunky and inconvenient, though I haven’t actually tried to work with Google docs at this point. I really want to figure out a way of conducting some of my basic work in this environment, with shared documents in the cloud. We need to be having upside down meetings in some respect–where a lot of stuff gets done in virtual or distanced environments so that face to face meetings can be used for other kinds of high level issues. I’m not sure where to begin, but I want to experiment a little more personally and then see if there’s any way of incorporating it in to some of the meetings I’m responsible for as a dean.

David and I both agreed that we were terribly out of our league when it came to understanding some of the basic language and references to what was going on. We are both disappointed that we won’t be able to come back tomorrow since we both intuited in some sense that it would be better to gain this knowledge by actually plunging in and trying to make sense of what people are doing on actual projects, rather than trying to fill in all the background before we ever get started. If I’m right that this is a great deal about gaining facility in a language, I think both David and I arrived at the conclusion that it would be better to somehow learn by immersion rather than believing that we should learn all the grammatical rules. In that sense, maybe there is no right place to start, and we just have to have a practical goal. Where would we like to get in the landscape and start walking there. We’ll figure out what we need as we go along.

A couple of final notes:

I am getting too old to get up at 4:00 after going to bed at 11:30.

Trying to keep up with Facebook while also listening to a lecture does not work, no matter what my students say.

Cosmopolis, My Home Town

In my first school years growing up as a child of American missionaries in Papua New Guinea, my friends and I lined up outside our two-room school house every day, stood to attention, and sang “God Save the Queen” to the raising of the Australian flag.  We played soccer at recess.  And cricket.  I learned quickly to speak a fluent pidgin–the standard language of commerce and conversion among the 1000 different language groups on the Island–and probably spoke as much pidgin in my four years there as I did English.  By the end of the first six months I spoke with an Aussie accent.

At the same time my friends and I were fiercely loyal Americans, even though America was mostly an idea our parents talked about.  A place in pictures we inhabited in the Polaroid versions of our infant selves.  I proudly proclaimed myself a Texan even though I had spent only the first two years of my life in Texas and had no living memory of it except hazy dream flashes of a  visit to a beach in Galveston.  Once, erudite already at the age of seven and reading my way through the World Book Encyclopedia, I proclaimed confidently that Australia was as big as the continental United States.  Fisticuffs ensued. My friends in utter disbelief that anything in the world could be so large as America–so large did it loom in our telescopic imaginations–and in disbelief too that I would have the temerity to state the blasphemy out loud.

I think this urgency to be American was born somehow out of an intuited recognition of our placelessness.  It was a longing to belong somewhere, and an acknowledgement that somehow, despite appearances, we were not entirely sure we belonged where we were. Unlike most of my friends, I returned to the States after only four years.  I shed my Aussie accent hurriedly.  When my father came to my third grade classroom in Bethany, Oklahoma, I refused to speak pidgin with him, embarrassed, pretending to forget.  No one played soccer.  No one had heard of cricket.  I semi-learned to throw a baseball, though my wife still throws better than I do.  For the first year back in the states, I rooted for the Texas Longhorns, before finally getting religion sometime right around 1970.  I’ve been a Sooner fan in good standing ever since.

This sense of cultural dislocation, of belonging and not belonging to two different countries and cultures, was, I think, felt much more acutely by my friends who remained in New Guinea for the duration of their childhoods.  And it has certainly been detailed and discussed much more movingly and thoughtfully by my former student here at Messiah College, Carmen McCain.  Still, I think this cultural lurching has remained important to me.  While I became thoroughly and unapologetically American, I retained a sense that people lived in other ways, that I had lived in other ways.  Somehow, to remain loyal to all the selves that I had been, I could never be loyal to just one place or just one people.  In that sense, I have always been drawn to a kind of cosmopolitan ideal, a recognition that the way we do things now is only a way of doing things now, bound by time, chance, and circumstance–that there are many different ways to live, and that these ways may be at different times taken up and inhabited.  And so the possibilities for our selves are not bounded by the blood we’ve been given or the ground to which we’ve been born.

At the same time, I’ve really been impressed lately by a couple of cautionary essays on the limitations of cosmopolitanism.  This week Peter Woods over at the Chronicle of Higher Education sounded a cautionary note about the ideal of global citizenship.

Being a “citizen of the world” sounds like a good and generous thing. Moreover it is one of those badges of merit that can be acquired at no particular cost. World citizens don’t face any of the ordinary burdens that come with citizenship in a regular polity: taxes, military services, jury duty, etc. Being a self-declared world citizen gives one an air of sophistication and a moral upper hand over the near-sighted flag-wavers without the bother of having to do anything.

Well, one can only say yes this strikes me as incredibly fair.  Though I will point out that it seems to me that a lot of times recently the flag-wavers seem to be not too interested in the basic things of a regular polity, like paying taxes.  Still, Woods has a point that cosmopolitanism can often devolve into a kind of irresponsible consumerist tourism–imbiber of all cultures, responsible for none.  He implies, rightly I think, that whatever the values of global awareness, the bulk of life is worked out in the nitty-gritty day to day of the local business of things.  All living, not just all politics,  is local in some utterly conventional and inescapable sense.

Wood goes on to critique Martha Nussbaum, though it is a generous critique it seems to me.

Higher education inevitably involves some degree of estrangement from the culture and the community in which a student began life. If a student truly engages liberal education, his horizons will widen and his capacity for comprehending and appreciating achievements outside his natal traditions will increase. Thus far I accept Nussbaum’s argument. But a good liberal-arts education involves a lot more than uprooting a student; showing him how limited and meager his life was before he walked into the classroom; and convincing him how much better he will be if he becomes a devotee of multiculturalism. Rather, a good liberal arts education brings a student back from that initial estrangement and gives him a tempered and deepened understanding of claims of citizenship—in a real nation, not in the figment of “world citizenship.”

I like a lot of what Woods is doing in passages like this, but I’m concerned that his only means of articulating a notion of particularity is through the category of the nation.  In a nation as big and baggy as the United States, does this give us a really very robust sense of the local and particular?  And does it solve my basic problem that I feel loyal to different localities, to the integrity of the memory of the person I have been and the people with whom I was and somehow still am.

I’m more attracted to what my colleague here at Messiah College, John Fea, has to say about cosmopolitanism in his recent and very good essay on the issue in academe, where he develops the concept of cosmopolitan rootedness as an ideal to strive after.

But this kind of liberal cosmopolitanism does not need to undermine our commitment to our local attachments. Someone who is practicing cosmopolitan rootedness engages the world from the perspective of home, however that might be defined. As Sanders writes:

To become intimate with your home region [or, I might add, one’s home institution], to know the territory as well as you can, to understand your life as woven into local life does not prevent you from recognizing and honoring the diversity of other places, cultures, ways. On the contrary, how can you value other places, if you do not have one of your own? If you are not yourself placed, then you wander the world like a sightseer, a collector of sensations, with no gauge for measuring what you see. Local knowledge is the grounding for global knowledge. (1993, 114)

Or to quote the late Christopher Lasch:

Without home culture, as it used to be called—a background of firmly held standards and beliefs—people will encounter the “other” merely as consumers of impressions and sensations, as cultural shoppers in pursuit of the latest novelties. It is important for people to measure their own values against others and to run the risk of changing their minds; but exposure to other will do them very little if they have no mind to risk. (New Republic, 18 February 1991)

So is cosmopolitan rootedness possible in the academy? Can the way of improvement lead home? Can we think of our vocation and our work in terms of serving an institution? Our natural inclination is to say something similar to the comments in the aforementioned blog discussion. I can be loyal to an institution as long as the administration of the institution remains loyal to me. Fair enough. Administrators must be sensitive to the needs of their faculty, realizing that institutional loyalty is something that needs to be cultivated over time. But this kind of rootedness also requires faculty who are open to sticking it out because they believe in what the institution stands for—whatever that might be. (This, of course, means that the college or university must stand for something greater than simply the production of knowledge). It requires a certain form of civic humanism—the ideological opposite of Lockean contractualism—that is willing to, at times, sacrifice rank careerism for the good of the institution.

Instead of Global citizenship, John is championing what is sometimes called by administrators institutional citizenship (and as an aside I would only say John is exemplary as this essay might suggest).  Yet I admit that I find myself still grappling after that thing that speaks out of our memories, those places to which we remain loyal in thought and speech because we have been committed to those locations too.  And I wonder then if it is possible that we might be loyal to the places and spaces of our imaginations, places and selves and worlds that we can imagine as places of becoming.  If I have been in and of these other places, how is that reflected in being in and of this place I’m in, and how should that be imagined in light of the places I might also be in and of, if only not yet.

John, I know, is a loyal citizen of New Jersey, and defends it when none of the rest of us will. I wish him well.  And I am a loyal citizen of the Papua New Guinea of my memory, and I am a fiercely loyal southerner and southwesterner who takes an ethnic umbrage at the easy sneering about the south that springs unconcsciously to the lips of northerners, and I am a fiercely loyal Oklahoman who believes the state has something to be proud of beyond its football team.

I am also, in some sense, a loyal citizen of the heaven of my imagining where all and everyone speak in the tongues of men and angels  and we hear each and every one in our own tongues, a transparent language without translation, a heaven where every northerner finally learns the proper way to say “Y’all.”

What theory of locality and cosmopolitanism can get at this sense that I am one body in a place, but that this body bears in its bones a loyalty to many places, growing full of spirit at the smell of cut grass after rain in the hills of Arkansas, nose pinching at the thought of the salty stink of Amsterdam, remembering faintly the sweat in the air and on the leaves of the banana trees in highland tropics of New Guinea?

In Praise of Reviews, Reviewing, and Reviewers

I think if I was born again by the flesh and not the spirit, I might choose to become a book reviewer in my second life.  Perhaps this is “true confessions” since academics and novelists alike share their disdain for the review as a subordinate piece of work, and so the reviewer as a lowly creature to be scorned.  However, I love the review as a form, see it as a way of exercising creativity, rhetorical facility, and critical consciousness.  In other words, with reviews I feel like I bring together all the different parts of myself.  The creativity and the rhetorical facility I developed through and MFA, and the critical consciousness of my scholarly self developed in graduate school at Duke.  I developed my course on book-reviewing here at Messiah College precisely because I think it is one of the most  challenging forms to do well.  To write engagingly and persuasively for a generally educated audience while also with enough informed intelligence for an academic audience.

Like Jeffrey Wasserstrom in the  Chronicle Review, I also love reading book reviews, and often spend vacation days not catching up on the latest novel or theoretical tome, but on all the book reviews I’ve seen and collected on Instapaper.  Wasserstrom’s piece goes against the grain of a lot of our thinking about book reviews, even mine, and it strikes me that he’s absolutely right about a lot of what he says.  First, I often tell students that one of the primary purposes of book reviewers is to help sift wheat from chafe and tell other readers what out there is worth the reading.  This is true, but only partially so.

Another way my thinking diverges from Lutz’s relates to his emphasis on positive reviews’ influencing sales. Of course they can, especially if someone as influential as, say, Michiko Kakutani (whose New York Times reviews I often enjoy) or Margaret Atwood (whose New York Review of Books essays I never skip) is the one singing a book’s praises. When I write reviews, though, I often assume that most people reading me will not even consider buying the book I’m discussing, even if I enthuse. And as a reader, I gravitate toward reviews of books I don’t expect to buy, no matter how warmly they are praised.

Consider the most recent batch of TLS issues. As usual, I skipped the reviews of mysteries, even though these are precisely the works of fiction I tend to buy. And I read reviews of nonfiction books that I wasn’t contemplating purchasing. For instance, I relished a long essay by Toby Lichtig (whose TLS contributions I’d enjoyed in the past) that dealt with new books on vampires. Some people might have read the essay to help them decide which Dracula-related book to buy. Not me. I read it because I was curious to know what’s been written lately about vampires—but not curious enough to tackle any book on the topic.

What’s true regarding vampires is—I should perhaps be ashamed to say—true of some big fields of inquiry. Ancient Greece and Rome, for example. I like to know what’s being written about them but rarely read books about them. Instead, I just read Mary Beard’s lively TLS reviews of publications in her field.

Reviews do influence my book buying—just in a roundabout way. I’m sometimes inspired to buy books by authors whose reviews impress me. I don’t think Lichtig has a book out yet, but when he does, I’ll buy it. The last book on ancient Greece I purchased wasn’t one Mary Beard reviewed but one she wrote.

I can only say yes to this.  It’s very clear that I don’t just read book reviews in order to make decisions as a consumer.  I read book reviews because I like them for themselves, if they are well-done, but also just to keep some kind of finger on the pulse of what’s going on.  In other words, there’s a way in which I depend on good reviewers not to read in order to tell me what to buy, but to read in my place since I can’t possibly read everything.  I can remain very glad, though, that some very good reader-reviewers out there are reading the many good things that there are out there to read.  I need them so I have a larger sense of the cultural landscape than I could possibly achieve by trying to read everything on my own.

Wasserstrom also champions the short review, and speculates on the tweeted review and its possibilities:

I’ve even been musing lately about the potential for tweet-length reviews. I don’t want those to displace other kinds, especially because they can too easily seem like glorified blurbs. But the best nuggets of some reviews could work pretty well within Twitter’s haiku-like constraints. Take my assessment of Kissinger’s On China. When I reviewed it for the June 13 edition of Time’s Asian edition, I was happy that the editors gave me a full-page spread. Still, a pretty nifty Twitter-friendly version could have been built around the best line from the Time piece: “Skip bloated sections on Chinese culture, focus on parts about author’s time in China—a fat book w/ a better skinnier one trying to get out.”

The basic insight here is critical.  Longer is not always better.  I’m even tempted to say not often better, the usual length of posts to this blog notwithstanding.  My experiences on facebook suggest to me that we may be in a new era of the aphorism, as well as one that may exalt the wit of the 18th  century, in which the pithy riposte may be more telling than the blowsy dissertation.

A new challenge for my students in the next version of my book-reviewing class, write a review that is telling accurate and rhetorically effective in 160 characters or less.

Create or Die: Humanities and the Boardroom

A couple of years ago when poet and former head of the NEA, Dana Gioia, came to Messiah College to speak to our faculty and the honors program, I asked him if he felt there were connections between his work as a poet and his other careers as a high-level government administrator and business executive.  Gioia hemmed and hawed a bit, but finally said no;  he thought he was probably working out of two different sides of his brain.

I admit to thinking the answer was a little uncreative for so creative  a person.  I’ve always been a bit bemused by and interested in the various connections that I make between the different parts of myself.  Although I’m not a major poet like Gioia, I did complete an MFA and continue to think of myself in one way or another as a writer, regardless of what job I happen to be doing at the time.  And I actually think working on an MFA for those two years back in Montana has had a positive effect on my life as an academic and my life as an administrator.  Some day I plan to write an essay on the specifics, but it is partially related to the notion that my MFA did something to cultivate my creativity, to use the title of a very good article in the Chronicle review by Steven Tepper and George Kuh.  According to Tepper and Kuh,

To fuel the 21st-century economic engine and sustain democratic values, we must unleash and nurture the creative impulse that exists within every one of us, or so say experts like Richard Florida, Ken Robinson, Daniel Pink, Keith Sawyer, and Tom Friedman. Indeed, just as the advantages the United States enjoyed in the past were based in large part on scientific and engineering advances, today it is cognitive flexibility, inventiveness, design thinking, and nonroutine approaches to messy problems that are essential to adapt to rapidly changing and unpredictable global forces; to create new markets; to take risks and start new enterprises; and to produce compelling forms of media, entertainment, and design.

Tepper and Kuh , a sociologist and professor education respectively, argue forcefully that creativity is not doled out by the forces of divine or genetic fate, but something that can be learned through hard work.  An idea supported by recent findings that what musical and other artistic prodigies share is not so much genetic markers as practice, according to Malcolm Gladwell 10,000 hours of it.  Tepper and Kuh believe the kinds of practice and discipline necessary for cultivating a creative mindset are deeply present in the arts, but only marginally present in many other popular disciplines on campus.

Granted, other fields, like science and engineering, can nurture creativity. That is one reason collaborations among artists, scientists, and engineers spark the powerful breakthroughs described by the Harvard professor David Edwards (author of Artscience, Harvard University Press, 2008); Xerox’s former chief scientist, John Seely Brown; and the physiologist Robert Root-Bernstein. It is also the case that not all arts schools fully embrace the creative process. In fact, some are so focused on teaching mastery and artistic conventions that they are far from hotbeds of creativity. Even so, the arts might have a special claim to nurturing creativity.

A recent national study conducted by the Curb Center at Vanderbilt University, with Teagle Foundation support, found that arts majors integrate and use core creative abilities more often and more consistently than do students in almost all other fields of study. For example, 53 percent of arts majors say that ambiguity is a routine part of their coursework, as assignments can be taken in multiple directions. Only 9 percent of biology majors say that, 13 percent of economics and business majors, 10 percent of engineering majors, and 7 percent of physical-science majors. Four-fifths of artists say that expressing creativity is typically required in their courses, compared with only 3 percent of biology majors, 16 percent of economics and business majors, 13 percent of engineers, and 10 percent of physical-science majors. And arts majors show comparative advantages over other majors on additional creativity skills—reporting that they are much more likely to have to make connections across different courses and reading; more likely to deploy their curiosity and imagination; more likely to say their coursework provides multiple ways of looking at a problem; and more likely to say that courses require risk taking.

Tepper and Kuh focus on the arts for their own purposes, and I realized that in thinking about the ways that an MFA had helped me I was thinking about an arts discipline in many respects.  However, the fact that the humanities is nowhere in their article set me to thinking.  How do the humanities stack up in cultivating creativity, and is this a selling point for parents and prospective students to consider as they imagine what their kids should study in college.  These are my reflections on the seven major “creative” qualities that Kepper and Kuh believe we should cultivate, and how the humanities might do in each of them.

  1. the ability to approach problems in nonroutine ways using analogy and metaphor;
    • I think the humanities excel in this area for a variety of reasons.  For some disciplines like English and Modern Languages, we focus on the way language works with analogy and metaphor and we encourage its effective use in writing.  But more than that, I think many humanities disciplines can encourage nonroutine ways of approaching problems—though, of course, many professors in any discipline can be devoted to doing the same old things the same old way.
  2. conditional or abductive reasoning (posing “what if” propositions and reframing problems);
    • Again, I think a lot of our humanities disciplines approach things in this fashion.  To some degree this is because of a focus on narrative.  One way of getting at how narrative works and understanding the meaning of a narrative at hand is to pose alternative scenarios.  What if we had not dropped the bomb on Japan?  What if Lincoln had not been shot?  How would different philosophical assumptions lead to different outcomes to an ethical problem.
  3. keen observation and the ability to see new and unexpected patterns;
    • I think we especially excel in this area.  Most humanities disciplines are deeply devoted to close and careful readings that discover patterns that are beneath the surface.  The patterns of imagery in a poem, the connections between statecraft and everyday life, the relationship between dialogue and music in a film.  And so forth.  Recognizing patterns in problems can lead to novel and inventive solutions.
  4. the ability to risk failure by taking initiative in the face of ambiguity and uncertainty;
    • I’m not sure if the humanities put a premium on this inherently or not.  I know a lot of professors (or at least I as a professor) prefer their students take a risk in doing something different on a project and have it not quite work than to do the predictable thing.  But I don’t know that this is something inherent to our discipline.  I do think, however, that our disciplines inculcate a high level of comfort with uncertainty and ambiguity.  Readings of novels or the complexities of history require us to not go for easy solutions but to recognize and work within the ambiguity of situations.
  5. the ability to heed critical feedback to revise and improve an idea;
    • Again, I would call this a signature strength, especially as it manifests itself in writing in our discipline and in the back and forth exchange between students and student and teacher.  Being willing to risk and idea or an explanation, have it critiqued, and then to revise one’s thinking in response to more persuasive arguments.
  6. a capacity to bring people, power, and resources together to implement novel ideas; and
    • I admit that on this one I kind of think the humanities might be weaker than we should be, at least as manifested in our traditional way of doing things.   Humanists in most of our disciplines are notorious individualists who would rather spend their time in libraries.  We don’t get much practice at gathering people together with the necessary resources to work collaboratively.  This can happen, but instinctively humanists often want to be left alone.  This is an area where I think a new attention to collaborative and project based learning could help the humanities a great deal, something we could learn from our colleagues in the sciences and arts like theatre and music.  I’m hopeful that some of the new attention we are giving to digital humanities at Messiah College will lead us in this direction.
  7. the expressive agility required to draw on multiple means (visual, oral, written, media-related) to communicate novel ideas to others.
    • A partial strength.  I do think we inculcate good expressive abilities in written and moral communication, and we value novel ideas.  Except for our folks in film and communication—as well as our cousins in art history—we are less good at working imaginatively with visual and other media related resources.  This has been one of my priorities as dean, but something that’s hard to generate from the ground up.

Well, that’s my take.  I wonder what others think?

Many Stories, One World

YHWH’s Image

With this clay He began to coat His shins,
cover His thighs, His chest. He continued this
layering, and, when He had been wholly
interred, He parted the clay at His side, and
retreated from it, leaving the image of Himself
to wander in what remained of that early
morning mist.
—Scott Cairns

I wrote a little tepidly about Scott Cairns’s collection “Recovered Body” a little while back. Nevertheless, I find that “YHWH’s Image”–partially quoted above– has been sticking with me, as poems somehow seem to do. I keep coming back to that image of the creation, so different and yet so right, as if Cairns has shown me both the truth of human intimacy with God and our ache at God’s absence.

I may have been thinking about this a bit since I’m leading a discussion of Genesis 1 and 2 at my church this Sunday. No Biblical scholar am I, but I’ve been mulling over the endless troubling that goes on about the two different accounts of creation, as if this somehow counted against the truth of the Biblical passages. “Those silly ancient Hebrews,”  we seems to say, “Didn’t they realize they put two completely different stories right next to each other.”  A modern chauvinism.  As it happens, I also read this week a very fine essay in manuscript by my colleague at Messiah College, Brian Smith, a scholar of the Hebrew Bible, who points out that there’s also a creation story in the first part of Proverbs that is very different from the first of Genesis, proclaiming the primacy of Wisdom. And, of course, I  realized that there’s another creation story in the first chapter of John, where in the beginning we have not God brooding over the deep, but the Word with and as God. I’m sure if I knew my Bible better a number of other creation stories might spring to mind–I’ve come to doubt there are only four. And yet even with so many, Scott Cairns’s poem got me thinking about some truth about the creation story that is genuinely new but somehow consistent with all the others and with the teachings that I’ve received about God’s relationship to the world as creator and redeemer.

And yet we worry and fret that there are two stories and they don’t match up.

What if it is the case that the creation of the world out of nothing is so beyond our imagining, that getting the one right story isn’t the point. What if it is the case that the great rupture of creation is so beyond our comprehension that we are set to storytelling, not so we could capture the single truth of what happened, but so we could bear testimony to its mystery.

Testimony, it seems to me, is an important word because it bears witness to a given fact and is in some way accepted as testimony only to the extent that it both repeats what everyone already knows, but in a way that bears an individual stamp.  The truthtelling that is testimony is both repetitive and unique, as we are urged on to bear witness to the facts of a known or remembered world.   The recent rehearsals of oral histories about 9/11 is this kind of record. All of them sound familiar, and yet all of them are unique and different, bearing some new witness to the day a world changed for those who were there. Scott Cairns’s poem strikes me this way. Right and consistent with everything I think I have heard and believed, but bearing its own stamp, its own word, never heard before, such that I know God the Creator in some new may.

For many poststructualists/postmodernists, the multiplicity of stories about events in the world gives credence to the notion of incommensurable realities–in some sense we make the world through our speaking about it.  And so there is no world to be spoken about, instead a plethora of worlds carried and clashing on the wings of our words. In this view, the notion of one world or a singular truth is oppressive, squashing down our human ingenuity.

But what if it were the case that the one act of creation created a world so singular and beyond our knowing that we are called upon to bear testimony to that one world through our own stories?  The one great rupture of creation calls into being our own acts of creation out of testimony to an event we affirm but cannot encompass.  If this were the case, then I would stop worrying about the fact that there were two or three or four or more stories about creation in the Bible. I might be surprised that there weren’t more.

Wouldn’t something as startling and unprecedented as a world need more than one story?

Read two poems and call me in the morning

I freely admit that we literary types are wont to say everything comes back to story and that words can change the world.  (We are also given to words like “wont” but that is another matter).  This is mostly just the self-aggrandizement that accompanies any disciplinary passion.  But I also really think there is something serious to the idea that stories can make a big difference to hearts and minds, and therefore our bodies.  This might be that this is because my father was a practicing physician, but also a storyteller, and somehow it seemed to me that these things always fit together in my mind:   The well-told story fixed something, put something broken back together, almost as literally as my father set a bone or wrapped a cast.

Some of you know that I have a growing interest in what is sometimes called the medical humanities, and I have been deeply interested in how the literary arts have been used in healing practices in everything from psych wards to cancer wards to Alzheimer’s facilities.  The recent lovely piece from Kristin Sanner in the  Chronicle Review, “The Literature Cure” is not quite so officially about the medical humanities, but it gets at this idea that somehow literature can be involved in our healing and our wholeness, even in the midst of dying and disease.  A few notes from Sanner,  on how her battle with breast cancer both changed her sense of literature and on how her commitment to literature changed her understanding of and response to her disease.

English majors and reluctant students of literature in my general-education courses often ask me questions like: “What exactly can you do with a degree in English?” “Why read all of these books from the past?” Before my cancer diagnosis, I would answer the English majors with practical examples. You could go into publishing or journalism, I told them, or go to graduate or law school. To the general-education students, I would answer with a vague “literature enriches our lives and makes us more well-rounded individuals.”

After my cancer diagnosis, my responses changed, becoming more universal and less practical. We read and study literature, I told my students, because it helps us understand how to live and how to die. It shows us how to persevere in the face of adversity, how to reach into our personal depths and find both meaning and will. It reminds us of the dichotomous fragility and tenacity of earthly living. It also teaches us how to care for those who suffer.

At a time when colleges and universities are making unilateral cuts to humanities programs, these reasons seem pertinent. Each of us, unfortunately, will experience adversity at some point in our lives. Many of us will find ourselves facing a tragedy, a trauma, or a loss that cannot be explained in simple terms. Conventional medicine and science may help us cope in a practical, physical sense—they may even cure us of our illnesses and pain. Religious faith will temper the suffering for some. But it is our universal stories—written, oral, and visual—that help us navigate through these adversities with grace and courage. For many of us, stories give us the hope that we may be able to bear the burdens of our afflictions and live fully, even as we are dying. Stories teach us that suffering and perseverance unify us as humans in a way that transcends race, religion, and class.

…………………………………………….

Throughout my battle with cancer, I have turned to literature and writing to make sense of this miserable and mysterious disease. Books help me understand that human suffering is universal. They have also taught me empathy—how to reach out to others who suffer. In a world where spite and hatred mark the rhetoric of so many, such an intangible attribute should be a vital, required outcome of every student’s educational experience.

Indeed.