Category Archives: creativity

How Do Blogging and Traditional Modes of Scholarly Production Relate?

In the latest edition of the Chronicle’s Digital Campus, Martin Weller makes some strong claims about the significance of blogging.  Recognizing the difficulty of measuring the value of a blog in comparison to traditional modes of journalistic publication, Weller believes that blogging is ultimately in the interest of both institutions and scholarship.

It’s a difficult problem, but one that many institutions are beginning to come to terms with. Combining the rich data available online that can reveal a scholar’s impact with forms of peer assessment gives an indication of reputation. Universities know this is a game they need to play—that having a good online reputation is more important in recruiting students than a glossy prospectus. And groups that sponsor research are after good online impact as well as presentations at conferences and journal papers.

Institutional reputation is largely created through the faculty’s online identity, and many institutions are now making it a priority to develop, recognize, and encourage practices such as blogging.For institutions and individuals alike, these practices are moving from specialist hobby to the mainstream. This is not without its risks, but as James Boyle, author of the book The Public Domain: Enclosing the Commons of the Mind (Yale University Press, 2008), argues, we tend to overstate the dangers of open approaches and overlook the benefits, while the converse holds true for the closed system.

The Virtues of Blogging as Scholarly Activity – The Digital Campus – The Chronicle of Higher Education.

The claim that an Institutional reputation is largely created through the faculty’s online identity startled me when I first read it, an index no doubt of my deeply held and inveterate prejudice in favor of libraries.  But I have been trying to pound away with the faculty how utterly important our online presence is, and the internet–in many different modes–gives us the opportunity to create windows on humanities work that are not otherwise easily achieved–at least in comparison to some of the work done by our colleagues in the arts or in the sciences.  Blogging is one way of creating connection, of creating vision, and I think that with a very few exceptions like the ivies and the public ivies, it is very much the case that your online presence matters more than any other thing you can possibly do to establish your reputation in the public eye and in the eye of prospective students and their parents.

That is fairly easy to grasp.  The value of the blogging to scholarship in general, or its relationship to traditional scholarship remains more thorny and difficult to parse.  I’ve had conversations with my colleague John Fea over at The Way of Improvement Leads Home, and we both agree that in some sense scholars still have to have established some claim to speak through traditional modes of publication in order to give their scholarly blogging some sense of authority.  People listen to John about History because he’s published books and articles. [Why people listen to me I have no idea–although I do have books and articles I have nothing like John’s reputation;  it may have something to do with simply holding a position.  Because I am a dean at a college I can lay claim to certain kinds of experience that are relevant to discussing the humanities].

I am not sure it will always be thus.  I think the day is fast approaching when publishing books will become less and less important as the arbiter of scholarly authority.  But I think for now and perhaps for a very good long time to come, blogging exist in an interesting symbiosis with other traditional forms of scholarship. Weller quotes John Naughton to this effect:  “Looking back on the history,” he writes, “one clear trend stands out: Each new technology increased the complexity of the ecosystem.”

I’ve read some things lately that say blogging may be on its way out, replaced in the minds of the general public, I guess, by Facebook, Twitter, and Pinterest.  But for now I think it remains an interesting and somewhat hybrid academic form.  A forum for serious thought and reasonably good writing, but not one that claims to be writing for the ages.  In some respects, I think the best blogging is more like the recovery of the eighteenth century Salon, wherein wit that demonstrated learning and acumen was highly valued, and perhaps a basis of academic life that stood unembarrassed next to the more muscular form of the book. Blogging is one clearly important addition to the scholarly ecosystem, playing off of and extending traditional scholarship rather than simply replacing it.

In my own life right now, as an administrator, I have too little time during the school year to pursue the writing of a 40 page article or a 400 page book–nor, right now, do I have the interest or inclination (however much I want to get back and finish the dang Harlem Renaissance manuscript that sits moldering in my computer).  I do, however, continue to feel the need to contribute to scholarly conversation surrounding the humanities and higher education in general.  Blogging is one good way to do that, and one that, like Weller, I find enjoyable, creative, and stress relieving–even when I am writing my blog at 11:00 at night to make sure I can get something posted.  Ever the Protestant and his work ethic.

Takeaways–NITLE Seminar: Undergraduates Collaborating in Digital Humanities Research

Yesterday afternoon at 3:00 about 30 Messiah College humanities faculty and undergraduates gathered to listen in on and virtually participate in the NITLE Seminar focusing on Undergraduates Collaborating in Digital Humanities Research.  A number of our faculty and students were tweeting the event, and a Storify version with our contributions can be found here.I am amazed and gratified to have such a showing late on a Friday afternoon.  Students and faculty alike were engaged and interested by the possibilities they saw being pursued in undergraduate programs across the country, and our own conversation afterwards extended for more than a half hour beyond the seminar itself. Although most of us freely admit that we are only at the beginning and feeling our way, there was a broad agreement that undergraduate research and participation in Digital Humanities work was something we needed to keep pushing on.

If you are interested in reviewing the entire seminar, including chat room questions and the like, you can connect through this link.  I had to download webex in order to participate in the seminar, so you may need to do the same, even though the instructions I received said I wouldn’t need to.  My own takeaways from the seminar were as follows:

  • Undergraduates are scholars, not scholars in waiting.  If original scholarship is defined as increasing the fund of human knowledge, discovering and categorizing and interpreting data that helps us better understand human events and artifacts, developing tools that can be employed by other scholars who can explore and confirm or disconfirm or further your findings, these young people are scholars by any definition.
  • Digital Humanities research extends (and, to be sure, modifies) our traditional ways of doing humanities work;  it does not oppose it.  None of these young scholars felt inordinate tensions between their traditional humanities training and their digital humanities research.  A student who reviewed a database of 1000 Russian folks tales extended and modified her understanding arrived at by the close reading of a dozen.  Digital Humanities tools enable closer reading and better contextual understanding of the poet Agha Shahid Ali, rather than pushing students away in to extraneous material.
  • Many or most of these students learned their tools as they went along, within the context of what they were trying to achieve.  I was especially fascinated that a couple of the students had had no exposure to Digital Humanities work prior to their honors projects, and they learned the coding and digital savvy they needed as they went along.  Learning tools within the context of how they are needed seems to make more and more sense to me.  You would not teach a person how to use a hammer simply by giving them a board and nails, at least not if you don’t want them to get bored.  Rather, give them something to build, and show or have them figure out how the hammer and nails will help them get there.

I’m looking forward to the Places We’ll Go.

Is Twitter the future of fiction? Micro-prose in an age of ADD

As I’ve mentioned, I’ve been struck by Alex Juhasz’s pronouncement at the Re:Humanities conference that we must learn what it means to write for an audience that is permanently distracted.  In response, I put up a Facebook post: “We need a rhetoric of the caption. A hermeneutic of the aphorism. Haiku as argument.”  My Provost at Messiah College–known for thorough and intricate argument–left a comment “I’m Doomed.”

Perhaps we all are, those of us who are more Faulkneresque than Carveresque in our stylistic leanings.  This latest from GalleyCat:

R.L. Stine, the author of the popular Goosebumps horror series for kids, gave his nearly 49,000 Twitter followers another free story this afternoon.To celebrate Friday the 13th, the novelist tweeted a mini-horror story called “The Brave One.” We’ve collected the posts below for your reading pleasure.

via R.L. Stine Publishes ‘The Brave Kid’ Horror Story on Twitter – GalleyCat.

Ok, I know it’s a silly reach to put Stine and Faulkner in the same paragraph, and to be honest I found Stine’s story trite.  On the other hand, I do think it’s obvious we’re  now in an age wherein shorter prose with bigger impact may be the necessity.  Flash fiction is growing, and we can witness the immense popularity of NPR’s three minute fiction contest.  These forms of fiction, of writing in general speak to the necessities of an art of the moment, rather than the art of immersion.  Literature, and prose in general, is ALWAYS responsive to material and cultural forms of its own moment, and I think prose that is short and explosive, or prose that pierces beneath the surface of the readers psyche in a moment only to spread and eat its way into the unconscious when the moment of reading is long forgotten, is mostly likely the prose that is the order of the day.

BUT…Stine certainly doesn’t do it for me.  I don’t know a lot about Twitter fiction.  Is there any really good stuff out there on twitter–as opposed to flash fiction written in a standard format which I know more about? Or is it all carney-style self-promotion or unrealized theory at the moment?

[And what, I wonder, does this mean for the future of academic prose as well?  I’m a late comer to Twitter myself, but I’ve been a little fascinated with the academic discourse that can occur, but more on that some other time.]

Literacy in the Digital Humanities: Or, a clueless “noob” in digital academe

Today my faculty group focused on the Digital Humanities here at Messiah College had a great session with Ryan Cordell from St. Norbert’s College.  Ryan blogs regularly for ProfHacker at the Chronicle of Higher Education, and holds down the Digital Humanities fort (or perhaps leads the insurgency) at St. Norbert’s.  He’s also especially done some work advising liberal arts colleges on projects in the Digital Humanities, so I thought he’d be a good choice for consulting.  I’m happy with the choice:  Ryan was practical and down-to-earth, while also pointing to really challenging and exciting places we could take some of our nascent ideas.  I think we came away with some good possibilities for next steps that will lead to some concrete action in the next year.  I highly recommend Ryan if your looking for a consultant for starting or managing digital humanities projects in a smaller school setting.

Earlier in the day I had had the good luck to look in on a massive twitter debate that was, unbeknownst to the participants, about or at least precipitated by me and a brief conversation I’d had with Ryan.  I’d told Ryan that one of my biggest concerns was professional development for faculty and getting them over some of the immediate humps of alienation that traditional humanistic scholars feel when confronted with what amounts to an alien DH world.  I mentioned the fact that I  and one of my colleagues, David Pettegrew--who is himself much more versed in technical know-how than I am–went to a THATCamp and spent the first two or three hours feeling completely lost and at sea, unable to fully comprehend half the language that was being used or the tasks that we were being asked to implement. I mentioned to Ryan that I felt that I probably needed to have had a half of a semester of a coding class before I would have gotten everything out of the THATCamp that I should have gotten.  Although that improved as things went along and we got in to concrete projects, and I also found everyone very gracious and the atmosphere enthusiastic,  I was worried that my faculty who were only interested in investigating (and perhaps then only after my pleading) would be discouraged or uninterested in engaging with DH if a THATCamp was their first experience.

Ryan mentioned this in a tweet yesterday.

All-twitter-hell broke loose.

Well, not really.  In fact it was a really fascinating and intellectually complex conversation–one I wouldn’t have thought could happen via Twitter.  I won’t try to completely replicate that conversation here.  You could go to Ryan’s twitter feed and find the essentials for yourself.  It was clear, though, that Ryan’s tweet had touched what amounted to a raw digital nerve.  Some twitterers were flabbergasted that anyone would find a THATCamp too daunting or that it could ever be alienating.  Others assumed that the problem definitely must have been with me, that I was too shy to ask for help.  Ultimately the conversation turned to a pretty serious engagement with the question of whether there were genuinely insider and exclusive groups and hierarchies within DH.

As a “noob”–which I discovered in the course of the twitter conversation yesterday is what I am–I am here to say without a hint of condemnation, “Yes, yes, yes there are.”

For me, this is not a moral or even a political statement, though it was very clear to me that for many people in the conversation this was a moral or political concern.  To admit to hierarchies and exclusivity was  a betrayal of the collaborative and radically democratic spirit that many feel is at the heart of DH work.  I will say that these collaborative aspects are part of what most attracts me to what’s going on in DH–as little as I actually do know;  I see it as a superb fit for some of the commitments my school has to the public humanities and to public service more generally, besides moving students in to more collaborative learning environments that will be useful to them in the world they are entering.

However, any academic discourse that is imaginable, maybe any discourse that is imaginable at all, operates by exclusion and inclusion simply given the facts that there are those who know the language and those who do not, there are those who are literate in the language and those who are not, there are those who are fluent in the language and those who are not, and there are those who are creators in with and of the language and there are those who are not.  It is impossible for me to imagine how this could be otherwise.

The reason DH can be difficult and alienating for beginners like me is because we don’t know enough of the language to even know what to ask for. I will say I mused over the question of whether I had just been too shy to ask for help at the THATCamp.  Being a fainting violet is not really a quality that will get you terribly far in administration, so I doubt it, but it may be that I could have asked for more help.  The problem was, I felt so lost that I wasn’t entirely sure what kind of help to ask for.  This is a basic function of discourse, to understand the parameters of the language games you are playing, to know what questions to ask, what moves to make and when, and where to go for the vocabulary you need.  Its why you need consultants like Ryan, or teachers who are in the know.  Its the rationale for the title of my post referencing Gerald Graff’s Clueless in Academe.  DH is obviously a part of academe, even in its alt-academic forms, and it is increasingly central to academic work in the humanities, and there are an awful lot of people who are clueless about where to begin.

There is nothing morally or politically wrong with this or with being a part of an in group.  To say there is would be to say there is something morally or politically wrong with being alive.  Hyper-Calvinists aside, I don’t think this is a tenable position.

The problem, however, from an administrators point of view–and I speak in to this conversation primarily as an administrator who is trying to facilitate the work of others and promote the well-being of our students–is the pathways toward accessing the language and practices of this world aren’t always terribly clear.  Indeed, ironically, I think some of the laudable democratic ethos in DH work and culture may contribute to this obscurity.  Because a THATCamp–and so much other DH work–is so democratically organized, it means that one experience, conference or workshop may in fact really work well for rank beginners, while another may really require attendees to be a little more versed in the basics before attending.

For me as a person and as a thinker, that’s fine.  I actually look forward to going to another THATCamp someday, even if I am just as lost as I was the first time around. My tenure no longer depends upon it–which gives me a freedom my junior faculty do not have.

However, as an administrator, that democratic quality is a disaster as I consider what kinds of professional development efforts to try to support with my faculty.  I would not be able to tell whether a particular experience would be appropriate for a rank beginner who is hesitantly interested or at least willing to give this a try.  Alternatively, I wouldn’t be able to know ahead of time whether a particular experience would be appropriate for a more advanced colleague who might go and get an iteration of the basics she already knows.  My ability to manage my budgets in a responsible fashion is hampered by my inability to gauge what kinds of professional development experiences I should pursue or promote with my colleagues who are at very different places in their experience of and expertise in DH methodologies and practices.

The traditional life of a humanist academic is elitist in its own obvious ways with its own arcana and exclusionary practices. But the pathway toward access to its languages is fairly well marked, even if it is now increasingly travelled successfully by the very lucky very few.  I could tell junior faculty members 10 years ago that if they wanted to succeed at my college they needed to do three or four things, and I could outline how they should go about doing them.  I don’t sense that kind of pathway to DH work, yet, so while I am wanting mightily to get my faculty more involved with some of these efforts, I’m also aware that without a clearer path for their own professional development, I may be as likely to facilitate confusion as I am to promote professional development.

This problem may simply disappear as DH becomes more and more central to the humanist enterprise, but I suspect as it does become more and more central that the pathways to access will have to become more and more clearly marked.  This means the development of disciplinary (or quasi-disciplinary–I am aware of the angst over thinking of DH as a discipline) protocols and expectations, and as importantly the expected means by which those elements of professional life are effectively accessed by beginners.

This means the recognition of certain gateways and the appointment of their gatekeepers, which all smacks a little bit of hierarchy and exclusion.  However, while it’s true that roadmaps undemocratically dominate a landscape, they also get you where you need to go.  And while gateways mark a boundary, they also let you in.

Why digital humanities is already a basic skill, not just a specialist niche–Matthew Kirschenbaum

Sometimes I think we humanists “of a certain age,” to put the issue politely, imagine digital humanities as an optional activity that will be filled by an interesting niche of young professors who take their place in the academy as yet another niche sub-discipline, something that research universities hire for and small colleges struggle hopelessly to replicate.  It may be indeed that small colleges will struggle to integrate digital humanities in to their own infrastructures, but I think the general picture of Digital Humanities as an optional sub-discipline will simply be unsustainable.  The argument smells a little of the idea that e-books are a nice sub-genre of texts, but not something the average humanist has to worry that much about.  I think, to the contrary, that digital humanities and the multitude of techniques that it entails, will become deeply integrated in a fundament way with the basic methodologies of how we go about doing business, akin to knowing how to do close reading or how to maneuver our way through libraries.

Although pointing out this fact is not his main point, Matthew Kirschenbaum–already a Digital Humanities patron saint in many respects–has an essay in The Chronicle that points to this fact.  Kirschenbaum is currently interested in how we preserve digital material, and the problems are just as complex if not moreso than the general question of how and when to save print materials.  Moreso to the degree that we cannot be sure that the current forms in which we place our digital intelligence will actually be usable five years from now.  The consequences for humanities research and writing are profound and must be considered. From Kirschenbaum:

Digital preservation is the sort of problem we like to assume others are thinking about. Surely someone, somewhere, is on the job. And, in lots of ways, that is true. Dire warnings of an approaching “digital dark ages” appear periodically in the media: Comparisons are often made to the early years of cinema—roughly half of the films made before 1950 have been lost because of neglect. 

But the fact is that enormous resources—government, industry, and academic—are being marshaled to attack the problem. In the United States, for example, the Library of Congress has been proactive through its National Digital Information Infrastructure and Preservation Program. Archivists of all stripes now routinely receive training in not only appraisal and conservation of digital materials but also metadata (documentation and description) and even digital forensics, through which we can stabilize and authenticate electronic records. (I now help teach such a course at the University of Virginia’s renowned Rare Book School.) Because of the skills of digital archivists, you can read former presidents’ e-mail messages and examine at Emory University Libraries a virtual recreation of Salman Rushdie’s first computer. Jason Scott’s Archive Team, meanwhile, working without institutional support, leaps into action to download and redistribute imperiled Web content.

What this suggests is that Rushdie’s biographers will have to not so much know how to sift through piles of letters, but how to recreate digital archives that authors themselves may not be interested in preserving.  Biographers of the present and surely the future, will have to be Digital technicians, as well as close readers of the digital archives they are able to recover.

Kirschenbaum goes on to suggest that most of us must do this work on our own, and must do this work for ourselves, in preserving our own archives.

But despite those heroic efforts, most individuals must still be their own digital caretakers. You and I must take responsibility for our own personal digital legacy. There are no drive-through windows (like the old photo kiosks) where you can drop off your old floppies and pick up fresh files a day or two later. What commercial services are available tend to assume data are being recovered from more recent technology (like hard drives), and these also can be prohibitively expensive for average consumers. (Organizations like the Library of Congress occasionally sponsor public-information sessions and workshops to teach people how to retrieve data from old machines, but those are obviously catch as catch can.)

Research shows that many of us just put our old disks, CD’s, and whatnot into shoeboxes and hope that if we need them again, we’ll figure out how to retrieve the data they contain when the time comes. (In fact, researchers such as Cathy Marshall, at Microsoft Research, have found that some people are not averse to data loss—that the mishaps of digital life provide arbitrary and not entirely unwelcome opportunities for starting over with clean slates.)

This last, of course, is an interesting problem.  Authors have often been notoriously averse to having their mail probed and prodded for signs of the conflicts and confessions, preferring that the “work” stand on its own. Stories of authors burning their letters and manuscripts are legion, nightmarishly so for  the literary scholar.   Such literary self-immolations are both harder and easier in a digital world.  My drafts and emails can disappear at the touch of a button.  On the other hand, I am told that a hard drive is never actually erased for those who are really in the know.  Then again, the task of scholar who sees a writers computer as his archive is in some ways vastly more difficult than that of the writer who was an assiduous collector of his type-written drafts.  Does every deletion and spell correct count as a revision.  What should we trace as an important change, and what should we disregard as detritus.  These are, of course, the standard archival questions, but it seems to me they are exponentially more complicated in a digital archive where a text may change a multitude of times in a single sitting, something not so possible in a typewritten world.
Well, these are the kinds of things Kirschenbaum takes up.  And having the tools to apply to such questions will be the task for every humanist in the future, not a narrow coterie.

In Praise of Reviews, Reviewing, and Reviewers

I think if I was born again by the flesh and not the spirit, I might choose to become a book reviewer in my second life.  Perhaps this is “true confessions” since academics and novelists alike share their disdain for the review as a subordinate piece of work, and so the reviewer as a lowly creature to be scorned.  However, I love the review as a form, see it as a way of exercising creativity, rhetorical facility, and critical consciousness.  In other words, with reviews I feel like I bring together all the different parts of myself.  The creativity and the rhetorical facility I developed through and MFA, and the critical consciousness of my scholarly self developed in graduate school at Duke.  I developed my course on book-reviewing here at Messiah College precisely because I think it is one of the most  challenging forms to do well.  To write engagingly and persuasively for a generally educated audience while also with enough informed intelligence for an academic audience.

Like Jeffrey Wasserstrom in the  Chronicle Review, I also love reading book reviews, and often spend vacation days not catching up on the latest novel or theoretical tome, but on all the book reviews I’ve seen and collected on Instapaper.  Wasserstrom’s piece goes against the grain of a lot of our thinking about book reviews, even mine, and it strikes me that he’s absolutely right about a lot of what he says.  First, I often tell students that one of the primary purposes of book reviewers is to help sift wheat from chafe and tell other readers what out there is worth the reading.  This is true, but only partially so.

Another way my thinking diverges from Lutz’s relates to his emphasis on positive reviews’ influencing sales. Of course they can, especially if someone as influential as, say, Michiko Kakutani (whose New York Times reviews I often enjoy) or Margaret Atwood (whose New York Review of Books essays I never skip) is the one singing a book’s praises. When I write reviews, though, I often assume that most people reading me will not even consider buying the book I’m discussing, even if I enthuse. And as a reader, I gravitate toward reviews of books I don’t expect to buy, no matter how warmly they are praised.

Consider the most recent batch of TLS issues. As usual, I skipped the reviews of mysteries, even though these are precisely the works of fiction I tend to buy. And I read reviews of nonfiction books that I wasn’t contemplating purchasing. For instance, I relished a long essay by Toby Lichtig (whose TLS contributions I’d enjoyed in the past) that dealt with new books on vampires. Some people might have read the essay to help them decide which Dracula-related book to buy. Not me. I read it because I was curious to know what’s been written lately about vampires—but not curious enough to tackle any book on the topic.

What’s true regarding vampires is—I should perhaps be ashamed to say—true of some big fields of inquiry. Ancient Greece and Rome, for example. I like to know what’s being written about them but rarely read books about them. Instead, I just read Mary Beard’s lively TLS reviews of publications in her field.

Reviews do influence my book buying—just in a roundabout way. I’m sometimes inspired to buy books by authors whose reviews impress me. I don’t think Lichtig has a book out yet, but when he does, I’ll buy it. The last book on ancient Greece I purchased wasn’t one Mary Beard reviewed but one she wrote.

I can only say yes to this.  It’s very clear that I don’t just read book reviews in order to make decisions as a consumer.  I read book reviews because I like them for themselves, if they are well-done, but also just to keep some kind of finger on the pulse of what’s going on.  In other words, there’s a way in which I depend on good reviewers not to read in order to tell me what to buy, but to read in my place since I can’t possibly read everything.  I can remain very glad, though, that some very good reader-reviewers out there are reading the many good things that there are out there to read.  I need them so I have a larger sense of the cultural landscape than I could possibly achieve by trying to read everything on my own.

Wasserstrom also champions the short review, and speculates on the tweeted review and its possibilities:

I’ve even been musing lately about the potential for tweet-length reviews. I don’t want those to displace other kinds, especially because they can too easily seem like glorified blurbs. But the best nuggets of some reviews could work pretty well within Twitter’s haiku-like constraints. Take my assessment of Kissinger’s On China. When I reviewed it for the June 13 edition of Time’s Asian edition, I was happy that the editors gave me a full-page spread. Still, a pretty nifty Twitter-friendly version could have been built around the best line from the Time piece: “Skip bloated sections on Chinese culture, focus on parts about author’s time in China—a fat book w/ a better skinnier one trying to get out.”

The basic insight here is critical.  Longer is not always better.  I’m even tempted to say not often better, the usual length of posts to this blog notwithstanding.  My experiences on facebook suggest to me that we may be in a new era of the aphorism, as well as one that may exalt the wit of the 18th  century, in which the pithy riposte may be more telling than the blowsy dissertation.

A new challenge for my students in the next version of my book-reviewing class, write a review that is telling accurate and rhetorically effective in 160 characters or less.

Create or Die: Humanities and the Boardroom

A couple of years ago when poet and former head of the NEA, Dana Gioia, came to Messiah College to speak to our faculty and the honors program, I asked him if he felt there were connections between his work as a poet and his other careers as a high-level government administrator and business executive.  Gioia hemmed and hawed a bit, but finally said no;  he thought he was probably working out of two different sides of his brain.

I admit to thinking the answer was a little uncreative for so creative  a person.  I’ve always been a bit bemused by and interested in the various connections that I make between the different parts of myself.  Although I’m not a major poet like Gioia, I did complete an MFA and continue to think of myself in one way or another as a writer, regardless of what job I happen to be doing at the time.  And I actually think working on an MFA for those two years back in Montana has had a positive effect on my life as an academic and my life as an administrator.  Some day I plan to write an essay on the specifics, but it is partially related to the notion that my MFA did something to cultivate my creativity, to use the title of a very good article in the Chronicle review by Steven Tepper and George Kuh.  According to Tepper and Kuh,

To fuel the 21st-century economic engine and sustain democratic values, we must unleash and nurture the creative impulse that exists within every one of us, or so say experts like Richard Florida, Ken Robinson, Daniel Pink, Keith Sawyer, and Tom Friedman. Indeed, just as the advantages the United States enjoyed in the past were based in large part on scientific and engineering advances, today it is cognitive flexibility, inventiveness, design thinking, and nonroutine approaches to messy problems that are essential to adapt to rapidly changing and unpredictable global forces; to create new markets; to take risks and start new enterprises; and to produce compelling forms of media, entertainment, and design.

Tepper and Kuh , a sociologist and professor education respectively, argue forcefully that creativity is not doled out by the forces of divine or genetic fate, but something that can be learned through hard work.  An idea supported by recent findings that what musical and other artistic prodigies share is not so much genetic markers as practice, according to Malcolm Gladwell 10,000 hours of it.  Tepper and Kuh believe the kinds of practice and discipline necessary for cultivating a creative mindset are deeply present in the arts, but only marginally present in many other popular disciplines on campus.

Granted, other fields, like science and engineering, can nurture creativity. That is one reason collaborations among artists, scientists, and engineers spark the powerful breakthroughs described by the Harvard professor David Edwards (author of Artscience, Harvard University Press, 2008); Xerox’s former chief scientist, John Seely Brown; and the physiologist Robert Root-Bernstein. It is also the case that not all arts schools fully embrace the creative process. In fact, some are so focused on teaching mastery and artistic conventions that they are far from hotbeds of creativity. Even so, the arts might have a special claim to nurturing creativity.

A recent national study conducted by the Curb Center at Vanderbilt University, with Teagle Foundation support, found that arts majors integrate and use core creative abilities more often and more consistently than do students in almost all other fields of study. For example, 53 percent of arts majors say that ambiguity is a routine part of their coursework, as assignments can be taken in multiple directions. Only 9 percent of biology majors say that, 13 percent of economics and business majors, 10 percent of engineering majors, and 7 percent of physical-science majors. Four-fifths of artists say that expressing creativity is typically required in their courses, compared with only 3 percent of biology majors, 16 percent of economics and business majors, 13 percent of engineers, and 10 percent of physical-science majors. And arts majors show comparative advantages over other majors on additional creativity skills—reporting that they are much more likely to have to make connections across different courses and reading; more likely to deploy their curiosity and imagination; more likely to say their coursework provides multiple ways of looking at a problem; and more likely to say that courses require risk taking.

Tepper and Kuh focus on the arts for their own purposes, and I realized that in thinking about the ways that an MFA had helped me I was thinking about an arts discipline in many respects.  However, the fact that the humanities is nowhere in their article set me to thinking.  How do the humanities stack up in cultivating creativity, and is this a selling point for parents and prospective students to consider as they imagine what their kids should study in college.  These are my reflections on the seven major “creative” qualities that Kepper and Kuh believe we should cultivate, and how the humanities might do in each of them.

  1. the ability to approach problems in nonroutine ways using analogy and metaphor;
    • I think the humanities excel in this area for a variety of reasons.  For some disciplines like English and Modern Languages, we focus on the way language works with analogy and metaphor and we encourage its effective use in writing.  But more than that, I think many humanities disciplines can encourage nonroutine ways of approaching problems—though, of course, many professors in any discipline can be devoted to doing the same old things the same old way.
  2. conditional or abductive reasoning (posing “what if” propositions and reframing problems);
    • Again, I think a lot of our humanities disciplines approach things in this fashion.  To some degree this is because of a focus on narrative.  One way of getting at how narrative works and understanding the meaning of a narrative at hand is to pose alternative scenarios.  What if we had not dropped the bomb on Japan?  What if Lincoln had not been shot?  How would different philosophical assumptions lead to different outcomes to an ethical problem.
  3. keen observation and the ability to see new and unexpected patterns;
    • I think we especially excel in this area.  Most humanities disciplines are deeply devoted to close and careful readings that discover patterns that are beneath the surface.  The patterns of imagery in a poem, the connections between statecraft and everyday life, the relationship between dialogue and music in a film.  And so forth.  Recognizing patterns in problems can lead to novel and inventive solutions.
  4. the ability to risk failure by taking initiative in the face of ambiguity and uncertainty;
    • I’m not sure if the humanities put a premium on this inherently or not.  I know a lot of professors (or at least I as a professor) prefer their students take a risk in doing something different on a project and have it not quite work than to do the predictable thing.  But I don’t know that this is something inherent to our discipline.  I do think, however, that our disciplines inculcate a high level of comfort with uncertainty and ambiguity.  Readings of novels or the complexities of history require us to not go for easy solutions but to recognize and work within the ambiguity of situations.
  5. the ability to heed critical feedback to revise and improve an idea;
    • Again, I would call this a signature strength, especially as it manifests itself in writing in our discipline and in the back and forth exchange between students and student and teacher.  Being willing to risk and idea or an explanation, have it critiqued, and then to revise one’s thinking in response to more persuasive arguments.
  6. a capacity to bring people, power, and resources together to implement novel ideas; and
    • I admit that on this one I kind of think the humanities might be weaker than we should be, at least as manifested in our traditional way of doing things.   Humanists in most of our disciplines are notorious individualists who would rather spend their time in libraries.  We don’t get much practice at gathering people together with the necessary resources to work collaboratively.  This can happen, but instinctively humanists often want to be left alone.  This is an area where I think a new attention to collaborative and project based learning could help the humanities a great deal, something we could learn from our colleagues in the sciences and arts like theatre and music.  I’m hopeful that some of the new attention we are giving to digital humanities at Messiah College will lead us in this direction.
  7. the expressive agility required to draw on multiple means (visual, oral, written, media-related) to communicate novel ideas to others.
    • A partial strength.  I do think we inculcate good expressive abilities in written and moral communication, and we value novel ideas.  Except for our folks in film and communication—as well as our cousins in art history—we are less good at working imaginatively with visual and other media related resources.  This has been one of my priorities as dean, but something that’s hard to generate from the ground up.

Well, that’s my take.  I wonder what others think?