Category Archives: Humanities

Teaching (to the) Tests

I was a little appalled to find out last week that my son is spending multiple days taking PSSA standardized tests in his school, a ritual that he shrugs his shoulders and rolls his eyes at, a teenager’s damning assessment that says wordlessly that this-is-just-one-more-stupid-thing-adults-make-me-do-for-my-own-good.

Teenage boys are notoriously bored with school and sneer at book-learning.  Think Huckleberry Finn as only one great American example.  So I think my son’s attitude about school is not particularly new.  But whereas Twain’s hero, in a great romantic tradition, saw books themselves as corrupt and the learning they provided as a corrupter of virtuous instincts, nothing is further from the truth for  my son.  He reads ravenously, much more than I did at his age, which is saying quite a bit.  And the level and range of his interests astonish me, from jazz poetry and history, to hindu mysticism, to the thickest of contemporary literary novels like Delillo’s Mao II and David Foster Wallace’s Infinite Jest.  When he bought Wallace’s Infinite Jest he was nearly giddy with delight.  “It’s like a giant two-fisted hamburger” he said laughing, holding it up to his open mouth. “Something you can eat up.”  This same boy who sees learning as a feast when left to the devices of books and their authors and his own imagination, seems to experience school as an emotional and intellectual Waste Land, a place he finds dull and dulling.

I can’t blame his teachers.  They are talented and caring.  But I can’t help wondering with a chorus of others, whether something has gone badly wrong with our approaches to schooling, approaches that trap teachers as surely as they enervate students, or at least many male students.

I tend to agree with this blog from Diane Ravitch, who I find to be one of our most interesting commentators on education, if only because she has become so unpredictable and yet so incisive in her analyses. From the NYRB, “Flunking Arne Duncan”

Will Duncan’s policies improve public education?

No. Under pressure to teach to tests—which assess only English and math skills—many districts are reducing the time available for teaching the arts, history, civics, foreign languages, physical education, and other non-tested subjects. (Other districts are spending scarce dollars to create new tests for the arts, physical education, and those other subjects so they can evaluate all their teachers, not just those who teach reading and mathematics.) Reducing the time available for the arts, history, and other subjects will not improve education. Putting more time and money into testing reduces the time and money available for instruction. None of this promotes good education. None of this supports love of learning or good character or any other ideals for education. Such a mechanistic, anti-intellectual approach would not be tolerated for President Obama’s children, who attend an excellent private school. It should not be tolerated for the nation’s children, 90 percent of whom attend public schools. Grade: F.

Of course, I care about this for my son, and I am thankful to God that he is getting through his high school education before the ax has cleaved away programs in the arts and diminished offering in other areas.  But I also care about this as an educator and administrator in a college.  In the first place, it is impossible to imagine that we will have students who are effective writers if they have no historical sense, no understanding of the arts, less exposure to the civic virtues, and fewer opportunities to read and think in languages other than their own.  It is impossible to imagine that such students–even scoring better on a writing or a math exam–are better prepared for college than students from a generation ago, before we started worrying about their being left behind.  It is impossible to imagine that we are getting a better student simply because they can do a math problem successfully when they have never had to struggle to draw a picture or understand a symphony or reach in to the past to engage the worlds of the ancients before them.
If this is the education we are giving them, we are producing neither saints or scholars.  We may be producing a generation of students (perhaps boys especially) who are bored mindless and who will look for their learning in more likely places, like the cover of a book for which no one will think to devise a test.

Why digital humanities is already a basic skill, not just a specialist niche–Matthew Kirschenbaum

Sometimes I think we humanists “of a certain age,” to put the issue politely, imagine digital humanities as an optional activity that will be filled by an interesting niche of young professors who take their place in the academy as yet another niche sub-discipline, something that research universities hire for and small colleges struggle hopelessly to replicate.  It may be indeed that small colleges will struggle to integrate digital humanities in to their own infrastructures, but I think the general picture of Digital Humanities as an optional sub-discipline will simply be unsustainable.  The argument smells a little of the idea that e-books are a nice sub-genre of texts, but not something the average humanist has to worry that much about.  I think, to the contrary, that digital humanities and the multitude of techniques that it entails, will become deeply integrated in a fundament way with the basic methodologies of how we go about doing business, akin to knowing how to do close reading or how to maneuver our way through libraries.

Although pointing out this fact is not his main point, Matthew Kirschenbaum–already a Digital Humanities patron saint in many respects–has an essay in The Chronicle that points to this fact.  Kirschenbaum is currently interested in how we preserve digital material, and the problems are just as complex if not moreso than the general question of how and when to save print materials.  Moreso to the degree that we cannot be sure that the current forms in which we place our digital intelligence will actually be usable five years from now.  The consequences for humanities research and writing are profound and must be considered. From Kirschenbaum:

Digital preservation is the sort of problem we like to assume others are thinking about. Surely someone, somewhere, is on the job. And, in lots of ways, that is true. Dire warnings of an approaching “digital dark ages” appear periodically in the media: Comparisons are often made to the early years of cinema—roughly half of the films made before 1950 have been lost because of neglect. 

But the fact is that enormous resources—government, industry, and academic—are being marshaled to attack the problem. In the United States, for example, the Library of Congress has been proactive through its National Digital Information Infrastructure and Preservation Program. Archivists of all stripes now routinely receive training in not only appraisal and conservation of digital materials but also metadata (documentation and description) and even digital forensics, through which we can stabilize and authenticate electronic records. (I now help teach such a course at the University of Virginia’s renowned Rare Book School.) Because of the skills of digital archivists, you can read former presidents’ e-mail messages and examine at Emory University Libraries a virtual recreation of Salman Rushdie’s first computer. Jason Scott’s Archive Team, meanwhile, working without institutional support, leaps into action to download and redistribute imperiled Web content.

What this suggests is that Rushdie’s biographers will have to not so much know how to sift through piles of letters, but how to recreate digital archives that authors themselves may not be interested in preserving.  Biographers of the present and surely the future, will have to be Digital technicians, as well as close readers of the digital archives they are able to recover.

Kirschenbaum goes on to suggest that most of us must do this work on our own, and must do this work for ourselves, in preserving our own archives.

But despite those heroic efforts, most individuals must still be their own digital caretakers. You and I must take responsibility for our own personal digital legacy. There are no drive-through windows (like the old photo kiosks) where you can drop off your old floppies and pick up fresh files a day or two later. What commercial services are available tend to assume data are being recovered from more recent technology (like hard drives), and these also can be prohibitively expensive for average consumers. (Organizations like the Library of Congress occasionally sponsor public-information sessions and workshops to teach people how to retrieve data from old machines, but those are obviously catch as catch can.)

Research shows that many of us just put our old disks, CD’s, and whatnot into shoeboxes and hope that if we need them again, we’ll figure out how to retrieve the data they contain when the time comes. (In fact, researchers such as Cathy Marshall, at Microsoft Research, have found that some people are not averse to data loss—that the mishaps of digital life provide arbitrary and not entirely unwelcome opportunities for starting over with clean slates.)

This last, of course, is an interesting problem.  Authors have often been notoriously averse to having their mail probed and prodded for signs of the conflicts and confessions, preferring that the “work” stand on its own. Stories of authors burning their letters and manuscripts are legion, nightmarishly so for  the literary scholar.   Such literary self-immolations are both harder and easier in a digital world.  My drafts and emails can disappear at the touch of a button.  On the other hand, I am told that a hard drive is never actually erased for those who are really in the know.  Then again, the task of scholar who sees a writers computer as his archive is in some ways vastly more difficult than that of the writer who was an assiduous collector of his type-written drafts.  Does every deletion and spell correct count as a revision.  What should we trace as an important change, and what should we disregard as detritus.  These are, of course, the standard archival questions, but it seems to me they are exponentially more complicated in a digital archive where a text may change a multitude of times in a single sitting, something not so possible in a typewritten world.
Well, these are the kinds of things Kirschenbaum takes up.  And having the tools to apply to such questions will be the task for every humanist in the future, not a narrow coterie.

Mark Sandel–The commodification of everything

I’ve enjoyed listening occasionally to Mark Sandel’s lectures in philosophy via iTunes.  He has an interesting new article in the April Atlantic focusing on the ways in which nearly everything in American life, at least, has been reduced to a market value.  Despite the admonition that money can’t buy me love, we are pretty sure that it can buy everything else, and that we are willing to sell just about anything, including body parts and personal dignity, for whatever the market will bear.

Sandel somewhat peculiarly to my mind traces this to a post-Cold War phenomenon.

WE LIVE IN A TIME when almost everything can be bought and sold. Over the past three decades, markets—and market values—have come to govern our lives as never before. We did not arrive at this condition through any deliberate choice. It is almost as if it came upon us.

As the Cold War ended, markets and market thinking enjoyed unrivaled prestige, and understandably so. No other mechanism for organizing the production and distribution of goods had proved as successful at generating affluence and prosperity. And yet even as growing numbers of countries around the world embraced market mechanisms in the operation of their economies, something else was happening. Market values were coming to play a greater and greater role in social life. Economics was becoming an imperial domain. Today, the logic of buying and selling no longer applies to material goods alone. It increasingly governs the whole of life.

The last gasp Marxists I studied with at Duke had a word for what Sandel sees and descries, commodification, and it didn’t mysterious just come upon us in the 1980s.  Commodification, the rendering of every bit of life as a commodity that can bought and sold, is the central thrust of capitalist economies in the 20th century, perhaps the central feature of capitalism per se.  The essential act of commodification is at the center of Marx’s understanding that the worker in some very real sense sells him or herself through selling his or her labor power.  Thus, human beings were commodified well before people became willing to sell tattoos on their foreheads to advertise products.  So Sandel’s perplexity and astonishment at this state of affairs in our contemporary economy strikes me as the perplexity of someone who has only recently awakened from a dream.

On the other hand, I do think Sandel is on to something.  It is the case that despite this thrust of capitalist economies (and, to be frank, I’m not sure that Marxist economies were all that different), there have been sectors of culture and their accompanying institutions that resisted their own commodification.  The edifice of modernism in the arts and literature is built on the notion that the arts could be a transcendent world apart from degradations of the social world, including perhaps especially its markets.  The difficulty and density of modern art and literature was built in part out of a desire that it not be marketable in any typical sense.  Modern art was sometimes ugly precisely to draw attention to the difficulty and difference of its aesthetic and intellectual properties.  It was meant not to sell, or at least not to sell too well.  Remember that the next time a Picasso sells for millions.  Similarly, the church and in a different way educational institutions retained a relative independence form the marketplace, or at least resisted the notion that they could be reduced to market forces.  Whether claiming to provide access to the sacred or to enduring human values, religious institutions and educational institutions served–even when they were corrupt or banal–to remind the culture that there was a world apart, something that called us to be better than ourselves, or at least reminded us that our present values were not all the values that there were.

Sandel rightly notes that that residue has all but disappeared, and the result has been an hollowing out of our public life, and a debasement of our humanity.

In hopes of avoiding sectarian strife, we often insist that citizens leave their moral and spiritual convictions behind when they enter the public square. But the reluctance to admit arguments about the good life into politics has had an unanticipated consequence. It has helped prepare the way for market triumphalism, and for the continuing hold of market reasoning.

In its own way, market reasoning also empties public life of moral argument. Part of the appeal of markets is that they don’t pass judgment on the preferences they satisfy. They don’t ask whether some ways of valuing goods are higher, or worthier, than others. If someone is willing to pay for sex, or a kidney, and a consenting adult is willing to sell, the only question the economist asks is “How much?” Markets don’t wag fingers. They don’t discriminate between worthy preferences and unworthy ones. Each party to a deal decides for him- or herself what value to place on the things being exchanged.

This nonjudgmental stance toward values lies at the heart of market reasoning, and explains much of its appeal. But our reluctance to engage in moral and spiritual argument, together with our embrace of markets, has exacted a heavy price: it has drained public discourse of moral and civic energy, and contributed to the technocratic, managerial politics afflicting many societies today.

Sandel wonders about a way to connect to some kind of moral discourse to inform public life, something that will reach beyond the reach of markets, but he clearly despairs that such a connection can be found.  I think there’s good reason.  Rapidly our educational institutions have become factories that shamelessly advertise themselves as places where people can make themselves in to better commodities than they were before, and which build programs designed to sell themselves to the highest number of student-customers possible.  Our religious institutions are floundering.  Only today I read in Time magazine that the rise of so-called “nones”–people who claim to have no religious affiliation–is one of the most notable developments in our spiritual culture.  Such people often seek to be spiritual but not religious on the grounds that religions are dogmatic and inflexible.  I have come to wonder whether that dogmatism and inflexibility points to the hard won truth that it is not good enough to just go along to get along.

One wonders, in fact, whether a spirituality based on getting along really provides a hard point of resistance to the tendency to see everything in life–whether my beliefs or my ethics–as an investment that must pay off if it is to be worth keeping.  I wonder, too, whether our educational systems and institutions are up to the task of providing an education that isn’t just another instance of the market.  As for art, writing, and literature.  Well, who knows?  Modernism was not always commodified, though it very quickly became  so.  I do find it intriguing that this point of hyper-commodification is also a time when there has been an explosion of free or relatively free writing and music on the internet.  There is a small return to the notion of the artist as a community voice, with musician and poets producing work for free on the internet, and making their living through performance or through other jobs–escaping or at least partially escaping the notion that we produce work primarily to sell it.  This is a small resistance, but worth thinking about.

I wonder if there are other ways our culture is equipped to resist in a larger collective fashion, the turning of our lives in to the image of a can of soup?

Living in an e-plus world: Students now prefer digital texts when given a choice

A recent blog by Nick DeSantis in the Chronicle points to a survey by the Pearson Foundation that suggests Tablet ownership is on the rise.  That’s not surprising, but more significant is the fact that among tablet users there’s a clear preference for digital texts over the traditional paper codex, something we haven’t seen before even among college students of this wired generation:

One-fourth of the college students surveyed said they owned a tablet, compared with just 7 percent last year. Sixty-three percent of college students believe tablets will replace textbooks in the next five years—a 15 percent increase over last year’s survey. More than a third said they intended to buy a tablet sometime in the next six months.

This year’s poll also found that the respondents preferred digital books over printed ones. It’s a reversal of last year’s results and goes against findings of other recent studies, which concluded that students tend to choose printed textbooks. The new survey found that nearly six in 10 students preferred digital books when reading for class, compared with one-third who said they preferred printed textbooks.

I find this unsurprising as it matches up pretty well with my own experience.  5 years ago I could never imagine doing any significant reading on a tablet.  Now I do all my reading of scholarly journals and long form journalism–i.e The Atlantic, the New York Review of Books, The Chronicle Review–on my iPad.  And while I still tend to prefer the codex for the reading of novels and other book length works, the truth is that preference is slowly eroding as well.  As I become more familiar with the forms of e-reading, the notions of its inherent inferiority, like the notions of any unreflective prejudice, gradually fade in the face of familiarity.

And yet I greet the news of this survey with a certain level of panic, not panic that it should happen at all, but panic that the pace of change is quickening and we are hardly prepared, by we I mean we in the humanities here in small colleges and elsewhere.  I’ve blogged on more than one occasion about my doubts about e-books and yet my sense of their inevitable ascendancy.  For instance here on the question of whether e-books are being foisted on students by a cabal of publishers and administrators like myself out to save a buck (or make a buck as the case may be), and here on the nostalgic but still real feeling that I have that print codex forms of books have an irreplaceable individuality and physicality that the mere presence of text in a myriad of e-forms does not suffice to replace.

But though I’ve felt the ascendancy of e-books was inevitable, I think I imagined a 15 or 20 year time span in which print and e-books would mostly live side by side.  Our own librarians here at Messiah College talk about a “print-plus” model for libraries, as if e-book will remain primarily an add on for some time to come.  I wonder.  Just as computing power increases exponentially, it seems to me that the half-life of print books is rapidly diminishing.  I now wonder whether we will have five years before students will expect their books to be in print–all their books, not just their hefty tomes for CHEM 101 that can be more nicely illustrated with iBook Author–but also their books for English and History classes as well.  This is an “e-plus”  world  where print will increasingly not be the norm, but the supplement to fill whatever gaps e-books have not yet bridged, whatever textual landscapes have not yet been digitized.

Despite warnings, we aren’t yet ready for an e-plus world.  Not only do we not know how to operate the apps that make these books available, we don’t even know how to critically study books in tablet form.  Yet learning what forms of critical engagement are possible and necessary will be required.  I suspect, frankly, that our current methods developed out of a what was made possible by the forms that texts took, rather than forms following our methodological urgencies.  This means that the look of critical study in the classroom will change radically in the next ten years.  What will it look like?

The United States–Land of the Second Language Illiterates

Approximately 80% of American citizens are monolingual, and the largest part of the rest are immigrants and their children. This statistic from Russell Berman, outgoing President of the MLA, in his valedictory message to the MLA convention in Seattle this past January. Berman’s address is a rousing and also practical defense of the humanities in general, and especially of the role of second language learning in developing a society fully capable of engaging the global culture within which it is situated. From his address:

Let us remember the context. According to the National Foreign Language Center, some 80% of the US population is monolingual. Immigrant populations and heritage speakers probably make up the bulk of the rest. Americans are becoming a nation of second-language illiterates, thanks largely to the mismanagement of our educational system from the Department of Education on down. In the European Union, 50% of the population older than 15 reports being able to carry on a conversation in a non-native language, and the EU has set a goal of two non-native languages for all its citizens.

Second language learning enhances first language understanding: many adults can recall how high school Spanish, French, or German—still the three main languages offered—helped them gain a perspective on English—not only in terms of grammar but also through insights into the complex shift in semantic values across cultural borders. For this reason, we in the MLA should rally around a unified language learning agenda: teachers of English and teachers of other languages alike teach the same students, and we should align our pedagogies to contribute cooperatively to holistic student learning. We are all language teachers. For this reason, I call on English departments to place greater importance on second language knowledge, perhaps most optimally in expectations for incoming graduate students. Literature in English develops nowhere in an English-only environment; writing in any language always takes place in a dialectic with others. With that in mind, I want to express my gratitude to the American Studies Association for recently adopting a statement supportive of the MLA’s advocacy for language learning.

Berman goes on to recognize, however, that this strong and reasonable call for educated people to be conversant in more than one language is largely sent echoless in to the void. Less than .1% of the discretionary budget of the Department of Education goes to support Language learning. Indeed, I suspect this is because so many of us, even in higher education, are dysfunctional in a second language. I often tell people grimly that I can ask how to go to the bathroom in four different languages.

Nevertheless, in an age where we call for global engagement and in which we imagine the importance of preparedness for a global marketplace and want our students to be citizens of the world, it is irresponsible to continue to imagine that world will conveniently learn to speak English for our sakes.

 

Should Humanities students learn to code?

One of the big questions that’s been on our mind in the digital humanities working group is whether and to what degree humanities students (and faculty!) need to have digital literacy or even fluency.  Should students be required to move beyond the ability to write a blog or create a wiki toward understanding and even implementing the digital tools that make blogs and wikis and databases possible.  This essay from Anastasia Salter takes up the issue, largely in the affirmative, in a review of Douglas Rushikoff’s Program or be Programmed:  Read more at…Should Humanities students learn to code?.

Michael Hart and Project Gutenberg

I felt an unaccountable sense of loss at reading tonight in the Chronicle of Higher Education (paper edition no-less) that the founder of Project Gutenberg, Michael Hart, has died at the age of 64.  This is a little strange since I had no idea who had founded Project Gutenberg until I read the obituary.  But Project Gutenberg I know, and I think I knew immediately when I ran across it that it was already and would continue to be an invaluable resource for readers and scholars, this even though I’ve never been much of a champion of e-books.  I guess it felt a bit like I had discovered belatedly the identity of a person who had given me a great gift and never had the chance to thank him.

One aspect of Hart’s vision for Project Gutenberg struck me in relationship to some of the things I’ve been thinking about in relationship to the Digital Humanities.  That’s Hart’s decision to go with something that was simple and nearly universal as an interface rather than trying to sexy, with it, and up to date.  Says the Chronicle

His early experiences clearly informed his choices regarding Project Gutenberg. He was committed to lo-fi—the lowest reasonable common denominator of textual presentation. That was for utterly pragmatic reasons: He wanted his e-texts to be readable on 99 percent of the existing systems of any era, and so insisted on “Plain-Vanilla ASCII” versions of all the e-texts generated by Project Gutenberg.

That may seem a small—even retro—conceit, but in fact it was huge. From the 80s on, as the Internet slowly became more publicly manifest, there were many temptations to be “up to date”: a file format like WordStar, TeX, or LaTeX in the 1980s, or XyWrite, MS Word, or Adobe Acrobat in the 90s and 2000s, might provide far greater formatting features (italics, bold, tab stops, font selections, extracts, page representations, etc.) than ASCII. But because Mr. Hart had tinkered with technology all his life, he knew that “optimal formats” always change, and that today’s hi-fi format was likely to evolve into some higher-fi format in the next year or two. Today’s ePub version 3.01 was, to Mr. Hart, just another mile marker along the highway. To read an ASCII e-text, via FTP, or via a Web browser, required no change of the presentational software—thereby ensuring the broadest possible readership.

Mr. Hart’s choice meant that the Project Gutenberg corpus—now 36,000 works—would always remain not just available, but readable. What’s more, it has been growing, in every system since.

This is no small thing.  The ephemeral character of digital humanities projects bothers me.  By ephemeral I don’t mean they are intellectually without substance.  I think the intellectual character of the work can be quite profound.  However, the forms in which the work is done can disappear or be outdated tomorrow.  Hart’s decision to use ASCII is in some sense an effort to replicate the durability of the book.  Books, for all the fragility of paper, have a remarkable endurance and stability overall.  The basic form doesn’t change and the book used by an ancient in the middle ages is, more or less, still usable by me in the same fashion.  By contrast I can’t even open some of my old files in my word processor.  I think the work I did was substantial, but the form it was placed in was not enduring.  Harts decision makes sense to me, but I’m not sure how it might be extended to other kinds of projects in the digital humanities.

Discussion of Henry Jenkins and Lev Manovich

I’m also blogging occasionally over at digitalhumanity.wordpress.org, our site for discussing Digital Humanities at Messiah College.  You’re invited to check in and see our flailing around as we try to get our minds around whatever it is that goes on with this field and try to think about how we might contribute.  Today we had a discussion of some work by Henry Jenkins and Lev Manovich.  A few of my notes can be found here.

Create or Die: Humanities and the Boardroom

A couple of years ago when poet and former head of the NEA, Dana Gioia, came to Messiah College to speak to our faculty and the honors program, I asked him if he felt there were connections between his work as a poet and his other careers as a high-level government administrator and business executive.  Gioia hemmed and hawed a bit, but finally said no;  he thought he was probably working out of two different sides of his brain.

I admit to thinking the answer was a little uncreative for so creative  a person.  I’ve always been a bit bemused by and interested in the various connections that I make between the different parts of myself.  Although I’m not a major poet like Gioia, I did complete an MFA and continue to think of myself in one way or another as a writer, regardless of what job I happen to be doing at the time.  And I actually think working on an MFA for those two years back in Montana has had a positive effect on my life as an academic and my life as an administrator.  Some day I plan to write an essay on the specifics, but it is partially related to the notion that my MFA did something to cultivate my creativity, to use the title of a very good article in the Chronicle review by Steven Tepper and George Kuh.  According to Tepper and Kuh,

To fuel the 21st-century economic engine and sustain democratic values, we must unleash and nurture the creative impulse that exists within every one of us, or so say experts like Richard Florida, Ken Robinson, Daniel Pink, Keith Sawyer, and Tom Friedman. Indeed, just as the advantages the United States enjoyed in the past were based in large part on scientific and engineering advances, today it is cognitive flexibility, inventiveness, design thinking, and nonroutine approaches to messy problems that are essential to adapt to rapidly changing and unpredictable global forces; to create new markets; to take risks and start new enterprises; and to produce compelling forms of media, entertainment, and design.

Tepper and Kuh , a sociologist and professor education respectively, argue forcefully that creativity is not doled out by the forces of divine or genetic fate, but something that can be learned through hard work.  An idea supported by recent findings that what musical and other artistic prodigies share is not so much genetic markers as practice, according to Malcolm Gladwell 10,000 hours of it.  Tepper and Kuh believe the kinds of practice and discipline necessary for cultivating a creative mindset are deeply present in the arts, but only marginally present in many other popular disciplines on campus.

Granted, other fields, like science and engineering, can nurture creativity. That is one reason collaborations among artists, scientists, and engineers spark the powerful breakthroughs described by the Harvard professor David Edwards (author of Artscience, Harvard University Press, 2008); Xerox’s former chief scientist, John Seely Brown; and the physiologist Robert Root-Bernstein. It is also the case that not all arts schools fully embrace the creative process. In fact, some are so focused on teaching mastery and artistic conventions that they are far from hotbeds of creativity. Even so, the arts might have a special claim to nurturing creativity.

A recent national study conducted by the Curb Center at Vanderbilt University, with Teagle Foundation support, found that arts majors integrate and use core creative abilities more often and more consistently than do students in almost all other fields of study. For example, 53 percent of arts majors say that ambiguity is a routine part of their coursework, as assignments can be taken in multiple directions. Only 9 percent of biology majors say that, 13 percent of economics and business majors, 10 percent of engineering majors, and 7 percent of physical-science majors. Four-fifths of artists say that expressing creativity is typically required in their courses, compared with only 3 percent of biology majors, 16 percent of economics and business majors, 13 percent of engineers, and 10 percent of physical-science majors. And arts majors show comparative advantages over other majors on additional creativity skills—reporting that they are much more likely to have to make connections across different courses and reading; more likely to deploy their curiosity and imagination; more likely to say their coursework provides multiple ways of looking at a problem; and more likely to say that courses require risk taking.

Tepper and Kuh focus on the arts for their own purposes, and I realized that in thinking about the ways that an MFA had helped me I was thinking about an arts discipline in many respects.  However, the fact that the humanities is nowhere in their article set me to thinking.  How do the humanities stack up in cultivating creativity, and is this a selling point for parents and prospective students to consider as they imagine what their kids should study in college.  These are my reflections on the seven major “creative” qualities that Kepper and Kuh believe we should cultivate, and how the humanities might do in each of them.

  1. the ability to approach problems in nonroutine ways using analogy and metaphor;
    • I think the humanities excel in this area for a variety of reasons.  For some disciplines like English and Modern Languages, we focus on the way language works with analogy and metaphor and we encourage its effective use in writing.  But more than that, I think many humanities disciplines can encourage nonroutine ways of approaching problems—though, of course, many professors in any discipline can be devoted to doing the same old things the same old way.
  2. conditional or abductive reasoning (posing “what if” propositions and reframing problems);
    • Again, I think a lot of our humanities disciplines approach things in this fashion.  To some degree this is because of a focus on narrative.  One way of getting at how narrative works and understanding the meaning of a narrative at hand is to pose alternative scenarios.  What if we had not dropped the bomb on Japan?  What if Lincoln had not been shot?  How would different philosophical assumptions lead to different outcomes to an ethical problem.
  3. keen observation and the ability to see new and unexpected patterns;
    • I think we especially excel in this area.  Most humanities disciplines are deeply devoted to close and careful readings that discover patterns that are beneath the surface.  The patterns of imagery in a poem, the connections between statecraft and everyday life, the relationship between dialogue and music in a film.  And so forth.  Recognizing patterns in problems can lead to novel and inventive solutions.
  4. the ability to risk failure by taking initiative in the face of ambiguity and uncertainty;
    • I’m not sure if the humanities put a premium on this inherently or not.  I know a lot of professors (or at least I as a professor) prefer their students take a risk in doing something different on a project and have it not quite work than to do the predictable thing.  But I don’t know that this is something inherent to our discipline.  I do think, however, that our disciplines inculcate a high level of comfort with uncertainty and ambiguity.  Readings of novels or the complexities of history require us to not go for easy solutions but to recognize and work within the ambiguity of situations.
  5. the ability to heed critical feedback to revise and improve an idea;
    • Again, I would call this a signature strength, especially as it manifests itself in writing in our discipline and in the back and forth exchange between students and student and teacher.  Being willing to risk and idea or an explanation, have it critiqued, and then to revise one’s thinking in response to more persuasive arguments.
  6. a capacity to bring people, power, and resources together to implement novel ideas; and
    • I admit that on this one I kind of think the humanities might be weaker than we should be, at least as manifested in our traditional way of doing things.   Humanists in most of our disciplines are notorious individualists who would rather spend their time in libraries.  We don’t get much practice at gathering people together with the necessary resources to work collaboratively.  This can happen, but instinctively humanists often want to be left alone.  This is an area where I think a new attention to collaborative and project based learning could help the humanities a great deal, something we could learn from our colleagues in the sciences and arts like theatre and music.  I’m hopeful that some of the new attention we are giving to digital humanities at Messiah College will lead us in this direction.
  7. the expressive agility required to draw on multiple means (visual, oral, written, media-related) to communicate novel ideas to others.
    • A partial strength.  I do think we inculcate good expressive abilities in written and moral communication, and we value novel ideas.  Except for our folks in film and communication—as well as our cousins in art history—we are less good at working imaginatively with visual and other media related resources.  This has been one of my priorities as dean, but something that’s hard to generate from the ground up.

Well, that’s my take.  I wonder what others think?