Tag Archives: humanities

Deep and Wide: Katharine Brooks on Becoming a T-shaped Professional

Earlier today I blogged on the need for humanities students to take seriously the need to become more literate in science, technology and mathematics, both in order to pursue the ideal of a well-rounded liberal arts education and very pragmatically in order to prepare themselves for the world of work. Katharine Brooks (UT-Austin), one of the keynoters at the Rethinking Success conference at Wake Forest takes up the same point in a different manner in her reflections on the need for job candidates coming out of college to present themselves as T-shaped individuals, persons with deep knowledge of one or two areas and broad knowledge of several.

According to those in the talent-seeking field, the most sought-after candidates for management, consulting, research, and other leadership positions are T-shaped. The vertical stem of the T is the foundation: an in-depth specialized knowledge in one or two fields. The horizontal crossbar refers to the complementary skills of communication (including negotiation), creativity, the ability to apply knowledge across disciplines, empathy (including the ability to see from other perspectives), and an understanding of fields outside your area of expertise.

Organizations need workers with specialized knowledge who can also think broadly about a variety of areas, and apply their knowledge to new settings. Since T-shaped professionals possess skills and knowledge that are both broad and deep, developing and promoting your T-shaped talent may be the ticket to yourcareer success now and in the future.

The term “T-shaped” isn’t new: it’s been in use since the 1990s but mostly in consulting and technical fields. Several companies, including IDEO andMcKinsey & Company, have used this concept for years because they have always sought multidisciplinary workers who are capable of responding creatively to unexpected situations. Many companies have also developed interdisciplinary T teams to solve problems.

Dr. Phil Gardner at Michigan State University, who researches and writes regularly on recruiting trends, has been researching the concept of the T-shaped professional and the T-shaped manager. At the recent Rethinking Success Conference at Wake Forest University, Dr. Gardner described the ideal job candidate as a “liberal arts student with technical skills” or a “business/engineering student with humanities training”— in other words, a T-shaped candidate. Dr. Gardner is currently developing a guide for college students based on this concept. He notes that “while the engineers are out in front on this concept – every field will require T professional development.”

As my post earlier today suggested, I think this kind of approach to things is absolutely crucial for graduates in humanities programs, and we ought to shape our curricula–both within the majors and in our general education programs– in such a way that we are producing students confident in and able to articulate the ways in which their education and experiences have made them both deep and broad.
If I can take a half step back from those assertions and plunge in another direction, I will only point out that there is a way in which this particular formulation may let my brethren who are in technical fields off the hook a little too easily.  If it is the case that engineers are leaders in this area, I will say that the notion of breadth that is entailed may be a fairly narrow one, limited to the courses that students are able to get in their general education curriculum.
My colleague, Ray Norman, who is the Dean of our School of Science Engineering and Health has talked with me on more than one occasion about how desirable his engineering graduates are because they have had several courses in the humanities.  I am glad for that, but I point out to him that it is absolutely impossible for his engineering graduates to even minor in a field in my area, much less dream of a double major.  About a decade ago when I was chair of the English department, I went to a colleague who has since vacated the chair of the engineering department, asking if we could talk about ways that we could encourage some interchange between our departments, such that I could encourage my majors to take more engineering or other technical courses, and he could encourage his engineers to minor in English.  He was enthusiastic but also regretful.  He’d love to have my English majors in his program, but he couldn’t send his engineers my way;  the size of the engineering curriculum meant it was absolutely impossible for his students to take anything but the required courses in general education.
I don’t hold this against my colleagues; they labor under accreditation standards and national expectations in the discipline.  But I do think it raises again important questions about what an undergraduate education is for, questions explored effectively by Andrew Delbanco’s recent book.  Should undergraduate programs be so large and so professionally oriented that students are unable to take a minor or possibly a double major?  Whistling in to the wind, I say they should not.  Breadth and Depth should not mean the ability to know ONLY one thing really well;  it ought to mean knowing AT LEAST one thing really well, and a couple of other things pretty well, and several other things generally well.
Oddly enough, it is far easier for liberal arts students to achieve this richer kind of breadth and depth, if they only will.  A major in history, a minor in sociology, a second minor in information sciences, a couple of internships working on web-development at a museum, a college with a robust general education program.  There’s a T-shape to consider.
[Side note;  It was Camp Hill Old Home Week at Wake Forest and the Rethinking Success conference last week.  At a dinner for administrators and career officers hosted by Jacquelyn Fetrow, Dean of the College of Arts and Sciences, Jacque and Katherine Brooks discovered they’d both grown up in Camp Hill, and both within a half dozen blocks of where I now live.  Small world, and Camp Hill is leading it :-)]

Do Humanities Programs Encourage the Computational Illiteracy of Their Students?

I think the knee-jerk and obvious answer to my question is “No.”  I think if humanities profs were confronted with the question of whether their students should develop their abilities in math (or more broadly in math, science and technology), many or most would say Yes.  On the other hand, I read the following post from Robert Talbert at the Chronicle of Higher Ed.  It got me thinking just a bit about how and whether we in the humanities contribute to an anti-math attitude among our own students, if not in the culture as a whole.

I’ve posted here before about mathematics’ cultural problem, but it’s really not enough even to say “it’s the culture”, because kids do not belong to a single monolithic “culture”. They are the product of many different cultures. There’s their family culture, which as Shaughnessy suggests either values math or doesn’t. There’s the popular culture, whose devaluing of education in general and mathematics in particular ought to be apparent to anybody not currently frozen in an iceberg. (The efforts of MIT, DimensionU, and others have a steep uphill battle on their hands.)

And of course there’s the school culture, which itself a product of cultures that are out of kids’ direct control. Sadly, the school culture may be the toughest one to change, despite our efforts at reform. As the article says, when mathematics is reduced to endless drill-and-practice, you can’t expect a wide variety of students — particularly some of the most at-risk learners — to really be engaged with it for long. I think Khan Academy is trying to make drill-and-practice engaging with its backchannel of badges and so forth, but you can only apply so much makeup to an inherently tedious task before learners see through it and ask for something more.

via Can Math Be Made Fun? – Casting Out Nines – The Chronicle of Higher Education.

This all rings pretty true to me.  There are similar versions of this in other disciplines.  In English, for instance, students unfortunately can easily learn to hate reading and writing through what they imbibe from popular culture or through what the experience in the school system.  For every hopeless math geek on television, there’s a reading geek to match.  Still and all, I wonder whether we in the humanities combat and intervene in the popular reputation of mathematics and technological expertise, or do we just accept it, and do we in fact reinforce it.

I think, for instance, of the unconscious assumption that there are “math people” and “English people”;  that is, there’s a pretty firmly rooted notion that people are born with certain proclivities and abilities and there is no point in addressing deficiencies in your literacy in other areas.  More broadly, I think we apply this to students, laughing in knowing agreement when they talk about coming to our humanities disciplines because they just weren’t math persons or a science persons, or groaning together in the faculty lounge about how difficult it is to teach our general education courses to nursing students or to math students.  As if our own abilities were genetic.

In high school I was highly competent in both math and English, and this tendency wasn’t all that unusual for students in the honors programs.  On the other hand, I tested out of math and never took another course in college, and none of my good humanistic teachers in college ever challenged and asked me to question that decision.  I was encouraged to take more and different humanities courses (though, to be frank, my English teachers were suspicious of my interest in philosophy), but being “well-rounded’ and “liberally educated”  seems in retrospect to have been largely a matter of being well-rounded in only half of the liberal arts curriculum.  Science and math people were well-rounded in a different way, if they were well-rounded at all.

There’s a lot of reason to question this.  Not least of which being that if our interests and abilities are genetic we have seen a massive surge of the gene pool toward the STEM side of the equation if enrollments in humanities majors is to serve as any judge.  I think it was Malcolm Gladwell who recently pointed out that genius has a lot less to do with giftedness than it does with practice and motivation.  Put 10000 hours in to almost anything and you will become a genius at it (not entirely true, but the general principle applies).  Extrapolating, we might say that even if students aren’t going to be geniuses in math and technology, they could actually get a lot better at it if they’d only try.

And there’s a lot of reason to ask them to try.  At the recent Rethinking Success conference at Wake Forest, one of the speakers who did research into the transition of college students in to the workplace pounded the table and declared, “In this job market you must either be a technical student with a liberal arts education or a liberal arts major with technical savvy.  There is no middle ground.”  There is no middle ground.  What became quite clear to me at this conference is that companies mean it that they want students with a liberal arts background.  However, it was also very clear to me that they expect them to have technical expertise that can be applied immediately to job performance. Speaker after speaker affirmed the value of the liberal arts.  They also emphasized the absolute and crying need for computational, mathematical, and scientific literacy.

In other words, we in the Humanities will serve our students extremely poorly if we accept their naive statements about their own genetic makeup, allowing them to proceed with a mathematical or scientific illiteracy that we would cry out against if the same levels of illiteracy were evident in others with respect to our own disciplines.

I’ve found, incidentally, that in my conversations with my colleagues in information sciences or math or sciences, that many of them are much more conversant in the arts and humanities than I or my colleagues are in even the generalities of science, mathematics, or technology.  This ought not to be the case, and in view of that i and a few of my colleagues are considering taking some workshops in computer coding with our information sciences faculty.  We ought to work toward creating a generation of humanists that does not perpetuate our own levels of illiteracy, for their own sake and for the health of our disciplines in the future.

Pilgrimage to the Mountaintop in Search of CEOs

One sign of the crisis of confidence in the humanities is that we keep feeling compelled to trot out CEOs to make our case for us.  It’s a little like the way we cited Freud in graduate school even if we believed the emperor had no clothes, just because we knew our professors believed he did.  And so, while we’d like to be citing John Henry Newman on the Idea of a Christian University or Socrates on the tragedy of an unexamined life, we look to the world of business for hopeful confirmation.  This is the way of both presidents and preachers, so why not professors.

I’m not too proud to play that game, so I note this recent essay from Jason Trennert in Forbes, reminding us again that there are lessons important to the boardroom that are learned best in history books and not in business seminars.

I was fortunate enough to attend great schools, earning both a bachelor’s degree in economics and an MBA, and I’ve wondered more times than I care to admit in the last few years whether I learned a damn thing.

After considerable thought, I’ve come to the conclusion that the broader, more liberal arts- oriented courses I took in my undergraduate years did far more to help me to adapt to what was deemed to be “economically unprecedented” than the more technical lessons I learned in business school. Not once in the last three years did I feel compelled to develop more complex mathematical models to help me discern what was happening.

This was due, at least in part, to an almost immediate revelation that it was these same models that sowed the seeds of the financial collapse in the first place. The financial crisis didn’t prompt me to do more math but to read quite a bit more history.

Is the decline of the Humanities responsible for Wall Street Corruption?

Andrew Delbanco’s view in a recent Huffington Post essay is “Yes”, at least to some indeterminate degree, though I admit that the title of his post “A Modest Proposal” gave me some pause given its Swiftian connotations:

What I do know is that at the elite universities from which investment firms such as Goldman Sachs recruit much of their talent, most students are no longer seeking a broad liberal education. They want, above all, marketable skills in growth fields such as information technology. They study science, where the intellectual action is. They sign up for economics and business majors as avenues to the kind of lucrative career Mr. Smith enjoyed. Much is to be gained from these choices, for both individuals and society. But something is also at risk. Students are losing a sense of how human beings grappled in the past with moral issues that challenge us in the present and will persist into the future. This is the shrinking province of what we call “the humanities.”

For the past twenty years, the percentage of students choosing to major in the humanities — in literature, philosophy, history, and the arts — has been declining at virtually all elite universities. This means, for instance, that fewer students encounter the concept of honor in Homer’s Iliad, or Kant’s idea of the “categorical imperative” — the principle that Mr. Smith thinks is out of favor at Goldman: that we must treat other people as ends in themselves rather than as means to our own satisfaction. Mr. Smith was careful to say that he was not aware of anything illegal going on. But few students these days read Herman Melville’s great novella, Billy Budd, about the difficult distinction between law and justice.

Correlation is not cause, and it’s impossible to prove a causal relation between what students study in college and how they behave in their post-college lives. But many of us who still teach the humanities believe that a liberal education can strengthen one’s sense of solidarity with other human beings — a prerequisite for living generously toward others. One of the striking discoveries to be gained from an education that includes some knowledge of the past is that certain fundamental questions persist over time and require every generation to answer them for itself.

via Andrew Delbanco: A Modest Proposal.

This is consonant with Delbanco’s thesis–expressed in his book College:  What it Was, Is, and Should Be–that education in college used to be about the education of the whole person but has gradually been emptied of the moral content of its originally religious more broadly civic vision, and the preprofessional and pecuniary imagination has become the dominant if not the sole rationale for pursuing an education. I am viscerally attracted to this kind of argument, so I offer a little critique rather than cheerleading.  First, while I do think its the case that an education firmly rooted in the humanities can provide for the kinds of deep moral reflection that forestalls a purely instrumentalist view of our fellow citizens–or should I say consumers–it’s also very evidently the case that people with a deep commitment to the arts and humanities descend into moral corruption as easily as anyone else.  The deeply felt anti-semitism of the dominant modernists would be one example, and the genteel and not so genteel racism of the Southern Agrarians would be another.  When Adorno said that there was no poetry after Auschwitz, he was only partly suggesting that the crimes of the 20th century were beyond the redemption of literature;  he also meant more broadly that the dream that literature and culture could save us was itself a symptom of our illness, not a solution to it.  Delbanco might be too subject to this particular dream, I think.

Secondly, I think that this analysis runs close to blaming college students for not majoring in or studying more in the humanities and is a little bit akin to blaming the victim–these young people have inherited the world we’ve given them, and we would do well to look at ourselves in the mirror and ask what kind of culture we’ve put in place that would make the frantic pursuit of economic gain in the putative name of economic security a sine qua non in the moral and imaginative lives of our children.

That having been said.  Yes, I do think the world–including the world of Wall Street–would be better if students took the time to read Billy Budd or Beloved, wedging it in somewhere between the work study jobs, appointments with debt counselors, and multiple extracurriculars and leadership conferences that are now a prerequisite for a job after college.

Digital History of the Brethren in Christ; writing in an age of distraction

I’ve been spending the last couple of days at the undergraduate conference on the digital humanities at Swarthmore College, getting a feel for what might be possible at the undergraduate level. Yesterday’s keynote by Alexandra Juhaz, whose book Learning from YouTube created a splash a couple of years ago, emphasized that in our writing now we have to write in an environment in which we know people will be distracted, and it may not be a feasible goal to overcome their distraction. Her own work is trying to account for what that might mean for multimedia writing, for either scholars or undergraduates. What does it mean to write for the distracted and know that they will remain distracted. I don’t quite have my brain around that rhetorical situation yet.

I’ve especially been excited to see the real world possibilities for digital humanities project and imagining the kinds of things our undergraduates might do. My colleague John Fea over at The way of Improvement Leads Home directed my attention to a great new digital history site by one of our graduates from Messiah College, Devin Manzullo-Thomas on the history of Brethren in Christ and evangelicals. Devin’s now a graduate student in digital history at Temple. I’d love to see our undergraduates working with our faculty on a project like this

Living in an e-plus world: Students now prefer digital texts when given a choice

A recent blog by Nick DeSantis in the Chronicle points to a survey by the Pearson Foundation that suggests Tablet ownership is on the rise.  That’s not surprising, but more significant is the fact that among tablet users there’s a clear preference for digital texts over the traditional paper codex, something we haven’t seen before even among college students of this wired generation:

One-fourth of the college students surveyed said they owned a tablet, compared with just 7 percent last year. Sixty-three percent of college students believe tablets will replace textbooks in the next five years—a 15 percent increase over last year’s survey. More than a third said they intended to buy a tablet sometime in the next six months.

This year’s poll also found that the respondents preferred digital books over printed ones. It’s a reversal of last year’s results and goes against findings of other recent studies, which concluded that students tend to choose printed textbooks. The new survey found that nearly six in 10 students preferred digital books when reading for class, compared with one-third who said they preferred printed textbooks.

I find this unsurprising as it matches up pretty well with my own experience.  5 years ago I could never imagine doing any significant reading on a tablet.  Now I do all my reading of scholarly journals and long form journalism–i.e The Atlantic, the New York Review of Books, The Chronicle Review–on my iPad.  And while I still tend to prefer the codex for the reading of novels and other book length works, the truth is that preference is slowly eroding as well.  As I become more familiar with the forms of e-reading, the notions of its inherent inferiority, like the notions of any unreflective prejudice, gradually fade in the face of familiarity.

And yet I greet the news of this survey with a certain level of panic, not panic that it should happen at all, but panic that the pace of change is quickening and we are hardly prepared, by we I mean we in the humanities here in small colleges and elsewhere.  I’ve blogged on more than one occasion about my doubts about e-books and yet my sense of their inevitable ascendancy.  For instance here on the question of whether e-books are being foisted on students by a cabal of publishers and administrators like myself out to save a buck (or make a buck as the case may be), and here on the nostalgic but still real feeling that I have that print codex forms of books have an irreplaceable individuality and physicality that the mere presence of text in a myriad of e-forms does not suffice to replace.

But though I’ve felt the ascendancy of e-books was inevitable, I think I imagined a 15 or 20 year time span in which print and e-books would mostly live side by side.  Our own librarians here at Messiah College talk about a “print-plus” model for libraries, as if e-book will remain primarily an add on for some time to come.  I wonder.  Just as computing power increases exponentially, it seems to me that the half-life of print books is rapidly diminishing.  I now wonder whether we will have five years before students will expect their books to be in print–all their books, not just their hefty tomes for CHEM 101 that can be more nicely illustrated with iBook Author–but also their books for English and History classes as well.  This is an “e-plus”  world  where print will increasingly not be the norm, but the supplement to fill whatever gaps e-books have not yet bridged, whatever textual landscapes have not yet been digitized.

Despite warnings, we aren’t yet ready for an e-plus world.  Not only do we not know how to operate the apps that make these books available, we don’t even know how to critically study books in tablet form.  Yet learning what forms of critical engagement are possible and necessary will be required.  I suspect, frankly, that our current methods developed out of a what was made possible by the forms that texts took, rather than forms following our methodological urgencies.  This means that the look of critical study in the classroom will change radically in the next ten years.  What will it look like?

Michael Hart and Project Gutenberg

I felt an unaccountable sense of loss at reading tonight in the Chronicle of Higher Education (paper edition no-less) that the founder of Project Gutenberg, Michael Hart, has died at the age of 64.  This is a little strange since I had no idea who had founded Project Gutenberg until I read the obituary.  But Project Gutenberg I know, and I think I knew immediately when I ran across it that it was already and would continue to be an invaluable resource for readers and scholars, this even though I’ve never been much of a champion of e-books.  I guess it felt a bit like I had discovered belatedly the identity of a person who had given me a great gift and never had the chance to thank him.

One aspect of Hart’s vision for Project Gutenberg struck me in relationship to some of the things I’ve been thinking about in relationship to the Digital Humanities.  That’s Hart’s decision to go with something that was simple and nearly universal as an interface rather than trying to sexy, with it, and up to date.  Says the Chronicle

His early experiences clearly informed his choices regarding Project Gutenberg. He was committed to lo-fi—the lowest reasonable common denominator of textual presentation. That was for utterly pragmatic reasons: He wanted his e-texts to be readable on 99 percent of the existing systems of any era, and so insisted on “Plain-Vanilla ASCII” versions of all the e-texts generated by Project Gutenberg.

That may seem a small—even retro—conceit, but in fact it was huge. From the 80s on, as the Internet slowly became more publicly manifest, there were many temptations to be “up to date”: a file format like WordStar, TeX, or LaTeX in the 1980s, or XyWrite, MS Word, or Adobe Acrobat in the 90s and 2000s, might provide far greater formatting features (italics, bold, tab stops, font selections, extracts, page representations, etc.) than ASCII. But because Mr. Hart had tinkered with technology all his life, he knew that “optimal formats” always change, and that today’s hi-fi format was likely to evolve into some higher-fi format in the next year or two. Today’s ePub version 3.01 was, to Mr. Hart, just another mile marker along the highway. To read an ASCII e-text, via FTP, or via a Web browser, required no change of the presentational software—thereby ensuring the broadest possible readership.

Mr. Hart’s choice meant that the Project Gutenberg corpus—now 36,000 works—would always remain not just available, but readable. What’s more, it has been growing, in every system since.

This is no small thing.  The ephemeral character of digital humanities projects bothers me.  By ephemeral I don’t mean they are intellectually without substance.  I think the intellectual character of the work can be quite profound.  However, the forms in which the work is done can disappear or be outdated tomorrow.  Hart’s decision to use ASCII is in some sense an effort to replicate the durability of the book.  Books, for all the fragility of paper, have a remarkable endurance and stability overall.  The basic form doesn’t change and the book used by an ancient in the middle ages is, more or less, still usable by me in the same fashion.  By contrast I can’t even open some of my old files in my word processor.  I think the work I did was substantial, but the form it was placed in was not enduring.  Harts decision makes sense to me, but I’m not sure how it might be extended to other kinds of projects in the digital humanities.

Create or Die: Humanities and the Boardroom

A couple of years ago when poet and former head of the NEA, Dana Gioia, came to Messiah College to speak to our faculty and the honors program, I asked him if he felt there were connections between his work as a poet and his other careers as a high-level government administrator and business executive.  Gioia hemmed and hawed a bit, but finally said no;  he thought he was probably working out of two different sides of his brain.

I admit to thinking the answer was a little uncreative for so creative  a person.  I’ve always been a bit bemused by and interested in the various connections that I make between the different parts of myself.  Although I’m not a major poet like Gioia, I did complete an MFA and continue to think of myself in one way or another as a writer, regardless of what job I happen to be doing at the time.  And I actually think working on an MFA for those two years back in Montana has had a positive effect on my life as an academic and my life as an administrator.  Some day I plan to write an essay on the specifics, but it is partially related to the notion that my MFA did something to cultivate my creativity, to use the title of a very good article in the Chronicle review by Steven Tepper and George Kuh.  According to Tepper and Kuh,

To fuel the 21st-century economic engine and sustain democratic values, we must unleash and nurture the creative impulse that exists within every one of us, or so say experts like Richard Florida, Ken Robinson, Daniel Pink, Keith Sawyer, and Tom Friedman. Indeed, just as the advantages the United States enjoyed in the past were based in large part on scientific and engineering advances, today it is cognitive flexibility, inventiveness, design thinking, and nonroutine approaches to messy problems that are essential to adapt to rapidly changing and unpredictable global forces; to create new markets; to take risks and start new enterprises; and to produce compelling forms of media, entertainment, and design.

Tepper and Kuh , a sociologist and professor education respectively, argue forcefully that creativity is not doled out by the forces of divine or genetic fate, but something that can be learned through hard work.  An idea supported by recent findings that what musical and other artistic prodigies share is not so much genetic markers as practice, according to Malcolm Gladwell 10,000 hours of it.  Tepper and Kuh believe the kinds of practice and discipline necessary for cultivating a creative mindset are deeply present in the arts, but only marginally present in many other popular disciplines on campus.

Granted, other fields, like science and engineering, can nurture creativity. That is one reason collaborations among artists, scientists, and engineers spark the powerful breakthroughs described by the Harvard professor David Edwards (author of Artscience, Harvard University Press, 2008); Xerox’s former chief scientist, John Seely Brown; and the physiologist Robert Root-Bernstein. It is also the case that not all arts schools fully embrace the creative process. In fact, some are so focused on teaching mastery and artistic conventions that they are far from hotbeds of creativity. Even so, the arts might have a special claim to nurturing creativity.

A recent national study conducted by the Curb Center at Vanderbilt University, with Teagle Foundation support, found that arts majors integrate and use core creative abilities more often and more consistently than do students in almost all other fields of study. For example, 53 percent of arts majors say that ambiguity is a routine part of their coursework, as assignments can be taken in multiple directions. Only 9 percent of biology majors say that, 13 percent of economics and business majors, 10 percent of engineering majors, and 7 percent of physical-science majors. Four-fifths of artists say that expressing creativity is typically required in their courses, compared with only 3 percent of biology majors, 16 percent of economics and business majors, 13 percent of engineers, and 10 percent of physical-science majors. And arts majors show comparative advantages over other majors on additional creativity skills—reporting that they are much more likely to have to make connections across different courses and reading; more likely to deploy their curiosity and imagination; more likely to say their coursework provides multiple ways of looking at a problem; and more likely to say that courses require risk taking.

Tepper and Kuh focus on the arts for their own purposes, and I realized that in thinking about the ways that an MFA had helped me I was thinking about an arts discipline in many respects.  However, the fact that the humanities is nowhere in their article set me to thinking.  How do the humanities stack up in cultivating creativity, and is this a selling point for parents and prospective students to consider as they imagine what their kids should study in college.  These are my reflections on the seven major “creative” qualities that Kepper and Kuh believe we should cultivate, and how the humanities might do in each of them.

  1. the ability to approach problems in nonroutine ways using analogy and metaphor;
    • I think the humanities excel in this area for a variety of reasons.  For some disciplines like English and Modern Languages, we focus on the way language works with analogy and metaphor and we encourage its effective use in writing.  But more than that, I think many humanities disciplines can encourage nonroutine ways of approaching problems—though, of course, many professors in any discipline can be devoted to doing the same old things the same old way.
  2. conditional or abductive reasoning (posing “what if” propositions and reframing problems);
    • Again, I think a lot of our humanities disciplines approach things in this fashion.  To some degree this is because of a focus on narrative.  One way of getting at how narrative works and understanding the meaning of a narrative at hand is to pose alternative scenarios.  What if we had not dropped the bomb on Japan?  What if Lincoln had not been shot?  How would different philosophical assumptions lead to different outcomes to an ethical problem.
  3. keen observation and the ability to see new and unexpected patterns;
    • I think we especially excel in this area.  Most humanities disciplines are deeply devoted to close and careful readings that discover patterns that are beneath the surface.  The patterns of imagery in a poem, the connections between statecraft and everyday life, the relationship between dialogue and music in a film.  And so forth.  Recognizing patterns in problems can lead to novel and inventive solutions.
  4. the ability to risk failure by taking initiative in the face of ambiguity and uncertainty;
    • I’m not sure if the humanities put a premium on this inherently or not.  I know a lot of professors (or at least I as a professor) prefer their students take a risk in doing something different on a project and have it not quite work than to do the predictable thing.  But I don’t know that this is something inherent to our discipline.  I do think, however, that our disciplines inculcate a high level of comfort with uncertainty and ambiguity.  Readings of novels or the complexities of history require us to not go for easy solutions but to recognize and work within the ambiguity of situations.
  5. the ability to heed critical feedback to revise and improve an idea;
    • Again, I would call this a signature strength, especially as it manifests itself in writing in our discipline and in the back and forth exchange between students and student and teacher.  Being willing to risk and idea or an explanation, have it critiqued, and then to revise one’s thinking in response to more persuasive arguments.
  6. a capacity to bring people, power, and resources together to implement novel ideas; and
    • I admit that on this one I kind of think the humanities might be weaker than we should be, at least as manifested in our traditional way of doing things.   Humanists in most of our disciplines are notorious individualists who would rather spend their time in libraries.  We don’t get much practice at gathering people together with the necessary resources to work collaboratively.  This can happen, but instinctively humanists often want to be left alone.  This is an area where I think a new attention to collaborative and project based learning could help the humanities a great deal, something we could learn from our colleagues in the sciences and arts like theatre and music.  I’m hopeful that some of the new attention we are giving to digital humanities at Messiah College will lead us in this direction.
  7. the expressive agility required to draw on multiple means (visual, oral, written, media-related) to communicate novel ideas to others.
    • A partial strength.  I do think we inculcate good expressive abilities in written and moral communication, and we value novel ideas.  Except for our folks in film and communication—as well as our cousins in art history—we are less good at working imaginatively with visual and other media related resources.  This has been one of my priorities as dean, but something that’s hard to generate from the ground up.

Well, that’s my take.  I wonder what others think?

We’re all pre-professional now

I’ve been catching up this evening on backlog of reading I’ve stored on Instapaper.  (I’m thinking “backlog” might be the right word:  I don’t think you can have a “stack” on an iPad).  A few weeks back Cathy Davidson down at Duke University had an interesting piece on whether College is for everyone.  Davidson’s basic thesis, as the title suggests, is no.  Despite the nationalist rhetoric that attends our discussions of higher education–we will be a stronger America if every Tom Dick and Henrietta has a four year degree–maybe, Davidson suggests, maybe we’d have a better society if we attended to and nurtured the multiple intelligences and creativities that abound in our society, and recognized that many of those were best nurtured somewhere else than in a college or university:

The world of work — the world we live in — is so much more complex than the quite narrow scope of learning measured and tested by college entrance exams and in college courses. There are so many viable and important and skilled professions that cannot be outsourced to either an exploitative Third World sweatshop or a computer, that require face-to-face presence, and a bucketload of skills – but that do not require a college education: the full range of IT workers, web designers, body workers (such as deep tissue massage), yoga and Pilates instructors, fitness educators, hairdressers, retail workers, food industry professionals, entertainers and entertainment industry professionals, construction workers, dancers, artists, musicians, entrepreneurs, landscapers, nannies, elder-care professionals, nurse’s aides, dog trainers, cosmetologists, athletes, sales people, fashion designers, novelists, poets, furniture makers, auto mechanics, and on and on.

All those jobs require specialized knowledge and intelligence, but most people who end up in those jobs have had to fight for the special form their intelligence takes because, throughout their lives, they have seen never seen their particular ability and skill set represented as a discipline, rewarded with grades, put into a textbook, or tested on an end-of-grade exam. They have had to fight for their identity and dignity, their self-worth and the importance of their particular genius in the world, against a highly structured system that makes knowledge into a hierarchy with creativity, imagination, and the array of so-called “manual skills” not just at the bottom but absent.

Moreover, Davidson argues that not only is our current educational system not recognizing and valuing these kinds of skills on the front end, when we actually get students in to college we narrow students interests yet further:

All of the multiple ways that we learn in the world, all the multiple forms of knowing we require in order to succeed in a life of work, is boiled down to an essential hierarchical subject matter tested in a way to get one past the entrance requirements and into a college. Actually, I agree with Ken Robinson that, if we are going to be really candid, we have to admit that it’s actually more narrow even than that: we’re really, implicitly training students to be college professors. That is our tacit criterion for “brilliance.” For, once you obtain the grail of admission to higher ed, you are then disciplined (put into majors and minors) and graded as if the only end of your college work were to go on to graduate school where the end is to prepare you for a profession, with university teaching of the field at the pinnacle of that profession.

Which brings me to my title.  We’re all pre-professional now.  Since the advent of the university if not before there’s been a partisan debate between growing pre-professional programs and what are defined as the “traditional liberal arts,”  though in current practice given the cache of  science programs in the world of work this argument is sometimes really between humanities and the rest of the world.

Nevertheless, I think Davidson points out that in actual practice of the humanities in many departments around the country, this distinction is specious.  Many humanities programs conceive of themselves as preparing students for grad school.  In the humanities.  In other words, we imagine ourselves as ideally preparing students who are future professionals in our profession.  These are the students who receive our attention, the students we hold up as models, the students we teach to, and the students for whom we construct our curricula, offer our honors and save our best imaginations.  What is this, if not a description of a pre-professional program?  So captive are we to this conceptual structure that it becomes hard to imagine what it would mean to form an English major, History major, or Philosophy major whose primary implicit or explicit goal was not to reproduce itself, but to produce individuals who will work in the world of business–which most of them will do–or in non-profit organizations, or in churches and synagogues, or somewhere else that we cannot even begin to imagine.  We get around this with a lot of talk with transferable skills, but we actually don’t do a great deal to help our students understand what those skills are or what they might transfer to.  So I think Davidson is right to point this out and to suggest that there’s something wrongheaded going on.

That having been said, a couple of points of critique:

Davidson rightly notes these multiple intelligences and creativities, and she rightly notes that we have a drastically limited conception of society if we imagine a four year degree is the only way to develop these intelligences and creativities in an effective fashion.  But Davidson remains silent on the other roles of higher education, the forming of an informed citizenry being only one.  Some other things I’ve seen from Davidson, including her new book Now You See It, suggests she’s extremely excited about all the informal ways that students are educating themselves, and seems to doubt the traditional roles of higher education;  higher education’s traditional role as a producer and disseminator of knowledge has been drastically undermined.  I have my doubts.  It is unclear that a couple of decades of the internet have actually produced a more informed citizenry.  Oh, yes, informed in all kinds of ways about all kinds of stuff, like the four thousand sexual positions in the Kama Sutra, but informed in a way that allows for effective participation in the body politic?  I’m not so sure.

I think this is so because to be informed is not simply to possess information, but to be shaped, to be in-formed.  In higher education this means receiving a context for how to receive and understand information, tools for analysing, evaluating, and using information,  the means for creating new knowledge for oneself.  To be sure, the institutions of higher education are not the only place that this happens, but it is clear that this doesn’t just automatically happen willy-nilly just because people have a Fios connection.

What higher education can and should give, then, is a lot of the values and abilities that are associated with a liberal arts education traditionally conceived–as opposed to being conceived as a route to a professorship–and these are values, indeed, that everyone should possess.  Whether it requires everyone to have a four year degree is an open question.  It may be that we need to rethink our secondary educational programs in such a way that they inculcate liberal arts learning in a much more rigorous and effective way than they do now.  But I still doubt that the kind of learning I’m talking about can be achieved simply by 17 year olds in transformed high schools.  Higher education should be a place for the maturing and transformation of young minds toward a larger understanding of the world and their responsibilities to it, which it sometimes is today, but should be more often.

Humanities and the workplace: or, bodysurfing the Tsunami.

As I suggested in my last post on the demise of Borders, book lovers have lived in an eternal tension between the transcendent ideals their reading often fosters and the commercial realities upon which widespread literacy has depended. The same tension is broadly evident in the Humanities response to professional programs or just more broadly the question of career preparation. We are not wrong to say that an education in history or English is much more than career preparation; nor are we wrong to insist that a college education has to be about much more than pre-professional training. (Not least because most college graduates end up doing things a long distance from their original preparation, and we ought to see that humanities in combination with other knowledges in arts and sciences is at least as good at preparing students for the twists and turns of their eventual career, and perhaps even better, than fields focused on narrower practical preparations

However we are absolutely wrong to assume that questions of career are extraneous or ought to be secondary to our students or our thinking about how we approach curricula.

Daniel Everett, dean of Arts and sciences at Bentley University offers a provocative refection on the need to integrate humanities in to professional education. According to Everett

“Programs that take in students without proper concern for their future or provision for post-graduate opportunities — how they can use what they have learned in meaningful work — need to think about the ethics of their situation. Students no longer come mainly from the leisured classes that were prominent at the founding of higher education. Today they need to find gainful employment in which to apply all the substantive things they learn in college. Majors that give no thought to that small detail seem to assume that since the humanities are good for you, the financial commitment and apprenticeship between student and teacher is fully justified. But in these cases, the numbers of students benefit the faculty and particular programs arguably more than they benefit the students themselves. This is a Ponzi scheme. Q.E.D.”

These are harsh words, but worth considering. I tend to not like Bentley’s particular solutions to the degree that they reduce the humanities to an enriching complement to the important business of, well, business. However, I do think we need to think of new ways of creating our majors that will prepare students for the realities of 21st century employment. Majors that allowed for concentrations in digital humanities would prepare students to engage the changing nature of our disciplines while also gaining technical skills that could serve them well in business. New joint programs with the sciences like those found in medical humanities programs could prepare students in new ways for work in the health care industry. Everett warns of what may happen of humanities programs don’t creatively remake themselves to meet the changing needs of our contemporary world:

“If, like me, you believe that the humanities do have problems to solve, I hope you agree that they are not going to be solved by lamenting the change in culture and exhorting folks to get back on course. That’s like holding your finger up to stop a tidal wave. Thinking like this could mean that new buildings dedicated to the humanities will wind up as mausoleums for the mighty dead rather than as centers of engagement with modern culture and the building of futures in contemporary society.”

Again, I don’t like all of the particular responses Everett has advocated, but I do agree that there is a problem to be addressed that continued proclamations about transferable skills is unlikely to solve. What is sometimes called the applied humanities may be a way of riding the wave rather than being drowned by it.