Category Archives: Humanities

What are the public responsibilities of private education?

I’ve been thinking a lot lately about education for the public good and what that must mean. It’s a sign of an impoverished civic imagination that the most we can come up with is that the purposes of an education is to get a better job so you can raise American competitiveness in the global marketplace. I’ve been using Andrew Delbanco as a part time foil in these reflections. In a new interview over at Inside Higher Ed, Delbanco takes on this general issue yet again.

Interview with author of new book on the past and future of higher education | Inside Higher Ed
http://www.insidehighered.com/news/2012/05/02/interview-author-new-book-past-and-future-higher-education

If a college functions well, it should break down, or at least diminish, the distinction between private and public good. Genuinely educated persons recognize how much they owe to the society that has furnished them with opportunities, and they feel an obligation to give back. This doesn’t mean that a college should teach its students to be ascetics or try to turn them into saints. Personal ambition will always be part of what a successful education requires and rewards. But a good college fosters an atmosphere of public-spiritedness. It teaches its students that individuals depend for fulfillment on community, and that a true community is constituted by responsible individuals.

(via Instapaper)

I like so much of this, but I think Delbanco is only addressing half the question. That is, it seems to me we have a generally compromised sense of public-spiritedness as such in the United States. Our national purposes reduced drastically to a kind of civic consumerism. Students, we, imagine that we are being public spirited by pursuing what it takes to get a job, no longer conceiving of “the public” in a rich complex fashion that can be activated outside the context of warfare and external threats to abstractly defined freedoms. I agree with Delanco, but I wonder whether he is invoking an older notion of public spiritedness that has itself become impoverished. Education as a private good is reflecting a culture that can only image the public through private metaphors and private actions

[Side note: I met Delbanco in the bathroom at the Rethinking Success conference. He seemed stunned that someone had actually read his book. As opposed, I guess, to just reading the excerpts in the chronicle review]

We are all twitterers now: revisiting John McWhorter on Tweeting

Angus Grieve Smith over at the MLA group on Linked In pointed me toward John McWhorter’s take on Twitter a couple of years back. [I admit to some embarrassment in referencing an article that’s TWO YEARS OLD!! But the presentism of the writing for the web is a story for another time]. McWhorter’s basic case is that Twitter is not really writing at all, but a form of graphic speech. My term, not McWhorter’s. He points out that most people even after knowing how to read and write speak in bursts of 7 to 10 words, and writing at its origins reflected these kinds of patterns. In other words, as speakers we are all twitterers. Of Twitter, McWhorter says:

 

The only other problem we might see in something both democratic and useful is that it will exterminate actual writing. However, there are no signs of this thus far. In 2009, the National Assessment of Education Performance found a third of American eighth graders – the ones texting madly right now — reading at or above basic proficiency, but crucially, this figure has changed little since 1992, except to rise somewhat. Just as humans can function in multiple languages, they can also function in multiple kinds of language. An analogy would be the vibrant foodie culture that has thrived and even flowered despite the spread of fast food.

Who among us really fears that future editions of this newspaper will be written in emoticons? Rather, a space has reopened in public language for the chunky, transparent, WYSIWYG quality of the earliest forms of writing, produced by people who had not yet internalized a sense that the flavor of casual language was best held back from the printed page.

This speech on paper is vibrant, creative and “real” in exactly the way that we celebrate in popular forms of music, art, dance and dress style. Few among us yearn for a world in which the only music is classical, the only dance is ballet and daily clothing requires corsets and waistcoats. As such, we might all embrace a brave new world where we can both write and talk with our fingers.

via Talking With Your Fingers – NYTimes.com.

 

I mostly agree with this idea, that we are all code shifters. I’m less sanguine than, McWhorter, however. His eighth grade sample takes students before they’ve entered a period of schooling where they’d be expected to take on more serious reading and longer and more complex forms of writing. Twitter may not be to blame, but it’s not clear to me the state of writing and reading comprehension at higher levels is doing all that well. There’s some pretty good evidence that it isn’t. So just as a foodie culture may thrive in the midst of a lot of fast food, it’s not clear that we ought to be complacent in the face of an obesity epidemic. In the same way, just because tweeting may not signal the demise of fine writing, it’s not clear that it’s helping the average writer become more facile and sophisticated in language use.

Unemployed Philosophers Abounding; Or, It’s Much Less Fun to Talk About Unemployed Business Majors

Philosophers are in the news these days.  By what I can tell from the media, un-and-underemployed philosophy majors are sprouting from the sidewalks, infesting Occupy America movements, and crowding the lines for openings in the barista business.  I am reminded of the line in T.S. Eliot’s Wasteland where he witnesses the hordes of urbanites crossing London Bridge and imagines them as an original infestation of the walking dead:

Philosophers, so many, I had not realized unemployment had undone so many.

The proliferation is further astonishing since my own Department of Philosophy begs borrows and steals students from other departments to make a living.  From what I can gauge in the news media they are not looking the right places because every news reporter living seems to find them easy pickin’s right at hand at every street corner.

A few days ago I posted on a peculiar opinion piece from Frank Bruni at the New York Times, wherein philosophers and anthropologists were given as examples of what’s wrong with the American educational system, graduating as it does hordes of unemployable thinkers with their heads too far in the clouds to realize the damage they are doing to themselves by reading Immanuel Kant.  This morning in my local newspaper I was treated to Nate Beeler’s editorial cartoon, featuring an unkempt and bewildered looking philosophy major on a street corner begging for food, his sign suggesting that he will “epistemologize for food.”  Finally, my day was topped off by an NPR story on the grim prospects for this year’s college grads.  The story finished with an interview with the ever omnipresent philosophy major, and noted, mockingly, that the student intended to pursue medical school after finishing his philosophy degree.  Good to see at least some philosophy major has some sense. I was actually thinking about how wonderful it was to find a student who was so accomplished in both the sciences and the humanities.  More fool I.

How philosophers came to represent the ills of recent college graduates is beyond reckoning.  Though I did do some reckoning.  According to Stats from the Department of Education    between 2006 and 2011, American colleges and Universities graduated approximately 117,891 philosophy majors.  In the same time period these same colleges and universities graduated 1,687,105 business majors.  Give or Take.

According to a Georgetown University study, recent humanities majors unemployment rate is about 9.4%, which means that we probably have about 11,081 unemployed philosophy majors running around loose and unattended.

By comparison, according to the Georgetown study 7.4% of recent business majors are unemployed.  Which means that 126,532 business majors are running around loose and unattended.  Give or Take.

I think the outcome of this entirely off the cuff analysis is that the average person crowding into line for barista openings at Starbucks is probably not a philosopher.  I’m wondering why there are no interviews with business majors on how they feel about the fact that their educational choices did not prepare them for the job market.

We shouldn’t laugh off the difficulties of these figures in general.  Recent college graduates are desperately hurting, whether they majored in philosophy or business;  they are loaded with debt and many are not finding jobs.  And while philosophers are struggling marginally more than some others, the point is that philosophy majors are not hurting in some extraordinary fashion because they have chosen to major in philosophy.  This is a generational problem visited on this generation of student through political, economic, and cultural decisions that were not of their doing or making.  To trash philosophy students as if they were witless is a snide form of victimizing victims of  a system and culture these students did not create.  It relieves us of responsibility to the many who are struggling and enables us to imagine that it is all their fault because of the poor educational choices they’ve happened to make.  Ironically, it enables us to ignore the plight of 128,000 unemployed business students as well, since they have all come to be represented by unkempt and irresponsible philosophers.

I don’t buy it.  A student thoughtful enough to read and think through Kant is thoughtful enough to be aware of what she might be getting herself in to as a philosophy major.  Such students deserve better than mockery and contempt.  They deserve our gratitude in reminding us that an education is about more than just the bottom line.  That we do not give them this is to our discredit, not to theirs.

Should we have college majors at all?

As I’ve suggested before, One of the more startling pronouncements at the Rethinking Success conference last month came from Stanton Green at Monmouth University, in my memory pounding the table and saying that the college major was the worst thing to happen in higher education in the past 150 years.  I’ve thought for a while that a real negative of our current system is the emphasis we put on students selecting a major even before they get to college–a practice driven largely by the need of large professional programs to get students started on their careers from the first semester.

Jeff Seligo at the Chronicle has an interesting blog post this morning on what exactly students think about all the revolution and transformation talk that’s going on in higher ed.  He picks up on this question of the importance of the major, finding anecdotally at least that students are less convinced of the importance of the major than we are:

Majors don’t matter. Perhaps a better question is why we force students to pick a major at all. The number of majors on campus has proliferated in the last two decades, but some academics, such as Mark Taylor or Roger Schank, think we should abolish our traditional notion of majors and build the undergraduate curriculum around broad ideas or problems we face, like water and food production.

Sure, some of the students I talked with were focused on pursuing a specific profession (marketing, for instance) and wanted a degree that would give them a skill set to secure the right internships that eventually would lead to a full-time job. But most of the students said they were less concerned with picking the right major than they were with choosing the classes that would expose them to new subjects or help them connect ideas across disciplines.

via Did Anyone Ask the Students?, Part I – Next – The Chronicle of Higher Education.

Of course, getting rid of the college major would require a massive transformation of what it meant to be a college, not just a college student, and moving away from a narrowly defined research or professionally oriented definition of your major.  There’s no sign yet that we would be willing to do that or that prospective students would respond well to a college that did away with majors entirely.

Even Seligo seems inconsistent on this point since just prior to this point about the unimportance of majors, Seligo says we need to have much more intense levels of career preparation in college so that students can not waste time figuring out what they want to do and what they should major in.  How these two assertions get in paragraphs that sit next to each other, I’m not entirely sure, but it may just signal the confusion we have over recognizing that except in some very specific circumstances majors don’t matter as much as we think they do, but we still somehow can only imagine a college education as a preparation for a specific career.

Maybe if we would think of college as preparing students to blaze a trail for their own professional and personal journey instead of following a career path that is predetermined, we’d be able to relieve ourselves of the belief that students need to figure out what they are going to do with their lives when they are 17 and forever after their fates will be determined by a choice made in ignorance by students who cannot possibly know the kinds of people they will be or the opportunities they will have when they are 22, much less 32 or 52.

So I wonder whether readers of this blog think its possible to imagine a world of higher education in which majors don’t exist?

Is hierarchy the problem? Higher Education Governance and Digital Revolutions

In the Digital Campus edition of the Chronicle, Gordon Freeman Makes the now common lament that colleges and universities are not taking the kind of advantage of Cloud technology that would enable superior student learning outcomes and more effective and efficient teaching. In doing so he lays blame at the feet of the hierarchical nature of higher Ed in a horizontal world.

Higher-education leaders, unlike the cloud-based companies of Silicon Valley, do not easily comprehend the social and commercial transformation gripping the world today. Indeed, there was a certain amount of gloating that the centuries-old higher-education sector survived the dot-com era. After all, textbooks are still in place, as are brick and mortar campuses.

The simple fact is that life is becoming more horizontal, while colleges remain hierarchical. We can expect the big shifts in higher education—where the smart use of digitization leads to degrees—to come from other countries.

And that’s sad, because the United States makes most of the new technologies that other parts of the world are more cleverly adapting, especially in education.

(via Instapaper)

I appreciate the metaphor of hierarchy versus horizontality, and I think it’s seductive, but I wonder if it’s accurate. It captures an American view of the world that images the bad guys as authoritarian and hierarchical and the good guys as democratic and dialogical.

Whether or not the horizontal crowd can ever do evil is a post for another day. I’m more interested in whether the slowness of higher Ed to take up new modes of doing business is actually due to hierarchy. Speaking to a colleague at a national liberal arts college that shall not be named, he indicated the president gave strong support for digital innovation in teaching and research, but that the faculty as a whole was slow on the uptake, which meant institution wide change was difficult to achieve.

This rings true to me as an administrator. Change is slow not because we are too hierarchical, but because we take horizontality to be an inviolable virtue. Faculty are largely independent operators with a lot of room to implement change or not as they choose. Institution wide changes in educational programming takes not one big decision, but a thousand small acts of persuasion and cajoling in order to effect change. I mostly think this is as it should be. It is the difficult opposite edge of academic freedom. It means that changing the direction of an institution is like changing the direction of an aircraft carrier, and doing so without an admiral who can indicate direction by fiat. There are problems with that, but I’m not sure that the changes required to make higher Ed as nimble as Gordon Freeman desires will result in an educational system we’d like to have.

How Do Blogging and Traditional Modes of Scholarly Production Relate?

In the latest edition of the Chronicle’s Digital Campus, Martin Weller makes some strong claims about the significance of blogging.  Recognizing the difficulty of measuring the value of a blog in comparison to traditional modes of journalistic publication, Weller believes that blogging is ultimately in the interest of both institutions and scholarship.

It’s a difficult problem, but one that many institutions are beginning to come to terms with. Combining the rich data available online that can reveal a scholar’s impact with forms of peer assessment gives an indication of reputation. Universities know this is a game they need to play—that having a good online reputation is more important in recruiting students than a glossy prospectus. And groups that sponsor research are after good online impact as well as presentations at conferences and journal papers.

Institutional reputation is largely created through the faculty’s online identity, and many institutions are now making it a priority to develop, recognize, and encourage practices such as blogging.For institutions and individuals alike, these practices are moving from specialist hobby to the mainstream. This is not without its risks, but as James Boyle, author of the book The Public Domain: Enclosing the Commons of the Mind (Yale University Press, 2008), argues, we tend to overstate the dangers of open approaches and overlook the benefits, while the converse holds true for the closed system.

The Virtues of Blogging as Scholarly Activity – The Digital Campus – The Chronicle of Higher Education.

The claim that an Institutional reputation is largely created through the faculty’s online identity startled me when I first read it, an index no doubt of my deeply held and inveterate prejudice in favor of libraries.  But I have been trying to pound away with the faculty how utterly important our online presence is, and the internet–in many different modes–gives us the opportunity to create windows on humanities work that are not otherwise easily achieved–at least in comparison to some of the work done by our colleagues in the arts or in the sciences.  Blogging is one way of creating connection, of creating vision, and I think that with a very few exceptions like the ivies and the public ivies, it is very much the case that your online presence matters more than any other thing you can possibly do to establish your reputation in the public eye and in the eye of prospective students and their parents.

That is fairly easy to grasp.  The value of the blogging to scholarship in general, or its relationship to traditional scholarship remains more thorny and difficult to parse.  I’ve had conversations with my colleague John Fea over at The Way of Improvement Leads Home, and we both agree that in some sense scholars still have to have established some claim to speak through traditional modes of publication in order to give their scholarly blogging some sense of authority.  People listen to John about History because he’s published books and articles. [Why people listen to me I have no idea–although I do have books and articles I have nothing like John’s reputation;  it may have something to do with simply holding a position.  Because I am a dean at a college I can lay claim to certain kinds of experience that are relevant to discussing the humanities].

I am not sure it will always be thus.  I think the day is fast approaching when publishing books will become less and less important as the arbiter of scholarly authority.  But I think for now and perhaps for a very good long time to come, blogging exist in an interesting symbiosis with other traditional forms of scholarship. Weller quotes John Naughton to this effect:  “Looking back on the history,” he writes, “one clear trend stands out: Each new technology increased the complexity of the ecosystem.”

I’ve read some things lately that say blogging may be on its way out, replaced in the minds of the general public, I guess, by Facebook, Twitter, and Pinterest.  But for now I think it remains an interesting and somewhat hybrid academic form.  A forum for serious thought and reasonably good writing, but not one that claims to be writing for the ages.  In some respects, I think the best blogging is more like the recovery of the eighteenth century Salon, wherein wit that demonstrated learning and acumen was highly valued, and perhaps a basis of academic life that stood unembarrassed next to the more muscular form of the book. Blogging is one clearly important addition to the scholarly ecosystem, playing off of and extending traditional scholarship rather than simply replacing it.

In my own life right now, as an administrator, I have too little time during the school year to pursue the writing of a 40 page article or a 400 page book–nor, right now, do I have the interest or inclination (however much I want to get back and finish the dang Harlem Renaissance manuscript that sits moldering in my computer).  I do, however, continue to feel the need to contribute to scholarly conversation surrounding the humanities and higher education in general.  Blogging is one good way to do that, and one that, like Weller, I find enjoyable, creative, and stress relieving–even when I am writing my blog at 11:00 at night to make sure I can get something posted.  Ever the Protestant and his work ethic.

Majoring in the Extreme Humanities

Playing Scrabble the other day I looked up the word “selvages” online and in the process discovered the sport of extreme scrap quilting.  I still don’t have my mind around the concept since I thought that scrap quilting was by its nature designed to be the opposite of extreme, but apparently it is a “thing” since it calls up 750000 hits on google in one form or another.  I can’t quite figure out the difference between extreme scrap quilting and regular scrap quilting, but I’m sure that if its important to my happiness someone will let me know.  Or even it’s not.

I take it that extreme scrap quilting is on the order of extreme eating, extreme couponing, extreme makeovers, and extreme other things.  Indeed, it appears that in order to be noticed as something special and different it is important that it become extreme, unusual, and call attention to itself.

I’ve concluded that this is one of the problems with the Humanities. We are not extreme enough.  We need to shake off the image of the sedate professors in elbow patches and figure out new ways to make our disciplines sufficiently life threatening to attract interest. If we were more extreme we could have sexier advertisement in college brochures and more positive coverage in the national press.

I struggled to come up with a few examples, but I wonder if others could come up with more.

“Extreme Hemingway 101”–Read Hemingway on a safari to Africa.  You will be injected with a form of gangrene and a rescue plane will fly you in to the side of Mount Kilimanjaro.  If you make it out alive your grand prize will be a a year for two in an isolated cabin in Idaho.  By the end of this course you will truly understand what it meant to be Ernest Hemingway.  Because we will spend so much time flying around the world, we will only have the time for the one short story.  But lots and lots and lots of experiential learning.

“Extreme Poetry 302”–competitors will rack up debt and be given jobs as baristas.  The competitor who is willing to go without health benefits and adequate housing the longest will be rewarded with a publishing contract with 2000.00 subvention fees for the cover art. [Oh, wait….we already do that one for real].

“Extreme History 291”–Students will be put out in sod houses on the Kansas Prairie without electricity, food or running water in order to relive America’s westward expansion. Students from the extreme archery team will provide realistic attacks on settlers in an effort to help students better understand the responses of the colonized to their colonizers.  [I think this was actually some kind of television show already, but why not steal a good idea]

“Extreme Philosophy 479”– an extreme version of Aristotle’s peripatetic school, students will be required to run a marathon on a treadmill while wearing specially designed headsets that allow them to watch all Slavoj Zizek videos currently posted on Youtube [because we realize students are not professional marathoners, we believe there will be sufficient time to actually accomplish this assignment].  Final exam focused on actually reading Zizek is optional.

I’m sure there must be other possibilities.  I’d love to hear of them.

[True story, in writing this blog post just now I googled “extreme humanities” and came up with several Indian sites for hair weaves made of real human hair;  I kid you not. Judging from the web site I looked at, it appears there’s an unnerving desire for “virgin human hair.”  I had not really realized this was a consideration in the baldness management industry.   “Extreme Higher Education”, more grimly, starts out with several pages of mostly news stories focusing on extreme cuts to Higher education]

Is it irresponsible to advise undergraduates to major in the Humanities?

I am not usually given to screeds about the press.  I advised the newspaper here at Messiah College for several years, I sponsored a recent overhaul of our journalism curriculum, and I continue to have broad if now somewhat indirect responsibility for student media here at the college.  And, secretly, in my heart of hearts I think we need a lot more professors in the humanities looking for how to have second careers in journalism, communicating directly with the public in accessible terms about the thorny difficulties of their work.  So I appreciate journalists, thinking they have a hard job that is mostly under appreciated.  The only world that is worse than a world with a free press is a world without one.

That having been said, today’s piece in the NY Times by Frank Bruni is opinion, and it strikes me as thoughtless opinion, mostly just sounding the cant notes about a liberal arts education that are increasingly becoming the common nonsense of the American public at large.  Although I agree with Bruni that a great deal needs to be done to address the job prospects and job preparation of American College students, the wisdom in his prescriptions is scant and would likely result in an educational program less helpful to students not more.  Says Bruni after lamenting the job prospects of anthropology and philosophy majors (there are hordes of them out there, have you noticed):

I single out philosophy and anthropology because those are two fields — along with zoology, art history and humanities — whose majors are least likely to find jobs reflective of their education level, according to government projections quoted by the Associated Press. But how many college students are fully aware of that? How many reroute themselves into, say, teaching, accounting, nursing or computer science, where degree-relevant jobs are easier to find? Not nearly enough, judging from the angry, dispossessed troops of Occupy Wall Street.

The thing is, today’s graduates aren’t just entering an especially brutal economy. They’re entering it in many cases with the wrong portfolios. To wit: as a country we routinely grant special visas to highly educated workers from countries like China and India. They possess scientific and technical skills that American companies need but that not enough American students are acquiring.

via The Imperiled Promise of College – NYTimes.com.

I can’t get past the irony that Bruni was an English major in college and has a degree in journalism.  Real growth industries.  I realize the ad hominem, but frankly, Frank ought to know better.

My overriding concern is that these bromides about channeling students in to areas where there supposedly will be jobs rests on multiple assumed grounds that are shaky at best, sand at worst.

First, it is terribly misguided to believe that what a student thinks they want when they are 17–or what we think they ought to want–is an adequate index of what they will want or what they will succeed at.  College is first and foremost a time of exploration and development, a time of discovery.  Most students change their major after entering college, most end up doing something after college that is not directly related to their fields of study, and most will change fields and go in different directions during the course of their lives.  When I was 17 I thought I wanted to be a lawyer and possibly go in to politics;  I began as a major in History and Political Science, then shifted to English because I enjoyed the classes more.  I had a conversion experience at the hands of T.S. Eliot, William Faulkner, and Joe McClatchey–not the poet, the Victorianist at Wheaton College where I did my undergraduate work, the best teacher I ever had–and decided to go for a degree in creative writing in the hopes that I would be the next Walker Percy.  It wasn’t until my second year of graduate school that I decided I loved higher education and wanted a PhD, and it wasn’t until I was nearly 50 that I decided administration could be a noble calling. (Others still doubt).  A long way from my early dream of being a congressman or a Senator, a long way from the dream of being a William Faulkner or a Hemingway.

It is secondarily irresponsible to believe that we can know what the hot jobs will be in the 2020, much less 2030 or 2040, despite our prognostications.  Five years ago Finance majors were littering the coffeeshops of Camp Hill (ok, there’s only one), having graduated from the best colleges only to be back home living with their parents.  In my own case, I am very sure that whatever the hot jobs were in the 1970s, novelist and Senator were not among them. But whatever we thought they were, I’m sure they aren’t all that hot any more. We do not, in fact, know what turns the economy will take, though we can know that we need students who are broadly educated, in whom creativity has been inculated and encouraged, and who possess the flexibility and the skills that can be adapted to a rapidly changing job environment.  There’s nothing about majoring in philosophy or anthropology that prevents students from having that kind of “portfolio”–indeed, their majors do much to produce the skills they will need, and in combination with a general education and elective choices that can develop their skills and knowledge base in technical field or in business, such a student could be extremely desirable for a wide range of jobs in the economy of the future.

Thirdly, WHY PICK ON PHILOSOPHY?  It makes up less than one half of one percent of all college majors in the country and anthropology majors not too many more.  Does Bruni really believe this is a solution to our economic difficulties?  GET RID OF PHILOSOPHY MAJORS.  There’s a political slogan with legs and an economic program with brio.  Why even pick on humanities majors as a whole–depending on which set of majors you take up, they make up between 8 and 12 percent of the nationwide student population and have for a very long time.  Their unemployment rates are somewhat higher that the nation as a whole–though not so drastic as the doom sayers suggest–but there are so many fewer of us it is laughable to believe that the unemployment problem is going to be solved by getting those who remain to drop what they are doing and become unhappy engineers.  Bruni was an English major so I will forgive his weaknesses with statistics.

Finally, is it really healthy for the nation to believe that we are going to be better off creating an educational system in which all students are wedged in to jobs for which they are ill suited, for which they have no personal gifts or desires, and through which they have fewer and fewer options.   Is this really what education for a free society will look like?  When I was young, we descried the fact that the Soviet Union forced students into narrow frames of life in the names of the Soviet five year plans.  We now do this in the name of markets and call it “incentivizing.”

It is not irresponsible to believe that colleges should do more to prepare students for the job market that will await them, but it is irresponsible to believe we will solve the problems facing students by forcing them all in to preprofessional or technical majors.  Indeed, if I can be forgiven one more point, it is bizarre that Bruni thinks a student’s portfolio is made up of his or her college major.  A student brings or ought to bring an entire panoply of experiences associated with college life, in and outside the classroom, and through internships and other forms of learning as well.  Believing that we can solve students’ problems by channeling them in to a major demonstrates a poor understanding of both how education works and how the job market works.

We need to do better, but doing what Bruni suggests will be doing disaster, not doing better.  We need to remember, first, to paraphrase Andy Chan down at Wake Forest, that we are in the education business not the job placement business, even if students getting jobs is important.  We are not a job shop, we are a culture and community that touches the whole of student lives–including their preparations for a career after college.  When we are at our best that is for their individual good and their individual portfolios, but also for the good of the nation as a whole–not just its economy.  That is what responsible education looks like.

Is Twitter Destroying the English language?

Coming out of the NITLE seminar on Undergraduate Research in Digital Humanities, my title question was one of the more interesting questions on my mind.  Janis Chinn, a student at the University of Pittsburgh, posed this question as a motivation for her research on shifts in linguistic register on Twitter.  I’m a recent convert to Twitter and see it as an interesting communication tool, but also an information network aggregator.  I don’t really worry about whether twitter is eroding my ability to write traditional academic prose, but then, I’ve inhabited that prose for so long its more the case that I can’t easily adapt to the more restrictive conventions of twitter.  And while I do think students are putting twitterisms in their papers, I don’t take this as specifically different than the tendency of students to use speech patterns as the basis for constructing their papers, and not recognizing the different conventions of academic prose.  So twitter poses some interesting issues, but not issues that strike me as different in kind from other kinds of language uses.

I gather from the website for her project that Janis is only at the beginning of her research and hasn’t developed her findings yet, but it looks like a fascinating study.  Part of her description of the work is as follows:

Speakers shift linguistic register all the time without conscious thought. One register is used to talk to professors, another for friends, another for close family, another for one’s grandparents. Linguistic register is the variety of language a speaker uses in a given situation. For example, one would not use the same kind of language to talk to one’s grandmother as to your friends. One avoids the use of slang and vulgar language in an academic setting, and the language used in a formal presentation is not the language used in conversation. This is not just a phenomenon in English, of course; in languages like Japanese there are special verbs only used in honorific or humble situations and different structures which can increase or decrease the politeness of a sentence to suit any situation. This sort of shift takes place effortlessly most of the time, but relatively new forms of communication such as Twitter and other social media sites may be blocking this process somehow.

In response to informal claims that the current generation’s language is negatively affected by modern communication tools likeTwitter, Mark Liberman undertook a brief analysis comparing the inaugural addresses of various Presidents. This analysis can be found on University of Pennsylvania‘s popular linguistics blog “Language Log”. Remarkably, he found a significant trend of shortening sentence and word lengths over the last 200 years. My research, while not addressing this directly, will demonstrate whether using these services affects a user’s ability to shift linguistic registers to match the situation as they would normally be expected to.

Fascinating question in and of itself. I think on some level I’ve always been deeply aware of these kinds of shifts.  As I kid when my parents were missionaries in New Guinea, I would speak with an Aussie accent while I was with kids at the school across the valley, which shifting back in to my Okie brogue on the mission field and in my house.  And as I made my way in to academe my southern and southwesternisms gradually dropped away with a very few exceptions–aware as I was that my accent somehow did not signal intelligence and accomplishment.  Mockery of southern white speech remains a last bastion of prejudice in the academy generally.  I don’t think these are the kinds of register shifts Janis is looking at, but same territory.

I’m also more interested in the general motive questions.  If we could prove that Twitter inhibited the ability to shift registers, would that count as destroying or damaging the language in some sense?  If we could demonstrate that Twitter was leading people to use shorter and shorter sentences–or to be less and less able to comprehend sentences longer than 160 characters.  Would this signal an erosion in the language.  We must have some notion that language can be used in more effective and less effective ways since we are all very aware that communication can fail abysmally or succeed beyond our hopes, and usually ends up somewhere in-between.  Does the restricted nature of Twitter limit or disable some forms of effective communication, while simultaneously enabling others.  These are interesting questions.  I’m sure more intelligent people than I am are working on them.

Takeaways–NITLE Seminar: Undergraduates Collaborating in Digital Humanities Research

Yesterday afternoon at 3:00 about 30 Messiah College humanities faculty and undergraduates gathered to listen in on and virtually participate in the NITLE Seminar focusing on Undergraduates Collaborating in Digital Humanities Research.  A number of our faculty and students were tweeting the event, and a Storify version with our contributions can be found here.I am amazed and gratified to have such a showing late on a Friday afternoon.  Students and faculty alike were engaged and interested by the possibilities they saw being pursued in undergraduate programs across the country, and our own conversation afterwards extended for more than a half hour beyond the seminar itself. Although most of us freely admit that we are only at the beginning and feeling our way, there was a broad agreement that undergraduate research and participation in Digital Humanities work was something we needed to keep pushing on.

If you are interested in reviewing the entire seminar, including chat room questions and the like, you can connect through this link.  I had to download webex in order to participate in the seminar, so you may need to do the same, even though the instructions I received said I wouldn’t need to.  My own takeaways from the seminar were as follows:

  • Undergraduates are scholars, not scholars in waiting.  If original scholarship is defined as increasing the fund of human knowledge, discovering and categorizing and interpreting data that helps us better understand human events and artifacts, developing tools that can be employed by other scholars who can explore and confirm or disconfirm or further your findings, these young people are scholars by any definition.
  • Digital Humanities research extends (and, to be sure, modifies) our traditional ways of doing humanities work;  it does not oppose it.  None of these young scholars felt inordinate tensions between their traditional humanities training and their digital humanities research.  A student who reviewed a database of 1000 Russian folks tales extended and modified her understanding arrived at by the close reading of a dozen.  Digital Humanities tools enable closer reading and better contextual understanding of the poet Agha Shahid Ali, rather than pushing students away in to extraneous material.
  • Many or most of these students learned their tools as they went along, within the context of what they were trying to achieve.  I was especially fascinated that a couple of the students had had no exposure to Digital Humanities work prior to their honors projects, and they learned the coding and digital savvy they needed as they went along.  Learning tools within the context of how they are needed seems to make more and more sense to me.  You would not teach a person how to use a hammer simply by giving them a board and nails, at least not if you don’t want them to get bored.  Rather, give them something to build, and show or have them figure out how the hammer and nails will help them get there.

I’m looking forward to the Places We’ll Go.