Category Archives: technology

Is hierarchy the problem? Higher Education Governance and Digital Revolutions

In the Digital Campus edition of the Chronicle, Gordon Freeman Makes the now common lament that colleges and universities are not taking the kind of advantage of Cloud technology that would enable superior student learning outcomes and more effective and efficient teaching. In doing so he lays blame at the feet of the hierarchical nature of higher Ed in a horizontal world.

Higher-education leaders, unlike the cloud-based companies of Silicon Valley, do not easily comprehend the social and commercial transformation gripping the world today. Indeed, there was a certain amount of gloating that the centuries-old higher-education sector survived the dot-com era. After all, textbooks are still in place, as are brick and mortar campuses.

The simple fact is that life is becoming more horizontal, while colleges remain hierarchical. We can expect the big shifts in higher education—where the smart use of digitization leads to degrees—to come from other countries.

And that’s sad, because the United States makes most of the new technologies that other parts of the world are more cleverly adapting, especially in education.

(via Instapaper)

I appreciate the metaphor of hierarchy versus horizontality, and I think it’s seductive, but I wonder if it’s accurate. It captures an American view of the world that images the bad guys as authoritarian and hierarchical and the good guys as democratic and dialogical.

Whether or not the horizontal crowd can ever do evil is a post for another day. I’m more interested in whether the slowness of higher Ed to take up new modes of doing business is actually due to hierarchy. Speaking to a colleague at a national liberal arts college that shall not be named, he indicated the president gave strong support for digital innovation in teaching and research, but that the faculty as a whole was slow on the uptake, which meant institution wide change was difficult to achieve.

This rings true to me as an administrator. Change is slow not because we are too hierarchical, but because we take horizontality to be an inviolable virtue. Faculty are largely independent operators with a lot of room to implement change or not as they choose. Institution wide changes in educational programming takes not one big decision, but a thousand small acts of persuasion and cajoling in order to effect change. I mostly think this is as it should be. It is the difficult opposite edge of academic freedom. It means that changing the direction of an institution is like changing the direction of an aircraft carrier, and doing so without an admiral who can indicate direction by fiat. There are problems with that, but I’m not sure that the changes required to make higher Ed as nimble as Gordon Freeman desires will result in an educational system we’d like to have.

Is it irresponsible to advise undergraduates to major in the Humanities?

I am not usually given to screeds about the press.  I advised the newspaper here at Messiah College for several years, I sponsored a recent overhaul of our journalism curriculum, and I continue to have broad if now somewhat indirect responsibility for student media here at the college.  And, secretly, in my heart of hearts I think we need a lot more professors in the humanities looking for how to have second careers in journalism, communicating directly with the public in accessible terms about the thorny difficulties of their work.  So I appreciate journalists, thinking they have a hard job that is mostly under appreciated.  The only world that is worse than a world with a free press is a world without one.

That having been said, today’s piece in the NY Times by Frank Bruni is opinion, and it strikes me as thoughtless opinion, mostly just sounding the cant notes about a liberal arts education that are increasingly becoming the common nonsense of the American public at large.  Although I agree with Bruni that a great deal needs to be done to address the job prospects and job preparation of American College students, the wisdom in his prescriptions is scant and would likely result in an educational program less helpful to students not more.  Says Bruni after lamenting the job prospects of anthropology and philosophy majors (there are hordes of them out there, have you noticed):

I single out philosophy and anthropology because those are two fields — along with zoology, art history and humanities — whose majors are least likely to find jobs reflective of their education level, according to government projections quoted by the Associated Press. But how many college students are fully aware of that? How many reroute themselves into, say, teaching, accounting, nursing or computer science, where degree-relevant jobs are easier to find? Not nearly enough, judging from the angry, dispossessed troops of Occupy Wall Street.

The thing is, today’s graduates aren’t just entering an especially brutal economy. They’re entering it in many cases with the wrong portfolios. To wit: as a country we routinely grant special visas to highly educated workers from countries like China and India. They possess scientific and technical skills that American companies need but that not enough American students are acquiring.

via The Imperiled Promise of College – NYTimes.com.

I can’t get past the irony that Bruni was an English major in college and has a degree in journalism.  Real growth industries.  I realize the ad hominem, but frankly, Frank ought to know better.

My overriding concern is that these bromides about channeling students in to areas where there supposedly will be jobs rests on multiple assumed grounds that are shaky at best, sand at worst.

First, it is terribly misguided to believe that what a student thinks they want when they are 17–or what we think they ought to want–is an adequate index of what they will want or what they will succeed at.  College is first and foremost a time of exploration and development, a time of discovery.  Most students change their major after entering college, most end up doing something after college that is not directly related to their fields of study, and most will change fields and go in different directions during the course of their lives.  When I was 17 I thought I wanted to be a lawyer and possibly go in to politics;  I began as a major in History and Political Science, then shifted to English because I enjoyed the classes more.  I had a conversion experience at the hands of T.S. Eliot, William Faulkner, and Joe McClatchey–not the poet, the Victorianist at Wheaton College where I did my undergraduate work, the best teacher I ever had–and decided to go for a degree in creative writing in the hopes that I would be the next Walker Percy.  It wasn’t until my second year of graduate school that I decided I loved higher education and wanted a PhD, and it wasn’t until I was nearly 50 that I decided administration could be a noble calling. (Others still doubt).  A long way from my early dream of being a congressman or a Senator, a long way from the dream of being a William Faulkner or a Hemingway.

It is secondarily irresponsible to believe that we can know what the hot jobs will be in the 2020, much less 2030 or 2040, despite our prognostications.  Five years ago Finance majors were littering the coffeeshops of Camp Hill (ok, there’s only one), having graduated from the best colleges only to be back home living with their parents.  In my own case, I am very sure that whatever the hot jobs were in the 1970s, novelist and Senator were not among them. But whatever we thought they were, I’m sure they aren’t all that hot any more. We do not, in fact, know what turns the economy will take, though we can know that we need students who are broadly educated, in whom creativity has been inculated and encouraged, and who possess the flexibility and the skills that can be adapted to a rapidly changing job environment.  There’s nothing about majoring in philosophy or anthropology that prevents students from having that kind of “portfolio”–indeed, their majors do much to produce the skills they will need, and in combination with a general education and elective choices that can develop their skills and knowledge base in technical field or in business, such a student could be extremely desirable for a wide range of jobs in the economy of the future.

Thirdly, WHY PICK ON PHILOSOPHY?  It makes up less than one half of one percent of all college majors in the country and anthropology majors not too many more.  Does Bruni really believe this is a solution to our economic difficulties?  GET RID OF PHILOSOPHY MAJORS.  There’s a political slogan with legs and an economic program with brio.  Why even pick on humanities majors as a whole–depending on which set of majors you take up, they make up between 8 and 12 percent of the nationwide student population and have for a very long time.  Their unemployment rates are somewhat higher that the nation as a whole–though not so drastic as the doom sayers suggest–but there are so many fewer of us it is laughable to believe that the unemployment problem is going to be solved by getting those who remain to drop what they are doing and become unhappy engineers.  Bruni was an English major so I will forgive his weaknesses with statistics.

Finally, is it really healthy for the nation to believe that we are going to be better off creating an educational system in which all students are wedged in to jobs for which they are ill suited, for which they have no personal gifts or desires, and through which they have fewer and fewer options.   Is this really what education for a free society will look like?  When I was young, we descried the fact that the Soviet Union forced students into narrow frames of life in the names of the Soviet five year plans.  We now do this in the name of markets and call it “incentivizing.”

It is not irresponsible to believe that colleges should do more to prepare students for the job market that will await them, but it is irresponsible to believe we will solve the problems facing students by forcing them all in to preprofessional or technical majors.  Indeed, if I can be forgiven one more point, it is bizarre that Bruni thinks a student’s portfolio is made up of his or her college major.  A student brings or ought to bring an entire panoply of experiences associated with college life, in and outside the classroom, and through internships and other forms of learning as well.  Believing that we can solve students’ problems by channeling them in to a major demonstrates a poor understanding of both how education works and how the job market works.

We need to do better, but doing what Bruni suggests will be doing disaster, not doing better.  We need to remember, first, to paraphrase Andy Chan down at Wake Forest, that we are in the education business not the job placement business, even if students getting jobs is important.  We are not a job shop, we are a culture and community that touches the whole of student lives–including their preparations for a career after college.  When we are at our best that is for their individual good and their individual portfolios, but also for the good of the nation as a whole–not just its economy.  That is what responsible education looks like.

Is Twitter Destroying the English language?

Coming out of the NITLE seminar on Undergraduate Research in Digital Humanities, my title question was one of the more interesting questions on my mind.  Janis Chinn, a student at the University of Pittsburgh, posed this question as a motivation for her research on shifts in linguistic register on Twitter.  I’m a recent convert to Twitter and see it as an interesting communication tool, but also an information network aggregator.  I don’t really worry about whether twitter is eroding my ability to write traditional academic prose, but then, I’ve inhabited that prose for so long its more the case that I can’t easily adapt to the more restrictive conventions of twitter.  And while I do think students are putting twitterisms in their papers, I don’t take this as specifically different than the tendency of students to use speech patterns as the basis for constructing their papers, and not recognizing the different conventions of academic prose.  So twitter poses some interesting issues, but not issues that strike me as different in kind from other kinds of language uses.

I gather from the website for her project that Janis is only at the beginning of her research and hasn’t developed her findings yet, but it looks like a fascinating study.  Part of her description of the work is as follows:

Speakers shift linguistic register all the time without conscious thought. One register is used to talk to professors, another for friends, another for close family, another for one’s grandparents. Linguistic register is the variety of language a speaker uses in a given situation. For example, one would not use the same kind of language to talk to one’s grandmother as to your friends. One avoids the use of slang and vulgar language in an academic setting, and the language used in a formal presentation is not the language used in conversation. This is not just a phenomenon in English, of course; in languages like Japanese there are special verbs only used in honorific or humble situations and different structures which can increase or decrease the politeness of a sentence to suit any situation. This sort of shift takes place effortlessly most of the time, but relatively new forms of communication such as Twitter and other social media sites may be blocking this process somehow.

In response to informal claims that the current generation’s language is negatively affected by modern communication tools likeTwitter, Mark Liberman undertook a brief analysis comparing the inaugural addresses of various Presidents. This analysis can be found on University of Pennsylvania‘s popular linguistics blog “Language Log”. Remarkably, he found a significant trend of shortening sentence and word lengths over the last 200 years. My research, while not addressing this directly, will demonstrate whether using these services affects a user’s ability to shift linguistic registers to match the situation as they would normally be expected to.

Fascinating question in and of itself. I think on some level I’ve always been deeply aware of these kinds of shifts.  As I kid when my parents were missionaries in New Guinea, I would speak with an Aussie accent while I was with kids at the school across the valley, which shifting back in to my Okie brogue on the mission field and in my house.  And as I made my way in to academe my southern and southwesternisms gradually dropped away with a very few exceptions–aware as I was that my accent somehow did not signal intelligence and accomplishment.  Mockery of southern white speech remains a last bastion of prejudice in the academy generally.  I don’t think these are the kinds of register shifts Janis is looking at, but same territory.

I’m also more interested in the general motive questions.  If we could prove that Twitter inhibited the ability to shift registers, would that count as destroying or damaging the language in some sense?  If we could demonstrate that Twitter was leading people to use shorter and shorter sentences–or to be less and less able to comprehend sentences longer than 160 characters.  Would this signal an erosion in the language.  We must have some notion that language can be used in more effective and less effective ways since we are all very aware that communication can fail abysmally or succeed beyond our hopes, and usually ends up somewhere in-between.  Does the restricted nature of Twitter limit or disable some forms of effective communication, while simultaneously enabling others.  These are interesting questions.  I’m sure more intelligent people than I am are working on them.

Takeaways–NITLE Seminar: Undergraduates Collaborating in Digital Humanities Research

Yesterday afternoon at 3:00 about 30 Messiah College humanities faculty and undergraduates gathered to listen in on and virtually participate in the NITLE Seminar focusing on Undergraduates Collaborating in Digital Humanities Research.  A number of our faculty and students were tweeting the event, and a Storify version with our contributions can be found here.I am amazed and gratified to have such a showing late on a Friday afternoon.  Students and faculty alike were engaged and interested by the possibilities they saw being pursued in undergraduate programs across the country, and our own conversation afterwards extended for more than a half hour beyond the seminar itself. Although most of us freely admit that we are only at the beginning and feeling our way, there was a broad agreement that undergraduate research and participation in Digital Humanities work was something we needed to keep pushing on.

If you are interested in reviewing the entire seminar, including chat room questions and the like, you can connect through this link.  I had to download webex in order to participate in the seminar, so you may need to do the same, even though the instructions I received said I wouldn’t need to.  My own takeaways from the seminar were as follows:

  • Undergraduates are scholars, not scholars in waiting.  If original scholarship is defined as increasing the fund of human knowledge, discovering and categorizing and interpreting data that helps us better understand human events and artifacts, developing tools that can be employed by other scholars who can explore and confirm or disconfirm or further your findings, these young people are scholars by any definition.
  • Digital Humanities research extends (and, to be sure, modifies) our traditional ways of doing humanities work;  it does not oppose it.  None of these young scholars felt inordinate tensions between their traditional humanities training and their digital humanities research.  A student who reviewed a database of 1000 Russian folks tales extended and modified her understanding arrived at by the close reading of a dozen.  Digital Humanities tools enable closer reading and better contextual understanding of the poet Agha Shahid Ali, rather than pushing students away in to extraneous material.
  • Many or most of these students learned their tools as they went along, within the context of what they were trying to achieve.  I was especially fascinated that a couple of the students had had no exposure to Digital Humanities work prior to their honors projects, and they learned the coding and digital savvy they needed as they went along.  Learning tools within the context of how they are needed seems to make more and more sense to me.  You would not teach a person how to use a hammer simply by giving them a board and nails, at least not if you don’t want them to get bored.  Rather, give them something to build, and show or have them figure out how the hammer and nails will help them get there.

I’m looking forward to the Places We’ll Go.

More Undergraduate Research in the Digital Humanities

This afternoon the School of the Humanities at Messiah College will be connecting to the NITLE Symposium on Undergraduate work in the digital Humanities. Messiah College is currently considering making the development of undergraduate research, and especially collaborative research between faculty and students, a central theme of our next strategic plan.  Like many colleges and universities across the country, we are seeing undergraduate research as a way of deepening student learning outcomes and engagement with their education, while also providing more and better skills for life after college.

The push toward student research has some detractors–Andrew DelBanco and Geoffrey Galt Harpham among them–but I’ll blog at some other time about my disagreement with them on liberal arts grounds.  I’ve been on record before as to how I think Digital Humanities is a (or THE) way to go with this effort within my own disciplines.  I was glad to receive the video below from Adeline Koh at Richard Stockton College, chronicling the achievements of the RE:Humanities conference at Swarthmore.  A nice overview of the conference.  If you look closely and don’t blink there’s a couple of shots of my colleagues, Larry Lake, and one of me apparently typing away distractedly on my iPad.  Although perhaps I was tweeting and achieving a transcendent level of attention and interaction without really having to listen.  🙂

This afternoon, the School of the Humanities here at Messiah College is going to consider some more whether and how Digital Humanities might be applicable to our situation by participating in the NITLE symposium on this topic at 3:00.

What is the Digital Humanities and Where can I get some of it?

As I’ve started listening in on Digital Humanities conversations over the past 12 to 18 months, and especially in the last three or four months as I’ve gotten more fully onto Twitter and understood its potential for academics, I’ve realized that I am mostly just a bobbing rubber duck in a great wave of ignorant interest in Digital Humanities.  “What is this thing, Digital Humanities, and where can I get some of it?”  seems to be a general hue and cry, and I’ve added my own voice to the mix.  Some of that wave is no doubt driven by the general malaise that seems to be afflicting the humanistic ecosystem, and mid-career academics look back with rose-colored nostalgia to culture wars of the 80s when our classes were full and our conflicts were played out in middle-brow journals so we could feel self-important.  Maybe digital humanities will make us relevant again and keep our budgets from getting cut.

On the other hand, I think that most people recognize that many different aspects of digital humanities practice seem to coalesce and provide responses to driving forces in academe at the moment:  our students’ need for technical proficiency, the increasingly porous border between distanced and bricks and mortar instruction, the needs to connect effectively with the public while maintaining high standards of academic rigor, the need for our students to be involved in “real world” experiential learning, the effort to provide opportunities for serious and original undergraduate research, the need to support collaborative forms of learning.

I have been terribly impressed with the generosity of Digital Humanities folks in responding to these repeated pleas to “Show me how to do like you.  Show me how to do it.”   There’s been a lot of different things I’ve discovered over the past year, and as many more than have been put out.  The most recent is a bibliographic blog compiled by Matthew Huculak.  As with any bibliography it is an act of interpretation. There are inclusions I’ve not seen elsewhere–Franco Moretti’s book on data in literary studies was on the list and is now on mine. [Though I admit this is at least as much because I had Moretti as a prof while a grad student at Duke, the same semester I completed my first essay ever on a Macintosh computer].  The blog skews toward the literary–and I have increasingly realized that while there is this strong discourse of solidarity among digital humanists, the traditional disciplinary divisions still play a strong role in much of the practical work that is actually done.  DH’ers have conferences together but it’s not always clear that they work together on projects across their humanistic disciplines. There are also obviously omissions (I thought that the new Journal of the Digital Humanities should have made the list).

My larger concern though is that I’m actually beginning to feel that there may actually be a glut of introductory materials, so many different possible things to do at the beginning that it is actually impossible to point to a place and say, “this is where to start.”  To some degree on this score the Digital Humanities are reflecting what Geoffrey Harpham has indicated is a basic feature of the Humanities in general.

In a great many colleges and universities, there is no “Intro to English” class at all, because there is no agreement among the faculty on what constitutes a proper introduction to a field in which the goals, methods, basic concepts, and even objects are so loosely defined, and in which individual subjective experience plays such a large part. This lack of consensus has sometimes been lamented, but has never been considered a serious problem.

Geoffrey Galt Harpham. The Humanities and the Dream of America (p. 101). Kindle Edition.

The details don’t quite apply.  I’m not entirely sure individual subjective experience is at the heart of DH work, and that is one of the biggest bones of contention with DH work as it has been reflected in English departments (see my reflections on Fish and his comments on DH in yesterdays post).  But I do think the general pragmatic feel of humanities departments where you can begin almost anywhere, which is in stark contrast to the methodical and even rigid approach to immersing students in the STEM disciplines, may be characteristic of DH as well.  Start where you are and get where you want to go.

In reading Harpham, I was reminded of one of Stanley Fish’s essays, which one I forget, in which he talks about the best way to introduce students to literary criticism is not to give them a theory, but to give them examples and say “Go thou and do likewise”.  I’m increasingly feeling this is the case for the Digital Humanities.  Figure out what seems appealing to you, and then figure out what you have to figure out so you can do like that.

Distanced and Close Reading in literary study: Metaphors for love

I am old enough now to begin sentences with the phrase “I am old enough…”  Seriously, though, I am old enough now to feel like I have lived through one revolution, into a new orthodoxy, and now the experience of a new revolution in literary studies.  In the ongoing debates I hear about the digital humanities versus whatever other kind of humanities happens to be at hand, I keep having this vertiginous sense of deja vu, as if I’m hearing the same arguments I heard two decades ago, but transformed in to a key just different enough that I can’t tell whether today’s debates are mere variations on a theme or some genuinely new frame of discourse.

The song that I think is remaining the same is the divide between the proponents of what gets called “distanced reading,”  which in some hands is a shorthand for all things digital humanities (if it’s digital, it must be distanced as compared to the human touch of paper, ink, and typewriters–how the industrial period came to be the sign and symbol of all thing human and intimate I am not entirely clear), and close reading which is somehow taken to be THE form of intimate human contact with the text.

This division is exemplified in Stanley Fish’s recent essay on the digital humanities in the New York times, an argument that has the usual whiff of caustic Fishian insight leavened with what I take to be a genuine if wary respect for what he sees in the practices of distanced reading.  Nevertheless, for Fish, it is finally close reading that is genuinely the work of the humane critic devoted to intimacy with the text:

But whatever vision of the digital humanities is proclaimed, it will have little place for the likes of me and for the kind of criticism I practice: a criticism that narrows meaning to the significances designed by an author, a criticism that generalizes from a text as small as half a line, a criticism that insists on the distinction between the true and the false, between what is relevant and what is noise, between what is serious and what is mere play. Nothing ludic in what I do or try to do. I have a lot to answer for.

Ironically, in an earlier period it was Fish and precisely this kind of close reading (as practiced by deconstructionists) that was descried for its lack of seriousness, for the way it removed literature from the realm of human involvement and into the play of mere textuality .  By contrast, the distanced readers in those days imagined themselves as defenders of humanity (or, since humanism was a dirty word, at least the defender of the poor, the downtrodden, the miserable, the huddled masses).  Historicism read widely and broadly in the name of discourse, and proclaimed itself a liberating project, ferreting out the hidden political underbelly in a multitude of texts and considering literary criticism to be an act of responsible justice-seeking over and against the decadent jouissance-seekers of post-structuralism.

A recent blog by Alex Reid takes up this same criticism of what he describes as the Close Reading industry, arguing for the ways digitization can free us from the tyranny of the industrialized close reader:

In the composition classroom, the widgets on the belt are student papers. If computers can read like people it’s because we have trained people to read like computers. The real question we should be asking ourselves is why are we working in this widget factory? And FYC essays are perhaps the best real world instantiation of the widget, the fictional product, produced merely as a generic example of production. They never leave the warehouse, never get shipped to market, and are never used for anything except test runs on the factory floor. 

In an earlier period, it was again the close-readers who were accused of being mechanistic, dry, and scientific as putatively more humanistic readers accused New Critics of an unfeeling scientism in their formalist attitude toward the text, cutting out every human affect in the quest for a serious and scientific study of literature.

I wonder at root, whether this is the controlling metaphor, the key to which all our tunes in literary and cultural studies are played, a quest for the human that is not merely scientific, and yet an unrepressed desire for the authority of the scientist to say things with security, to wear the mantle of authority that our culture apparently only believes a statistical method can endow.

It is probably a mark against my character that I tend to be a both/and pragmatist as a thinker.  I do not buy the notion that distanced reading is inconsequential, or some how less about truth or less serious than the close rhetorical readings that Fish invokes.  At the same time, I am not too given to the euphoric and pugnacious challenges that can sometimes characterize digital humanities responses to the regnant forms of literary criticism.  At their best, Fishian forms of close reading are endowed not simply with acute attention, but with attention that seems to give birth to a form of wisdom that only attentiveness and close examination can provide, the kind of insistent close reading that led Gerard Manley Hopkins to seek the “inscape” of individual instances beyond categories, rather than simply the ways in which individuals fit into the vast landscapes popular in his post-romantic period.

I was reminded of this need to attend to the close properties of the individual use of language again in a recent article on Chaucer in the Chronicle. The writer attends to the detail of Chaucer’s language in a way that seems to reveal something important about the ways in which we are human.

translating Chaucer is like translating any other foreign language: The words are different from one language to the next. And then comes the third category, the most fascinating and the most aggravating because it is the trickiest: the false cognates, words that look like they should mean what they do in Modern English, but don’t. False cognates are especially aggravating, and fascinating when they carry their Middle and Modern English meanings simultaneously. These are exciting moments, when we see, through a kind of linguistic time-lapse photography, Chaucer’s language on its way to becoming our own.

In Middle English, for instance, countrefete means “to counterfeit,” as in “to fake,” but it also has the more flattering meaning of “to imitate.” Corage has not only the Modern English sense of bravery but also, frequently, overtones of sexual energy, desire, or potency. Corage takes its roots from the word coeur, or “heart,” and transplants them slightly southward. The same is true for solas, or “solace.” The “comfort,” “satisfaction,” or “pleasure” it entails is often sexual.

Lust might seem to pose no problem for the modern reader. Yet in the 14th century, the word, spelled as it is today, could mean any kind of desire or pleasure, though around that time it was beginning to carry a sexual connotation, too. And lest it seem as if false cognates always involve sex, take sely, or “silly.” It most often means “blessed” or “innocent,” as well as “pitiful” and “hapless,” but “foolish” was making its way in there, too.

A sentence like “The sely man felte for luste for solas” could mean “The pitiful man felt desire for comfort.” It could just as likely mean: “The foolish man felt lust for sex.” In Chaucer’s hands, it could mean both at once.

Chaucer was fully aware of the slipperiness of language. He delights in it; he makes his artistic capital from it. He is an inveterate punster. The Wife of Bath, for example, repeatedly puns on the word queynte (eventually the Modern English “quaint”). In the 14th century, the word means not only “curious” or “fascinating” but also the curious part of her female anatomy that most fascinates her five husbands. What’s more, the slipperiness of language gives Chaucer the tools to form his famous irony and ambiguity. If the way-too-pretty Prioress is “nat undergrowe” (“not undergrown”), how big is she?

(via Instapaper)

 These kinds of particularities of language are the worthy objects of our attention as literary scholars.  At the same time,  I do not think we need say that distanced reading plays no role in our understanding of such peculiarities.  A Chaucer project on the order of the Homer Multi-text, might actually deepen and multiply our understanding of Chaucer’s slipperiness and originality.  At the same time, vast database-driven analyses of every text written within a hundred years of Chaucer might allow us to discover the kinds of linguistic sources he was drawing on and manipulating anew for his own purposes, they might show us new creativities we had not imagined, or they might show us things we had taken to be unique were fairly common stock and trade.
These kinds of knowledges could not be derived from a contest between methods, but only from a reading marked by attentiveness, skill and desire, one willing to draw on any resource to understand what one wishes to know, which used to be a metaphor for love.

Deep and Wide: Katharine Brooks on Becoming a T-shaped Professional

Earlier today I blogged on the need for humanities students to take seriously the need to become more literate in science, technology and mathematics, both in order to pursue the ideal of a well-rounded liberal arts education and very pragmatically in order to prepare themselves for the world of work. Katharine Brooks (UT-Austin), one of the keynoters at the Rethinking Success conference at Wake Forest takes up the same point in a different manner in her reflections on the need for job candidates coming out of college to present themselves as T-shaped individuals, persons with deep knowledge of one or two areas and broad knowledge of several.

According to those in the talent-seeking field, the most sought-after candidates for management, consulting, research, and other leadership positions are T-shaped. The vertical stem of the T is the foundation: an in-depth specialized knowledge in one or two fields. The horizontal crossbar refers to the complementary skills of communication (including negotiation), creativity, the ability to apply knowledge across disciplines, empathy (including the ability to see from other perspectives), and an understanding of fields outside your area of expertise.

Organizations need workers with specialized knowledge who can also think broadly about a variety of areas, and apply their knowledge to new settings. Since T-shaped professionals possess skills and knowledge that are both broad and deep, developing and promoting your T-shaped talent may be the ticket to yourcareer success now and in the future.

The term “T-shaped” isn’t new: it’s been in use since the 1990s but mostly in consulting and technical fields. Several companies, including IDEO andMcKinsey & Company, have used this concept for years because they have always sought multidisciplinary workers who are capable of responding creatively to unexpected situations. Many companies have also developed interdisciplinary T teams to solve problems.

Dr. Phil Gardner at Michigan State University, who researches and writes regularly on recruiting trends, has been researching the concept of the T-shaped professional and the T-shaped manager. At the recent Rethinking Success Conference at Wake Forest University, Dr. Gardner described the ideal job candidate as a “liberal arts student with technical skills” or a “business/engineering student with humanities training”— in other words, a T-shaped candidate. Dr. Gardner is currently developing a guide for college students based on this concept. He notes that “while the engineers are out in front on this concept – every field will require T professional development.”

As my post earlier today suggested, I think this kind of approach to things is absolutely crucial for graduates in humanities programs, and we ought to shape our curricula–both within the majors and in our general education programs– in such a way that we are producing students confident in and able to articulate the ways in which their education and experiences have made them both deep and broad.
If I can take a half step back from those assertions and plunge in another direction, I will only point out that there is a way in which this particular formulation may let my brethren who are in technical fields off the hook a little too easily.  If it is the case that engineers are leaders in this area, I will say that the notion of breadth that is entailed may be a fairly narrow one, limited to the courses that students are able to get in their general education curriculum.
My colleague, Ray Norman, who is the Dean of our School of Science Engineering and Health has talked with me on more than one occasion about how desirable his engineering graduates are because they have had several courses in the humanities.  I am glad for that, but I point out to him that it is absolutely impossible for his engineering graduates to even minor in a field in my area, much less dream of a double major.  About a decade ago when I was chair of the English department, I went to a colleague who has since vacated the chair of the engineering department, asking if we could talk about ways that we could encourage some interchange between our departments, such that I could encourage my majors to take more engineering or other technical courses, and he could encourage his engineers to minor in English.  He was enthusiastic but also regretful.  He’d love to have my English majors in his program, but he couldn’t send his engineers my way;  the size of the engineering curriculum meant it was absolutely impossible for his students to take anything but the required courses in general education.
I don’t hold this against my colleagues; they labor under accreditation standards and national expectations in the discipline.  But I do think it raises again important questions about what an undergraduate education is for, questions explored effectively by Andrew Delbanco’s recent book.  Should undergraduate programs be so large and so professionally oriented that students are unable to take a minor or possibly a double major?  Whistling in to the wind, I say they should not.  Breadth and Depth should not mean the ability to know ONLY one thing really well;  it ought to mean knowing AT LEAST one thing really well, and a couple of other things pretty well, and several other things generally well.
Oddly enough, it is far easier for liberal arts students to achieve this richer kind of breadth and depth, if they only will.  A major in history, a minor in sociology, a second minor in information sciences, a couple of internships working on web-development at a museum, a college with a robust general education program.  There’s a T-shape to consider.
[Side note;  It was Camp Hill Old Home Week at Wake Forest and the Rethinking Success conference last week.  At a dinner for administrators and career officers hosted by Jacquelyn Fetrow, Dean of the College of Arts and Sciences, Jacque and Katherine Brooks discovered they’d both grown up in Camp Hill, and both within a half dozen blocks of where I now live.  Small world, and Camp Hill is leading it :-)]

Do Humanities Programs Encourage the Computational Illiteracy of Their Students?

I think the knee-jerk and obvious answer to my question is “No.”  I think if humanities profs were confronted with the question of whether their students should develop their abilities in math (or more broadly in math, science and technology), many or most would say Yes.  On the other hand, I read the following post from Robert Talbert at the Chronicle of Higher Ed.  It got me thinking just a bit about how and whether we in the humanities contribute to an anti-math attitude among our own students, if not in the culture as a whole.

I’ve posted here before about mathematics’ cultural problem, but it’s really not enough even to say “it’s the culture”, because kids do not belong to a single monolithic “culture”. They are the product of many different cultures. There’s their family culture, which as Shaughnessy suggests either values math or doesn’t. There’s the popular culture, whose devaluing of education in general and mathematics in particular ought to be apparent to anybody not currently frozen in an iceberg. (The efforts of MIT, DimensionU, and others have a steep uphill battle on their hands.)

And of course there’s the school culture, which itself a product of cultures that are out of kids’ direct control. Sadly, the school culture may be the toughest one to change, despite our efforts at reform. As the article says, when mathematics is reduced to endless drill-and-practice, you can’t expect a wide variety of students — particularly some of the most at-risk learners — to really be engaged with it for long. I think Khan Academy is trying to make drill-and-practice engaging with its backchannel of badges and so forth, but you can only apply so much makeup to an inherently tedious task before learners see through it and ask for something more.

via Can Math Be Made Fun? – Casting Out Nines – The Chronicle of Higher Education.

This all rings pretty true to me.  There are similar versions of this in other disciplines.  In English, for instance, students unfortunately can easily learn to hate reading and writing through what they imbibe from popular culture or through what the experience in the school system.  For every hopeless math geek on television, there’s a reading geek to match.  Still and all, I wonder whether we in the humanities combat and intervene in the popular reputation of mathematics and technological expertise, or do we just accept it, and do we in fact reinforce it.

I think, for instance, of the unconscious assumption that there are “math people” and “English people”;  that is, there’s a pretty firmly rooted notion that people are born with certain proclivities and abilities and there is no point in addressing deficiencies in your literacy in other areas.  More broadly, I think we apply this to students, laughing in knowing agreement when they talk about coming to our humanities disciplines because they just weren’t math persons or a science persons, or groaning together in the faculty lounge about how difficult it is to teach our general education courses to nursing students or to math students.  As if our own abilities were genetic.

In high school I was highly competent in both math and English, and this tendency wasn’t all that unusual for students in the honors programs.  On the other hand, I tested out of math and never took another course in college, and none of my good humanistic teachers in college ever challenged and asked me to question that decision.  I was encouraged to take more and different humanities courses (though, to be frank, my English teachers were suspicious of my interest in philosophy), but being “well-rounded’ and “liberally educated”  seems in retrospect to have been largely a matter of being well-rounded in only half of the liberal arts curriculum.  Science and math people were well-rounded in a different way, if they were well-rounded at all.

There’s a lot of reason to question this.  Not least of which being that if our interests and abilities are genetic we have seen a massive surge of the gene pool toward the STEM side of the equation if enrollments in humanities majors is to serve as any judge.  I think it was Malcolm Gladwell who recently pointed out that genius has a lot less to do with giftedness than it does with practice and motivation.  Put 10000 hours in to almost anything and you will become a genius at it (not entirely true, but the general principle applies).  Extrapolating, we might say that even if students aren’t going to be geniuses in math and technology, they could actually get a lot better at it if they’d only try.

And there’s a lot of reason to ask them to try.  At the recent Rethinking Success conference at Wake Forest, one of the speakers who did research into the transition of college students in to the workplace pounded the table and declared, “In this job market you must either be a technical student with a liberal arts education or a liberal arts major with technical savvy.  There is no middle ground.”  There is no middle ground.  What became quite clear to me at this conference is that companies mean it that they want students with a liberal arts background.  However, it was also very clear to me that they expect them to have technical expertise that can be applied immediately to job performance. Speaker after speaker affirmed the value of the liberal arts.  They also emphasized the absolute and crying need for computational, mathematical, and scientific literacy.

In other words, we in the Humanities will serve our students extremely poorly if we accept their naive statements about their own genetic makeup, allowing them to proceed with a mathematical or scientific illiteracy that we would cry out against if the same levels of illiteracy were evident in others with respect to our own disciplines.

I’ve found, incidentally, that in my conversations with my colleagues in information sciences or math or sciences, that many of them are much more conversant in the arts and humanities than I or my colleagues are in even the generalities of science, mathematics, or technology.  This ought not to be the case, and in view of that i and a few of my colleagues are considering taking some workshops in computer coding with our information sciences faculty.  We ought to work toward creating a generation of humanists that does not perpetuate our own levels of illiteracy, for their own sake and for the health of our disciplines in the future.

Teaching Latin on an iPad: An experiment at Messiah College

An example of some of the things that Messiah College is trying to do in experimenting with digital technology in the classroom.  My colleague Joseph Huffman is more pessimistic than I about the promise of iPads and e-books, but I’m just glad we have faculty trying to figure it out.  See the full post at the link below.

You might not expect a historian of Medieval and Renaissance Europe to be among the first educators at Messiah College to volunteer to lead a pilot project exploring the impact of mobile technology—in this case, the iPad—on students’ ability to learn. But that’s exactly what happened.Joseph Huffman, distinguished professor of European history, and the eight students in his fall 2011 Intermediate Latin course exchanged their paper textbooks for iPads loaded with the required texts, relevant apps, supplementary PDFs and a Latin-English dictionary. The primary goal was to advance the learning of Latin. The secondary goal was to determine whether the use of the iPad improved, inhibited or did not affect their ability to learn a foreign language.Why Latin?“A Latin course is about as traditional a humanities course as one can find,” Huffman says. Because any foreign language course requires deep and close readings of the texts, studying how student learning and engagement are affected by mobile technology is especially provocative in such a classic course. In addition, Latin fulfills general language course requirements and, therefore, classes are comprised of students from a variety of majors with, perhaps, diverse experiences with mobile technologies like iPads.One aspect of the experiment was to explore whether students would engage the learning process differently with an iPad than a textbook.

The assumption, Huffman admits, is that today’s students likely prefer technology over books.Huffman’s experiences with his Latin 201 course—comprised of five seniors, two sophomores and one junior—challenged that commonly held assumption.

via Messiah College: Messiah News – Messiah College Homepage Features » iPad experiment.