Tag Archives: digital humanities

Teaching Humanities to digital natives who may know more than we do.

I remember a story about the advent of the New Criticism where one of those famous critic/scholar/teachers–I forget which one, but I want to say Cleanth Brooks or perhaps John Crowe Ransom–admitted to rushing home at night to read feverishly ahead in the texts he was teaching so that he was ready to go the following day.  On the one hand, this is a familiar story to any new (or not so new) professor who’s trying to stay one step ahead of the onrushing train.  On the other hand, its also the case that part of this was demanded by the fact that Brooks and others were trying to do something totally new for a literature classroom, the close perspicacious reading whose minutest detail nevertheless resulted miraculously in a coherent organic whole.  That kind of textual analysis was the meat of my own education, and to be honest, it hasn’t really changed all that much despite all the new (and now new old theories) that came in with the advent of deconstruction and its descendants.  We still, more or less, on the undergraduate level do the close reading, even if we now look for the way things fall apart or for hints and allegations of this or that cultural depravity.

But I am intrigued by just how hard Brooks/Ransom (or whomever it was) had to work to stay ahead of his students, in part because he really didn’t know entirely what he was doing.  He wasn’t building on the secure  corpus of knowledge that previous literary scholastics had received and passed on.  Despite the mythic and quasi-priestly status that some New Critics projected–turning the critic into an all-knowing seer, and thus setting the stage for the later assertions that critics were really the equals or superiors of the novelists and poets they read and critiqued, knowing what those poor souls could only allude to and  evoke–there was a very real sense in which the New Criticism was much more democratic than the literary scholasticism that preceded it.  (I am sure Frank Lentricchia is exploding about now, or would be if he ever actually would bother to read me).  While it may not have been more democratic in the sense that the New Critics seemed to cast a mysterious aura about all they did, developing a new and arcane ritual language to accompany it, it was more democratic in the sense that the method was potentially available to everyone.  Not everyone could have the time to read all the histories and all the letters and delve in to the archives and read the vast quantities of literature required for the literary scholasticism that characterized old style literary history .  But everyone could read the poem or the novel set in front of them.  And potentially a smart undergraduate could see a good deal that the prof had missed, or point out the problems in particular interpretations.  When the evidence of the poem was simply the poem itself, all the cards were on the table.  No longer could a professor say to the quivering undergraduate “Well, yes, but if you had bothered to read x,y, and z you would understand why your assertions about this poems place in literary history are totally asinine.”  The average undergraduate is never in a place to dispute with a professor on the place of this or that figure in literary history, but they could, in fact, argue that a professor had gotten a poem wrong, that an interpretation didn’t hold up to a closer scrutiny of the fact.  The feverish late night work of my Brooks/Ransom avatar, like the feverish late-night work of many a new and not so new professor, is sometimes cast as a noble inclination to truth or knowledge, or the discipline.  It is in truth, very often the quest to avoid embarrassment at the hands of our smarter undergraduates, the quest for just enough knowledge or just enough preparation to make sure we justify our authority in the eyes of our skeptical younger charges.

I was thinking about his again while attending the Re:Humanities undergraduate DH conference at Swarthmore/Bryn Mawr/Haverford Thursday and Friday. Clearly, one of the biggest challenges to bringing DH fully onboard in Humanities disciplines is the simple fact that undergraduates often know as much, and often know a great deal more, about the tools we are trying to employ.  On the one hand, this is a tremendous challenge to mid-career academics who understandably have little interest in abandoning the approaches to scholarship, teaching, and learning that they have developed, that they understand, and that they continue to use effectively given the assumptions and possibilities of those tools as they are.  It was ever thus and to some degree colleges remain always one step behind the students they are attempting to educate, figuring out on the fly how our own education and experience can possibly apply in this day and hour.

However, I also wonder whether the democratization of the technological environment in the classroom isn’t a newly permanent state of affairs.  The pace of technological change–at least for the present, and why would we assume that should stop in the near or mediate future–means that there is some sense in which we are entering an period in the history of education in which educators will, in some sense, never know any more about the possibilities of the tools they are using than do the students that they are teaching.  Indeed, given the nature of the tools, it is quite likely that collectively the students know a great deal more about how to use the tools available to them and that they are likely to be more attuned more quickly to the latest technological developments.  What they don’t know–and what we as educators don’t know either–is how to best deploy those resources to do different kinds of humanistic work.  The teleology of learning used to be fairly, if undemocratically, straightforward.  The basic educational goal was to learn how to do what your teacher could do–with reading, with texts, with research.  In our current age that teleology is completely, perhaps appropriately, disrupted.  But that doesn’t alleviate the sense that we don’t know entirely what we should be teaching our students to do when we don’t entirely know what to do or how to do it ourselves.

Mortimer Adler famously wrote a book on “How to Read a Book”  and though people bemoaned Adler as an elitist and a snob, the basic idea was still important.  Some people knew how to read books and others did not.  I still think its the case that we take a tremendous amount for granted if we assume an undergraduate actually knows how to read an old-fashioned codex well.  They don’t.  On the other hand, we have no equivalent book that tells us “how to read….”, in part because we don’t know how to fill in the blank, though perhaps “digital artifacts” comes as close as anything.  We’re not even sure what tools we should be using to do whatever it is we are doing as humanists in this day and age.  No wonder most professors choose to continue to use books, even though I think the day is fast approaching when students won’t tolerate that, anymore than an ancient would have tolerated the continued use of scrolls when a perfectly good codex was available at hand.  What the current technological changes are doing is radically democratizing the classroom on the level of the tool.

I did have a couple of signs of hope this past week at the Re:Humanities conference at Swarthmore. In the first place, if the educational system in the humanities is becoming radically democratized at the level of structure, I think it is safe to say there are many, many, many people using that democracy well.  The students at the conference were doings stunningly good and creative work that was clearly contributing to our knowledge of the world around us–sometimes pursuing these projects independently or, most often, in partnership with and in mentoring relationships with committed faculty.  (It is, of course, also the case that people can use democracy poorly, as I’ve suggested elsewhere;  this would be true in both the classroom and the body politic, so we should ask whether and where the democratization of our educational system is being used well, rather than assuming that because we use the word democracy we have named a substantive good).

Secondarily, one of the chief insights I drew from the different speakers was that if we put the tools on the table as possibilities, students will surprise and amaze us with what they can manage to come up with.  What if we found ways to encourage students to get beyond the research paper and asked that they do serious creative and critical work with the tools that they have everyday at hand on their iPhones, laptops, and etcetera.  What is we encouraged them to say we have to find the best way to answer the kind of questions humanists have always asked, and to identify the new questions and potential answers that new (and now not so new) technologies make possible.  We will have to do this regardless, I think.  The age demands it.  And I suspect that there will be many many more frantic late nights for faculty ahead.  But I think those frantic late nights will be built less and less on the belief that we have to get on top of “the material” and “stay ahead” of our students.  When they can bring in material we’ve never heard of with the touch of a finger on their iPhones, we have no hope of being on top of the material or staying ahead in a meaningful sense.  Perhaps what we can do is inspire them to charge ahead, guide them to the edges of the landscape that we already know, and partner with them in the exploration of the landscapes that we haven’t yet discovered.

Digital History of the Brethren in Christ; writing in an age of distraction

I’ve been spending the last couple of days at the undergraduate conference on the digital humanities at Swarthmore College, getting a feel for what might be possible at the undergraduate level. Yesterday’s keynote by Alexandra Juhaz, whose book Learning from YouTube created a splash a couple of years ago, emphasized that in our writing now we have to write in an environment in which we know people will be distracted, and it may not be a feasible goal to overcome their distraction. Her own work is trying to account for what that might mean for multimedia writing, for either scholars or undergraduates. What does it mean to write for the distracted and know that they will remain distracted. I don’t quite have my brain around that rhetorical situation yet.

I’ve especially been excited to see the real world possibilities for digital humanities project and imagining the kinds of things our undergraduates might do. My colleague John Fea over at The way of Improvement Leads Home directed my attention to a great new digital history site by one of our graduates from Messiah College, Devin Manzullo-Thomas on the history of Brethren in Christ and evangelicals. Devin’s now a graduate student in digital history at Temple. I’d love to see our undergraduates working with our faculty on a project like this

Michael Hart and Project Gutenberg

I felt an unaccountable sense of loss at reading tonight in the Chronicle of Higher Education (paper edition no-less) that the founder of Project Gutenberg, Michael Hart, has died at the age of 64.  This is a little strange since I had no idea who had founded Project Gutenberg until I read the obituary.  But Project Gutenberg I know, and I think I knew immediately when I ran across it that it was already and would continue to be an invaluable resource for readers and scholars, this even though I’ve never been much of a champion of e-books.  I guess it felt a bit like I had discovered belatedly the identity of a person who had given me a great gift and never had the chance to thank him.

One aspect of Hart’s vision for Project Gutenberg struck me in relationship to some of the things I’ve been thinking about in relationship to the Digital Humanities.  That’s Hart’s decision to go with something that was simple and nearly universal as an interface rather than trying to sexy, with it, and up to date.  Says the Chronicle

His early experiences clearly informed his choices regarding Project Gutenberg. He was committed to lo-fi—the lowest reasonable common denominator of textual presentation. That was for utterly pragmatic reasons: He wanted his e-texts to be readable on 99 percent of the existing systems of any era, and so insisted on “Plain-Vanilla ASCII” versions of all the e-texts generated by Project Gutenberg.

That may seem a small—even retro—conceit, but in fact it was huge. From the 80s on, as the Internet slowly became more publicly manifest, there were many temptations to be “up to date”: a file format like WordStar, TeX, or LaTeX in the 1980s, or XyWrite, MS Word, or Adobe Acrobat in the 90s and 2000s, might provide far greater formatting features (italics, bold, tab stops, font selections, extracts, page representations, etc.) than ASCII. But because Mr. Hart had tinkered with technology all his life, he knew that “optimal formats” always change, and that today’s hi-fi format was likely to evolve into some higher-fi format in the next year or two. Today’s ePub version 3.01 was, to Mr. Hart, just another mile marker along the highway. To read an ASCII e-text, via FTP, or via a Web browser, required no change of the presentational software—thereby ensuring the broadest possible readership.

Mr. Hart’s choice meant that the Project Gutenberg corpus—now 36,000 works—would always remain not just available, but readable. What’s more, it has been growing, in every system since.

This is no small thing.  The ephemeral character of digital humanities projects bothers me.  By ephemeral I don’t mean they are intellectually without substance.  I think the intellectual character of the work can be quite profound.  However, the forms in which the work is done can disappear or be outdated tomorrow.  Hart’s decision to use ASCII is in some sense an effort to replicate the durability of the book.  Books, for all the fragility of paper, have a remarkable endurance and stability overall.  The basic form doesn’t change and the book used by an ancient in the middle ages is, more or less, still usable by me in the same fashion.  By contrast I can’t even open some of my old files in my word processor.  I think the work I did was substantial, but the form it was placed in was not enduring.  Harts decision makes sense to me, but I’m not sure how it might be extended to other kinds of projects in the digital humanities.

Uncreative Writing: Kenneth Goldsmith and Liz Laribee on Originality in the Digital Age

Professors have endless angst over the new possibilities for plagiarism and other forms of intellectual property theft in the digital age.  But according to Kenneth Goldsmith in the Chronicle Review, such anxiety misses the point that we long entered a new age of uncreative creativity, a fact to be celebrated rather than lamented since it points to our having gotten beyond simplisitic and romantic or modernist notions of the creative individual.  Of course, Goldsmith is promoting his new book, which I guess he would to take to be some kind of act of creation and for which I’m guessing he will gain his portion of individual profits—though if he wants to share the profits with all those from whom his ideas derive in an uncreative fashion, I’m sure they will oblige.

My snarky comment aside, I think there’s something to Goldsmith’s ideas, encapsulated in his title “It’s Not Plagiarism. In the Digital Age, It’s ‘Repurposing.’”  As Goldsmith puts it.

The prominent literary critic Marjorie Perloff has recently begun using the term “unoriginal genius” to describe this tendency emerging in literature. Her idea is that, because of changes brought on by technology and the Internet, our notion of the genius—a romantic, isolated figure—is outdated. An updated notion of genius would have to center around one’s mastery of information and its dissemination. Perloff has coined another term, “moving information,” to signify both the act of pushing language around as well as the act of being emotionally moved by that process. She posits that today’s writer resembles more a programmer than a tortured genius, brilliantly conceptualizing, constructing, executing, and maintaining a writing machine.

Perloff’s notion of unoriginal genius should not be seen merely as a theoretical conceit but rather as a realized writing practice, one that dates back to the early part of the 20th century, embodying an ethos in which the construction or conception of a text is as important as what the text says or does. Think, for example, of the collated, note-taking practice of Walter Benjamin’s Arcades Project or the mathematically driven constraint-based works by Oulipo, a group of writers and mathematicians.

Today technology has exacerbated these mechanistic tendencies in writing (there are, for instance, several Web-based versions of Raymond Queneau’s 1961 laboriously hand-constructed Hundred Thousand Billion Poems), inciting younger writers to take their cues from the workings of technology and the Web as ways of constructing literature. As a result, writers are exploring ways of writing that have been thought, traditionally, to be outside the scope of literary practice: word processing, databasing, recycling, appropriation, intentional plagiarism, identity ciphering, and intensive programming, to name just a few.

I really do think there is something to this notion that there is a mark of “creativity”—sanitized or put under erasure (to use that hoary old theoretical term) by the quotation marks—in the ways in which we appropriate and redeploy sources from other areas on the internet.  We create personae through citation, quotation, sharing, and commentary rather than through creative acts that spring fully formed from our minds and imagination.  What we choose to cite and how we choose to comment on it, who we share it with, what other citations we assemble together with it in a kind of linguistic collage.  On one level this is old stuff, as Goldsmith points out, stretching back to a particular strand of modernism and even beyond.  Indeed, to go with a different reference to Benjamin, the figure of the storyteller is one who is best understood under the sign of repetition and appropriation, retelling stories that take on new meanings through their performance within particular contexts, rather than creating novel stories that exist on the page in the effort to create their own context.

Good behavior is the proper posture of the weak. (or, Jamaica Kincaid)

I’m reminded in this of some of the work of my friend and former student Liz Laribee, whose art I find visually provocative and surprisingly moving on an emotional scale, made up out of assemblage of leftovers.  About her work, Liz says the following:

My work almost always involves the repurposing of something else, and it’s in this process that I am trying to find meaning. Here, I used discarded bits and overlooked scraps of this bookstore to continue telling stories. The authors I’ve chosen are layered in my life in ways I can’t even quite tell you about. The dime novel poems force a new meaning to make room for a cheekier, sleuthier past

I’m not exactly sure what Liz means by a cheekier, sleuthier past, but what I take from it is that detritus, the schlocky stuff our commercial culture seems to vomit out and then shovel in to a corner is not something to be lamented so much as it is to be viewed as an opportunity, an occasion for a new kind of creativity that takes the vacuous surfaces of that commercial culture and creates a surprising visual and emotional depth.

Goldsmith thinks we are still too absolutely captive to old forms of doing things and thinks writing and literature has descended into irrelevance as a result.  He advocated for the development of a writing machine that moves us beyond the cult of personality and intended effect and into a realm of fortuitous and occasional affect.  Students need to be forced, he thinks, not to be original in the old sense, but to be repetitive and find whatever newness there is through this act of what Liz calls “repurposing.”

All this, of course, is technology-driven. When the students arrive in class, they are told that they must have their laptops open and connected. And so we have a glimpse into the future. And after seeing what the spectacular results of this are, how completely engaged and democratic the classroom is, I am more convinced that I can never go back to a traditional classroom pedagogy. I learn more from the students than they can ever learn from me. The role of the professor now is part party host, part traffic cop, full-time enabler.

The secret: the suppression of self-expression is impossible. Even when we do something as seemingly “uncreative” as retyping a few pages, we express ourselves in a variety of ways. The act of choosing and reframing tells us as much about ourselves as our story about our mother’s cancer operation. It’s just that we’ve never been taught to value such choices.

After a semester of my forcibly suppressing a student’s “creativity” by making her plagiarize and transcribe, she will tell me how disappointed she was because, in fact, what we had accomplished was not uncreative at all; by not being “creative,” she had produced the most creative body of work in her life. By taking an opposite approach to creativity—the most trite, overused, and ill-defined concept in a writer’s training—she had emerged renewed and rejuvenated, on fire and in love again with writing.

Having worked in advertising for many years as a “creative director,” I can tell you that, despite what cultural pundits might say, creativity—as it’s been defined by our culture, with its endless parade of formulaic novels, memoirs, and films—is the thing to flee from, not only as a member of the “creative class” but also as a member of the “artistic class.” At a time when technology is changing the rules of the game in every aspect of our lives, it’s time for us to question and tear down such clichés and reconstruct them into something new, something contemporary, something—finally—relevant.

I think there is something to this, although I doubt traditional novels and stories will disappear or should, any more than the writing of novels did away with storytelling in the old sense in any absolute way.  But I do think we need to think through, and not only in creative writing classes, what we might mean in encouraging our students to come up with their own original ideas, their personal arguments.

How might this notion change what we are doing, recognizing that we are in a period in which creative work, either artistic or academic, is primarily an act of redeploying, distributing, and remaking, rather than being original in the old sense of that word?

More observations on getting started with Digital Humanities

On the train home after a good day in Philly. The afternoon sessions focused a good deal on project management. David and I both agreed that in some respects it was a session that could have been presented at any kind of meeting whatsoever and didn’t seem particularly geared toward digital humanities issues. However, I do think that it is germane to humanists simply because it goes back to the whole issue of working collaboratively. I think we are more or less trained to work alone with a very few exceptions in our humanities disciplines; indeed, we glorify, support, and materially reward working alone as I suggested in my last post on THATCamp Philly. So I think it is helpful for humanists to think through very basic things like what it takes to plan a project, what it takes to run a meeting, what it takes to break a project down in to workable parts, what it takes to keep people running on schedule, what it takes to make people actually collaborate instead of setting off on their own (and instead of just providing them with information or telling them what to do–i.e. holding non-collaborative meetings). These are all issues that do not come naturally to humanists, and in many respects I think I am only figuring them out now after 7 years as a department chair and three years as a dean. These skills are essential if digital humanities requires collaborative work in order to function at any kind of high level.

Among the very simple gains from the day was an introduction to the possibilities of Google Docs. This comes as no surprise to the rest of the world, I’m sure, but I really have not moved out of the structured environs of a office software suite and/or a learning management system. IN my very brief exposure, google docs made these methods of doing work seems really quite clunky and inconvenient, though I haven’t actually tried to work with Google docs at this point. I really want to figure out a way of conducting some of my basic work in this environment, with shared documents in the cloud. We need to be having upside down meetings in some respect–where a lot of stuff gets done in virtual or distanced environments so that face to face meetings can be used for other kinds of high level issues. I’m not sure where to begin, but I want to experiment a little more personally and then see if there’s any way of incorporating it in to some of the meetings I’m responsible for as a dean.

David and I both agreed that we were terribly out of our league when it came to understanding some of the basic language and references to what was going on. We are both disappointed that we won’t be able to come back tomorrow since we both intuited in some sense that it would be better to gain this knowledge by actually plunging in and trying to make sense of what people are doing on actual projects, rather than trying to fill in all the background before we ever get started. If I’m right that this is a great deal about gaining facility in a language, I think both David and I arrived at the conclusion that it would be better to somehow learn by immersion rather than believing that we should learn all the grammatical rules. In that sense, maybe there is no right place to start, and we just have to have a practical goal. Where would we like to get in the landscape and start walking there. We’ll figure out what we need as we go along.

A couple of final notes:

I am getting too old to get up at 4:00 after going to bed at 11:30.

Trying to keep up with Facebook while also listening to a lecture does not work, no matter what my students say.

Humanities and the workplace: or, bodysurfing the Tsunami.

As I suggested in my last post on the demise of Borders, book lovers have lived in an eternal tension between the transcendent ideals their reading often fosters and the commercial realities upon which widespread literacy has depended. The same tension is broadly evident in the Humanities response to professional programs or just more broadly the question of career preparation. We are not wrong to say that an education in history or English is much more than career preparation; nor are we wrong to insist that a college education has to be about much more than pre-professional training. (Not least because most college graduates end up doing things a long distance from their original preparation, and we ought to see that humanities in combination with other knowledges in arts and sciences is at least as good at preparing students for the twists and turns of their eventual career, and perhaps even better, than fields focused on narrower practical preparations

However we are absolutely wrong to assume that questions of career are extraneous or ought to be secondary to our students or our thinking about how we approach curricula.

Daniel Everett, dean of Arts and sciences at Bentley University offers a provocative refection on the need to integrate humanities in to professional education. According to Everett

“Programs that take in students without proper concern for their future or provision for post-graduate opportunities — how they can use what they have learned in meaningful work — need to think about the ethics of their situation. Students no longer come mainly from the leisured classes that were prominent at the founding of higher education. Today they need to find gainful employment in which to apply all the substantive things they learn in college. Majors that give no thought to that small detail seem to assume that since the humanities are good for you, the financial commitment and apprenticeship between student and teacher is fully justified. But in these cases, the numbers of students benefit the faculty and particular programs arguably more than they benefit the students themselves. This is a Ponzi scheme. Q.E.D.”

These are harsh words, but worth considering. I tend to not like Bentley’s particular solutions to the degree that they reduce the humanities to an enriching complement to the important business of, well, business. However, I do think we need to think of new ways of creating our majors that will prepare students for the realities of 21st century employment. Majors that allowed for concentrations in digital humanities would prepare students to engage the changing nature of our disciplines while also gaining technical skills that could serve them well in business. New joint programs with the sciences like those found in medical humanities programs could prepare students in new ways for work in the health care industry. Everett warns of what may happen of humanities programs don’t creatively remake themselves to meet the changing needs of our contemporary world:

“If, like me, you believe that the humanities do have problems to solve, I hope you agree that they are not going to be solved by lamenting the change in culture and exhorting folks to get back on course. That’s like holding your finger up to stop a tidal wave. Thinking like this could mean that new buildings dedicated to the humanities will wind up as mausoleums for the mighty dead rather than as centers of engagement with modern culture and the building of futures in contemporary society.”

Again, I don’t like all of the particular responses Everett has advocated, but I do agree that there is a problem to be addressed that continued proclamations about transferable skills is unlikely to solve. What is sometimes called the applied humanities may be a way of riding the wave rather than being drowned by it.