Category Archives: technology

The Words matter: Digital Humanities Vocabulary 101

I always know that if the hits on my blog spike it has less to do with anything I’ve said than with the fact that some good soul or souls out there have bothered to mention me on their own much more well-read and insightful blogs. This has happened several times with the far flung folks in the Digital Humanities who I mostly only know virtually through Twitter and other social media, as well as with my colleague John Fea over at The Way of Improvement Leads Home, who I acknowledge both virtually and when I pass him in the hallway. Just today, Rebecca Davis over at NITLE wrote a blog post mentioning me and some of my own floundering engagement as an administrator trying to get my mind around what was happening in this field.

Just a word about Rebecca’s post. She’s begun a very interesting project, partly in response to the floundering of folks like me, designed to provide a glossary of terms to help beginners get a grip on things and begin to navigate the thickets of the Digital Humanities. She ran an experiment at a couple of conferences to get things started:

Normally, academics getting to know a new discipline would read about it before doing it. But, the ethos of doing in digital humanities is so strong, that THATCamps ask beginners to engage in doing digital humanities (more hack, less yack). To that end, at my last workshop (at the Institute for Pedagogy in the Liberal Arts hosted by Oxford College of Emory University), I came up with a strategy to let beginners do and help the digital humanities veterans be sensitive to specialist vocabulary. I asked my workshop participants to write down on a post-it note every term they heard from me or other workshop participants that they didn’t know. Then we added them to our workshop wiki and set about defining them. We accumulated quite a few terms (35 total). Although my co-teacher Sean Lind, Digital Services Librarian at Oxford, ended up contributing most of the definitions, I think the list was still useful as an indicator of terms veterans need to be prepared to define.

I repeated the experiment at THATCamp LAC 2012 by proposing a session on a digital humanities glossary and setting up a google doc for the glossary. I think that session happened, though I didn’t make it. Certainly terms were added to the doc throughout the THATcamp, with a final total of 28 terms.

Looking at this admittedly small sample, let me share some preliminary conclusions. There were only five terms that both lists shared (one of which I had contributed by initiating each list with the acronym DH):

  • Crowdsourcing
  • DH = Digital Humanities
  • Hashtag
  • Open Access (OA)
  • TEI= Text Encoding Initiative
I love this idea, love the activity, and I hope that Rebecca’s idea for a glossary takes off. The lists she’s come up with to start seem about right. I will say I ended my first THATCamp still entirely clueless about what TEI stood for and I’m still not entirely sure I could define “XML” for anyone else, even though I think I know it when I see it. (In my defense, I actually did know what crowd sourcing, hashtag, and open access indicated, although I hadn’t the foggiest how you did any of them).
Regarding hacking and yacking, I am, so far, more of a digital humanities advocate than a digital humanities practitioner, a position necessitated both by my ignorance and my position as an administrator with too little time to read his email, much less pursue digital humanities projects. From this position as a facilitator, I feel a little reluctant to take a position, other than to say words matter. Having the word for something gives you one way to act on the world. I’ve always been deeply moved by the section of The Autobiography of Malcolm X wherein he describes learning to read by reading the dictionary. This seems right. If you want to act in a world learn its words, start to speak its language even if at first you are only stringing nouns together into something that only vaguely resembles a sentence.Words became the necessary means of action. Thus, I think that Rebecca’s project will be a boon to those who are still at the stage of the DH language game where they are mostly pointing and grunting.

I started this post thinking I was going to write about intellectual generosity. How important it is and what it looks like when you find it. That will have to wait, but I will say I have appreciated the large hearted generosity of the many folks in DH who know they are blazing a trail and are willing to lay out signposts and serve as guides to others on the path.

Hermeneutics of the stack and the list: unreading journals in print or online

The New York Times reported yesterday that The Wilson Quarterly will put out its final print issue in July ( Wilson Quarterly to End Print Publication – NYTimes.com). The editorial staff seemed sanguine.

“We’re not going on the Web per se,” Steven Lagerfeld, the magazine’s editor, said in an interview. “We already have a Web site. The magazine will simply be published in a somewhat different form as an app,” first in the iTunes store and later on the Android platform.

And, to be honest, I’m sanguine too.  Although, I noted a the demise of the University of Missouri Press with a half shudder last week, I have to admit that I don’t greet the demise of print journals with the same anxiety.  I’ve recognized lately that I mostly buy paper journals so I can have access to their online manifestations or because I feel guilty knowing that online long form journalism and feature writing has yet to find a way to monetize itself effectively. I try to do my part by littering my office and bedroom with stack and stacks of largely unopened New York Reviews, New Yorkers, Chronicles of Higher Ed, and a few other lesser known magazines and specialist journals. But most of my non-book reading, long form or not, is done on my iPad.

I will leave the question of what we will do if good journals like the Wilson Quarterly really can’t survive on iTunes distribution (WQ only survived in paper because of the indulgence of the Woodrow Wilson International Center for Scholars). I’m more interested at the moment in the fact of the stack and what it signifies in the intellectual life.  Every intellectual I know of is guilty of stockpiling books and journals that  she never reads, and can never reasonably expect to, at least not if she has a day job.  The stack is not simply a repository of knowledge and intellectual stimulation beckoning to the reader, drawing him away from other mundane tasks like reading or preparing for class with an ennobling idea of staying informed. (Side note:  academia is the one place in life where every activity of daily life can be construed as tax deductible; just make a note about it and write “possible idea for future article” at the top of the page.)

No, The stack is also a signifier.  It exists not so much to read, since most academics give up hopelessly on the idea of reading every word of the journals that they receive.  The stack exists to be observed.  Observed on the one hand by the academic him or herself, a reassuring sign of one’s own seriousness, that one reads such thing and is conversant with the big ideas, or at least the nifty hot ideas, about culture high and low.  The stack also exists to be observed by others:  the rare student who comes by during office hours, the dean who happens to drop by to say hello, the colleagues coming in to ask you out for coffee–“Oh, you already got the latest issue of PMLA!” The stack suggests you are uptodate, or intend to be.  The stack communicates your values.  Which journal do you put strategically out at the edge of the desk to be observed by others, which do you stack heedlessly on top of the file cabinet.  Even the hopelessly disheveled office can signify, as did Derrida’s constantly disheveled hair; I am too busy and thinking too many big thoughts to be concerned with neatness.

The stack, like the Wilson Quarterly, is on its way out, at least for academics.  I realized four or five years ago that e-books would signify the end of a certain form of identification since people would no longer self-consciously display their reading matter in coffee houses or on subways, every text hidden in the anonymous and private cover of the Kindle or now the iPad.  While I could connect now with other readers in Tibet or Siberia, I could not say off-handedly to the college student sitting next to me–“Oh, you’re reading Jonathan Safran Foer, I loved that book!”

The stack too is going and will soon be gone.  Replaced now by the endless and endlessly growing list of articles on Instapaper that I pretend I will get back to.  This has not yet had the effect of neatening my office, but it will remove one more chance at self-display.  I will soon be accountable only for what I know and what I can actually talk about, not what I can intimate by the stacks of unread paper sitting on my desk.

Enjoy your summer reading. Faster! Faster!: Technology, Recreation, and Being Human

A nice essay from novelist Graham Swift in the New York Times on the issues of reading, writing, speed and leisure.  A lot of what’s here is well travelled ground, though travelled well again by Swift.  I especially noted his sense in which time-saving has become the means by which we are enslaved to time.

A good novel is like a welcome pause in the flow of our existence; a great novel is forever revisitable. Novels can linger with us long after we’ve read them — even, and perhaps particularly, novels that compel us to read them, all other concerns forgotten, in a single intense sitting. We may sometimes count pages as we read, but I don’t think we look at our watches to see how time is slipping away.

That, in fact, is the position of the skeptical nonreader who says, “I have no time to read,” and who deems the pace of life no longer able to accommodate the apparently laggard process of reading books. We have developed a wealth of technologies that are supposed to save us time for leisurely pursuits, but for some this has only made such pursuits seem ponderous and archaic. ­“Saving time” has made us slaves to speed.

via The Narrative Physics of Novels – NYTimes.com.

To some degree Swift is picking up on a perpetual conundrum in the advancements of technology, a dialectic by which we pursue technological ends to make our lives easier, more convenient, less consumed by work and more open to enrichment. Making onerous tasks more efficient has been the dream of technology from indoor plumbing to the washing machine to email. In short we pursue technological means to make our lives more human.

And in some ways and places, we are able to achieve that end.  Who would want, really, to live in the Middle Ages anywhere except in Second Life.  Your expected life span at birth would have been about 30 years, compared to a global average today in the mid to upper 60s, and it would have been a 30 years far more grinding and difficult than what most of the world experiences today (with, of course, important and grievous exceptions).  You would likely have been hopelessly illiterate, cut off from even the possibility of entering a library (much less purchasing a handmade codex in a bookstore), and you would have had no means of being informed of what happened in the next valley last week, much less what happened in Beijing 10 minutes ago.  It is little wonder that  becoming a monk or a priest ranked high on the list of desirable medieval occupations.  Where else were you guaranteed a reward in heaven, as well as at least some access to those things we consider basic features of our contemporary humanity–literacy, education, art, music, a life not under the dominion of physical labor.  What we usually mean when we romanticize the ancient world (or for that matter the 1950s) is that we want all the fruits of our modern era with out the new enslavements that accompany them

At the same time, of course, our technological advances have often been promoted as a gift to humankind in general, but they have as readily been employed to advance a narrow version of human productivity in the marketplace.  Our technologies facilitate fast communication;  This mostly means that we are now expected to communicate more than ever, and they also raise expectations about just exactly what can get done.  Technology vastly expands the range or information we can thoughtfully engage, but increases the sense that we are responsible for knowing something about everything, instead of knowing everything about the few dozen books my great grandparents might have had in their possession.  One reason the vaunted yeoman farmer knew something about Shakespeare, could memorize vast expanses of the bible, and could endure sermons and speeches that lasted for hours is because he didn’t have a twitter feed. Nor did he have an Outlook Calendar that becomes an endless to do list generated by others.

I do think the novel, even in its e-book form, resists this need for speed.  On the other hand, it is worth saying that reading like this must be practiced like other things.  I find that when I take a couple of vacation days for a long weekend (like this weekend), it takes me about 2/3 of a day to slow down and relax and allow myself to pause.  Luckily, I can do this more readily with novels, even at the end of a hectic and too full day or week.  But that might be possible because I learned how to do it in another world, one without the bells and whistles that call for my attention through my multiple devices with their glowing LCDs.

Novel reading is a learned skill, and I wonder whether our students learn it well enough. Re-creation is a learned skill, one we need to be fully ourselves, and I do wonder whether we lose that capacity for pause in our speedy lives.

Revolution and Reformation in Higher Education: Anya Kamenetz’s DIY U

It’s a sign of the fast changing times in higher education that I just finished reading Anya Kamenetz’s DIY U and it already feels just a little bit dated–not terribly so, since it is a kind of futurist fiction about higher education written in 2010–and I feel frustrated at the notion that great new ideas and books to consider are solving yesterdays problems by the time I get around to them.  The shelf life for this kind of thing seems to be about a year and 2010 seems like an eon ago in both publishing and in higher education.  This is too bad because I actually think there is some important ethical thinking about higher education going on in the book that gets obscured both by the speed of the author and the speed with which the educational times are leaving even this book behind.

A few examples: the term MOOC, all the rage since the new cooperative ventures of Harvard, MIT YAle, Stanford and others, is barely mentioned as such–there are a couple of notes about it, but the notion that Ivy League schools would start en-mass to give their educational content away for free isn’t given much attention in this book (indeed, institutions of higher education seem largely to be the problem rather than a part of innovative solutions in Kamenetz’s view).  Similarly, the recent scandals and shennanigans in the for-profit sector barely rate a mention in for Kamenetz, and yet their pervasiveness at the present moment casts an inespcapable pall over the idea that that the for-profits are the best or even a good way forward.  Kamenetz offers a few gestures of critique at the for-profit educational industry, but seems more enamored of the innovations they can offer.  I’m less sanguine about the creative destruction of capitalism when it comes to education, and that shades my own reception of the book.

Overall I liked this book a great deal, but I do think the rosy and largely uncritical view of the present suggests a few problems.  The book catalogues the florid variety of things going on in higher education, championing every change or possibility that’s out there on an equal plane without too much discrimination.  There are a few gestures here and there toward critical thinking about these new possibilities, but mostly things fall into the following rough equations:

Current higher education system = exclusionary + hierarchical + expensive + tradition centered = bad

Anything new = good (or at least potential good)

On some level this strikes me as a convert’s story.  Kamenetz went to Yale College, for goodness sake, not Kaplan University.  So it may be that she is a kind of Martin Luther, or at least his publicist.  One well imagines Kamenetz in the reformation glorifying every sect that came down the pike as good because it wasn’t the catholic church and was returning power to the people.  Or the believer who wakes one morning to realize she believes nothing that her parents church believes, and so is fascinated and wildly attracted to the notion that some people out there worship turnips.

Not sure if anyone actually worships turnips, but you get the point;  its difficult in the midst of a reformation to discriminate and figure out who is Martin Luther, Menno Simons, John Calvin, or William Tyndale, and who is just a the latest crackpot televangelist hocking his wares.  Moreover, it takes a lot of discrimination–and probably more distance than we can afford right now–to figure out which parts of Luther, Simons, Calvin and Tyndale were the things worth keeping and which were, well, more like the crackpot televangelists of their own day.  Are Phoenix, Kaplan, and other for profits really helping poorer students in a way that the bad and exclusive traditional university is not, or are they really fleecing most of them in the name of hope and prosperity–something a good many televangelists and other American Hucksters are well known for?

This book is not where we’ll get that kind of analysis and considered attention about what we really ought to do next, where we ought to put what weight and influence we have.  And I admit, to some degree that’s asking this book to be something it isn’t We need books like this that are more provocations and manifestos than reflective analyses.  We also have to have someone that writes the revolution from the inside with all the enthusiasms that entails.

But that means this is a fast book, subject to the strengths and weaknesses that speed provides, one weakness being a little bit of factual sloppiness and a penchant for hasty and oversimplified analysis that sells well to the journalistic ear.  For instance Kamenetz uses a recurrent metaphor of the higher educational institution being a church that the contemporary world increasingly doesn’t need, and she draws an analogy by saying that statistics show that church attendance has dropped from 40 to 25 percent.  The problem is that the article she cites actually says that regular church attendance has remained consistently at 25 percent for the past couple of decades and has declined only slightly since 1950.  Other studies peg that number at 40 percent.  No study I know of (I’m not an expert)–and certainly not the one that Kamenetz cites–suggests its dropped from 40 to 25 percent.

Another annoying instance is a recurrent statement that administrators of higher education institutions are committed to maintaining the status quo.  This is spoken like someone who never actually talked to an administrator, or perhaps is only speaking about Yale College which for the most part really doesn’t need to change.  Nearly every administrator I know of or have talked to is thinking furiously, sometimes frantically, and sometimes creatively, about how our institutions can change to meet the challenges we face and better serve the public with our various educational missions.  Unless it is the case that Kamenetz is arguing that institutions are simply for the status quo because they are institutions and unwilling to pass quietly in to the night.  But this would jejune.  It sounds good to the anti-institutional American ear, but its doubtful policy for advances in higher education.

These kinds of issues individually are small, but collectively they are annoying and to someone who is involved in the institutional side of higher education and is informed about the issues, they are glaring.  What it might mean is that the book won’t get the kind of attention in higher education institutions that it deserves.

Which is too bad since I think the book ought to be required reading for administrators, if only to debate its urgency.  What the book lacks in critical discrimination it makes up for with passionate and detailed pronouncement–a good sermon can be good for the academic soul.  For one thing, it might help us realize that the way things have always been done isn’t even the way things are being done now for an increasingly larger and larger share of the population.  Just as churches change–however slowly–in the face of historical movements and transformations, higher education is and will be changing as well.  Many of the ideas detailed in Kamenetz’s book help us see the extent to which those changes are occurring and lend new urgency to the question of what those changes mean for us in higher education.  There’s even a good deal available that could help us to think about how to best reform our own practices to meet our current highest ideals, rather than seeing this as a war of good and evil over the minds of the next generation.

I was especially drawn to Kamenetz’s notion of a community of practice–something she drew from Jean Lave and Etienne Wenger:

Such communities  are defined by shared engagement in a task and shared understanding of goals and means to reach them.  In the classic progression of a community of practice, an appentice presents herself to the community and takes on simple beginning tasks at the elbow of an expert.  Everyone is participating in real-world tasks, not academic exercises., so the learner’s actions have consequences right away.  This stage is known as “legitimate peripheral participation.’  As she progresses she continuosly reinforces her learning by teaching others as well.  In a community of practice it is understood that youare just as likely to learn from the mistakes of fellow beginners, or from people with just slightly more experience, as from wizened elders.  Virtual communities of practice are thriving on the internet, among bloggers, gamers, designers and programmers.  These groups have little choice but to teach each other–information technology has been changing so fast for the past few decades that traditional schools and curricula can’t keep up.”

This last, of course, if very true.  I think the question of time for learning and play in higher education is a big problem, as I pointed out a couple of weeks ago.  But even given that, I’m struck by the ways what she describes seems characteristic of the practice already of Digital Humanists as I understand the basics of this particular practice. Something like theHomer Multitext project that includes students from first year Greek classes to fourth year Greek majors is one instance of this.

Beyond this, I am struck by the ethical impulses entailed here and in much of Kamenetz’s work.  She points out that the original meanings of words we associate with universities had to do with something like this notion of community–university and college pointing to the notion of guild or community, a gathering of like-minded people pursuing a common vocation.

This ethical impulse in Kamenetz’s work is what I find most attractive and most usable.  She connects her manifesto to the work of Paul Freire and other catholic priest/intellectuals who were deeply invested in the notion of universal active and engaged education for what my church growing up called “the least of these.”  This is a notion that faculty at my faith-based institution can root themselves in and catch a vision for, and one that I think many other public-minded intellectuals could embrace regardless of the particulars of their beliefs.

What would it mean for us to take advantage of the latest innovations in technology, not because it could take save the institution money and not because it could save faculty time, but what if we could imagine it as a way of taking what we have to those who have need of it?

What if the world were really our classroom, not just the 30 students in front of us who can afford (or not afford) to be there?

What difference would it make to our practice, our politics, our thinking, teaching, and scholarship?

Digital Humanities, “techno-lust”, and the Personal Economy of Professional Development: Ryan Cordell on simplification

It’s a sign of my unsimple and exceedingly overstuffed life that I’ve only now gotten around to reading Ryan Cordell’s ProfHacker piece from last week.  Ryan is moving to a new position at Northeastern (kudos!) and he’s taken the ritual of eliminating the clutter and junk of a household as a metaphor for the need to prioritize and simplify our professional practices and in his own instance to get rid of the techno gadgets and gizmos that curiosity and past necessity have brought his way.

I have confessed before my appreciation for Henry David Thoreau—an odd thinker, perhaps, for a ProfHacker to esteem. Nevertheless, I think Thoreau can be a useful antidote to unbridled techno-lust. As I wrote in that earlier post, “I want to use gadgets and software that will help me do things I already wanted to do—but better, or more efficiently, or with more impact.” I don’t want to acumulate things for their own sake.

…..

I relate this not to brag, but to start a conversation about necessity. We talk about tech all the time here at ProfHacker. We are, most of us at least, intrigued by gadgets and gizmos. But there comes a time to simplify: to hold a garage sale, sell used gadgets on Gazelle, or donate to Goodwill.

via Simplify, Simplify! – ProfHacker – The Chronicle of Higher Education.

Ryan’s post comes as I’ve been thinking a lot about the personal economy that we all must bring to the question of our working lives, our sense of personal balance, and our personal desire for professional development and fulfillment.  As I have been pushing hard for faculty in my school to get more engaged with issues surrounding digital pedagogy and to consider working with students to develop projects in digital humanities, I have been extremely aware that the biggest challenge is not faculty resistance or a lack of faculty curiosity–though there is sometimes that.  The biggest challenge is the simple fact of a lack of faculty time.  At a small teaching college our lives are full, and not always in a good way.  There is extremely little bandwidth to imagine or think through new possibilities, much less experiment with them.

At our end of the year school meeting I posed to faculty the question of what they had found to be the biggest challenge in the past year so that we could think through what to do about it in the future.  Amy, my faculty member who may be the most passionate about trying out the potential of technology in the classroom responded “No time to play.”  Amy indicated that she had bought ten new apps for her iPad that year, but had not had any time to just sit around and experiment with them in order to figure out everything that they could do and imagine new possibilities for her classroom and the rest of her work.  The need for space, the need for play, is necessary for the imagination, for learning, and for change.

It is necessary for excellence, but it is easily the thing we value least in higher education.

Discussing this same issue with my department chairs, one of them said that she didn’t really care how much extra money I would give them to do work on digital humanities and pedagogy, what she really needed was extra time.

This is, I think, a deep problem generally and a very deep problem at a teaching college with a heavy teaching load and restricted budgets.  (At the same time, I do admit that I recognize some of the biggest innovations in digital pedagogy have come from community colleges with far higher teaching loads than ours).  I think, frankly, that this is at the root of some of the slow pace of change in higher education generally. Faculty are busy people despite the stereotype of the professor with endless time to just sit around mooning about nothing.  And books are….simple.  We know how to use them, they work pretty well, they are standardized in terms of their technical specifications, and we don’t have to reinvent the wheel every time we buy one.

Not so with the gadgets, gizmos, and applications that we accumulate rapidly with what Ryan describes as “techno-lust”. (I have not yet been accused of having this, but I am sure someone will use it on me now).  Unless driven by a personal passion, I think most faculty and administrators make an implicit and not irrational decision–“This is potentially interesting, but it would be just one more thing to do.”  This problem is exacerbated by the fact that the changes in technology seem to speed up and diversify rather than slow down and focus.  Technology doesn’t seem to simplify our lives or make them easier despite claims to greater efficiency.  Indeed, in the initial effort to just get familiar or figure out possibilities, technology just seems to add to the clutter.

I do not know a good way around this problem:  the need for play in an overstuffed and frantic educational world that so many of us inhabit.  One answer–just leave it alone and not push for innovation–doesn’t strike me as plausible in the least.  The world of higher education and learning is shifting rapidly under all of our feet, and the failure to take steps to address that change creatively will only confirm the stereotype of higher education as a dinosaur unable to respond to the educational needs of a public.

I’m working with the Provost so see if I can pilot a program that would give a course release to a faculty member to develop his or her abilities in technology in order to redevelop a class or develop a project with a student.  But this is a very small drop in the midst of a very big bucket of need.  And given the frantic pace of perpetual change that seems to be characteristic of contemporary technology, it seems like the need for space to play, and the lack of it, is going to be a perpetual characteristic of our personal professional economies for a very long time to come.

Any good ideas?  How can could I make space for professional play in the lives of faculty? Or for that matter in my own? How could faculty do it for themselves?  Is there a means of decluttering our professional lives to make genuine space for something new?

Why students of the Humanities should look for jobs in Silicon Valley

Ok, I’ll risk sounding like a broken record to say again that the notion that humanities students are ill-positioned for solid careers after college is simply misguided.  It still bears repeating.  This latest from Vivek Wadhwa at the Washington Post gives yet more confirmation of the notion that employers are not looking for specific majors but for skills and abilities and creativity, and that package can come with any major whatsoever, and it often comes with students in the humanities and social sciences.

Using Damon Horowitz, who possess degrees in both philosophy and engineering and whose unofficial title at Google is In-House Philosopher and whose official title is Director of Engineering, Wadhwa points out the deep need for humanities and social science students in the work of technology companies, a need that isn’t just special pleading from a humanist but is made vivid in the actual hiring practices of Silicon Valley companies.

Venture Capitalists often express disdain for startup CEOs who are not engineers. Silicon Valley parents send their kids to college expecting them to major in a science, technology, engineering or math (STEM) discipline. The theory goes as follows: STEM degree holders will get higher pay upon graduation and get a leg up in the career sprint.

The trouble is that theory is wrong. In 2008, my research team at Duke and Harvard surveyed 652 U.S.-born chief executive officers and heads of product engineering at 502 technology companies. We found that they tended to be highly educated: 92 percent held bachelor’s degrees, and 47 percent held higher degrees. But only 37 percent held degrees in engineering or computer technology, and just two percent held them in mathematics. The rest have degrees in fields as diverse as business, accounting, finance, healthcare, arts and the humanities.

Yes, gaining a degree made a big difference in the sales and employment of the company that a founder started. But the field that the degree was in was not a significant factor. ….

I’d take that a step further. I believe humanity majors make the best project managers, the best product managers, and, ultimately, the most visionary technology leaders. The reason is simple. Technologists and engineers focus on features and too often get wrapped up in elements that may be cool for geeks but are useless for most people. In contrast, humanities majors can more easily focus on people and how they interact with technology. A history major who has studied the Enlightment or the rise and fall of the Roman Empire may be more likely to understand the human elements of technology and how ease of use and design can be the difference between an interesting historical footnote and a world-changing technology. 

via Why Silicon Valley needs humanities PhDs – The Washington Post.

Again, at the risk of sounding like a broken record, this sounds like the kind of findings emphasized at the Rethinking Success Conference that I have now blogged on several times.    (I’ve heard theories that people come to be true believers if they hear a story 40 times.  So far I’ve only blogged on this 12 times, so I’ll keep going for a while longer).  Although I still doubt that it would be a good thing for a philosopher to go to Silicon Valley with no tech experience whatsoever,  a philosopher who had prepared himself by acquiring some basic technical skills alongside of his philosophy degree might be in a particularly good position indeed.  Worth considering.

Side note,  the Post article points to a nice little bio about Damon Horowitz.  I suspect there are not many folks in Silicon Valley who can talk about the ethics of tech products in terms that invoke Kant and John Stuart Mill.  Maybe there should be more.

Celebrating the liberal arts in the marketplace (cautiously)

A new survey of 225 employers just out emphasizes the continuing value of the liberal arts in the employment market.

More interesting, at least for those of us who got some parental grief over our college choice, was the apparent love being shown for liberal arts majors. Thirty percent of surveyed employers said they were recruiting liberal arts types, second only to the 34 percent who said they were going after engineering and computer information systems majors. Trailing were finance and accounting majors, as only 18 percent of employers said they were recruiting targets.

“The No. 1 skill that employers are looking for are communication skills and liberal arts students who take classes in writing and speaking,” said Dan Schawbel, founder of Millennial Branding and an expert on Generation Y. “They need to become good communicators in order to graduate with a liberal arts degree. Companies are looking for soft skills over hard skills now because hard skills can be learned, while soft skills need to be developed.”

via Survey On Millennial Hiring Highlights Power Of Liberal Arts – Daily Brief – Portfolio.com.

I don’t particularly like the soft skills/hard skills dichotomy.  However, this fits my general sense, blogged on before, that the hysteria over liberal arts majors lack of employability is, well, hysteria.  Something manufactured by reporters needing something to talk about.

At the same time, I think the somewhat glib and easy tone of this particular article calls for some caution.  Digging in to the statistics provided even in the summary suggests that liberal arts majors need to be supplementing their education with concrete experiences and coursework that will provide a broad panoply of skills and abilities.  50% of employers, for instance, say they are looking for students who held leadership positions on campus, a stat before which even engineers and computer scientist but kneel in obeisance.  Similarly, nearly 69% say they are looking for coursework relevant to the position you are pursuing.  My general sense is you can sell you Shakespeare course to a lot of employers, but it might be helpful if you sold Shakespeare along side the website you built for the course or alongside the three courses you took in computer programming.

Generally speaking, then, I think these statistics confirm the ideas propounded by the Rethinking Success conference in suggesting that students really need to be developing themselves as T-shaped candidates for positions, broad and deep, with a variety of skills and experiences to draw on and some level of expertise that has been, preferably, demonstrated through experiences like internships or project-based creativity.

Speaking of Rethinking Success, the entire website is now up with all the relevant videos.  The session with Philip Gardner from Michigan State is embedded below.  It was Gardner who impressed me by his emphasis that students need to realize that they either need to be liberal arts students with technical skills or technical students with liberal arts skills if they are going to have a chance in the current job market.

Dreaming of Heaven? Connection and Disconnection in Cathy Davidson’s Commencement address at UNC

Cathy Davidson and I crossed paths very briefly at Duke what now seems ages ago, she one of the second wave of important hires during Duke’s heyday in the late 80s and 90s, me a graduate student nearly finished and regretting that I didn’t have a chance to have what the graduate student scuttlebutt told me was a great professor.  I was sorry to miss the connection.  And its one of the ironies of our current information age that I am more “connected” to her now in some respects than I ever was during the single year we were in physical proximity at Duke:  following her tweets, following her blog at HASTAC, checking in on this or that review or interview as it pops up on my twitter feed or in this or that electronic medium.

I’m sure, of course, that she has no idea who I am.

In the past several years, of course, Davidson has become one of the great intellectual cheerleaders for the ways our current digital immersion is changing us as human beings, much for the better in Davidson’s understanding.  Recently Davidson gave the commencement address at the UNC school of Information and Library Science and emphasized the the ways in which our information age is changing even our understanding of post-collegiate adulthood in the ways it enables or seems to enable the possibility of permanent connection.

How do you become an adult?   My students and I spent our last class together talking about the many issues at the heart of this complex, unanswerable question, the one none of us ever stops asking.  One young woman in my class noted that, while being a student meant being constantly together—in dorms, at parties, in class—life on the other side of graduation seemed dauntingly “individual.”  Someone else piped up that at least that problem could be solved with a list serv or a Facebook page.  From the occasional email I receive from one or another of them, I know the students in that class came up with a way to still stay in touch with one another. 

 In the fourth great Information Age,  distance doesn’t have to mean loss in the same way it once did.  If Modernity—the third Industrial Age of Information—was characterized by alienation, how can we use the conditions of our connected Information Age to lessen human alienation, disruption of community, separation, loss?  I’m talking about the deep  “social life of information,” as John Seely Brown would say, not just its technological affordances.  How can we make sure that we use the communication technologies of our Age to help one another, even as our lives take us to different destinations?  How can we make sure our social networks are also our human and humane safety net?  

via Connection in the Age of Information: Commencement Address, School of Information and Library Science, UNC | HASTAC.

At the end of her address Davidson asked the graduates from UNC–ILS to stand and address one another:

And now make your colleague a promise. The words are simple, but powerful, and I know you won’t forget them:  Please say to one another, “I promise we will stay connected.” 

There’s something powerful and affecting about this, but I’ll admit that it gave me some pause, both in the fact that I think it is a promise that is fundamentally impossible to keep, even amidst the powers of our social networks, and in the fact that I’m not sure it would be an absolutely positive thing if we were able to keep it faithfully.

The dream of permanent and universal connection, of course, is a dream of heaven, an infinite and unending reconciliation whereby the living and the dead speak one to another in love without ceasing.  But there are many reasons why this remains a dream of heaven rather than fact of life, not least being our finite capacity for connection.  According to some cognitive theorists, human beings have the capacity for maintaining stable relationships with at most about 200 to 250 people, with many putting the number much lower.  I am not a cognitive scientist, so I won’t argue for the accuracy of a number, and I cant really remember at the moment whether Davidson addresses this idea in her recent work, but to me the general principle seems convincing.  While the internet might offer the allure of infinite connection, and while we might always be able to add more computing power to our servers, and while the human brain is no doubt not yet tapped out in its capacities, it remains the case that we are finite, limited, and….human.  This means that while I value the 600 friends I have on Facebook and the much smaller congregation that visits my blog and those who follow me or whom I follow on Twitter, and a number with whom I have old-fashioned and boring face to face relationships in the flesh, I am meaningfully and continuously connected to only a very few of them comparative to the number of connections I have in the abstract.  This leads to the well-known phenomenon of the joyous and thrilling reconnection with high school friends on Facebook, followed by long fallow periods punctuated only by the thumbs up “like” button for the occasional post about  new grandchildren. We are connected, but we are mostly still disconnected.

And, I would say, a good thing too.

That is, it seems to me that there can be significant values to becoming disconnected, whether intentionally or not.  For one thing, disconnection gives space for the experience of the different and unfamiliar.  One concern we’ve had in our study abroad programs is that students will sometimes stay so connected to the folks back home–i.e. their online comfort zone–that they will not fully immerse in or connect with the cultures that they are visiting.  In other words, they miss an opportunity for new growth and engagement with difference because they are unwilling to let go of the connections they already have and are working, sometimes feverishly, to maintain.

Stretched through time, we might say that something very similar occurs if it becomes imperative that we maintain connections with communities, with the relational self, of our past to the extent that we cannot engage with the relational possibilities of our present.  In order to be fully present to those connections that are actually significant to me–even those relationships that are maintained primarily online–I have to let hordes and hordes of relationships die or lie fallow, maintained only through the fiction of connection that my Facebook and Twitter Newsfeeds happen to allow.

Of course, I don’t think saying any of this is necessarily earth shattering.  I am very sure that the vast majority of my Facebook connections are not pining away about the fact that I am not working hard at maintaining strong connections with every single one of them.  Indeed, I doubt the vast majority of them will even know I wrote this blog since they will miss it on their Newsfeed.  Indeed a good many of them are probably secretly annoyed that I write a daily blog that appears on their newsfeed, but for the sake of our connection they graciously overlook the annoyance.

On the other hand, I do think there is a broad principle about what it means to be human that’s at stake.  Connection isn’t the only value.  Losing connection, separation, dying to some things and people and selves so some new selves can live.  These are values that our age doesn’t talk much about, caught up as we are in our dreams of a heaven of infinite connection. They are, however, facts and even values that make any kind of living at all possible.

Katrina Gulliver’s 10 Commandments of Twitter for Academics – With Exegetical Commentary

I’m a Johnny Come Lately to Twitter as I’ve mentioned on this blog before.   I’ve got the zeal of a new convert.  It was thus with great delight that I ran across Katrina Gulliver’s Ten Commandments of twittering for academics.  It’s a worthwhile article, but I’ll only list the ten commandments themselves as well as my self-evaluation of how I’m doing.

1. Put up an avatar. It doesn’t really matter what the picture is, but the “egg picture” (the default avatar for new accounts) makes you look like a spammer. [I CONFESS I WAS AN EGG FOR SEVERAL MONTHS BUT FINALLY GOT AROUND TO AN AVATAR THREE OR FOUR WEEKS AGO, LUCKILY TWITTER CONVERTS EVERYTHING TO YOUR AVATAR IMMEDIATELY, SO IF YOU ARE JUST AN EGG YOU CAN COVER OVER A MULTITUDE OF SINS IMMEDIATELY BY UPLOADING AN AVATAR.  IT’S VERY NEARLY A RELIGIOUS EXPERIENCE’\]

2. Don’t pick a Twitter name that is difficult to spell or remember. [I WOULD ADD TO THIS THAT IT COULD BE GOOD TO PICK SOMETHING FAIRLY SHORT.  MY OWN HANDLE IS MY NAME, @PETERKPOWERS, BUT THAT TAKES UP A LOT OF CHARACTERS OUT OF THE TWITTER LIMIT]

3. Tweet regularly. [DONE.  I AM NOT YET TO THE STAGE OF ANNOYING MY WIFE, BUT SHE DOESN’T REALIZE THAT’S WHAT I’M DOING ON MY IPAD;  I MIGHT ALSO SAY DON’T TWEET TOO REGULARLY, ESPECIALLY NOT IF YOU ARE LISTING SPECIFIC PERSONS.  NO ONE WANTS THEIR PHONE GOING OFF CONSTANTLY]

4. Don’t ignore people who tweet at you. Set Twitter to send you an e-mail notification when you get a mention or a private message. If you don’t do that, then check your account frequently. [AGREED, ALTHOUGH I STRUGGLE WITH WHETHER TO CONTACT EVERY PERSON WHO FOLLOWS ME;  NOT LIKE I’M INUNDATED, BUT I DON’T HAVE TONS OF TIME.  I TRY TO ACKNOWLEDGE FOLLOWS IF THE SELF-DESCRIPTION SUGGESTS THE PERSON IS CLOSELY CONNECTED TO MY PROFESSIONAL LIFE AND INTERESTS]

5. Engage in conversation. Don’t just drop in to post your own update and disappear. Twitter is not a “broadcast-only” mechanism; it’s CB radio. [DOING THIS, BUT IT TOOK ME A WHILE TO GET AROUND TO IT.  HOWEVER, I’M BETTER AT THIS THAN AT STRIKING UP CONVERSATIONS WITH STRANGERS AT PARTIES]

6. Learn the hashtags for your subject field or topics of interest, and use them.[OK, I DON’T REALLY DO THIS ONE THAT MUCH.  EXCEPT SOME WITH DIGITAL HUMANITIES.  I HAVEN’T FOUND THAT FOLLOWING HASHTAGS OUTSIDE OF #DIGITALHUMANITIES HAS GOTTEN ME ALL THAT FAR]

7. Don’t just make statements. Ask questions. [DONE]

8. Don’t just post links to news articles. I don’t need you to be my aggregator.[I’M NOT SURE ABOUT THIS ONE.  I ACTUALLY THINK TWITTER’S AGGREGATOR QUALITIES IS ONE OF ITS MOST IMPORTANT FEATURES.  FOR PEOPLE WHO I RESPECT IN THE FIELD OF DH, FOR INSTANCE, I REALLY LIKE THEM TO TELL ME WHAT THEY ARE READING AND WHAT THEY LIKE.  DAN COHEN, MARK SAMPLE, RYAN CORDELL, ADELINE KOH, ALL OF THEM ARE READING OR IN CONTACT WITH REALLY IMPORTANT STUFF AND I WANT THEM TO PASS STUFF ALONG.  I’D AGREE THAT JUST POSTING LINKS AT RANDOM MIGHT BE COUNTERPRODUCTIVE, BUT IF YOU ARE BUILDING A REPUTATION AT BEING IN TOUCH WITH GOOD STUFF IN PARTICULAR AREAS, I THINK POSTING LINKS IS ONE GOOD WAY OF BUILDING AN ONLINE PERSONA.  ON THE OTHER HAND, IN THE STRICT DEFINITION OF THE TEXT, I AGREE THAT I DON’T REALLY NEED POSTS OF NEWS ARTICLES PER SE.  I FOLLOW THE NEWS TWITTER FEEDS THEMSELVES FOR THAT KIND OF THING]

9. Do show your personality. Crack some jokes. [DOES TWEETING MY CONTRIBUTION TO INTERNATIONAL MONTY PYTHON STATUS DAY COUNT?]

10. Have fun. [TOO MUCH FUN.  I’VE GOT TO GET BACK TO WORK]

Related note, I‘ve been having a robust, sometimes contentious, sometimes inane discussion about twitter over at the MLA Linked-In group.  Be happy to have someone join that conversation as well.

Digitization and the fulfillment of the book

My colleague in the library here at Messiah College, Jonathan Lauer, has a very nice essay in the most recent Digital Campus edition of the Chronicle of Higher Education.  Jonathan makes an eloquent defense of the traditional book over and against the googlization and ebookification of everything.   He especially employs an extended metaphor drawn from the transition to aluminum bats in various levels of baseball to discuss his unease and reservations about the shifts to electronic books and away from print that is profoundly and rapidly changing the nature of libraries as we’ve known them.  The essay is more evocative than argumentative, so there’s a lot of different things going on, but a couple of Jonathan’s main points are that enhancements we supposedly achieve with digitization projects come at a cost to our understanding of texts and at a cost to ourselves.

In the big leagues, wooden bats still matter. Keeping print materials on campus and accessible remains important for other reasons as well. Witness Andrew M. Stauffer’s recent Chroniclearticle, “The Troubled Future of the 19th-Century Book.” Stauffer, the director of the Networked Infrastructure for Nineteenth-Century Electronic Scholarship, cites several examples of what we all know intuitively. “The books on the shelves carry plenty of information lost in the process of digitization, no matter how lovingly a particular copy is rendered on the screen,” he writes. “There are vitally significant variations in the stacks: editions, printings, issues, bindings, illustrations, paper type, size, marginalia, advertisements, and other customizations in apparently identical copies.” Without these details, discernible only in physical copies, we are unable to understand a book’s total impact. Are we so easily seduced by the aluminum bat that we toss all wooden ones from the bat bag?

Let’s also acknowledge that our gadgets eventually program us. History teaches us that technologies often numb the very human capacities they amplify; in its most advanced forms, this is tantamount to auto-amputation. As weavers lost manual dexterity with their use of increasingly mechanized looms during the Industrial Revolution, so we can only imagine what effect GPS will have on the innate and learned ability of New York City cabbies to find their way around the five boroughs. Yet we practice auto-amputation at our own peril. We dare not abandon wooden bats for aluminum for those endeavors that demand prolonged attention, reflection, and the analysis and synthesis that sometimes lead to wisdom, the best result of those decidedly human endeavors that no gadget can exercise.

I have a lot of sympathy for Jonathan’s position, things like the revamping of the New York Public Library leaving me with a queasy hole in my stomach.  I’ve had a running conversation with Beth Transue, another of our librarians, about our desire to start leading alumni tours of the world’s great libraries, but if we’re going to do so we better get it done fast because most of them won’t be around anymore in a few more years, at least if the NYPL and its budgetary woes are anything to judge by.

At the same time, I think Jonathan overstates his case here.  I don’t think serious thinkers are assuming we’ll get rid of books entirely.  Although I currently think we are already living in what I’ve called an E-plus world, print will continue to be with us serving many different purposes. Jason Epstein over at the NYRB has a blog on this fact and progrognosticating the likely future and uses of the traditional book seems to be a growth industry at the moment. I don’t think the average student is too terribly interested in the material textuality that Jonathan references above, nor for that matter is the average scholar, the vast majority of whom remain interested in what people wrote not how the publishers chose to package it.  But those issues will continue to be extremely important for cultural and social historians, and there will be some forms of work that will only possibly be done with books.  Just as it is a tremendous boon to have Joyce’s manuscript’s digitized, making them available for the general reader and the scholar who cannot afford a trip to Ireland, authoritative interpretations of Joyce’s method, biography, and life’s work will still have to make the trip to Ireland to see the thing for themselves, to capture what can’t be captured by a high resolution camera.

That having been said, who would say that students studying Joyce should avoid examining the digitized manuscripts closely because they aren’t “the genuine article.”  Indeed, I strongly suspect that even the authoritative interpretations of those manuscripts will increasingly be a commerce between examination of the physical object and close examination of digitized objects since advanced DH work shows us time and time again that computerized forms of analysis can get at things the naked eye could never see.  So the fact that there are badly digitized copies of things in google books and beyond, shouldn’t belie the fact that there are some massively important scholarly opportunities here.

Jonathan’s second point is about the deeply human and quasi-spiritual aspects of engagement with traditional books that so many of us have felt over the years.  There’s something very true about this. It is also true that our technologies can result in forms of self amputation.  Indeed, if we are to take it to heart we need to admit that the technology of writing and reading itself is something that involves self-amputation.  Studies have shown that heavy readers alter their brains, and not always in a good sense.  We diminish the capacity of certain forms of memory, literally making ourselves absent minded professors.   Other studies have suggested that persons in oral cultures have this capacity in heightened form, and  some people argue that this generation is far more visually acute than those that preceded it, developing new abilities because of their engagement with visual texts.  So, indeed, our technologies alter us, and even result in self-amputation, but that is true of the traditional book as well as the internet.  This second is Jonathan’s larger claim since it seems to claim for traditional books as such a superiority in terms of something central to humanity as such. I am intrigued, with this argument that the book is superior for serious reflection and the quasi spiritual aspects of study that we have come to treat as central to the humanities.

I admit, I don’t buy it.

First, I admit that I’m just wary about attributing essential human superiorities to historical artifact and practices.  Homer as a collection of aural songs is not inherently inferior to the scrolls within which they were originally collected, then finding their apotheosis in the book form.  We have come to think of the book as exhibiting and symbolizing superior forms of humanity, but it’s not clear that book form was triumphant in the west because of these attributes.  Indeed, traditional Jews and others clearly think the scroll remains the superior spiritual form even to this day.  Rather, the codex triumphed for a variety of complicated reasons.  Partly Christian Churches for ideological reasons apparently wanted to distinguish their own writings from the writings of the Jews.  There may have been some more substantive reasons as well, though that’s not entirely clear: Anthony Grafton points out that many of the Christian innovations with the codex seemed to focus on the desire to compare different kinds of texts side by side (an innovation, I will point out, for which the internet is in many ways easily superior).  The codex also triumphed not because it was spiritually and intellectually superior but because it was, frankly, more efficient, cheaper, and easier to disseminate than its scrolly ancestors.  One good example is from the poet Martial who explicitly ties the selling of his poetry in codex form to making them easily and efficiently accessible to the common person:  “Assign your book-boxes to the great, this copy of me one hand can grasp.”

The entire trend of book history has been toward this effort to make texts and what they contain more readily and easily available to more and more people.  From the early clay tablets to the mass market paperback that let you carry Plato in your hip pocket, the thrust of the book has been toward broader and broader dissemination, toward greater and greater ease of use, toward cheaper and cheaper accessibility.  The goal of writing, even when that writing was imprisoned in libraries that only the initiated could enter as in Umberto Eco’s The Name of the Rose, has been open access.

The digitization that is occurring now comes to fulfill the book, not destroy it.

Secondarily, I guess I no longer believe fully in the spiritual or intellectual superiority of codex forms simply since it doesn’t comport with my experience.  As I do more and more of my reading of books with my various e-readers, I find that I have serious, contemplative, analytical, and synthetic engagements with all kinds of texts, from those hundreds of “pages” long and those not.  As I get used to the tools of various e-readers, theres almost nothing that can’t be accomplished in some way on an e-reader that is accomplished in traditional books.  Although I interact with texts differently now in a spatial sense, I am able to take fuller and more copious notes, I am able to mark texts more easily,  and if I can’t quite remember where something was in the book I can use a search engine to find not only a specific phrase or topic, but every single instance of that topic in the book.  Moreover, because every text represents an act of contemplation on and conversation with other texts, I can at the touch of a screen go and read for myself the interlocutors embedded within a book, just as those interested in Jonathan’s essay can touch my link above and decide for themselves whether I am reading him fairly.  Thus there are very obviously and seriously some ways in which e-readers are superior for serious analytical and interpretive readings of texts, or at least the equal to them.

All this having been said, I will say that there remains one way that I find the traditional paper book the clear superior to the e-book, and that has to do with my ability to make it mine.

I spoke a couple of days ago about the personal connection I felt to Kierkegaard in rereading him and discovering my many years of underlines, highlights and marginalia.  I even confess that I real Kimi Cunningham Grant’s new memoir on my iPad, but I still bought a hard cover at the reading–not because I thought I would be able to analyze it more effectively in hard cover, but because I wanted her to sign it for me.

This is a personal connection to the book that isn’t unimportant, but that is about my personal biography, and Kimi’s.  It’s not about the text, and frankly I doubt it will in the long run even be about literary history.  Some literary archivist somewhere is collecting all the shared comments on the Kindle version of Kimi’s book, and that massive marginalia will be fodder for some graduate student’s dissertation in a few decades.

I pity the poor graduate student who decides on such a project. But at least she won’t have to strain her eyes to decipher the handwriting.