Category Archives: higher education

Revolution and Reformation in Higher Education: Anya Kamenetz’s DIY U

It’s a sign of the fast changing times in higher education that I just finished reading Anya Kamenetz’s DIY U and it already feels just a little bit dated–not terribly so, since it is a kind of futurist fiction about higher education written in 2010–and I feel frustrated at the notion that great new ideas and books to consider are solving yesterdays problems by the time I get around to them.  The shelf life for this kind of thing seems to be about a year and 2010 seems like an eon ago in both publishing and in higher education.  This is too bad because I actually think there is some important ethical thinking about higher education going on in the book that gets obscured both by the speed of the author and the speed with which the educational times are leaving even this book behind.

A few examples: the term MOOC, all the rage since the new cooperative ventures of Harvard, MIT YAle, Stanford and others, is barely mentioned as such–there are a couple of notes about it, but the notion that Ivy League schools would start en-mass to give their educational content away for free isn’t given much attention in this book (indeed, institutions of higher education seem largely to be the problem rather than a part of innovative solutions in Kamenetz’s view).  Similarly, the recent scandals and shennanigans in the for-profit sector barely rate a mention in for Kamenetz, and yet their pervasiveness at the present moment casts an inespcapable pall over the idea that that the for-profits are the best or even a good way forward.  Kamenetz offers a few gestures of critique at the for-profit educational industry, but seems more enamored of the innovations they can offer.  I’m less sanguine about the creative destruction of capitalism when it comes to education, and that shades my own reception of the book.

Overall I liked this book a great deal, but I do think the rosy and largely uncritical view of the present suggests a few problems.  The book catalogues the florid variety of things going on in higher education, championing every change or possibility that’s out there on an equal plane without too much discrimination.  There are a few gestures here and there toward critical thinking about these new possibilities, but mostly things fall into the following rough equations:

Current higher education system = exclusionary + hierarchical + expensive + tradition centered = bad

Anything new = good (or at least potential good)

On some level this strikes me as a convert’s story.  Kamenetz went to Yale College, for goodness sake, not Kaplan University.  So it may be that she is a kind of Martin Luther, or at least his publicist.  One well imagines Kamenetz in the reformation glorifying every sect that came down the pike as good because it wasn’t the catholic church and was returning power to the people.  Or the believer who wakes one morning to realize she believes nothing that her parents church believes, and so is fascinated and wildly attracted to the notion that some people out there worship turnips.

Not sure if anyone actually worships turnips, but you get the point;  its difficult in the midst of a reformation to discriminate and figure out who is Martin Luther, Menno Simons, John Calvin, or William Tyndale, and who is just a the latest crackpot televangelist hocking his wares.  Moreover, it takes a lot of discrimination–and probably more distance than we can afford right now–to figure out which parts of Luther, Simons, Calvin and Tyndale were the things worth keeping and which were, well, more like the crackpot televangelists of their own day.  Are Phoenix, Kaplan, and other for profits really helping poorer students in a way that the bad and exclusive traditional university is not, or are they really fleecing most of them in the name of hope and prosperity–something a good many televangelists and other American Hucksters are well known for?

This book is not where we’ll get that kind of analysis and considered attention about what we really ought to do next, where we ought to put what weight and influence we have.  And I admit, to some degree that’s asking this book to be something it isn’t We need books like this that are more provocations and manifestos than reflective analyses.  We also have to have someone that writes the revolution from the inside with all the enthusiasms that entails.

But that means this is a fast book, subject to the strengths and weaknesses that speed provides, one weakness being a little bit of factual sloppiness and a penchant for hasty and oversimplified analysis that sells well to the journalistic ear.  For instance Kamenetz uses a recurrent metaphor of the higher educational institution being a church that the contemporary world increasingly doesn’t need, and she draws an analogy by saying that statistics show that church attendance has dropped from 40 to 25 percent.  The problem is that the article she cites actually says that regular church attendance has remained consistently at 25 percent for the past couple of decades and has declined only slightly since 1950.  Other studies peg that number at 40 percent.  No study I know of (I’m not an expert)–and certainly not the one that Kamenetz cites–suggests its dropped from 40 to 25 percent.

Another annoying instance is a recurrent statement that administrators of higher education institutions are committed to maintaining the status quo.  This is spoken like someone who never actually talked to an administrator, or perhaps is only speaking about Yale College which for the most part really doesn’t need to change.  Nearly every administrator I know of or have talked to is thinking furiously, sometimes frantically, and sometimes creatively, about how our institutions can change to meet the challenges we face and better serve the public with our various educational missions.  Unless it is the case that Kamenetz is arguing that institutions are simply for the status quo because they are institutions and unwilling to pass quietly in to the night.  But this would jejune.  It sounds good to the anti-institutional American ear, but its doubtful policy for advances in higher education.

These kinds of issues individually are small, but collectively they are annoying and to someone who is involved in the institutional side of higher education and is informed about the issues, they are glaring.  What it might mean is that the book won’t get the kind of attention in higher education institutions that it deserves.

Which is too bad since I think the book ought to be required reading for administrators, if only to debate its urgency.  What the book lacks in critical discrimination it makes up for with passionate and detailed pronouncement–a good sermon can be good for the academic soul.  For one thing, it might help us realize that the way things have always been done isn’t even the way things are being done now for an increasingly larger and larger share of the population.  Just as churches change–however slowly–in the face of historical movements and transformations, higher education is and will be changing as well.  Many of the ideas detailed in Kamenetz’s book help us see the extent to which those changes are occurring and lend new urgency to the question of what those changes mean for us in higher education.  There’s even a good deal available that could help us to think about how to best reform our own practices to meet our current highest ideals, rather than seeing this as a war of good and evil over the minds of the next generation.

I was especially drawn to Kamenetz’s notion of a community of practice–something she drew from Jean Lave and Etienne Wenger:

Such communities  are defined by shared engagement in a task and shared understanding of goals and means to reach them.  In the classic progression of a community of practice, an appentice presents herself to the community and takes on simple beginning tasks at the elbow of an expert.  Everyone is participating in real-world tasks, not academic exercises., so the learner’s actions have consequences right away.  This stage is known as “legitimate peripheral participation.’  As she progresses she continuosly reinforces her learning by teaching others as well.  In a community of practice it is understood that youare just as likely to learn from the mistakes of fellow beginners, or from people with just slightly more experience, as from wizened elders.  Virtual communities of practice are thriving on the internet, among bloggers, gamers, designers and programmers.  These groups have little choice but to teach each other–information technology has been changing so fast for the past few decades that traditional schools and curricula can’t keep up.”

This last, of course, if very true.  I think the question of time for learning and play in higher education is a big problem, as I pointed out a couple of weeks ago.  But even given that, I’m struck by the ways what she describes seems characteristic of the practice already of Digital Humanists as I understand the basics of this particular practice. Something like theHomer Multitext project that includes students from first year Greek classes to fourth year Greek majors is one instance of this.

Beyond this, I am struck by the ethical impulses entailed here and in much of Kamenetz’s work.  She points out that the original meanings of words we associate with universities had to do with something like this notion of community–university and college pointing to the notion of guild or community, a gathering of like-minded people pursuing a common vocation.

This ethical impulse in Kamenetz’s work is what I find most attractive and most usable.  She connects her manifesto to the work of Paul Freire and other catholic priest/intellectuals who were deeply invested in the notion of universal active and engaged education for what my church growing up called “the least of these.”  This is a notion that faculty at my faith-based institution can root themselves in and catch a vision for, and one that I think many other public-minded intellectuals could embrace regardless of the particulars of their beliefs.

What would it mean for us to take advantage of the latest innovations in technology, not because it could take save the institution money and not because it could save faculty time, but what if we could imagine it as a way of taking what we have to those who have need of it?

What if the world were really our classroom, not just the 30 students in front of us who can afford (or not afford) to be there?

What difference would it make to our practice, our politics, our thinking, teaching, and scholarship?

Dumpster Diving and other career moves: remembering the job market with Roger Whitson

It would be hard to say I enjoyed reading Roger Whitson’s very fine recent meditation in the Chronicle on the quest for a tenure-track job, his ambivalent feelings on finding one, the mixed feelings of exaltation and guilt at getting what so many of his peers would never find, and of leaving behind an #Altac existence where he had begun to make a home.

Hard to enjoy reading both because the story seems to typify what our academic life has actually become, and, frankly, because it reminded me too much of my own wandering years as a new academic a couple of decades ago.  I spent seven years full-time on the job market back in the day (if you count the last two years of graduate school).  I have estimated in the past that I must have applied for at least 700-800 jobs during those years–the idea of being targeted and selective a joke for a new father.  Fortunately I was actually only totally unemployed for four months during those years, though that was enough to plunge me thousands of dollars in to debt paying for health insurance.  For five of those seven years I had full-time work in various visiting assistant positions, and for two of  those visiting years I was paid so little I qualified for food stamps, though I never applied for the program.  I worked as many extra courses as I could to pay the bills–probably foolish for my career since publishing slowed to a crawl, but it saved my pride.  I remember asking, naively, during an interview for one such visiting position whether it was actually possible to live in that area of the country on what I was going to be paid.  The chair interviewing me at the time hesitated, then responded, “Well, of course, your wife can work.”

Only one of those years did I not get an interview, and only two of those years did I not get a campus interview, but even then this seemed like a very peculiar and unhelpful way to claim success for a beginning academic career.  We did not have anything called #altac in those days, and my plan B–which on my worst days I sometimes still wonder whether I should have followed–was to go back to cooking school and become a chef (I know, I know.  Another growth industry).  I never felt bad about pursuing a PhD in English, and I don’t think I would have even if I had gone on to become a chef.  The learning was worth it, to me, at least.

But I did grow distant from college friends who became vice-presidents of companies or doctors in growing practices , all of whom talked about their mortgages and vacations in the Caribbean or Colorado, while I was living in the cheapest 2 bedroom apartment in Fairfax Virginia that I could find and fishing furniture, including my daughter’s first bed, out of a dumpster.  (The furniture was held together, literally, by duct tape; I had to pay for conferences). And I spent a lot of evenings walking off my anxiety through the park next to our apartment complex, reminding myself of how much I had to be thankful for.  After all, I had a job and could pay my bills through the creative juggling of credit card balances. A lot of my friends had found no jobs at all.  A low rent comparison, I realize, but I would take what solace I could get.

I do not resent those days now, but that depends a lot on my having come out the other side.  The sobering thought in all of this is in realizing that in the world of academics today I should count myself one of the lucky ones.  Reading Roger’s essay, and the many like it that have been published in the last twenty years, I always get a sick hollow feeling in the gut, remembering what it was like to wonder what would happen if….

Reading Roger’s essay I was struck again with the fact that this is now the permanent condition of academic life in the humanities.  My own job story began more than 20 years ago at Duke, and even then we were told that the job market had been miserable for 15 years (but was sure to get better by and by).  30 years is not a temporary downturn or academic recession.  It is a way of being.

The advent of MOOC’s, all-online education, and for-profit universities, are responses to the economics of higher education that are unlikely to make things any better for the freshly minted PhD.  While there are some exciting innovations here that have a lot of promise for increasing learning to the many, it’s also the case that they are attractive and draw interest because they promise to do it more cheaply, which in the world of higher education means teaching more students with fewer faculty hours.  Roger’s most powerful line came toward the end:  “Until we realize that we are all contingent, we are all #altac, we all need to be flexible, and we are all in this together, we won’t be able to effectively deal with the crisis in the humanities with anything other than guilt.”

This is right, it seems to me.  In a world that is changing as rapidly and as radically as higher education, we are all as contingent the reporters and editors in the newsrooms of proud daily newspapers.  It is easy to say that the person who “made it” was talented enough or smart enough or savvy enough, but mostly they, I, we were just lucky enough to come out the other side.  But we would be misguided to imagine that because we made it in to a world that at least resembled the world we imagined, that that world will always be there.  We are an older institution and industry than music or radio or newspapers, but we are an industry and an institution nonetheless, and it seems to me that the change is upon us.  We are all contingent now.

Digital Humanities, “techno-lust”, and the Personal Economy of Professional Development: Ryan Cordell on simplification

It’s a sign of my unsimple and exceedingly overstuffed life that I’ve only now gotten around to reading Ryan Cordell’s ProfHacker piece from last week.  Ryan is moving to a new position at Northeastern (kudos!) and he’s taken the ritual of eliminating the clutter and junk of a household as a metaphor for the need to prioritize and simplify our professional practices and in his own instance to get rid of the techno gadgets and gizmos that curiosity and past necessity have brought his way.

I have confessed before my appreciation for Henry David Thoreau—an odd thinker, perhaps, for a ProfHacker to esteem. Nevertheless, I think Thoreau can be a useful antidote to unbridled techno-lust. As I wrote in that earlier post, “I want to use gadgets and software that will help me do things I already wanted to do—but better, or more efficiently, or with more impact.” I don’t want to acumulate things for their own sake.

…..

I relate this not to brag, but to start a conversation about necessity. We talk about tech all the time here at ProfHacker. We are, most of us at least, intrigued by gadgets and gizmos. But there comes a time to simplify: to hold a garage sale, sell used gadgets on Gazelle, or donate to Goodwill.

via Simplify, Simplify! – ProfHacker – The Chronicle of Higher Education.

Ryan’s post comes as I’ve been thinking a lot about the personal economy that we all must bring to the question of our working lives, our sense of personal balance, and our personal desire for professional development and fulfillment.  As I have been pushing hard for faculty in my school to get more engaged with issues surrounding digital pedagogy and to consider working with students to develop projects in digital humanities, I have been extremely aware that the biggest challenge is not faculty resistance or a lack of faculty curiosity–though there is sometimes that.  The biggest challenge is the simple fact of a lack of faculty time.  At a small teaching college our lives are full, and not always in a good way.  There is extremely little bandwidth to imagine or think through new possibilities, much less experiment with them.

At our end of the year school meeting I posed to faculty the question of what they had found to be the biggest challenge in the past year so that we could think through what to do about it in the future.  Amy, my faculty member who may be the most passionate about trying out the potential of technology in the classroom responded “No time to play.”  Amy indicated that she had bought ten new apps for her iPad that year, but had not had any time to just sit around and experiment with them in order to figure out everything that they could do and imagine new possibilities for her classroom and the rest of her work.  The need for space, the need for play, is necessary for the imagination, for learning, and for change.

It is necessary for excellence, but it is easily the thing we value least in higher education.

Discussing this same issue with my department chairs, one of them said that she didn’t really care how much extra money I would give them to do work on digital humanities and pedagogy, what she really needed was extra time.

This is, I think, a deep problem generally and a very deep problem at a teaching college with a heavy teaching load and restricted budgets.  (At the same time, I do admit that I recognize some of the biggest innovations in digital pedagogy have come from community colleges with far higher teaching loads than ours).  I think, frankly, that this is at the root of some of the slow pace of change in higher education generally. Faculty are busy people despite the stereotype of the professor with endless time to just sit around mooning about nothing.  And books are….simple.  We know how to use them, they work pretty well, they are standardized in terms of their technical specifications, and we don’t have to reinvent the wheel every time we buy one.

Not so with the gadgets, gizmos, and applications that we accumulate rapidly with what Ryan describes as “techno-lust”. (I have not yet been accused of having this, but I am sure someone will use it on me now).  Unless driven by a personal passion, I think most faculty and administrators make an implicit and not irrational decision–“This is potentially interesting, but it would be just one more thing to do.”  This problem is exacerbated by the fact that the changes in technology seem to speed up and diversify rather than slow down and focus.  Technology doesn’t seem to simplify our lives or make them easier despite claims to greater efficiency.  Indeed, in the initial effort to just get familiar or figure out possibilities, technology just seems to add to the clutter.

I do not know a good way around this problem:  the need for play in an overstuffed and frantic educational world that so many of us inhabit.  One answer–just leave it alone and not push for innovation–doesn’t strike me as plausible in the least.  The world of higher education and learning is shifting rapidly under all of our feet, and the failure to take steps to address that change creatively will only confirm the stereotype of higher education as a dinosaur unable to respond to the educational needs of a public.

I’m working with the Provost so see if I can pilot a program that would give a course release to a faculty member to develop his or her abilities in technology in order to redevelop a class or develop a project with a student.  But this is a very small drop in the midst of a very big bucket of need.  And given the frantic pace of perpetual change that seems to be characteristic of contemporary technology, it seems like the need for space to play, and the lack of it, is going to be a perpetual characteristic of our personal professional economies for a very long time to come.

Any good ideas?  How can could I make space for professional play in the lives of faculty? Or for that matter in my own? How could faculty do it for themselves?  Is there a means of decluttering our professional lives to make genuine space for something new?

Why students of the Humanities should look for jobs in Silicon Valley

Ok, I’ll risk sounding like a broken record to say again that the notion that humanities students are ill-positioned for solid careers after college is simply misguided.  It still bears repeating.  This latest from Vivek Wadhwa at the Washington Post gives yet more confirmation of the notion that employers are not looking for specific majors but for skills and abilities and creativity, and that package can come with any major whatsoever, and it often comes with students in the humanities and social sciences.

Using Damon Horowitz, who possess degrees in both philosophy and engineering and whose unofficial title at Google is In-House Philosopher and whose official title is Director of Engineering, Wadhwa points out the deep need for humanities and social science students in the work of technology companies, a need that isn’t just special pleading from a humanist but is made vivid in the actual hiring practices of Silicon Valley companies.

Venture Capitalists often express disdain for startup CEOs who are not engineers. Silicon Valley parents send their kids to college expecting them to major in a science, technology, engineering or math (STEM) discipline. The theory goes as follows: STEM degree holders will get higher pay upon graduation and get a leg up in the career sprint.

The trouble is that theory is wrong. In 2008, my research team at Duke and Harvard surveyed 652 U.S.-born chief executive officers and heads of product engineering at 502 technology companies. We found that they tended to be highly educated: 92 percent held bachelor’s degrees, and 47 percent held higher degrees. But only 37 percent held degrees in engineering or computer technology, and just two percent held them in mathematics. The rest have degrees in fields as diverse as business, accounting, finance, healthcare, arts and the humanities.

Yes, gaining a degree made a big difference in the sales and employment of the company that a founder started. But the field that the degree was in was not a significant factor. ….

I’d take that a step further. I believe humanity majors make the best project managers, the best product managers, and, ultimately, the most visionary technology leaders. The reason is simple. Technologists and engineers focus on features and too often get wrapped up in elements that may be cool for geeks but are useless for most people. In contrast, humanities majors can more easily focus on people and how they interact with technology. A history major who has studied the Enlightment or the rise and fall of the Roman Empire may be more likely to understand the human elements of technology and how ease of use and design can be the difference between an interesting historical footnote and a world-changing technology. 

via Why Silicon Valley needs humanities PhDs – The Washington Post.

Again, at the risk of sounding like a broken record, this sounds like the kind of findings emphasized at the Rethinking Success Conference that I have now blogged on several times.    (I’ve heard theories that people come to be true believers if they hear a story 40 times.  So far I’ve only blogged on this 12 times, so I’ll keep going for a while longer).  Although I still doubt that it would be a good thing for a philosopher to go to Silicon Valley with no tech experience whatsoever,  a philosopher who had prepared himself by acquiring some basic technical skills alongside of his philosophy degree might be in a particularly good position indeed.  Worth considering.

Side note,  the Post article points to a nice little bio about Damon Horowitz.  I suspect there are not many folks in Silicon Valley who can talk about the ethics of tech products in terms that invoke Kant and John Stuart Mill.  Maybe there should be more.

Digital Humanities as Culture Difference: Adeline Koh on Hacking and Yacking

My colleague Bernardo Michael in the History department here has been pressing me to understand that properly understood Digital Humanities should be deeply connected to our College-wide efforts to address questions of diversity and what the AAC&U calls inclusive excellence.  (Bernardo also serves as the special assistant to the President for Diversity affairs).  At first blush I will admit that this has seemed counter-intuitive to me and I have struggled to articulate the priority between my interest in developing new efforts in Digital Humanities that I tie to our college’s technology plan and my simultaneous concerns with furthering our institutions diversity plan (besides just a general ethical interest, my primary field of study over the past 20 years has been multicultural American Literature).

Nevertheless, I’ve started seeing more and more of Bernardo’s point as I’ve engaged in the efforts to get things started in Digital Humanities.  For one thing, the practices and personages of the digital world are talked about in cultural terms:  We use language like “digital natives” and “digital culture” and “netizens”–cultural terms that attempt to articulate new forms of social and cultural being.  In the practical terms of trying to create lift-off for some of these efforts, an administrator faces the negotiation of multiple institutional cultures, and the challenging effort to get faculty–not unreasonably happy and proud about their achievements within their own cultural practices–to see that they actually need to become conversant in the languages and practices of an entirely different and digital culture.

Thus I increasingly see that Bernardo is right;  just as we need to acclimate ourselves and become familiar with other kinds of cultural differences in the classroom, and just as our teaching needs to begin to reflect the values of diversity and global engagement, our teaching practices also need to engage students as digital natives.  Using technology in the classroom or working collaboratively with students on digital projects isn’t simply instrumental–i.e. it isn’t simply about getting students familiar with things they will need for a job.  It is, in many ways, about cultural engagement, respect, and awareness.  How must our own cultures within academe adjust and change to engage with a new and increasingly not so new culture–one that is increasingly central and dominant to all of our cultural practices?

Adeline Koh over at Richard Stockton College (and this fall at Duke, I think), has a sharp post on these kinds of issues, focusing more on the divide between theory and practice or yacking and hacking in Digital Humanities.  Adeline has more theory hope than I do, but I like what she’s probing in her piece and I especially like where she ends up:

If computation is, as Cathy N. Davidson (@cathyndavidson) and Dan Rowinski have been arguing, the fourth “R” of 21st century literacy, we very much need to treat it the way we already do existing human languages: as modes of knowledge which unwittingly create cultural valences and systems of privilege and oppression. Frantz Fanon wrote in Black Skin, White Masks: “To speak a language is to take on a world, a civilization.”  As Digital Humanists, we have the responsibility to interrogate and to understand what kind of world, and what kind of civilization, our computational languages and forms create for us. Critical Code Studies is an important step in this direction. But it’s important not to stop there, but to continue to try to expand upon how computation and the digital humanities are underwritten by theoretical suppositions which still remain invisible.

More Hack, Less Yack?: Modularity, Theory and Habitus in the Digital Humanities | Adeline Koh
http://www.adelinekoh.org/blog/2012/05/21/more-hack-less-yack-modularity-theory-and-habitus-in-the-digital-humanities/

I suspect that Bernardo and Adeline would have a lot to say to each other.

Celebrating the liberal arts in the marketplace (cautiously)

A new survey of 225 employers just out emphasizes the continuing value of the liberal arts in the employment market.

More interesting, at least for those of us who got some parental grief over our college choice, was the apparent love being shown for liberal arts majors. Thirty percent of surveyed employers said they were recruiting liberal arts types, second only to the 34 percent who said they were going after engineering and computer information systems majors. Trailing were finance and accounting majors, as only 18 percent of employers said they were recruiting targets.

“The No. 1 skill that employers are looking for are communication skills and liberal arts students who take classes in writing and speaking,” said Dan Schawbel, founder of Millennial Branding and an expert on Generation Y. “They need to become good communicators in order to graduate with a liberal arts degree. Companies are looking for soft skills over hard skills now because hard skills can be learned, while soft skills need to be developed.”

via Survey On Millennial Hiring Highlights Power Of Liberal Arts – Daily Brief – Portfolio.com.

I don’t particularly like the soft skills/hard skills dichotomy.  However, this fits my general sense, blogged on before, that the hysteria over liberal arts majors lack of employability is, well, hysteria.  Something manufactured by reporters needing something to talk about.

At the same time, I think the somewhat glib and easy tone of this particular article calls for some caution.  Digging in to the statistics provided even in the summary suggests that liberal arts majors need to be supplementing their education with concrete experiences and coursework that will provide a broad panoply of skills and abilities.  50% of employers, for instance, say they are looking for students who held leadership positions on campus, a stat before which even engineers and computer scientist but kneel in obeisance.  Similarly, nearly 69% say they are looking for coursework relevant to the position you are pursuing.  My general sense is you can sell you Shakespeare course to a lot of employers, but it might be helpful if you sold Shakespeare along side the website you built for the course or alongside the three courses you took in computer programming.

Generally speaking, then, I think these statistics confirm the ideas propounded by the Rethinking Success conference in suggesting that students really need to be developing themselves as T-shaped candidates for positions, broad and deep, with a variety of skills and experiences to draw on and some level of expertise that has been, preferably, demonstrated through experiences like internships or project-based creativity.

Speaking of Rethinking Success, the entire website is now up with all the relevant videos.  The session with Philip Gardner from Michigan State is embedded below.  It was Gardner who impressed me by his emphasis that students need to realize that they either need to be liberal arts students with technical skills or technical students with liberal arts skills if they are going to have a chance in the current job market.

Katrina Gulliver’s 10 Commandments of Twitter for Academics – With Exegetical Commentary

I’m a Johnny Come Lately to Twitter as I’ve mentioned on this blog before.   I’ve got the zeal of a new convert.  It was thus with great delight that I ran across Katrina Gulliver’s Ten Commandments of twittering for academics.  It’s a worthwhile article, but I’ll only list the ten commandments themselves as well as my self-evaluation of how I’m doing.

1. Put up an avatar. It doesn’t really matter what the picture is, but the “egg picture” (the default avatar for new accounts) makes you look like a spammer. [I CONFESS I WAS AN EGG FOR SEVERAL MONTHS BUT FINALLY GOT AROUND TO AN AVATAR THREE OR FOUR WEEKS AGO, LUCKILY TWITTER CONVERTS EVERYTHING TO YOUR AVATAR IMMEDIATELY, SO IF YOU ARE JUST AN EGG YOU CAN COVER OVER A MULTITUDE OF SINS IMMEDIATELY BY UPLOADING AN AVATAR.  IT’S VERY NEARLY A RELIGIOUS EXPERIENCE’\]

2. Don’t pick a Twitter name that is difficult to spell or remember. [I WOULD ADD TO THIS THAT IT COULD BE GOOD TO PICK SOMETHING FAIRLY SHORT.  MY OWN HANDLE IS MY NAME, @PETERKPOWERS, BUT THAT TAKES UP A LOT OF CHARACTERS OUT OF THE TWITTER LIMIT]

3. Tweet regularly. [DONE.  I AM NOT YET TO THE STAGE OF ANNOYING MY WIFE, BUT SHE DOESN’T REALIZE THAT’S WHAT I’M DOING ON MY IPAD;  I MIGHT ALSO SAY DON’T TWEET TOO REGULARLY, ESPECIALLY NOT IF YOU ARE LISTING SPECIFIC PERSONS.  NO ONE WANTS THEIR PHONE GOING OFF CONSTANTLY]

4. Don’t ignore people who tweet at you. Set Twitter to send you an e-mail notification when you get a mention or a private message. If you don’t do that, then check your account frequently. [AGREED, ALTHOUGH I STRUGGLE WITH WHETHER TO CONTACT EVERY PERSON WHO FOLLOWS ME;  NOT LIKE I’M INUNDATED, BUT I DON’T HAVE TONS OF TIME.  I TRY TO ACKNOWLEDGE FOLLOWS IF THE SELF-DESCRIPTION SUGGESTS THE PERSON IS CLOSELY CONNECTED TO MY PROFESSIONAL LIFE AND INTERESTS]

5. Engage in conversation. Don’t just drop in to post your own update and disappear. Twitter is not a “broadcast-only” mechanism; it’s CB radio. [DOING THIS, BUT IT TOOK ME A WHILE TO GET AROUND TO IT.  HOWEVER, I’M BETTER AT THIS THAN AT STRIKING UP CONVERSATIONS WITH STRANGERS AT PARTIES]

6. Learn the hashtags for your subject field or topics of interest, and use them.[OK, I DON’T REALLY DO THIS ONE THAT MUCH.  EXCEPT SOME WITH DIGITAL HUMANITIES.  I HAVEN’T FOUND THAT FOLLOWING HASHTAGS OUTSIDE OF #DIGITALHUMANITIES HAS GOTTEN ME ALL THAT FAR]

7. Don’t just make statements. Ask questions. [DONE]

8. Don’t just post links to news articles. I don’t need you to be my aggregator.[I’M NOT SURE ABOUT THIS ONE.  I ACTUALLY THINK TWITTER’S AGGREGATOR QUALITIES IS ONE OF ITS MOST IMPORTANT FEATURES.  FOR PEOPLE WHO I RESPECT IN THE FIELD OF DH, FOR INSTANCE, I REALLY LIKE THEM TO TELL ME WHAT THEY ARE READING AND WHAT THEY LIKE.  DAN COHEN, MARK SAMPLE, RYAN CORDELL, ADELINE KOH, ALL OF THEM ARE READING OR IN CONTACT WITH REALLY IMPORTANT STUFF AND I WANT THEM TO PASS STUFF ALONG.  I’D AGREE THAT JUST POSTING LINKS AT RANDOM MIGHT BE COUNTERPRODUCTIVE, BUT IF YOU ARE BUILDING A REPUTATION AT BEING IN TOUCH WITH GOOD STUFF IN PARTICULAR AREAS, I THINK POSTING LINKS IS ONE GOOD WAY OF BUILDING AN ONLINE PERSONA.  ON THE OTHER HAND, IN THE STRICT DEFINITION OF THE TEXT, I AGREE THAT I DON’T REALLY NEED POSTS OF NEWS ARTICLES PER SE.  I FOLLOW THE NEWS TWITTER FEEDS THEMSELVES FOR THAT KIND OF THING]

9. Do show your personality. Crack some jokes. [DOES TWEETING MY CONTRIBUTION TO INTERNATIONAL MONTY PYTHON STATUS DAY COUNT?]

10. Have fun. [TOO MUCH FUN.  I’VE GOT TO GET BACK TO WORK]

Related note, I‘ve been having a robust, sometimes contentious, sometimes inane discussion about twitter over at the MLA Linked-In group.  Be happy to have someone join that conversation as well.

Assessment, Knowledge, and Magic in the Humanities

I do not recommend becoming chair of the accreditation team at your college if you value your mental health.  I do, however, recommend it if you want to understand the multitude of cultures that make up your own institution and to think through how they all fit together, or not, in the common educational enterprise.  Like all academics living it seems, we are grappling with various levels of success with the assessment tsunami that has hit higher education in the past decade or so.  One feature that’s very evident is that different parts of campus have different attitudes toward assessment and its virtues or evils.  Among we humanists, there’s still a large contingent that believes that what we do is unassessable, that our value can’t be assessed but that at the same time it should be obvious to everyone.

I think this approach is mostly self-defeating;  it seems to me that it mostly invokes a mystification that, if taken literally, means that we can’t even know ourselves what we mean when we say the humanities have a value that should be recognized by the institutions in which they live and move and have their being.  I do think there is a truth to which this mystification speaks.  Some of the most important moments in learning, perhaps the most crucial moments in learning, in the humanities are unreplicable and so unmeasurable. The fact that reading Soren Kierkegaard changed my life in some fundamental sense and filled me with a love for the life of the mind that has never since been expended is not a fact that means Kierkegaard should be required reading for everyone, as if reading SK were like learning an algebraic equation.

On the other hand, the fact of these transformative experiences, and most academics have such liminal experience or they wouldn’t be academics, shouldn’t lead us to say that no thing is measurable in what we do, or that because the things that we can measure are not the liminal experiences that made us who we are they are therefore not worth assessing at all.  This would be like saying that because the really crucial things in music are things like Verdi’s Othello or Handel’s Messiah, we shouldn’t bother to see if a music teacher’s methods are helping students to learn to play Bach two-part inventions effectively. It’s less sexy, but if students can’t do analogous things reasonably well, they won’t ever be in a position to have the kinds of transformative experiences with humanistic work that we ourselves value and recognize as fundamental to who we are and who we hope our students will become.

Over at Digital Digs, Alex Reid has a very good blog on assessment and the humanities  where he points to the need to develop forms of knowledge that help us get at things that will lead to useful change, and points out that doing this well is related to our oft-professed desire to be pursuing work that makes a difference in the world:

This is why, when it comes to assessment, I always ask “What kind of knowledge would we require in order to make a substantive change?” That question asks not only about the specific knowledge statement but the process by which the knowledge is constructed. Anecdotes are not strong enough. And my concern for the humanities is that it doesn’t believe that any knowledge is strong enough to make such decisions. This, of course, does not mean that curriculum doesn’t happen or that changes don’t occur. It simply means that we deny ourselves the opportunity to produce knowledge that is strong enough to inform decision-making. Instead we are left with individual feelings, opinions, and beliefs and whatever they amount to. A skeptic might say that this is all that humanistic knowledge has ever been. 

But I can’t believe that. I can’t afford to believe that. If we believe that as humanists we cannot produce knowledge of real value with the strength to make changes in the world, then what would we be doing as teachers or scholars? We would be engaged in some kind of self-pleasuring activity, perhaps with the idea that our performances might instill in others (through some quasi-magical, sympathetic incantation) a similar practice of finding self-pleasure (or aesthetic appreciation) through a purely subjective/cultural/discursive encounter with the objects we study. No doubt there is a strong strand of such thinking in the humanities, especially in English, that goes back at least to Matthew Arnold (though in his case the self-pleasure was imbuded with a chaste religiosity rather than the psycho-sexual implications one probably sees here). However, no one would imagine self-pleasure as the sole goal of humanistic study. We must be able to produce knowledge that has the strength to make changes. And that requires an understanding of how knowledge is constructed and operates in a world that isn’t divided into natural, social, and discursive realms. And this is as true for our research and teaching as it is for assessment.

via digital digs: constructing academic knowledge.

Reid points out that over and against this, we are usually driven by classroom lore, anecdotes about our students that seems to identify problems and lead to certain forms of common knowledge, but that never actually rise to the level of knowledge that can make a difference.  This is what we ought to be seeking in the humanities.  Knowledge that will make a difference in our students lives.  Because we clearly can’t replicate those magical moments that all of us have had with books and culture, that doesn’t mean we shouldn’t be attending to those more mundane items that are the ground through which that magic happens.  And so we need to figure out basic questions like the following:

  • Have our students established a fundamental level of disciplinary literacy such that they are able to make connections across the discipline and find connections for their work in other disciplines?
  • Do our students understand how to enter in to a disciplinary conversation through effective research, the development of an argument with a point of view and a broad grasp (appropriate to an undergraduate) of the issues that are at stake for the argument in the discipline or in the culture at large.
  • Do our students understand logical fallacies, the appropriate use of evidence, and the nature of different rhetorical situations?
  • Can our students effectively discuss the application of their humanistic knowledge to non-academic areas of life, and can they effectively articulate the relationship of the skills and abilities they’ve developed to the world of work and careers following college?  (An outcome I realize not everyone may embrace, but which I have come to think of as fundamental following the Rethinking Success conference of a few weeks ago).

There are probably others, maybe many others that are more important, but these would be a start, and we ought to be willing to work to find the tools that will effectively measure such things even though none of these speak to the magical moments we and our students have when we are seized anew by an idea.

Barack Obama’s Waste Land; President as First Reader

GalleyCat reported today that the new biography of Barack Obama gives an extensive picture of Obama’s literary interests, including a long excerpt of a letter in which Obama details his engagement with TS Eliot and his signature poem, The Waste Land. Obama’s analysis:

Eliot contains the same ecstatic vision which runs from Münzer to Yeats. However, he retains a grounding in the social reality/order of his time. Facing what he perceives as a choice between ecstatic chaos and lifeless mechanistic order, he accedes to maintaining a separation of asexual purity and brutal sexual reality. And he wears a stoical face before this. Read his essay on Tradition and the Individual Talent, as well as Four Quartets, when he’s less concerned with depicting moribund Europe, to catch a sense of what I speak. Remember how I said there’s a certain kind of conservatism which I respect more than bourgeois liberalism—Eliot is of this type. Of course, the dichotomy he maintains is reactionary, but it’s due to a deep fatalism, not ignorance. (Counter him with Yeats or Pound, who, arising from the same milieu, opted to support Hitler and Mussolini.) And this fatalism is born out of the relation between fertility and death, which I touched on in my last letter—life feeds on itself. A fatalism I share with the western tradition at times.

A Portrait of Barack Obama as a Literary Young Man – GalleyCat.

For a 22 year old, you’d have to say this is pretty good. I’m impressed with the nuance of Obamas empathetic imagination, both in his ability to perceive the differences between the three great conservative poets of that age, and in his ability to identify with Eliot against his own political instincts. This is the kind of reading we’d like to inculcate in our students, and I think it lends credence to the notion that a mind trained in this kind of engagement might be better trained for civic engagement than those that are not. But too often even literature profs are primarily readers of the camp, so to speak, lumping those not of their own political or cultural persuasion into the faceless, and largely unread, camp of the enemy, and appreciating without distinction those who further our pet or current causes.

This is too bad, reducing a richer sense of education for civic engagement into the narrower and counterproductive sense of reading as indoctrination. I think the older notion was a vision of education that motivated the founding fathers. Whatever one thinks of his politics, passages like this suggest to me that Obama could sit unembarrassed with Jefferson and Adams discussing in all seriousness the relationship between poetry and public life. It would be a good thing to expect this of our presidents, rather than stumbling upon it by accident.

Annotating Kierkegaard; an intellectual’s appreciation

I am largely an intellectual because of Soren Kierkegaard.  I mean this primarily in terms of intellectual biography rather than genealogy.  A few days ago I noted briefly my own vocational journey into English at the hands of T.S. Eliot.  That is a true tale. However, at Eliot’s hands and through English alone as an undergraduate I largely wanted to be the next great poet or novelist.  Kierkegaard taught me to think, or at least taught me that thinking was something a Christian could do, ought to do, with whatever capacity God had given him.  Through Kierkegaard I came to Walker Percy, subject of my undergraduate thesis, and then John Updike, subject of my first scholarly essay, and probably too to literary and cultural theory which became a field of my doctoral studies and has remained a passion.   His writerly creativity, his playfulness with language image and authorial personae, never let me believe that critical writing was the inherent inferior to fiction, even if it is often practiced poorly.

In honor of Kierkegaard’s birthday yesterday, I took down some of my old SK from the shelf and blew the dust off.  The old Walter Lowrie paperback editions that were 3.95 back in the day.  The rapturous and pious annotations that fill the margins are now cringe-inducing, but I am reminded of the passions an intellectual engagement deeply felt can arouse.  A lot of the passages are marked over in four or five different colors of highlights and underlining, a way of trying to keep track, I suspect, of the many different readings I gave those book back in the day, a way of tracking the different person I was becoming.  And if I now have moved a long way from those Kierkegaardian roots in to other hipper modes of thinking, I’m also of an age where I’ve started realizing that the newest thing is not necessarily a mark of the best thing, maybe only showing you what you already knew without realizing it rather than what you need to know.

I still think The Great Dane wears well.  His comments on sectarianism, as well as his more general clarity about easy piety, say something to our own age as equally as his.  And, I still wonder sometimes, deep down, whether my first love was not the best.

From Fear and Trembling:

The true knight of faith is always absolute isolation, the false knight is sectarian. This sectarianism is an attempt to leap away from the narrow path of the paradox and become a tragic hero at a cheap price. The tragic hero expresses the universal and sacrifices himself for it. The sectarian punchinello, instead of that, has a private theatre, i.e. several good friends and comrades who represent the universal just about as well as the beadles in The Golden Snuffbox represent justice. The knight of faith, on the contrary, is the paradox, is the individual, absolutely nothing but the individual, without connections or pretensions. This is the terrible thing which the sectarian manikin cannot endure. For instead of learning from this terror that he is not capable of performing the great deed and then plainly admitting it (an act which I cannot but approve, because it is what I do) the manikin thinks that by uniting with several other manikins he will be able to do it. But that is quite out of the question. In the world of spirit no swindling is tolerated. A dozen sectaries join arms with one another, they know nothing whatever of the lonely temptations which await the knight of faith and which he dares not shun precisely because it would be still more dreadful if he were to press forward presumptuously. The sectaries deafen one another by their noise and racket, hold the dread off by their shrieks, and such a hallooing company of sportsmen think they are storming heaven and think they are on the same path as the knight of faith who in the solitude of the universe never hears any human voice but walks alone with his dreadful responsibility.

The knight of faith is obliged to rely upon himself alone, he feels the pain of not being able to make himself intelligible to others, but he feels no vain desire to guide others. The pain is his assurance that he is in the right way, this vain desire he does not know, he is too serious for that. The false knight of faith readily betrays himself by this proficiency in guiding which he has acquired in an instant. He does not comprehend what it is all about, that if another individual is to take the same path, he must become entirely in the same way the individual and have no need of any man’s guidance, least of all the guidance of a man who would obtrude himself. At this point men leap aside, they cannot bear the martyrdom of being uncomprehended, and instead of this they choose conveniently enough the worldly admiration of their proficiency. The true knight of faith is a witness, never a teacher, and therein lies his deep humanity, which is worth a good deal more than this silly participation in others’ weal and woe which is honored by the name of sympathy, whereas in fact it is nothing but vanity. He who would only be a witness thereby avows that no man, not even the lowliest, needs another man’s sympathy or should be abased that another may be exalted. But since he did not win what he won at a cheap price, neither does he sell it out at a cheap price, he is not petty enough to take men’s admiration and give them in return his silent contempt, he knows that what is truly great is equally accessible to all.

Either there is an absolute duty toward God, and if so it is the paradox here described, that the individual as the individual is higher than the universal and as the individual stands in an absolute relation to the absolute / or else faith never existed, because it has always existed, or, to put it differently, Abraham is lost.