Category Archives: education

Living in an e-plus world: Students now prefer digital texts when given a choice

A recent blog by Nick DeSantis in the Chronicle points to a survey by the Pearson Foundation that suggests Tablet ownership is on the rise.  That’s not surprising, but more significant is the fact that among tablet users there’s a clear preference for digital texts over the traditional paper codex, something we haven’t seen before even among college students of this wired generation:

One-fourth of the college students surveyed said they owned a tablet, compared with just 7 percent last year. Sixty-three percent of college students believe tablets will replace textbooks in the next five years—a 15 percent increase over last year’s survey. More than a third said they intended to buy a tablet sometime in the next six months.

This year’s poll also found that the respondents preferred digital books over printed ones. It’s a reversal of last year’s results and goes against findings of other recent studies, which concluded that students tend to choose printed textbooks. The new survey found that nearly six in 10 students preferred digital books when reading for class, compared with one-third who said they preferred printed textbooks.

I find this unsurprising as it matches up pretty well with my own experience.  5 years ago I could never imagine doing any significant reading on a tablet.  Now I do all my reading of scholarly journals and long form journalism–i.e The Atlantic, the New York Review of Books, The Chronicle Review–on my iPad.  And while I still tend to prefer the codex for the reading of novels and other book length works, the truth is that preference is slowly eroding as well.  As I become more familiar with the forms of e-reading, the notions of its inherent inferiority, like the notions of any unreflective prejudice, gradually fade in the face of familiarity.

And yet I greet the news of this survey with a certain level of panic, not panic that it should happen at all, but panic that the pace of change is quickening and we are hardly prepared, by we I mean we in the humanities here in small colleges and elsewhere.  I’ve blogged on more than one occasion about my doubts about e-books and yet my sense of their inevitable ascendancy.  For instance here on the question of whether e-books are being foisted on students by a cabal of publishers and administrators like myself out to save a buck (or make a buck as the case may be), and here on the nostalgic but still real feeling that I have that print codex forms of books have an irreplaceable individuality and physicality that the mere presence of text in a myriad of e-forms does not suffice to replace.

But though I’ve felt the ascendancy of e-books was inevitable, I think I imagined a 15 or 20 year time span in which print and e-books would mostly live side by side.  Our own librarians here at Messiah College talk about a “print-plus” model for libraries, as if e-book will remain primarily an add on for some time to come.  I wonder.  Just as computing power increases exponentially, it seems to me that the half-life of print books is rapidly diminishing.  I now wonder whether we will have five years before students will expect their books to be in print–all their books, not just their hefty tomes for CHEM 101 that can be more nicely illustrated with iBook Author–but also their books for English and History classes as well.  This is an “e-plus”  world  where print will increasingly not be the norm, but the supplement to fill whatever gaps e-books have not yet bridged, whatever textual landscapes have not yet been digitized.

Despite warnings, we aren’t yet ready for an e-plus world.  Not only do we not know how to operate the apps that make these books available, we don’t even know how to critically study books in tablet form.  Yet learning what forms of critical engagement are possible and necessary will be required.  I suspect, frankly, that our current methods developed out of a what was made possible by the forms that texts took, rather than forms following our methodological urgencies.  This means that the look of critical study in the classroom will change radically in the next ten years.  What will it look like?

The United States–Land of the Second Language Illiterates

Approximately 80% of American citizens are monolingual, and the largest part of the rest are immigrants and their children. This statistic from Russell Berman, outgoing President of the MLA, in his valedictory message to the MLA convention in Seattle this past January. Berman’s address is a rousing and also practical defense of the humanities in general, and especially of the role of second language learning in developing a society fully capable of engaging the global culture within which it is situated. From his address:

Let us remember the context. According to the National Foreign Language Center, some 80% of the US population is monolingual. Immigrant populations and heritage speakers probably make up the bulk of the rest. Americans are becoming a nation of second-language illiterates, thanks largely to the mismanagement of our educational system from the Department of Education on down. In the European Union, 50% of the population older than 15 reports being able to carry on a conversation in a non-native language, and the EU has set a goal of two non-native languages for all its citizens.

Second language learning enhances first language understanding: many adults can recall how high school Spanish, French, or German—still the three main languages offered—helped them gain a perspective on English—not only in terms of grammar but also through insights into the complex shift in semantic values across cultural borders. For this reason, we in the MLA should rally around a unified language learning agenda: teachers of English and teachers of other languages alike teach the same students, and we should align our pedagogies to contribute cooperatively to holistic student learning. We are all language teachers. For this reason, I call on English departments to place greater importance on second language knowledge, perhaps most optimally in expectations for incoming graduate students. Literature in English develops nowhere in an English-only environment; writing in any language always takes place in a dialectic with others. With that in mind, I want to express my gratitude to the American Studies Association for recently adopting a statement supportive of the MLA’s advocacy for language learning.

Berman goes on to recognize, however, that this strong and reasonable call for educated people to be conversant in more than one language is largely sent echoless in to the void. Less than .1% of the discretionary budget of the Department of Education goes to support Language learning. Indeed, I suspect this is because so many of us, even in higher education, are dysfunctional in a second language. I often tell people grimly that I can ask how to go to the bathroom in four different languages.

Nevertheless, in an age where we call for global engagement and in which we imagine the importance of preparedness for a global marketplace and want our students to be citizens of the world, it is irresponsible to continue to imagine that world will conveniently learn to speak English for our sakes.

 

Should Humanities students learn to code?

One of the big questions that’s been on our mind in the digital humanities working group is whether and to what degree humanities students (and faculty!) need to have digital literacy or even fluency.  Should students be required to move beyond the ability to write a blog or create a wiki toward understanding and even implementing the digital tools that make blogs and wikis and databases possible.  This essay from Anastasia Salter takes up the issue, largely in the affirmative, in a review of Douglas Rushikoff’s Program or be Programmed:  Read more at…Should Humanities students learn to code?.

Discussion of Henry Jenkins and Lev Manovich

I’m also blogging occasionally over at digitalhumanity.wordpress.org, our site for discussing Digital Humanities at Messiah College.  You’re invited to check in and see our flailing around as we try to get our minds around whatever it is that goes on with this field and try to think about how we might contribute.  Today we had a discussion of some work by Henry Jenkins and Lev Manovich.  A few of my notes can be found here.

In Praise of Reviews, Reviewing, and Reviewers

I think if I was born again by the flesh and not the spirit, I might choose to become a book reviewer in my second life.  Perhaps this is “true confessions” since academics and novelists alike share their disdain for the review as a subordinate piece of work, and so the reviewer as a lowly creature to be scorned.  However, I love the review as a form, see it as a way of exercising creativity, rhetorical facility, and critical consciousness.  In other words, with reviews I feel like I bring together all the different parts of myself.  The creativity and the rhetorical facility I developed through and MFA, and the critical consciousness of my scholarly self developed in graduate school at Duke.  I developed my course on book-reviewing here at Messiah College precisely because I think it is one of the most  challenging forms to do well.  To write engagingly and persuasively for a generally educated audience while also with enough informed intelligence for an academic audience.

Like Jeffrey Wasserstrom in the  Chronicle Review, I also love reading book reviews, and often spend vacation days not catching up on the latest novel or theoretical tome, but on all the book reviews I’ve seen and collected on Instapaper.  Wasserstrom’s piece goes against the grain of a lot of our thinking about book reviews, even mine, and it strikes me that he’s absolutely right about a lot of what he says.  First, I often tell students that one of the primary purposes of book reviewers is to help sift wheat from chafe and tell other readers what out there is worth the reading.  This is true, but only partially so.

Another way my thinking diverges from Lutz’s relates to his emphasis on positive reviews’ influencing sales. Of course they can, especially if someone as influential as, say, Michiko Kakutani (whose New York Times reviews I often enjoy) or Margaret Atwood (whose New York Review of Books essays I never skip) is the one singing a book’s praises. When I write reviews, though, I often assume that most people reading me will not even consider buying the book I’m discussing, even if I enthuse. And as a reader, I gravitate toward reviews of books I don’t expect to buy, no matter how warmly they are praised.

Consider the most recent batch of TLS issues. As usual, I skipped the reviews of mysteries, even though these are precisely the works of fiction I tend to buy. And I read reviews of nonfiction books that I wasn’t contemplating purchasing. For instance, I relished a long essay by Toby Lichtig (whose TLS contributions I’d enjoyed in the past) that dealt with new books on vampires. Some people might have read the essay to help them decide which Dracula-related book to buy. Not me. I read it because I was curious to know what’s been written lately about vampires—but not curious enough to tackle any book on the topic.

What’s true regarding vampires is—I should perhaps be ashamed to say—true of some big fields of inquiry. Ancient Greece and Rome, for example. I like to know what’s being written about them but rarely read books about them. Instead, I just read Mary Beard’s lively TLS reviews of publications in her field.

Reviews do influence my book buying—just in a roundabout way. I’m sometimes inspired to buy books by authors whose reviews impress me. I don’t think Lichtig has a book out yet, but when he does, I’ll buy it. The last book on ancient Greece I purchased wasn’t one Mary Beard reviewed but one she wrote.

I can only say yes to this.  It’s very clear that I don’t just read book reviews in order to make decisions as a consumer.  I read book reviews because I like them for themselves, if they are well-done, but also just to keep some kind of finger on the pulse of what’s going on.  In other words, there’s a way in which I depend on good reviewers not to read in order to tell me what to buy, but to read in my place since I can’t possibly read everything.  I can remain very glad, though, that some very good reader-reviewers out there are reading the many good things that there are out there to read.  I need them so I have a larger sense of the cultural landscape than I could possibly achieve by trying to read everything on my own.

Wasserstrom also champions the short review, and speculates on the tweeted review and its possibilities:

I’ve even been musing lately about the potential for tweet-length reviews. I don’t want those to displace other kinds, especially because they can too easily seem like glorified blurbs. But the best nuggets of some reviews could work pretty well within Twitter’s haiku-like constraints. Take my assessment of Kissinger’s On China. When I reviewed it for the June 13 edition of Time’s Asian edition, I was happy that the editors gave me a full-page spread. Still, a pretty nifty Twitter-friendly version could have been built around the best line from the Time piece: “Skip bloated sections on Chinese culture, focus on parts about author’s time in China—a fat book w/ a better skinnier one trying to get out.”

The basic insight here is critical.  Longer is not always better.  I’m even tempted to say not often better, the usual length of posts to this blog notwithstanding.  My experiences on facebook suggest to me that we may be in a new era of the aphorism, as well as one that may exalt the wit of the 18th  century, in which the pithy riposte may be more telling than the blowsy dissertation.

A new challenge for my students in the next version of my book-reviewing class, write a review that is telling accurate and rhetorically effective in 160 characters or less.

Create or Die: Humanities and the Boardroom

A couple of years ago when poet and former head of the NEA, Dana Gioia, came to Messiah College to speak to our faculty and the honors program, I asked him if he felt there were connections between his work as a poet and his other careers as a high-level government administrator and business executive.  Gioia hemmed and hawed a bit, but finally said no;  he thought he was probably working out of two different sides of his brain.

I admit to thinking the answer was a little uncreative for so creative  a person.  I’ve always been a bit bemused by and interested in the various connections that I make between the different parts of myself.  Although I’m not a major poet like Gioia, I did complete an MFA and continue to think of myself in one way or another as a writer, regardless of what job I happen to be doing at the time.  And I actually think working on an MFA for those two years back in Montana has had a positive effect on my life as an academic and my life as an administrator.  Some day I plan to write an essay on the specifics, but it is partially related to the notion that my MFA did something to cultivate my creativity, to use the title of a very good article in the Chronicle review by Steven Tepper and George Kuh.  According to Tepper and Kuh,

To fuel the 21st-century economic engine and sustain democratic values, we must unleash and nurture the creative impulse that exists within every one of us, or so say experts like Richard Florida, Ken Robinson, Daniel Pink, Keith Sawyer, and Tom Friedman. Indeed, just as the advantages the United States enjoyed in the past were based in large part on scientific and engineering advances, today it is cognitive flexibility, inventiveness, design thinking, and nonroutine approaches to messy problems that are essential to adapt to rapidly changing and unpredictable global forces; to create new markets; to take risks and start new enterprises; and to produce compelling forms of media, entertainment, and design.

Tepper and Kuh , a sociologist and professor education respectively, argue forcefully that creativity is not doled out by the forces of divine or genetic fate, but something that can be learned through hard work.  An idea supported by recent findings that what musical and other artistic prodigies share is not so much genetic markers as practice, according to Malcolm Gladwell 10,000 hours of it.  Tepper and Kuh believe the kinds of practice and discipline necessary for cultivating a creative mindset are deeply present in the arts, but only marginally present in many other popular disciplines on campus.

Granted, other fields, like science and engineering, can nurture creativity. That is one reason collaborations among artists, scientists, and engineers spark the powerful breakthroughs described by the Harvard professor David Edwards (author of Artscience, Harvard University Press, 2008); Xerox’s former chief scientist, John Seely Brown; and the physiologist Robert Root-Bernstein. It is also the case that not all arts schools fully embrace the creative process. In fact, some are so focused on teaching mastery and artistic conventions that they are far from hotbeds of creativity. Even so, the arts might have a special claim to nurturing creativity.

A recent national study conducted by the Curb Center at Vanderbilt University, with Teagle Foundation support, found that arts majors integrate and use core creative abilities more often and more consistently than do students in almost all other fields of study. For example, 53 percent of arts majors say that ambiguity is a routine part of their coursework, as assignments can be taken in multiple directions. Only 9 percent of biology majors say that, 13 percent of economics and business majors, 10 percent of engineering majors, and 7 percent of physical-science majors. Four-fifths of artists say that expressing creativity is typically required in their courses, compared with only 3 percent of biology majors, 16 percent of economics and business majors, 13 percent of engineers, and 10 percent of physical-science majors. And arts majors show comparative advantages over other majors on additional creativity skills—reporting that they are much more likely to have to make connections across different courses and reading; more likely to deploy their curiosity and imagination; more likely to say their coursework provides multiple ways of looking at a problem; and more likely to say that courses require risk taking.

Tepper and Kuh focus on the arts for their own purposes, and I realized that in thinking about the ways that an MFA had helped me I was thinking about an arts discipline in many respects.  However, the fact that the humanities is nowhere in their article set me to thinking.  How do the humanities stack up in cultivating creativity, and is this a selling point for parents and prospective students to consider as they imagine what their kids should study in college.  These are my reflections on the seven major “creative” qualities that Kepper and Kuh believe we should cultivate, and how the humanities might do in each of them.

  1. the ability to approach problems in nonroutine ways using analogy and metaphor;
    • I think the humanities excel in this area for a variety of reasons.  For some disciplines like English and Modern Languages, we focus on the way language works with analogy and metaphor and we encourage its effective use in writing.  But more than that, I think many humanities disciplines can encourage nonroutine ways of approaching problems—though, of course, many professors in any discipline can be devoted to doing the same old things the same old way.
  2. conditional or abductive reasoning (posing “what if” propositions and reframing problems);
    • Again, I think a lot of our humanities disciplines approach things in this fashion.  To some degree this is because of a focus on narrative.  One way of getting at how narrative works and understanding the meaning of a narrative at hand is to pose alternative scenarios.  What if we had not dropped the bomb on Japan?  What if Lincoln had not been shot?  How would different philosophical assumptions lead to different outcomes to an ethical problem.
  3. keen observation and the ability to see new and unexpected patterns;
    • I think we especially excel in this area.  Most humanities disciplines are deeply devoted to close and careful readings that discover patterns that are beneath the surface.  The patterns of imagery in a poem, the connections between statecraft and everyday life, the relationship between dialogue and music in a film.  And so forth.  Recognizing patterns in problems can lead to novel and inventive solutions.
  4. the ability to risk failure by taking initiative in the face of ambiguity and uncertainty;
    • I’m not sure if the humanities put a premium on this inherently or not.  I know a lot of professors (or at least I as a professor) prefer their students take a risk in doing something different on a project and have it not quite work than to do the predictable thing.  But I don’t know that this is something inherent to our discipline.  I do think, however, that our disciplines inculcate a high level of comfort with uncertainty and ambiguity.  Readings of novels or the complexities of history require us to not go for easy solutions but to recognize and work within the ambiguity of situations.
  5. the ability to heed critical feedback to revise and improve an idea;
    • Again, I would call this a signature strength, especially as it manifests itself in writing in our discipline and in the back and forth exchange between students and student and teacher.  Being willing to risk and idea or an explanation, have it critiqued, and then to revise one’s thinking in response to more persuasive arguments.
  6. a capacity to bring people, power, and resources together to implement novel ideas; and
    • I admit that on this one I kind of think the humanities might be weaker than we should be, at least as manifested in our traditional way of doing things.   Humanists in most of our disciplines are notorious individualists who would rather spend their time in libraries.  We don’t get much practice at gathering people together with the necessary resources to work collaboratively.  This can happen, but instinctively humanists often want to be left alone.  This is an area where I think a new attention to collaborative and project based learning could help the humanities a great deal, something we could learn from our colleagues in the sciences and arts like theatre and music.  I’m hopeful that some of the new attention we are giving to digital humanities at Messiah College will lead us in this direction.
  7. the expressive agility required to draw on multiple means (visual, oral, written, media-related) to communicate novel ideas to others.
    • A partial strength.  I do think we inculcate good expressive abilities in written and moral communication, and we value novel ideas.  Except for our folks in film and communication—as well as our cousins in art history—we are less good at working imaginatively with visual and other media related resources.  This has been one of my priorities as dean, but something that’s hard to generate from the ground up.

Well, that’s my take.  I wonder what others think?

Grading the Crowd

Can the wisdom of crowds apply to grading student papers, or to evaluation of culture more generally?  What about the quality of a theological argument, or a decision about foreign policy?  We’re taken a lot with the idea of crowds and collaboration lately, and not without good reason.  I think there’s a great deal to be said about getting beyond the notion of the isolated individual at work in his study;  especially in the humanities I think we need to learn something from our colleagues in the sciences and think through what collaborative engagement as a team of scholars might look like as a norm rather than an exception.  At the same time, is there a limit to collective learning and understanding?  Can we detect the difference between the wisdom of the crowd and the rather mindless preferences of a clique, or a mob.  I found myself thinking about these things again this evening as I read Cathy Davidson’s latest piece in The Chronicle Review, “Please Give Me Your Divided Attention: Transforming Learning for the Digital Age.”

Cathy Davidson, "Now You See It"

I wrote about Davidson a couple of days ago–she’s around a lot lately, as authors tend to be when a new book comes out that a publisher has decided to push—and I feel almost bad at taking up  only my crabbiest reactions to her recent work.  First, let me say that I briefly crossed paths with Davidson at Duke where she was hired the year I was finished my doctorate in English.  She seemed like a breath of fresh and genuine air in a department that could sometime choke on its collective self-importance, and the enthusiasm and generosity and love of teaching that Davidson evinces in this essay was evident then as well, though I never had her for class.  And, as this comment suggests, I think there’s a lot in this essay that’s really important to grapple with.  First, her suggestions of the ways that she and some of her colleagues at Duke trusted students with an experiment in iPod pedagogy paid off in so many unexpected ways, and we now know a good bit of that was far ahead of its time.  Moreover, she paints a wonderful picture of students as collaborative teachers in the learning process in her course on the way neuroscience is changing everything.  Still, as with a lot of these things that focus on student-centeredness, I find that promising insights are blinded by what amounts to a kind of ideology that may not be as deeply informed about human action as it really ought to be.  I felt this way in Davidson’s discussion of grading.

 There are many ways of crowdsourcing, and mine was simply to extend the concept of peer leadership to grading. The blogosphere was convinced that either I or my students would be pulling a fast one if the grading were crowdsourced and students had a role in it. That says to me that we don’t believe people can learn unless they are forced to, unless they know it will “count on the test.” As an educator, I find that very depressing. As a student of the Internet, I also find it implausible. If you give people the means to self-publish—whether it’s a photo from their iPhone or a blog—they do so. They seem to love learning and sharing what they know with others. But much of our emphasis on grading is based on the assumption that learning is like cod-liver oil: It is good for you, even though it tastes horrible going down. And much of our educational emphasis is on getting one answer right on one test—as if that says something about the quality of what you have learned or the likelihood that you will remember it after the test is over.

Grading, in a curious way, exemplifies our deepest convictions about excellence and authority, and specifically about the right of those with authority to define what constitutes excellence. If we crowdsource grading, we are suggesting that young people without credentials are fit to judge quality and value. Welcome to the Internet, where everyone’s a critic and anyone can express a view about the new iPhone, restaurant, or quarterback. That democratizing of who can pass judgment is digital thinking. As I found out, it is quite unsettling to people stuck in top-down models of formal education and authority.

Davidson’s last minute veering into ad hominem covers over the fact that she doesn’t provide any actual evidence for the superiority of her method, offers a cultural fact for a substantive good—if this is how things are done in the age of digital thinking, it must be good, you old fogies—seems to crassly assume that any theory of judgment that does not rely on the intuitions of 20 year olds is necessarily anti-democratic and authoritarian, and glibly overlooks the social grounding within which her own experiment was even possible.  All of this does sound like a lot of stuff that comes out of graduate departments in English, Duke not least of all, but I wonder if the judgment is really warranted.

An alternate example would be a class I occasionally teach, when I have any time to teach at all any more, on book reviewing.  In the spirit of democratizing the classroom, I usually set this course up as a kind of book contest in which students choose books to review and on the basis of those reviews, books proceed through a process of winnowing, until at last, with two books left, we write reviews of the finalists and then vote for our book of the year.  The wisdom of the crowd does triumph in some sense because through a process of persuasion students have to convince their classmates what books are worth reading next.  The class is partly about the craft of book reviewing, partly about the business of book publishing, and partly about theories of value and evaluation.  We spend time not only thinking about how to write effective book reviews for different markets, we discuss how theorists from Kant to Pierre Bourdieu to Barbara Herrnstein Smith discuss the nature of value, all in an effort to think through what we are saying when we finally sit down and say one thing is better than another thing.

The first two times I taught this class, I gave the students different lists of books.  One list included books that were short listed for book awards, one list included first time authors, and one list included other books from notable publishers that I had collected during the previous year.  I told them that to begin the class they had to choose three books to read and review from the lists that I had provided, and that at least one book had to be from a writer of color (my field of expertise being Ethnic literature of the U.S., I reserved the right).  They could also choose one book simply through their own research to substitute for a book on one of my lists.  Debates are always spirited, and the reading is always interesting.  Students sometimes tell me that this was one of their favorite classes.

The most recent time I taught the class, I decided to take the steps in democratizing one step further by allowing the students to choose three books entirely on their own accord.  Before class began I told them how to go about finding books through the use of major industry organs like Publishers Weekly, as well as how to use search engines on Amazon and elsewhere—which, their digital knowledge notwithstanding, students are often surprised at what you can do on a search engine.  The only other guidance was students would ultimately have to justify their choices by defending in their reviews why they liked the books and thought they could be described as good works of literature, leaving open what we meant by terms like “good” and “literature” since that was part of the purpose of the course.

The results were probably predictable but left me disheartened nonetheless.  Only one book out of fifty some books in the first round was by a writer of color.  A predictable problem, but one I had kept my fingers crossed would not occur.  More than half the books chosen by my students were from romance, mystery, fantasy lit, and science fiction genres.  Strictly speaking I didn’t have a problem with that since I think great works of fiction can be written in all kinds of genres, and most works of what we call literary fiction bear the fingerprints of their less reputable cousins (especially mystery writing, in my view, but that’s another post).  I thought there might be a chance that there would be some undiscovered gem in the midst.  I do not have the time to read all fifty books, of course, but rely on students to winnow for me and then try to read every book that gets to the round of eight.  It’s fair to say that in my personal authoritarian aesthetic, none of the books that fell in to that generic category could have been called a great work of fiction, though several of them were decent enough reads.  Still, I was happy to go with this and see where things would take us, relatively sure that as things went on and we had to grapple with what it meant to evaluate prose, we would probably still come out with some pretty good choices.

Most of the works that I would have considered literary were knocked out by the second round, though it is the case that Jonathan Franzen’s Freedom made it all the way to the finals, paired against Lisa Unger’s entry from 2010, whose title I can’t even remember now.  In the end the class split almost down the middle, but chose Unger’s book as the best book they had read during the course of the semester.  Not that I thought it was a terrible book. It was a nice enough read and Unger is a decent enough mystery writer.  But being asked to remember the book is a little like being asked to remember my last trip to MacDonalds.  One doesn’t go there for memorable dining experience, and one doesn’t read Lisa Unger in order come up with books that we will care to remember several weeks after having set them down.  But what was perhaps most intriguing to me was that after an hour long discussion of the two books in which students offered spirited defenses of each writer, I asked them that if they could project themselves in to the year 2020 and had to choose only one book to include on a syllabus in a course on the best books of 2010, which book would it be.  Without exception the students voted for Franzen’s book.  When I asked the students who changed their votes why this would be, they said “We think that Franzen is more important, we just liked reading Unger more.”

This is the nub.  Can the wisdom of crowds decide what is most important?  To that, the answer can only be “sometimes”.  As often crowds choose what is conveniently at hand, satisfies a sweet tooth, or even the desire for revenge. Is there a distinction between what is important or what is true and what is merely popular?  Collaboration can lead us past blindnesses, but it is not clear that the subjectivity of a crowd is anything but blind (in my original draft I typed “bling”, a telling typographical slip and one that may be truer and more interesting than “blind.”  It is not clear that they can consistently be relied upon by their intuition to decide what ought to last. This may not be digital thinking, but at least it is thinking, something crowds cannot be always relied upon to do.

If we could really rely on crowds to make our choices, we would discover that there is really very little to choose between almost anything.  Going on Amazon, what is amazing is that four stars is the average score for all of the 100,000s of thousands of books that are catalogued.  And popularity trumps everything:  Lisa Scottoline scores higher in a lot of cases than Jane Austen.  Literally everything is above average and worth my time.  This is because in the world of the crowd, people mostly choose to be with those crowds that are most like themselves and read those things that are most likely to reinforce the sense they have that they are in the right crowd to begin with.  This is true as even elementary studies of internet usage have pointed out.  Liberals read other liberals, and delight in their wisdom and the folly of conservatives.  Conservatives read other conservatives and do likewise.  This too is digital thinking, and in this case it is quite easily seen that crowds can become authoritarian over and against the voice of the marginalized.  My students choices to not read students of color unless I tell them to is only one small reminder of that.

Which leads to one last observation.  I wonder, indeed, whether it is not the case that this experiment worked so well at Duke because students at Duke already know what it takes to get an A.  That is, in some sense Davidson is not really crowdsourcing at all but is relying on the certain educational processes that will deliver students well-attuned to certain forms of cultural excellence,  able to create effectively and “challenge” the status quot because they are already deeply embedded within those forms of culture excellence and all the assumptions they entail.  That is, as with many pedagogical theories offered by folks at research institutions, Davidson isn’t theorizing from the crowd but from a tiny elite and extremely accomplished sample.  As Gerald Graff points out, most of us most want to teach the students that don’t need us, that means most of us want to teach at places where none of the students actually need us.  Davidson’s lauding of her students in the fact that they don’t’ really need her to learn may merely be an index of their privilege, not the inherent wisdom of crowds or the superiority of her pedagogical method.   Her students have already demonstrated that they know what it takes to get A’s in almost everything because they have them—and massively high test scores besides—or they are unusually gifted in other ways or they wouldn’t be at Duke.  These are the students who not only read all their AP reading list the summer before class started, they also read all the books related to the AP reading list and  have attended tutoring sessions to learn to write about them besides.

Let me hasten to say that there is absolutely nothing wrong with their being accomplished.  On some level, I was one of them in having gone to good private colleges and elite graduate programs.  But it is a mistake to assume that the well-learned practices of the elite, the cultural context that reinforces those practices, and the habits of mind that enable the kinds of things that Davidson accomplished actual form the basis for a democratizing pedagogy for everyone.  Pierre Bourdieu 101.

We’re all pre-professional now

I’ve been catching up this evening on backlog of reading I’ve stored on Instapaper.  (I’m thinking “backlog” might be the right word:  I don’t think you can have a “stack” on an iPad).  A few weeks back Cathy Davidson down at Duke University had an interesting piece on whether College is for everyone.  Davidson’s basic thesis, as the title suggests, is no.  Despite the nationalist rhetoric that attends our discussions of higher education–we will be a stronger America if every Tom Dick and Henrietta has a four year degree–maybe, Davidson suggests, maybe we’d have a better society if we attended to and nurtured the multiple intelligences and creativities that abound in our society, and recognized that many of those were best nurtured somewhere else than in a college or university:

The world of work — the world we live in — is so much more complex than the quite narrow scope of learning measured and tested by college entrance exams and in college courses. There are so many viable and important and skilled professions that cannot be outsourced to either an exploitative Third World sweatshop or a computer, that require face-to-face presence, and a bucketload of skills – but that do not require a college education: the full range of IT workers, web designers, body workers (such as deep tissue massage), yoga and Pilates instructors, fitness educators, hairdressers, retail workers, food industry professionals, entertainers and entertainment industry professionals, construction workers, dancers, artists, musicians, entrepreneurs, landscapers, nannies, elder-care professionals, nurse’s aides, dog trainers, cosmetologists, athletes, sales people, fashion designers, novelists, poets, furniture makers, auto mechanics, and on and on.

All those jobs require specialized knowledge and intelligence, but most people who end up in those jobs have had to fight for the special form their intelligence takes because, throughout their lives, they have seen never seen their particular ability and skill set represented as a discipline, rewarded with grades, put into a textbook, or tested on an end-of-grade exam. They have had to fight for their identity and dignity, their self-worth and the importance of their particular genius in the world, against a highly structured system that makes knowledge into a hierarchy with creativity, imagination, and the array of so-called “manual skills” not just at the bottom but absent.

Moreover, Davidson argues that not only is our current educational system not recognizing and valuing these kinds of skills on the front end, when we actually get students in to college we narrow students interests yet further:

All of the multiple ways that we learn in the world, all the multiple forms of knowing we require in order to succeed in a life of work, is boiled down to an essential hierarchical subject matter tested in a way to get one past the entrance requirements and into a college. Actually, I agree with Ken Robinson that, if we are going to be really candid, we have to admit that it’s actually more narrow even than that: we’re really, implicitly training students to be college professors. That is our tacit criterion for “brilliance.” For, once you obtain the grail of admission to higher ed, you are then disciplined (put into majors and minors) and graded as if the only end of your college work were to go on to graduate school where the end is to prepare you for a profession, with university teaching of the field at the pinnacle of that profession.

Which brings me to my title.  We’re all pre-professional now.  Since the advent of the university if not before there’s been a partisan debate between growing pre-professional programs and what are defined as the “traditional liberal arts,”  though in current practice given the cache of  science programs in the world of work this argument is sometimes really between humanities and the rest of the world.

Nevertheless, I think Davidson points out that in actual practice of the humanities in many departments around the country, this distinction is specious.  Many humanities programs conceive of themselves as preparing students for grad school.  In the humanities.  In other words, we imagine ourselves as ideally preparing students who are future professionals in our profession.  These are the students who receive our attention, the students we hold up as models, the students we teach to, and the students for whom we construct our curricula, offer our honors and save our best imaginations.  What is this, if not a description of a pre-professional program?  So captive are we to this conceptual structure that it becomes hard to imagine what it would mean to form an English major, History major, or Philosophy major whose primary implicit or explicit goal was not to reproduce itself, but to produce individuals who will work in the world of business–which most of them will do–or in non-profit organizations, or in churches and synagogues, or somewhere else that we cannot even begin to imagine.  We get around this with a lot of talk with transferable skills, but we actually don’t do a great deal to help our students understand what those skills are or what they might transfer to.  So I think Davidson is right to point this out and to suggest that there’s something wrongheaded going on.

That having been said, a couple of points of critique:

Davidson rightly notes these multiple intelligences and creativities, and she rightly notes that we have a drastically limited conception of society if we imagine a four year degree is the only way to develop these intelligences and creativities in an effective fashion.  But Davidson remains silent on the other roles of higher education, the forming of an informed citizenry being only one.  Some other things I’ve seen from Davidson, including her new book Now You See It, suggests she’s extremely excited about all the informal ways that students are educating themselves, and seems to doubt the traditional roles of higher education;  higher education’s traditional role as a producer and disseminator of knowledge has been drastically undermined.  I have my doubts.  It is unclear that a couple of decades of the internet have actually produced a more informed citizenry.  Oh, yes, informed in all kinds of ways about all kinds of stuff, like the four thousand sexual positions in the Kama Sutra, but informed in a way that allows for effective participation in the body politic?  I’m not so sure.

I think this is so because to be informed is not simply to possess information, but to be shaped, to be in-formed.  In higher education this means receiving a context for how to receive and understand information, tools for analysing, evaluating, and using information,  the means for creating new knowledge for oneself.  To be sure, the institutions of higher education are not the only place that this happens, but it is clear that this doesn’t just automatically happen willy-nilly just because people have a Fios connection.

What higher education can and should give, then, is a lot of the values and abilities that are associated with a liberal arts education traditionally conceived–as opposed to being conceived as a route to a professorship–and these are values, indeed, that everyone should possess.  Whether it requires everyone to have a four year degree is an open question.  It may be that we need to rethink our secondary educational programs in such a way that they inculcate liberal arts learning in a much more rigorous and effective way than they do now.  But I still doubt that the kind of learning I’m talking about can be achieved simply by 17 year olds in transformed high schools.  Higher education should be a place for the maturing and transformation of young minds toward a larger understanding of the world and their responsibilities to it, which it sometimes is today, but should be more often.

Humanities and the workplace: or, bodysurfing the Tsunami.

As I suggested in my last post on the demise of Borders, book lovers have lived in an eternal tension between the transcendent ideals their reading often fosters and the commercial realities upon which widespread literacy has depended. The same tension is broadly evident in the Humanities response to professional programs or just more broadly the question of career preparation. We are not wrong to say that an education in history or English is much more than career preparation; nor are we wrong to insist that a college education has to be about much more than pre-professional training. (Not least because most college graduates end up doing things a long distance from their original preparation, and we ought to see that humanities in combination with other knowledges in arts and sciences is at least as good at preparing students for the twists and turns of their eventual career, and perhaps even better, than fields focused on narrower practical preparations

However we are absolutely wrong to assume that questions of career are extraneous or ought to be secondary to our students or our thinking about how we approach curricula.

Daniel Everett, dean of Arts and sciences at Bentley University offers a provocative refection on the need to integrate humanities in to professional education. According to Everett

“Programs that take in students without proper concern for their future or provision for post-graduate opportunities — how they can use what they have learned in meaningful work — need to think about the ethics of their situation. Students no longer come mainly from the leisured classes that were prominent at the founding of higher education. Today they need to find gainful employment in which to apply all the substantive things they learn in college. Majors that give no thought to that small detail seem to assume that since the humanities are good for you, the financial commitment and apprenticeship between student and teacher is fully justified. But in these cases, the numbers of students benefit the faculty and particular programs arguably more than they benefit the students themselves. This is a Ponzi scheme. Q.E.D.”

These are harsh words, but worth considering. I tend to not like Bentley’s particular solutions to the degree that they reduce the humanities to an enriching complement to the important business of, well, business. However, I do think we need to think of new ways of creating our majors that will prepare students for the realities of 21st century employment. Majors that allowed for concentrations in digital humanities would prepare students to engage the changing nature of our disciplines while also gaining technical skills that could serve them well in business. New joint programs with the sciences like those found in medical humanities programs could prepare students in new ways for work in the health care industry. Everett warns of what may happen of humanities programs don’t creatively remake themselves to meet the changing needs of our contemporary world:

“If, like me, you believe that the humanities do have problems to solve, I hope you agree that they are not going to be solved by lamenting the change in culture and exhorting folks to get back on course. That’s like holding your finger up to stop a tidal wave. Thinking like this could mean that new buildings dedicated to the humanities will wind up as mausoleums for the mighty dead rather than as centers of engagement with modern culture and the building of futures in contemporary society.”

Again, I don’t like all of the particular responses Everett has advocated, but I do agree that there is a problem to be addressed that continued proclamations about transferable skills is unlikely to solve. What is sometimes called the applied humanities may be a way of riding the wave rather than being drowned by it.

Ephemeral, all too ephemeral; Or, does anybody remember what Russell Jacoby wrote yesterday?

David Rothman over at Teleread announced today that I’ll be blogging for them on a semi-regular basis. As I’ve indicated on this blog in the past, I think Teleread is a very good portal for getting good information about all things reading, with a special emphasis on e-books and the general interface between the digital world and books. Also, though I’m not involved in this whole end of things, they function as an advocacy group for universal computer and open library access. A worthy cause in my estimation.

David wasn’t dissuaded by my protestations that I don’t consider myself an expert on reading, e-books, or blogging. I’ll try to keep it from being too much the amateur hour.

But wait, some people are saying that blogging is THE amateur hour, to the destruction of our civilization. So perhaps I’ll just do my best to contribute to the end of all things.

Seriously though, as this blog reaches its one month anniversary, I’ve realized that I’ve been thinking at least as much about how blogging works as writing as I have about how reading is functioning in our contemporary culture. Russell Jacoby, whom I generally respect for his now somewhat ancient call for a renewal of public intellectual work, wrote a kind of scathing dismissal of the contemporary academy, and along with it a rather sniffy dismissal of blogging.

I think Jacoby’s categorical thinking is part and parcel of our problem in grappling with the nature of reading and writing—and thinking and intellectual work—in the contemporary world. It is true that blogs are somewhat ephemeral, but on the other hand, how many of the many periodical essays of the 1950s do people really remember. Moreover, if blogging specifically eschews the media of print and permanence, we might ask what kind of intellectual work, what kind of cultural work generally, is being done by this kind of writing?

It seems to me to apply the standard of permanent value to any public intellectual work is wrongheaded. The things Edmund Wilson and Lionel Trilling had to say in the 30s, 40s, and 50s can hardly apply to our world at the dawning of the 21st century. But more importantly, they did a specific kind of public intellectual work that was appropriate for the media and age in which they worked.

Blogging can be “id writing” as described in the NyTimes the other day. But it can also be a kind of intellectual and cultural work much different from those that we are used to. The 1950s saw both a lot of what we now envy as influential public intellectual work, and it also saw the explosion on the scene of Playboy magazine. Perhaps this was why people claim even yet today to buy it for the writing.

My point is that magazines and middle to upper middle brow journals were a media through which to do particular kinds of intellectual work. It seems to me that blogs and other kinds of interactive and multimedia formations are the site of potentially important intellectual work today.

But what is the nature of that work. How does it differ from that of the past. Well, for one thing, I think it aspires more to the kind of interchanges that go on in coffeeshop culture of the 18th century, where intellectual work was not yet located (imprisoned?) in as yet undeveloped academy. It’s even much more like the intellectual work that goes on in classrooms or in informal exchanges with colleagues and students after a lecture or reading.

Academics mostly sniff at this kind of thing as beneath their consideration. But it seems to me that it is precisely out of these kinds of exchanges that enduring and important intellectual work comes from. This is why I decided to start blogging a month ago. I wanted to imagine an intellectual space, a conversational space, that wasn’t imprisoned by the conventions of academic work. I also wanted to imagine doing writing that was more like the energy I feel in conversations with good friends about topics I care about deeply.

I think, honestly, that this will lead to another kind of writing that is more conventional. I still want to publish a book on the reading in the contemporary cultural imaginary. On the other hand, just because I continue to envision a more conventional form of intellectual work, I think I’m coming to think that we are mistaken if we think that blogging or other kinds of intellectual exchange on the web can only be justified if they lead to that kind of thing. If they do, great. But if they don’t they are their own kind of good. When Jacoby criticized blogs for the ephemerality, he is holding out the standard of books, and even more books that continue to be read decades after they are written, as the only worthy definition of intellectual work. I dispute that. The immediacy of blogging, the effort to grapple in written words with difficult problems, the opportunity to have friends and strangers read or respond, these are their own goods. This is a kind of intellectual work worth doing regardless of what other forms it may lead to, and regardless of whether it leads anywhere else at all. Just as the engagement I have in my classroom is its own good and produces knowledge worth having in me and in my students, the process of writing, of thinking through issues on a daily or weekly basis, is a knowledge worth pursuing even if it disappears next week or next year.

Well, enough for now. I’ll try to come back later and put in some links for Jacoby’s piece and some other things I refer to in this post. Right now I’ve got to go party with my wife. She thinks I love my blog more than her, so I better go prove her wrong.