Monthly Archives: September 2011

Grading the Crowd

Can the wisdom of crowds apply to grading student papers, or to evaluation of culture more generally?  What about the quality of a theological argument, or a decision about foreign policy?  We’re taken a lot with the idea of crowds and collaboration lately, and not without good reason.  I think there’s a great deal to be said about getting beyond the notion of the isolated individual at work in his study;  especially in the humanities I think we need to learn something from our colleagues in the sciences and think through what collaborative engagement as a team of scholars might look like as a norm rather than an exception.  At the same time, is there a limit to collective learning and understanding?  Can we detect the difference between the wisdom of the crowd and the rather mindless preferences of a clique, or a mob.  I found myself thinking about these things again this evening as I read Cathy Davidson’s latest piece in The Chronicle Review, “Please Give Me Your Divided Attention: Transforming Learning for the Digital Age.”

Cathy Davidson, "Now You See It"

I wrote about Davidson a couple of days ago–she’s around a lot lately, as authors tend to be when a new book comes out that a publisher has decided to push—and I feel almost bad at taking up  only my crabbiest reactions to her recent work.  First, let me say that I briefly crossed paths with Davidson at Duke where she was hired the year I was finished my doctorate in English.  She seemed like a breath of fresh and genuine air in a department that could sometime choke on its collective self-importance, and the enthusiasm and generosity and love of teaching that Davidson evinces in this essay was evident then as well, though I never had her for class.  And, as this comment suggests, I think there’s a lot in this essay that’s really important to grapple with.  First, her suggestions of the ways that she and some of her colleagues at Duke trusted students with an experiment in iPod pedagogy paid off in so many unexpected ways, and we now know a good bit of that was far ahead of its time.  Moreover, she paints a wonderful picture of students as collaborative teachers in the learning process in her course on the way neuroscience is changing everything.  Still, as with a lot of these things that focus on student-centeredness, I find that promising insights are blinded by what amounts to a kind of ideology that may not be as deeply informed about human action as it really ought to be.  I felt this way in Davidson’s discussion of grading.

 There are many ways of crowdsourcing, and mine was simply to extend the concept of peer leadership to grading. The blogosphere was convinced that either I or my students would be pulling a fast one if the grading were crowdsourced and students had a role in it. That says to me that we don’t believe people can learn unless they are forced to, unless they know it will “count on the test.” As an educator, I find that very depressing. As a student of the Internet, I also find it implausible. If you give people the means to self-publish—whether it’s a photo from their iPhone or a blog—they do so. They seem to love learning and sharing what they know with others. But much of our emphasis on grading is based on the assumption that learning is like cod-liver oil: It is good for you, even though it tastes horrible going down. And much of our educational emphasis is on getting one answer right on one test—as if that says something about the quality of what you have learned or the likelihood that you will remember it after the test is over.

Grading, in a curious way, exemplifies our deepest convictions about excellence and authority, and specifically about the right of those with authority to define what constitutes excellence. If we crowdsource grading, we are suggesting that young people without credentials are fit to judge quality and value. Welcome to the Internet, where everyone’s a critic and anyone can express a view about the new iPhone, restaurant, or quarterback. That democratizing of who can pass judgment is digital thinking. As I found out, it is quite unsettling to people stuck in top-down models of formal education and authority.

Davidson’s last minute veering into ad hominem covers over the fact that she doesn’t provide any actual evidence for the superiority of her method, offers a cultural fact for a substantive good—if this is how things are done in the age of digital thinking, it must be good, you old fogies—seems to crassly assume that any theory of judgment that does not rely on the intuitions of 20 year olds is necessarily anti-democratic and authoritarian, and glibly overlooks the social grounding within which her own experiment was even possible.  All of this does sound like a lot of stuff that comes out of graduate departments in English, Duke not least of all, but I wonder if the judgment is really warranted.

An alternate example would be a class I occasionally teach, when I have any time to teach at all any more, on book reviewing.  In the spirit of democratizing the classroom, I usually set this course up as a kind of book contest in which students choose books to review and on the basis of those reviews, books proceed through a process of winnowing, until at last, with two books left, we write reviews of the finalists and then vote for our book of the year.  The wisdom of the crowd does triumph in some sense because through a process of persuasion students have to convince their classmates what books are worth reading next.  The class is partly about the craft of book reviewing, partly about the business of book publishing, and partly about theories of value and evaluation.  We spend time not only thinking about how to write effective book reviews for different markets, we discuss how theorists from Kant to Pierre Bourdieu to Barbara Herrnstein Smith discuss the nature of value, all in an effort to think through what we are saying when we finally sit down and say one thing is better than another thing.

The first two times I taught this class, I gave the students different lists of books.  One list included books that were short listed for book awards, one list included first time authors, and one list included other books from notable publishers that I had collected during the previous year.  I told them that to begin the class they had to choose three books to read and review from the lists that I had provided, and that at least one book had to be from a writer of color (my field of expertise being Ethnic literature of the U.S., I reserved the right).  They could also choose one book simply through their own research to substitute for a book on one of my lists.  Debates are always spirited, and the reading is always interesting.  Students sometimes tell me that this was one of their favorite classes.

The most recent time I taught the class, I decided to take the steps in democratizing one step further by allowing the students to choose three books entirely on their own accord.  Before class began I told them how to go about finding books through the use of major industry organs like Publishers Weekly, as well as how to use search engines on Amazon and elsewhere—which, their digital knowledge notwithstanding, students are often surprised at what you can do on a search engine.  The only other guidance was students would ultimately have to justify their choices by defending in their reviews why they liked the books and thought they could be described as good works of literature, leaving open what we meant by terms like “good” and “literature” since that was part of the purpose of the course.

The results were probably predictable but left me disheartened nonetheless.  Only one book out of fifty some books in the first round was by a writer of color.  A predictable problem, but one I had kept my fingers crossed would not occur.  More than half the books chosen by my students were from romance, mystery, fantasy lit, and science fiction genres.  Strictly speaking I didn’t have a problem with that since I think great works of fiction can be written in all kinds of genres, and most works of what we call literary fiction bear the fingerprints of their less reputable cousins (especially mystery writing, in my view, but that’s another post).  I thought there might be a chance that there would be some undiscovered gem in the midst.  I do not have the time to read all fifty books, of course, but rely on students to winnow for me and then try to read every book that gets to the round of eight.  It’s fair to say that in my personal authoritarian aesthetic, none of the books that fell in to that generic category could have been called a great work of fiction, though several of them were decent enough reads.  Still, I was happy to go with this and see where things would take us, relatively sure that as things went on and we had to grapple with what it meant to evaluate prose, we would probably still come out with some pretty good choices.

Most of the works that I would have considered literary were knocked out by the second round, though it is the case that Jonathan Franzen’s Freedom made it all the way to the finals, paired against Lisa Unger’s entry from 2010, whose title I can’t even remember now.  In the end the class split almost down the middle, but chose Unger’s book as the best book they had read during the course of the semester.  Not that I thought it was a terrible book. It was a nice enough read and Unger is a decent enough mystery writer.  But being asked to remember the book is a little like being asked to remember my last trip to MacDonalds.  One doesn’t go there for memorable dining experience, and one doesn’t read Lisa Unger in order come up with books that we will care to remember several weeks after having set them down.  But what was perhaps most intriguing to me was that after an hour long discussion of the two books in which students offered spirited defenses of each writer, I asked them that if they could project themselves in to the year 2020 and had to choose only one book to include on a syllabus in a course on the best books of 2010, which book would it be.  Without exception the students voted for Franzen’s book.  When I asked the students who changed their votes why this would be, they said “We think that Franzen is more important, we just liked reading Unger more.”

This is the nub.  Can the wisdom of crowds decide what is most important?  To that, the answer can only be “sometimes”.  As often crowds choose what is conveniently at hand, satisfies a sweet tooth, or even the desire for revenge. Is there a distinction between what is important or what is true and what is merely popular?  Collaboration can lead us past blindnesses, but it is not clear that the subjectivity of a crowd is anything but blind (in my original draft I typed “bling”, a telling typographical slip and one that may be truer and more interesting than “blind.”  It is not clear that they can consistently be relied upon by their intuition to decide what ought to last. This may not be digital thinking, but at least it is thinking, something crowds cannot be always relied upon to do.

If we could really rely on crowds to make our choices, we would discover that there is really very little to choose between almost anything.  Going on Amazon, what is amazing is that four stars is the average score for all of the 100,000s of thousands of books that are catalogued.  And popularity trumps everything:  Lisa Scottoline scores higher in a lot of cases than Jane Austen.  Literally everything is above average and worth my time.  This is because in the world of the crowd, people mostly choose to be with those crowds that are most like themselves and read those things that are most likely to reinforce the sense they have that they are in the right crowd to begin with.  This is true as even elementary studies of internet usage have pointed out.  Liberals read other liberals, and delight in their wisdom and the folly of conservatives.  Conservatives read other conservatives and do likewise.  This too is digital thinking, and in this case it is quite easily seen that crowds can become authoritarian over and against the voice of the marginalized.  My students choices to not read students of color unless I tell them to is only one small reminder of that.

Which leads to one last observation.  I wonder, indeed, whether it is not the case that this experiment worked so well at Duke because students at Duke already know what it takes to get an A.  That is, in some sense Davidson is not really crowdsourcing at all but is relying on the certain educational processes that will deliver students well-attuned to certain forms of cultural excellence,  able to create effectively and “challenge” the status quot because they are already deeply embedded within those forms of culture excellence and all the assumptions they entail.  That is, as with many pedagogical theories offered by folks at research institutions, Davidson isn’t theorizing from the crowd but from a tiny elite and extremely accomplished sample.  As Gerald Graff points out, most of us most want to teach the students that don’t need us, that means most of us want to teach at places where none of the students actually need us.  Davidson’s lauding of her students in the fact that they don’t’ really need her to learn may merely be an index of their privilege, not the inherent wisdom of crowds or the superiority of her pedagogical method.   Her students have already demonstrated that they know what it takes to get A’s in almost everything because they have them—and massively high test scores besides—or they are unusually gifted in other ways or they wouldn’t be at Duke.  These are the students who not only read all their AP reading list the summer before class started, they also read all the books related to the AP reading list and  have attended tutoring sessions to learn to write about them besides.

Let me hasten to say that there is absolutely nothing wrong with their being accomplished.  On some level, I was one of them in having gone to good private colleges and elite graduate programs.  But it is a mistake to assume that the well-learned practices of the elite, the cultural context that reinforces those practices, and the habits of mind that enable the kinds of things that Davidson accomplished actual form the basis for a democratizing pedagogy for everyone.  Pierre Bourdieu 101.

Tchotchkes R’US: Formerly known as Barnes and Nobles, Booksellers

Like a beaten dog wagging its tail as it returns to the master for one more slap, I keep returning to Barnes and Nobles, hoping for a dry bone or at least a condescending pat on the head. Mostly getting the slap.  I’m wondering lately how much longer B&N can hold on to the subtitle of their name with a straight face. I admit that for purists Barnes and Nobles was never much of a bookseller in the first place, the corporate ambiance just a bit too antiseptic for the crowd that prefers their books straight, preferably with the slightest scent of dust and mold. But as a person that has spent the entirety of his life in flyover country, Barnes and Nobles and its recently deceased cousin Borders seemed something like salvation. If the ambiance was corporate, the books were real, and they were many. If there were too few from independent publishers, there were more than enough good books to last any reader a lifetime, and I spent many hours on my own wandering the shelves, feeling that ache that all readers know, the realization that there are too many good books and one lifetime will never be enough.

Barnes and Nobles became a family affair for us. One way I induced the habit of reading in my kids was to promise them I’d take them to B&N anytime they finished a book and buy them another one. The ploy paid off. My kids read voraciously, son and daughter alike, putting the lie to the notion that kids today have to choose between reading and surfing.  My kids do both just fine, and I think this is attributable in no small part to the fact our family time together was spent wandering the endless aisles of bookstores, imaging the endless possibilities, what the world would be like if we only had enough time to read them all. Other families go on kayak trips; we read books. I’m not sorry for the tradeoffs.

All that is mostly over, for paper books anyway. My son and I still go over to Barnes and Nobles, but the last three trips we’ve come out saying the same thing to one another without prompting–worthless. Aisle after Aisle of bookshelves in our local store are being replaced by toys and tchotchkes designed to…..do what? It’s not even clear. At least when the place was dominated by books it was clear that this was where you went for books. Now it seems like a vaguely upscale Walmart with a vast toy section. I’m waiting for the clothing section to open up soon.

I don’t think we should underestimate the consequence of these changes for booksellers and bookreaders. Although it is the case that readers will still be able to get books via your local Amazon.com, the place of books is changing in radical ways.  The advent of e-books is completely reordering the business of bookselling–and i would say the culture of books as well.  An article in the Economist notes that books are following in the sucking vortex that has swallowed the music and newspaper industries all but whole.  Among the biggest casualties is the books and mortar bookstore, and this is of no small consequence to the effort to sell books in general:

Perhaps the biggest problem, though, is the gradual disappearance of the shop window. Brian Murray, chief executive of HarperCollins, points out that a film may be released with more than $100m of marketing behind it. Music singles often receive radio promotion. Publishers, on the other hand, rely heavily on bookstores to bring new releases to customers’ attention and to steer them to books that they might not have considered buying. As stores close, the industry loses much more than a retail outlet. Publishers are increasingly trying to push books through online social networks. But Mr Murray says he hasn’t seen anything that replicates the experience of browsing a bookstore.

Confession, I actually enjoy browsing Amazon, and I read book reviews endlessly.  But I think this article is right that there is nothing quite like the experience of browsing the shelves at a bookstore, in part because it is a kind of communal event.  It is not that there are more choices–there aren’t, there are far more choices online.  Nor is it necessarily that things are better organized.  I think I have a better chance of finding things relevant to my interests through a search engine than I do by a chance encounter in the stack.   And, indeed, The Strand is a book store that leaves me vaguely nauseous and dizzy, both because there is too much choice and there is too little organization.  But the physical fact of browsing with one’s companions through the stacks, the chance encounter with a book you had heard about but never seen, the excitement of discovery, the anxious calculations–at least if you are an impoverished graduate student or new parent–as to whether you have enough cash on hand to make the purchase now or take a chance that the book will disappear if you wait.  All of these get left behind in the sterility of the online exchange.  The bookstore is finally a cultural location, a location of culture, where bookminded people go for buzz they get from being around other book-minded people.  I can get my books from Amazon, and I actually don’t mind getting them via e-books, avoiding all the fuss of going down and having a face to face transaction with a seller.  But that face to face is part of the point, it seems to me.  Even though book-buying has always fundamentally been about an exchange of cash for commodity, the failure to see that it was also more than that is the cultural poverty of a world that amazon creates.  With books stores dying a rapid death and libraries close upon their heels, I’m feeling a little like a man without a country, since the country of books is the one to which I’ve always been most loyal.

I am, of course, sounding old and crotchety.  The same article in the Economist notes that IKEA is now changing their bookshelf line in the anticipation that people will no longer use them for books.

TO SEE how profoundly the book business is changing, watch the shelves. Next month IKEA will introduce a new, deeper version of its ubiquitous “BILLY” bookcase. The flat-pack furniture giant is already promoting glass doors for its bookshelves. The firm reckons customers will increasingly use them for ornaments, tchotchkes and the odd coffee-table tome—anything, that is, except books that are actually read.

I suspect this may be true.  Bookshelves and books alike may become craft items, things produced by those curious folks who do things by hand, items that you can only buy at craft fairs and auctions, something you can’t find at Wal-Mart, or Barnes and Nobles.

We’re all pre-professional now

I’ve been catching up this evening on backlog of reading I’ve stored on Instapaper.  (I’m thinking “backlog” might be the right word:  I don’t think you can have a “stack” on an iPad).  A few weeks back Cathy Davidson down at Duke University had an interesting piece on whether College is for everyone.  Davidson’s basic thesis, as the title suggests, is no.  Despite the nationalist rhetoric that attends our discussions of higher education–we will be a stronger America if every Tom Dick and Henrietta has a four year degree–maybe, Davidson suggests, maybe we’d have a better society if we attended to and nurtured the multiple intelligences and creativities that abound in our society, and recognized that many of those were best nurtured somewhere else than in a college or university:

The world of work — the world we live in — is so much more complex than the quite narrow scope of learning measured and tested by college entrance exams and in college courses. There are so many viable and important and skilled professions that cannot be outsourced to either an exploitative Third World sweatshop or a computer, that require face-to-face presence, and a bucketload of skills – but that do not require a college education: the full range of IT workers, web designers, body workers (such as deep tissue massage), yoga and Pilates instructors, fitness educators, hairdressers, retail workers, food industry professionals, entertainers and entertainment industry professionals, construction workers, dancers, artists, musicians, entrepreneurs, landscapers, nannies, elder-care professionals, nurse’s aides, dog trainers, cosmetologists, athletes, sales people, fashion designers, novelists, poets, furniture makers, auto mechanics, and on and on.

All those jobs require specialized knowledge and intelligence, but most people who end up in those jobs have had to fight for the special form their intelligence takes because, throughout their lives, they have seen never seen their particular ability and skill set represented as a discipline, rewarded with grades, put into a textbook, or tested on an end-of-grade exam. They have had to fight for their identity and dignity, their self-worth and the importance of their particular genius in the world, against a highly structured system that makes knowledge into a hierarchy with creativity, imagination, and the array of so-called “manual skills” not just at the bottom but absent.

Moreover, Davidson argues that not only is our current educational system not recognizing and valuing these kinds of skills on the front end, when we actually get students in to college we narrow students interests yet further:

All of the multiple ways that we learn in the world, all the multiple forms of knowing we require in order to succeed in a life of work, is boiled down to an essential hierarchical subject matter tested in a way to get one past the entrance requirements and into a college. Actually, I agree with Ken Robinson that, if we are going to be really candid, we have to admit that it’s actually more narrow even than that: we’re really, implicitly training students to be college professors. That is our tacit criterion for “brilliance.” For, once you obtain the grail of admission to higher ed, you are then disciplined (put into majors and minors) and graded as if the only end of your college work were to go on to graduate school where the end is to prepare you for a profession, with university teaching of the field at the pinnacle of that profession.

Which brings me to my title.  We’re all pre-professional now.  Since the advent of the university if not before there’s been a partisan debate between growing pre-professional programs and what are defined as the “traditional liberal arts,”  though in current practice given the cache of  science programs in the world of work this argument is sometimes really between humanities and the rest of the world.

Nevertheless, I think Davidson points out that in actual practice of the humanities in many departments around the country, this distinction is specious.  Many humanities programs conceive of themselves as preparing students for grad school.  In the humanities.  In other words, we imagine ourselves as ideally preparing students who are future professionals in our profession.  These are the students who receive our attention, the students we hold up as models, the students we teach to, and the students for whom we construct our curricula, offer our honors and save our best imaginations.  What is this, if not a description of a pre-professional program?  So captive are we to this conceptual structure that it becomes hard to imagine what it would mean to form an English major, History major, or Philosophy major whose primary implicit or explicit goal was not to reproduce itself, but to produce individuals who will work in the world of business–which most of them will do–or in non-profit organizations, or in churches and synagogues, or somewhere else that we cannot even begin to imagine.  We get around this with a lot of talk with transferable skills, but we actually don’t do a great deal to help our students understand what those skills are or what they might transfer to.  So I think Davidson is right to point this out and to suggest that there’s something wrongheaded going on.

That having been said, a couple of points of critique:

Davidson rightly notes these multiple intelligences and creativities, and she rightly notes that we have a drastically limited conception of society if we imagine a four year degree is the only way to develop these intelligences and creativities in an effective fashion.  But Davidson remains silent on the other roles of higher education, the forming of an informed citizenry being only one.  Some other things I’ve seen from Davidson, including her new book Now You See It, suggests she’s extremely excited about all the informal ways that students are educating themselves, and seems to doubt the traditional roles of higher education;  higher education’s traditional role as a producer and disseminator of knowledge has been drastically undermined.  I have my doubts.  It is unclear that a couple of decades of the internet have actually produced a more informed citizenry.  Oh, yes, informed in all kinds of ways about all kinds of stuff, like the four thousand sexual positions in the Kama Sutra, but informed in a way that allows for effective participation in the body politic?  I’m not so sure.

I think this is so because to be informed is not simply to possess information, but to be shaped, to be in-formed.  In higher education this means receiving a context for how to receive and understand information, tools for analysing, evaluating, and using information,  the means for creating new knowledge for oneself.  To be sure, the institutions of higher education are not the only place that this happens, but it is clear that this doesn’t just automatically happen willy-nilly just because people have a Fios connection.

What higher education can and should give, then, is a lot of the values and abilities that are associated with a liberal arts education traditionally conceived–as opposed to being conceived as a route to a professorship–and these are values, indeed, that everyone should possess.  Whether it requires everyone to have a four year degree is an open question.  It may be that we need to rethink our secondary educational programs in such a way that they inculcate liberal arts learning in a much more rigorous and effective way than they do now.  But I still doubt that the kind of learning I’m talking about can be achieved simply by 17 year olds in transformed high schools.  Higher education should be a place for the maturing and transformation of young minds toward a larger understanding of the world and their responsibilities to it, which it sometimes is today, but should be more often.

Humanities and the workplace: or, bodysurfing the Tsunami.

As I suggested in my last post on the demise of Borders, book lovers have lived in an eternal tension between the transcendent ideals their reading often fosters and the commercial realities upon which widespread literacy has depended. The same tension is broadly evident in the Humanities response to professional programs or just more broadly the question of career preparation. We are not wrong to say that an education in history or English is much more than career preparation; nor are we wrong to insist that a college education has to be about much more than pre-professional training. (Not least because most college graduates end up doing things a long distance from their original preparation, and we ought to see that humanities in combination with other knowledges in arts and sciences is at least as good at preparing students for the twists and turns of their eventual career, and perhaps even better, than fields focused on narrower practical preparations

However we are absolutely wrong to assume that questions of career are extraneous or ought to be secondary to our students or our thinking about how we approach curricula.

Daniel Everett, dean of Arts and sciences at Bentley University offers a provocative refection on the need to integrate humanities in to professional education. According to Everett

“Programs that take in students without proper concern for their future or provision for post-graduate opportunities — how they can use what they have learned in meaningful work — need to think about the ethics of their situation. Students no longer come mainly from the leisured classes that were prominent at the founding of higher education. Today they need to find gainful employment in which to apply all the substantive things they learn in college. Majors that give no thought to that small detail seem to assume that since the humanities are good for you, the financial commitment and apprenticeship between student and teacher is fully justified. But in these cases, the numbers of students benefit the faculty and particular programs arguably more than they benefit the students themselves. This is a Ponzi scheme. Q.E.D.”

These are harsh words, but worth considering. I tend to not like Bentley’s particular solutions to the degree that they reduce the humanities to an enriching complement to the important business of, well, business. However, I do think we need to think of new ways of creating our majors that will prepare students for the realities of 21st century employment. Majors that allowed for concentrations in digital humanities would prepare students to engage the changing nature of our disciplines while also gaining technical skills that could serve them well in business. New joint programs with the sciences like those found in medical humanities programs could prepare students in new ways for work in the health care industry. Everett warns of what may happen of humanities programs don’t creatively remake themselves to meet the changing needs of our contemporary world:

“If, like me, you believe that the humanities do have problems to solve, I hope you agree that they are not going to be solved by lamenting the change in culture and exhorting folks to get back on course. That’s like holding your finger up to stop a tidal wave. Thinking like this could mean that new buildings dedicated to the humanities will wind up as mausoleums for the mighty dead rather than as centers of engagement with modern culture and the building of futures in contemporary society.”

Again, I don’t like all of the particular responses Everett has advocated, but I do agree that there is a problem to be addressed that continued proclamations about transferable skills is unlikely to solve. What is sometimes called the applied humanities may be a way of riding the wave rather than being drowned by it.

Borders fantasies

Book lovers have always turned a blind eye to the god Mammon, remembering only with regret the fact that the house they live in with their truest love was bought and paid for by the leering uncle down the street. Jonathan Gourlay took up that ill-begotten relationship in a nice personal essay on the demise of Borders. For Gourlay, Borders was an opportunity missed whose demise was figured long ago in it’s decision to skip a flirtation with the labor movement and bend the knee to a corporatist ethos.

“For Borders, which first opened in 1971, the end began when it was sold to K-Mart in 1992. By the time I got there, three years later, only a few of the stalwart Borders believers remained to try to change the store from within. Within a few months of my arrival, Neil gave up and retired to play in his band, The Human Rays. I don’t know if the band was real or Neil just thought it was amusing to retire and join The Human Rays. His friendly management style didn’t jibe with the new owners.

“Neil’s replacement was a guy named Doug. Doug had the personality of a pair of brown corduroy pants. We all hated Doug. We hated him because he was not Neil. Underneath that hatred was a hatred of what Doug represented: corporate masters and the loss of our own identity. With Neil we labored under the impression that we were cool. Under Doug we just labored.

This romantic vision may have a grain of truth.  But it’s worth remembering that scrolls and codexes(codices?) required immense amounts of money–and so wealthy patrons who had to be appeased–to produce. And at some level when you come down to it a book is a commodity as surely as a coke can. So we like to imagine ourselves as counter cultural in our love of texts, but that romantic ethos is purchased at a price like everything else.

The promise of the Internet is in part the idea of thinking and writing and reading without the necessary intervention of the dollar bill. But is that too just a romantic fancy?

Conrad’s Typhoon: or, An Ode to My iPad

Joseph Conrad

Typhoon by Conrad, Joseph

My rating: 4 of 5 stars

Conrad’s Typhoon: or, An Ode to My iPad

I think one reason I don’t write and publish more than I do is because I am far too slow on the trigger. The ubiquity of blogging hasn’t helped this any since I usually find that someone else much more intelligent and articulate than I has blogged on what I think of as MY SUBJECT in a manner far more perspicacious, acute and interesting than I could manage. Take Charles Simic’s meditation on boredom during the recent power outages along the east coast, blogged over at the NYRB. I had several of those, Yes-that-is-exactly-what-was-on-the-tip-of-my-tongue moments reading lines like these:

“We sit with our heads bowed as if trying to summon spirits, while in truth struggling to see what’s on our dinner plates. Being temporarily unable to use the technology we’ve grown dependent on to inform ourselves about the rest of the world, communicate with others, and pass the time, is a reminder of our alarming dependence on them.”

Of course, these words weren’t actually on the tip of my tongue, but by imagining that the poet is only telling us what we have always known but could not say so well, we are able to give ourselves credit for a lot of intelligence and imagination that we don’t actually possess. Simic goes on to talk about the notable demise of reading and other delights like radio in the fact of our ubiquitous gadgets. Now, of course, reading books on a rainy afternoon or listening to a radio show has the faint reek of quaintness when we can’t manage to champion with a straight face these distractions as relics of authenticity. Simic reminds us that reading too was a form of distraction as surely as an i-phone.

“All of this reminded me of the days of my youth when my family, like so many others, lived in a monastic solitude when the weather was bad, since we had no television. It wasn’t in church, but on dark autumn days and winter nights that I had an inkling of what they meant when they spoke about eternity. Everyone read in order to escape boredom. I had friends so addicted to books, their parents were convinced they were going crazy with so many strange stories and ideas running like fever through their brains, not to mention becoming hard of hearing, after failing to perform the simplest household chores like letting the cat out.

“Living in a quiet neighborhood made it even worse. Old people stared out of windows at all hours, when they were not staring at the walls. There were radios, but their delights—with the exception of a few programs—were reserved for the grownups only. Thousands died of ennui in such homes. Others joined the navy, got married, or moved to California. Even so, looking back now, I realize how much I owe to my boredom. Drowning in it, I came face to face with myself as if in a mirror.”

Be that as it may, I lived out this boredom during the last hurricane by taking up Conrad’s Typhoon, the Project Gutenberg version, on the recommendation I received via my facebook friending of the New Yorker Magazine. (Let’s be frank, folks.  Oprah’s book club is absolutely yesterday).  Too dark to read, yes, but unlike the youthful Simic I had one gadget in hand that bore its own light to me in hand, my trusty iPad, fully charged and functioning.

When I began blogging three years ago at Read, Write, Now (a title I have come to detest, so future bloggers choose carefully), I had a suspicious and doubtful mindset about e-books, e-readers, and many things e-in-general. To be sure, I saw the advantages of blogging as a means of immediate intellectual self-gratification, and even then I think I felt that a great deal of writing and reading, especially in the academic world, would migrate effectively online. But I could not imagine, then, that an electronic gadget could take the place of paper. I wrote about the fact that I freely took my paper books in to saunas and bathtubs, that I could find my way through paper books more quickly and simply than with a scrolling sidebar, that I didn’t have to worry about whether it was sunny outside. And the smell, the smell, the smell. E-books were sterile, it seemed to me. In a word inauthentic.

I may still believe some of this, but I believe it less than I used to, largely due to my i-Pad. To come back to the

The steamer Nan Shan in the Storm

ostensible purpose of this review, Conrad’s Typhoon, it was the first full book I had read on my IPad, if a novella of 100 some odd pages can be thought of as a full book. And the verdict is that it was like reading…well…a book. The interface felt book like, I can adjust the light to the needs of my aging eyes, and can read more clearly than I could have managed by candlelight. I’ve always worried about the ability to personalize the texts, but iBooks lets me underline, and if anything I personalized the text more than I might have some others since my handwriting is unreadable and my notes in paperbooks cryptic and unintelligible. By contrast, the marginalia tool in iBooks is clean and my notes copious. Perhaps above all, I loved my iPad for remaining charged and working when everything else failed, leaving in the dark and to my own devices. Scary what I might find in that mirror. I read the entire book undistracted by facebook or my email apps, but I took comfort in knowing they were available for my distraction should I need them.

Now as to Typhoon itself. I want to say “Yes,” with qualifications. The story is gripping and intense, a naturalist drama of man against nature that becomes a kind of paean to stoic and pedestrian endurance, though one that is ironic and complicated in the end. The main human character is Captain MacWhirr, whose name betokens a machine-like efficiency. He is a man of small intellect, little imagination, and no intellectual curiosity. Because of this it is hard to describe him as actually courageous in the teeth of the hurricane. While a more imaginative man might have hidden his response to the terrors of the outrageous sea in cryptic understatement, MacWhirr is mostly just given to small emotion and small imagination.

Captain MacWhirr was trying to do up the top button of his oilskin coat with unwonted haste. The hurricane, with its power to madden the seas, to sink ships, to uproot trees, to overturn strong walls and dash the very birds of the air to the ground, had found this taciturn man in its path, and, doing its utmost, had managed to wring out a few words. Before the renewed wrath of winds swooped on his ship, Captain MacWhirr was moved to declare, in a tone of vexation, as it were: “I wouldn’t like to lose her.”

One doesn’t come away from this novel feeling grand and heroic and triumphant about human beings. On the other hand, one doesn’t come away feeling like human beings are small and accidental as you do, for instance in reading Stephen Crane’s “The Open Boat”. Instead endurance seems something to be achieved, and we end up happy for MacWhirr that he has achieved it, knowing we’d rather have him dull and unimaginative, but steady, were we caught in the writhing seas ourselves.

The story as a whole is gripping and seems to reveal something about both our human frailty and our strength and complexity, making it more than just a good adventure story. If I had read it first, I’m sure I would say that The Perfect Storm reminded me of it in being only partly a book about humans against the storm, and as much or more about humans against themselves.

One thing keeps me from a whole hearted endorsement. It really is the case that the depictions of Chinese in the book are deeply troubling. Passages in which Chinese are cast a jabbering animals or as writhing forces of nature are offensive and hard to find a way to redeem. I have always thought the criticism of Heart of Darkness was perhaps unearned since the thesis of that book had always seemed to me to be the evils of imperialism. But there is no redeeming theme that I can find for the representation of the Chinese coolies as brutes, and I found myself less inclined to defend Conrad, either here or for Heart of Darkness than I was before I began. To say this is not to say that the book is not worth reading, since there is no good human thing that is free of the scent of corruption, but it is to say that the goodness in the book does not overcome that corruption and reminds this reader at least that human beings are mixed creations, leaving us to admire and cringe in the same moment.

View all my reviews