My rating: 5 of 5 stars
My friend and colleague, Devin Manzullo-Thomas, warned me that he broke down weeping on a Chicago subway as he finished the book, to the consternation of his fellow passengers. I found it easier to weep discretely in a Barnes and Nobles cafe. As one might guess with a book that opens with the discovery of a young and hopeful and brilliant doctor at the beginning of his career that he has what by all accounts is an incurable cancer, death awaits. And awaits relentlessly. But the knowledge that the book ends in our common human destiny does little to steel the reader for the way the heart breaks against the stony shores of that certainty. We are not given the witness of his death; no autobiography could do that. But we are given the witness of his dying and are included in that process, one that is by turns, noble and wretched and then, in some sense, ennobling of the author, of those around him, and perhaps even of us as readers.
While focused on death, the book is not morbid. If it ends in tears, it is not even really depressing in any meaningful sense of that word. What we are given is the striving after understanding, and the effort to make meaning out of experience with the only tools we have been given to do it, through our language. The book is fascinating for its testimony to the difficulties of the life of the mind and the body, to the agonies of both surgery and uncertainty. Kalanithi began as a devotee of language and discusses his turn away from the study of literature and what might loosely be described as “the mind” and towards the study of the brain and the body in the attempt to understand more completely the physical mechanism by which, after all, we come to speak and conceive of ourselves as having something called a soul or a mind. Kalanithi does not go deeply enough into this transition away from literature and language for my personal taste, though he offers the throwaway line that he thought the study of literature had become too embroiled in the study of politics and was not leading him into the study of meaning. However, he also seems to think that the study of literature is too divorced from the world of action, of doing something in the world that makes a difference, a difference he thought he could make in the healing professions. Kalanithi’s book is not about a theory of literature, but if it were I would want to argue with him more forcefully here. There is, after all, nothing more thoroughly politicized in our day and age than the practice of medicine, though his book does not seem to take up the question of who gets to be there to have a chance at healing at a first rate medical center and why that might be. I might want to argue as well, as I have at other times, that the embrace of literature in particular, and of the humanities more broadly–indeed, education per se–is actually one part of the world’s healing, though we fail too often to recognize it as such. As Milton suggested, the purpose of education is to repair the ruin of our first parents (and perhaps the ruin of our own parents, and of ourselves, I might add).
But these are not the main points of Kalanithi’s book. He manages to make the beauty and power of the study of science, and especially the study of medicine real and persuasive. More broadly he shows convincingly that a life of intense study is a life worth living, that the life of the mind matters, even if that point is made more poignant by the fact that that life is cut off by a disease of the body that cannot be studied away and that as of yet remains beyond the realm of human comprehension and control. It is interesting to me that, in the end, Kalanithi returns to literature to make meaning of the death that he cannot, that none of us can, avoid. He makes mention of the fact, of his return to stories about living and dying, stories that tried to make sense of the fact of death. Some of his final memories are of sitting with his infant daughter on his lap, reading to her the words of Eliot’s “Waste Land”, a fact that might be comical were it not for the fact that it rings true for a father given over to the power of language to give meaning where meaning seems tenuous at best, were it not for the fact that I and other fathers I know in the English major set do or have done precisely the same thing.
Of course, he made his own story in the end. A story, to be sure, that could not have been written without his unimaginably deep engagement with science and medicine. But finally his story lives for us, tells us something about what it means to be human, because it lives for us in words.
Having been away from this blog for awhile, I am struck by my vaguely perceived need to offer explanations, as if I needed excuses for beginning again or for stopping in the first place. And it was also evident to me that I found this impulse, unresisted, littering blogs across the web. The blog of my friend and former student, Carmen McCain, is rife with repeated apologies for her inconsistent blogging. Another friend and former student, Liz Laribee, also seems to apologize for not blogging about as often as she does blog. I know that in the past when I’ve gone on unexplained hiatus, I’ve begun again with an apology. Just for grins I did a quick Google search for “apologize for not blogging more” and got 16,600 hits for that exact combination. Apparently we are legion and we are a sorry lot.
There seem to be several versions of this particular literary genre. In one variety the blogger abjectly denounces herself for moral turpitude, admitting to various venal weaknesses like preferring Facebook or watching television rather than keeping to the rough moral discipline of the keyboard. Others beg busyness, or grovel repentantly in admitting they were only sick and surely could have opened up the lap top. Some seek remission of their sins by requesting the reader’s empathy, including long and engrossing lists of ills and misfortunes that make the travails of the biblical Job look like a trip to the gym with a particularly rigorous drill instructor. My particular favorite is the blogger who apologizes but lets the reader know that he was really up to much more important or much more interesting things and that we are lucky he is back at all. In some instances it seems that people spend a good deal of their blogging time ruminating about how they should be blogging more, much as I talk about how little time I have for exercise while I am sitting on my couch in the evening.
Many such blog posts recognize that they are enacting an internet cliche by apologizing, but do so anyway. I’m intrigued. What does this apology signify about blogging as a form of writing, about the kind of audience the author imagines, about the relationship with that audience. Novelists do not apologize for the years or decades between novels, nor for that matter do essayists, short story writers, or poets. It seems more important to have something worth saying than to say something with great regularity. While such writers may flog themselves for not writing, they do so privately or to their editors and fellow writers–readers be damned. Newspaper columnists will announce their absence for a sabbatical, without apology I might add, but mostly they go on vacation without comment other than the dry, italicized editorial note that “[Insert opinionated name here] will return in September after his vacation to the Bahamas where he is working on a book and enjoying his family.”
Only bloggers bother to apologize for not writing.
As if their readers really cared.
I suspect this has something to do with the illusion of intimacy that is made possible by interactivity. I have come to “know” a number of people through my blog or through twitter and Facebook, and since this electronic transmission is the sum total of our human experience together it is a little bit akin to having kept up a loose friendship by phone and then having not phoned for a good long time. On the other hand, I suspect too that it has something to do with the fact that bloggers suffer from the anxiety of silence. The writer who publishes in the New York Times knows that her work will have readers. The writer for the Podunk Times knows they had at least one reader who thought their work was worthwhile since an editor decided to publish it.
The blogger, on the other hand, flings words into space like dust.
The apology has the appearance of a statement intended to right a wrong I the blogger have done to you the reader by not blessing you with my words and wisdom these last two some odd months and days.
In fact, the apology is a bloggers plea. Hear me now. Confirm my existence as a writer of some sort or another by clicking on my blog anew. While I may truly have ignored you if I know you or, more likely, while I may truly have no idea on earth who you are, I need you nonetheless. To drive up my blog stats. To share me on Facebook. To “like” my post and so like me. To “follow” my blog to the ends of the earth even when there is nothing there to follow. Though I am bloggus absconditus, wait for me like the ancients waited for the gods. Make me matter.
And so I am back. For today, with no promise for tomorrow. Without apology.
It’s a pleasure to pick up a novel and know from the first lines that it will be worth the read. We usually have to give more of ourselves over to novels than to other forms of writing–a problem in our frantic clickable age. With stuff that I remand to my Instapaper account, I can glance through the first paragraph and decide if I want to keep reading, and if I read three or four paragraphs and am not convinced it’s worth the time, there’s no point in agonizing about whether to keep on going. Same with a book of poetry: If I read the first three or four poems and there’s nothing there other than the authors reputation, I mostly put it aside–though I might admit that it is my failing and not the poets.
But novels are a different horse. A lot of novels require 50 pages and sometime more before we can get fully immersed in the writer’s imaginative world, feeling our way in to the nuances and recesses of possibility, caring enough about the characters or events or language or ideas (and preferably all four) to let the writer land us like netted fish. I think I’ve written before about the experience of reading yet one more chapter, still hoping and believing in the premise or the scenario or the author’s reputation. I’ve finished books that disappointed me, though my persistence was more like a teenaged boy fixed up on a date with with the best girl in school, pretending until the evening clicks to a close that he isn’t really bored to tears, things just haven’t gotten started yet.
But Madison Smartt Bell’s Doctor Sleep didn’t make me wait. I bought the book on reputation and topic. I loved Bell’s All Souls Rising, but got derailed by the follow-ups in his Haitian trilogy, never quite losing myself in the Caribbean madness that made the first book the deserved winner of an array of awards. Thus disappointed, I hadn’t really picked up Bell’s work since, though I vaguely felt I ought to. From the first sentences of the novel, his first, I was under the spell. The choice of words is purposeful since the book is about a hypnotherapist who, while helping others solve all manner of problems and difficulties through his particular gift at putting them under, cannot neither solve his own problems or put himself to sleep: He suffers from a crippling case of insomnia.
Like any good novel, the meanings are thickly layered. In some respects I found myself thinking of the Apostle Paul’s dictum that wretched man that he was, he knew what he should do, and he wanted to do it, but he could not do the very thing he knew to do, and, indeed, the very thing he did not want to do this was the very thing he did. The tale of all things human, the disjunction between knowledge and will, between thought and desire and act. The main character’s skills as a hypnotist are deeply related to his metaphysical wanderings amidst the mystics and heretics of the past, most particularly Giordano Bruno, burned at the stake because he claimed the church sought to promote good through force rather than through love. Ironically, the main character knows all this, knows in his head that love is the great mystic unity of which the mystics speak, and yet turns away from love in to abstraction, failing to love women because he cannot see them as human beings to whom he might be joined, seeing them instead as mystic abstractions through which he wants to escape the world.
In the end, accepting love means accepting death, which means accepting sleep–something that seems so natural to so many, but if you have suffered from insomnia as I do, you realize that surrendering to sleep is a strange act of grace, one that cannot be willed, but can only be received.
I think in some ways to there’s a lot of reflection in this book on the power of words and stories, their ability to put us under. So, perhaps inevitably, it is a book about writing and reading on some very deep level. Adrian, Doctor Sleep, takes people on a journey in to their unconscious through words, and his patients surrender to him willingly. Indeed, Adrian believes, with most hypnotists, that only those who want to be hypnotized actually can be. This is not so far from the notion of T.S. Eliot’s regarding the willing suspension of disbelief. I do not believe Adrian’s metaphysical mumbo-jumbo, but for the space of the novel I believe it utterly. We readers want to be caught. We want to lose ourselves at least for that space and that time, so that reading becomes a little like the gift of sleep, a waking dream.
Under the spell of writing we allow Bell to take us in to another world that is, surprisingly, like our own, one in which we see our own abstractedness, our own anxieties, our own petty crimes and misdemeanors, our own failures to love.
One of the surprise pleasures afforded by Twitter has been following The Paris Review (@parisreview) and getting tweets linking me to their archives of author interviews. I know I could just go to the website, but it feels like a daily act of grace to run across the latest in my twitter feed, as if these writers are finding me in the ether rather than me searching for them dutifully.
(In my heart of hearts I am probably still a Calvinist; the serendipity of these lesser gods finding me is so much better than the tedious duty of seeking them out).
This evening over dinner I read the latest, a 1978 interview with Joyce Carol Oates, a real gem.
Do you find emotional stability is necessary in order to write? Or can you get to work whatever your state of mind? Is your mood reflected in what you write? How do you describe that perfect state in which you can write from early morning into the afternoon?
One must be pitiless about this matter of “mood.” In a sense, the writing will create the mood. If art is, as I believe it to be, a genuinely transcendental function—a means by which we rise out of limited, parochial states of mind—then it should not matter very much what states of mind or emotion we are in. Generally I’ve found this to be true: I have forced myself to begin writing when I’ve been utterly exhausted, when I’ve felt my soul as thin as a playing card, when nothing has seemed worth enduring for another five minutes . . . and somehow the activity of writing changes everything. Or appears to do so. Joyce said of the underlying structure of Ulysses—the Odyssean parallel and parody—that he really didn’t care whether it was plausible so long as it served as a bridge to get his “soldiers” across. Once they were across, what does it matter if the bridge collapses? One might say the same thing about the use of one’s self as a means for the writing to get written. Once the soldiers are across the stream . . .
Oates doesn’t blog, I think, and I wouldn’t dare to hold my daily textural gurgitations up next to Oates’s stupendous artistic outpouring. On the other hand, I resonated with this, thinking about what writing does for me at the end of the day. I’ve had colleagues ask me how I have the time to write every day, my sometimes longish diatribes about this or that subject that has caught my attention. Secretly my answer is “How could I not?”
Ok, I know that for a long time this blog lay fallow, but I have repented of that and returned to my better self. Mostly (tonight is an exception), I do my blog late, after 10:00–late for someone over 50–like a devotion. I just pick up something I’ve read that day, like Joyce Carol Oates, and do what English majors are trained to do: find a connection. Often I’m exhausted and cranky from the day–being an administrator is no piece of cake,( but then, neither is being alive so what do I have to complain about). Mostly I write as if I were talking to someone about the connections that I saw, the problems that it raised (or, more rarely, solved).
It doesn’t take that long–a half hour to an hour, and mostly I’ve given up television entirely. I tell people I seem to think in paragraphs–sometimes very bad paragraphs, but paragraphs nevertheless–and years of piano lessons have left me a quick typist. Sometimes I write to figure out what I think, sometimes to figure out whether what I think matters, sometimes to resolve a conundrum I have yet to figure out at work or at home, sometimes to make an impression (I am not above vanity).
But always I write because the day and my self disappears. As Oates says above, the activity of writing changes everything, or at least appears to do so. Among the everything that it changes is me. I am most myself when I lose the day and myself in words.
Yesterday on Facebook I cited Paul Fussell saying “If I didn’t have writing, I’d be running down the street hurling grenades in people’s faces.”
Well, though I work at a pacifist school and it is incorrect to say so, that seems about right.
A brief follow up on my post from earlier today responding to Tim Parks’s notion over at the New York Review of Books that literature is actually characterized by fear and withdrawal from life rather than engagement with us. Later in the day I read Salman Rushdie’s post at the New Yorker on Censorship, a redaction of his Arthur Miller Freedom to Write Lecture delivered a few days ago. Rushdie brings out the idea that, indeed, writers can be afraid, but it is a fear born from the fact of their writing rather than their writing being a compensation for it. Censorship is a direct attack on the notion of the right to think and write and Rushdie brings out the idea that this can be paralyzing to the writers act.
The creative act requires not only freedom but also this assumption of freedom. If the creative artist worries if he will still be free tomorrow, then he will not be free today. If he is afraid of the consequences of his choice of subject or of his manner of treatment of it, then his choices will not be determined by his talent, but by fear. If we are not confident of our freedom, then we are not free.
Rushdie goes on to chronicle the martyrs of writing, who have had a great deal to be afraid of because of their writing (a point made in responses to Parks’s blog as well).
You will even find people who will give you the argument that censorship is good for artists because it challenges their imagination. This is like arguing that if you cut a man’s arms off you can praise him for learning to write with a pen held between his teeth. Censorship is not good for art, and it is even worse for artists themselves. The work of Ai Weiwei survives; the artist himself has an increasingly difficult life. The poet Ovid was banished to the Black Sea by a displeased Augustus Caesar, and spent the rest of his life in a little hellhole called Tomis, but the poetry of Ovid has outlived the Roman Empire. The poet Mandelstam died in one of Stalin’s labor camps, but the poetry of Mandelstam has outlived the Soviet Union. The poet Lorca was murdered in Spain, by Generalissimo Franco’s goons, but the poetry of Lorca has outlived the fascistic Falange. So perhaps we can argue that art is stronger than the censor, and perhaps it often is. Artists, however, are vulnerable.
This is powerful stuff, though I’ll admit I started feeling like there was an uncomfortable contradiction in Rushdie’s presentation. Although Rushdie’s ostensible thesis is that “censorship is not good for art,” he goes on after this turn to celebrate the dangerousness of writing. According to Rushdie, all great art challenges the status quo and unsettles convention:
Great art, or, let’s just say, more modestly, original art is never created in the safe middle ground, but always at the edge. Originality is dangerous. It challenges, questions, overturns assumptions, unsettles moral codes, disrespects sacred cows or other such entities. It can be shocking, or ugly, or, to use the catch-all term so beloved of the tabloid press, controversial. And if we believe in liberty, if we want the air we breathe to remain plentiful and breathable, this is the art whose right to exist we must not only defend, but celebrate. Art is not entertainment. At its very best, it’s a revolution.
It remains unclear to me how Rushdie can have it both ways. If Art is going to be revolutionary, it cannot possibly be safe and it cannot possibly but expect the efforts to censor. If there is no resistance to art, then there is no need for revolution, everything will be the safe middle ground, and there will be no possibility of great art.
I am not sure of which way Rushdie wants it, and I wonder what my readers think. Does great art exist apart from resistance and opposition? If it does not, does it make sense to long for a world in which such opposition does not exist? Does Rushdie want to be edgy and pushing boundaries, but to do so safely? Is this a contradictory and impossible desire?
You can also listen to Rushdie’s lecture below:
In a new blog at NYRB, Tim Parks questions the notion that literature is about the stuff of life and instead might be a kind of withdrawal from the complexity and fearfulness of life itself:
So much, then, for a fairly common theme in literature. It’s understandable that those sitting comfortably at a dull desk to imagine life at its most intense might be conflicted over questions of courage and fear. It’s also more than likely that this divided state of mind is shared by a certain kind of reader, who, while taking a little time out from life’s turmoil, nevertheless likes to feel that he or she is reading courageous books.
The result is a rhetoric that tends to flatter literature, with everybody over eager to insist on its liveliness and import. “The novel is the one bright book of life,” D H Lawrence tells us. “Books are not life,” he immediately goes on to regret. “They are only tremulations on the ether. But the novel as a tremulation can make the whole man alive tremble.” Lawrence, it’s worth remembering, grew up in the shadow of violent parental struggles and would always pride himself on his readiness for a fight, regretting in one letter that he was too ill “to slap Frieda [his wife] in the eye, in the proper marital fashion,” but “reduced to vituperation.” Frieda, it has to be said, gave as good as she got. In any event words just weren’t as satisfying as blows, though Lawrence did everything he could to make his writing feel like a fight: “whoever reads me will be in the thick of the scrimmage,” he insisted.
In How Fiction Works James Wood tells us that the purpose of fiction is “to put life on the page” and insists that “readers go to fiction for life.” Again there appears to be an anxiety that the business of literature might be more to do with withdrawal; in any event one can’t help thinking that someone in search of life would more likely be flirting, traveling or partying. How often on a Saturday evening would the call to life lift my head from my books and have me hurrying out into the street.
I was reminded in reading this of a graduate seminar with Franco Moretti wherein he said, almost as an aside, that we have an illusion that literature is complex and difficult, but that in fact, literature simplifies the complexity and randomness of life as it is. In some sense literature is a coping mechanism. I don’t remember a great deal more than that about the seminar–other than the fact that Moretti wasn’t too impressed with my paper on T.S. Eliot–but I do remember that aside. It struck me as at once utterly convincing and yet disturbing, unsettling the notion that we in literature were dealing with the deepest and most complicated things in life.
On the other hand, I’m reminded of the old saw, literature may not be life, but, then, what is? Parks seems to strike a little bit of a graduate studenty tone here in presenting the obvious as an earthshaking discovery, without really advancing our understanding of what literature might actually be and do. Parks seems to take delight in skewering without revealing or advancing understanding. There’s a tendency to set up straw men to light afire, and then strike the smug and knowing revelatory critical pose, when what one has revealed is more an invention of one’s own rhetoric than something that might be worth thinking about.
This desire to convince oneself that writing is at least as alive as life itself, was recently reflected by a New York Times report on brain-scan research that claims that as we read about action in novels the relative areas of the brain—those that respond to sound, smell, texture, movement, etc.—are activated by the words. “The brain, it seems,” enthuses the journalist, “does not make much of a distinction between reading about an experience and encountering it in real life; in each case, the same neurological regions are stimulated.”
What nonsense! As if reading about sex or violence in any way prepared us for the experience of its intensity. (In this regard I recall my adolescent daughter’s recent terror on seeing our border collie go into violent death throes after having eaten some poison in the countryside. As the dog foamed at the mouth and twitched, Lucy was shivering, weeping, appalled. But day after day she reads gothic tales and watches horror movies with a half smile on her lips.)
I’m tempted to say “What nonsense!” Parks’s willingness to use his daughter to dismiss a scientific finding strikes me a bit like the homeschool student I once had who cited her father as an authority who disproved evolution. Well. The reference to the twitching dog invokes emotion that in fact runs away–in a failure of critical nerve perhaps?–from the difficult question of how exactly the brain processes and models fictional information, how that information relates to similar real world situations in which people find themselves, and how people might use and interrelate both fictional and “real world” information.
Parks seems to have no consciousness whatsoever of the role of storytelling in modeling possibility, one of its most complex ethical and psychological effects. It’s a very long-standing and accepted understanding that one reason we tell any stories at all is to provide models for living. Because a model is a model, we need not assume it lacks courage or is somehow a cheat on the real stuff of life. Horror stories and fairy tales help children learn to deal with fear, impart warning and knowledge and cultural prohibitions to children, and attempt to teach them in advance how to respond to threat, to fear, to violence, etcetera. Because those lessons are always inadequate to the moment itself hardly speaks against the need to have such mental models and maps. It would be better to ask what we would do without them. The writer who provides such models need not be skewered for that since to write well and convincingly, to provide a model that serves that kind of ethical or psychic purpose, the writer him or herself must get close to those feelings of terror and disintegration themselves. It’s why there’s always been a tradition of writers like Hemingway or Sebastian Junger who go to war in order to get into that place within themselves where the emotions of the real can be touched. It’s also why there’s always been a tradition of writers self-medicating with alcohol.
Thus, I kind of found Parks’s implied assumption that writers are cowering just a bit from the real stuff of life to be a cheap shot, something that in the cultural stories we tell each other is usually associated with cowardice and weakness, in a writer or a fighter. The novelists and poets Parks takes on deserve better.
Yesterday in my comments on Carmen McCain’s post, I quoted Susan Sontag in all seriousness. I might have thought better of doing so if I had bothered first to take in this image:
This from a collection of photos at Flavorwire of writers in various stage of un-work. Mostly these folks do not look inebriated, but with Hunter S. Thompson, Papa Hemingway, and Kurt Vonnegut in the mix, I would remain none to sure. It is comforting to know that writers are people too, just like you and me. Though I will say that unlike Hunter S. Thompson, I have never driven down the Vegas strip with a naked blow up doll sitting in my lap. No doubt it is this kind of self-repression that is keeping me from being the writer I was meant to be.
Side Note: A personal favorite is of screenwriter Dalton Trumbo working in the bathtub. Which leads to a writerly twist on the drunken parlor game question: Most unusual place you’ve ever done it? Your writing, I mean?
My colleague in the library here at Messiah College, Jonathan Lauer, has a very nice essay in the most recent Digital Campus edition of the Chronicle of Higher Education. Jonathan makes an eloquent defense of the traditional book over and against the googlization and ebookification of everything. He especially employs an extended metaphor drawn from the transition to aluminum bats in various levels of baseball to discuss his unease and reservations about the shifts to electronic books and away from print that is profoundly and rapidly changing the nature of libraries as we’ve known them. The essay is more evocative than argumentative, so there’s a lot of different things going on, but a couple of Jonathan’s main points are that enhancements we supposedly achieve with digitization projects come at a cost to our understanding of texts and at a cost to ourselves.
In the big leagues, wooden bats still matter. Keeping print materials on campus and accessible remains important for other reasons as well. Witness Andrew M. Stauffer’s recent Chroniclearticle, “The Troubled Future of the 19th-Century Book.” Stauffer, the director of the Networked Infrastructure for Nineteenth-Century Electronic Scholarship, cites several examples of what we all know intuitively. “The books on the shelves carry plenty of information lost in the process of digitization, no matter how lovingly a particular copy is rendered on the screen,” he writes. “There are vitally significant variations in the stacks: editions, printings, issues, bindings, illustrations, paper type, size, marginalia, advertisements, and other customizations in apparently identical copies.” Without these details, discernible only in physical copies, we are unable to understand a book’s total impact. Are we so easily seduced by the aluminum bat that we toss all wooden ones from the bat bag?
Let’s also acknowledge that our gadgets eventually program us. History teaches us that technologies often numb the very human capacities they amplify; in its most advanced forms, this is tantamount to auto-amputation. As weavers lost manual dexterity with their use of increasingly mechanized looms during the Industrial Revolution, so we can only imagine what effect GPS will have on the innate and learned ability of New York City cabbies to find their way around the five boroughs. Yet we practice auto-amputation at our own peril. We dare not abandon wooden bats for aluminum for those endeavors that demand prolonged attention, reflection, and the analysis and synthesis that sometimes lead to wisdom, the best result of those decidedly human endeavors that no gadget can exercise.
I have a lot of sympathy for Jonathan’s position, things like the revamping of the New York Public Library leaving me with a queasy hole in my stomach. I’ve had a running conversation with Beth Transue, another of our librarians, about our desire to start leading alumni tours of the world’s great libraries, but if we’re going to do so we better get it done fast because most of them won’t be around anymore in a few more years, at least if the NYPL and its budgetary woes are anything to judge by.
At the same time, I think Jonathan overstates his case here. I don’t think serious thinkers are assuming we’ll get rid of books entirely. Although I currently think we are already living in what I’ve called an E-plus world, print will continue to be with us serving many different purposes. Jason Epstein over at the NYRB has a blog on this fact and progrognosticating the likely future and uses of the traditional book seems to be a growth industry at the moment. I don’t think the average student is too terribly interested in the material textuality that Jonathan references above, nor for that matter is the average scholar, the vast majority of whom remain interested in what people wrote not how the publishers chose to package it. But those issues will continue to be extremely important for cultural and social historians, and there will be some forms of work that will only possibly be done with books. Just as it is a tremendous boon to have Joyce’s manuscript’s digitized, making them available for the general reader and the scholar who cannot afford a trip to Ireland, authoritative interpretations of Joyce’s method, biography, and life’s work will still have to make the trip to Ireland to see the thing for themselves, to capture what can’t be captured by a high resolution camera.
That having been said, who would say that students studying Joyce should avoid examining the digitized manuscripts closely because they aren’t “the genuine article.” Indeed, I strongly suspect that even the authoritative interpretations of those manuscripts will increasingly be a commerce between examination of the physical object and close examination of digitized objects since advanced DH work shows us time and time again that computerized forms of analysis can get at things the naked eye could never see. So the fact that there are badly digitized copies of things in google books and beyond, shouldn’t belie the fact that there are some massively important scholarly opportunities here.
Jonathan’s second point is about the deeply human and quasi-spiritual aspects of engagement with traditional books that so many of us have felt over the years. There’s something very true about this. It is also true that our technologies can result in forms of self amputation. Indeed, if we are to take it to heart we need to admit that the technology of writing and reading itself is something that involves self-amputation. Studies have shown that heavy readers alter their brains, and not always in a good sense. We diminish the capacity of certain forms of memory, literally making ourselves absent minded professors. Other studies have suggested that persons in oral cultures have this capacity in heightened form, and some people argue that this generation is far more visually acute than those that preceded it, developing new abilities because of their engagement with visual texts. So, indeed, our technologies alter us, and even result in self-amputation, but that is true of the traditional book as well as the internet. This second is Jonathan’s larger claim since it seems to claim for traditional books as such a superiority in terms of something central to humanity as such. I am intrigued, with this argument that the book is superior for serious reflection and the quasi spiritual aspects of study that we have come to treat as central to the humanities.
I admit, I don’t buy it.
First, I admit that I’m just wary about attributing essential human superiorities to historical artifact and practices. Homer as a collection of aural songs is not inherently inferior to the scrolls within which they were originally collected, then finding their apotheosis in the book form. We have come to think of the book as exhibiting and symbolizing superior forms of humanity, but it’s not clear that book form was triumphant in the west because of these attributes. Indeed, traditional Jews and others clearly think the scroll remains the superior spiritual form even to this day. Rather, the codex triumphed for a variety of complicated reasons. Partly Christian Churches for ideological reasons apparently wanted to distinguish their own writings from the writings of the Jews. There may have been some more substantive reasons as well, though that’s not entirely clear: Anthony Grafton points out that many of the Christian innovations with the codex seemed to focus on the desire to compare different kinds of texts side by side (an innovation, I will point out, for which the internet is in many ways easily superior). The codex also triumphed not because it was spiritually and intellectually superior but because it was, frankly, more efficient, cheaper, and easier to disseminate than its scrolly ancestors. One good example is from the poet Martial who explicitly ties the selling of his poetry in codex form to making them easily and efficiently accessible to the common person: “Assign your book-boxes to the great, this copy of me one hand can grasp.”
The entire trend of book history has been toward this effort to make texts and what they contain more readily and easily available to more and more people. From the early clay tablets to the mass market paperback that let you carry Plato in your hip pocket, the thrust of the book has been toward broader and broader dissemination, toward greater and greater ease of use, toward cheaper and cheaper accessibility. The goal of writing, even when that writing was imprisoned in libraries that only the initiated could enter as in Umberto Eco’s The Name of the Rose, has been open access.
The digitization that is occurring now comes to fulfill the book, not destroy it.
Secondarily, I guess I no longer believe fully in the spiritual or intellectual superiority of codex forms simply since it doesn’t comport with my experience. As I do more and more of my reading of books with my various e-readers, I find that I have serious, contemplative, analytical, and synthetic engagements with all kinds of texts, from those hundreds of “pages” long and those not. As I get used to the tools of various e-readers, theres almost nothing that can’t be accomplished in some way on an e-reader that is accomplished in traditional books. Although I interact with texts differently now in a spatial sense, I am able to take fuller and more copious notes, I am able to mark texts more easily, and if I can’t quite remember where something was in the book I can use a search engine to find not only a specific phrase or topic, but every single instance of that topic in the book. Moreover, because every text represents an act of contemplation on and conversation with other texts, I can at the touch of a screen go and read for myself the interlocutors embedded within a book, just as those interested in Jonathan’s essay can touch my link above and decide for themselves whether I am reading him fairly. Thus there are very obviously and seriously some ways in which e-readers are superior for serious analytical and interpretive readings of texts, or at least the equal to them.
All this having been said, I will say that there remains one way that I find the traditional paper book the clear superior to the e-book, and that has to do with my ability to make it mine.
I spoke a couple of days ago about the personal connection I felt to Kierkegaard in rereading him and discovering my many years of underlines, highlights and marginalia. I even confess that I real Kimi Cunningham Grant’s new memoir on my iPad, but I still bought a hard cover at the reading–not because I thought I would be able to analyze it more effectively in hard cover, but because I wanted her to sign it for me.
This is a personal connection to the book that isn’t unimportant, but that is about my personal biography, and Kimi’s. It’s not about the text, and frankly I doubt it will in the long run even be about literary history. Some literary archivist somewhere is collecting all the shared comments on the Kindle version of Kimi’s book, and that massive marginalia will be fodder for some graduate student’s dissertation in a few decades.
I pity the poor graduate student who decides on such a project. But at least she won’t have to strain her eyes to decipher the handwriting.
Angus Grieve Smith over at the MLA group on Linked In pointed me toward John McWhorter’s take on Twitter a couple of years back. [I admit to some embarrassment in referencing an article that’s TWO YEARS OLD!! But the presentism of the writing for the web is a story for another time]. McWhorter’s basic case is that Twitter is not really writing at all, but a form of graphic speech. My term, not McWhorter’s. He points out that most people even after knowing how to read and write speak in bursts of 7 to 10 words, and writing at its origins reflected these kinds of patterns. In other words, as speakers we are all twitterers. Of Twitter, McWhorter says:
The only other problem we might see in something both democratic and useful is that it will exterminate actual writing. However, there are no signs of this thus far. In 2009, the National Assessment of Education Performance found a third of American eighth graders – the ones texting madly right now — reading at or above basic proficiency, but crucially, this figure has changed little since 1992, except to rise somewhat. Just as humans can function in multiple languages, they can also function in multiple kinds of language. An analogy would be the vibrant foodie culture that has thrived and even flowered despite the spread of fast food.
Who among us really fears that future editions of this newspaper will be written in emoticons? Rather, a space has reopened in public language for the chunky, transparent, WYSIWYG quality of the earliest forms of writing, produced by people who had not yet internalized a sense that the flavor of casual language was best held back from the printed page.
This speech on paper is vibrant, creative and “real” in exactly the way that we celebrate in popular forms of music, art, dance and dress style. Few among us yearn for a world in which the only music is classical, the only dance is ballet and daily clothing requires corsets and waistcoats. As such, we might all embrace a brave new world where we can both write and talk with our fingers.
I mostly agree with this idea, that we are all code shifters. I’m less sanguine than, McWhorter, however. His eighth grade sample takes students before they’ve entered a period of schooling where they’d be expected to take on more serious reading and longer and more complex forms of writing. Twitter may not be to blame, but it’s not clear to me the state of writing and reading comprehension at higher levels is doing all that well. There’s some pretty good evidence that it isn’t. So just as a foodie culture may thrive in the midst of a lot of fast food, it’s not clear that we ought to be complacent in the face of an obesity epidemic. In the same way, just because tweeting may not signal the demise of fine writing, it’s not clear that it’s helping the average writer become more facile and sophisticated in language use.