Monthly Archives: May 2012

Dreaming of Sleep: Madison Smartt Bell’s Doctor Sleep

It’s a pleasure to pick up a novel and know from the first lines that it will be worth the read. We usually have to give more of ourselves over to novels than to other forms of writing–a problem in our frantic clickable age. With stuff that I remand to my Instapaper account, I can glance through the first paragraph and decide if I want to keep reading, and if I read three or four paragraphs and am not convinced it’s worth the time, there’s no point in agonizing about whether to keep on going. Same with a book of poetry: If I read the first three or four poems and there’s nothing there other than the authors reputation, I mostly put it aside–though I might admit that it is my failing and not the poets.

But novels are a different horse. A lot of novels require 50 pages and sometime more before we can get fully immersed in the writer’s imaginative world, feeling our way in to the nuances and recesses of possibility, caring enough about the characters or events or language or ideas (and preferably all four) to let the writer land us like netted fish. I think I’ve written before about the experience of reading yet one more chapter, still hoping and believing in the premise or the scenario or the author’s reputation. I’ve finished books that disappointed me, though my persistence was more like a teenaged boy fixed up on a date with with the best girl in school, pretending until the evening clicks to a close that he isn’t really bored to tears, things just haven’t gotten started yet.

But Madison Smartt Bell’s Doctor Sleep didn’t make me wait. I bought the book on reputation and topic. I loved Bell’s All Souls Rising, but got derailed by the follow-ups in his Haitian trilogy, never quite losing myself in the Caribbean madness that made the first book the deserved winner of an array of awards. Thus disappointed, I hadn’t really picked up Bell’s work since, though I vaguely felt I ought to. From the first sentences of the novel, his first, I was under the spell. The choice of words is purposeful since the book is about a hypnotherapist who, while helping others solve all manner of problems and difficulties through his particular gift at putting them under, cannot neither solve his own problems or put himself to sleep: He suffers from a crippling case of insomnia.

Like any good novel, the meanings are thickly layered. In some respects I found myself thinking of the Apostle Paul’s dictum that wretched man that he was, he knew what he should do, and he wanted to do it, but he could not do the very thing he knew to do, and, indeed, the very thing he did not want to do this was the very thing he did. The tale of all things human, the disjunction between knowledge and will, between thought and desire and act. The main character’s skills as a hypnotist are deeply related to his metaphysical wanderings amidst the mystics and heretics of the past, most particularly Giordano Bruno, burned at the stake because he claimed the church sought to promote good through force rather than through love. Ironically, the main character knows all this, knows in his head that love is the great mystic unity of which the mystics speak, and yet turns away from love in to abstraction, failing to love women because he cannot see them as human beings to whom he might be joined, seeing them instead as mystic abstractions through which he wants to escape the world.

In the end, accepting love means accepting death, which means accepting sleep–something that seems so natural to so many, but if you have suffered from insomnia as I do, you realize that surrendering to sleep is a strange act of grace, one that cannot be willed, but can only be received.

I think in some ways to there’s a lot of reflection in this book on the power of words and stories, their ability to put us under. So, perhaps inevitably, it is a book about writing and reading on some very deep level. Adrian, Doctor Sleep, takes people on a journey in to their unconscious through words, and his patients surrender to him willingly. Indeed, Adrian believes, with most hypnotists, that only those who want to be hypnotized actually can be. This is not so far from the notion of T.S. Eliot’s regarding the willing suspension of disbelief. I do not believe Adrian’s metaphysical mumbo-jumbo, but for the space of the novel I believe it utterly. We readers want to be caught. We want to lose ourselves at least for that space and that time, so that reading becomes a little like the gift of sleep, a waking dream.

Under the spell of writing we allow Bell to take us in to another world that is, surprisingly, like our own, one in which we see our own abstractedness, our own anxieties, our own petty crimes and misdemeanors, our own failures to love.

Dumpster Diving and other career moves: remembering the job market with Roger Whitson

It would be hard to say I enjoyed reading Roger Whitson’s very fine recent meditation in the Chronicle on the quest for a tenure-track job, his ambivalent feelings on finding one, the mixed feelings of exaltation and guilt at getting what so many of his peers would never find, and of leaving behind an #Altac existence where he had begun to make a home.

Hard to enjoy reading both because the story seems to typify what our academic life has actually become, and, frankly, because it reminded me too much of my own wandering years as a new academic a couple of decades ago.  I spent seven years full-time on the job market back in the day (if you count the last two years of graduate school).  I have estimated in the past that I must have applied for at least 700-800 jobs during those years–the idea of being targeted and selective a joke for a new father.  Fortunately I was actually only totally unemployed for four months during those years, though that was enough to plunge me thousands of dollars in to debt paying for health insurance.  For five of those seven years I had full-time work in various visiting assistant positions, and for two of  those visiting years I was paid so little I qualified for food stamps, though I never applied for the program.  I worked as many extra courses as I could to pay the bills–probably foolish for my career since publishing slowed to a crawl, but it saved my pride.  I remember asking, naively, during an interview for one such visiting position whether it was actually possible to live in that area of the country on what I was going to be paid.  The chair interviewing me at the time hesitated, then responded, “Well, of course, your wife can work.”

Only one of those years did I not get an interview, and only two of those years did I not get a campus interview, but even then this seemed like a very peculiar and unhelpful way to claim success for a beginning academic career.  We did not have anything called #altac in those days, and my plan B–which on my worst days I sometimes still wonder whether I should have followed–was to go back to cooking school and become a chef (I know, I know.  Another growth industry).  I never felt bad about pursuing a PhD in English, and I don’t think I would have even if I had gone on to become a chef.  The learning was worth it, to me, at least.

But I did grow distant from college friends who became vice-presidents of companies or doctors in growing practices , all of whom talked about their mortgages and vacations in the Caribbean or Colorado, while I was living in the cheapest 2 bedroom apartment in Fairfax Virginia that I could find and fishing furniture, including my daughter’s first bed, out of a dumpster.  (The furniture was held together, literally, by duct tape; I had to pay for conferences). And I spent a lot of evenings walking off my anxiety through the park next to our apartment complex, reminding myself of how much I had to be thankful for.  After all, I had a job and could pay my bills through the creative juggling of credit card balances. A lot of my friends had found no jobs at all.  A low rent comparison, I realize, but I would take what solace I could get.

I do not resent those days now, but that depends a lot on my having come out the other side.  The sobering thought in all of this is in realizing that in the world of academics today I should count myself one of the lucky ones.  Reading Roger’s essay, and the many like it that have been published in the last twenty years, I always get a sick hollow feeling in the gut, remembering what it was like to wonder what would happen if….

Reading Roger’s essay I was struck again with the fact that this is now the permanent condition of academic life in the humanities.  My own job story began more than 20 years ago at Duke, and even then we were told that the job market had been miserable for 15 years (but was sure to get better by and by).  30 years is not a temporary downturn or academic recession.  It is a way of being.

The advent of MOOC’s, all-online education, and for-profit universities, are responses to the economics of higher education that are unlikely to make things any better for the freshly minted PhD.  While there are some exciting innovations here that have a lot of promise for increasing learning to the many, it’s also the case that they are attractive and draw interest because they promise to do it more cheaply, which in the world of higher education means teaching more students with fewer faculty hours.  Roger’s most powerful line came toward the end:  “Until we realize that we are all contingent, we are all #altac, we all need to be flexible, and we are all in this together, we won’t be able to effectively deal with the crisis in the humanities with anything other than guilt.”

This is right, it seems to me.  In a world that is changing as rapidly and as radically as higher education, we are all as contingent the reporters and editors in the newsrooms of proud daily newspapers.  It is easy to say that the person who “made it” was talented enough or smart enough or savvy enough, but mostly they, I, we were just lucky enough to come out the other side.  But we would be misguided to imagine that because we made it in to a world that at least resembled the world we imagined, that that world will always be there.  We are an older institution and industry than music or radio or newspapers, but we are an industry and an institution nonetheless, and it seems to me that the change is upon us.  We are all contingent now.

The Best of Times, the Worst of Times: The U of Missouri Press is closing, Jennifer Egan is Tweeting

A bad week for publishing, but sometimes it seems like they all are.  First I was greeted with the news that the New Orleans Times-Picayune has cut back circulation to three days a week.  Apparently later in the same day, three papers in Alabama announced a similar move to downsize and reduce circulation.  Apparently being an award winning newspaper that does heroic community service in the midst of the disaster of a century is no longer enough.

Then today’s twitter feed brought me news of another University Press closing.

University of Missouri Press is closing after more than five decades of operation, UM System President Tim Wolfe announced this morning.

The press, which publishes about 30 books a year, will begin to be phased out in July, although a more specific timeline has not been determined.

Ten employees will be affected. Clair Willcox, editor in chief, declined to comment but did note that neither he nor any of the staff knew about the change before a midmorning meeting.

In a statement, Wolfe said even though the state kept funding to the university flat this year, administrators “take seriously our role to be good stewards of public funds, to use those funds to achieve our strategic priorities and re-evaluate those activities that are not central to our core mission.”

via University of Missouri Press is closing | The Columbia Daily Tribune – Columbia, Missouri.

Plenty has been said about the worrisome demise of daily papers and what the transformation of journalism into an online blogosphere really means for the body politic.  Will the Huffington Post, after all, actually cover anything in New Orleans if the paper goes under entirely.  Reposting is still not reporting, and having opinions at a distance is great fun but not exactly a form of knowledge.

The demise of or cutbacks to university presses is less bemoaned in the national press or blogosphere, but it is still worrisome.  Although I am now a believer in the possibilities of serious intellectual work occurring online, I am not yet convinced the demise of the serious scholarly book with a small audience would be a very good thing.  Indeed, I believe the best online work remains in a kind of symbiotic relationship with the traditional scholarly monograph or journal.  I keep my fingers crossed that this is merely an instance merely an instance of creative destruction, and not an instance of destruction plan and simple.

On a more hopeful note, I will say that I thoroughly enjoyed the New Yorker tweeting Jennifer Egan’s latest story Black box and am looking forward to the next installments.  I’d encourage everyone to “listen in”, if that’s what you do on twitter, but if you can’t you can read it in a more traditional but still twitterish form at the New Yorker Page turner site.  To get the twitter feed, go to @NYerFiction.  The reviews have been mixed, but I liked it a great deal.  Egan is a great writer, less full of herself than some others, she has a great deal to say, and she’s willing to experiment with new ways to say it.  Her last novel, Waiting for the Goon Squad, experimented with Powerpoint like slides within the text.  And, there’s a nice article over at Wired about the piece, suggesting it may be signaling a revival of serialized fiction.

Let’s hope so, it will make up for the loss of the U of MIssouri Press, at least today.

Blogging as textual meditation: Joyce Carol Oates and The Paris Review

One of the surprise pleasures afforded by Twitter has been following The Paris Review (@parisreview) and getting tweets linking me to their archives of author interviews.  I know I could just go to the website, but it feels like a daily act of grace to run across the latest in my twitter feed, as if these writers are finding me in the ether rather than me searching for them dutifully.

(In my heart of hearts I am probably still a Calvinist;  the serendipity of these lesser gods finding me is so much better than the tedious duty of seeking them out).

This evening over dinner I read the latest, a 1978 interview with Joyce Carol Oates, a real gem.

INTERVIEWER

Do you find emotional stability is necessary in order to write? Or can you get to work whatever your state of mind? Is your mood reflected in what you write? How do you describe that perfect state in which you can write from early morning into the afternoon?

OATES

One must be pitiless about this matter of “mood.” In a sense, the writing will create the mood. If art is, as I believe it to be, a genuinely transcendental function—a means by which we rise out of limited, parochial states of mind—then it should not matter very much what states of mind or emotion we are in. Generally I’ve found this to be true: I have forced myself to begin writing when I’ve been utterly exhausted, when I’ve felt my soul as thin as a playing card, when nothing has seemed worth enduring for another five minutes . . . and somehow the activity of writing changes everything. Or appears to do so. Joyce said of the underlying structure of Ulysses—the Odyssean parallel and parody—that he really didn’t care whether it was plausible so long as it served as a bridge to get his “soldiers” across. Once they were across, what does it matter if the bridge collapses? One might say the same thing about the use of one’s self as a means for the writing to get written. Once the soldiers are across the stream . . .

via Paris Review – The Art of Fiction No. 72, Joyce Carol Oates.

Oates doesn’t blog, I think, and I wouldn’t dare to hold my daily textural gurgitations up next to Oates’s stupendous artistic outpouring.  On the other hand, I resonated with this, thinking about what writing does for me at the end of the day.  I’ve had colleagues ask me how I have the time to write every day, my sometimes longish diatribes about this or that subject that has caught my attention.  Secretly my answer is “How could I not?”

Ok, I know that for a long time this blog lay fallow, but I have repented of that and returned to my better self. Mostly (tonight is an exception), I do my blog late, after 10:00–late for someone over 50–like a devotion.  I just pick up something I’ve read that day, like Joyce Carol Oates, and do what English majors are trained to do:  find a connection.  Often I’m exhausted and cranky from the day–being an administrator is no piece of cake,( but then, neither is being alive so what do I have to complain about).  Mostly I write as if I were talking to someone about the connections that I saw, the problems that it raised (or, more rarely, solved).

It doesn’t take that long–a half hour to an hour, and mostly I’ve given up television entirely.  I tell people I seem to think in paragraphs–sometimes very bad paragraphs, but paragraphs nevertheless–and years of piano lessons have left me a quick typist.  Sometimes I write to figure out what I think, sometimes to figure out whether what I think matters, sometimes to resolve a conundrum I have yet to figure out at work or at home, sometimes to make an impression (I am not above vanity).

But always I write because the day and my self disappears.  As Oates says above,  the activity of writing changes everything, or at least appears to  do so. Among the everything that it changes is me.  I am most myself when I lose the day and myself in words.

Yesterday on Facebook I cited Paul Fussell saying  “If I didn’t have writing, I’d be running down the street hurling grenades in people’s faces.”

Well, though I work at a pacifist school and it is incorrect to say so, that seems about right.

Digital Humanities, “techno-lust”, and the Personal Economy of Professional Development: Ryan Cordell on simplification

It’s a sign of my unsimple and exceedingly overstuffed life that I’ve only now gotten around to reading Ryan Cordell’s ProfHacker piece from last week.  Ryan is moving to a new position at Northeastern (kudos!) and he’s taken the ritual of eliminating the clutter and junk of a household as a metaphor for the need to prioritize and simplify our professional practices and in his own instance to get rid of the techno gadgets and gizmos that curiosity and past necessity have brought his way.

I have confessed before my appreciation for Henry David Thoreau—an odd thinker, perhaps, for a ProfHacker to esteem. Nevertheless, I think Thoreau can be a useful antidote to unbridled techno-lust. As I wrote in that earlier post, “I want to use gadgets and software that will help me do things I already wanted to do—but better, or more efficiently, or with more impact.” I don’t want to acumulate things for their own sake.

…..

I relate this not to brag, but to start a conversation about necessity. We talk about tech all the time here at ProfHacker. We are, most of us at least, intrigued by gadgets and gizmos. But there comes a time to simplify: to hold a garage sale, sell used gadgets on Gazelle, or donate to Goodwill.

via Simplify, Simplify! – ProfHacker – The Chronicle of Higher Education.

Ryan’s post comes as I’ve been thinking a lot about the personal economy that we all must bring to the question of our working lives, our sense of personal balance, and our personal desire for professional development and fulfillment.  As I have been pushing hard for faculty in my school to get more engaged with issues surrounding digital pedagogy and to consider working with students to develop projects in digital humanities, I have been extremely aware that the biggest challenge is not faculty resistance or a lack of faculty curiosity–though there is sometimes that.  The biggest challenge is the simple fact of a lack of faculty time.  At a small teaching college our lives are full, and not always in a good way.  There is extremely little bandwidth to imagine or think through new possibilities, much less experiment with them.

At our end of the year school meeting I posed to faculty the question of what they had found to be the biggest challenge in the past year so that we could think through what to do about it in the future.  Amy, my faculty member who may be the most passionate about trying out the potential of technology in the classroom responded “No time to play.”  Amy indicated that she had bought ten new apps for her iPad that year, but had not had any time to just sit around and experiment with them in order to figure out everything that they could do and imagine new possibilities for her classroom and the rest of her work.  The need for space, the need for play, is necessary for the imagination, for learning, and for change.

It is necessary for excellence, but it is easily the thing we value least in higher education.

Discussing this same issue with my department chairs, one of them said that she didn’t really care how much extra money I would give them to do work on digital humanities and pedagogy, what she really needed was extra time.

This is, I think, a deep problem generally and a very deep problem at a teaching college with a heavy teaching load and restricted budgets.  (At the same time, I do admit that I recognize some of the biggest innovations in digital pedagogy have come from community colleges with far higher teaching loads than ours).  I think, frankly, that this is at the root of some of the slow pace of change in higher education generally. Faculty are busy people despite the stereotype of the professor with endless time to just sit around mooning about nothing.  And books are….simple.  We know how to use them, they work pretty well, they are standardized in terms of their technical specifications, and we don’t have to reinvent the wheel every time we buy one.

Not so with the gadgets, gizmos, and applications that we accumulate rapidly with what Ryan describes as “techno-lust”. (I have not yet been accused of having this, but I am sure someone will use it on me now).  Unless driven by a personal passion, I think most faculty and administrators make an implicit and not irrational decision–“This is potentially interesting, but it would be just one more thing to do.”  This problem is exacerbated by the fact that the changes in technology seem to speed up and diversify rather than slow down and focus.  Technology doesn’t seem to simplify our lives or make them easier despite claims to greater efficiency.  Indeed, in the initial effort to just get familiar or figure out possibilities, technology just seems to add to the clutter.

I do not know a good way around this problem:  the need for play in an overstuffed and frantic educational world that so many of us inhabit.  One answer–just leave it alone and not push for innovation–doesn’t strike me as plausible in the least.  The world of higher education and learning is shifting rapidly under all of our feet, and the failure to take steps to address that change creatively will only confirm the stereotype of higher education as a dinosaur unable to respond to the educational needs of a public.

I’m working with the Provost so see if I can pilot a program that would give a course release to a faculty member to develop his or her abilities in technology in order to redevelop a class or develop a project with a student.  But this is a very small drop in the midst of a very big bucket of need.  And given the frantic pace of perpetual change that seems to be characteristic of contemporary technology, it seems like the need for space to play, and the lack of it, is going to be a perpetual characteristic of our personal professional economies for a very long time to come.

Any good ideas?  How can could I make space for professional play in the lives of faculty? Or for that matter in my own? How could faculty do it for themselves?  Is there a means of decluttering our professional lives to make genuine space for something new?

Why students of the Humanities should look for jobs in Silicon Valley

Ok, I’ll risk sounding like a broken record to say again that the notion that humanities students are ill-positioned for solid careers after college is simply misguided.  It still bears repeating.  This latest from Vivek Wadhwa at the Washington Post gives yet more confirmation of the notion that employers are not looking for specific majors but for skills and abilities and creativity, and that package can come with any major whatsoever, and it often comes with students in the humanities and social sciences.

Using Damon Horowitz, who possess degrees in both philosophy and engineering and whose unofficial title at Google is In-House Philosopher and whose official title is Director of Engineering, Wadhwa points out the deep need for humanities and social science students in the work of technology companies, a need that isn’t just special pleading from a humanist but is made vivid in the actual hiring practices of Silicon Valley companies.

Venture Capitalists often express disdain for startup CEOs who are not engineers. Silicon Valley parents send their kids to college expecting them to major in a science, technology, engineering or math (STEM) discipline. The theory goes as follows: STEM degree holders will get higher pay upon graduation and get a leg up in the career sprint.

The trouble is that theory is wrong. In 2008, my research team at Duke and Harvard surveyed 652 U.S.-born chief executive officers and heads of product engineering at 502 technology companies. We found that they tended to be highly educated: 92 percent held bachelor’s degrees, and 47 percent held higher degrees. But only 37 percent held degrees in engineering or computer technology, and just two percent held them in mathematics. The rest have degrees in fields as diverse as business, accounting, finance, healthcare, arts and the humanities.

Yes, gaining a degree made a big difference in the sales and employment of the company that a founder started. But the field that the degree was in was not a significant factor. ….

I’d take that a step further. I believe humanity majors make the best project managers, the best product managers, and, ultimately, the most visionary technology leaders. The reason is simple. Technologists and engineers focus on features and too often get wrapped up in elements that may be cool for geeks but are useless for most people. In contrast, humanities majors can more easily focus on people and how they interact with technology. A history major who has studied the Enlightment or the rise and fall of the Roman Empire may be more likely to understand the human elements of technology and how ease of use and design can be the difference between an interesting historical footnote and a world-changing technology. 

via Why Silicon Valley needs humanities PhDs – The Washington Post.

Again, at the risk of sounding like a broken record, this sounds like the kind of findings emphasized at the Rethinking Success Conference that I have now blogged on several times.    (I’ve heard theories that people come to be true believers if they hear a story 40 times.  So far I’ve only blogged on this 12 times, so I’ll keep going for a while longer).  Although I still doubt that it would be a good thing for a philosopher to go to Silicon Valley with no tech experience whatsoever,  a philosopher who had prepared himself by acquiring some basic technical skills alongside of his philosophy degree might be in a particularly good position indeed.  Worth considering.

Side note,  the Post article points to a nice little bio about Damon Horowitz.  I suspect there are not many folks in Silicon Valley who can talk about the ethics of tech products in terms that invoke Kant and John Stuart Mill.  Maybe there should be more.

Digital Humanities as Culture Difference: Adeline Koh on Hacking and Yacking

My colleague Bernardo Michael in the History department here has been pressing me to understand that properly understood Digital Humanities should be deeply connected to our College-wide efforts to address questions of diversity and what the AAC&U calls inclusive excellence.  (Bernardo also serves as the special assistant to the President for Diversity affairs).  At first blush I will admit that this has seemed counter-intuitive to me and I have struggled to articulate the priority between my interest in developing new efforts in Digital Humanities that I tie to our college’s technology plan and my simultaneous concerns with furthering our institutions diversity plan (besides just a general ethical interest, my primary field of study over the past 20 years has been multicultural American Literature).

Nevertheless, I’ve started seeing more and more of Bernardo’s point as I’ve engaged in the efforts to get things started in Digital Humanities.  For one thing, the practices and personages of the digital world are talked about in cultural terms:  We use language like “digital natives” and “digital culture” and “netizens”–cultural terms that attempt to articulate new forms of social and cultural being.  In the practical terms of trying to create lift-off for some of these efforts, an administrator faces the negotiation of multiple institutional cultures, and the challenging effort to get faculty–not unreasonably happy and proud about their achievements within their own cultural practices–to see that they actually need to become conversant in the languages and practices of an entirely different and digital culture.

Thus I increasingly see that Bernardo is right;  just as we need to acclimate ourselves and become familiar with other kinds of cultural differences in the classroom, and just as our teaching needs to begin to reflect the values of diversity and global engagement, our teaching practices also need to engage students as digital natives.  Using technology in the classroom or working collaboratively with students on digital projects isn’t simply instrumental–i.e. it isn’t simply about getting students familiar with things they will need for a job.  It is, in many ways, about cultural engagement, respect, and awareness.  How must our own cultures within academe adjust and change to engage with a new and increasingly not so new culture–one that is increasingly central and dominant to all of our cultural practices?

Adeline Koh over at Richard Stockton College (and this fall at Duke, I think), has a sharp post on these kinds of issues, focusing more on the divide between theory and practice or yacking and hacking in Digital Humanities.  Adeline has more theory hope than I do, but I like what she’s probing in her piece and I especially like where she ends up:

If computation is, as Cathy N. Davidson (@cathyndavidson) and Dan Rowinski have been arguing, the fourth “R” of 21st century literacy, we very much need to treat it the way we already do existing human languages: as modes of knowledge which unwittingly create cultural valences and systems of privilege and oppression. Frantz Fanon wrote in Black Skin, White Masks: “To speak a language is to take on a world, a civilization.”  As Digital Humanists, we have the responsibility to interrogate and to understand what kind of world, and what kind of civilization, our computational languages and forms create for us. Critical Code Studies is an important step in this direction. But it’s important not to stop there, but to continue to try to expand upon how computation and the digital humanities are underwritten by theoretical suppositions which still remain invisible.

More Hack, Less Yack?: Modularity, Theory and Habitus in the Digital Humanities | Adeline Koh
http://www.adelinekoh.org/blog/2012/05/21/more-hack-less-yack-modularity-theory-and-habitus-in-the-digital-humanities/

I suspect that Bernardo and Adeline would have a lot to say to each other.

Are Writers Afraid of the Dark–Part II: Salman Rushdie’s contradictory views of censorship

A brief follow up on my post from earlier today responding to Tim Parks’s notion over at the New York Review of Books that literature is actually characterized by fear and withdrawal from life rather than engagement with us.  Later in the day I read Salman Rushdie’s post at the New Yorker on Censorship, a redaction of his Arthur Miller Freedom to Write Lecture delivered a few days ago.  Rushdie brings out the idea that, indeed, writers can be afraid, but it is a fear born from the fact of their writing rather than their writing being a compensation for it.  Censorship is a direct attack on the notion of the right to think and write and Rushdie brings out the idea that this can be paralyzing to the writers act.

The creative act requires not only freedom but also this assumption of freedom. If the creative artist worries if he will still be free tomorrow, then he will not be free today. If he is afraid of the consequences of his choice of subject or of his manner of treatment of it, then his choices will not be determined by his talent, but by fear. If we are not confident of our freedom, then we are not free.

via Salman Rushdie’s PEN World Voices Lecture on Censorship : The New Yorker.

Rushdie goes on to chronicle the martyrs of writing, who have had a great deal to be afraid of because of their writing (a point made in responses to Parks’s blog as well).

You will even find people who will give you the argument that censorship is good for artists because it challenges their imagination. This is like arguing that if you cut a man’s arms off you can praise him for learning to write with a pen held between his teeth. Censorship is not good for art, and it is even worse for artists themselves. The work of Ai Weiwei survives; the artist himself has an increasingly difficult life. The poet Ovid was banished to the Black Sea by a displeased Augustus Caesar, and spent the rest of his life in a little hellhole called Tomis, but the poetry of Ovid has outlived the Roman Empire. The poet Mandelstam died in one of Stalin’s labor camps, but the poetry of Mandelstam has outlived the Soviet Union. The poet Lorca was murdered in Spain, by Generalissimo Franco’s goons, but the poetry of Lorca has outlived the fascistic Falange. So perhaps we can argue that art is stronger than the censor, and perhaps it often is. Artists, however, are vulnerable.

Read more http://www.newyorker.com/online/blogs/books/2012/05/on-censorship-salman-rushdie.html#ixzz1vMorpS00

This is powerful stuff, though I’ll admit I started feeling like there was an uncomfortable contradiction in Rushdie’s presentation.  Although Rushdie’s ostensible thesis is that “censorship is not good for art,” he goes on after this turn to celebrate the dangerousness of writing.  According to Rushdie, all great art challenges the status quo and unsettles convention:

Great art, or, let’s just say, more modestly, original art is never created in the safe middle ground, but always at the edge. Originality is dangerous. It challenges, questions, overturns assumptions, unsettles moral codes, disrespects sacred cows or other such entities. It can be shocking, or ugly, or, to use the catch-all term so beloved of the tabloid press, controversial. And if we believe in liberty, if we want the air we breathe to remain plentiful and breathable, this is the art whose right to exist we must not only defend, but celebrate. Art is not entertainment. At its very best, it’s a revolution.

Read more http://www.newyorker.com/online/blogs/books/2012/05/on-censorship-salman-rushdie.html#ixzz1vMpu97Bt

It remains unclear to me how Rushdie can have it both ways.  If Art is going to be revolutionary, it cannot possibly be safe and it cannot possibly but expect the efforts to censor.  If there is no resistance to art, then there is no need for revolution, everything will be the safe middle ground, and there will be no possibility of great art.

I am not sure of which way Rushdie wants it, and I wonder what my readers think.  Does great art exist apart from resistance and opposition?  If it does not, does it make sense to long for a world in which such opposition does not exist?  Does Rushdie want to be edgy and pushing boundaries, but to do so safely?  Is this a contradictory and impossible desire?

You can also listen to Rushdie’s lecture below:

Are writers afraid of the dark?

In a new blog at NYRB, Tim Parks questions the notion that literature is about the stuff of life and instead might be a kind of withdrawal from the complexity and fearfulness of life itself:

So much, then, for a fairly common theme in literature. It’s understandable that those sitting comfortably at a dull desk to imagine life at its most intense might be conflicted over questions of courage and fear. It’s also more than likely that this divided state of mind is shared by a certain kind of reader, who, while taking a little time out from life’s turmoil, nevertheless likes to feel that he or she is reading courageous books.

The result is a rhetoric that tends to flatter literature, with everybody over eager to insist on its liveliness and import. “The novel is the one bright book of life,” D H Lawrence tells us. “Books are not life,” he immediately goes on to regret. “They are only tremulations on the ether. But the novel as a tremulation can make the whole man alive tremble.” Lawrence, it’s worth remembering, grew up in the shadow of violent parental struggles and would always pride himself on his readiness for a fight, regretting in one letter that he was too ill “to slap Frieda [his wife] in the eye, in the proper marital fashion,” but “reduced to vituperation.” Frieda, it has to be said, gave as good as she got. In any event words just weren’t as satisfying as blows, though Lawrence did everything he could to make his writing feel like a fight: “whoever reads me will be in the thick of the scrimmage,” he insisted.

In How Fiction Works James Wood tells us that the purpose of fiction is “to put life on the page” and insists that “readers go to fiction for life.” Again there appears to be an anxiety that the business of literature might be more to do with withdrawal; in any event one can’t help thinking that someone in search of life would more likely be flirting, traveling or partying. How often on a Saturday evening would the call to life lift my head from my books and have me hurrying out into the street.

(via Instapaper)

I was reminded in reading this of a graduate seminar with Franco Moretti wherein he said, almost as an aside, that we have an illusion that literature is complex and difficult, but that in fact, literature simplifies the complexity and randomness of life as it is.  In some sense literature is a coping mechanism.  I don’t remember a great deal more than that about the seminar–other than the fact that Moretti wasn’t too impressed with my paper on T.S. Eliot–but I do remember that aside.  It struck me as at once utterly convincing and yet disturbing, unsettling the notion that we in literature were dealing with the deepest and most complicated things in life.

On the other hand, I’m reminded of the old saw, literature may not be life, but, then, what is?  Parks seems to strike a little bit of a graduate studenty tone here in presenting the obvious as an earthshaking discovery, without really advancing our understanding of what literature might actually be and do.  Parks seems to take delight in skewering without revealing or advancing understanding.  There’s a tendency to set up straw men to light afire, and then strike the smug and knowing revelatory critical pose, when what one has revealed is more an invention of one’s own rhetoric than something that might be worth thinking about.

This desire to convince oneself that writing is at least as alive as life itself, was recently reflected by a New York Times report on brain-scan research that claims that as we read about action in novels the relative areas of the brain—those that respond to sound, smell, texture, movement, etc.—are activated by the words. “The brain, it seems,” enthuses the journalist, “does not make much of a distinction between reading about an experience and encountering it in real life; in each case, the same neurological regions are stimulated.”

What nonsense! As if reading about sex or violence in any way prepared us for the experience of its intensity. (In this regard I recall my adolescent daughter’s recent terror on seeing our border collie go into violent death throes after having eaten some poison in the countryside. As the dog foamed at the mouth and twitched, Lucy was shivering, weeping, appalled. But day after day she reads gothic tales and watches horror movies with a half smile on her lips.)

I’m tempted to say “What nonsense!”  Parks’s willingness to use his daughter to dismiss a scientific finding strikes me a bit like the homeschool student I once had who cited her father as an authority who disproved evolution.  Well.  The reference to the twitching dog invokes emotion that in fact runs away–in a failure of critical nerve perhaps?–from the difficult question of how exactly the brain processes and models fictional information, how that information relates to similar real world situations in which people find themselves, and how people might use and interrelate both fictional and “real world” information.

Parks seems to have no consciousness whatsoever of the role of storytelling in modeling possibility, one of its most complex ethical and psychological effects.  It’s a very long-standing and accepted understanding that one reason we tell any stories at all is to provide models for living.  Because a model is a model, we need not assume it lacks courage or is somehow a cheat on the real stuff of life.  Horror stories and fairy tales help children learn to deal with fear, impart warning and knowledge and cultural prohibitions to children, and attempt to teach them in advance how to respond to threat, to fear, to violence, etcetera.  Because those lessons are always inadequate to the moment itself hardly speaks against the need to have such mental models and maps.  It would be better to ask what we would do without them.  The writer who provides such models need not be skewered for that since to write well and convincingly, to provide a model that serves that kind of ethical or psychic purpose, the writer him or herself must get close to those feelings of terror and disintegration themselves.  It’s why there’s always been a tradition of writers like Hemingway or Sebastian Junger who go to war in order to get into that place within themselves where the emotions of the real can be touched.  It’s also why there’s always been a tradition of writers self-medicating with alcohol.

Thus, I kind of found Parks’s implied assumption that writers are cowering just a bit from the real stuff of life to be a cheap shot, something that in the cultural stories we tell each other is usually associated with cowardice and weakness, in a writer or a fighter.  The novelists and poets Parks takes on deserve better.

Celebrating the liberal arts in the marketplace (cautiously)

A new survey of 225 employers just out emphasizes the continuing value of the liberal arts in the employment market.

More interesting, at least for those of us who got some parental grief over our college choice, was the apparent love being shown for liberal arts majors. Thirty percent of surveyed employers said they were recruiting liberal arts types, second only to the 34 percent who said they were going after engineering and computer information systems majors. Trailing were finance and accounting majors, as only 18 percent of employers said they were recruiting targets.

“The No. 1 skill that employers are looking for are communication skills and liberal arts students who take classes in writing and speaking,” said Dan Schawbel, founder of Millennial Branding and an expert on Generation Y. “They need to become good communicators in order to graduate with a liberal arts degree. Companies are looking for soft skills over hard skills now because hard skills can be learned, while soft skills need to be developed.”

via Survey On Millennial Hiring Highlights Power Of Liberal Arts – Daily Brief – Portfolio.com.

I don’t particularly like the soft skills/hard skills dichotomy.  However, this fits my general sense, blogged on before, that the hysteria over liberal arts majors lack of employability is, well, hysteria.  Something manufactured by reporters needing something to talk about.

At the same time, I think the somewhat glib and easy tone of this particular article calls for some caution.  Digging in to the statistics provided even in the summary suggests that liberal arts majors need to be supplementing their education with concrete experiences and coursework that will provide a broad panoply of skills and abilities.  50% of employers, for instance, say they are looking for students who held leadership positions on campus, a stat before which even engineers and computer scientist but kneel in obeisance.  Similarly, nearly 69% say they are looking for coursework relevant to the position you are pursuing.  My general sense is you can sell you Shakespeare course to a lot of employers, but it might be helpful if you sold Shakespeare along side the website you built for the course or alongside the three courses you took in computer programming.

Generally speaking, then, I think these statistics confirm the ideas propounded by the Rethinking Success conference in suggesting that students really need to be developing themselves as T-shaped candidates for positions, broad and deep, with a variety of skills and experiences to draw on and some level of expertise that has been, preferably, demonstrated through experiences like internships or project-based creativity.

Speaking of Rethinking Success, the entire website is now up with all the relevant videos.  The session with Philip Gardner from Michigan State is embedded below.  It was Gardner who impressed me by his emphasis that students need to realize that they either need to be liberal arts students with technical skills or technical students with liberal arts skills if they are going to have a chance in the current job market.