A nice essay from novelist Graham Swift in the New York Times on the issues of reading, writing, speed and leisure. A lot of what’s here is well travelled ground, though travelled well again by Swift. I especially noted his sense in which time-saving has become the means by which we are enslaved to time.
A good novel is like a welcome pause in the flow of our existence; a great novel is forever revisitable. Novels can linger with us long after we’ve read them — even, and perhaps particularly, novels that compel us to read them, all other concerns forgotten, in a single intense sitting. We may sometimes count pages as we read, but I don’t think we look at our watches to see how time is slipping away.
That, in fact, is the position of the skeptical nonreader who says, “I have no time to read,” and who deems the pace of life no longer able to accommodate the apparently laggard process of reading books. We have developed a wealth of technologies that are supposed to save us time for leisurely pursuits, but for some this has only made such pursuits seem ponderous and archaic. “Saving time” has made us slaves to speed.
To some degree Swift is picking up on a perpetual conundrum in the advancements of technology, a dialectic by which we pursue technological ends to make our lives easier, more convenient, less consumed by work and more open to enrichment. Making onerous tasks more efficient has been the dream of technology from indoor plumbing to the washing machine to email. In short we pursue technological means to make our lives more human.
And in some ways and places, we are able to achieve that end. Who would want, really, to live in the Middle Ages anywhere except in Second Life. Your expected life span at birth would have been about 30 years, compared to a global average today in the mid to upper 60s, and it would have been a 30 years far more grinding and difficult than what most of the world experiences today (with, of course, important and grievous exceptions). You would likely have been hopelessly illiterate, cut off from even the possibility of entering a library (much less purchasing a handmade codex in a bookstore), and you would have had no means of being informed of what happened in the next valley last week, much less what happened in Beijing 10 minutes ago. It is little wonder that becoming a monk or a priest ranked high on the list of desirable medieval occupations. Where else were you guaranteed a reward in heaven, as well as at least some access to those things we consider basic features of our contemporary humanity–literacy, education, art, music, a life not under the dominion of physical labor. What we usually mean when we romanticize the ancient world (or for that matter the 1950s) is that we want all the fruits of our modern era with out the new enslavements that accompany them
At the same time, of course, our technological advances have often been promoted as a gift to humankind in general, but they have as readily been employed to advance a narrow version of human productivity in the marketplace. Our technologies facilitate fast communication; This mostly means that we are now expected to communicate more than ever, and they also raise expectations about just exactly what can get done. Technology vastly expands the range or information we can thoughtfully engage, but increases the sense that we are responsible for knowing something about everything, instead of knowing everything about the few dozen books my great grandparents might have had in their possession. One reason the vaunted yeoman farmer knew something about Shakespeare, could memorize vast expanses of the bible, and could endure sermons and speeches that lasted for hours is because he didn’t have a twitter feed. Nor did he have an Outlook Calendar that becomes an endless to do list generated by others.
I do think the novel, even in its e-book form, resists this need for speed. On the other hand, it is worth saying that reading like this must be practiced like other things. I find that when I take a couple of vacation days for a long weekend (like this weekend), it takes me about 2/3 of a day to slow down and relax and allow myself to pause. Luckily, I can do this more readily with novels, even at the end of a hectic and too full day or week. But that might be possible because I learned how to do it in another world, one without the bells and whistles that call for my attention through my multiple devices with their glowing LCDs.
Novel reading is a learned skill, and I wonder whether our students learn it well enough. Re-creation is a learned skill, one we need to be fully ourselves, and I do wonder whether we lose that capacity for pause in our speedy lives.
A bad week for publishing, but sometimes it seems like they all are. First I was greeted with the news that the New Orleans Times-Picayune has cut back circulation to three days a week. Apparently later in the same day, three papers in Alabama announced a similar move to downsize and reduce circulation. Apparently being an award winning newspaper that does heroic community service in the midst of the disaster of a century is no longer enough.
Then today’s twitter feed brought me news of another University Press closing.
University of Missouri Press is closing after more than five decades of operation, UM System President Tim Wolfe announced this morning.
The press, which publishes about 30 books a year, will begin to be phased out in July, although a more specific timeline has not been determined.
Ten employees will be affected. Clair Willcox, editor in chief, declined to comment but did note that neither he nor any of the staff knew about the change before a midmorning meeting.
In a statement, Wolfe said even though the state kept funding to the university flat this year, administrators “take seriously our role to be good stewards of public funds, to use those funds to achieve our strategic priorities and re-evaluate those activities that are not central to our core mission.”
Plenty has been said about the worrisome demise of daily papers and what the transformation of journalism into an online blogosphere really means for the body politic. Will the Huffington Post, after all, actually cover anything in New Orleans if the paper goes under entirely. Reposting is still not reporting, and having opinions at a distance is great fun but not exactly a form of knowledge.
The demise of or cutbacks to university presses is less bemoaned in the national press or blogosphere, but it is still worrisome. Although I am now a believer in the possibilities of serious intellectual work occurring online, I am not yet convinced the demise of the serious scholarly book with a small audience would be a very good thing. Indeed, I believe the best online work remains in a kind of symbiotic relationship with the traditional scholarly monograph or journal. I keep my fingers crossed that this is merely an instance merely an instance of creative destruction, and not an instance of destruction plan and simple.
On a more hopeful note, I will say that I thoroughly enjoyed the New Yorker tweeting Jennifer Egan’s latest story Black box and am looking forward to the next installments. I’d encourage everyone to “listen in”, if that’s what you do on twitter, but if you can’t you can read it in a more traditional but still twitterish form at the New Yorker Page turner site. To get the twitter feed, go to @NYerFiction. The reviews have been mixed, but I liked it a great deal. Egan is a great writer, less full of herself than some others, she has a great deal to say, and she’s willing to experiment with new ways to say it. Her last novel, Waiting for the Goon Squad, experimented with Powerpoint like slides within the text. And, there’s a nice article over at Wired about the piece, suggesting it may be signaling a revival of serialized fiction.
Let’s hope so, it will make up for the loss of the U of MIssouri Press, at least today.
One of the surprise pleasures afforded by Twitter has been following The Paris Review (@parisreview) and getting tweets linking me to their archives of author interviews. I know I could just go to the website, but it feels like a daily act of grace to run across the latest in my twitter feed, as if these writers are finding me in the ether rather than me searching for them dutifully.
(In my heart of hearts I am probably still a Calvinist; the serendipity of these lesser gods finding me is so much better than the tedious duty of seeking them out).
This evening over dinner I read the latest, a 1978 interview with Joyce Carol Oates, a real gem.
Do you find emotional stability is necessary in order to write? Or can you get to work whatever your state of mind? Is your mood reflected in what you write? How do you describe that perfect state in which you can write from early morning into the afternoon?
One must be pitiless about this matter of “mood.” In a sense, the writing will create the mood. If art is, as I believe it to be, a genuinely transcendental function—a means by which we rise out of limited, parochial states of mind—then it should not matter very much what states of mind or emotion we are in. Generally I’ve found this to be true: I have forced myself to begin writing when I’ve been utterly exhausted, when I’ve felt my soul as thin as a playing card, when nothing has seemed worth enduring for another five minutes . . . and somehow the activity of writing changes everything. Or appears to do so. Joyce said of the underlying structure of Ulysses—the Odyssean parallel and parody—that he really didn’t care whether it was plausible so long as it served as a bridge to get his “soldiers” across. Once they were across, what does it matter if the bridge collapses? One might say the same thing about the use of one’s self as a means for the writing to get written. Once the soldiers are across the stream . . .
Oates doesn’t blog, I think, and I wouldn’t dare to hold my daily textural gurgitations up next to Oates’s stupendous artistic outpouring. On the other hand, I resonated with this, thinking about what writing does for me at the end of the day. I’ve had colleagues ask me how I have the time to write every day, my sometimes longish diatribes about this or that subject that has caught my attention. Secretly my answer is “How could I not?”
Ok, I know that for a long time this blog lay fallow, but I have repented of that and returned to my better self. Mostly (tonight is an exception), I do my blog late, after 10:00–late for someone over 50–like a devotion. I just pick up something I’ve read that day, like Joyce Carol Oates, and do what English majors are trained to do: find a connection. Often I’m exhausted and cranky from the day–being an administrator is no piece of cake,( but then, neither is being alive so what do I have to complain about). Mostly I write as if I were talking to someone about the connections that I saw, the problems that it raised (or, more rarely, solved).
It doesn’t take that long–a half hour to an hour, and mostly I’ve given up television entirely. I tell people I seem to think in paragraphs–sometimes very bad paragraphs, but paragraphs nevertheless–and years of piano lessons have left me a quick typist. Sometimes I write to figure out what I think, sometimes to figure out whether what I think matters, sometimes to resolve a conundrum I have yet to figure out at work or at home, sometimes to make an impression (I am not above vanity).
But always I write because the day and my self disappears. As Oates says above, the activity of writing changes everything, or at least appears to do so. Among the everything that it changes is me. I am most myself when I lose the day and myself in words.
Yesterday on Facebook I cited Paul Fussell saying “If I didn’t have writing, I’d be running down the street hurling grenades in people’s faces.”
Well, though I work at a pacifist school and it is incorrect to say so, that seems about right.
It’s a sign of my unsimple and exceedingly overstuffed life that I’ve only now gotten around to reading Ryan Cordell’s ProfHacker piece from last week. Ryan is moving to a new position at Northeastern (kudos!) and he’s taken the ritual of eliminating the clutter and junk of a household as a metaphor for the need to prioritize and simplify our professional practices and in his own instance to get rid of the techno gadgets and gizmos that curiosity and past necessity have brought his way.
I have confessed before my appreciation for Henry David Thoreau—an odd thinker, perhaps, for a ProfHacker to esteem. Nevertheless, I think Thoreau can be a useful antidote to unbridled techno-lust. As I wrote in that earlier post, “I want to use gadgets and software that will help me do things I already wanted to do—but better, or more efficiently, or with more impact.” I don’t want to acumulate things for their own sake.
I relate this not to brag, but to start a conversation about necessity. We talk about tech all the time here at ProfHacker. We are, most of us at least, intrigued by gadgets and gizmos. But there comes a time to simplify: to hold a garage sale, sell used gadgets on Gazelle, or donate to Goodwill.
Ryan’s post comes as I’ve been thinking a lot about the personal economy that we all must bring to the question of our working lives, our sense of personal balance, and our personal desire for professional development and fulfillment. As I have been pushing hard for faculty in my school to get more engaged with issues surrounding digital pedagogy and to consider working with students to develop projects in digital humanities, I have been extremely aware that the biggest challenge is not faculty resistance or a lack of faculty curiosity–though there is sometimes that. The biggest challenge is the simple fact of a lack of faculty time. At a small teaching college our lives are full, and not always in a good way. There is extremely little bandwidth to imagine or think through new possibilities, much less experiment with them.
At our end of the year school meeting I posed to faculty the question of what they had found to be the biggest challenge in the past year so that we could think through what to do about it in the future. Amy, my faculty member who may be the most passionate about trying out the potential of technology in the classroom responded “No time to play.” Amy indicated that she had bought ten new apps for her iPad that year, but had not had any time to just sit around and experiment with them in order to figure out everything that they could do and imagine new possibilities for her classroom and the rest of her work. The need for space, the need for play, is necessary for the imagination, for learning, and for change.
It is necessary for excellence, but it is easily the thing we value least in higher education.
Discussing this same issue with my department chairs, one of them said that she didn’t really care how much extra money I would give them to do work on digital humanities and pedagogy, what she really needed was extra time.
This is, I think, a deep problem generally and a very deep problem at a teaching college with a heavy teaching load and restricted budgets. (At the same time, I do admit that I recognize some of the biggest innovations in digital pedagogy have come from community colleges with far higher teaching loads than ours). I think, frankly, that this is at the root of some of the slow pace of change in higher education generally. Faculty are busy people despite the stereotype of the professor with endless time to just sit around mooning about nothing. And books are….simple. We know how to use them, they work pretty well, they are standardized in terms of their technical specifications, and we don’t have to reinvent the wheel every time we buy one.
Not so with the gadgets, gizmos, and applications that we accumulate rapidly with what Ryan describes as “techno-lust”. (I have not yet been accused of having this, but I am sure someone will use it on me now). Unless driven by a personal passion, I think most faculty and administrators make an implicit and not irrational decision–“This is potentially interesting, but it would be just one more thing to do.” This problem is exacerbated by the fact that the changes in technology seem to speed up and diversify rather than slow down and focus. Technology doesn’t seem to simplify our lives or make them easier despite claims to greater efficiency. Indeed, in the initial effort to just get familiar or figure out possibilities, technology just seems to add to the clutter.
I do not know a good way around this problem: the need for play in an overstuffed and frantic educational world that so many of us inhabit. One answer–just leave it alone and not push for innovation–doesn’t strike me as plausible in the least. The world of higher education and learning is shifting rapidly under all of our feet, and the failure to take steps to address that change creatively will only confirm the stereotype of higher education as a dinosaur unable to respond to the educational needs of a public.
I’m working with the Provost so see if I can pilot a program that would give a course release to a faculty member to develop his or her abilities in technology in order to redevelop a class or develop a project with a student. But this is a very small drop in the midst of a very big bucket of need. And given the frantic pace of perpetual change that seems to be characteristic of contemporary technology, it seems like the need for space to play, and the lack of it, is going to be a perpetual characteristic of our personal professional economies for a very long time to come.
Any good ideas? How can could I make space for professional play in the lives of faculty? Or for that matter in my own? How could faculty do it for themselves? Is there a means of decluttering our professional lives to make genuine space for something new?
A brief follow up on my post from earlier today responding to Tim Parks’s notion over at the New York Review of Books that literature is actually characterized by fear and withdrawal from life rather than engagement with us. Later in the day I read Salman Rushdie’s post at the New Yorker on Censorship, a redaction of his Arthur Miller Freedom to Write Lecture delivered a few days ago. Rushdie brings out the idea that, indeed, writers can be afraid, but it is a fear born from the fact of their writing rather than their writing being a compensation for it. Censorship is a direct attack on the notion of the right to think and write and Rushdie brings out the idea that this can be paralyzing to the writers act.
The creative act requires not only freedom but also this assumption of freedom. If the creative artist worries if he will still be free tomorrow, then he will not be free today. If he is afraid of the consequences of his choice of subject or of his manner of treatment of it, then his choices will not be determined by his talent, but by fear. If we are not confident of our freedom, then we are not free.
Rushdie goes on to chronicle the martyrs of writing, who have had a great deal to be afraid of because of their writing (a point made in responses to Parks’s blog as well).
You will even find people who will give you the argument that censorship is good for artists because it challenges their imagination. This is like arguing that if you cut a man’s arms off you can praise him for learning to write with a pen held between his teeth. Censorship is not good for art, and it is even worse for artists themselves. The work of Ai Weiwei survives; the artist himself has an increasingly difficult life. The poet Ovid was banished to the Black Sea by a displeased Augustus Caesar, and spent the rest of his life in a little hellhole called Tomis, but the poetry of Ovid has outlived the Roman Empire. The poet Mandelstam died in one of Stalin’s labor camps, but the poetry of Mandelstam has outlived the Soviet Union. The poet Lorca was murdered in Spain, by Generalissimo Franco’s goons, but the poetry of Lorca has outlived the fascistic Falange. So perhaps we can argue that art is stronger than the censor, and perhaps it often is. Artists, however, are vulnerable.
This is powerful stuff, though I’ll admit I started feeling like there was an uncomfortable contradiction in Rushdie’s presentation. Although Rushdie’s ostensible thesis is that “censorship is not good for art,” he goes on after this turn to celebrate the dangerousness of writing. According to Rushdie, all great art challenges the status quo and unsettles convention:
Great art, or, let’s just say, more modestly, original art is never created in the safe middle ground, but always at the edge. Originality is dangerous. It challenges, questions, overturns assumptions, unsettles moral codes, disrespects sacred cows or other such entities. It can be shocking, or ugly, or, to use the catch-all term so beloved of the tabloid press, controversial. And if we believe in liberty, if we want the air we breathe to remain plentiful and breathable, this is the art whose right to exist we must not only defend, but celebrate. Art is not entertainment. At its very best, it’s a revolution.
It remains unclear to me how Rushdie can have it both ways. If Art is going to be revolutionary, it cannot possibly be safe and it cannot possibly but expect the efforts to censor. If there is no resistance to art, then there is no need for revolution, everything will be the safe middle ground, and there will be no possibility of great art.
I am not sure of which way Rushdie wants it, and I wonder what my readers think. Does great art exist apart from resistance and opposition? If it does not, does it make sense to long for a world in which such opposition does not exist? Does Rushdie want to be edgy and pushing boundaries, but to do so safely? Is this a contradictory and impossible desire?
You can also listen to Rushdie’s lecture below:
In a new blog at NYRB, Tim Parks questions the notion that literature is about the stuff of life and instead might be a kind of withdrawal from the complexity and fearfulness of life itself:
So much, then, for a fairly common theme in literature. It’s understandable that those sitting comfortably at a dull desk to imagine life at its most intense might be conflicted over questions of courage and fear. It’s also more than likely that this divided state of mind is shared by a certain kind of reader, who, while taking a little time out from life’s turmoil, nevertheless likes to feel that he or she is reading courageous books.
The result is a rhetoric that tends to flatter literature, with everybody over eager to insist on its liveliness and import. “The novel is the one bright book of life,” D H Lawrence tells us. “Books are not life,” he immediately goes on to regret. “They are only tremulations on the ether. But the novel as a tremulation can make the whole man alive tremble.” Lawrence, it’s worth remembering, grew up in the shadow of violent parental struggles and would always pride himself on his readiness for a fight, regretting in one letter that he was too ill “to slap Frieda [his wife] in the eye, in the proper marital fashion,” but “reduced to vituperation.” Frieda, it has to be said, gave as good as she got. In any event words just weren’t as satisfying as blows, though Lawrence did everything he could to make his writing feel like a fight: “whoever reads me will be in the thick of the scrimmage,” he insisted.
In How Fiction Works James Wood tells us that the purpose of fiction is “to put life on the page” and insists that “readers go to fiction for life.” Again there appears to be an anxiety that the business of literature might be more to do with withdrawal; in any event one can’t help thinking that someone in search of life would more likely be flirting, traveling or partying. How often on a Saturday evening would the call to life lift my head from my books and have me hurrying out into the street.
I was reminded in reading this of a graduate seminar with Franco Moretti wherein he said, almost as an aside, that we have an illusion that literature is complex and difficult, but that in fact, literature simplifies the complexity and randomness of life as it is. In some sense literature is a coping mechanism. I don’t remember a great deal more than that about the seminar–other than the fact that Moretti wasn’t too impressed with my paper on T.S. Eliot–but I do remember that aside. It struck me as at once utterly convincing and yet disturbing, unsettling the notion that we in literature were dealing with the deepest and most complicated things in life.
On the other hand, I’m reminded of the old saw, literature may not be life, but, then, what is? Parks seems to strike a little bit of a graduate studenty tone here in presenting the obvious as an earthshaking discovery, without really advancing our understanding of what literature might actually be and do. Parks seems to take delight in skewering without revealing or advancing understanding. There’s a tendency to set up straw men to light afire, and then strike the smug and knowing revelatory critical pose, when what one has revealed is more an invention of one’s own rhetoric than something that might be worth thinking about.
This desire to convince oneself that writing is at least as alive as life itself, was recently reflected by a New York Times report on brain-scan research that claims that as we read about action in novels the relative areas of the brain—those that respond to sound, smell, texture, movement, etc.—are activated by the words. “The brain, it seems,” enthuses the journalist, “does not make much of a distinction between reading about an experience and encountering it in real life; in each case, the same neurological regions are stimulated.”
What nonsense! As if reading about sex or violence in any way prepared us for the experience of its intensity. (In this regard I recall my adolescent daughter’s recent terror on seeing our border collie go into violent death throes after having eaten some poison in the countryside. As the dog foamed at the mouth and twitched, Lucy was shivering, weeping, appalled. But day after day she reads gothic tales and watches horror movies with a half smile on her lips.)
I’m tempted to say “What nonsense!” Parks’s willingness to use his daughter to dismiss a scientific finding strikes me a bit like the homeschool student I once had who cited her father as an authority who disproved evolution. Well. The reference to the twitching dog invokes emotion that in fact runs away–in a failure of critical nerve perhaps?–from the difficult question of how exactly the brain processes and models fictional information, how that information relates to similar real world situations in which people find themselves, and how people might use and interrelate both fictional and “real world” information.
Parks seems to have no consciousness whatsoever of the role of storytelling in modeling possibility, one of its most complex ethical and psychological effects. It’s a very long-standing and accepted understanding that one reason we tell any stories at all is to provide models for living. Because a model is a model, we need not assume it lacks courage or is somehow a cheat on the real stuff of life. Horror stories and fairy tales help children learn to deal with fear, impart warning and knowledge and cultural prohibitions to children, and attempt to teach them in advance how to respond to threat, to fear, to violence, etcetera. Because those lessons are always inadequate to the moment itself hardly speaks against the need to have such mental models and maps. It would be better to ask what we would do without them. The writer who provides such models need not be skewered for that since to write well and convincingly, to provide a model that serves that kind of ethical or psychic purpose, the writer him or herself must get close to those feelings of terror and disintegration themselves. It’s why there’s always been a tradition of writers like Hemingway or Sebastian Junger who go to war in order to get into that place within themselves where the emotions of the real can be touched. It’s also why there’s always been a tradition of writers self-medicating with alcohol.
Thus, I kind of found Parks’s implied assumption that writers are cowering just a bit from the real stuff of life to be a cheap shot, something that in the cultural stories we tell each other is usually associated with cowardice and weakness, in a writer or a fighter. The novelists and poets Parks takes on deserve better.
Yesterday in my comments on Carmen McCain’s post, I quoted Susan Sontag in all seriousness. I might have thought better of doing so if I had bothered first to take in this image:
This from a collection of photos at Flavorwire of writers in various stage of un-work. Mostly these folks do not look inebriated, but with Hunter S. Thompson, Papa Hemingway, and Kurt Vonnegut in the mix, I would remain none to sure. It is comforting to know that writers are people too, just like you and me. Though I will say that unlike Hunter S. Thompson, I have never driven down the Vegas strip with a naked blow up doll sitting in my lap. No doubt it is this kind of self-repression that is keeping me from being the writer I was meant to be.
Side Note: A personal favorite is of screenwriter Dalton Trumbo working in the bathtub. Which leads to a writerly twist on the drunken parlor game question: Most unusual place you’ve ever done it? Your writing, I mean?
I led a discussion in the Adult Forum down at St. Stephens last Sunday in which I suggested that apocalyptic literature falls in to basically two types: apocalyptic nihilism and apocalyptic redemption, the one seeing an end to everything the other seeing destruction as the necessary precursor to renewal. Theres an awful lot of apocalypticism out there about the book these days, a good bit of it just assuming the book is going to hell in a handbasket.
I mentioned in my post earlier today that prognosticating the future of the book seems to be a growth industry. Indeed, we devoted an entire symposium to it here at Messiah College. Besides the recent articles I mentioned earlier from my colleague Jonathan Lauer, and another by Jason Epstein, I ran across this from John Thompson at Huffington Post. A lot of it was the usual and obvious grist for the blogging mill, but I was intrigued by his final point, that the death of our current models for book production and dissemination may well lead to a flourishing of smaller independent publishing concerns
Seventh, small publishing operations and innovative start-ups will proliferate, as the costs and complexities associated with the book supply chain diminish, and threats of disintermediation will abound, as both traditional and new players avail themselves of new technologies and the opportunities opened up by them to try to eat the lunch of their erstwhile collaborators.
This strikes me as a plausible idea, and an exciting one. Although Anthony Grafton lamented the loss of the demanding professional editor, I think there’s an awful lot of talented creative people out there who could bring new energy and innovation to the world of ebooks and print books alike. We might be able to look back at this time fifty years from now and see this moment as one that heralded a new beginning for the book rather than its demise. If that’s not just so much rose colored glasses.
As I’ve suggested before, One of the more startling pronouncements at the Rethinking Success conference last month came from Stanton Green at Monmouth University, in my memory pounding the table and saying that the college major was the worst thing to happen in higher education in the past 150 years. I’ve thought for a while that a real negative of our current system is the emphasis we put on students selecting a major even before they get to college–a practice driven largely by the need of large professional programs to get students started on their careers from the first semester.
Jeff Seligo at the Chronicle has an interesting blog post this morning on what exactly students think about all the revolution and transformation talk that’s going on in higher ed. He picks up on this question of the importance of the major, finding anecdotally at least that students are less convinced of the importance of the major than we are:
Majors don’t matter. Perhaps a better question is why we force students to pick a major at all. The number of majors on campus has proliferated in the last two decades, but some academics, such as Mark Taylor or Roger Schank, think we should abolish our traditional notion of majors and build the undergraduate curriculum around broad ideas or problems we face, like water and food production.
Sure, some of the students I talked with were focused on pursuing a specific profession (marketing, for instance) and wanted a degree that would give them a skill set to secure the right internships that eventually would lead to a full-time job. But most of the students said they were less concerned with picking the right major than they were with choosing the classes that would expose them to new subjects or help them connect ideas across disciplines.
Of course, getting rid of the college major would require a massive transformation of what it meant to be a college, not just a college student, and moving away from a narrowly defined research or professionally oriented definition of your major. There’s no sign yet that we would be willing to do that or that prospective students would respond well to a college that did away with majors entirely.
Even Seligo seems inconsistent on this point since just prior to this point about the unimportance of majors, Seligo says we need to have much more intense levels of career preparation in college so that students can not waste time figuring out what they want to do and what they should major in. How these two assertions get in paragraphs that sit next to each other, I’m not entirely sure, but it may just signal the confusion we have over recognizing that except in some very specific circumstances majors don’t matter as much as we think they do, but we still somehow can only imagine a college education as a preparation for a specific career.
Maybe if we would think of college as preparing students to blaze a trail for their own professional and personal journey instead of following a career path that is predetermined, we’d be able to relieve ourselves of the belief that students need to figure out what they are going to do with their lives when they are 17 and forever after their fates will be determined by a choice made in ignorance by students who cannot possibly know the kinds of people they will be or the opportunities they will have when they are 22, much less 32 or 52.
So I wonder whether readers of this blog think its possible to imagine a world of higher education in which majors don’t exist?