Tag Archives: creativity
Uncreative Writing: Kenneth Goldsmith and Liz Laribee on Originality in the Digital Age
Professors have endless angst over the new possibilities for plagiarism and other forms of intellectual property theft in the digital age. But according to Kenneth Goldsmith in the Chronicle Review, such anxiety misses the point that we long entered a new age of uncreative creativity, a fact to be celebrated rather than lamented since it points to our having gotten beyond simplisitic and romantic or modernist notions of the creative individual. Of course, Goldsmith is promoting his new book, which I guess he would to take to be some kind of act of creation and for which I’m guessing he will gain his portion of individual profits—though if he wants to share the profits with all those from whom his ideas derive in an uncreative fashion, I’m sure they will oblige.
My snarky comment aside, I think there’s something to Goldsmith’s ideas, encapsulated in his title “It’s Not Plagiarism. In the Digital Age, It’s ‘Repurposing.’” As Goldsmith puts it.
The prominent literary critic Marjorie Perloff has recently begun using the term “unoriginal genius” to describe this tendency emerging in literature. Her idea is that, because of changes brought on by technology and the Internet, our notion of the genius—a romantic, isolated figure—is outdated. An updated notion of genius would have to center around one’s mastery of information and its dissemination. Perloff has coined another term, “moving information,” to signify both the act of pushing language around as well as the act of being emotionally moved by that process. She posits that today’s writer resembles more a programmer than a tortured genius, brilliantly conceptualizing, constructing, executing, and maintaining a writing machine.
Perloff’s notion of unoriginal genius should not be seen merely as a theoretical conceit but rather as a realized writing practice, one that dates back to the early part of the 20th century, embodying an ethos in which the construction or conception of a text is as important as what the text says or does. Think, for example, of the collated, note-taking practice of Walter Benjamin’s Arcades Project or the mathematically driven constraint-based works by Oulipo, a group of writers and mathematicians.
Today technology has exacerbated these mechanistic tendencies in writing (there are, for instance, several Web-based versions of Raymond Queneau’s 1961 laboriously hand-constructed Hundred Thousand Billion Poems), inciting younger writers to take their cues from the workings of technology and the Web as ways of constructing literature. As a result, writers are exploring ways of writing that have been thought, traditionally, to be outside the scope of literary practice: word processing, databasing, recycling, appropriation, intentional plagiarism, identity ciphering, and intensive programming, to name just a few.
I really do think there is something to this notion that there is a mark of “creativity”—sanitized or put under erasure (to use that hoary old theoretical term) by the quotation marks—in the ways in which we appropriate and redeploy sources from other areas on the internet. We create personae through citation, quotation, sharing, and commentary rather than through creative acts that spring fully formed from our minds and imagination. What we choose to cite and how we choose to comment on it, who we share it with, what other citations we assemble together with it in a kind of linguistic collage. On one level this is old stuff, as Goldsmith points out, stretching back to a particular strand of modernism and even beyond. Indeed, to go with a different reference to Benjamin, the figure of the storyteller is one who is best understood under the sign of repetition and appropriation, retelling stories that take on new meanings through their performance within particular contexts, rather than creating novel stories that exist on the page in the effort to create their own context.

Good behavior is the proper posture of the weak. (or, Jamaica Kincaid)
I’m reminded in this of some of the work of my friend and former student Liz Laribee, whose art I find visually provocative and surprisingly moving on an emotional scale, made up out of assemblage of leftovers. About her work, Liz says the following:
My work almost always involves the repurposing of something else, and it’s in this process that I am trying to find meaning. Here, I used discarded bits and overlooked scraps of this bookstore to continue telling stories. The authors I’ve chosen are layered in my life in ways I can’t even quite tell you about. The dime novel poems force a new meaning to make room for a cheekier, sleuthier past
I’m not exactly sure what Liz means by a cheekier, sleuthier past, but what I take from it is that detritus, the schlocky stuff our commercial culture seems to vomit out and then shovel in to a corner is not something to be lamented so much as it is to be viewed as an opportunity, an occasion for a new kind of creativity that takes the vacuous surfaces of that commercial culture and creates a surprising visual and emotional depth.
Goldsmith thinks we are still too absolutely captive to old forms of doing things and thinks writing and literature has descended into irrelevance as a result. He advocated for the development of a writing machine that moves us beyond the cult of personality and intended effect and into a realm of fortuitous and occasional affect. Students need to be forced, he thinks, not to be original in the old sense, but to be repetitive and find whatever newness there is through this act of what Liz calls “repurposing.”
All this, of course, is technology-driven. When the students arrive in class, they are told that they must have their laptops open and connected. And so we have a glimpse into the future. And after seeing what the spectacular results of this are, how completely engaged and democratic the classroom is, I am more convinced that I can never go back to a traditional classroom pedagogy. I learn more from the students than they can ever learn from me. The role of the professor now is part party host, part traffic cop, full-time enabler.
The secret: the suppression of self-expression is impossible. Even when we do something as seemingly “uncreative” as retyping a few pages, we express ourselves in a variety of ways. The act of choosing and reframing tells us as much about ourselves as our story about our mother’s cancer operation. It’s just that we’ve never been taught to value such choices.
After a semester of my forcibly suppressing a student’s “creativity” by making her plagiarize and transcribe, she will tell me how disappointed she was because, in fact, what we had accomplished was not uncreative at all; by not being “creative,” she had produced the most creative body of work in her life. By taking an opposite approach to creativity—the most trite, overused, and ill-defined concept in a writer’s training—she had emerged renewed and rejuvenated, on fire and in love again with writing.
Having worked in advertising for many years as a “creative director,” I can tell you that, despite what cultural pundits might say, creativity—as it’s been defined by our culture, with its endless parade of formulaic novels, memoirs, and films—is the thing to flee from, not only as a member of the “creative class” but also as a member of the “artistic class.” At a time when technology is changing the rules of the game in every aspect of our lives, it’s time for us to question and tear down such clichés and reconstruct them into something new, something contemporary, something—finally—relevant.
I think there is something to this, although I doubt traditional novels and stories will disappear or should, any more than the writing of novels did away with storytelling in the old sense in any absolute way. But I do think we need to think through, and not only in creative writing classes, what we might mean in encouraging our students to come up with their own original ideas, their personal arguments.
How might this notion change what we are doing, recognizing that we are in a period in which creative work, either artistic or academic, is primarily an act of redeploying, distributing, and remaking, rather than being original in the old sense of that word?
Create or Die: Humanities and the Boardroom
A couple of years ago when poet and former head of the NEA, Dana Gioia, came to Messiah College to speak to our faculty and the honors program, I asked him if he felt there were connections between his work as a poet and his other careers as a high-level government administrator and business executive. Gioia hemmed and hawed a bit, but finally said no; he thought he was probably working out of two different sides of his brain.
I admit to thinking the answer was a little uncreative for so creative a person. I’ve always been a bit bemused by and interested in the various connections that I make between the different parts of myself. Although I’m not a major poet like Gioia, I did complete an MFA and continue to think of myself in one way or another as a writer, regardless of what job I happen to be doing at the time. And I actually think working on an MFA for those two years back in Montana has had a positive effect on my life as an academic and my life as an administrator. Some day I plan to write an essay on the specifics, but it is partially related to the notion that my MFA did something to cultivate my creativity, to use the title of a very good article in the Chronicle review by Steven Tepper and George Kuh. According to Tepper and Kuh,
To fuel the 21st-century economic engine and sustain democratic values, we must unleash and nurture the creative impulse that exists within every one of us, or so say experts like Richard Florida, Ken Robinson, Daniel Pink, Keith Sawyer, and Tom Friedman. Indeed, just as the advantages the United States enjoyed in the past were based in large part on scientific and engineering advances, today it is cognitive flexibility, inventiveness, design thinking, and nonroutine approaches to messy problems that are essential to adapt to rapidly changing and unpredictable global forces; to create new markets; to take risks and start new enterprises; and to produce compelling forms of media, entertainment, and design.
Tepper and Kuh , a sociologist and professor education respectively, argue forcefully that creativity is not doled out by the forces of divine or genetic fate, but something that can be learned through hard work. An idea supported by recent findings that what musical and other artistic prodigies share is not so much genetic markers as practice, according to Malcolm Gladwell 10,000 hours of it. Tepper and Kuh believe the kinds of practice and discipline necessary for cultivating a creative mindset are deeply present in the arts, but only marginally present in many other popular disciplines on campus.
Granted, other fields, like science and engineering, can nurture creativity. That is one reason collaborations among artists, scientists, and engineers spark the powerful breakthroughs described by the Harvard professor David Edwards (author of Artscience, Harvard University Press, 2008); Xerox’s former chief scientist, John Seely Brown; and the physiologist Robert Root-Bernstein. It is also the case that not all arts schools fully embrace the creative process. In fact, some are so focused on teaching mastery and artistic conventions that they are far from hotbeds of creativity. Even so, the arts might have a special claim to nurturing creativity.
A recent national study conducted by the Curb Center at Vanderbilt University, with Teagle Foundation support, found that arts majors integrate and use core creative abilities more often and more consistently than do students in almost all other fields of study. For example, 53 percent of arts majors say that ambiguity is a routine part of their coursework, as assignments can be taken in multiple directions. Only 9 percent of biology majors say that, 13 percent of economics and business majors, 10 percent of engineering majors, and 7 percent of physical-science majors. Four-fifths of artists say that expressing creativity is typically required in their courses, compared with only 3 percent of biology majors, 16 percent of economics and business majors, 13 percent of engineers, and 10 percent of physical-science majors. And arts majors show comparative advantages over other majors on additional creativity skills—reporting that they are much more likely to have to make connections across different courses and reading; more likely to deploy their curiosity and imagination; more likely to say their coursework provides multiple ways of looking at a problem; and more likely to say that courses require risk taking.
Tepper and Kuh focus on the arts for their own purposes, and I realized that in thinking about the ways that an MFA had helped me I was thinking about an arts discipline in many respects. However, the fact that the humanities is nowhere in their article set me to thinking. How do the humanities stack up in cultivating creativity, and is this a selling point for parents and prospective students to consider as they imagine what their kids should study in college. These are my reflections on the seven major “creative” qualities that Kepper and Kuh believe we should cultivate, and how the humanities might do in each of them.
- the ability to approach problems in nonroutine ways using analogy and metaphor;
- I think the humanities excel in this area for a variety of reasons. For some disciplines like English and Modern Languages, we focus on the way language works with analogy and metaphor and we encourage its effective use in writing. But more than that, I think many humanities disciplines can encourage nonroutine ways of approaching problems—though, of course, many professors in any discipline can be devoted to doing the same old things the same old way.
- conditional or abductive reasoning (posing “what if” propositions and reframing problems);
- Again, I think a lot of our humanities disciplines approach things in this fashion. To some degree this is because of a focus on narrative. One way of getting at how narrative works and understanding the meaning of a narrative at hand is to pose alternative scenarios. What if we had not dropped the bomb on Japan? What if Lincoln had not been shot? How would different philosophical assumptions lead to different outcomes to an ethical problem.
- keen observation and the ability to see new and unexpected patterns;
- I think we especially excel in this area. Most humanities disciplines are deeply devoted to close and careful readings that discover patterns that are beneath the surface. The patterns of imagery in a poem, the connections between statecraft and everyday life, the relationship between dialogue and music in a film. And so forth. Recognizing patterns in problems can lead to novel and inventive solutions.
- the ability to risk failure by taking initiative in the face of ambiguity and uncertainty;
- I’m not sure if the humanities put a premium on this inherently or not. I know a lot of professors (or at least I as a professor) prefer their students take a risk in doing something different on a project and have it not quite work than to do the predictable thing. But I don’t know that this is something inherent to our discipline. I do think, however, that our disciplines inculcate a high level of comfort with uncertainty and ambiguity. Readings of novels or the complexities of history require us to not go for easy solutions but to recognize and work within the ambiguity of situations.
- the ability to heed critical feedback to revise and improve an idea;
- Again, I would call this a signature strength, especially as it manifests itself in writing in our discipline and in the back and forth exchange between students and student and teacher. Being willing to risk and idea or an explanation, have it critiqued, and then to revise one’s thinking in response to more persuasive arguments.
- a capacity to bring people, power, and resources together to implement novel ideas; and
- I admit that on this one I kind of think the humanities might be weaker than we should be, at least as manifested in our traditional way of doing things. Humanists in most of our disciplines are notorious individualists who would rather spend their time in libraries. We don’t get much practice at gathering people together with the necessary resources to work collaboratively. This can happen, but instinctively humanists often want to be left alone. This is an area where I think a new attention to collaborative and project based learning could help the humanities a great deal, something we could learn from our colleagues in the sciences and arts like theatre and music. I’m hopeful that some of the new attention we are giving to digital humanities at Messiah College will lead us in this direction.
- the expressive agility required to draw on multiple means (visual, oral, written, media-related) to communicate novel ideas to others.
- A partial strength. I do think we inculcate good expressive abilities in written and moral communication, and we value novel ideas. Except for our folks in film and communication—as well as our cousins in art history—we are less good at working imaginatively with visual and other media related resources. This has been one of my priorities as dean, but something that’s hard to generate from the ground up.
Well, that’s my take. I wonder what others think?
Bodies and Books–II
I’ve continued reading Karin Littau’s Theories of Reading. The second chapter is mostly a schematic History of Reading that will be familiar with anyone who’s read some stuff about that history. Still, I was struck anew or again by two aspects of that history.
First, Littau rehearses the manifest distinctions between our own (gradually eroding??) views of textual authorship and those of earlier periods. According to Littau there’s no real way to distinguish the copying of a text from the creation of a text in the Middle Ages (which makes me think that more than a few of our students would be more textually at home in the middle ages than in our contemporary academy). According to Littau, one reason for the fluidity between “copying” and “creating” was “‘the common classical and Christian view of poetic inspiration’, in accordance with which ‘the poet does not originate the poem but is the inspired channel for a divine act of creation’ (Selden 1988: 303). In pre-print culture an author, or auctor, was therefore less a creator of a given work than its assembler, whose rights to the work extended merely to the physical object of the manuscript he or she had produced in the first instance rather than the text as the fruit of his or her private consciousness, as is the case in the copyright law now” (16).
The relationship to our own modes of electronic creation almost don’t bear pointing out. How many blogs are simply compilations of materials generated elsewhere, and yet we still think of them as something we’ve somehow produced or written, unique only in their assemblage, not in creation?
Still, I’m more interested in the implications of the latter part of the quote. I wonder especially whether this doesn’t reaffirm the notion that trying to get back to original intention springs from a god-like view of authorship. However, in the ancient world, the idea that the words were divinely inspired allowed them to be disseminated endlessly into new texts and new assemblages, without worrying fastidiously about the point of historical origin in a particular writer in a particular time and place. By contrast, our own view of the author as Godlike locates that divine authority in a specific moment of history, to which we have to return to the point of exhaustion.
I wonder how this plays out especially among Christian views of scriptural authority and inspiration. Our own view of historicism insists that grappling with the historical uniqueness and situatedness of the point of creation–with the author is one can be determined–ironically discards a sense of authorship, authority, and inspiration that would have been common at these earlier points in history. To some degree we make the text captive to history, rather than releasing it to new and unforeseen forms of assemblage and creativity.
Well, this is too much for me to flesh out right now, and I’m not sure it would go anywhere anyway.
Cryptomnesia revisited (is this even possible?)
I mentioned in my last post that I had an intuitive sense that cryptomnesia must somehow be deeply important to creativity, but that I was also sure that someone else had already said it.
Turns out I was right on both score, or at least on the score that someone already said it. Siri Carpenter reports in the Monitor on Psychology that four psychologists have conducted tests whose results suggest the process of forgetting that one has learned something is important to creativity:
The mechanisms that underlie cryptomnesia also have important implications for creativity, Marsh believes. Recently, he, Landau, Hicks and psychologist Thomas B. Ward, PhD, of Texas A&M University, have examined how unconscious learning affects the creative process. In one series of experiments, for example, the researchers asked participants to draw novel space creatures. They found that when participants were first shown a few examples of space creatures that all contained some features in common, such as four legs, antennae and a tail, participants reliably included those features in their own drawings–even though they were instructed not to copy any of the features used in the examples.
“If we want to understand how it is that people design skyscrapers, or write music, or write a New York Times best seller,” Marsh concludes, “I think we need to acknowledge that nothing we design is ever truly novel–every creative effort contains vestiges of what we have experienced in the past.”
I admit that this particular experiment and its definition of creativity strike me as kind of lame, but it still gets at the fact that creation is a cobbling together of things we’ve already known rather than a fashioning tout court. My earlier intuition is more along the lines that people we view as truly creative have the ability to forget, which is as important as the ability to remember. By absorbing ideas, stories, images, and yet forgetting their original context, we are freed to recombine them in new and interesting ways more readily than those who remain slaves to context and origin.
This is, of course, no respecter of intellectual property rights, but don’t forget that Shakespeare’s best ideas were plagiarism–perhaps both conscious and unconscious–of other less accomplished playwrights.
There are apparently a couple of other documented literary cases of cryptomnesia, one pretty definite, the other speculative. At the New York Observer, Ron Rosenbaum wrote an essay about the speculation of a professor that Nabokov’s clear use of someone else’s lolita-like story represents a case not so much of conscious plagiarism as of cryptomnesia. Professor Michael Marr provides some pretty well-documented evidence that Robert Musil recognized his own cryptomensia in a couple of scenes of Man Without Qualities.
There’s also an interesting claim at Andrew Brown, Queen’s Counsel, an intellectual property web-site, to the effect that creative people are especially susceptible to Cryptomnesia.
The psychiatrist C G Jung in his book Man and His Symbols outlines that the brain never forgets an impression, no matter how slight. The mind has an ability to recall old impressions particularly during a creative process and what is perceived as a “new” creation can in fact be past memories subconsciously recalled. This can give rise to subconscious plagiarism or what psychiatrists call cryptomnesia.
Those with left brain creativity such as artists, composers or fashion designers can all be particularly susceptible to subconscious copying.
I also ran across a number of writers who say they are terrified of cryptomnesia. I suspect that in part, it’s graduate student sydrome. Always frantic to keep track of the last footnote to demonstrate the three sentences of ideas that you actually have are really your own, most dissertations ever written are more dully derivative that a sophomore’s thesis. I, at least, didn’t feel like I was really becoming a scholar until I knew the field well enough to forget where I had learned things. This started happening about 10 years after I finished graduate school.
In general I suspect that the books of folks worried about cryptomnesia aren’t all that good. They are so obsessed with the question of whether they are original that they don’t leave themselves any space to be creative.
Cryptomnesia: Originality, thy name is plagiarism
For the past several years I’ve used the Footprints poem in my literary theory class to discuss theories evaluation and aesthetic quality. (For those of you unfamiliar with the Footprints poem, I want to say first, “What planet have you been living on?” There is, after all, no more popular item of American religious kitsch than the Footprints poem. My very conservative guess is that is has been shellacked to about 14 million pieces of 1X4 plank board pieces at Vacation Bible Schools across the country.–For those of you unfamiliar with Vacation Bible School…well…bear with me. This may be a my culture/your culture sort of thing. In any case, just enter Footprints poem in any search engine and prepare to be inundated.)
As I was saying, I’ve used this “poem”–in some versions it’s kind of more of a paragraph–to talk about theories of aesthetic distinction. My very good English majors are often aghast that there could be a serious debate about the aesthetic qualities of such a piece of tripe. On the other hand, they are often very chary of the notion that someone could tell them that some things are better than others. Good Americans all, their instincts tell them that it is elitist in a sad and undemocratic fashion to assert that Wagner is “better” muscially than Eminem, or that some things are just inherently better than other things. Brought home to them through Sunday School poetry, however, we deal with the question of why they believe that the Footprints poem is inferior. If it is inferior, what justification to we use for saying that some things are better than other things. Is making this claim an objective claim or is it merely a subjective preference they’ve developed through their years of being elitist English majors. If Hopkins really is better than the Footprints poem, should we take it upon ourselves to teach people that love the Footprints poem that they are really rotting their aesthetic brains and ought to be reading Gerard Manley Hopkins. And if we really believe people would be better off reading Hopkins, why is it such a leap to believe that Wagner really is better than LL Cool J, or that in general everyone would be better off listening to opera than to Country and Western or Rock and Roll.
Found out today that I can now use the Footprints poem to also teach about the philosophy and history of authorship, another main topic of the course. Hank Stuever over at the Washington Post has written a great piece on the contested authorship of the Footprints poem. Turns out that lawsuits abound, and that no less than 3 people are claiming ownership over the text, though one of them also claims to have written the words to a famous Beatles tune when she was a pre-teen. There are claims and counterclaims, with drafts and manuscripts and other forensic evidence.
And we only thought Shakespeare worthy of this attention.
One academic has even traced the basic essence of the poem back to a sermon in the 1880s and has asserted that it’s possible that nobody actually wrote the poem. Says Stuever:
Last fall, in an online article for the Poetry Foundation, a Brooklyn journalist and literary sleuth named Rachel Aviv traced elements of “Footprints” to a sermon delivered in 1880, and raised the tantalizing possibility that nobody really wrote “Footprints in the Sand.” Those who have claimed to, Aviv noted, may be suffering from the collective “accidental plagiarism” that Carl Jung explored in his paper “Cryptomnesia” more than a century ago.
Everyone knows a cryptomnesiast, of one sort or another. It’s your cousin who stood up at Peepaw’s funeral and tried to pass off the “Do not cry, I did not die” poem as his own; or those crafty tykes who keep submitting bits of Shel Silverstein as original verse to The Post’s kids’ poetry contest. It’s the woman who sends you a sympathy card after your dog dies, with her handwritten version of the (also disputed) fable about dogs waiting for their masters in Heaven. It’s your church pastor or corporate motivational speaker who keeps coming up with those amazing “I-recently-met-a-man-who” anecdotes to illustrate his point.
Something can be so profound, so true, (so “duh”) that the cryptomnesiast is sure she thought it up herself. There is very little you can buy at a crafts fair or in the self-help section at Barnes & Noble that doesn’t have a whiff of the unattributed.
This happens to me ALL the time. I’m always running across published stuff out there on the internet that I KNOW I came up with first. My brain is the great unacknowledged source of most of the intellectual creativity out there in the last couple of decades.
Ok, seriously though, I do have this experience of feverishly working out an idea only to discover someone else has already done it, and much better than I could even if given world enough and time. Cryptomnesia is apparently a specific form of memory in which you recall what you don’t know that you’ve forgotten and don’t remember ever learning or reading.
A new explanation for Borgeses story about the rewritten version of the Don Quixote.
I find, in fact, as I get older that I do this all the time. I was reading in the Chronicle of Higher Education earlier today, and ran across a blog about an aging professor who talks about how he downloaded an article and wrote feverish and exalted notes about it, thrilled at these new ideas and contributions to his own work until he discovered somewhat later that he had already read the article, and had written feverish and exalted notes about it thrilled at these new ideas and contributions to his own work.
Okay, I really have done this. And at 48 I don’t really consider myself aged. I’m reading Dicken’s Hard Times at the moment, and I keep having the nagging suspicion that I’ve read it before, but I can’t remember enough to know what is going to happen next. Does this count as a memory?
Although it’s laughable on one level to think of the many people wanting to claim ownership of the Footprints poem, I wonder if there’s not some more serious relationship between cryptomnesia and creativity. It’s not an original thing to say that creativity is primarily a matter of rearrangement, of finding creative connections between the many things that are rather than a discovery or manufacture of the absolutely new. There are people out there with absolutely unerring memories, who remember the details of their lives from many years ago–what they had for breakfast, how long it took to eat, whether they burped or sniffled after downing the last bit of egg. I suspect, though I cannot know, that such people would find it intensely difficult to be “original.” They would experience themselves self-consciously plagiarizing or replicating experiences all the time. If I could not forget that I had read something, I would find myself less free to combine it anew with other things that I have forgotten that I learned. Recombination is made possible by dislocation, by tearing something away from it’s original context. My guess is that Crypotmnesia makes that possible.
In the same fashion, we imagine ourselves as unique and original individuals until we awake in our forties to discover that we like the same kind of socks as our fathers, that we scratch our noses in the same way between sentences, that our lower lips protrude while thinking just like his, and that we roll our thumbs around one another in precisely the same annoying way that he rolled his thumbs when we were teenagers.
Not that any of that has ever happened to me. But I’m sure if we had to live in a constant awareness of all the ways our lives are imitative, we would never be able to allow ourselves to invent new contexts and meanings for all the things we’ve plagiarized from others.
But I’m sure that someone already said this better.
Of Bloggers, Bookworms, and Bibliomaniacs
Because I’ve been teaching Ralph Waldo Emerson’s essays, “The Poet” and “The American Scholar,” I’ve been spending a good bit of time over at rwe.org, which describes itself as “The Internet’s Complete Guide to the Life and Works of Ralph Waldo Emerson.” The description is awfully modest for a site devoted to a thinker whom literary theorist Harold Bloom described as “God,” but it is also awfully accurate. Indeed, every time I visit rwe.org, I find myself thinking wistfully, “What if everything on the web devoted to literature were actually this good, this complete, this organized, this useful?”
Would that it were so, but the net’s strength tends to be volume while quality, completeness, and organization are hit and miss. Of course, a lot of these things depend on not only copyright laws, but also on attracting a devoted following willing to do the work necessary. By every evidence, and not just that of Harold Bloom, the cult of Emerson remains strong. The Emersonians over at rwe.org have clearly done a great work for all of us by creating a digital monument to this most seminal of American thinkers.
Which is itself an irony and an occasion for thought. In the first place, Emerson wasn’t much given to monuments or to being monumental. In the second place, what exactly would this thinker who believed immersion in nature was the first responsibility of “Man Thinking” think about our dependence on technology, our bleary-eyed devotion to the glowing screen, our aching backs as we bend over our keyboards, our pasty complexions that testify that we have all but forgotten the existence of the sun.
My first guess is that he would be appalled by both his own monumentuality, and by our unnatural lives. Though, at the same time, it isn’t impossible to imagine Emerson as the God of not only Harold Bloom, but the first progenitor of netizens everywhere.
My sense of Emerson’s displeasure centers on Emerson’s general disease with reading. This from “The American Scholar”:
The theory of books is noble. The scholar of the first age received into him the world around; brooded thereon; gave it the new arrangement of his own mind, and uttered it again. It came into him, life; it went out from him, truth. It came to him, short-lived actions; it went out from him, immortal thoughts. It came to him, business; it went from him, poetry. It was dead fact; now, it is quick thought. It can stand, and it can go. It now endures, it now flies, it now inspires. Precisely in proportion to the depth of mind from which it issued, so high does it soar, so long does it sing.
Yet hence arises a grave mischief. The sacredness which attaches to the act of creation, — the act of thought, — is transferred to the record. The poet chanting, was felt to be a divine man: henceforth the chant is divine also. The writer was a just and wise spirit: henceforward it is settled, the book is perfect; as love of the hero corrupts into worship of his statue. Instantly, the book becomes noxious: the guide is a tyrant. The sluggish and perverted mind of the multitude, slow to open to the incursions of Reason, having once so opened, having once received this book, stands upon it, and makes an outcry, if it is disparaged. Colleges are built on it. Books are written on it by thinkers, not by Man Thinking; by men of talent, that is, who start wrong, who set out from accepted dogmas, not from their own sight of principles. Meek young men grow up in libraries, believing it their duty to accept the views, which Cicero, which Locke, which Bacon, have given, forgetful that Cicero, Locke, and Bacon were only young men in libraries, when they wrote these books.
Hence, instead of Man Thinking, we have the bookworm. Hence, the book-learned class, who value books, as such; not as related to nature and the human constitution, but as making a sort of Third Estate with the world and the soul. Hence, the restorers of readings, the emendators, the bibliomaniacs of all degrees.
(From RWE.org – The Works of Ralph Waldo Emerson)
I wish I had thought of the name “Bibliomaniac” when I was getting my personal blog started. The description is apt. Though for Emerson, of course, a sign of damnation. He would have detested the accompanying image from Northwestern University’s Library, however much I love it. But really, isn’t this more or less the image of not just Man Reading Books, but also Man Reading Computer Screen? (Or even Man Blogging?) Through my devotion to the thoughts and words of others, I drift gradually from my own authenticity, my own innate and good self-expression, my personal experience of the Over-Soul. To be derivative is to be damned, and the only sure way to avoid derivation is to not read at all.
Well, to be sure, Emerson doesn’t go quite this far. But he was suspicious of the obeisance we give to thinkers of old. Written when he was a relatively young man, he probably didn’t give a lot of thought to the fact that he, like all flesh living, was on the way to becoming a thinker of old. And he could not have imagined me poring over and ingesting his words like a bookworm as I prepare to teach a class on him as one of the monumental literary theorist of the nineteenth century.
(Sidebar: What metaphor must we now use in place of bookworm in the world of pixels? A computer virus? Another term like “typewriter” that my someday-grandchildren will not recollect and will marvel at as an index of my age and lack of cool. Who am I kidding—my son already marvels at these things. Of course, if we give up reading altogether, spending all our time blogging, we won’t have to worry about having a different word. )
But there is a place for good reading in Emerson, and as I’ve suggested elsewhere, it has something to do with reading as a creative act. Says Emerson.
I would not be hurried by any love of system, by any exaggeration of instincts, to underrate the Book. We all know, that, as the human body can be nourished on any food, though it were boiled grass and the broth of shoes, so the human mind can be fed by any knowledge. And great and heroic men have existed, who had almost no other information than by the printed page. I only would say, that it needs a strong head to bear that diet. One must be an inventor to read well. As the proverb says, “He that would bring home the wealth of the Indies, must carry out the wealth of the Indies.” There is then creative reading as well as creative writing. When the mind is braced by labor and invention, the page of whatever book we read becomes luminous with manifold allusion. Every sentence is doubly significant, and the sense of our author is as broad as the world. We then see, what is always true, that, as the seer’s hour of vision is short and rare among heavy days and months, so is its record, perchance, the least part of his volume. The discerning will read, in his Plato or Shakspeare, only that least part, — only the authentic utterances of the oracle; — all the rest he rejects, were it never so many times Plato’s and Shakspeare’s.
I think this kind of thing, along with Emerson’s deeply felt sense of the interconnections of the immediate world with a world beyond—and with everything else in the world–is where people get the idea that Emerson was some kind of ancient God of the blogosphere. Indeed, Christopher Lydon a few years back said just this thing in an extended blog post called “A God for Bloggers.” The post at its original site is long gone, but is copied in full here . In part, Lydon argues:
Here’s my point. When we talk about this Internet and this
blogging software, this techno-magic that encourages each of us to be
expressive voices in an open, universal network of across-the-board
conversation, we are speaking of an essentially Emersonian device for
an essentially Emersonian exercise. Starting with the
electronics. “Invent a better mousetrap,” as Emerson wrote, “and the
world will beat a path to your door.”
There’s a part of me that thinks Emerson would have loved the fact that Lydon’s post had disappeared, or almost disappeared. This is the perfect condition of reading as far as Emerson is concerned: let the book/blog have its say and go away.
To Lydon’s actual content, I want to say….yeah, kind of….but not really. In the first place, there’s a way in which the technology of blogging and reading blogs tethers us to society—Emerson’s worst dirty word—in a way that books did not, this despite the aura of freedom that surrounds computerworld.
Even with the magnificence of access, I am struck by how physically limited I am in terms of my mode of access. My computer needs a proximate cord and electricity and connections—electricity even if I have a wireless connection, and reliable wireless connections are still hard to come by. Because I know next to nothing about the workings of this machine I’m writing on, because I can do nothing to control my internet connection, because I have to have access to various levels of anonymous administrators and their vast electronic resources, I am in some sense even more dependent, more inescapably tethered to society and its mores and its conventions than Emerson could have ever imagined.
We have the lovely illusion of independent creativity in our isolation, in our loggorhea of the keyboard, in our incessant speech. It’s a little like cocaine makes the addict think he’s an all powerful sex machine. The real power is the man who provides the fix. Or doesn’t. In this case, my internet administrator, or more dumbly, the squirrel that gets itself electrocuted in the router box or powerline.
By comparison, a book is a model of self-reliance, even compared to e-books with megabatteries. I can drop my copy of Ulysses in a lake, and if I’m quick enough I can probably set it by a fire, let it dry for a while, and be just fine. Then again, if not I have a new and ready supply of toilet paper, Kleenex, and firestarter.
By comparison my daughter’s ipod died irreparably after sitting next to a sweating water bottle for thirty minutes. Sitting in the sauna today, I was wondering—can an e-book stand the heat, stand those rivers of sweat that dripped off my nose into the creases of the cheap newsprint I was perusing. Could be, but I would be afraid to try. If I ruin my newspaper I’m out three bucks. If I ruin my dedicated e-reader—the one I will supposedly buy someday—I would be out 400 plus however many hundreds of dollars of books I stored up. Emerson might well look at bloggers and e-books and the like and see not evidence of infinite expressibility, but of cows in a pen.
Not saying he would be right, but he would have reason. This cow says.
Similarly, I have my doubts about the idea that blogging, simply because it is a form of expression, is the kind of expressiveness that Emerson had in mind. Indeed, for Emerson the deadliest thing for individual authenticity was repetition, was convention. And yet, does it take very much time on the web to realize just how repetitive blogging really is, just how much of it is reaction rather than response. How much of it is profane ejaculation rather than creative reading Indeed, I’m often bemused by how many blogs are largely cut and paste jobs of other blogs. Bloggers not only don’t come up with words of their own, they explicitly and joyfully use the words of others as a substitute for words of their own.
By comparison my own blog is a paen to self-indulgence. I actually write these things myself. Mostly.
At least all this lack of originality orginates with me.
In other words, there’s a part of the net’s emphasis on collaboration and connectivity that speaks to Emerson’s optimistic view of the interconnectivity of all things. There’s another part of it that speaks to Emerson’s sense of “the sluggish and perverted mind of the multitude” , where people let groups think for them rather than thinking, and speaking, for themselves.
You don’t, of course, have to rely on me for this. Rely on yourself…and on the administrators at RWE.org, and on your computer engineer, and on your electrical grid, and on your software engineer… and on…and on. Well you get my drift.
Go over to the folks at RWE.org and read Emerson for yourself.