Category Archives: internet culture

Enjoy your summer reading. Faster! Faster!: Technology, Recreation, and Being Human

A nice essay from novelist Graham Swift in the New York Times on the issues of reading, writing, speed and leisure.  A lot of what’s here is well travelled ground, though travelled well again by Swift.  I especially noted his sense in which time-saving has become the means by which we are enslaved to time.

A good novel is like a welcome pause in the flow of our existence; a great novel is forever revisitable. Novels can linger with us long after we’ve read them — even, and perhaps particularly, novels that compel us to read them, all other concerns forgotten, in a single intense sitting. We may sometimes count pages as we read, but I don’t think we look at our watches to see how time is slipping away.

That, in fact, is the position of the skeptical nonreader who says, “I have no time to read,” and who deems the pace of life no longer able to accommodate the apparently laggard process of reading books. We have developed a wealth of technologies that are supposed to save us time for leisurely pursuits, but for some this has only made such pursuits seem ponderous and archaic. ­“Saving time” has made us slaves to speed.

via The Narrative Physics of Novels – NYTimes.com.

To some degree Swift is picking up on a perpetual conundrum in the advancements of technology, a dialectic by which we pursue technological ends to make our lives easier, more convenient, less consumed by work and more open to enrichment. Making onerous tasks more efficient has been the dream of technology from indoor plumbing to the washing machine to email. In short we pursue technological means to make our lives more human.

And in some ways and places, we are able to achieve that end.  Who would want, really, to live in the Middle Ages anywhere except in Second Life.  Your expected life span at birth would have been about 30 years, compared to a global average today in the mid to upper 60s, and it would have been a 30 years far more grinding and difficult than what most of the world experiences today (with, of course, important and grievous exceptions).  You would likely have been hopelessly illiterate, cut off from even the possibility of entering a library (much less purchasing a handmade codex in a bookstore), and you would have had no means of being informed of what happened in the next valley last week, much less what happened in Beijing 10 minutes ago.  It is little wonder that  becoming a monk or a priest ranked high on the list of desirable medieval occupations.  Where else were you guaranteed a reward in heaven, as well as at least some access to those things we consider basic features of our contemporary humanity–literacy, education, art, music, a life not under the dominion of physical labor.  What we usually mean when we romanticize the ancient world (or for that matter the 1950s) is that we want all the fruits of our modern era with out the new enslavements that accompany them

At the same time, of course, our technological advances have often been promoted as a gift to humankind in general, but they have as readily been employed to advance a narrow version of human productivity in the marketplace.  Our technologies facilitate fast communication;  This mostly means that we are now expected to communicate more than ever, and they also raise expectations about just exactly what can get done.  Technology vastly expands the range or information we can thoughtfully engage, but increases the sense that we are responsible for knowing something about everything, instead of knowing everything about the few dozen books my great grandparents might have had in their possession.  One reason the vaunted yeoman farmer knew something about Shakespeare, could memorize vast expanses of the bible, and could endure sermons and speeches that lasted for hours is because he didn’t have a twitter feed. Nor did he have an Outlook Calendar that becomes an endless to do list generated by others.

I do think the novel, even in its e-book form, resists this need for speed.  On the other hand, it is worth saying that reading like this must be practiced like other things.  I find that when I take a couple of vacation days for a long weekend (like this weekend), it takes me about 2/3 of a day to slow down and relax and allow myself to pause.  Luckily, I can do this more readily with novels, even at the end of a hectic and too full day or week.  But that might be possible because I learned how to do it in another world, one without the bells and whistles that call for my attention through my multiple devices with their glowing LCDs.

Novel reading is a learned skill, and I wonder whether our students learn it well enough. Re-creation is a learned skill, one we need to be fully ourselves, and I do wonder whether we lose that capacity for pause in our speedy lives.

Digital Humanities as Culture Difference: Adeline Koh on Hacking and Yacking

My colleague Bernardo Michael in the History department here has been pressing me to understand that properly understood Digital Humanities should be deeply connected to our College-wide efforts to address questions of diversity and what the AAC&U calls inclusive excellence.  (Bernardo also serves as the special assistant to the President for Diversity affairs).  At first blush I will admit that this has seemed counter-intuitive to me and I have struggled to articulate the priority between my interest in developing new efforts in Digital Humanities that I tie to our college’s technology plan and my simultaneous concerns with furthering our institutions diversity plan (besides just a general ethical interest, my primary field of study over the past 20 years has been multicultural American Literature).

Nevertheless, I’ve started seeing more and more of Bernardo’s point as I’ve engaged in the efforts to get things started in Digital Humanities.  For one thing, the practices and personages of the digital world are talked about in cultural terms:  We use language like “digital natives” and “digital culture” and “netizens”–cultural terms that attempt to articulate new forms of social and cultural being.  In the practical terms of trying to create lift-off for some of these efforts, an administrator faces the negotiation of multiple institutional cultures, and the challenging effort to get faculty–not unreasonably happy and proud about their achievements within their own cultural practices–to see that they actually need to become conversant in the languages and practices of an entirely different and digital culture.

Thus I increasingly see that Bernardo is right;  just as we need to acclimate ourselves and become familiar with other kinds of cultural differences in the classroom, and just as our teaching needs to begin to reflect the values of diversity and global engagement, our teaching practices also need to engage students as digital natives.  Using technology in the classroom or working collaboratively with students on digital projects isn’t simply instrumental–i.e. it isn’t simply about getting students familiar with things they will need for a job.  It is, in many ways, about cultural engagement, respect, and awareness.  How must our own cultures within academe adjust and change to engage with a new and increasingly not so new culture–one that is increasingly central and dominant to all of our cultural practices?

Adeline Koh over at Richard Stockton College (and this fall at Duke, I think), has a sharp post on these kinds of issues, focusing more on the divide between theory and practice or yacking and hacking in Digital Humanities.  Adeline has more theory hope than I do, but I like what she’s probing in her piece and I especially like where she ends up:

If computation is, as Cathy N. Davidson (@cathyndavidson) and Dan Rowinski have been arguing, the fourth “R” of 21st century literacy, we very much need to treat it the way we already do existing human languages: as modes of knowledge which unwittingly create cultural valences and systems of privilege and oppression. Frantz Fanon wrote in Black Skin, White Masks: “To speak a language is to take on a world, a civilization.”  As Digital Humanists, we have the responsibility to interrogate and to understand what kind of world, and what kind of civilization, our computational languages and forms create for us. Critical Code Studies is an important step in this direction. But it’s important not to stop there, but to continue to try to expand upon how computation and the digital humanities are underwritten by theoretical suppositions which still remain invisible.

More Hack, Less Yack?: Modularity, Theory and Habitus in the Digital Humanities | Adeline Koh
http://www.adelinekoh.org/blog/2012/05/21/more-hack-less-yack-modularity-theory-and-habitus-in-the-digital-humanities/

I suspect that Bernardo and Adeline would have a lot to say to each other.

Dreaming of Heaven? Connection and Disconnection in Cathy Davidson’s Commencement address at UNC

Cathy Davidson and I crossed paths very briefly at Duke what now seems ages ago, she one of the second wave of important hires during Duke’s heyday in the late 80s and 90s, me a graduate student nearly finished and regretting that I didn’t have a chance to have what the graduate student scuttlebutt told me was a great professor.  I was sorry to miss the connection.  And its one of the ironies of our current information age that I am more “connected” to her now in some respects than I ever was during the single year we were in physical proximity at Duke:  following her tweets, following her blog at HASTAC, checking in on this or that review or interview as it pops up on my twitter feed or in this or that electronic medium.

I’m sure, of course, that she has no idea who I am.

In the past several years, of course, Davidson has become one of the great intellectual cheerleaders for the ways our current digital immersion is changing us as human beings, much for the better in Davidson’s understanding.  Recently Davidson gave the commencement address at the UNC school of Information and Library Science and emphasized the the ways in which our information age is changing even our understanding of post-collegiate adulthood in the ways it enables or seems to enable the possibility of permanent connection.

How do you become an adult?   My students and I spent our last class together talking about the many issues at the heart of this complex, unanswerable question, the one none of us ever stops asking.  One young woman in my class noted that, while being a student meant being constantly together—in dorms, at parties, in class—life on the other side of graduation seemed dauntingly “individual.”  Someone else piped up that at least that problem could be solved with a list serv or a Facebook page.  From the occasional email I receive from one or another of them, I know the students in that class came up with a way to still stay in touch with one another. 

 In the fourth great Information Age,  distance doesn’t have to mean loss in the same way it once did.  If Modernity—the third Industrial Age of Information—was characterized by alienation, how can we use the conditions of our connected Information Age to lessen human alienation, disruption of community, separation, loss?  I’m talking about the deep  “social life of information,” as John Seely Brown would say, not just its technological affordances.  How can we make sure that we use the communication technologies of our Age to help one another, even as our lives take us to different destinations?  How can we make sure our social networks are also our human and humane safety net?  

via Connection in the Age of Information: Commencement Address, School of Information and Library Science, UNC | HASTAC.

At the end of her address Davidson asked the graduates from UNC–ILS to stand and address one another:

And now make your colleague a promise. The words are simple, but powerful, and I know you won’t forget them:  Please say to one another, “I promise we will stay connected.” 

There’s something powerful and affecting about this, but I’ll admit that it gave me some pause, both in the fact that I think it is a promise that is fundamentally impossible to keep, even amidst the powers of our social networks, and in the fact that I’m not sure it would be an absolutely positive thing if we were able to keep it faithfully.

The dream of permanent and universal connection, of course, is a dream of heaven, an infinite and unending reconciliation whereby the living and the dead speak one to another in love without ceasing.  But there are many reasons why this remains a dream of heaven rather than fact of life, not least being our finite capacity for connection.  According to some cognitive theorists, human beings have the capacity for maintaining stable relationships with at most about 200 to 250 people, with many putting the number much lower.  I am not a cognitive scientist, so I won’t argue for the accuracy of a number, and I cant really remember at the moment whether Davidson addresses this idea in her recent work, but to me the general principle seems convincing.  While the internet might offer the allure of infinite connection, and while we might always be able to add more computing power to our servers, and while the human brain is no doubt not yet tapped out in its capacities, it remains the case that we are finite, limited, and….human.  This means that while I value the 600 friends I have on Facebook and the much smaller congregation that visits my blog and those who follow me or whom I follow on Twitter, and a number with whom I have old-fashioned and boring face to face relationships in the flesh, I am meaningfully and continuously connected to only a very few of them comparative to the number of connections I have in the abstract.  This leads to the well-known phenomenon of the joyous and thrilling reconnection with high school friends on Facebook, followed by long fallow periods punctuated only by the thumbs up “like” button for the occasional post about  new grandchildren. We are connected, but we are mostly still disconnected.

And, I would say, a good thing too.

That is, it seems to me that there can be significant values to becoming disconnected, whether intentionally or not.  For one thing, disconnection gives space for the experience of the different and unfamiliar.  One concern we’ve had in our study abroad programs is that students will sometimes stay so connected to the folks back home–i.e. their online comfort zone–that they will not fully immerse in or connect with the cultures that they are visiting.  In other words, they miss an opportunity for new growth and engagement with difference because they are unwilling to let go of the connections they already have and are working, sometimes feverishly, to maintain.

Stretched through time, we might say that something very similar occurs if it becomes imperative that we maintain connections with communities, with the relational self, of our past to the extent that we cannot engage with the relational possibilities of our present.  In order to be fully present to those connections that are actually significant to me–even those relationships that are maintained primarily online–I have to let hordes and hordes of relationships die or lie fallow, maintained only through the fiction of connection that my Facebook and Twitter Newsfeeds happen to allow.

Of course, I don’t think saying any of this is necessarily earth shattering.  I am very sure that the vast majority of my Facebook connections are not pining away about the fact that I am not working hard at maintaining strong connections with every single one of them.  Indeed, I doubt the vast majority of them will even know I wrote this blog since they will miss it on their Newsfeed.  Indeed a good many of them are probably secretly annoyed that I write a daily blog that appears on their newsfeed, but for the sake of our connection they graciously overlook the annoyance.

On the other hand, I do think there is a broad principle about what it means to be human that’s at stake.  Connection isn’t the only value.  Losing connection, separation, dying to some things and people and selves so some new selves can live.  These are values that our age doesn’t talk much about, caught up as we are in our dreams of a heaven of infinite connection. They are, however, facts and even values that make any kind of living at all possible.

Do all Canadian Professors wear funny robes and require a moderator? The Book Is Not Dead [jenterysayers.com]

Jentery Sayers at the University of Victoria posted a really interesting video set from a debate the humanities faculty put on about the book or the death thereof.  Couldn’t help being interested since it’s what’s absorbed me generally for the past several years, and since we here at Messiah had out own symposium on the book this past February.

I embedded one video below with part of Jentery’s speech–unfortunately split between two videos, and Jentery’s head is cut off some of the time, a talking body instead of a talking head.  The whole set is on Jentery’s website and apparently somewhere on the University of Victoria and of course on YouTube.  Worth my time this evening, though perhaps it says something about me that I am spending my time on a Friday night watching Canadian professors dressed in robes and addressing one another as “Madame Prime Minister” and “Leader of the Opposition”. Better than Monty Python.

The event is described as follows:

 As independent bookstores close their doors, newspapers declare bankruptcy and young people are more familiar with negotiating digitized data, it seems that the era of the printed word may be on it’s way out. Indeed, the emergency of digital humanities research seems to imply that, even in the most book-centric fields, the written word may be obsolete. Join us for a good-humoured look at whether the book is dead or if rumours of its demise are premature.

via The Book Is Not Dead [jenterysayers.com].

Takeaway line from Jentery:  “New Media Remediates Old Media”.  I’m still unpacking that, but I like Jentery’s general sense of the commerce between the traditional Gutenberg book and New Media.  It does seem to me that in a lot of ways this interaction between media forms is really what’s happening right now.  Every book published has a web site, a Facebook page, and the authors interact with readers via twitter and personal blogs.  A lot of what goes on in new media is repackaging and mashups of old media.  I do think though that its also the case that old media repackages new media as well.  Movies end up as books, and blogs become books that become movies.

It seems to me that our divisions between English and Film/communication/digital media might make less and less sense.  Would it make more sense to imagine books as such as “media” and simply have media studies, rather than imagining these things separately.

Other memorable line was someone quoting McLuhan.  “Old technologies become new art forms.”  Or words to that effect. I think this is right, and in the long haul I keep thinking this may be the destiny of the traditional book, though i could be proven wrong. I think book binders could be a growth industry, as well as publishers that specialize in high end book products.  I’ve mulled over the question of the book becoming an art object several times before, so I won’t bother to do it again here.

Side note:  Jentery Sayers was extremely generous with his time, attention, and intelligence in engaging with a number of faculty and students at Messiah College last week.  A lot of good ideas and great energy even if the computer hook up was less than desirable. Much appreciated.  The clip of Jentery’s speech is below:

 

Katrina Gulliver’s 10 Commandments of Twitter for Academics – With Exegetical Commentary

I’m a Johnny Come Lately to Twitter as I’ve mentioned on this blog before.   I’ve got the zeal of a new convert.  It was thus with great delight that I ran across Katrina Gulliver’s Ten Commandments of twittering for academics.  It’s a worthwhile article, but I’ll only list the ten commandments themselves as well as my self-evaluation of how I’m doing.

1. Put up an avatar. It doesn’t really matter what the picture is, but the “egg picture” (the default avatar for new accounts) makes you look like a spammer. [I CONFESS I WAS AN EGG FOR SEVERAL MONTHS BUT FINALLY GOT AROUND TO AN AVATAR THREE OR FOUR WEEKS AGO, LUCKILY TWITTER CONVERTS EVERYTHING TO YOUR AVATAR IMMEDIATELY, SO IF YOU ARE JUST AN EGG YOU CAN COVER OVER A MULTITUDE OF SINS IMMEDIATELY BY UPLOADING AN AVATAR.  IT’S VERY NEARLY A RELIGIOUS EXPERIENCE’\]

2. Don’t pick a Twitter name that is difficult to spell or remember. [I WOULD ADD TO THIS THAT IT COULD BE GOOD TO PICK SOMETHING FAIRLY SHORT.  MY OWN HANDLE IS MY NAME, @PETERKPOWERS, BUT THAT TAKES UP A LOT OF CHARACTERS OUT OF THE TWITTER LIMIT]

3. Tweet regularly. [DONE.  I AM NOT YET TO THE STAGE OF ANNOYING MY WIFE, BUT SHE DOESN’T REALIZE THAT’S WHAT I’M DOING ON MY IPAD;  I MIGHT ALSO SAY DON’T TWEET TOO REGULARLY, ESPECIALLY NOT IF YOU ARE LISTING SPECIFIC PERSONS.  NO ONE WANTS THEIR PHONE GOING OFF CONSTANTLY]

4. Don’t ignore people who tweet at you. Set Twitter to send you an e-mail notification when you get a mention or a private message. If you don’t do that, then check your account frequently. [AGREED, ALTHOUGH I STRUGGLE WITH WHETHER TO CONTACT EVERY PERSON WHO FOLLOWS ME;  NOT LIKE I’M INUNDATED, BUT I DON’T HAVE TONS OF TIME.  I TRY TO ACKNOWLEDGE FOLLOWS IF THE SELF-DESCRIPTION SUGGESTS THE PERSON IS CLOSELY CONNECTED TO MY PROFESSIONAL LIFE AND INTERESTS]

5. Engage in conversation. Don’t just drop in to post your own update and disappear. Twitter is not a “broadcast-only” mechanism; it’s CB radio. [DOING THIS, BUT IT TOOK ME A WHILE TO GET AROUND TO IT.  HOWEVER, I’M BETTER AT THIS THAN AT STRIKING UP CONVERSATIONS WITH STRANGERS AT PARTIES]

6. Learn the hashtags for your subject field or topics of interest, and use them.[OK, I DON’T REALLY DO THIS ONE THAT MUCH.  EXCEPT SOME WITH DIGITAL HUMANITIES.  I HAVEN’T FOUND THAT FOLLOWING HASHTAGS OUTSIDE OF #DIGITALHUMANITIES HAS GOTTEN ME ALL THAT FAR]

7. Don’t just make statements. Ask questions. [DONE]

8. Don’t just post links to news articles. I don’t need you to be my aggregator.[I’M NOT SURE ABOUT THIS ONE.  I ACTUALLY THINK TWITTER’S AGGREGATOR QUALITIES IS ONE OF ITS MOST IMPORTANT FEATURES.  FOR PEOPLE WHO I RESPECT IN THE FIELD OF DH, FOR INSTANCE, I REALLY LIKE THEM TO TELL ME WHAT THEY ARE READING AND WHAT THEY LIKE.  DAN COHEN, MARK SAMPLE, RYAN CORDELL, ADELINE KOH, ALL OF THEM ARE READING OR IN CONTACT WITH REALLY IMPORTANT STUFF AND I WANT THEM TO PASS STUFF ALONG.  I’D AGREE THAT JUST POSTING LINKS AT RANDOM MIGHT BE COUNTERPRODUCTIVE, BUT IF YOU ARE BUILDING A REPUTATION AT BEING IN TOUCH WITH GOOD STUFF IN PARTICULAR AREAS, I THINK POSTING LINKS IS ONE GOOD WAY OF BUILDING AN ONLINE PERSONA.  ON THE OTHER HAND, IN THE STRICT DEFINITION OF THE TEXT, I AGREE THAT I DON’T REALLY NEED POSTS OF NEWS ARTICLES PER SE.  I FOLLOW THE NEWS TWITTER FEEDS THEMSELVES FOR THAT KIND OF THING]

9. Do show your personality. Crack some jokes. [DOES TWEETING MY CONTRIBUTION TO INTERNATIONAL MONTY PYTHON STATUS DAY COUNT?]

10. Have fun. [TOO MUCH FUN.  I’VE GOT TO GET BACK TO WORK]

Related note, I‘ve been having a robust, sometimes contentious, sometimes inane discussion about twitter over at the MLA Linked-In group.  Be happy to have someone join that conversation as well.

We are all twitterers now: revisiting John McWhorter on Tweeting

Angus Grieve Smith over at the MLA group on Linked In pointed me toward John McWhorter’s take on Twitter a couple of years back. [I admit to some embarrassment in referencing an article that’s TWO YEARS OLD!! But the presentism of the writing for the web is a story for another time]. McWhorter’s basic case is that Twitter is not really writing at all, but a form of graphic speech. My term, not McWhorter’s. He points out that most people even after knowing how to read and write speak in bursts of 7 to 10 words, and writing at its origins reflected these kinds of patterns. In other words, as speakers we are all twitterers. Of Twitter, McWhorter says:

 

The only other problem we might see in something both democratic and useful is that it will exterminate actual writing. However, there are no signs of this thus far. In 2009, the National Assessment of Education Performance found a third of American eighth graders – the ones texting madly right now — reading at or above basic proficiency, but crucially, this figure has changed little since 1992, except to rise somewhat. Just as humans can function in multiple languages, they can also function in multiple kinds of language. An analogy would be the vibrant foodie culture that has thrived and even flowered despite the spread of fast food.

Who among us really fears that future editions of this newspaper will be written in emoticons? Rather, a space has reopened in public language for the chunky, transparent, WYSIWYG quality of the earliest forms of writing, produced by people who had not yet internalized a sense that the flavor of casual language was best held back from the printed page.

This speech on paper is vibrant, creative and “real” in exactly the way that we celebrate in popular forms of music, art, dance and dress style. Few among us yearn for a world in which the only music is classical, the only dance is ballet and daily clothing requires corsets and waistcoats. As such, we might all embrace a brave new world where we can both write and talk with our fingers.

via Talking With Your Fingers – NYTimes.com.

 

I mostly agree with this idea, that we are all code shifters. I’m less sanguine than, McWhorter, however. His eighth grade sample takes students before they’ve entered a period of schooling where they’d be expected to take on more serious reading and longer and more complex forms of writing. Twitter may not be to blame, but it’s not clear to me the state of writing and reading comprehension at higher levels is doing all that well. There’s some pretty good evidence that it isn’t. So just as a foodie culture may thrive in the midst of a lot of fast food, it’s not clear that we ought to be complacent in the face of an obesity epidemic. In the same way, just because tweeting may not signal the demise of fine writing, it’s not clear that it’s helping the average writer become more facile and sophisticated in language use.

Is hierarchy the problem? Higher Education Governance and Digital Revolutions

In the Digital Campus edition of the Chronicle, Gordon Freeman Makes the now common lament that colleges and universities are not taking the kind of advantage of Cloud technology that would enable superior student learning outcomes and more effective and efficient teaching. In doing so he lays blame at the feet of the hierarchical nature of higher Ed in a horizontal world.

Higher-education leaders, unlike the cloud-based companies of Silicon Valley, do not easily comprehend the social and commercial transformation gripping the world today. Indeed, there was a certain amount of gloating that the centuries-old higher-education sector survived the dot-com era. After all, textbooks are still in place, as are brick and mortar campuses.

The simple fact is that life is becoming more horizontal, while colleges remain hierarchical. We can expect the big shifts in higher education—where the smart use of digitization leads to degrees—to come from other countries.

And that’s sad, because the United States makes most of the new technologies that other parts of the world are more cleverly adapting, especially in education.

(via Instapaper)

I appreciate the metaphor of hierarchy versus horizontality, and I think it’s seductive, but I wonder if it’s accurate. It captures an American view of the world that images the bad guys as authoritarian and hierarchical and the good guys as democratic and dialogical.

Whether or not the horizontal crowd can ever do evil is a post for another day. I’m more interested in whether the slowness of higher Ed to take up new modes of doing business is actually due to hierarchy. Speaking to a colleague at a national liberal arts college that shall not be named, he indicated the president gave strong support for digital innovation in teaching and research, but that the faculty as a whole was slow on the uptake, which meant institution wide change was difficult to achieve.

This rings true to me as an administrator. Change is slow not because we are too hierarchical, but because we take horizontality to be an inviolable virtue. Faculty are largely independent operators with a lot of room to implement change or not as they choose. Institution wide changes in educational programming takes not one big decision, but a thousand small acts of persuasion and cajoling in order to effect change. I mostly think this is as it should be. It is the difficult opposite edge of academic freedom. It means that changing the direction of an institution is like changing the direction of an aircraft carrier, and doing so without an admiral who can indicate direction by fiat. There are problems with that, but I’m not sure that the changes required to make higher Ed as nimble as Gordon Freeman desires will result in an educational system we’d like to have.

Teaching Latin on an iPad: An experiment at Messiah College

An example of some of the things that Messiah College is trying to do in experimenting with digital technology in the classroom.  My colleague Joseph Huffman is more pessimistic than I about the promise of iPads and e-books, but I’m just glad we have faculty trying to figure it out.  See the full post at the link below.

You might not expect a historian of Medieval and Renaissance Europe to be among the first educators at Messiah College to volunteer to lead a pilot project exploring the impact of mobile technology—in this case, the iPad—on students’ ability to learn. But that’s exactly what happened.Joseph Huffman, distinguished professor of European history, and the eight students in his fall 2011 Intermediate Latin course exchanged their paper textbooks for iPads loaded with the required texts, relevant apps, supplementary PDFs and a Latin-English dictionary. The primary goal was to advance the learning of Latin. The secondary goal was to determine whether the use of the iPad improved, inhibited or did not affect their ability to learn a foreign language.Why Latin?“A Latin course is about as traditional a humanities course as one can find,” Huffman says. Because any foreign language course requires deep and close readings of the texts, studying how student learning and engagement are affected by mobile technology is especially provocative in such a classic course. In addition, Latin fulfills general language course requirements and, therefore, classes are comprised of students from a variety of majors with, perhaps, diverse experiences with mobile technologies like iPads.One aspect of the experiment was to explore whether students would engage the learning process differently with an iPad than a textbook.

The assumption, Huffman admits, is that today’s students likely prefer technology over books.Huffman’s experiences with his Latin 201 course—comprised of five seniors, two sophomores and one junior—challenged that commonly held assumption.

via Messiah College: Messiah News – Messiah College Homepage Features » iPad experiment.

Is the laptop going the way of the codex; technological nostalgia in the iPad imperium

My colleague John Fea over at The Way of Improvement Leads Home, pointed me to this essay by Alex Golub on the relative merits of the iPad and the laptop.  For Golub, the iPad is indispensable, but, as he puts it “it’s not a laptop and it never will be.”  Golub goes on with a litany of limitations that, in fact, I mostly agree with–too hard to produce things, too hard to multi-task, etcetera, etcetera.

On the other hand, I’m struck by the degree to which his lamentations strike me as just the sort of thing people are saying about the demise of the book.

Perhaps I am one of the old generation who will someday be put to shame by nimble-fingered young’uns tapping expertly away on their nanometer-thick iPad 7s, but I don’t think so. People may get used to the limitations of the device, but that doesn’t mean that it’s better than what came before.

In fact, I see this as one of the dangers of the iPad. I see them everywhere on campus, and I wonder to myself: Are my students really getting through college without a laptop? Frankly, the idea seems horrifying to me. I don’t doubt that they can do it — I worry what skills they are not learning because of the smallness (in every sense of that word) of the devices they learn on.

Read more: http://www.insidehighered.com/views/2012/04/09/essay-use-ipad-academics#ixzz1rbBPGr4L
Inside Higher Ed

Substitute the word “book” for every reference to laptop and you’ve got a pretty good rendition of the typical concerns with the demise of the codex, profs in horror at the idea that students may someday come to their classes without books in hand and they may be required to teach students from text on a screen. (Who am I kidding, the thought horrifies me still).  As if somehow there were an inherent depth or proficiency of knowledge that is unavailable through this other form.  My college began an iPad experiment this year, and so far there’s been quite a bit of success, even if there are also hiccups.  Just yesterday I read an interview with Clive Thompson who is reading War and Peace on his iPhone.  On his iPhone!

As I said, I’m reading War and Peace on my iPhone. But you can’t tell I’m reading War and Peaceon my iPhone. When I take my kids to the park and they’re off playing while I’m reading War and Peace, I look like just some fatuous idiot reading his email. I almost went to CafePress and designed a T-shirt that said, “Piss off, I’m reading War and Peace on my iPhone.”

I mildly object to the notion that people look like fatuous idiots answering their email.  It’s what I spend about 80% of my day doing.  Nevertheless, I agree with the sentiment that simply because the embodiment or the tools of our intelligence are unfamiliar, we should not assume intelligence and learning aren’t present.

We’ve had the codex for about two millennia in one form or another.  We’ve had the laptop for less than 40.  I admit to being just a bit bemused at the foreshortening of our nostalgia for the good old days.

Our Data, Our Selves: Data Mining for Self-Knowledge

If you haven’t read Gary Shteygart’s Super, Sad, True, Love Story, I would encourage you to go, sell all, buy and do so.  I guess I would call it a dystopian black comedic satire, and at one point I would have called it futuristic.  Now I’m not so sure.  The creepy thing is that about every other week there’s some new thing I notice and I kind of say to myself “Wow–that’s right out of Shteyngart.”  This latest from the NYTimes is another case in point.  The article traces the efforts of Stephen Wolfram to use his immense collection of data from the records of his email to the keystrokes on his computer to analyze his life for patterns of creativity, productivity, and the like.

He put the system to work, examining his e-mail and phone calls. As a marker for his new-idea rate, he used the occurrence of new words or phrases he had begun using over time in his e-mail. These words were different from the 33,000 or so that the system knew were in his standard lexicon.

The analysis showed that the practical aspects of his days were highly regular — a reliable dip in e-mail about dinner time, and don’t try getting him on the phone then, either.

But he said the system also identified, as hoped, some of the times and circumstances of creative action. Graphs of what the system found can be seen on his blog, called “The Personal Analytics of My Life.”

The algorithms that Dr. Wolfram and his group wrote “are prototypes for what we might be able to do for everyone,” he said.

The system may someday end up serving as a kind of personal historian, as well as a potential coach for improving work habits and productivity. The data could also be a treasure trove for people writing their autobiographies, or for biographers entrusted with the information.

This is eerily like the processes in Shteyngart’s novel whereby people have data scores that are immediately readable by themselves and others, and the main character obsesses continuously over the state of his data, and judges the nature and potential for his relationship on the basis of the data of others.

Socrates was the first, I think, to say the unexamined life was not worth living, but I’m not entirely sure this was what he had in mind.  There is a weird distancing effect involved in this process by which we remove ourselves from ourselves and look at the numbers.

At the same time, I’m fascinated by the prospects, and I think its not all that different from the idea of “distanced reading” that is now becoming common through certain Digital humanities practices in literature, analyzing hundreds or thousands of novels instead of reading two or three closely in order to understand through statistical analysis the important trends in literary history at any particular point in time, as well as the way specific novels might fit in to that statistical history.

Nevertheless, a novel isn’t a person.  I remain iffy about reducing myself to a set of numbers I can work to improve, modify, analyze, and interpret.  The examined life leads typically not to personal policies, but to a sense of mystery, how much there is that we don’t know about ourselves, how much there is that can’t be reduced to what I can see, or what I can count.  If I could understand my life by numbers, would I?

For Your edification I include the book trailer for Shteygart’s novel below.