No Arguments for the Elimination of Anything
Randy Lewis/University of Texas at Austin

iPhone

iPhone Abacus

Before I complain about how incredibly lucky I am, technologically speaking, let make something very clear. I am a grateful occupant of the present moment. I am not interested in going back to the 1970s, to a childhood populated by stuck-key typewriters and televisions the size of Buicks, not to mention germy pay phones, tissue-paper airmail, and film strips with wobbly-voiced narrators. Even worse is the prospect of time travelling back to the 19th century, when booming 3D movies and sleek smart phones had not yet supplanted the minstrel show and abacus.

If anything, the recent holiday season has intensified—and complicated—my gratitude for the technological here and now. Like many of you, I am the proud owner of a new iPhone, the latest addition to an impressive electronic arsenal that keeps me from falling out of step with the slightest twitch of our culture’s nonstop media frenzy. Now more than ever, I live in a constant pirouette, spinning between news updates, push notifications, and urgent work emails flashing on the various screens that surround me even when I sleep. Websites, television programs, text messages—they barely register in my mind before I flit to the next jolt of electrons that might bring some flicker of joy.

It would seem a perfect match between product and consumer: the glittering wares of the postmodern media bazaar versus the anxious drudgery that characterizes so much of contemporary American life. But the more I stare into the expensive screens that I am lucky enough to possess, the more I feel a vague anxiety stirring. Why I am so stimulated but rarely satisfied by this orgy of electronic activity? Why does disenchanting reality haunt my fantasies of blissful connectivity? Why can’t I simply enjoy the show?

I’d write it off to the quirks of personal psychology, but I know that I am not alone in feeling something amiss in the emerging mediascape. How many of us have a secret or even unconscious longing to escape the constant looking and seeing, buzzing and Tweeting, of our seductive screen culture? The iPhone has barely celebrated its fifth birthday, but already it’s an addictive presence that follows us everywhere, even places where it should never be. For instance, our hospitals are now staffed with iPhone junkies, including, as the New York Times reports, a “neurosurgeon making personal calls during an operation, a nurse checking airfares during surgery and a poll showing that half of technicians running bypass machines had admitted texting during a procedure.”1

Even if techno-mania doesn’t induce medical malpractice that will shorten my life, the incessant beeping of everything everywhere is driving me slightly mad—-so much so that I wonder what it would be like to live without it all (once again). I’m am not talking about a quick Internet holiday or a symbolic “TV turn-off” day, but instead something unthinkable in our present state of mind: the permanent unplugging of all media devices, from TVs to computers to smart phones to video games, in one big Luddite freak-out. To even evoke such possibilities seems wrong and dangerous, like something out of dystopian science fiction. Surely it would result in roving bands of Mad Max villains and Kevin Costner drinking his own urine on a sea of post-apocalyptic despair?

Mad Max

Mad Max reaches for his iPhone

Strangely, we’re willing to imagine such grim scenarios in general, whether it’s the Christian “rapture” of the Left Behind series or the apocalyptic landscape of I Am Legend or The Terminator—-but not in regard to our beloved consumer electronics. How can we account for the rapidity with which these devices have become forever superglued to our bodies? How can we explain our waning ability to imagine anything else? It is one of the great aporias of our times, a strange hole in our collective imagination.

Perhaps high tech is the only game in town, the only place where, as Charlie Sheen puts it, we’re winning. A friend of mine, the anthropologist Norman Stolzoff, made this point to me recently. As he put it, technology is the rare part of our society in which we have something concrete to show for all our bluster about innovation and amelioration. Taking away our greatest success story would just be cruel.

Yet… are we not curious about how it would feel to experience the “great unplugging”? Would we relish the ensuing silence as we restore the old ways of communicating and connecting with one another? Or would we lapse into a languorous funk without Google and HBO, Avatar and Annoying Orange? Would we feel permanently stuck in the isolation tank of our own boredom, marooned with the hideousness of our own organic thoughts? Would we start sketching the “Real Housewives” on the walls of our condos in crayon, breathlessly narrating their erotic adventures like an ancient bards singing the tale Odysseus and the sirens? Would we pine for our iPhones, laptops, and flatscreen TVs like postmodern amputees cursing the loss of our cyborg appendages? Would we grieve for our machines?

Probably. But what fascinates me is how loathe we are to even imagine this scenario. We are increasingly unwilling to contemplate the absence of the various screens that convey so much of our entertainment, sociality, and labor. Like Francis Fukuyama’s Cold War “End of History” argument in which capitalism’s apparent triumph over socialism foreclosed any discussion of alternatives, the new media juggernaut is so powerful that it has blotted out our ability to imagine anything else. We are all hopeless screenagers now.

Once, long ago, in a land just before TRS-80s and Colecovision, Americans could still imagine cutting the umbilical cord to mother media. In 1977, former ad-man Jerry Mander wrote an influential book called Four Arguments for the Elimination of Television, which takes aim at what he deems “a totally horrible” and “irredeemable” technology; “we’d all be much better off without it.” His critique was blistering. “Television offers neither rest nor stimulation,” Mander lamented. “Television inhibits your ability to think, but it does not lead to freedom of mind, relaxation or renewal. It leads to a more exhausted mind.”2 Of course, Mander encountered resistance even then. “Are you really going to advocate its elimination?” he was asked repeatedly while researching the book, even by people who claimed to hate television. Much to his astonishment, even the haters were unable to imagine life without TV, prompting him to wonder, “why it is so unthinkable that we might eliminate a whole technology?”

[youtube]http://www.youtube.com/watch?v=m3NBEurnIqY&feature=youtu.be[/youtube]

Recent interview with Jerry Mander

Today, Mander’s bold renunciation seems as much a relic of the 1970s as Billy Beer and Barry Manilow. As TV has been joined by a host of new media devices that offer endless distraction (and increasingly endless labor), we only hear tepid, partial calls for individual reform, never systemic abolition. In the first days of 2012, Pico Iyer wrote a searching piece in the New York Times about “trying to escape the constant stream of too much information,” but his solution was simply the occasional act of personal renunciation: a morning without email here, an unplugged week in a Benedictine monastery there. Although Iyer acknowledges that “all the data in the world cannot teach us how to sift through data; images don’t show us how to process images,” he can only suggest that we leave our cell phones at home during a Saturday walk, or find some recompense in the organic rigor of “yoga, or meditation, or tai chi.”3

Writing in a similar vein a few days later in Slate, Katie Roiphe wondered if we could even go back to an unplugged world:

If you ask any 60-year-old what life was like before the Internet they will likely say they “don’t remember.” How can they not remember the vast bulk of their adult life? The advent of our online lives is so transforming, so absorbing, so passionate that daily life beforehand is literally unimaginable.”4

Literally unimaginable is the part that stuns me. Unless it’s just a shallow reflection of unfounded bourgeois certitude, a kind of upper-middle class Whiggism that assume that history moves in one direction—toward my comfy perch on a Pottery Barn sofa—Roiphe is describing a disturbing sort of cultural rigidity in the contemporary US. It’s what politicians used to call a failure of vision.

Of course, a few Americans are able to imagine themselves shorn of “all mod cons,” not just the latest iPhone. In his book What Technology Wants, Kevin Kelly looks at the Amish, whose rejection of modern technology seems impressively complete on first glance. Yet what Kelly discovers is that even the Amish remain dependent on the hidden benefits of technology for the kerosene in their lamps, metal in their tools, and cotton in their clothes, which means that their renunciation is dependent upon our system of manufacturing and high-tech distribution. In other words, the Amish renunciation is in symbiosis with our techno-lust. “The Amish lifestyle is too familiar to poor peasants in China or India to have any meaning there,” Kelly points out. “Such elegant rejection can only exist in, and because of, a modern technium.”5

Kelly is a practical, adaptive sort: he wants us to use the tools that make our tasks easier, whether it’s a chainsaw or an iPad. And he is right. Atavism is not the answer to our technological woes: thoughtful adaptation is. We are in a transitional moment—when are we not?—in regard to the proliferation of new communication technologies in our midst. Their sudden omnipresence is a boon to the consciousness that homo sapiens have evolved over 100,000 years, but also, almost imperceptibly and certainly without fanfare, a splinter in our eye as well.

I think it is this both/and perspective that I would emphasize. I’m tired of well-funded techno-utopians shouting out a few ragged techno-Cassandras in an “either/or” battle for the soul of our culture. Instead of ignoring (or exaggerating) the downside of our proliferating screen culture, we could weigh the benefits and the drawbacks in the same thoughtful conversation. Ten years ago, in The Rise of the Image and the Fall of the Word, media scholar Mitchell Stephens gestured in this direction when he reminded us that all technology comes with a price.6 Digging into the Greek origins of the word, Stephens noted that “techne” comes from the Greek for “knowledge about how to make things,” and that this knowledge is what Prometheus stole when he took fire from the Gods and passed it to humans. His punishment for sharing the intellectual property of the gods? He was chained to a rock where a vulture ate his liver daily (assuming your gin-bloated liver weighs a full pound—otherwise, the price is 6 ounces of healthy liver). For Stephens, the moral is clear: the price of any technological know-how is a pound of flesh. In other words, new technologies always come with a price, one that is often hidden or obscured in the hype that accompanies each new advance. Cultural maturity, I would argue, allows for this sort of both/and thinking in lieu of hysterical polarization.

By thinking more dialectically about new media, we could even ask some useful questions: how can we minimize the pound of flesh that we sacrifice to our beautiful new devices? How do we prevent our employers from colonizing our spiffy new devices, sneaking in more and more work obligations where creativity, relaxation, and community might be found? How can we minimize the jolts to our psyche that we experience in a mediascape of constant interruption? How do we bring greater depth to the luminous surfaces of our iPhones and laptops?

I’m not giving up my techno-goodies: they’ll have to pry my iPhone from my cold dead hands. And I’m not volunteering to become an information hermit, vainly shutting out the noise of the world with fingers in my ears. Still, I’m glad that the Amish and other techno-skeptics are out there somewhere in our cultural imagination. Their quiet renunciation reminds us that we could, and perhaps at times should, live without these sleek machines that we find deliriously addictive, pleasurable, maddening, and exhausting. After all, our technology may improve and enliven our lives, but not without a price. We should never forget that pound of flesh.

Image Credits:

1. iPhone Abacus
2. Mad Max reaches for his iPhone
3. Recent interview with Jerry Mander

Please feel free to comment.

  1. As if going to the doctor weren’t bad enough: “As Doctors Use More Devices, Potential for Distraction Grows,” New York Times, December 15, 2011 []
  2. Jerry Mander, Four Arguments for the Elimination of Television (NY: William Morrow, 1978) 347, 211. []
  3. Pico Iyer, “The Joy of Quiet,” New York Times, January 1, 2012. []
  4. See Katie Roiphe’s essay on the Freedom app []
  5. Kevin Kelly, What Technology Wants, (NY: Viking, 2010) 231. []
  6. Mitchell Stephens, The Rise of the Image and the Fall of the Word (NY: Oxford University Press, 1998). []

6 comments

  • This polemic against technology is reminiscent of Nicholas Carr’s The Shallows, which asserts Web 2.0 technological conveniences as the death knell of the deeply contemplative psyche. An individual can sever link between short term and long term memory with confidence now that a quick Google or Wikipedia search holds all the answers. A cloud-based server system, while convenient, is scary in that we are comfortably moving our data not just from local storage, but from our own memories. No longer do we feel the need to focus, to study, to ponder deeply.

    Technological progress is not an upward climb to achieving maximal efficiency. The multiplicity of screens surrounding us and the blitz of cross-platform rich internet applications introduces redundancy along with convenience. Any data can be accessed through several possible avenues, ported between local and cloud storage, and bookmarked or flagged for instantaneous access rather than committal to memory.

    The introduction of Netflix streaming introduced channel-flipping, once reserved for the television media, to the world of film consumption. I cannot tell you how many times I have started viewing a film and switched over to something else five minutes in after concluding that I was not sufficiently engaged. Perhaps this is why my instant streaming queue is constantly pushing the 500-title limit.

    The part that worries me most is the effect on creativity, as Jerry Mander touches on in his recollection of having nothing to do in a time before television: “that bottom-space of nothing to do was in some ways the root of creativity, because you say ‘I’m going to do something now'”. Flipping on a screen or some other form of rich media consumption, whether passive or active, is now the default. Rarely are we forced to face the abyss, to delve deep into our interior state, or to immerse ourselves in nature free of electronic influence. The problem is evident to the point that several nonprofit organizations have rallied behind this cause. Rather than fight disease, hunger, or other issues critical to human existence, groups such as Campaign for a Commercial-Free Childhood and Unplug and Reconnect focus on efforts such as “Screen-Free Week”.

    Though these efforts are fighting a losing battle against technological progress and mass consumer adoption, they do have a certain cache. Attention must be paid to each device we own, if for no other reason than to rationalize having paid for it in the first place. Once divided up between a desktop, a laptop, a television, an ipod, a smartphone, an e-reader, a smartTV, and any other number of devices, this fragmentation of our attention can quickly result in increased stress rather than convenience.

  • The key, I suppose, is learning the discipline to compartmentalize our spheres of influence – to balance data collection and information sharing with dedicated blocks of introspection and creation.

  • Pingback: Faculty Research: Dr. Randy Lewis on Unplugging at Flow « AMS :: ATX

  • Thank you for this informative, insightful, and relatable commentary. I appreciate (and very much identify with) your acknowledgement of contradictory feelings in the face of modern technological interaction: arriving at an understanding of the excesses of our reliance brings with it a certain trepidation towards sacrificing, or, at least, minimizing, these interactions. Yet, I believe that specific developments in television and the Internet landscapes have the power to “improve and enliven our lives,” without the dramatic price indicated by Mitchell Stephens.

    Imagining the surrender of easy access to information is what inspires the most hesitation in picturing a “systemic abolition” of media devices and technologies. The flow of the Internet, the ease of jumping from site to site, of redirecting thoughts and interests with a mouse click or swipe of the keyboard, seems irreplaceable. Even the simple act of reading this article – itself, a non-technological behavior – was modified by this flow and became increasingly technological, as I clicked on many of the embedded links for further information and clarification.

    Jerry Mander talks about how viewers reside in a “relatively unconscious mode” while watching television, marked by its “non-participatory and accepting” behavioral patterns. This neglects to acknowledge the developments in interactive television practices, not the least of which are viewer participation in programming-related contests, polls, and elections. Furthermore, for defined segments of the population, watching television is a stimulating and (semi-)educational experience, and can be tied back in many examples to additional research via online mediums. For example, in a fourth season episode of Mad Men, “The Chrysanthemum and the Sword,” there is mention of a Dr. Lyle Evans, a reference that, during the airing of that episode, prompted extensive Internet searches and follow-up blogging in order to find the name’s source. Though the placement was ultimately deemed a ruse, an Easter egg planted to mock modern technological behaviors, the very fact that the hoax was successful in getting audiences to engage in these behaviors perfectly illustrates the informative potential of using the Internet in conjunction with television.

    This approach to television and the Internet, appreciating them as tools capable of informing and broadening worldviews, is one way to add “greater depth to the luminous surfaces of our iPhones and laptops.” There are still, undoubtedly, sacrifices being made here, flesh ceded to the technological birds of prey as we lose time to physical exercise, human interaction, creative output. And yet, we are adding flesh as well: as a mind grows and deepens, some of that matter can be retrieved, perhaps permanently. While I cannot be sure that this is the most significant or successful way to mediate the “mediascape of constant interruption,” I believe it makes good use of this reality and lends agency back to the viewer.

  • I truly enjoyed reading this thoughtful and well-constructed discussion on the relentless grasp of technology. I found it to be remarkably even and to also admit of a noteworthy ambivalence (with regard to what the contemporary human relationship with technology is and ought to be) on your part. Like Brian, who commented earlier, I am tempted to advocate “learning the discipline to compartmentalize our spheres of influence – to balance data collection and information sharing with dedicated blocks of introspection and creation.” However, I am concerned that such compartmentalization misses the way in which technology has become so terrifyingly all-consuming. Like you, I will not readily give up my iPhone or other modern technological luxuries. Even the very act of engaging in thoughtful discussion (as provided by Flow) requires the use of technology. What’s more is that the very nature of the required technology permits those contributing to discussion to explore other areas of interest even as they contribute (full disclosure: Espn.com and TheAtlantic.com are both open on my browser right now). This arguably has the potential to allow for greater insight, for the access to this information may permit us to synthesize it all and to ultimately produce very meaningful and insightful work that combines multiple spheres of interest. On the other hand, it is certainly reasonable to assume that such access to these diverse spheres may finally lead us to intellectual paralysis should we choose to never get off the ride of information accumulation. Given this, Brian’s notion of compartmentalization provides us with a simple and promising solution.

    The ability to separate these realms, however, to function between them is not, strictly speaking, what concerns me. In your discussion of Mitchell Stephens, you mentioned that techne (knowledge about how to make things) would require a sacrifice of a pound of flesh. Emma, in her response, mentioned that flesh is possibly being added as well. I do not argue this point. However, I do wonder about what we are really losing (or have already lost) and what we may be adding in its stead. It is important to remember that techne was often held in opposition to episteme, true knowledge disinterested in final product (the degree to which, of course, and the relationship between the two varied for individual philosophers). What I wonder, what truly concerns me, is the extent to which technological fetishization has stunted epistemological investigation. This is not to say that the two must be mutually exclusive, but rather that knowledge of craft seems to have supplanted pure knowledge. Again, while I think technology can have positive effects on our ability to think, I am increasingly concerned about technology’s conversations with itself. That is to say that media is becoming increasingly self-reflexive, working within a world of its own creation. In many ways, technology has assumed the role of two mirrors facing one another so that the foundation of the reflection and the reflection itself are entirely inseparable. It is an iPhone within an iPhone within an iPhone within an…

    While this is likely coming across as a diatribe against technology, I want to stress that I do not believe that technology in itself is to blame. Rather, the duty to ensure that technological concern does not eradicate disinterested thought lies with the users of technology. Much like fire, technology is neither good nor bad; invoking Don Draper, this “change simply is.” Prometheus may have given man fire, but it is up to man to ensure that fire is used to light the dark and not burn the flesh. If it is the latter, then the sacrifice will certainly be much greater than the pound of flesh discussed by Stephens. If it is the former, then perhaps the flesh lost can be replaced, stronger and more durable than it may have originally been.

  • Je suis débutant et je me renseigne sur le référencement de mon site. Je souhaiterais de connaitre votre avis sur ces info.. Les conseils webmarketing consistant à la captation de trafic en optimisant la position de son blog sur les requêtes de recherche des les internautes est connu en tant que SEO. Qu’en dites-vous ?

    référencement internet gratuit http://goo.gl/rYAIi stratégie référencement web

Leave a Reply

Your email address will not be published. Required fields are marked *