Home » Communication

Category Archives: Communication

Remembering Sam Becker and university citizenship

Sam Becker, for whom the University of Iowa Department of Communication Studies building is named, and whose six decades career was highly accomplished, passed away on November 8.  While I was a doctoral student at Iowa in the 1990’s, Becker was already retired but still ever-present, and by the sheer randomness of graduate student office assignment two colleagues and I ended up right across the hall from his small retirement office.  He was often available for conversation and also extraordinarily productive, a professor who gave the lie to the all-too-popular idea currently dominant in the humanities and some of the social sciences that the only way to get real research done is to work in seclusion either behind a closed office door or, even better, from the house.  Looking down hallways of closed faculty office doors, and I mean this as no insult since Iowa is typical, I think, of today’s academic culture, I was always struck by the fact that the most open and accessible professors – Gronbeck, Ochs, Becker – were also among the most research productive.

By my time in the program, Dr. Becker was only occasionally teaching, but he taught one of my very first classes, a one credit hour professionalization seminar that only met, as I recall it, for about 45 or 50 minutes a week.  We were coached on the protocols of academic citation, taught the mechanics of the main communication associations (Sam’s lifelong commitment to the National Communication Association meant we heard most about that one), and we talked about how one best organizes one’s research work.  I believe it was there that he told the story about how he first got hooked on academic scholarship in communication.  He was a young undergraduate at the university in the 1940’s, and was encouraged to turn a classroom paper into a publication, which he landed in one of the lead outlets for communication research.  Over the next sixty years he produced more than 100 peer reviewed research essays, advised nearly 70 doctoral dissertations, and won high degrees of acclaim for his work (NCA president, NCA Distinguished Scholar, recipient of the association’s first mentorship award, and many more, including a number of honors on the Iowa campus that include the rare privilege of a namesake building).

Professor Becker’s death evokes in me, against my better judgment perhaps, a nostalgic desire for a sort of academic culture that likely no longer exists.  The temptation to nostalgia when it comes to the academic past is fraught.  Even as the American public universities threw open their doors and programs in the 1960’s and 70’s, they were far from perfect, and the political constraints under which professors work today are in some respects incomparably different.  And universities look a lot different through the eyes of a professor than they do through the eyes of a graduate student.  It is easier to imagine public university work as a sort of exotic salon culture, the pure life of the mind where professors think great thoughts in communion with their colleagues, when one’s schedule, overloaded as is graduate student life always, consists of one intellectual interaction after another, seminar to seminar and great book to great book.  The academic life performed for graduate students, indeed for all students, is simply not the same as the one lived in a profession as dominated by committee meetings as discussions of big ideas.  Comparisons between past and present too often fail.

But my nostalgia lingers.

Sam Becker represented a style of academic life and an extraordinary commitment to building local programmatic excellence, that I find harder to find today (and in my world so infrequent as to be essentially nonexistent), living as we do at a time when many professors understandably find their main intellectual sustenance from longer distance networking – social media, blog- and listserv-centered – and themselves too informationally enriched (or, alternatively, overstimulated) and even overwhelmed by those gushing sources to desire anything but minimal face-to-face engagement with on-campus colleagues.  Part of this, I believe, is the characteristic connection of rather-shy-and-life-of-the-mind-driven academics with the more controllable interactions of online and distance encounter; it is easier to present a polished, a more clever persona through Facebook and blogging than in the heat of a tedious faculty meeting, and so as a result many gravitate to the New Comfort Zones of virtual engagement.

Entire academic generations have been mentored to the view that their most assured path to professional success is isolation – keep your head down, don’t over commit, set up a home office and be disciplined about working there, spend as few hours on campus as you can because if word gets out that you’re available then colleagues and students will eat you alive and rob you of all your productive energy.  This advice is reinforced because when one resolves only to spend ten hours a week on campus then, not surprisingly, those ten hours quickly fill to capacity as students figure out those are the only opportunities for real access not coordinated by emails.  The approach affords little time to linger, for lingering is time wasting.  Sorry!  I’m drowning; gotta run! becomes an easy refrain.

All this is understandable and not unreasonable.  I’m as prone to the temptations as anyone.  The seductive blend of intellectual (over) stimulation, where ideas can be consumed at any pace one prefers, and staged (or scripted) encounters managed from the comfort of the computer desk chair, can simply feel more enriching than sitting through a long research presentation or a comprehensive examination defense.

Donavan Ochs, a Becker colleague at Iowa, and Sam Becker, both veterans of military service (I have the sense that had something to do with it), put in pretty regular daily schedules.  Ochs, with whom I had the chance to study classical rhetoric and do an independent study on Aristotle, often put in 8-to-5 days.  As I recall it Donovan wore a tie every day, even in the 1990’s when few others did, and his door was always open apart from times when he was in private meetings or teaching.  When I asked him once how he got any work done under those conditions, he was plainly surprised at the question, and his reply – what I do here is my work – led to wider conversations about academic life.  Ochs noted that an open door policy did not prevent his research productivity, since the morning hours typically gave him many undisturbed hours to write and think.  His door wasn’t open to enable empty chit-chat – he was always profoundly encouraging but kept conversations mainly work focused.  And because he worked, seriously worked, for the duration of the regular day, he avoided the guilt so many of us feel at thinking we should be working at all hours of the night.  I always had the sense Ochs went home with a clean conscience – he had a life apart from Aristotle, a healthy set of diverse family and life interests, and retirement presented no apparent trauma for him.

It is simply impossible to generalize about the state of faculty engagement given the diversity of campus environments, and unproductive even to try, and there remain, of course, more pastoral campus settings where relatively smaller student cohorts and perhaps better supported faculty lives enable the creation of close intellectual community that at some level still adheres to the wider mythology of enclaved campus life.  But life for students and professors at the big state universities, and I suspect even in places where campus life is residential and intensively communal, is changing.  If the National Surveys of Student Engagement are to be trusted, students report less frequent conversations with their classroom professors outside of regular class times.  Michael Crow, the president of Arizona State University and a key (and controversial) national advocate for delivering high quality and research intensive educational outcomes to very high numbers of enrolled students (ASU is now one of the nation’s largest universities), often repeats the idea that demographic surges require a model of education that is not numerically exclusive (the backward logic that translates so that the more people a school turns away, the better their reputation).  If public state institutions cannot find ways to well serve the students who are academically gifted but not financially or intellectually elite enough to enroll at the most exclusive private schools, Crow often says we’ll end up with a two-tiered system where the rich kids are educated by professors and the rest will be educated by computers.

The truth is that the big public universities are fast veering in the latter direction, not in the sense that MOOC’s educate our students but that the experience, especially in the first couple years, can be awfully impersonal, if not on account of large classes than because so many early classes are taught by graduate students and temporary faculty whose good teaching may nonetheless insufficiently convey a sense of place and local intellectual tradition.  The wider incentive structures are too often negative:  no pay raises, the demoralized sense that follows from the more frequently expressed taxpayer hostility to higher education, the pressures to win grants and relentlessly publish papers, accountability pressures that seem to require more and more administrative meetings, the idea that one must always stay on the job market or you’ll likely not be able to get a pay raise here, the growing number of students and in some states the expectation of higher instructional workloads, a tendency to think of day-to-day intellectual connectivity as simply more uncompensated service.  All this lures professors from the committed work of building local loyalty and into more defensive practices that feel like simple self preservation but are also, I suspect, self-defeating because they only accelerate a vicious cycle of brief and highly focused teaching and mentorship alternated by long stretches away.  Participate in a sustained reading group?  Sorry, I just don’t have any time for that.  Organize a campus colloquium, film or lecture series?  DittoAnd since everyone else is overwhelmed too, what would be the point?  No one would come.  Did you see the lead essay in the new QJS?  I’m curious what you thought.  Gosh, I’m months behind on that kind of reading – all my energy has to go to my book.  What energized you at the last big national conference?  Oh, couldn’t make it – and how could I when the university gives so little for professional development support?

The picture I’ve just drawn is exaggerated, thankfully, but I suspect that even as caricature it retains a certain familiarity.  Fortunately the energetic participation new faculty bring to academic programs is inspirational, and idealism trumps low morale for so many staff and faculty who sustain both distance networked and local connectivity. Whatever the incentives, every department includes professors at all ranks who pour their energies into building real collective intellectual communities.  It might also be added that the struggle I’m describing may be most accentuated in the humanities, where the norms of academic research are only slowly shifting away from the lone-professor-writing-her-latest-book to practices of team-based interdisciplinarity.  The very important beneficial consequences of globally networked disciplinary conversation arose for important reasons – the generation of new knowledge is more dynamic than ever before in human history, even despite data that (at least in communication) the research work is increasingly localized in smaller numbers of publishing faculty (a recent analysis in speech communication showed that something like 85% of all professors have not published anything or received external support for their projects in the previous five years).  But I wonder if the number of high productivity and communally engaged scholars can be sustained when their morale is under assault too, because the dynamics induced by understandable mentorship and reduced support bring into ever-starker relief the old 20/80 rule, where 20% do 80% of the work.  As 20/80 becomes 10/90, this is how intellectual dynamism, and universities, die.

Sam Becker’s career suggests a thought experience that asks whether the considerable benefits of 21st century intellectual life can be improved by some integration of the professional practices of the 20th.  I want to hypothesize that what so often seems like the depressing path of today’s stressed system of public higher education need not necessarily be accepted as a New Normal.  If public higher education is to retain its historical vitality, changes will have to happen on many fronts.  Taxpayers and legislators will need to be persuaded of public education’s value.  Reasonable systems of accountability will need to document the outcomes of pedagogical encounter, I know.  But there is a role for we faculty to play as well, and Sam Becker’s professional life suggests some of the possibilities.  Becker knew that good and committed scholars who simply show up day after day and make themselves open to engaged discussions with others, both online and in person, actually attract other smart students and teachers to join as well in ways that energize the common enterprise, and that calling it quits at the end of the workday creates intellectual sustainability too as people find time away every single day to recharge.  He saw, because he so often created it himself, that the vital and passionate sense of connection that emerges as intelligent participants in the educational experience talk to each other and rev up excitement about ideas one discussion at a time.  He realized that when everyone is present and engaged in program building, service work is made more manageable by division among larger numbers of connected co-workers.  I cannot prove it, but my suspicion is that the great intellectually vital centers of communication scholarship were (are) built more fully by acts of local loyalism than by enterprising free-agent academic nomadism.

The key is not simply hallway cultures of greater presence but also necessarily entail high degrees of intellectual openness, a refusal to see the scholarly enterprise as ideational warfare or zero-sum, even in contexts where resourcing is finite.  And this was another of the Becker legacies.  During his five decades in the department, communication studies nurtured, developed, and then in some cases spun off new academic units, including theater and film.  Those discussions were not always smooth or even friendly, and Becker had strong opinions.  But what he always preserved, as I saw it, was a culture of openness to new and productive work – it led him to shift over his own career from interests in quantitative social science to British cultural studies qualitative research and then back again.  No department is ever entirely free of intellectual entanglements – smart people will tend always to prefer their own lines of inquiry and can too easily fail to see the value of the efforts undertaken by others.  But so long as there are some Beckers around, these inclinations to either/or warfare that have consumed whole programs in acrimony can be channeled productively into both/and collective accomplishment.

Fantasies, perhaps.  But these are ideas whose lifelong embodiment in one Samuel L. Becker – Eagle Scout, decorated war hero, “Mr. University of Iowa,” champion of social justice and the idea that public education enriches us all, extraordinary teacher and scholar and administrator – remain for me compelling, even given the New Normals of this new century.

When the map seems larger than the territory

On one of the websites for students of rhetorical theory, conversation has recently focused on the status of psychoanalytic criticism and the question of whether its insights are being willfully ignored by the larger field.  Josh Gunn kicked off the discussion, in part, by noting that despite recent interest, “rhetorical theory — at least on the communication [studies] side – is hampered by a certain blind spot caused by the avoidance of psychoanalysis, and more specifically, the inadmissibility of the category of the unconscious.”  Gunn rightly wonders at the absurdity of this given how many revered figures in rhetorical theory have been explicitly influenced by or have reacted against Freud, Lacan, Klein, Jung and others.

In the ensuing back-and-forth a range of perspectives have been expressed:  some writing to agree that psychoanalysis does seem to provoke unique antipathy from students assigned to encounter it, others speculating on the causes (is it because communication was more a journal than a book field?  did the discipline’s work in response to behaviorism inoculate scholars against its insights?  has psychoanalysis been more widely tainted, thus deterring investigation from the outset?), and so on.  Perhaps not surprisingly, some of the explanations veer to the therapeutic – several responses convey anecdotes of a visceral (and by implication anti-intellectual) refusal to take psychoanalytic work seriously:  sneering senior scholars, wink-wink-nudge-nudge sorts of boundary policing behavior, and the (not-so-)subtle steering of graduate students away from the theoretical insights of psychoanalysis.

As I’ve been thinking about all this I don’t find myself in particular disagreement except that I don’t think this phenomenon is unique to psychoanalysis.  Rather, I think what we are seeing are the ongoing consequences of theoretical hyper-specialization, where these are simply several of many local occurrences.  By contrast to those who continue to announce the Death of Theory, it seems to me that we are still working to live with the consequences of having at our disposal So Many Seriously Elaborated Theories, which in turn gives rise to a mostly frustrating situation where the maps seem richer, or at least larger, than the territory.

I do not note this to endorse hostility to the elaboration of theoretical sophistication, but simply to note how glutted we are with it.  I think the symptoms of this, at least in communication studies, are everywhere:  A more ready willingness to abandon the mega-conferences in preference for more intimate niche meetings where one can burrow in and keep up.  The tendency to assign secondary sources even in doctoral seminars, or, when primary works are struggled with, to isolate them from the conversations in which they participated (which results in an alternative tendency to see originary controversies, such as the big arguments between Foucault and Sartre, or Fraser and Habermas, as pretty much settled history to be filed in the same category with “the earth is round”). A growing impatience with efforts to provide arms-length or peer review or other gate-keeping work undertaken in the effort to make comprehensible the incoming ocean of new material, and a more widespread corrosive cynicism about the larger enterprise.  The increasing frequency of major conference presentations, even given by serious senior scholars, that don’t seem to say much of anything new but mostly offer a repetition of the theoretically same.  An inclination to see friendly work as fully appreciating the rich nuance of my own tradition, and hostile work as reducing my tradition to caricature.  A wider tendency to see the dissertation not as evidencing a student’s ability to undertake a serious research project, but as an indication of the project whose trajectory will forever define a career.

Another key marker is the level of defensiveness, sometimes veering into animus, I hear often expressed by the advocates of every perspective who feel their work is under siege:  Marxist theory, argumentation studies, close textual analysis, historical/archival work, postcolonial and queer theory, cultural studies, feminist scholarship, and the list could be considerably lengthened.  All feel under attack and to some extent sustain intellectual solidarity by insisting enemies are at the gate.  And within these traditions fragmentation continues apace – a longstanding theme in a number of the convention conversations I hear is how scholars who for many years have labored to make visible the cultural contributions of gays and lesbians see themselves as today marginalized by queer theory, and in turn how queer theory seems to be marginalizing bisexual and transgendered approaches.  This is a theme not limited to rhetorical studies but is more widely sensed within the broader inquiry of communication scholars:  the television studies people feel like they aren’t taken seriously, and so do the performance theorists, the cinema studies scholars, the interpersonal researchers, the quantoids, the public opinion theorists, those who first encountered communication through forensics or theater, the TV and film production faculty, ethnographers, organizational communication scholars, mass communication empiricists, public relations practitioners, and those who teach students for industry work.

As my career has taken me in the direction of administrative work, I see the same trends more widely as they shape conversations within the humanities and beyond.  When I first had the audacity in a meeting of chairs from the full range of disciplines to say that external resources are harder to find in the humanities – I thought everyone agreed with that – I was surprised that the most assertive push-back came from a colleague in biology, who was there to argue in detail his relative deprivation within the wider university.  His case was not absurd:  it is hard to argue anyone is properly supported in the modern public research university.

I don’t see this defensiveness as a reflection of bad faith or of animus.  For in a sense all of us are right – one does have to exercise eternal vigilance in defending one’s research perspective, because in a universe of so many well-elaborated accounts of human behavior the most likely danger is being forgotten or overshadowed given the broader cacophony.  Thus the paradox that while more journals are now published in the humanities than ever before, the individual researchers I talk with see fewer and fewer outlets available for their sort of work.  Or, to further mix the metaphors, there are not only more intellectual fortresses today, but they are better fortified against attack and protected against the wandering tourist and amateur dabbler than ever before.

It is true, I suppose, that within each theoretical community are some who treat, say, Anti-Oedipus or Lacan’s seminars or the Prison Notebooks or the Rhetoric as holy scripture.  But the issue is less that each theorist has induced a cult than that, in general, scholars who are otherwise persuaded they cannot possibly know every perspective well, tend to stick with the one rich approach into which they were first acculturated.  And so what was and is seen by some as a sort of happy theoretical pluralism, a view still promoted by the wider impulses to boundary-cross and be interdisciplinary and all the rest, has devolved into a more frequently expressed surliness about colleagues who “won’t do the work to stay current,” a wider reliance on secondary sources like the Dummy guides and Cambridge Companions, the more frequent play (in responding to outsider critics) of the “you don’t appreciate the subtlety of my theory when it comes to ___” card, and an even more common resort by the basically friendly to the tactic of heavy-note-taking silence or the helpful “you should really read [insert my theorist],” or, more generally, “have you thought about this?” conference response or query.  One of the most common questions i hear my colleagues ask of one another is one I often ask myself:  “If you could recommend three or four short and accessible overviews to ____ that would help me get up to speed, what would you suggest?”  It’s asking for an academic life preserver.

Less of all this is sparked by ill will or ideological refusal than by the simple unwillingness to confess “I am unable to offer a thoughtful response to your read of Ranciere because I didn’t know he would be discussed today and so I didn’t have the chance to beef up on my Ranciere for Dummies, and because it takes every minute I have available for intellectual work just to keep up on my Burke.”  The eye rolling response is sometimes thus less reflective of substantively well-grounded opposition than the expression of a weirdly humble recognition of the game we think everyone is playing: the gotcha strategem of “there s/he goes again showing off everything s/he knows about Cicero.”  At a time when credible humanistic research is said to be impossible apart from mastery of all social theory, all of the philosophical and aesthetic traditions, and (increasingly) the life sciences (cognitive theory, evolutionary psychology, accounts of chaos and networks and more), and the globalized set of artifacts that underwrite comparative work, the task seems overwhelming.

My point is not to be alarmist or defeatist about the enterprise.  Specialization is not new, and has elicited expressions of concern for generations.  To some extent the theoretical proliferation is self correcting – people participating in a bounded academic conversation do move on and not every carefully enunciated perspective finds a following.  There remain exceptionally skilled intellectuals who seem to know everything and who are apparently able to keep up with all the wider literatures.  And too often the expressed difficulties in “keeping up” exaggerate the challenge in an age when more resources than ever are available to enable one’s informational literacy, and when “I don’t have the time to understand [feminist] [critical race] [queer] theory” is a too-convenient excuse to ignore perspectives that elites brushed off even when Renaissance Giants Walked the Earth and only had to stay current on the sum of human knowledge contained in fifty books.

And because the challenges of surfing the sea of new literature and getting others interested in one’s work are by now so universal, I have little to offer to the range of problematic correctives.  The idea of reinstating a common canon holds little appeal, and for good reason.  Nor is announcing the Death of Theory, or insisting on the Priority of the Local or the Case, especially compelling.  My own preference, given a background in debating, is to “teach the controversies,” but that approach isn’t ideologically innocent either.  If book publishers survive, I think the impetus to anthologies that now characterizes cinema studies is likely to expand more widely within communication scholarship.  But there are dangers in too readily recommending hyper-specialization in doctoral students, paper writing revved up out of fast tours of JSTOR and Project Muse, and too quickly acceding to happy talk about theoretical pluralism.  Better, in our own intellectual labors, to insistently listen to and reach out to other perspectives and work like hell to keep up with the wider world of humanistic scholarship.

And sometimes, if only as a mechanism to preserve one’s sanity, a little eye rolling may also be in order.  Just keep it to yourself please.

The future of globalized literary history

A 2008 special issue of New Literary History (vol. 39) is focused on the future of literary history (and, relatedly, comparative literary studies) given globalization.  To some extent one can track the complicated history of World Literature through the early and influential essays of Rene Wellek, who advocated for comparative scholarship even as he warned against the dangers of investing disciplinary energy in the search for covering laws and causal relationships between literature and the wider society.  The titles of Wellek’s much-cited 1958 talk, “The Crisis of Comparative Literature,” and his 1973 essay “The Fall of Literary History,” convey some sense of his pessimism about the prospects for defensible work.

Of course the very term World Literature has to be carefully used since one must always demarcate the multiple possibilities implied by the phrase.  Some use World Literature to reference all the literature produced in the world, some see it as referring to Kant and Goethe’s dream (Goethe in 1827:  “a universal world literature is in the process of being constituted”) of an international body of transcendently superb literature, and still others to reference those few novels that have found truly international fame.  And so some who are invested in comparative work today, often undertaken to throw American cultural productions into a wider perspective of circulation and resistance, prefer terms like transcultural literary history (Pettersson).  In the context of the theoretical care one must take even to begin this kind of work (the complications of which are unwittingly revealed in Walter Veit’s summation of Linda Hutcheon’s call for a “new history of literature” which “has to be constructed as a relational, contrapuntal, polycommunal, polyethnic, multiperspectival comparative history”), the project remains inherently appealing:  who would oppose the idea of research that induces cross-cultural sensitivity and understanding, even realizing its final impossibility?

After buzzing along for decades, or at least since the 1950’s when the International Comparative Literature Association first met to debate the potential for doing literary historical work, new attention has been given to transcultural literary studies thanks to two much-discussed interventions:  Franco Moretti’s essay, “Conjectures on World Literature” (which forms the anchor for a 2004 anthology on Debating World Literature released by Verso and which formed a sort of introduction to his widely read book a year later) and Pascale Casanova’s The World Republic of Letters (trans. M.B. DeBevoise; Cambridge, Mass.:  Harvard UP, 2004).  Moretti’s work has gotten a lot of attention given his heretical view that the sheer quantity of the world’s literature, which now escapes the possibilities of close textual analysis, now requires distant reading, which is to say sophisticated forms of macro-data analysis that can reveal patterns of novelistic diffusion worldwide.

But things get tricky fast.  Fredric Jameson, who leads off and who has long expressed skepticism about the work of literary historians (noting in an address to a 1984 Hong Kong conference on Rewriting Literary History that “few of us think of our work in terms of literary history,” and having subsequently called repeated attention to the essentially ahistorical nature of postmodernity), argues that the dialectical impulses of economic globalization simultaneously promise cultural liberation even as the economic chains are slowly tightened, and in ways that finally limit the range of cultural productions as well.  To be concrete, Jameson highlights how global capital appears to open all cultures to all populations, even as, over time, a shrinking number of transnational conglomerates end up ultimately stifling all but the handful of mainly English-language novels able to turn a profit.  He is especially keen on Michael Mann’s argument that the global economy is “encaging” – that is, as Jameson describes it, “the new global division of labor is” organized so that “at first it is useful for certain countries to specialize…. Today, however, when self-sufficiency is a thing of the past, and when no single country, no matter what its fertility, any longer feeds itself, it becomes clearer what this irreversibility means.  You cannot opt out of the international division of labor any longer” (376).

The cage ensnares more tightly – and not only because “smaller national publishers are absorbed into gigantic German or Spanish publishing empires,” but because a handful of mega-publishers end up publishing all the textbooks kids read even as budding authors everywhere are subtly persuaded to buy in because of their “instinctive desire to be read by the West and in particular in the United States and in the English language:  to be read and to be seen and observed by this particular Big Other” (377).  So what are literary historians to do that will not invariably make them simply complicit in all this?  Jameson, a little bizarrely I think, argues for a sort of criticism that imagines the world-making possibilities of novels-yet-unwritten-that-one-imagines-as-ultimately-failing-to-liberate.  This sort of creative criticism “raises the ante,” according to Jameson, because it helps its audiences recognize the actual “persistence, if insufficiently imagined and radicalized, of current stereotypes of literary history” (381).

Brian Stock, at the University of Toronto, reads the current scene from within the larger new traditions of developmental and cognitive psychology and cognitive neuroscience.  What work done in these areas suggests is that reading has a profound cognitive (and universal) influence on human beings, whose plastic minds are essentially reconfigured by repeated participation in practices of literacy.  As Stock see is,”the only way in which reading can be related to the ubiquitous problem of globalization in communications, without running the risk of new types of intellectual colonization, is by demonstrating that it is in the genetic inheritance for interpreting language in its written or ideographic form that is the truly ‘global’ phenomenon, since it is potentially shared by everyone who can read…  [I]f this approach can be agreed upon, the natural partner of globalization will become a scientifically defended pluralism” (406).

Walter Veit, at Monash University, sees the interpretive key as residing in temporality, which can never be linguistically articulated (Paul Ricouer:  “temporality cannot be spoken of in the direct discourse of phenomenology”) except in novelistic narrative, where the arc of the narrative makes some sense of time’s passage and where, following Hayden White, the linguistic operations of rhetorical tropes and figures provide metaphorical access to the otherwise inexpressible.  One is left with a more sanguine sense of the future within these terms:  both for an analysis of the multiple ways in which the world’s literatures construct time and its passing, and with respect to literary criticism, which is always embedded in the particular and always changing practices of its time and audiences.  Such a view is well supplemented by Nirvana Tanoukhi’s claim that the challenge of understanding transnational literature is also foundationally a question of scale and locale and spaces of production.

The work of literary history, and the conceptualization even of its very possibility, is, finally,  a representative anecdote for the broader work of the humanities.  This is a theme apprehended both by Hayden White, who notes that the questions raised in the symposium reflect the larger conditions of historical knowledge as such, and by David Bleich, who notes the close affinity between the work of literary historians and the broader work of the university (where “scholars have always been involved in the mixing of societies, the sharing of languages and literatures, and the teaching of their findings and understandings,” pg. 497).  The university plays a culturally central role in translating other cultures (for students, for the audiences of its research) that is fraught with all the perils of the work of writing intercultural history – hubris, caricature, misapprehension.  But the effort to make sense of the wider world, however risky, is also indispensable, if only because the alternatives – unmitigated arrogance and blinkered ignorance – are so much worse.

An approaching Singularity?

When Ray Kurzweil published his bestseller, The Singularity is Near, in 2005, the skeptical response reverberated widely, but his track record when it comes to having made accurate predictions has been uncanny.  In the late 1980’s it was Kurzweil who anticipated that soon a computer could be programmed to defeat a human opponent in chess; by 1997 Big Blue was beating Garry Kasparov.  His prediction that within several decades humans will regularly assimilate machines to the body seemed, as Michael Skapinker recently put it, “crazy,” “except that we are already introducing machines into our bodies.  Think of pacemakers – or the procedure for Parkinson’s disease that involves inserting wires into the brain and placing a battery pack in the chest to send electric impulses through them.”

Kurzweil obviously has something more dramatic in mind than pacemakers.  The term singularity both describes the center of a black hole where the universe’s laws don’t hold and that turning point in human history where the forward momentum of machine development (evolution?) will have so quickly accelerated as to outpace human brainpower and arguably human controls.  For Kurzweil the potential implications are socially and scientifically transformational:  as Skapinker catalogs them, “We will be able to live far longer – long enough to be around for the technological revolution that will enable us to live forever.  We will be able to resist many of the diseases, such as cancer, that plague us now, and ally ourselves with digital versions of ourselves that will become increasingly more intelligent than we are.”

Kurzweil’s positions have attracted admirers and detractors.  Bill Gates seems to be an admirer (Kurzweil is “the best person I know at predicting the future of artificial intelligence”).  Others have criticized the claims as hopelessly exaggerated; Douglas Hofstadter admires elements of the work but has also said it presents something like a mix of fine food and “the craziest sort of dog excrement.”  A particular criticism is how much of Kurzweil’s claim rests on what critics call the “exponential growth fallacy.”  As Paul Davies put it in a review of The Singularity is Near:  “The key point about exponential growth is that it never lasts.  The conditions for runaway expansion are always peculiar and temporary.”  Kurzweil responds that the conditions for a computational explosion are essentially unique; as he put it in an interview:  “what we see actually in these information technologies is that the exponential growth associated with a particular paradigm… may come to an end, but that doesn’t stop the ongoing exponential progression of information technology – it just yields to another paradigm.”  Kurzweil’s projection of the trend lines has him predicting that by 2027, computers will surpass human intelligence, and by 2045 “strictly biological humans won’t be able to keep up” (qtd. in O’Keefe, pg. 62).

Now Kurzweil has been named chancellor of a new Singularity University, coordinated by a partnership between NASA and Google.  The idea is simultaneously bizarre and compelling.  The institute is roughly modeled on the International Space Unversity in Strasbourg, where the idea is to bring together Big Thinkers who can, by their interdisciplinary conversations and collaboration, tackle the impossible questions.  One wonders at whether the main outcome will be real research or wannabe armchair metaphysical speculation – time will tell, of course.  NASA’s role seems to be simply that they have agreed to let the “university” rent space at their Moffett Field Ames Research Center facility in California.  The money comes from Peter Diamandis (X Prize Foundation chair), Google co-founder Larry Page, Moses Znaimer (the media impresario), and tuition revenue (the nine week program is charging $25,000, scholarships available).  With respect to the latter the odds seem promising – in only two days 600 potential students applied.

The conceptual issues surrounding talk of a Singularity go right to the heart of the humanistic disciplines, starting with the manner in which it complicates anew and at the outset what one means by the very term human.  The Kurzweil proposition forces the issue by postulating that the exponential rate of information growth and processing capacity will finally result in a transformational break.  When one considers the capacity of human beings to stay abreast of all human knowledge that characterized, say, the 13th century, when Europe’s largest library (housed at the Sorbonne) held only 1,338 volumes, and contrasts that with the difficulty one would encounter today in simply keeping up with, say, research on William Shakespeare or Abraham Lincoln, the age-old humanistic effort to induce practices of close reading and thoughtful contemplation can seem anachronistically naive.

One interesting approach for navigating these issues is suggested in a 2007 essay by Mikhail Epstein.  Epstein suggests that the main issue for the humanities lies less in the sheer quantity of information and its potentially infinite trajectory (where, as Kurzweil has implied, an ever-expanding computational mind finally brings order to the Universe) than in the already evident mismatch between the finite human mind and the accumulated informational inheritance of humanity.  Human beings live for a short period of time, and within the limited timeline of even a well-lived life, the amount of information one can absorb and put to good use will always be easily swamped by the accumulated knowledge of the centuries.  And this is a problem, moreover, that worsens with each generation.  Epstein argues that this results in an ongoing collective trauma, first explained by Marxist theory as inducing both vertigo and alienation, then by the existentialists as an inevitability of the human condition, and now by poststructuralists who (and Epstein concedes this is an oversimplification) who take reality itself “as delusional, fabricated, or infinitely deferred” (19).  Epstein sees all this as evidencing the traumatizing incapacity of humans to comprehend in any detailed way their own collective history or thought.  The postmodern sensibility revealed in such aesthetic traditions as Russian conceptualism, “which from the 1970s to the 1990s was occupied with cliches of totalitarian ideology,” and which “surfaced in the poetry and visual art of Russian postmodernism” in ways “insistently mechanical, distant, and insensitive” (21).  There and elsewhere, “the senses are overwhelmed with signs and images, but the intellect no longer admits and processes them” (22).

The problem to which Epstein calls attention – the growing gap between a given human and the total of humanity – is not necessarily solved by the now well-established traditions that have problematized the Enlightenment sense of a sovereign human.  In Epstein’s estimation, the now-pluralized sense of the human condition brought into being by multiculturalism has only accentuated the wider social trends to particularization and hyper-specialization:  the problem is that “individuals will continue to diversify and specialize:  they will narrow their scope until the words humans and humanity have almost nothing in common” (27).

The wider work on transhumanism and cyborg bodies reflects a longer tradition of engagement with the challenge posed by technological transformation and the possibilities it presents for physical reinvention.  At its best, and in contrast to the more culturally salient cyborg fantasies enacted by Star Trek and the Terminator movies, this work refuses the utopian insistence in some of the popular accounts that technology will fully eradicate disease, environmental risk, war, and death itself.  This can be accomplished by a range of strategies, one of which is to call attention to the essentially religious impulses in the work, all in line with long-standing traditions of intellectual utopianism that imagine wholesale transformation as an object to be greatly desired.  James Carey used to refer to America’s “secular religiosity,” and in doing so followed Lewis Mumford’s critique of the nation’s “machano-idolatry” (qtd. in Dinerstein, pg. 569).  Among the cautionary lessons of such historical contextualization is the reminder of how often thinkers like Kurzweil present their liberatory and also monstrous fantasies as inevitabilities simply to be managed in the name of human betterment.

SOURCES:  Michael Skapinker, “Humanity 2.0:  Downsides of the Upgrade,” Financial Times, 10 February 2009, pg. 11; Mikhail Epstein, “Between Humanity and Human Beings:  Information Trauma and the Evolution of the Species,” Common Knowledge 13.1 (2007), pgs. 18-32; Paul Davies, “When Computers Take Over:  What If the Current Exponential Increase in Information-Processing Power Could Continue Unabated,” Nature 440 (23 March 2006); Brian O’Keefe, “Check One:  __ The Smartest, or __ The Nuttiest Futurist on Earth,” Fortune, 14 May 2007, pgs. 60-69; Myra Seaman, “Becoming More (Than) Human:  Affective Posthumanisms, Past and Future,” Journal of Narrative Theory 37.2 (Summer 2007), pgs. 246-275; Joel Dinerstein, “Technology and Its Discontents:  On the Verge of the Posthuman,” American Quarterly (2006), pgs. 569-595.

Remembering Harold Pinter

Several of the obituaries for Harold Pinter, the Nobel prize winning playwright who died on Christmas Eve, see the puzzle of his life as centered on the question of how so happy a person could remain so consistently angry.  The sense of anger, or perhaps sullenness is the better word, arises mainly from the diffidence of his theatrical persona and the independence of his best characters even, as it were, from himself, and of course his increasingly assertive left-wing politics.  The image works, despite its limitations, because as he suffered in recent years from a gaunting cancer he remained active and in public view, becoming something of a spectral figure.  And of course many who were not fans of theatrical work (from the hugely controversial Birthday Party, to the critically acclaimed Caretaker, and then further forays into drama and film) mainly knew him through his forceful opposition to Bush and Blair and their Iraqi policies and the larger entanglements of American empire.

But Pinter, and this is true I think of all deeply intellectual figures, cannot be reduced to the terms provocateur or leftist.  In this case, to be sure, simple reductions are wholly inadequate to the task given his very methods of work:  one of his most abiding theatrical legacies is his insistence that dramatic characters are inevitably impenetrable – they owe us no “back story,” nor are their utterances ever finally comprehensible, any more than are our interactions in the real world of performed conversation.  And so Pinter set characters loose who even he could not predict or control, an exercise that often meant his productions were themselves angering as audiences struggled to talk sense into the unfolding stories. As the Economist put it, “his characters rose up randomly… and then began to play taunting games with him.  They resisted him, went their own way.  There was no true or false in them.  No certainty, no verifiable past…  Accordingly, in his plays, questions went unanswered.  Remarks were not risen to.”

So what does all this say about the ends of communication?  For Pinter they are not connected to metaphysical reflection or understanding (this was Beckett’s domain; it is somehow fitting that Pinter’s last performance was in Beckett’s Krapp’s Last Tape, played from a wheelchair), but simple self defense, a cover for the emptiness underneath (Pinter: “cover for nakedness”), a response to loneliness where silence often does just as well as words.  And so this is both a dramatic device (the trait that makes a play Pinteresque) and a potentially angering paradox:  “Despite the contentment of his life he felt exposed to all the winds, naked and shelterless.  Only lies would protect him, and as a writer he refused to lie.  That was politicians’ work, criminal Bush or supine Blair, or the work of his critics” (Economist).  Meanwhile, the audience steps into Pinter’s worlds as if into a subway conversation; as Cox puts it, “The strangers don’t give you any idea of their backgrounds, and it’s up to the eavesdropper to decide what their relationships are, who’s telling the truth, and what they’re talking about.”

The boundaries that lie between speaking and silence are policed by timing, and Pinter once said he learned the value of a good pause from watching Jack Benny performing at the Palladium in the early 1950’s.  One eulogist recalls the “legendary note” Pinter once sent to the actor Michael Hordern:  “Michael, I wrote dot, dot, dot, and you’re giving me dot, dot.”  As Siegel notes:  “It made perfect sense to Hordern.”  The shifting boundaries of communication, which in turn provide traces of the shifting relations of power in a relationship, can devolve into cruelty or competition where both players vie for one-up status even as all the rest disintegrates around them.  As his biographer, Michael Billington, put it, “Pinter has always been obsessed with the way we use language to mask primal urges.  The difference in the later plays is not simply that they move into the political arena, but they counterpoint the smokescreen of language with shocking and disturbing images of torture, punishment, and death.”  At the same time, and this because Pinter was himself an actor and knew how to write for them, the written texts always seemed vastly simpler on paper than in performance – and this is not because simple language suggests symbolic meaning (Pinter always resisted readings of his work that found symbolic power in this or that gesture) but because the dance of pauses and stutters and speaking end up enacting scenes of apparently endless complexity.

For scholars of communication who attend to his work, then, Pinter poses interesting puzzles and even at their most cryptic his plays bump up against the possibilities and limits of language.  One such riddle, illuminated in an essay Dirk Visser, is that while most Pinter critics see his plays as revealing the failures of communication, Pinter himself refused to endorse such a reading, which he said misapprehended his efforts.  And as one moves through his pieces, the realization that language is not finally representational of reality slowly emerges (or, in some cases, with the first line), nor even instrumental (where speakers say certain things to achieve certain outcomes).  Pinter helps one see how language can both stabilize and unmoor meaning, even in the same instant (this is the subject of an interesting analysis of Pinter’s drama written by Marc Silverstein), and his work both reflects and straddles the transition from modernism to postmodernism he was helping to write into existence (a point elaborated by Varun Begley).

His politics were similarly complicated, I think, a view that runs contrary to the propagandists who simply read him as a leftist traitor, and a fascist at that.  His attacks on Bush/Blair were often paired with his defense of Milosevic in the press as implying a sort of left-wing fascism where established liberal power is always wrong.  But his intervention in the Milosevic trial was not to defend the war criminal but to argue for a fair and defensible due process, and this insistence on the truth of a thing was at the heart of his compelling Nobel address.  Critics saw his hyperbole as itself a laughable performative contradiction (here he is, talking about the truth, when he hopelessly exaggerates himself).  I saw a long interview done with Charlie Rose, replayed at Pinter’s death, where Rose’s impulse was to save Pinter from this contradiction, and from himself:  (Paraphrasing Rose) “Surely your criticism is not of all the people in America and Britain, but only made against particular leaders.”  “Surely you do not want to oversimplify things.”  Pinter agreed he was not accusing everyone of war crimes but also refused to offer broader absolution, since his criticism was of a culture that allowed and enabled lies as much as of leaders who perpetuated them without consequence.  Bantering with Rose, that is to say, he refused to take the bait, and the intentional contradictions persisted.  His Nobel speech (which was videotaped for delivery because he could not travel to Stockholm and is thus available for view online) starts with this compelling paragraph:

In 1958 I wrote the following:  “There are no hard distinctions between what is real and what is unreal, nor between what is true and what is false.  A thing is not necessarily either true or false; it can be both true or false.”  I believe that these assertions still make sense and do still apply to the exploration of reality through art.  So as a writer I stand by them but as a citizen I cannot.  As a citizen I must ask:  What is true?  What is false?

What was so angering for many was Pinter’s suggestion that the American leadership (and Blair too) had committed war crimes that had first to be recognized and tallied and then perpetrators held to account:

The United States supported and in many cases engendered every right wing military dictatorship in the world after the end of the Second World War. I refer to Indonesia, Greece, Uruguay, Brazil, Paraguay, Haiti, Turkey, the Philippines, Guatemala, El Salvador, and, of course, Chile.  The horror the United States inflicted upon Chile in 1973 can never be purged and can never be forgiven.  Hundreds of thousands of deaths took place throughout these countries.  Did they take place?  And are they in all cases attributable to US foreign policy?  The answer is yes they did take place and they are attributable to American foreign policy.  But you wouldn’t know it.  It never happened.  Nothing ever happened.  Even while it was happening it wasn’t happening.  It didn’t matter.  It was of no interest.  The crimes of the United States have been systematic, constant, vicious, remorseless, but very few people have actually talked about them.  You have to hand it to America.  It has exercised a quite clinical manipulation of power worldwide while masquerading as a force for universal good.  It’s a brilliant, even witty, highly successful act of hypnosis.

The argument is offensive to many (when the Nobel was announced, the conservative critic Roger Kimball said it was “not only ridiculous but repellent”), though for a playwright most attentive to the power of the obscuring mask and the underlying and sometimes savage operations of power they obscure, it is all of a piece.  McNulty:  “But for all his vehemence and posturing, Pinter was too gifted with words and too astute a critic to be dismissed as an ideological crank.  He was also too deft a psychologist, understanding what the British psychoanalyst D. W. Winnicott meant when he wrote that ‘being weak is as aggressive as the attack of the strong on the weak’ and that the repressive denial of personal aggressiveness is perhaps even more dangerous than ranting and raving.”

As the tributes poured in, the tensions between the simultaneous arrogance (a writer refuses to lie) and the humility (he felt exposed to all the winds, naked and shelterless) in this arise again and again.  The London theatre critic John Peter gets at this when he passingly noted how Pinter “doesn’t like being asked how he is.”  And then, in back to back sentences:  “A big man, with a big heart, and one who had the rare virtue of being able to laugh at himself.  Harold could be difficult, oh yes.”  David Wheeler (at the ART in Cambridge, Massachusetts):  “What I enjoyed [of my personal meeting with him] was the humility of it, and his refusal to accept the adulation of us mere mortals.”  Michael Billington:  “Pinter’s politics were driven by a deep-seated moral disgust… But Harold’s anger was balanced by a rare appetite for life and an exceptional generosity to those he trusted.”  Ireland’s Sunday Independent:  “Pinter was awkward and cussed… It was the cussedness of massive intellect and a profound sense of outrage.”

Others were more unequivocal.  David Hare:  “Yesterday when you talked about Britain’s greatest living playwright, everyone knew who you meant.  Today they don’t.  That’s all I can say.”  Joe Penhall:  Pinter was “my alpha and beta…  I will miss him and mourn him like there’s no tomorrow.”  Frank Gillen (editor of the Pinter Review):  “He created a body of work that will be performed as long as there is theater.”  Sir Michael Gambon:  “He was our God, Harold Pinter, for actors.”

Pinter’s self-selected eulogy conveys, I think, the complication – a passage from No Man’s Land – “And so I say to you, tender the dead as you would yourself be tendered, now, in what you would describe as your life.”  Gentle.  Charitable.  But also a little mocking.  A little difficult.  And finally, inconclusive.

SOURCES:  Beyond Pinter’s own voluminous work, of course – Marc Silverstein, Harold Pinter and the Language of Cultural Power (Bucknell UP, 1993); Varun Begley, Harold Pinter and the Twilight of Modernism (U Toronto P, 2005); “Harold Pinter,” Economist, 3 January 2009, pg. 69; Ed Siegel, “Harold Pinter, Dramatist of Life’s Menace, Dies,” Boston Globe, 26 December 2008, pg. A1; John Peter, “Pinter:  A Difficult But (Pause) Lovely Man Who Knew How to Apologise,” Sunday Times (London), 28 December 2008, pgs. 2-3; Gordon Cox and Timothy Gray, “Harold Pinter, 1930-2008,” Daily Variety, 29 December 2008, pg. 2; Charles McNulty, “Stilled Voices, Sardonic, Sexy:  Harold Pinter Conveyed a World of Perplexing Menace with a Vocabulary All His Own,” Los Angeles Times, 27 December 2008, pg. E1; Dirk Visser, “Communicating Torture: The Dramatic Language of Harold Pinter,” Neophilologus 80 (1996): 327-340; Matt Schudel, “Harold Pinter, 78,” Washington Post, 26 December 2008, pg. B5; Michael Billington, “Harold Pinter 1930-2008,” Guardian (London), 27 December 2008, pg. 15; Esther Addley, “Harold Pinter 1930-2008,” Guardian (London), 27 December 2008, pg. 14; Frank Gillen, “Farewell to an Artist, Friend,” St. Petersburg Times (Florida), 4 January 2009, pg. 4E; “Unflagging in His Principles and Unrivalled in His Genius,” Sunday Independent (Ireland), 28 December 2008; Dominic Dromgoole, “In the Shadow of a Giant,” Sunday Times (London), 28 December 2008, pgs. 1-2; Mel Gussow and Ben Brantley, “Harold Pinter, Whose Silences Redefined Drama, Dies at 78,” New York Times, 26 December 2008, pg. A1.

Whale communication

Every couple weeks I receive alerts from a grants database configured to send me information about communication research informing me of sponsored projects relating to animal communication.  I am always surprised to see these alerts since the field of speech communication in which I’m trained (and in contrast to scholars of the science of language development) only peripherally engages the topic and in ways that either treat it as an issue of passing relevance in our introductory texts for students who inevitably raise the subject, or in fully anecdotal ways.  A famous essay by the literary critic Kenneth Burke thus distinguishes animal communication as motion (which for him is a term referring to instinctive and unthinking movement) as compared to human action; although ideological convictions can make us instinctively hop around this way or that, Burke argued our capacity for intersubjective communication enables thoughtful and motivated acts.  As influential as Burke is for scholars of rhetorical process, it has never occurred to me that he had a sophisticated understanding of, say, primate interaction, and of course his work in the 1950s and 1960s predated the most significant anthropological and biological research on animal sociability, some of which comes very close to suggesting or even proving beyond doubt complex structures of communicated meaning.

I was reminded of all this, and also my own ignorance, in watching a PBS Nature documentary aired the other night in Atlanta which told the story of some animal caretakers who have rescued aging chimpanzees kept in lifelong captivity either because they were raised to be circus performers or the subjects of medical research.  Many of the chimps, who age to be 35 years or older, are living with having been infected with HIV or hepatitis or other illnesses with the goal of discovering cures for these illnesses in humans.  One of the common characteristics of these animals is that they have lived so long in cages and on concrete pads that they may fear a return to more natural habitats, although that rightly does not prevent advocates for their humane treatment from endeavoring to restore them to such living conditions.  One of the sad moments in the documentary showed a chimp who recognized one of his trainers after having not seen him for more than a quarter century.  On renewed contact the old chimp seemed to immediately recognize his old captor with affection.  The professionals who cared for him (the ape) learned that he liked ice cream cones, and so for the last couple weeks of his life they brought him one every day which he would happily devour, not by swallowing it whole but by licking it and nibbling through the cone as any human being might.  One of the happiest moments showed a chimp released at last from concrete and caged confinement into an almost open park – the monkey raced out, off the concrete pad onto the grass, and scrambled to the top of the tallest tree on the property.  The good people who fought to create such a habitat wept tears of joy.  A small victory for primate rights, to be sure, or perhaps simply for the humane treatment of animals who have been, let us say it, tortured.  As I watched all this I realized I was as moved by the fact that the old chimp ate ice cream in ways that made him seem human as I was by the other’s sprinting return into an environs harder to see as recognizably human.

We’ve all heard the statistic noting the degree of genetic separation distinguishing humans from bonobos, gorillas, or chimpanzees as less than one percent.  And although the genetic distance between humans and dolphins is just a bit greater (a new book by Maddalena Bearzi and Craig Stanford notes that we split away from dolphins more than 100 million years ago), dolphins and whales are also astonishingly clever, in part because their neocortex is considerably larger than ours.  The point of the book (Beautiful Minds: The Parallel Lives of Great Apes and Dolphins, Harvard UP), as the titles and the author’s credentials imply (Bearzi studies dolphins, Stanford primates), is to point out the affinities between these animals (and by doing so to implicitly close the gap we typically perceive between them and us):  all “spend their lives navigating a complicated social environment.  An ability to learn quickly and make subtle choices lets individuals track the moods, intentions and shifting dominance ranks of others with an eye to increasing their own position within the social hierarchy…[D]olphins and apes are assessing rivals and friends, inventing distractions, employing bribes, and forming alliances to get [food and mates], relying at least as much on cunning as on brute strength” (Chadwick).

The point of all this is not to establish or prove to skeptics that animals are capable of communication – these authors take such a conclusion as already having been established, where the real questions of interest have more to do with communicative variation among species.  While debate remains over the extent to which non-humans possess the capacity to socially transmit learned information, there is by now no doubt that “dolphins and great apes grasp language to some extent.  [Bearzi and Stanford] describe dolphins responding readily to commands that require an understanding of the meaning and order of words – i.e., syntax – while studiously ignoring researchers who issued illogical orders” (Chadwick).

David Rothenberg has been working on the prospects for interspecies communication for some time now, and his 2005 book Why Birds Sing recounts his efforts to create duets with birds.  Birdsong is interesting because, as Chadwick (and of course Rothenberg) notes, birds sing far more than biologically necessary – Rothenberg is thus interested to discover whether birds sing as a form of entertainment, or because it might bring them joy.  With the discovery that whales make elaborate music beneath the waves, efforts were made to uncode whalesong, in part motivated by the thought that we could perhaps talk to each other.  As Chadwick notes:  “Enthusiasts grabbed flutes and lutes and boated out to serenade minds in the waters, intending to open up lines of contact.  The typical reaction from the whales was avoidance, and the dream evaporated.  Rothenberg thinks it may be time to try again.”  His new book, Thousand Mile Song:  Whale Music in a Sea of Sound (Basic Books), recounts efforts to make recognizable contact with whales both in captivity and in the wild.  The book pays special attention to the clicks and whistle-like sounds made by sperm whales, and maps out the different patterns that distinguish whale families.

In one of the Star Trek movies, Spock mind-melds with an earth whale, who is able to seamlessly bring Spock up to date on how ocean pollution has eviscerated habitats.  The film, of course, is predicated on a hopeless but lingering cultural fantasy that holds out the possibility of mental transference and total communicative transparency, a dream no less utopian when considered in the human context than in cross-species encounters.  But I wish Rothenberg well anyway and understand his work as deeply serious.  If research only gets as far as decoding signs of distress or hunger in other species, then that alone will have justified the effort.

SOURCES:  Douglas H. Chadwick, “Cool sounds for killer whales,” Times Literary Supplement, 14 November 2008, pg. 26; David Rothenberg, “Whale Music: Anatomy of an Interspecies Duet,” Leonardo Music Journal 18 (2008): pgs. 47-53.

The limits of global connectivity

The idea that everyone and everything is networked is an ancient one but now a thoroughly pervasive way of describing the world; it is a commonplace that influence, wealth, and ideas shape the unfolding of human history via increasingly dispersed and vertically organized arrangements.  Two essays in the new (January/February 2009) Foreign Affairs, written from drastically different perspectives, offer accounts opposed not because one lacks optimism (to the contrary, both are exceptionally hopeful for America’s prospects) but because one takes measure of the opportunities created by a globally networked society while the other emphasizes its perils.  Bill Gates, in an essay presumably penned before he knew he would continue working as heads of the Pentagon, talks about the security challenges posed by a “new age” where unconventional insurgencies will require both military and diplomatic flexibility.  And, in an essay that clearly sees the cup as half full, Anne-Marie Slaughter, dean of the Woodrow Wilson School at Princeton, argues for the wonderful benefits that can accrue to a smart America in a “networked century.”  Where Gates emphasizes how the globalized context enables instantaneous and network-destroying threats, Slaughter is more attuned to the ways a networked world create the possibilities for collaborations where the United States can assert its more hopeful cultural worldviews without shouldering the full price or institutional risks.

Many recent events have evoked the downsides of globalization, where small coordinated attacks can cripple or derail whole societies even as globalized media coverage brings small acts of resistance or terror to worldwide attention.  The attacks on Mumbai were a vivid reminder that the potential targets of attack (in this case tourist and religious centers) are almost innumerable; the same outcome might have been attained by attacks on any of the world’s major hotels, shopping centers, schools, or churches.  The recent arrest of the Russian arms dealer Viktor Bout has also brought into view the shadow circuits of arms selling and the secret networks of commerce that often allow rogue governments to purchase almost anything they can afford.  Bout was said to have sold military hardware to everyone from the Taliban to Latin American revolutionaries to Hizbullah and even perhaps to rebels in Chechnya.

Or one might also take note of the alarming uptick in documented piracy over the past twenty years.  The recent episodes off the African coast, where Somalian pirates have been attacking cruise ships and cargo vessels (about a week ago the United Nations Security Council extended its authorization allowing countries to enter Somalian waters to intercept pirates), also show how globalization provides ready resources for nongovernmental organizations (whether legitimate or criminal) as much as for governments.  While the reported number of pirate attacks has been declining since 2003 (445 in that year, 263 in 2007), the enterprise is centered around the anticipation of pure profit, pirates regularly demand hostage payments, and it looks like the long term decrease may be reversed when the final statistics are reported for 2008.  A report issued recently by Chatham House argued that the Somali pirates are clearly aiming to raise cash for attacks on the Somali government and on behalf of some radical Islamic groups.  And a report from the London-based International Maritime Bureau says the average demanded ransom is in the one million dollar range.

Piracy is described by Homer in both the Iliad and the Odyssey, and even then the difficulty of identifying naval heroes from villains was well recognized; Homer said it was hard to distinguish because “both set off in their long ships to distant shores to plunder and kill.  The difference… seems only to be their god-given fate.”  Today those who think about globalization might, following Tony Blair and Bill Clinton and the broader discourses of neoliberalism, note that all this simply illustrates the inevitability of opportunity created by globalization, which in turn suggests that good governance should be aimed not at ending globalization but channeling its in more socially beneficial and just ways.  But the powerful structures of opportunity and potential profit created by global financial networks should also serve as a cautionary note against those inclined to see international regulation as easily set in motion.  Such international consensus offers cold comfort to the victims of Victor Bout’s arms sales or the Somali pirates who move, watched but mainly unimpeded.

Follow

Get every new post delivered to your Inbox.