Start here

Mark Noll and the potential contributions of Christian scholarship

Mark Noll’s Scandal of the Evangelical Mind (1994), a book that scandalized the evangelical mind by noting that it wasn’t much in evidence (Noll then scandalized some further when he announced in 2006 that he was leaving Wheaton College after 27 years on the faculty for Notre Dame), was in a sense sequeled in 2011 by Jesus Christ and the Life of the Mind (Eerdmans).  Life of the Mind moves in a more hopeful direction by reconnecting with one of the most ancient of theological questions, often shorthanded as the distinction to be understood between Jerusalem and Athens:  how does one reconcile the life of faith with the life of the mind?

The very question can seem absurd.  Some Christian traditions have revered intellectualism when understood as supplemental or even constitutive of faith, and the world’s great centers of learning include many dedicated to propagation of the faith, but within the contours of profoundly thoughtful efforts to apprehend God’s creation through both the registers of reason as well as the more affectively sensitive mechanisms of intuition or unquestioning simple belief.  For advocates of those traditions – I have in mind the towering scholarly accomplishments of Catholicism and the scholarly products of the Jesuits or the Episcopalians with their metaphorical three-legged stool, but also the textually rigorous insistence that animates many of the Protestant and fundamentalist traditions and brings intellectual coherence to the “priesthood of the believer” (such as the originary impulse of the Churches/Disciples of Christ, founded by the Campbells and Barton Stone, to find converts by way of rigorous actual interdenominational debates) – a faith inconsistent with the dictates of rationality is a belief not worth having.  Why would one worship a God who cannot be apprehended, if only in part, by use of the very mental capacity that most fully distinguishes humans as God’s creation?

But the New Testament itself provides ammunition to those who see the Gospel as requiring a renunciation of the foolish dictates of reason.  The Apostle Paul thunders at the church in Corinth in a tone that taunts the ivory tower elites of his time:

For it is written:  “I will destroy the wisdom of the wise; the intelligence of the intelligent I will frustrate.”  Where is the wise man?  Where is the scholar?  Where is the philosopher of this age?  Has not God made foolish the wisdom of this world?  For since in the wisdom of God the world through its wisdom did not know him, God was pleased through the foolishness of what was preached to save those who believe.  Jews demand miraculous signs and Greeks look for wisdom, but we preach Christ crucified:  a stumbling block to Jews and foolishness to Gentiles, but to those whom God has called, both Jews and Greeks, Christ the power of God and the wisdom of God.  For the foolishness of God is wiser than man’s wisdom, and the weakness of God is stronger than man’s strength.  Brothers, think of what you were when you were called.  Not many of you were wise by human standards; not many were influential; not many were of noble birth.  But God chose the foolish things of the world to shame the wise; God chose the weak things of the world to shame the strong.  He chose the lowly things of this world and the despised things – and the things that are not – to nullify the things that are, so that no one may boast before him.  (1 Cor. 1: 19-29).

There is much to say about this passage, and regarding related passages in the Book of Acts that describe moments of encounter between budding Christian doctrine and the worldly philosophers.  But to some, Paul is here recommending the abandonment of scholasticism and the deep methods of inquiry that can incline humans to hubris.  Augustine and others famously warned against confidence in academic inquiry – how might one have confidence that truth will emerge out of the exchanges conducted among fools? – all presumably to be renounced in preference for the interactions that conducted in prayer bring human frailty into contact with Divine perfection.  And yet the New Testament also recounts multiple scenes of attempted conversion predicated not on the performance of miracles or the enactment of loving care but through the incisive work of public argument (e.g., Acts 6:8-10; 9:28-30; 17:16-17; 18:27-28; 19:8-10).  The message regarding scholarship is thus often read as profoundly mixed:  helpful as a tactic of potential conversion but also dangerous, not only because of its possible inducement to hubris but because clever sophistry (of the type Satan practiced on Jesus as he wandered the wilderness for forty days and nights, or attempted in his jousting with God over Job) can lead the innocent astray.

When it comes to those Christians who have made professional commitments to the work of the public university, the issue is further complicated.  A life built on unwavering adherence to the Christian gospel can be understood as profoundly at odds with the spirit of skepticism and unending inquiry that underwrites the academy.  Only several short steps lead many believers to see secular institutions (like, for example, public universities) as inevitably hostile to Christian discipleship.  Meanwhile expressions of doubt, the very lifeblood of academic inquiry, are too easily read as heretical when articulated in religious settings.  Athens and Jerusalem are thus apprehended as two worlds completely divided and incommensurable one to the other.  (This, I think, is deeply unfortunate, and it has always seemed to me that faith traditions would be made stronger by welcoming and working through expressions of doubt.  There is support for this position in the New Testament gospels – in one case, recounted at Mark 9:24, the father of a demon-possessed boy comes to Jesus and asks that his son be healed.  Jesus says something like Everything is possible for those who believe.  The father replies by expressing a paradox often felt by even the most dedicated believers:  I do believe; help me overcome my unbelief.  Importantly, Jesus is not offended by the contradiction but heals the boy.  And when the famous doubting Thomas expresses his skepticism about the resurrection, Jesus does not throw him out; rather, as recounted in Luke’s gospel, Jesus replies “Look at my hands and feet.  Touch me and see.”  Doubt is heard as an invitation to fellowship and grace and not read as blasphemy).

The risk that the phrase “Christian intellectual” will be thought a contradiction in terms, and the related consequence that Christianity will, if seen as embracing anti-intellectualism, repel brilliant seekers, is what I take as the animating impulse of Noll’s recent work.  His project is to argue the consistency of scholarship with Christianity, and more than that, to assert that Christians who do scholarship importantly enrich academic work.

A common approach in taking up this issue is to cite scripture on the topic of noble work.  In a number of places believers are called, irrespective of the location or nature of their employment, to excellence in the workplace, and I have heard these admonitions cited to induce even professors into offering their dedicated and best work.  Some examples:  The OT Proverbs in several places advocate for diligence (12:24 – “diligent hands will rule”; 14:23 – “all hard work brings a profit”).  Or from the Acts of the Apostles (5:38), a test that much academic work seems easily to pass:  “For if their purpose or activity is of human origin, it will fail.  But if it is from God, you will not be able to stop these men; you will only find yourself fighting against God.”  Or, alternatively, the commendation made in the letter to the Colossian church at 3:17: “Do it all in the name of the Lord Jesus” (which one might read as a command to dedicate all work, especially the work of the mind, to God’s honor); later (3:23), “Whatever you do, work at it with all your heart…”  Or, from the first letter to the Corinthian Christians, an injunction essentially to “bloom where you are planted”:  “Nevertheless, each one should retain the place in life that the Lord assigned to him and to which God has called him.”  Versions of the same idea are repeated three times in that one chapter (7:17, 7:20, 7:24) alone.  In the letter Paul wrote to the church in Ephesus, he writes (6:5-8) “Obey earthly masters with respect and fear and with sincerity of heart, just as you would obey Christ…  like slaves of Christ doing the will of God from your heart.  Serve wholeheartedly, as if you were serving the Lord, not men.”

But this is not the path laid out by Prof. Noll.  Instead, Life of the Mind searches scripture for those places where insights into intellectualism can be abstracted into a philosophy of Christian scholarship.  What Noll finds everywhere are invitations to closer scrutiny and deeper inquiry.  In the Christian creeds and in the major doctrinal worldviews found in the New Testament (such as the first chapter of Colossians (1:16-17) are statements about the created world that he reads as inviting Christians to respond to creation with the impulse to further explore and learn.  In the statements of Jesus to which I’ve alluded already (especially for Noll: “Come, and see!”), Noll apprehends a scholarly impulse which one can credit by faith with always rewarding closer scrutiny.  What Noll advocates is a faithful confidence that deeper engagement with the protocols of learning will lead thinkers closer to God and not further away:

The specific requirements for Christian scholarship all grow naturally from Christian worship inspired by love:  confidence in the ability to gain knowledge about the world because the world was brought into being through Jesus Christ; commitment to careful examination of the objects of study through “coming and seeing”; trust that good scholarship and faithful discipleship cannot ultimately conflict; humility from realizing that learning depends at every step on a merciful God; and gratitude in acknowledging that all good gifts come from above.  If, as Christians believe, “all the treasures of wisdom and knowledge” are hid in Christ (Col 2:3), the time is always past for talking about treasure hunting.  The time is always now to unearth treasure, offer it to others for critique or affirmation, and above all find in it new occasions to glorify the one who gives the treasure and is the treasure himself.  (p. 149).

Shortly after its publication, the great Yale theologian Nicholas Wolsterstorff wrote a positive review that nonetheless wondered whether Noll’s three-chapter discipline-by-discipline case studies were rich enough to make compelling the case for Christian contributions to scholarship.  He wrote:

Let me add that whereas the Christological case that Noll makes for Christians engaging in serious learning seems to me both compelling and rich, the guidelines that he teases out of classic Christology for how we actually engage in learning strike me as rather thin by comparison.  Christians, he says, will affirm contingency.  They will affirm particularity.  With the Incarnation in mind they will insist, by analogy, that ascribing a natural cause to some event is compatible with ascribing it to God as well.  They will resist the pride characteristic of intellectuals.  All true; but very general and abstract.

That point is well taken, although given the common radical separation of secular and sacred intellectual inquiry, it may be that the simple articulation of a Christian alternative itself might engage deeper thinking.

For me, the trickier question is whether, despite the intellectual payoffs to be found in the great faith traditions, they should ever be strongly asserted in the public university.  One need not condemn Christians to silence in the public square to recognize that in an institution aiming to welcome and encourage thinkers from all backgrounds and perspectives, the forceful articulation of Christian theological imperatives risks doing as much damage to the open spirit of inquiry (by silencing those who will wonder if they can freely disagree with faith commitments so deeply held) as good.  I wonder.  It may be that the scholarly work Noll advocates is best undertaken in explicitly religious institutions, from which point its findings and main claims can be disseminated more widely as an implicit corrective to the narrower work of a public educational system that will of rightful necessity orient its efforts to reach more widely.

For me, then, Noll’s work finally raises this question:  Even conceding the strongest case for Christian scholarship (which is to say, the case that an articulated Christian worldview can enliven any disciplinary conversation), does it then follow that Christian commitments should be always and everywhere articulated?  Or, put a bit differently, does every workplace obligate the believer to proselytization?  Is it possible that, as Paul made tents to raise money for his missionary journeys, there were days when he simply quietly engaged in craftsmanship without preaching to his colleagues?  As the New Testament figure Lydia made purple silks, which we are told she did to fund the work of the church, did she try to determine how this or that biblical verse might better inform her artistic practice?  Or were these believers content simply to segment their good work, willing to concentrate their evangelism within other locales where Christian testimony would be more gratefully received than the tent or silk workshops?

Remembering Sam Becker and university citizenship

Sam Becker, for whom the University of Iowa Department of Communication Studies building is named, and whose six decades career was highly accomplished, passed away on November 8.  While I was a doctoral student at Iowa in the 1990’s, Becker was already retired but still ever-present, and by the sheer randomness of graduate student office assignment two colleagues and I ended up right across the hall from his small retirement office.  He was often available for conversation and also extraordinarily productive, a professor who gave the lie to the all-too-popular idea currently dominant in the humanities and some of the social sciences that the only way to get real research done is to work in seclusion either behind a closed office door or, even better, from the house.  Looking down hallways of closed faculty office doors, and I mean this as no insult since Iowa is typical, I think, of today’s academic culture, I was always struck by the fact that the most open and accessible professors – Gronbeck, Ochs, Becker – were also among the most research productive.

By my time in the program, Dr. Becker was only occasionally teaching, but he taught one of my very first classes, a one credit hour professionalization seminar that only met, as I recall it, for about 45 or 50 minutes a week.  We were coached on the protocols of academic citation, taught the mechanics of the main communication associations (Sam’s lifelong commitment to the National Communication Association meant we heard most about that one), and we talked about how one best organizes one’s research work.  I believe it was there that he told the story about how he first got hooked on academic scholarship in communication.  He was a young undergraduate at the university in the 1940’s, and was encouraged to turn a classroom paper into a publication, which he landed in one of the lead outlets for communication research.  Over the next sixty years he produced more than 100 peer reviewed research essays, advised nearly 70 doctoral dissertations, and won high degrees of acclaim for his work (NCA president, NCA Distinguished Scholar, recipient of the association’s first mentorship award, and many more, including a number of honors on the Iowa campus that include the rare privilege of a namesake building).

Professor Becker’s death evokes in me, against my better judgment perhaps, a nostalgic desire for a sort of academic culture that likely no longer exists.  The temptation to nostalgia when it comes to the academic past is fraught.  Even as the American public universities threw open their doors and programs in the 1960’s and 70’s, they were far from perfect, and the political constraints under which professors work today are in some respects incomparably different.  And universities look a lot different through the eyes of a professor than they do through the eyes of a graduate student.  It is easier to imagine public university work as a sort of exotic salon culture, the pure life of the mind where professors think great thoughts in communion with their colleagues, when one’s schedule, overloaded as is graduate student life always, consists of one intellectual interaction after another, seminar to seminar and great book to great book.  The academic life performed for graduate students, indeed for all students, is simply not the same as the one lived in a profession as dominated by committee meetings as discussions of big ideas.  Comparisons between past and present too often fail.

But my nostalgia lingers.

Sam Becker represented a style of academic life and an extraordinary commitment to building local programmatic excellence, that I find harder to find today (and in my world so infrequent as to be essentially nonexistent), living as we do at a time when many professors understandably find their main intellectual sustenance from longer distance networking – social media, blog- and listserv-centered – and themselves too informationally enriched (or, alternatively, overstimulated) and even overwhelmed by those gushing sources to desire anything but minimal face-to-face engagement with on-campus colleagues.  Part of this, I believe, is the characteristic connection of rather-shy-and-life-of-the-mind-driven academics with the more controllable interactions of online and distance encounter; it is easier to present a polished, a more clever persona through Facebook and blogging than in the heat of a tedious faculty meeting, and so as a result many gravitate to the New Comfort Zones of virtual engagement.

Entire academic generations have been mentored to the view that their most assured path to professional success is isolation – keep your head down, don’t over commit, set up a home office and be disciplined about working there, spend as few hours on campus as you can because if word gets out that you’re available then colleagues and students will eat you alive and rob you of all your productive energy.  This advice is reinforced because when one resolves only to spend ten hours a week on campus then, not surprisingly, those ten hours quickly fill to capacity as students figure out those are the only opportunities for real access not coordinated by emails.  The approach affords little time to linger, for lingering is time wasting.  Sorry!  I’m drowning; gotta run! becomes an easy refrain.

All this is understandable and not unreasonable.  I’m as prone to the temptations as anyone.  The seductive blend of intellectual (over) stimulation, where ideas can be consumed at any pace one prefers, and staged (or scripted) encounters managed from the comfort of the computer desk chair, can simply feel more enriching than sitting through a long research presentation or a comprehensive examination defense.

Donavan Ochs, a Becker colleague at Iowa, and Sam Becker, both veterans of military service (I have the sense that had something to do with it), put in pretty regular daily schedules.  Ochs, with whom I had the chance to study classical rhetoric and do an independent study on Aristotle, often put in 8-to-5 days.  As I recall it Donovan wore a tie every day, even in the 1990’s when few others did, and his door was always open apart from times when he was in private meetings or teaching.  When I asked him once how he got any work done under those conditions, he was plainly surprised at the question, and his reply – what I do here is my work – led to wider conversations about academic life.  Ochs noted that an open door policy did not prevent his research productivity, since the morning hours typically gave him many undisturbed hours to write and think.  His door wasn’t open to enable empty chit-chat – he was always profoundly encouraging but kept conversations mainly work focused.  And because he worked, seriously worked, for the duration of the regular day, he avoided the guilt so many of us feel at thinking we should be working at all hours of the night.  I always had the sense Ochs went home with a clean conscience – he had a life apart from Aristotle, a healthy set of diverse family and life interests, and retirement presented no apparent trauma for him.

It is simply impossible to generalize about the state of faculty engagement given the diversity of campus environments, and unproductive even to try, and there remain, of course, more pastoral campus settings where relatively smaller student cohorts and perhaps better supported faculty lives enable the creation of close intellectual community that at some level still adheres to the wider mythology of enclaved campus life.  But life for students and professors at the big state universities, and I suspect even in places where campus life is residential and intensively communal, is changing.  If the National Surveys of Student Engagement are to be trusted, students report less frequent conversations with their classroom professors outside of regular class times.  Michael Crow, the president of Arizona State University and a key (and controversial) national advocate for delivering high quality and research intensive educational outcomes to very high numbers of enrolled students (ASU is now one of the nation’s largest universities), often repeats the idea that demographic surges require a model of education that is not numerically exclusive (the backward logic that translates so that the more people a school turns away, the better their reputation).  If public state institutions cannot find ways to well serve the students who are academically gifted but not financially or intellectually elite enough to enroll at the most exclusive private schools, Crow often says we’ll end up with a two-tiered system where the rich kids are educated by professors and the rest will be educated by computers.

The truth is that the big public universities are fast veering in the latter direction, not in the sense that MOOC’s educate our students but that the experience, especially in the first couple years, can be awfully impersonal, if not on account of large classes than because so many early classes are taught by graduate students and temporary faculty whose good teaching may nonetheless insufficiently convey a sense of place and local intellectual tradition.  The wider incentive structures are too often negative:  no pay raises, the demoralized sense that follows from the more frequently expressed taxpayer hostility to higher education, the pressures to win grants and relentlessly publish papers, accountability pressures that seem to require more and more administrative meetings, the idea that one must always stay on the job market or you’ll likely not be able to get a pay raise here, the growing number of students and in some states the expectation of higher instructional workloads, a tendency to think of day-to-day intellectual connectivity as simply more uncompensated service.  All this lures professors from the committed work of building local loyalty and into more defensive practices that feel like simple self preservation but are also, I suspect, self-defeating because they only accelerate a vicious cycle of brief and highly focused teaching and mentorship alternated by long stretches away.  Participate in a sustained reading group?  Sorry, I just don’t have any time for that.  Organize a campus colloquium, film or lecture series?  DittoAnd since everyone else is overwhelmed too, what would be the point?  No one would come.  Did you see the lead essay in the new QJS?  I’m curious what you thought.  Gosh, I’m months behind on that kind of reading – all my energy has to go to my book.  What energized you at the last big national conference?  Oh, couldn’t make it – and how could I when the university gives so little for professional development support?

The picture I’ve just drawn is exaggerated, thankfully, but I suspect that even as caricature it retains a certain familiarity.  Fortunately the energetic participation new faculty bring to academic programs is inspirational, and idealism trumps low morale for so many staff and faculty who sustain both distance networked and local connectivity. Whatever the incentives, every department includes professors at all ranks who pour their energies into building real collective intellectual communities.  It might also be added that the struggle I’m describing may be most accentuated in the humanities, where the norms of academic research are only slowly shifting away from the lone-professor-writing-her-latest-book to practices of team-based interdisciplinarity.  The very important beneficial consequences of globally networked disciplinary conversation arose for important reasons – the generation of new knowledge is more dynamic than ever before in human history, even despite data that (at least in communication) the research work is increasingly localized in smaller numbers of publishing faculty (a recent analysis in speech communication showed that something like 85% of all professors have not published anything or received external support for their projects in the previous five years).  But I wonder if the number of high productivity and communally engaged scholars can be sustained when their morale is under assault too, because the dynamics induced by understandable mentorship and reduced support bring into ever-starker relief the old 20/80 rule, where 20% do 80% of the work.  As 20/80 becomes 10/90, this is how intellectual dynamism, and universities, die.

Sam Becker’s career suggests a thought experience that asks whether the considerable benefits of 21st century intellectual life can be improved by some integration of the professional practices of the 20th.  I want to hypothesize that what so often seems like the depressing path of today’s stressed system of public higher education need not necessarily be accepted as a New Normal.  If public higher education is to retain its historical vitality, changes will have to happen on many fronts.  Taxpayers and legislators will need to be persuaded of public education’s value.  Reasonable systems of accountability will need to document the outcomes of pedagogical encounter, I know.  But there is a role for we faculty to play as well, and Sam Becker’s professional life suggests some of the possibilities.  Becker knew that good and committed scholars who simply show up day after day and make themselves open to engaged discussions with others, both online and in person, actually attract other smart students and teachers to join as well in ways that energize the common enterprise, and that calling it quits at the end of the workday creates intellectual sustainability too as people find time away every single day to recharge.  He saw, because he so often created it himself, that the vital and passionate sense of connection that emerges as intelligent participants in the educational experience talk to each other and rev up excitement about ideas one discussion at a time.  He realized that when everyone is present and engaged in program building, service work is made more manageable by division among larger numbers of connected co-workers.  I cannot prove it, but my suspicion is that the great intellectually vital centers of communication scholarship were (are) built more fully by acts of local loyalism than by enterprising free-agent academic nomadism.

The key is not simply hallway cultures of greater presence but also necessarily entail high degrees of intellectual openness, a refusal to see the scholarly enterprise as ideational warfare or zero-sum, even in contexts where resourcing is finite.  And this was another of the Becker legacies.  During his five decades in the department, communication studies nurtured, developed, and then in some cases spun off new academic units, including theater and film.  Those discussions were not always smooth or even friendly, and Becker had strong opinions.  But what he always preserved, as I saw it, was a culture of openness to new and productive work – it led him to shift over his own career from interests in quantitative social science to British cultural studies qualitative research and then back again.  No department is ever entirely free of intellectual entanglements – smart people will tend always to prefer their own lines of inquiry and can too easily fail to see the value of the efforts undertaken by others.  But so long as there are some Beckers around, these inclinations to either/or warfare that have consumed whole programs in acrimony can be channeled productively into both/and collective accomplishment.

Fantasies, perhaps.  But these are ideas whose lifelong embodiment in one Samuel L. Becker – Eagle Scout, decorated war hero, “Mr. University of Iowa,” champion of social justice and the idea that public education enriches us all, extraordinary teacher and scholar and administrator – remain for me compelling, even given the New Normals of this new century.

When the map seems larger than the territory

On one of the websites for students of rhetorical theory, conversation has recently focused on the status of psychoanalytic criticism and the question of whether its insights are being willfully ignored by the larger field.  Josh Gunn kicked off the discussion, in part, by noting that despite recent interest, “rhetorical theory — at least on the communication [studies] side – is hampered by a certain blind spot caused by the avoidance of psychoanalysis, and more specifically, the inadmissibility of the category of the unconscious.”  Gunn rightly wonders at the absurdity of this given how many revered figures in rhetorical theory have been explicitly influenced by or have reacted against Freud, Lacan, Klein, Jung and others.

In the ensuing back-and-forth a range of perspectives have been expressed:  some writing to agree that psychoanalysis does seem to provoke unique antipathy from students assigned to encounter it, others speculating on the causes (is it because communication was more a journal than a book field?  did the discipline’s work in response to behaviorism inoculate scholars against its insights?  has psychoanalysis been more widely tainted, thus deterring investigation from the outset?), and so on.  Perhaps not surprisingly, some of the explanations veer to the therapeutic – several responses convey anecdotes of a visceral (and by implication anti-intellectual) refusal to take psychoanalytic work seriously:  sneering senior scholars, wink-wink-nudge-nudge sorts of boundary policing behavior, and the (not-so-)subtle steering of graduate students away from the theoretical insights of psychoanalysis.

As I’ve been thinking about all this I don’t find myself in particular disagreement except that I don’t think this phenomenon is unique to psychoanalysis.  Rather, I think what we are seeing are the ongoing consequences of theoretical hyper-specialization, where these are simply several of many local occurrences.  By contrast to those who continue to announce the Death of Theory, it seems to me that we are still working to live with the consequences of having at our disposal So Many Seriously Elaborated Theories, which in turn gives rise to a mostly frustrating situation where the maps seem richer, or at least larger, than the territory.

I do not note this to endorse hostility to the elaboration of theoretical sophistication, but simply to note how glutted we are with it.  I think the symptoms of this, at least in communication studies, are everywhere:  A more ready willingness to abandon the mega-conferences in preference for more intimate niche meetings where one can burrow in and keep up.  The tendency to assign secondary sources even in doctoral seminars, or, when primary works are struggled with, to isolate them from the conversations in which they participated (which results in an alternative tendency to see originary controversies, such as the big arguments between Foucault and Sartre, or Fraser and Habermas, as pretty much settled history to be filed in the same category with “the earth is round”). A growing impatience with efforts to provide arms-length or peer review or other gate-keeping work undertaken in the effort to make comprehensible the incoming ocean of new material, and a more widespread corrosive cynicism about the larger enterprise.  The increasing frequency of major conference presentations, even given by serious senior scholars, that don’t seem to say much of anything new but mostly offer a repetition of the theoretically same.  An inclination to see friendly work as fully appreciating the rich nuance of my own tradition, and hostile work as reducing my tradition to caricature.  A wider tendency to see the dissertation not as evidencing a student’s ability to undertake a serious research project, but as an indication of the project whose trajectory will forever define a career.

Another key marker is the level of defensiveness, sometimes veering into animus, I hear often expressed by the advocates of every perspective who feel their work is under siege:  Marxist theory, argumentation studies, close textual analysis, historical/archival work, postcolonial and queer theory, cultural studies, feminist scholarship, and the list could be considerably lengthened.  All feel under attack and to some extent sustain intellectual solidarity by insisting enemies are at the gate.  And within these traditions fragmentation continues apace – a longstanding theme in a number of the convention conversations I hear is how scholars who for many years have labored to make visible the cultural contributions of gays and lesbians see themselves as today marginalized by queer theory, and in turn how queer theory seems to be marginalizing bisexual and transgendered approaches.  This is a theme not limited to rhetorical studies but is more widely sensed within the broader inquiry of communication scholars:  the television studies people feel like they aren’t taken seriously, and so do the performance theorists, the cinema studies scholars, the interpersonal researchers, the quantoids, the public opinion theorists, those who first encountered communication through forensics or theater, the TV and film production faculty, ethnographers, organizational communication scholars, mass communication empiricists, public relations practitioners, and those who teach students for industry work.

As my career has taken me in the direction of administrative work, I see the same trends more widely as they shape conversations within the humanities and beyond.  When I first had the audacity in a meeting of chairs from the full range of disciplines to say that external resources are harder to find in the humanities – I thought everyone agreed with that – I was surprised that the most assertive push-back came from a colleague in biology, who was there to argue in detail his relative deprivation within the wider university.  His case was not absurd:  it is hard to argue anyone is properly supported in the modern public research university.

I don’t see this defensiveness as a reflection of bad faith or of animus.  For in a sense all of us are right – one does have to exercise eternal vigilance in defending one’s research perspective, because in a universe of so many well-elaborated accounts of human behavior the most likely danger is being forgotten or overshadowed given the broader cacophony.  Thus the paradox that while more journals are now published in the humanities than ever before, the individual researchers I talk with see fewer and fewer outlets available for their sort of work.  Or, to further mix the metaphors, there are not only more intellectual fortresses today, but they are better fortified against attack and protected against the wandering tourist and amateur dabbler than ever before.

It is true, I suppose, that within each theoretical community are some who treat, say, Anti-Oedipus or Lacan’s seminars or the Prison Notebooks or the Rhetoric as holy scripture.  But the issue is less that each theorist has induced a cult than that, in general, scholars who are otherwise persuaded they cannot possibly know every perspective well, tend to stick with the one rich approach into which they were first acculturated.  And so what was and is seen by some as a sort of happy theoretical pluralism, a view still promoted by the wider impulses to boundary-cross and be interdisciplinary and all the rest, has devolved into a more frequently expressed surliness about colleagues who “won’t do the work to stay current,” a wider reliance on secondary sources like the Dummy guides and Cambridge Companions, the more frequent play (in responding to outsider critics) of the “you don’t appreciate the subtlety of my theory when it comes to ___” card, and an even more common resort by the basically friendly to the tactic of heavy-note-taking silence or the helpful “you should really read [insert my theorist],” or, more generally, “have you thought about this?” conference response or query.  One of the most common questions i hear my colleagues ask of one another is one I often ask myself:  “If you could recommend three or four short and accessible overviews to ____ that would help me get up to speed, what would you suggest?”  It’s asking for an academic life preserver.

Less of all this is sparked by ill will or ideological refusal than by the simple unwillingness to confess “I am unable to offer a thoughtful response to your read of Ranciere because I didn’t know he would be discussed today and so I didn’t have the chance to beef up on my Ranciere for Dummies, and because it takes every minute I have available for intellectual work just to keep up on my Burke.”  The eye rolling response is sometimes thus less reflective of substantively well-grounded opposition than the expression of a weirdly humble recognition of the game we think everyone is playing: the gotcha strategem of “there s/he goes again showing off everything s/he knows about Cicero.”  At a time when credible humanistic research is said to be impossible apart from mastery of all social theory, all of the philosophical and aesthetic traditions, and (increasingly) the life sciences (cognitive theory, evolutionary psychology, accounts of chaos and networks and more), and the globalized set of artifacts that underwrite comparative work, the task seems overwhelming.

My point is not to be alarmist or defeatist about the enterprise.  Specialization is not new, and has elicited expressions of concern for generations.  To some extent the theoretical proliferation is self correcting – people participating in a bounded academic conversation do move on and not every carefully enunciated perspective finds a following.  There remain exceptionally skilled intellectuals who seem to know everything and who are apparently able to keep up with all the wider literatures.  And too often the expressed difficulties in “keeping up” exaggerate the challenge in an age when more resources than ever are available to enable one’s informational literacy, and when “I don’t have the time to understand [feminist] [critical race] [queer] theory” is a too-convenient excuse to ignore perspectives that elites brushed off even when Renaissance Giants Walked the Earth and only had to stay current on the sum of human knowledge contained in fifty books.

And because the challenges of surfing the sea of new literature and getting others interested in one’s work are by now so universal, I have little to offer to the range of problematic correctives.  The idea of reinstating a common canon holds little appeal, and for good reason.  Nor is announcing the Death of Theory, or insisting on the Priority of the Local or the Case, especially compelling.  My own preference, given a background in debating, is to “teach the controversies,” but that approach isn’t ideologically innocent either.  If book publishers survive, I think the impetus to anthologies that now characterizes cinema studies is likely to expand more widely within communication scholarship.  But there are dangers in too readily recommending hyper-specialization in doctoral students, paper writing revved up out of fast tours of JSTOR and Project Muse, and too quickly acceding to happy talk about theoretical pluralism.  Better, in our own intellectual labors, to insistently listen to and reach out to other perspectives and work like hell to keep up with the wider world of humanistic scholarship.

And sometimes, if only as a mechanism to preserve one’s sanity, a little eye rolling may also be in order.  Just keep it to yourself please.

The future of globalized literary history

A 2008 special issue of New Literary History (vol. 39) is focused on the future of literary history (and, relatedly, comparative literary studies) given globalization.  To some extent one can track the complicated history of World Literature through the early and influential essays of Rene Wellek, who advocated for comparative scholarship even as he warned against the dangers of investing disciplinary energy in the search for covering laws and causal relationships between literature and the wider society.  The titles of Wellek’s much-cited 1958 talk, “The Crisis of Comparative Literature,” and his 1973 essay “The Fall of Literary History,” convey some sense of his pessimism about the prospects for defensible work.

Of course the very term World Literature has to be carefully used since one must always demarcate the multiple possibilities implied by the phrase.  Some use World Literature to reference all the literature produced in the world, some see it as referring to Kant and Goethe’s dream (Goethe in 1827:  “a universal world literature is in the process of being constituted”) of an international body of transcendently superb literature, and still others to reference those few novels that have found truly international fame.  And so some who are invested in comparative work today, often undertaken to throw American cultural productions into a wider perspective of circulation and resistance, prefer terms like transcultural literary history (Pettersson).  In the context of the theoretical care one must take even to begin this kind of work (the complications of which are unwittingly revealed in Walter Veit’s summation of Linda Hutcheon’s call for a “new history of literature” which “has to be constructed as a relational, contrapuntal, polycommunal, polyethnic, multiperspectival comparative history”), the project remains inherently appealing:  who would oppose the idea of research that induces cross-cultural sensitivity and understanding, even realizing its final impossibility?

After buzzing along for decades, or at least since the 1950’s when the International Comparative Literature Association first met to debate the potential for doing literary historical work, new attention has been given to transcultural literary studies thanks to two much-discussed interventions:  Franco Moretti’s essay, “Conjectures on World Literature” (which forms the anchor for a 2004 anthology on Debating World Literature released by Verso and which formed a sort of introduction to his widely read book a year later) and Pascale Casanova’s The World Republic of Letters (trans. M.B. DeBevoise; Cambridge, Mass.:  Harvard UP, 2004).  Moretti’s work has gotten a lot of attention given his heretical view that the sheer quantity of the world’s literature, which now escapes the possibilities of close textual analysis, now requires distant reading, which is to say sophisticated forms of macro-data analysis that can reveal patterns of novelistic diffusion worldwide.

But things get tricky fast.  Fredric Jameson, who leads off and who has long expressed skepticism about the work of literary historians (noting in an address to a 1984 Hong Kong conference on Rewriting Literary History that “few of us think of our work in terms of literary history,” and having subsequently called repeated attention to the essentially ahistorical nature of postmodernity), argues that the dialectical impulses of economic globalization simultaneously promise cultural liberation even as the economic chains are slowly tightened, and in ways that finally limit the range of cultural productions as well.  To be concrete, Jameson highlights how global capital appears to open all cultures to all populations, even as, over time, a shrinking number of transnational conglomerates end up ultimately stifling all but the handful of mainly English-language novels able to turn a profit.  He is especially keen on Michael Mann’s argument that the global economy is “encaging” – that is, as Jameson describes it, “the new global division of labor is” organized so that “at first it is useful for certain countries to specialize…. Today, however, when self-sufficiency is a thing of the past, and when no single country, no matter what its fertility, any longer feeds itself, it becomes clearer what this irreversibility means.  You cannot opt out of the international division of labor any longer” (376).

The cage ensnares more tightly – and not only because “smaller national publishers are absorbed into gigantic German or Spanish publishing empires,” but because a handful of mega-publishers end up publishing all the textbooks kids read even as budding authors everywhere are subtly persuaded to buy in because of their “instinctive desire to be read by the West and in particular in the United States and in the English language:  to be read and to be seen and observed by this particular Big Other” (377).  So what are literary historians to do that will not invariably make them simply complicit in all this?  Jameson, a little bizarrely I think, argues for a sort of criticism that imagines the world-making possibilities of novels-yet-unwritten-that-one-imagines-as-ultimately-failing-to-liberate.  This sort of creative criticism “raises the ante,” according to Jameson, because it helps its audiences recognize the actual “persistence, if insufficiently imagined and radicalized, of current stereotypes of literary history” (381).

Brian Stock, at the University of Toronto, reads the current scene from within the larger new traditions of developmental and cognitive psychology and cognitive neuroscience.  What work done in these areas suggests is that reading has a profound cognitive (and universal) influence on human beings, whose plastic minds are essentially reconfigured by repeated participation in practices of literacy.  As Stock see is,”the only way in which reading can be related to the ubiquitous problem of globalization in communications, without running the risk of new types of intellectual colonization, is by demonstrating that it is in the genetic inheritance for interpreting language in its written or ideographic form that is the truly ‘global’ phenomenon, since it is potentially shared by everyone who can read…  [I]f this approach can be agreed upon, the natural partner of globalization will become a scientifically defended pluralism” (406).

Walter Veit, at Monash University, sees the interpretive key as residing in temporality, which can never be linguistically articulated (Paul Ricouer:  “temporality cannot be spoken of in the direct discourse of phenomenology”) except in novelistic narrative, where the arc of the narrative makes some sense of time’s passage and where, following Hayden White, the linguistic operations of rhetorical tropes and figures provide metaphorical access to the otherwise inexpressible.  One is left with a more sanguine sense of the future within these terms:  both for an analysis of the multiple ways in which the world’s literatures construct time and its passing, and with respect to literary criticism, which is always embedded in the particular and always changing practices of its time and audiences.  Such a view is well supplemented by Nirvana Tanoukhi’s claim that the challenge of understanding transnational literature is also foundationally a question of scale and locale and spaces of production.

The work of literary history, and the conceptualization even of its very possibility, is, finally,  a representative anecdote for the broader work of the humanities.  This is a theme apprehended both by Hayden White, who notes that the questions raised in the symposium reflect the larger conditions of historical knowledge as such, and by David Bleich, who notes the close affinity between the work of literary historians and the broader work of the university (where “scholars have always been involved in the mixing of societies, the sharing of languages and literatures, and the teaching of their findings and understandings,” pg. 497).  The university plays a culturally central role in translating other cultures (for students, for the audiences of its research) that is fraught with all the perils of the work of writing intercultural history – hubris, caricature, misapprehension.  But the effort to make sense of the wider world, however risky, is also indispensable, if only because the alternatives – unmitigated arrogance and blinkered ignorance – are so much worse.

Thalberg: Making the piano sensitive

Sigismund Thalberg’s piano performance tour of the United States prior to the Civil War came at a key point in the nation’s cultural emergence on the world scene.  By the 1830’s the United States’ cultural and social elite knew the musical works of Haydn, Beethoven, Mozart, and Bach, but the acquired tastes of the refined classical tradition had not reached the American masses and European musical refinement was often caricatured.  While in New York City the Philharmonic had been organized as a voluntary association since 1842, even by the late 1850’s (on the brink of the Civil War) efforts to expand concerts into mid-day matinees struggled to find audiences.  It wasn’t until 1881 that “the Boston Symphony became the nation’s first permanent full-time orchestra” (Horowitz).  Meanwhile, as the director of the Parisian Opera had but it in the 1840’s, in language characteristic of the European prejudice:  “We look upon America as an industrial country – excellent for electric telegraphs and railroads but not for Art.”

The mismatch between American and European musical sensibilities created a mutual cycle of mistrust and disparagement that did not really begin to crumble until Anton Rubenstein’s tour of the states in the 1870’s.  Prior to his tour, European impresarios often appealed to their publics in ways more reminiscent of circus performers (as when Jenny Lind was debuted as an “angel,” descended from heaven, by P.T. Barnum), where publicity machines were wildly ramped up.  But the irony of the Paris Opera director’s comment is that it was precisely America’s midcentury industrialization that enabled its cultural transformation.  The frenetic pace of Thalberg’s American concertizing was only possible because he was able to perform in the evening and then often travel all night on the trains, on a rapidly expanded and precisely organized transportation system that gave predictability to one’s efforts to reach small towns like Kenosha and Sandusky and Natchez and Atlanta.

Thalberg’s tour – his first American concert was in November 1856 in New York City, his last, provoked by a sudden and still unexplained return to Europe, took place in Peoria in June 1858 – was a noticeable contrast to the earlier hype of Jenny Lind, partly because his reputation as a European master needed little exaggeration.  Training in piano had already emerged as a marker of middle-class respectability, and Thalberg’s sheet music was well known to American music students.  News of Thalberg’s status as the most credible European pianist, second only to Franz Liszt, had been long circulated by the time of his arrival (and of course Liszt’s refusal to tour the United States left the stage to Thalberg’s crowd-pleasing sensibilities).

The generally exuberant reaction Thalberg received from American audiences reiterated the enthusiasm Europe had shown for his virtuosity twenty years earlier.  Reacting to a Parisian performance given in 1836, one reviewer described Thalberg this way:  “From under his fingers there escape handfuls of pearls.”  As time passed, the early rapture was revved into a feud, where partisans of Liszt (most notably Berlioz) took sides against Thalberg partisans (most notably Mendelssohn).  The whole thing came to head in a famously reported Paris recital where both Lizst and Thalberg played.  The March 1837 event, where tickets sold for charity cost an extravagant $8 apiece, featured each playing one of their famous fantasias:  Liszt played his volcanic transcription of Pacini’s Niobe and Thalberg his subtler version of Rossini’s Moise.  The outcome, although historically dominated by Liszt, was judged a close call at the time.  Some viewed the contest a tie.  A 1958 account by Vera Mikol argued that the winner was, “in the eyes of the ladies, Thalberg; according to the critics, Liszt.”

Thalberg’s reputation, though sustained by sold out European performances, faded on the continent, even as his worldwide reach (with concert tours in Russia, Holland, Spain, and Brazil) expanded.  Robert Schumann was notoriously hostile, and in his writing used the term “a la Thalberg” as a slur to describe lightweight compositions.  Mendelssohn remained an admirer.  The sparkling romanticism of Thalberg’s compositions made them simultaneously popular (and stylistically imitated) and critically panned.  It is evidence of both impulses that when Jenny Lind launched her American tour, the concert opened with a two-piano performance of Thalberg’s transcription of Norma.

But what made Thalberg extraordinary was not necessarily best displayed in his compositions, and this is undoubtedly why his reputation has so seriously abated.  Even in his day his compositions were often criticized for their repetitive impulse to showcase technique.  The key, for his audiences, was a compositional trick popularized and perhaps even invented by Thalberg and imitated everywhere:  the melodic line was switched thumb to thumb while the other fingers ranged widely across arpeggios above and below in an effect that made the player sound as if he had three hands.  Audiences were so impressed with this illusion that in some cities they reportedly stood up to get a better glimpse of his hands on the keys.

The key for his admiring critics, meanwhile, lay in his technique.  For Mendelssohn, Thalberg “restores one’s desire for playing and studying as everything really perfect does.”  Ernest Legouve wrote that “Thalberg never pounded.  What constituted his superiority, what made the pleasure of hearing him play a luxury to the ear, was pure tone.  I have never heard such another, so full, so round, so soft, so velvety, so sweet, and still so strong.”  Arthur Pougin, memorializing Thalberg at his death, said it was he “who, for the first time we had seen, made the piano sensitive,” which was to say that in the eyes of other players, he had mastered the art of pressing against the limits of the instrument so as to make it sound most like the singing human voice.  It was this Thalberg himself was seeking to highlight when he entitled his own piano text, L’Art du chant applique au piano, or The Art of Song Applied to the Piano.

Academic debate continues about the role of Thalberg and the other early European virtuosos who toured the states.  Some defend him as representing the necessary first step in tutoring America in musical sophistication, all while softening the more difficult numbers with crowd pleasing fantasias that classed up songs like “The Last Rose of Summer” and “Home Sweet Home.”  Others render a harsher judgment – Mikol closed her 1958 essay on Thalberg with this highly negative assessment:  “we should not underestimate the part he played a hundred years ago in delaying our musical coming-of-age.”  Part of the sharp discrepancy relates to differing views of the emergence of high culture.  R. Allen Lott’s From Paris to Peoria credits the impresario experience as laying the groundwork for a richer American culture, where the audience experience of the classical repertoire was “sacralized” over time.  Contra Lawrence Levine and others, who have argued that this period shows how cultural norms were imposed by rich elites as a method of bringing the masses under disciplined control, Lott and Ralph Locke credit not elite control but a widely shared eagerness for intensive aesthetic experiences that transcended class divisions.  Still others point to the deeply gendered responses this sort of musical performance elicited – women were not allowed entry into the evening theatre without a male escort, and so Thalberg and others added afternoon matinees where women could attend unaccompanied.

Today Thalberg is mainly forgotten – my own interest in him came from seeing one of his works performed on campus two months ago by a Mississippi musicologist – but In towns across America he and the other touring virtuosos provoked both class antagonisms and the enthralled reactions of the spiritually “slain in the spirit,” which makes it difficult to judge either perspective uniquely correct.  In Boston a huge controversy erupted when Thalberg’s manager sought to limit ticket sales to the upper class (by briefly requiring patrons to provide a “correct address”); the papers had a field day about foreign snobbery and defended music for its democratizing potential.

Meanwhile, audiences were enthralled and carried to heights of emotional ecstasy by the actual concerts; the press accounts often noted that listeners wept.  Thalberg managed to achieve this response, amazingly, without resort to the usual theatrics apart from his pure technique; as a Boston reviewer put it, “no upturning of the eyes or pressing of the hand over the heart as if seized by a sudden cramp in that region, the said motions caused by a sudden fit of thankfulness.”  Others, sometimes in small towns but even in New York City, already the nation’s cultural capital, reacted with disdain, a fact that led one of the city’s preeminent critics to ask, “Why will mere glitter so far outweigh solid gold with the multitude?”  Still others attended not to hear the music but display their social status.

Such reactions persist to this day in the nation’s symphony halls, but even as audiences reproduce the prejudices of their time, it is hard not to be moved by the more singular reaction of that same New York correspondent who, upon hearing Thalberg play the opening of Beethoven’s Emperor, said that even as “it fell dead upon the audience, …I drank it in as the mown grass does the rain.  A great soul was speaking to mine, and I communed with him.”

SOURCES:  R. Allen Lott, From Paris to Peoria:  How European Piano Virtuosos Brought Classical Music to the American Heartland (Oxford:  Oxford University Press, 2003); Vera Mikol, “The Influence of Sigismund Thalberg on American Musical Taste, 1830-1872,” Proceedings of the American Philosophical Society 102.5 (20 October 1958), pgs. 464-468; Joseph Horowitz, online review of Vera Lawrence’s Strong on Music:  The New York Music Scene in the Days of George Templeton Strong, vol. 3 (Chicago:  University of Chicago Press, 1999), at http://www.josephhorowitz.com; Lawrence Levine, Highbrow/Lowbrow:  The Emergence of Cultural Hierarchy in America (Cambridge, Mass.:  Harvard University Press, 1988); Ralph Locke, “Music Lovers, Patrons, and the ‘Sacralization’ of Culture in America,” 19th-Century Music 17 (Fall 1993), pgs. 149-173; E. Douglas Bomberger, “The Thalberg Effect:  Playing the Violin on the Piano,” Musical Quarterly 75.2 (Summer 1991), pgs. 198-208.

The importance of watching

I’m not quite finished with it yet, but Paul Woodruff’s recent The Necessity of Theatre: The Art of Watching and Being Watched (Oxford:  Oxford University Press, 2008) makes a compelling case for treating theatre as central to the human experience.  Woodruff’s point is not to reiterate the now-familiar claim that theatrical drama importantly mirrors human experience, although I assume he would agree with thinkers like Kenneth Burke (who insisted in his own work that theatricality was not a metaphor for human life, but that our interactions are fundamentally dramatically charged).  Rather, theatre, which he defines (repeatedly) as “the art by which human beings make or find human action worth watching, in a measured time and place” (18), enacts much of what is basic to human sociability.

Theatre and life are about watching and the maintenance of appropriate distance, and the way in which collective observation provides validation for human interaction (such as in the ways public witness validates a marriage ceremony or makes justice, itself animated by witnesses, collectively persuasive).

The book is a little frustrating – Woodruff is a philosopher and the book starts by discovering the river (and making its boldest claim up front) and then guiding the reader through all the connected tributaries, and that can be a little tedious when the journey starts to feel less like a riverboat cruise and more like navigating sandbars.  That is, the project proceeds too fully as a definitional typology of theatre, an approach that performatively contradicts the most important think about theatre itself:  finding audiences and keeping them interested.  Woodruff also has a tendency to keep announcing how important his claims are:  “Formally, however, I can point out already that [my] definition has an elegance that should delight philosophers trained in the classics” (39).  “This is bold” (67).  “My proposed definition of theatre is loaded” (68).  And so on.

But along the way Woodruff says a lot of interesting things.  Some examples:

•  “Justice needs a witness.  Wherever justice is done in the public eye, there is theatre, and the theatre helps make the justice real” (9).  

•  “People need theatre.  They need it the way they need each other – the way they need to gather, to talk things over, to have stories in common, to share friends and enemies.  They need to watch, together, something human.  Without this…, well, without this we would be a different sort of species.  Theatre is as distinctive of human beings, in my view, as language itself” (11).

•  “Politics needs all of us to be witnesses, if we are to be a democracy and if we are to believe that our politics embody justice.  In democracy, the people hold their leaders accountable, but the people cannot do this if they are kept in the dark.  Leaders who work in closed meetings are darkening the stage of public life and they are threatening justice” (23).

•  “The whole art of theatre is the one we must be able to practice in order to secure our bare, naked cultural survival” (26).

•  “A performance of Antigone has more in common with a football game than it does with a film of Antigone” (44).

I began by cheating, I suppose, by reading the epilogue, where Woodruff notes:  “I do not mean this book to be an answer to Plato and Rousseau…, because I think theatre in our time is not powerful enough to have real enemies.  Theatre does have false friends, however, and they would confine it to a precious realm in the fine arts.  We need to pull theatre away from its false friends, but we have a greater task.  We need to defend theatre against the idea that it is irrelevant, that it is an elitist and a dying art, kept alive by a few cranks in a culture attuned only to film and television.  I want to support the entire boldness of my title:  The Necessity of Theatre” (231).

An approaching Singularity?

When Ray Kurzweil published his bestseller, The Singularity is Near, in 2005, the skeptical response reverberated widely, but his track record when it comes to having made accurate predictions has been uncanny.  In the late 1980’s it was Kurzweil who anticipated that soon a computer could be programmed to defeat a human opponent in chess; by 1997 Big Blue was beating Garry Kasparov.  His prediction that within several decades humans will regularly assimilate machines to the body seemed, as Michael Skapinker recently put it, “crazy,” “except that we are already introducing machines into our bodies.  Think of pacemakers – or the procedure for Parkinson’s disease that involves inserting wires into the brain and placing a battery pack in the chest to send electric impulses through them.”

Kurzweil obviously has something more dramatic in mind than pacemakers.  The term singularity both describes the center of a black hole where the universe’s laws don’t hold and that turning point in human history where the forward momentum of machine development (evolution?) will have so quickly accelerated as to outpace human brainpower and arguably human controls.  For Kurzweil the potential implications are socially and scientifically transformational:  as Skapinker catalogs them, “We will be able to live far longer – long enough to be around for the technological revolution that will enable us to live forever.  We will be able to resist many of the diseases, such as cancer, that plague us now, and ally ourselves with digital versions of ourselves that will become increasingly more intelligent than we are.”

Kurzweil’s positions have attracted admirers and detractors.  Bill Gates seems to be an admirer (Kurzweil is “the best person I know at predicting the future of artificial intelligence”).  Others have criticized the claims as hopelessly exaggerated; Douglas Hofstadter admires elements of the work but has also said it presents something like a mix of fine food and “the craziest sort of dog excrement.”  A particular criticism is how much of Kurzweil’s claim rests on what critics call the “exponential growth fallacy.”  As Paul Davies put it in a review of The Singularity is Near:  “The key point about exponential growth is that it never lasts.  The conditions for runaway expansion are always peculiar and temporary.”  Kurzweil responds that the conditions for a computational explosion are essentially unique; as he put it in an interview:  “what we see actually in these information technologies is that the exponential growth associated with a particular paradigm… may come to an end, but that doesn’t stop the ongoing exponential progression of information technology – it just yields to another paradigm.”  Kurzweil’s projection of the trend lines has him predicting that by 2027, computers will surpass human intelligence, and by 2045 “strictly biological humans won’t be able to keep up” (qtd. in O’Keefe, pg. 62).

Now Kurzweil has been named chancellor of a new Singularity University, coordinated by a partnership between NASA and Google.  The idea is simultaneously bizarre and compelling.  The institute is roughly modeled on the International Space Unversity in Strasbourg, where the idea is to bring together Big Thinkers who can, by their interdisciplinary conversations and collaboration, tackle the impossible questions.  One wonders at whether the main outcome will be real research or wannabe armchair metaphysical speculation – time will tell, of course.  NASA’s role seems to be simply that they have agreed to let the “university” rent space at their Moffett Field Ames Research Center facility in California.  The money comes from Peter Diamandis (X Prize Foundation chair), Google co-founder Larry Page, Moses Znaimer (the media impresario), and tuition revenue (the nine week program is charging $25,000, scholarships available).  With respect to the latter the odds seem promising – in only two days 600 potential students applied.

The conceptual issues surrounding talk of a Singularity go right to the heart of the humanistic disciplines, starting with the manner in which it complicates anew and at the outset what one means by the very term human.  The Kurzweil proposition forces the issue by postulating that the exponential rate of information growth and processing capacity will finally result in a transformational break.  When one considers the capacity of human beings to stay abreast of all human knowledge that characterized, say, the 13th century, when Europe’s largest library (housed at the Sorbonne) held only 1,338 volumes, and contrasts that with the difficulty one would encounter today in simply keeping up with, say, research on William Shakespeare or Abraham Lincoln, the age-old humanistic effort to induce practices of close reading and thoughtful contemplation can seem anachronistically naive.

One interesting approach for navigating these issues is suggested in a 2007 essay by Mikhail Epstein.  Epstein suggests that the main issue for the humanities lies less in the sheer quantity of information and its potentially infinite trajectory (where, as Kurzweil has implied, an ever-expanding computational mind finally brings order to the Universe) than in the already evident mismatch between the finite human mind and the accumulated informational inheritance of humanity.  Human beings live for a short period of time, and within the limited timeline of even a well-lived life, the amount of information one can absorb and put to good use will always be easily swamped by the accumulated knowledge of the centuries.  And this is a problem, moreover, that worsens with each generation.  Epstein argues that this results in an ongoing collective trauma, first explained by Marxist theory as inducing both vertigo and alienation, then by the existentialists as an inevitability of the human condition, and now by poststructuralists who (and Epstein concedes this is an oversimplification) who take reality itself “as delusional, fabricated, or infinitely deferred” (19).  Epstein sees all this as evidencing the traumatizing incapacity of humans to comprehend in any detailed way their own collective history or thought.  The postmodern sensibility revealed in such aesthetic traditions as Russian conceptualism, “which from the 1970s to the 1990s was occupied with cliches of totalitarian ideology,” and which “surfaced in the poetry and visual art of Russian postmodernism” in ways “insistently mechanical, distant, and insensitive” (21).  There and elsewhere, “the senses are overwhelmed with signs and images, but the intellect no longer admits and processes them” (22).

The problem to which Epstein calls attention – the growing gap between a given human and the total of humanity – is not necessarily solved by the now well-established traditions that have problematized the Enlightenment sense of a sovereign human.  In Epstein’s estimation, the now-pluralized sense of the human condition brought into being by multiculturalism has only accentuated the wider social trends to particularization and hyper-specialization:  the problem is that “individuals will continue to diversify and specialize:  they will narrow their scope until the words humans and humanity have almost nothing in common” (27).

The wider work on transhumanism and cyborg bodies reflects a longer tradition of engagement with the challenge posed by technological transformation and the possibilities it presents for physical reinvention.  At its best, and in contrast to the more culturally salient cyborg fantasies enacted by Star Trek and the Terminator movies, this work refuses the utopian insistence in some of the popular accounts that technology will fully eradicate disease, environmental risk, war, and death itself.  This can be accomplished by a range of strategies, one of which is to call attention to the essentially religious impulses in the work, all in line with long-standing traditions of intellectual utopianism that imagine wholesale transformation as an object to be greatly desired.  James Carey used to refer to America’s “secular religiosity,” and in doing so followed Lewis Mumford’s critique of the nation’s “machano-idolatry” (qtd. in Dinerstein, pg. 569).  Among the cautionary lessons of such historical contextualization is the reminder of how often thinkers like Kurzweil present their liberatory and also monstrous fantasies as inevitabilities simply to be managed in the name of human betterment.

SOURCES:  Michael Skapinker, “Humanity 2.0:  Downsides of the Upgrade,” Financial Times, 10 February 2009, pg. 11; Mikhail Epstein, “Between Humanity and Human Beings:  Information Trauma and the Evolution of the Species,” Common Knowledge 13.1 (2007), pgs. 18-32; Paul Davies, “When Computers Take Over:  What If the Current Exponential Increase in Information-Processing Power Could Continue Unabated,” Nature 440 (23 March 2006); Brian O’Keefe, “Check One:  __ The Smartest, or __ The Nuttiest Futurist on Earth,” Fortune, 14 May 2007, pgs. 60-69; Myra Seaman, “Becoming More (Than) Human:  Affective Posthumanisms, Past and Future,” Journal of Narrative Theory 37.2 (Summer 2007), pgs. 246-275; Joel Dinerstein, “Technology and Its Discontents:  On the Verge of the Posthuman,” American Quarterly (2006), pgs. 569-595.

Follow

Get every new post delivered to your Inbox.