Amateur Humanist

Start here

Whale communication

Every couple weeks I receive alerts from a grants database configured to send me information about communication research informing me of sponsored projects relating to animal communication.  I am always surprised to see these alerts since the field of speech communication in which I’m trained (and in contrast to scholars of the science of language development) only peripherally engages the topic and in ways that either treat it as an issue of passing relevance in our introductory texts for students who inevitably raise the subject, or in fully anecdotal ways.  A famous essay by the literary critic Kenneth Burke thus distinguishes animal communication as motion (which for him is a term referring to instinctive and unthinking movement) as compared to human action; although ideological convictions can make us instinctively hop around this way or that, Burke argued our capacity for intersubjective communication enables thoughtful and motivated acts.  As influential as Burke is for scholars of rhetorical process, it has never occurred to me that he had a sophisticated understanding of, say, primate interaction, and of course his work in the 1950s and 1960s predated the most significant anthropological and biological research on animal sociability, some of which comes very close to suggesting or even proving beyond doubt complex structures of communicated meaning.

I was reminded of all this, and also my own ignorance, in watching a PBS Nature documentary aired the other night in Atlanta which told the story of some animal caretakers who have rescued aging chimpanzees kept in lifelong captivity either because they were raised to be circus performers or the subjects of medical research.  Many of the chimps, who age to be 35 years or older, are living with having been infected with HIV or hepatitis or other illnesses with the goal of discovering cures for these illnesses in humans.  One of the common characteristics of these animals is that they have lived so long in cages and on concrete pads that they may fear a return to more natural habitats, although that rightly does not prevent advocates for their humane treatment from endeavoring to restore them to such living conditions.  One of the sad moments in the documentary showed a chimp who recognized one of his trainers after having not seen him for more than a quarter century.  On renewed contact the old chimp seemed to immediately recognize his old captor with affection.  The professionals who cared for him (the ape) learned that he liked ice cream cones, and so for the last couple weeks of his life they brought him one every day which he would happily devour, not by swallowing it whole but by licking it and nibbling through the cone as any human being might.  One of the happiest moments showed a chimp released at last from concrete and caged confinement into an almost open park – the monkey raced out, off the concrete pad onto the grass, and scrambled to the top of the tallest tree on the property.  The good people who fought to create such a habitat wept tears of joy.  A small victory for primate rights, to be sure, or perhaps simply for the humane treatment of animals who have been, let us say it, tortured.  As I watched all this I realized I was as moved by the fact that the old chimp ate ice cream in ways that made him seem human as I was by the other’s sprinting return into an environs harder to see as recognizably human.

We’ve all heard the statistic noting the degree of genetic separation distinguishing humans from bonobos, gorillas, or chimpanzees as less than one percent.  And although the genetic distance between humans and dolphins is just a bit greater (a new book by Maddalena Bearzi and Craig Stanford notes that we split away from dolphins more than 100 million years ago), dolphins and whales are also astonishingly clever, in part because their neocortex is considerably larger than ours.  The point of the book (Beautiful Minds: The Parallel Lives of Great Apes and Dolphins, Harvard UP), as the titles and the author’s credentials imply (Bearzi studies dolphins, Stanford primates), is to point out the affinities between these animals (and by doing so to implicitly close the gap we typically perceive between them and us):  all “spend their lives navigating a complicated social environment.  An ability to learn quickly and make subtle choices lets individuals track the moods, intentions and shifting dominance ranks of others with an eye to increasing their own position within the social hierarchy…[D]olphins and apes are assessing rivals and friends, inventing distractions, employing bribes, and forming alliances to get [food and mates], relying at least as much on cunning as on brute strength” (Chadwick).

The point of all this is not to establish or prove to skeptics that animals are capable of communication – these authors take such a conclusion as already having been established, where the real questions of interest have more to do with communicative variation among species.  While debate remains over the extent to which non-humans possess the capacity to socially transmit learned information, there is by now no doubt that “dolphins and great apes grasp language to some extent.  [Bearzi and Stanford] describe dolphins responding readily to commands that require an understanding of the meaning and order of words – i.e., syntax – while studiously ignoring researchers who issued illogical orders” (Chadwick).

David Rothenberg has been working on the prospects for interspecies communication for some time now, and his 2005 book Why Birds Sing recounts his efforts to create duets with birds.  Birdsong is interesting because, as Chadwick (and of course Rothenberg) notes, birds sing far more than biologically necessary – Rothenberg is thus interested to discover whether birds sing as a form of entertainment, or because it might bring them joy.  With the discovery that whales make elaborate music beneath the waves, efforts were made to uncode whalesong, in part motivated by the thought that we could perhaps talk to each other.  As Chadwick notes:  “Enthusiasts grabbed flutes and lutes and boated out to serenade minds in the waters, intending to open up lines of contact.  The typical reaction from the whales was avoidance, and the dream evaporated.  Rothenberg thinks it may be time to try again.”  His new book, Thousand Mile Song:  Whale Music in a Sea of Sound (Basic Books), recounts efforts to make recognizable contact with whales both in captivity and in the wild.  The book pays special attention to the clicks and whistle-like sounds made by sperm whales, and maps out the different patterns that distinguish whale families.

In one of the Star Trek movies, Spock mind-melds with an earth whale, who is able to seamlessly bring Spock up to date on how ocean pollution has eviscerated habitats.  The film, of course, is predicated on a hopeless but lingering cultural fantasy that holds out the possibility of mental transference and total communicative transparency, a dream no less utopian when considered in the human context than in cross-species encounters.  But I wish Rothenberg well anyway and understand his work as deeply serious.  If research only gets as far as decoding signs of distress or hunger in other species, then that alone will have justified the effort.

SOURCES:  Douglas H. Chadwick, “Cool sounds for killer whales,” Times Literary Supplement, 14 November 2008, pg. 26; David Rothenberg, “Whale Music: Anatomy of an Interspecies Duet,” Leonardo Music Journal 18 (2008): pgs. 47-53.

The limits of global connectivity

The idea that everyone and everything is networked is an ancient one but now a thoroughly pervasive way of describing the world; it is a commonplace that influence, wealth, and ideas shape the unfolding of human history via increasingly dispersed and vertically organized arrangements.  Two essays in the new (January/February 2009) Foreign Affairs, written from drastically different perspectives, offer accounts opposed not because one lacks optimism (to the contrary, both are exceptionally hopeful for America’s prospects) but because one takes measure of the opportunities created by a globally networked society while the other emphasizes its perils.  Bill Gates, in an essay presumably penned before he knew he would continue working as heads of the Pentagon, talks about the security challenges posed by a “new age” where unconventional insurgencies will require both military and diplomatic flexibility.  And, in an essay that clearly sees the cup as half full, Anne-Marie Slaughter, dean of the Woodrow Wilson School at Princeton, argues for the wonderful benefits that can accrue to a smart America in a “networked century.”  Where Gates emphasizes how the globalized context enables instantaneous and network-destroying threats, Slaughter is more attuned to the ways a networked world create the possibilities for collaborations where the United States can assert its more hopeful cultural worldviews without shouldering the full price or institutional risks.

Many recent events have evoked the downsides of globalization, where small coordinated attacks can cripple or derail whole societies even as globalized media coverage brings small acts of resistance or terror to worldwide attention.  The attacks on Mumbai were a vivid reminder that the potential targets of attack (in this case tourist and religious centers) are almost innumerable; the same outcome might have been attained by attacks on any of the world’s major hotels, shopping centers, schools, or churches.  The recent arrest of the Russian arms dealer Viktor Bout has also brought into view the shadow circuits of arms selling and the secret networks of commerce that often allow rogue governments to purchase almost anything they can afford.  Bout was said to have sold military hardware to everyone from the Taliban to Latin American revolutionaries to Hizbullah and even perhaps to rebels in Chechnya.

Or one might also take note of the alarming uptick in documented piracy over the past twenty years.  The recent episodes off the African coast, where Somalian pirates have been attacking cruise ships and cargo vessels (about a week ago the United Nations Security Council extended its authorization allowing countries to enter Somalian waters to intercept pirates), also show how globalization provides ready resources for nongovernmental organizations (whether legitimate or criminal) as much as for governments.  While the reported number of pirate attacks has been declining since 2003 (445 in that year, 263 in 2007), the enterprise is centered around the anticipation of pure profit, pirates regularly demand hostage payments, and it looks like the long term decrease may be reversed when the final statistics are reported for 2008.  A report issued recently by Chatham House argued that the Somali pirates are clearly aiming to raise cash for attacks on the Somali government and on behalf of some radical Islamic groups.  And a report from the London-based International Maritime Bureau says the average demanded ransom is in the one million dollar range.

Piracy is described by Homer in both the Iliad and the Odyssey, and even then the difficulty of identifying naval heroes from villains was well recognized; Homer said it was hard to distinguish because “both set off in their long ships to distant shores to plunder and kill.  The difference… seems only to be their god-given fate.”  Today those who think about globalization might, following Tony Blair and Bill Clinton and the broader discourses of neoliberalism, note that all this simply illustrates the inevitability of opportunity created by globalization, which in turn suggests that good governance should be aimed not at ending globalization but channeling its in more socially beneficial and just ways.  But the powerful structures of opportunity and potential profit created by global financial networks should also serve as a cautionary note against those inclined to see international regulation as easily set in motion.  Such international consensus offers cold comfort to the victims of Victor Bout’s arms sales or the Somali pirates who move, watched but mainly unimpeded.

Harvard’s Arts Task Force

This past Wednesday a Harvard task force appointed by president Drew Gilpin Faust released a report advocating an expanded role for the arts there.  The report is interesting in large part because it calls attention to a circumstance common on many campuses, where the arts are ubiquitous – theatrical productions and exhibitions running all the time – but also marginalized to the work of the modern research university to the extracurricular and programmatic sidelines.  While Harvard’s circumstances are obviously not generalizable everywhere given its tremendous wealth and status as the nation’s leading private university, the Task Force led by the Shakespeare scholar Stephen Greenblatt makes a compelling claim for artistic centrality.

To those who regularly teach the creative arts none of the main arguments will seem new, but they are eloquently put and I think well suited to the audiences for claims on the collective resources of comprehensive universities who tend, even if only subliminally, to discount the arts (and for that matter the humanities) as mainly doing peripheral or service work while the real useful knowledge emerging from college campuses is being made in science laboratories and in the professional schools.  In addressing such a worldview, and it is pervasive, the report defends the intellectual practices of artistic invention as wholly necessary to intellectual work.  As the report argues:

The quarantining of arts practice in the sphere of the extra-curricular creates a false dichotomy.  It leads students (and, on occasion, their teachers) to assume falsely that the qualities of successful art-making – the breaking apart of the carapace of routine, the empowerment of the imagination, the careful selection and organization of elements that contribute to an overarching, coherent design, the rigorous elimination of all that does not contribute to this design, the achievement of a deepened intelligibility in the external and internal world – do not belong in the work they are assigned to undertake in the curriculum…  On the contrary, the forms of thinking inculcated in art training are valuable both in themselves and in what they help to enhance elsewhere in the curriculum.  These include the development of craft, the sharpening of focus and concentration, and the empowerment of the imagination.  Art-making is an expressive practice:  it nurtures intense alertness to the intellectual and emotional resources of the human means of communication, in all their complexity.  It requires both acute observation and critical reflection.  This self-reflection – the drive to interrogate conventions, displace genres, challenge inherited codes of meaning – encourages risk-taking and an ability to endure repeated failures.  It fosters both intelligent imitation and a desire to conceive and bring forth what has hitherto been unimaginable.

The report also evokes the increasingly accepted claims that the most demanding intellectual problems demand interdisciplinary approaches, and the pedagogical insistence that students learn best by making rather than by hearing, both arguments mobilized to make the case that training in the arts is not just a luxurious supplement but a necessary ingredient to serious scholarly endeavor.  Although the examples are of necessity anecdotal (and for obvious reason limited to Harvard alumni), cases are brought forward where distinguished work was enabled by exposure to the arts:  T.S. Eliot, W.E.B. Du Bois and others are mentioned as having challenged dominant paradigms because of their involvement with a range of disciplines including the arts.

When the arts are mainly championed as extracurricular events in which well rounded individuals will participate but not specialize, another danger is aroused, and “a quite specific view of the arts” is encouraged:  “Art, in this view, is a thing entirely bound up with pleasure.  Purely voluntary, it stands apart from the sphere of obligation, high seriousness, and professional training.”  And when the arts are “deemed… to be extracurricular, many students remain oblivious to the hard work – the careful training, perception, and intelligence – that the arts require.  They know that writing essays is a skilled and time-consuming labor.  They recognize that problem sets in math and science are meant to be difficult.  But ask them to photograph a landscape, compose a short story, or direct a scene rather than write an analytical essay and they will almost universally assume that the exercise will be quickly and easily dispatched.  The problem is not that they believe art-making is trivial but rather that they believe that talent alone, and not thought or diligence, will determine the outcome.”

And yet the report also makes a sustained case for the role the arts can play in nurturing happiness, by which is meant not the fleeting delight that comes from a moving sonata or an entertaining comedy, “something more than the acquisition of technical mastery, something beyond the amassing and exchange of information necessary for the advancement of human learning” – it “entails an intensified participation in the natural and human realms, a vital union of spirit and matter at once facilitated and symbolized by works of art.”

The report obviously moves rather quickly to make Harvard-specific recommendations, including a call for an expanded arts presence on their new Allston campus, concretized support for artists-in-residence, and new graduate programs in the arts.  Some of these will work elsewhere and some won’t.  But even at the level of the specifics, it is hard to imagine that the Harvard Task Force call for such initiatives as using graduate arts programs (especially new MFA degree programs) as a way to leverage artistic excellence, creating an interdisciplinary artistic Hothouse where new collaborations might be nurtured, and thinking of all campus spaces as potential places for exhibition and attention to aesthetic practice would not be well justified on any comprehensive research or liberal arts campus.

These arguments are made with some rhetorical sensitivity, offered in a way I think unlikely either to offend artists who might be inclined to see such a case as slighting their hard work or non-artists whose academic positions would less typically have them thinking seriously about the role art might play.  All the more reason that it should be widely read and its central claims broadly deployed.

Ranking the world’s best orchestras

Thinking I would become better informed about the world’s classical music scene, I’ve recently subscribed to the magazine Gramophone, and the very first issue I received is the December 2008 one whose cover announces a ranking of “the world’s greatest orchestras.”  The list has been deeply controversial, especially in Philadelphia, whose orchestra was omitted from the list.  Here is how they ranked them:

1.  Royal Concertgebouw (their concert hall pictured above)
2.  Berlin Philharmonic
3.  Vienna Philharmonic
4.  London Symphony Orchestra
5.  Chicago Symphony Orchestra
6.  Bavarian Radio Symphony Orchestra
7.  Cleveland Orchestra
8.  Los Angeles Philharmonic
9.  Budapest Festival Orchestra
10.  Dresden Staatskapelle
11.  Boston Symphony Orchestra
12.  New York Philharmonic
13.  San Francisco Symphony Orchestra
14.  Mariinsky Theatre Orchestra
15.  Russian National Orchestra
16.  St. Petersburg Philharmonic
17.  Leipzig Gewandhaus
18.  Metropolitan Opera Orchestra
19.  Saito Kinen Orchestra
20.  Czech Philharmonic

The insult to Philadelphia is especially sharp given that the Orchestra plays in one of the most magnificent symphony halls in the world (the gorgeous cello-shaped Kimmel Center for the Performing Arts), and because Gramophone list them in a box headlined “Past Glories,” along with the NBC Orchestra, which was disbanded decades ago.  (The Gramophone editor did email the Philadelphia Inquirer to note that they had made the top thirty list).  Some have wondered whether the critics are still made that Philadelphia dumped Christoph Eschenbach.

On first opening the article, I thought (fleetingly) that I might see the Atlanta Symphony Orchestra listed, if not as a top twenty then maybe as an up-and-coming – as a novice it seems to me their playing is crisper and their programming more wide-ranging and interesting under Robert Spano – but no such luck.  And how could I defend such a judgment anyway, having never heard most of the symphonies that made the final ranking?

The methodology of the list, which no matter how devised would be impossible to defend, was to have a small number of critics (eleven all told) consider live performance, recorded output, community contributions, and “the ability to maintain iconic status in an increasingly competitive contemporary climate,” whatever that means.  As one might expect, this procedure has many commentators scratching their heads.  Some have articulated alternative criteria, such as the quality of the conductor (certainly the symbiotic relationship between conductor and ensemble cannot be easily reduced to a ranking of this sort).  One blogger has noted that, given how unlikely it is that any of the panel would have seen live performances of all these organizations, the better tack would have been to have a ranking done by conductors in wide circulation (with a presumably better comparative sense of talent).

Does the surprisingly high ranking for the LSO reflect the fact that Gramophone is produced in Great Britain?  Does it matter that Mariisky and Vienna and the Metropolitan Opera ensembles mostly play in the pit?  Doesn’t critical buzz lag actual quality playing, and shouldn’t that be considered?  Should the strong recording histories of some orchestras necessarily fortify their place on the list when others may be arbitrarily restrained?  One commentator noted that Berlin under Karajan got recorded every time they sneezed, whereas New York was (arguably) under-recorded in the same period for reasons beyond their control (union contracts, etc.).  Shouldn’t it matter that some groups will be fabulous playing Bruckner and lousy playing Wagner?  How should one relatively weigh “best playing” with “most interesting programs”?  Don’t reviewing critics get ideas in their heads that sediment and are difficult to change even when the facts on the ground are fast-evolving?  And given the inevitable variability in repertoire, how can one plausibly make apples-and-oranges comparisons with any validity?  As the London Telegraph (November 26) put it,

The exercise is fundamentally preposterous, not because the placings might trigger controversy but because there are no such absolutes in music.  Asked to name the eighth “best” singer in the world or the 15th “best” violinist, we might all be hard pressed.  The same goes for orchestras, where there are considerably more variables to take into account.  Different traditions, different repertoires, halls and conductors all have an impact.  No matter how scientifically this poll was conducted, or who was involved in the voting, it bypasses the fact that the quality to celebrate in orchestras is not their top-twenty status but their diversity and the individual attributes they might bring to the performance of music.

Considering the number one orchestra, which is without doubt an incredible ensemble, brings some of these concerns into sharp relief:  Mariss Jansons has only been conducting in Amsterdam since 2004 and replaced Riccardo Chailly, who started out highly controversial (though the unanimous pick of the players he was their first ever non-Dutch conductor) and was early-on criticized by some for wrecking the Concertgebouw sound.  So is this rating reflective of true world’s-best artistry or of Jansons’ conducting honeymoon or the aftermath of the Chailly Age, which gradually gained fans and won respect?

The classical music blogger for the Guardian (UK), Tom Service, asks:  “Is the Dresden Staatskapelle really almost twice as good as the Leipzig Gewandhaus?  Should the New York Philharmonic be more highly ranked than the San Francisco Symphony when Michael Tilson Thomas’s reign in San Francisco has been infinitely more interesting than Lorin Maazel’s at Lincoln Centre?”

And so all seem agreed that the ranking is an absurdity, yes?  But the impulse to rank (which will undoubtedly move a lot of Gramophone issues) is as hard to resist for editors as it is for those recognized – you can bet the Dutch orchestra put atop the list will forever refer to itself as “having been named the world’s greatest orchestra by the authoritative publication Gramophone.”

Cosmetic surgery and ugly ducklings

A recent issue of Configurations (vol. 15, 2007, pgs. 1-99) focuses on those plastic surgery transformation television shows, especially Fox’s The Swan.  Introducing the essays that follow in the issue, Bernadette Wegenstein cites the crazy wider statistics:  (a) since 1997 the number of plastic surgery procedures in the United States has increased 446 percent; (b) in 2006, 7,915 Botox injections and 16,477 rhinoplasties were performed on adolescents, and (c) in 2006, 3,181,592 Botox injections were made.  A lot of this jump corresponds to the popularity of the makeover shows – ABC’s Extreme Makeover (2002), FX’s Nip and Tuck (2003), E! channel’s Dr. 90210 and MTV’s I Want a Famous Face and Fox’s The Swan (all premiered in 2004) – in 2004 (the peak year for these shows) surgeries increased by 44%.  In The Swan’s third season, the producers received 500,000 applications from people wanting to be selected.

The Swan received most of the attention in this collection, I suspect because it connects to the fairy tale we all know so well (whereas the MTV show is just creepy and the act of watching someone who wants to look exactly like Cher feels like voyeurism).  The childhood fantasy that one is trapped in a life she or he finds miserable, and that someday someone will come along to acknowledge the awful misrecognition that produced such a trap (babies switched at birth!  you are really a princess from Monaco! and so on), is among the most influential in our culture and helps to explain everything from the way our culture fetishizes superheroes (whose true fantastic nature is obscured from all but a few confidants who know the real secret) to the still popular Hollywood/modeling agency fantasy that fame and fortune happens when one is discovered serving cheeseburgers by an agent who sees the true Brad Pitt under all the acne.  These myths imply that you don’t have to really do anything to be fabulous except wait around to be discovered.

There is nothing wrong with wanting to be a better person, and as these essays note, many of those appearing on these shows are clearly working through issues of severe trauma and disparagement, and one can’t help but want to see their lives improved (and part of the success of the shows is how they position audience members as supportive friends).  Personal maturation is in part achieved by bumping up against one’s physical and intellectual limits and struggling to become a better person as those constraints are undone or acknowledged, facing inevitable loss and joy to come to terms with what one can become.  But of course the same shows perpetuate some dangerous mythologies, such as the ideas that external physical transformation can do anything but only briefly forestall the normal mechanisms of aging and deterioration, that quick and intensive periods of professional intervention can correct years of sedimented bad habits, or that one’s true individuality is best attained by reshaping one’s body under the scalpel to look like a million others or some mythical collective ideal type.  It doesn’t help that so often the transformations are described as the outcome of necessary humiliation and overly-hyped emotionalism.  Wegenstein is on target when she asks whether “the desires on display in our makeover culture represent ‘the best or the worst of our nature,’” and the essays do a good job of exploring both sides – Pamela Oresan-Weine argues, for example, that while such TV shows send the wrong message to those fantasizing about, say, a nose-job as the cure to all that ails them, they may send a more beneficial message to wider audiences, which are reminded “that one’s true nature may not, at first, be easily recognized by self or other, but that one’s value will become apparent if one endures.”  Transformation shows can be a lesson in tolerance.

I have education on my mind as I write this both because I work in a university, but also because just this morning I read another review of the new Charles Murray (the Bell Curve author) book whose main claim is that the problem with American higher education is that, in the name of total transformation, too many kids are lured into colleges and universities who have no business being there and who too often simply lack the aptitude to succeed once they arrive on campus.  As I read about the strengths and weaknesses of total transformation TV shows, the parallels to education emerge in interesting ways.  When one of The Swan analysts calls positive attention to the fact that contestants succeed because they “are surrounded by a panel of experts who hold them in mind, think about what they need, what is best for them,” it is hard not to think about the similar role teachers can play.  And when that same author argues that personal maturation is partly the result of youthful narcissism coming to terms with one’s limits, any teacher will recognize the process from having watched students struggle to learn with confidence.  Although Stanley Fish probably doesn’t agree with Murray on anything, his new book also warns against the dangers that are produced by overclaiming the benefits of education.

Even as I kept returning to the analogical relationship between transformational cosmetic surgery and transformational education, I chafed against it since I tend to hate what those shows implicitly promise but love the promises of education.  After all, reality shows lie by implying that transformation is easy (most of the difficult and long process of weight loss and nutritional change is kept off camera).  Then again, riding the Atlanta subway this morning, I saw a large advertisement for one of the city’s best business schools implying that wealth is only six college courses away.  And so one wonders.  Many college teachers I know have grown concerned with the “entitlement culture” that dominates their students’ mindset, where students seem to feel they deserve an “A” because of how hard they worked on an assignment (regardless of the quality of their work) or special consideration just because of who they are and how busy they are otherwise.  All kinds of culprits are named when I hear this complaint, but to some extent the national fantasy that college serves as the final validating point of entry into a life of fame and fortune (which is preached at almost every “follow your bliss” commencement ceremony in America and in every university promotional flyer) likely also plays a part, this despite the important essential truthfulness in the advertisements.

Ultimately the thought experiment of contrasting Barbie-doll surgery TV shows to the experience of, say, the typical public university, leads also to important distinctions, and my point is not to disparage education by claiming that it is no more or less transformational than a breast augmentation operation.  Cosmetic surgery reshapes the body, education the mind, and the benefits from the latter are usually more lasting.  Surgery forces the body into ever tightening constraints (as proved, I think, by that woman who has had more than twenty operations to look like Loni Anderson but now looks like a trauma survivor), whereas education opens possibilities and enlarges identity.  But the dangers of one are also to some extent the dangers of the other, and the comparison is a useful reminder that education is not easy, transformation takes real and extended work, wisdom is both a function of surpassing one’s limits and also recognizing them, and that personal change is never purely individualistic bootstrap work, but relies on a network of support (teachers, coaches, families) available to both recognize talent and support its maturation when the hard work seems hardly worth the effort.

On guilty pleasures

In Atlanta they have started to play wall-to-wall Christmas music on some of the radio stations, which in general catches me too early since thirty days of “Grandma Got Run Over By a Reindeer” can yank the Christmas spirit out of just about anyone.  But some of the songs I’ve heard clearly fall into the category of guilty pleasures, by which I mean those I’m usually ashamed to share with others.

Here is a wholly random example that I’m trying to sort out right now:  I am totally wrapped up in the Mormon Tabernacle Choir version of a Christmas hymn called “What Shall We Give to the Babe in the Manger.”  I heard it somewhere last week and then had to research to figure out the song title, and then download it from iTunes, and now can’t get it out of my head.  I’m moved by it, in part I know simply because of the early spirit of the season, and also because the lyrics, basic though they are, are gentle and rather wistful.  I guess I shouldn’t be the least bit embarrassed by this – millions of people around the world are taken with MTC music, not the least among them the seven million Mormons in the United States – but I am, if just a little.

I think it is true of everyone, especially those in the iTunes and MP3 and YouTube universe, but my popular musical habits are lately driven by one kitchy song after the next.  Last year my Christmas music pathology was the Michael Crawford version of “O Holy Night,” which many people found creepy because the idea of the Phantom of the Opera singing about Jesus struck them as just, well, plain wrong, but I found it rather spectacular. The year before it was the Century Men singing “Oh Beautiful Child of Bethlehem,” which dares to pair the all-male choral number with a hammered dulcimer backdrop that makes it sound, gasp!, downright Appalachian.

More broadly, in the last six months alone I’ve worked through (and am now mainly over – see, there’s the denial part) fast obsessions with Shirley Bassey (love the Living Tree and that bad Moonraker theme), the baton-twirling kid who almost won the British version of American Idol (the YouTube video of him on stage with his little grandmother watching from the side stage, which was admittedly edited to achieve a reaction, choked me up every time), Ben Folds singing The Luckiest, Marvin Gaye’s doing the Star Spangled Banner at some basketball game twenty years ago, or the exuberant debut video of Riverdance at the Eurovision contest years ago (especially the very end where the performers can’t seem to believe the reaction they have elicited).  But I can’t blame it all on new media or the YouTube video archive:  my guilty music pleasures – which include Lynn Anderson singing Rocky Top, the Bellamy Brothers’ Let Your Love Flow, Sniff’n the Tears’ Driver’s Seat, Dolly Parton’s Hard Candy Christmas, the O’Jays doing Love Train, that sappy Andy Williams’ song Dear Heart, Nina Simone covering the George Harrison song Here Comes the Sun, Dionne Warwick (before she went all occult) doing almost anything – have been going on for years.

And how am I supposed to defend any of this?  Please help me.

Some I know mask their true guilty pleasures – the number of people who answer questions about their “favorite music” by waxing on about Mozart and Mahler surely exceeds the number who actually have their car radios set to the classical music station – or wear their shameful preferences right there on their sleeve.  But when millions of Americans trumpet their love for achy-breaky country music and it has become the most popular radio format in America, then can one really be guilty about it?  Dollywood and NASCAR and Graceland and Desperate Housewives are at some level pure kitch, but when millions watch or attend and whole cottage industries exist to serve the needs of their fans, these institutions drop off the true guilty pleasure list.

Some portion of anyone’s guilty pleasures derive from events that transport them back to childhood, and we’ve all seen people defending the indefensible (like Baby Got Back! or Tony Basil’s Oh Mickey) with a shrug and a Sorry, high school favorite!  A recent entry in this genre was written into a newspaper column by Melissa Ruggieri:  “Not that I’ve ever hidden my devotion to Duran Duran and Bon Jovi.  Look, I grew up with them.  They were my teen crushes” (can you hear the defensiveness?).  I suspect this throwback logic explains my truly excessive love of Cat Stephens’ Morning Has Broken (it meant a lot to me in college), Neil Sedaka’s Laughter in the Rain (every time I hear it I remember what high school felt like, even though none of my high school peers would have been caught dead listening to it), and Supertramp’s School and Goodbye Stranger (when I hear those I instantly remember my senior year).  Even more than those:  Al Stewart’s Year of the Cat and Crosby/Stills/Nash/Young’s Teach Your Children Well (I think I was the only American who liked Walter Mondale more after he soundtracked a campaign-closing TV ad to that song), and, get ready to cringe, that Harper’s Bizarre 59th Street Bridge Song (you know, “feelin’ groovy”).

And maybe most of all: Andy Williams’ Moon River, which I can’t hear without being instantly transported into the wistfulness of Breakfast at Tiffany’s.  I imagine millions of others must feel the same way:  why else would it have been featured as a key soundtrack moment in Sex and the City (in the episode where Big leaves NYC for wine country it plays in his empty apartment) or in that aching scene from the HBO version of Angels in America where a relationship traumatically ended too soon is remembered as an imagined dance to that song?  So perhaps I get a pass on that one.

But I can’t blame it all on my childhood and neither can you.  Some guilty pleasures just have to be left alone for what they are:  pure moments of pathos that caught you short at the right (or wrong) time.  And this is why they are so hard to admit, since it is embarrassing to confess that one actually fell for the shameless manipulation of emotion in that movie or TV show or Obama (or, maybe for you, Palin) speech.  Or to admit that you secretly enjoyed that gruesome moment when Anthony Hopkins’ Hannibal dines on Ray Liota’s brain.  As Bill Everhart has put it, “a guilty pleasure can be a movie that is so bad it’s good, so unapologetically maudlin, violent, shameless, or ridiculous that you can’t help but love it.”  For me those would include All That Jazz!, Moulin Rouge, The Natural, Fields of Dreams, and yes, Dr. Zhivago and Music Man and the Sound of Music (confession:  I want to go to one of those sing-along Sound of Music screenings).

Some of the explanation for guilty pleasures is contextual:  in some neighborhoods a trip to Disney World is the dream vacation of a lifetime, where in others it signifies a secret shame.  One gets some sense of this from the responses volunteered by academics answering a “guilty pleasures” inquiry from the Chronicle of Higher Education:  professors named People magazine, karaoke, Texas hold ‘em, Jimmy Buffett, cheesy historical fiction, comic books, and Barry Manilow.  Imagine the shock in the faculty lounge!  But in Dallas or Philly or Oakland or Chicago and outside ivy-covered walls, I’m not sure any of these would raise any concerns.  Well, except maybe Barry Manilow.

Others are things from which we derive pleasure even though we know they are bad for us, like Krispy Kreme donuts or clove cigarettes or shopping on QVC.  An entire scholarly cottage industry has arisen to explain why so many women end up compulsively hooked on media images that create impossible weight and beauty standards, and on romance novels that even today celebrate the idea of being swept away by a Prince Charming bad boy figure (an example of this research is a recent essay by Maxine Leeds Craig).  I know I’m not supposed to like the endless-one-take “A Pretty Girl is Like a Melody” sequence in The Great Ziegfeld (1936), but I’m transfixed when I see it.  Maybe the parallel event for you is Titanic.  Henry Jenkins (who has just announced his intention to leave MIT for the University of Southern California, a fact I know because of the guilty pleasure I derive from academic gossip) has written a book, The Wow Climax: Tracing the Emotional Impact of Popular Culture (NYU Press, 2007), that spends considerable effort to account for this impact and in ways often surprisingly sympathetic (especially given the way in which popular culture is so often disparaged even within cultural studies, who dismiss it as reproducing capitalism and racism and sexism).

A good test of one’s genuine and unfabricated guilty pleasures is to uncover the answer to these questions:  What things in the world, because they move you (perhaps even to tears), do you insist on repeatedly experiencing alone?  Or, since tears are not the only measure of strong reaction:  what things rev up or inspire you (and thus bring you back to the experience time after time) that you would never confess to anyone else?  

Go ahead, name them out loud:  Billy Idol’s Rebel Yell.  Anything by ELO, but especially Mr. Blue Sky.  The Carol Burnett Show, particularly at the end every week when she tugged her ear and you knew she was saying hi to her grandmother.  Meet Joe Black.  Shirley Temple movies.  Michael Jackson music made before he went crazy.  The old Andy Griffith Show, especially any episode with Barney or Goober.  Rip Taylor.  V for Vendetta.  College marching bands.  Showboat, especially Ole Man River or the moment when at New Year’s Eve the singing prodigal daughter brings tears to her father’s eyes with her increasingly confident rendition of After the Ball.  The Jerry Lewis Muscular Dystrophy telethon.  Hush Hush Sweet Charlotte.  ABBA.  Any novel by Nora Roberts or Barbara Cartland.  Those old James Bond movies; yes, even the most absurd ones starring Roger Moore and featuring Jaws.  Xanadu (the movie, not the Broadway musical).  The Carpenters. Brian’s Song.

There now:  don’t you feel better?

And if your memory needs to be jogged, you can consult The Encyclopedia of Guilty Pleasures (by Sam Stall, Lou Harry, and Julia Spalding, published by Quirk).  The book “celebrates the joys of cheesy pleasures such as Wayne Newton, Baywatch, motion-sensitive mounted fish that break into song, and those pestilential ‘collectible’ plates and figurines from the Franklin Mint” (Loeffler).  “Going into this,” says Harry, “I had no idea there was a real Chef Boyardee.”

Meanwhile, I’ll be sticking with my official public replies:  the second movement of Beethoven’s Emperor Concerto (which is, I must say, genuinely sublime), Hitchcock’s Vertigo, anything by Sondheim, Tchaikovsky’s Violin Concerto in D Major, Frontline and Charlie Rose and the Lehrer Newshour, Rachmaninoff’s 3rd Piano Concerto (the opening minutes provide the best bang for the buck in the repertoire)…  And don’t let me forget CSPAN and Shakespeare and any film by Godard or Bergman or Renoir or any theatrical production of Beckett…

SOURCES:  Bill Everhart, “Guilty Pleasures,” Berkshire Eagle (Pittsfield, Massachusetts), 7 March 2008, Sunday magazine; Melissa Ruggieri, “Guilty Pleasures Still Sound Sweet,” Richmond Times Dispatch, 16 May 2008, pg. E-9; William Loeffler, “Guilty Pleasures:  Our Secret Shame,” Pittsburgh Tribune Review, 20 March 2005; “Guilty Pleasures Revisited,” Chronicle of Higher Education, 6 June 2008, Chronicle Review, pg. 4; Maxine Leeds Craig, “Race, Beauty, and the Tangled Know of a Guilty Pleasure,” Feminist Theory 7.2 (2006): 159-177.

How global warming imperils our history

C. Brian Rose, president of the Archeological Institute of America, introduced the November/December 2008 issue of Archeology with an editorial that begins as follows:

Global warming is real and it is one of the gravest threats facing our shared cultural heritage.  According to the National Oceanic and Atmospheric Administration, the ten warmest years have all occurred since 1995, and the UN’s Environment Program notes that the world’s glaciers are receding at a record pace.  This situation brings a cascade of problems that are having a catastrophic impact on archeological sites.  Melting of ice and permafrost endanger most frozen sites on the continents, while rising sea levels promote the erosion and submergence of others…  Examples in recent years include Ötzi, the late Neolithic herdsmen discovered in the Italian Alps; the 550-year-old Native American hunter whose body was recovered from a melting glacier in British Columbia; and the Inca human sacrifices found on Andean peaks.  Similarly endangered are the frozen burials of Eurasian nomads… Remains of 5,000 year old stone houses built by Neolithic farmers and hunters at Skara Brae, Orkney, may have to be dismantled and moved inland for protection.  Portions of the ruins of Nan Madol, an ancient political and religious center on the Pacific island of Pohnpei in Micronesia, may soon be submerged.

In the context of the larger consequences of global climate change, these effects on the historical record may seem incidental or modest, but of course the losses might be permanent, and as Rose has noted, not that difficult to document now.  He calls for a UNESCO and NASA and ESA program to do fast satellite imaging to map glaciers, since the ultraviolet readings can lead investigators to burial sites.

Every other year the World Monuments Fund releases a “world monuments watch list” to call attention to endangered sites.  For the first time in 2008, global climate change is named as a cause of urgent concern, noting that “several sites… are threatened right now by flooding, encroaching desert, and changing weather patterns.”  Two examples:  (a) Herschel Island, Canada, “home to ancient Inuit sites and a historic whaling town at the edge of the Yukon Territory that are being lost to the rising sea and melting permafrost in this fastest-warming part of the world”; and (b) Sonargaon-Panam City, Bangladesh, “a former medieval trading hub and crossroads of culture, whose long-neglected and deteriorating architecture is increasingly threatened by flooding in this low-lying country, one of the most vulnerable to the impacts of global warming.”  The dangers, because they are likely to approach gradually, are easy to ignore, and in the context of archeological sites where the main evidence is already obscured and not in plain site, awful losses might be occurring without anyone even knowing it.

Despite such warnings, there is little evidence of policy action to move in ways that would conserve historical preservation sites, perhaps not surprising given the lack of action on climate change’s broader consequences.  A recent study published by the journal Climate Change notes in Great Britain, where some emphasis has been placed on cataloging climate effects, “lack of a widespread consideration of heritage has resulted in a relatively low profile more generally for the subject.”  A 2005 UK Environment Agency report organized to set “a science agenda… in light of climate change impacts and adaptation” never mentions heritage preservation.

The danger does not simply derive from changing levels in oceans and rivers.  A 2006 “Heritage at Risk” report argues that climate change is partly responsible for the summer of 2007 fires that were among “the largest catastrophes in the Mediterranean in the last century.”  Warming was at fault because it made fires more common and intense; research reported by the Athens Observatory notes that global warming also changes soil humidity levels, and this also contributes to fire susceptibility.  While climate change is not the only cause of fires, their 2007 severity raised alarms in the historical preservation community, especially given damage to “our cultural heritage in the Peloponnese.  This included the Arcadian landscapes, Byzantine churches and monasteries, Apollo Epicurius at Bassae (a World Heritage Site), the Antiquities in Ilieia and especially the archeological site of Olympia (also a World Heritage Site).  There was damage to the area surrounding the Olympia archeological site.  The Kladios stream, a tributary of the Alpheios River, was burnt to a great extent, whereas the Kronios Hill was burnt entirely.  The park and the surroundings of the International Olympic Academy were destroyed.  Furthermore, some slopes near the ancient stadium were also burnt.”

The Centre for Sustainable Heritage at University College London released (in 2005) a major report on these issues, Climate Change and the Historic Environment, authored by May Cassar.  The document summarizes a comprehensive effort to catalog the risks, but for me most compelling starts by quoting Titania’s “weather speech,” a part of A Midsummer Night’s Dream (Act II Scene I), which eerily anticipates the threat, and may even have been prompted by the “meteorologically turbulent time when Shakespeare was writing his play” (Cassar):

                …the spring, the summer,
                The chiding autumn, angry winter, change
                Their wonted liveries, and the mazed world
                By their increase, now knows not which is which:
                And this same progeny of evil comes
                From our debate, from our dissension;
                We are their parents and original.

SOURCE:  A.J. Howard et al., “The impact of climate change on archeological resources in Britain: A catchment scale assessment,” Climate Change 91 (2008): 405-422; May Cassar, Climate Change and the Historic Environment (London: English Heritage and the UK Climate Impacts Programme, 2005).

William Eggleston invented color

The Whitney in New York has just opened a major retrospective of William Eggleston’s long career as an innovator in photography (William Eggleston:  Democratic Camera, Photographs and Video, 1961-2008), which perhaps brings full circle a journey that has been mainly centered in the American south and the Mississippi Delta (Memphis most of all) but that in 1976, and connected with an exhibit at the Museum of Modern Art (MOMA), has had galvanizing force for the wider arts.

Although the MOMA had exhibited color photography once before and had shown photos in its galleries as far back as 1932, its decision to showcase Eggleston and his color-saturated pictures in 1976 was exceptionally controversial.  At the time the New York Times said it was “the most hated show of the year.”  “Critics didn’t just dislike it; they were outraged.  Much the way viewers were aghast when Manet exhibited Olympia, a portrait of a prostitute, many in the art community couldn’t figure out why Eggleston was shooting in color” (Belcove).  Eggleston’s subjects can be seen as totally mundane (as in the above) and his public refusal to illuminate how his main works are staged proved infuriating (and actually, to the contrary, Eggleston has long insisted that he never poses his subjects, arguing, astonishingly, that these are in every case single-shot images and that either he gets the shot or moves onto the next without regret).  Prior to Eggleston, art photography was most often black-and-white.  Thus, for students of the art and practice of photography, and given his enormous visual influence, one can say without complete hyperbole that William Eggleston invented color.

Well, maybe that is a little hyperbolic.  After all, those seeking the color founding might better retreat to the period of the “Cambrian Explosion” 543 million years ago, when the diversification of the species was sparked by the evolutionary development of vision; in that time, “color first arose to help determine who ate dinner and who ended up on the plate” (Finlay 389).  Or one might look to the late Cretaceous period – prior to that “plants did not produce flowers and colored leaves.”  Further elaborating this perspective, Finlay (391) writes that:

As primates gained superior color vision from the Paleocene to the Oligocene (65 to 38 million years ago), the world for the first time blossomed into a range of hues.  At the same time, other creatures and plants also evolved and settled into ecological niches.  Flowering plants (angiosperms) radiated, developing colored buds and fruits; vivid insects and birds colonized the plants, attracted by their tints and serving to disperse their pollen and seeds.  Plants, insects, birds, and primates evolved in tandem, with color playing a crucial role in the survival and proliferation of each.  The heart of these developments lay in upland tropical Africa, where lack of cloud cover and therefore greater luminance resulted in selective evolutionary pressure for intense coloration.

It states the obvious, but I’ll do it anyway, to note that colors, along with the human capacity to recognize and distinguish among them, transforms human experience.  Part of the reason why Aristotle so famously preferred drawing to color is that the latter can too easily overwhelm one’s critical capacities (for him this was evidenced by the fact a viewer in the presence of rich color has to step back, color blurring at close range, and in the process a necessary distancing will inevitably divert audiences from attending to the artistic details present in good drawing).  Plato had disdained color too, thinking it was merely an ornamental, ephemeral and surface distraction, a view oddly recalled later by Augustine, who warned against the threat posed by the “queen of colors” who “works by a seductive and dangerous sweetness to season the life of those who blindly love the world” (qtd. in Finlay, 400).  It was only in the 12th century that Christians came fully around to color, at about the time stained glass technology was undergoing fast refinement; suddenly colored lights were seen as evoking the Divine and True Light of God.

But for centuries color was dismissed as feminine and theoretically disparaged since it “is impossible to grasp and evanescent in thought; it transcends language, clouds the intellect, and evades categorization” (Finlay, 401).  Color was thus seen as radically irrational by the thinking and professing classes – Cato the Elder said that colores floridi (florid colors) were foreign to republican virtue – all of this an interesting contrast to the Egyptian kings who saturated their tombs with gorgeous coloration and to the Greeks who ignored Aristotle’s warnings and painted their Parthenon bright blue and their heroic marble sculptures right down to the red pupils we would today prefer to digitize out since they apparently evoke the idea of Satanic possession.

The history of color is regularly bifurcated by scholars into work emphasizing chromophilia (the love of color) and chromophobia, which by contrast has often reflected an elite view that color is garish and low class.  Wittgenstein concluded that the radically subjective response to color could never be adequately specified in a manner adequate to philosophy:  “there is merely an inability to bring the concepts into some kind of order.  We stand there like the ox in front of the newly-painted stall door” (qtd. in Finlay, pg. 383).

In the context of early film production and the industry’s emerging use of color and then Technicolor, colors were seen by some as a “threat to classical standards of legibility and coherence,” necessitating close control:

For instance, filmmakers monitored compositions for unwanted color contrasts, sometimes termed visual magnets, that might vie for attention with the narratively salient details of a scene.  Within a few years the body of conventions for regulating color’s function as a spatial cue had been widely adopted.  The most general guideline was that background information should be carried by cool colors of low saturation, leaving warm, saturated hues for the foreground.  Narrative interest should coincide with the point of greatest color contrast. (Higgins)

The ongoing power of such conventions has recently led Brian Price, a film scholar at Oklahoma State University, to argue that the imposition of saturated and abstracted color in recent films made by Claire Denis and Hou Hsiao-Hsien exemplify a resistive threat to globalized filmmaking and its industrial grip on the world’s imagination.

A paradox in Eggleston’s work is that although his subjects – Elvis’ Graceland, southern strip malls, the run down architecture produced as often by the simple ravages of time and nature as of neglect – are dated and immediately evocative of a completely different though not wholly lost and variously tempoed time, his photographs seem timeless.  Like the man himself, described by one journalist as “out of place and out of time,” Eggleston captures elements of modern life that persist and his attention to the formalistic properties of color and framing make his work arresting even for those uninterested or unimpressed by the odd assemblages of southern culture who constitute his most interesting subjects.  This paradox, in turn, can produce a sense in the viewer of vague dread, as if the contradictions inhabited by the idea of serendipitous composition reveal dangers of which we are customarily unaware.  At the same time, because Eggleston has never seemed interested in documentary reportage and has defaulted to literal photographs that instead accentuate the commonplace, he “belongs to that rare and disappearing breed, the instinctive artist who seems to see into and beyond what we refer to as the ‘everyday’” (O’Hagan).

Technically speaking, Eggleston beat others to the punch because his personal wealth enabled him to produce very high quality and expensive prints of his best work; another benefit of this wealth may be that, as Juergen Teller has put it, “he has never had the pressure of being commercial.”  The dye-transfer print process he has used since the 1960’s (Eggleston resists the shift to the digital camera and image manipulation, simply noting that it is an instrument he does not know how to play) was borrowed from high-end advertising.  And although rejected early on and in some quarters – the conservative art critic Hilton Kramer notoriously described his 1976 New York exhibit as “perfectly banal” – he has been honored late in life as a prophet in his own time – a lifetime achievement award from the Institute of Contemporary Photography and another from Getty, and other honors from the National Arts Club and others to numerous to mention.  Eggleston seems immune to the critiques whether hostile or friendly, a fact reflected in the details of his mercurial and sometimes weird personal life but also in his refusal to talk talk talk about his work:  “A picture is what it is, and I’ve never noticed that it helps to talk about them, or answer specific questions about them, much less volunteer information in words.  It wouldn’t make any sense to explain them.  Kind of diminishes them.”

The distinctive Eggleston aesthetic has influenced David Lynch (readily evident in his Blue Velvet), Gus Van Sant (e.g., Elephant, an explicit homage), Sofia Coppola (the Virgin Suicides; “it was the beauty of banal details that was inspirational”), the band Primal Scream (his “Troubled Waters” forms the cover art for Give Out But Don’t Give Up) and many others.  David Byrne is a friend and Eudora Welty was a fan.  Curiously, despite his influence on avant-garde cinema and his own efforts at videography, Eggleston professes faint interest in film, although he is said to like Hitchcock.

Finlay has noted that “Brilliant color was rare in the premodern world.  An individual watching color television, strolling through a supermarket, or examining a box of crayons sees a larger number of bright, saturated hues in a few moments than did most persons in a traditional society in a lifetime” (398).  What was true of premodernity was also true of photography wings in the world’s major art museums.  Until William Eggleston.

SOURCES:  Holland Cotter, “Old South Meets New, in Living Color,” New York Times, 6 November 2008; Sean O’Hagan, “Out of the Ordinary,” The (London) Observer, 25 July 2004; Rebecca Bengal, “Southern Gothic: William Eggleston is Even More Colorful than His Groundbreaking Photographs,” New York Magazine, 2 November 2008; Julie Belcove, “William Eggleston,” W Magazine, November 2008; Scott Higgins, “Color Accents and Spatial Itineraries,” Velvet Light Trap, no. 62 (Fall 2008)L 68-70; Brian Price, “Color, the Formless, and Cinematic Eros,” Framework 47.1 (Spring 2006): 22-35; Jacqueline Lichtenstein, The Eloquence of Color:  Rhetoric and Painting in the French Classical Age, trans. Emily McVarish (Berkeley:  University of California Press, 1993); Robert Finlay, “Weaving the Rainbow:  Visions of Color in World History,” Journal of World History 18.4 (2007): 383-431; Christopher Phillips, “The Judgment Seat of Photography,” October 22 (October 1982): 27-63.

The remarkable Evelyn Glennie

Evelyn Glennie is in the city this weekend performing the Atlanta Symphony Orchestra premier of a percussion concerto written by John Corigliano (the Juilliard School professor whose score for The Red Violin won the Academy Award).  The Conjurer, first performed this February in Pittsburgh, is interesting because it wholly foregrounds the percussionist.  It accomplishes this both physically (the soloist is situated in front of the conductor with the full array of instruments on which she will perform, organized as are the movements, by wood, metal and skin) and aurally, since the orchestral setting is reduced to strings only and the melodic and tonal work of the piece is wholly carried by the solo artist.  Corigliano, in a pre-performance talk, explained the challenges of writing percussion-centered pieces, which include the fact that many of the main percussive instruments (like, obviously, the snare drum) do not enable melodic expression, and as a result one often leaves a percussion performance mainly remembering the orchestra and the melody they played and nothing about the soloist except that she or he was running around and expressive.  Relying on the full range of available strategies to combat these tendencies, Corigliano arranges the work so that the soloist is physically in front, aurally dominating, and temporally he arranged each movement to start with a true percussion solo into which the orchestra only slowly intrudes and then fully joins.

I’m embarrassed to say that I had never heard of or encountered Evelyn Glennie, the amazing artist for who the work was commissioned.  I had no idea that she has been named a Dame by the British queen or that she has been a global percussion icon for decades.  Some sense of her contributions are summarized in her biography, available at her website and reproduced in the evening’s concert program:

Evelyn Glennie is the first person in musical history to successfully create and sustain a full-time career as a solo percussionist…  For the first ten years of her career virtually every performance she gave was in some way a first.  Her diversity of collaborations have included performance artists such as Nana Vasconcelos, Kodo, Bela Fleck, Bjork, Bobby McFerrin, Sting, Emmanuel Ax, Kings Singers, Mormon Tabernacle Choir and Fred Frith.  Evelyn has commissioned 150 new works for solo percussion from many of the world’s most eminent composers and also composes and records music for film and television.  Her first high quality drama produced a score so original that she was nominated for a British Academy of Film and Television Arts awards (BAFTA’s); the U.K. equivalent of the Oscars.

Glennie has also won two Grammy Awards and her website also refers to her design work (she makes jewelry) and her multimedia collaborations.  Many online resources are available that showcase her artistry and intellect, including a quite interesting talk she gave in the TED series and YouTube clips from her performances.  An award-winning documentary has also been made centered on her work; I found it mesmerizing.

The press coverage attending her performances doesn’t quite do justice to the experience of seeing her on the stage.  A common theme is that Glennie (as a St. Petersburg Times report put it) “dashes around stage like a woman possessed, darting from marimba to tom-toms to cymbals to bongos to every other imaginable instrument that can be struck, shook, rattled, and rolled” (Variety described her as “bright-eyed, wiry, and pointedly articulate,” not to mention “tattooed and wild-haired”).  And there is some truth in these descriptions, compounded for some when they first realize that Glennie is performing barefoot.  But such labels also deflect from the incredible discipline and precision demanded of percussive performance – drumming accomplished as pure abandonment would be torture, not art or music.

For Glennie, the universe of sounds enabled by the full repertoire of percussive instruments reveal both the primal impulses of human culture (along with its wide variability) but also connect in a fuller sensory way than sound waves hitting eardrums.  In a Glennie performance one is struck by the holistic manner by which sound so visibly courses through her body and gestures, and animates her clearly enthusiastic passion for the acoustic possibilities she evokes.  Whether her connection to a particular instrument is mediated by sticks or whether she is physically fused with it (as is the case of the so-called “talking drum,” which a player holds between the legs and plays with the hands, where the legs themselves by squeezing against the flexible frame can reshape and contort the resulting sound), Glennie reveals how the performer actually embodies the music.  As she put it in an interview with the Wisconsin State Journal in Madison, “We have to listen; we have to listen through the entire body, and by bringing all the senses together…  It’s something that I’m refining myself every time I pick my sticks up.  To have that kind of fluidity means that you’re constantly listening, and I don’t mean listening through the ears – listening through the entire body.  That makes a massive difference in how you experience sound, not just music.”

It was only after watching her remarkable ASO performance that I learned something about Glennie that is widely reported but on which she prefers not to dwell:  Evelyn Glennie has been profoundly deaf since the age of 12.   For me this is astonishing, less in the sense of the typical artist overcomes disability sense that one hears so often when discussing Beethoven and others, but rather because here, one leaves with the profound impression that facing her own physical limits has produced artistry that far transcends the historical limits of the percussion repertoire and that curiously would not have been imagined as richly by those who can hear in the traditional sense.

SOURCES:  John Fleming, “The Perfect Touch,” St. Petersburg Times, 24 August 2007, pg. 1E; Edward Ortiz, “A Feel For Music: Evelyn Glennie Hears With Her Body – And Makes Percussion an Adventure,” Sacramento Bee, 2 December 2007, pg. TK23; Amanda Henry, “Percussionist Evelyn Glennie Gets New Emphasis For the Role,” Wisconsin State Journal, 24 March 2004, pg. D1; Eddie Cockrell, “Review of Touch the Sound,” Variety, 20 December 2004, pg. 50; Donald Munro, “Flying Solo:  A Long List of Accomplishments Travels Alongside this Talented Musician,” Fresno Bee, 22 April 2008, pg. E1

The other Williams Ayers

Driving to work yesterday I heard one of Atlanta’s conservative talk radio hosts announce with a mixture of pride and wistfulness that, as a concession to Barack Obama’s victory, he had thrown out all his “research” on William Ayers, whose violent past he had been preaching for months.  Now that Obama has been chosen by the voters to lead the nation, the talk show host noted, it was time to move past Ayers and Jeremiah Wright and onto larger topics.  At the same time, though, while Sarah Palin has been insisting that the association (however modest) still matters, Ayers himself has emerged into the public spotlight, having given some recent interviews (he was on Good Morning America the other morning) and published some op-ed pieces.

As the election unfolded, only passing notice was typically given to the other/older William Ayers, the University of Illinois (Chicago) professor of education.  Now that November 4th has passed, and accepting for the moment the impulse to bracket his past to better understand his influence today as an advocate for educational reform, I’ve been reading some of his work on social justice pedagogy.  It was this work, actually, that led him to cross paths with Obama, since their mutual interest in school reform led both to agree to serve on the same Chicago board of directors, an association that obviously led Obama’s critics to question the wisdom of his political and intellectual alliances.

Ayers has a way of getting right to the point, a trait much on display in the recent interviews but which also makes him an interesting writer.  One book review he authored starts:  “Drawing on traditional methods and straightforward approaches… Vonovskis fails to add anything new to the story of the origins of Head Start despite constant and irritating assertions to the contrary.”  And an essay co-authored with Michael Klonsky begins, “Each day, children in Chicago are cheated out of a challenging, meaningful, or even an adequate education…  Despite the well-publicized crime rate in Chicago’s poor neighborhoods, the greatest robbery is not in the streets, but in the schools.”  But Ayers’ purpose is not just attention-grabbing or op-ed-style hyperbole, for he quickly moves to back up such provocative claims by the presentation of truly appalling data about urban education.  The Chicago research, which appeared in 1994, noted that as of that year, for instance, “reading scores in almost half of Chicago’s schools are in the lowest 1% of the nation.”

Ayers’ work in Chicago does partly mirror the logic of his anti-war activism, which was animated by the view that one must deal with criminal negligence by use of a proportionally urgent response (this was the argument he made on GMA in justifying his participation in anti-Vietnam War insurgency; his view was that what he saw as America’s murderous policies in southeast Asia were so monstrous that they demanded even the use of violent opposition).  In the context of education reform, this has led to the mobilization of what might best be considered a social movement, organized to provide tangible opposition to schooling bureaucracies.  And this, in turn, leads to a wide-scale systemic perspective that attends as much to the macro-allocation (or misallocation) of educational funds as to the local dynamics of this or that classroom.  Schools in Illinois, as elsewhere, are funded by property taxes, and because urban property values tend to be lower they generate less revenue than ends up available in the richer suburbs.  In 1992, Illinois voters narrowly rejected a statewide constitutional amendment to provide funding equalization (a constitutional amendment requires 60% support, while this one received 56%).

The passions elicited by the issue of educating children run deep.  Ayers recounts the firestorm evoked when, in 1988, then-governor of Illinois Bill Thompson resisted higher funding for Chicago schools – he didn’t want to throw more money into a “black hole.”  When one of Chicago’s representatives in the state legislature accused Thompson of having made a racist comment, pundits accused the senator of playing the race card.  But such back-and-forths are not surprising given the complex history of racial politics that has characterized the city’s political history, not to mention the long period of conflict between the city and its teacher union that led to a regular cycle of walkouts in the 1980’s and ‘90’s.  One can gather some sense of Ayers’ fuller indictment in the following passage, also written in the mid-1990’s:

Returning to Chicago [from a discussion of schooling in South Africa], a similarly persuasive argument can be made that the failure of some schools and some children is not due to a failure of the system.  That is, if one suspects for a moment the rhetoric of democratic participation, fairness, and justice, and acknowledges (even tentatively) that our society, too, is one of privilege and oppression, inequality, class divisions, and racial and gender stratifications, then one might view the schools as a whole as doing an adequate job both of sorting youngsters for various roles in society and convincing them that they deserve their privileges and their failures.  Sorting students may be the single, brutal accomplishment of U.S. schools, even if it runs counter to the ideal of education as a process that opens possibilities, provides opportunities to challenge and change fate, and empowers people to control their own lives.  The wholesale swindle in Chicago, then, is neither incidental nor accidental; rather, it is an expression of the smooth functioning of the system.

The movement that emerged as a reaction to the frustrating situation in Chicago was in large measure centered on the idea of accountability, a rhetorical rubric that can accommodate both conservatives (who might prefer to emphasize how schools fail to respond to or engage the interests of parents) and liberals (who might prefer to emphasize the need for greater investments, paired with oversight better able to hold bureaucracies to account) both.  Emerging as it did under the leadership of Mayor Harold Washington, the mobilization of parents and educational reformers brought (Ayers and Klonsky argue) African-American parents to the forefront, along with the traditional themes of civil rights organizing (grassroots activity, decentralization, desegregation, community empowerment).  But they were also assisted by the then-recent creation of academic research activity that provided concrete data able to call attention to the true problems.  Early on, Mayor Washington was able to bring together mainly minority parents and white business leaders, all of whom shared concerns about poor schooling, but that coalition was fragmented when the funding issue percolated to the top of the reform agenda (community leaders favored more equitable tax policies and greater funding, while many in the business community were opposed).

Starting with the local reflects an ongoing theme in Ayers’ work, and in an essay he wrote in 1988, it becomes an explicit focus of his account of his past.  Ayers wrote:  “My experience with making change leaves me unimpressed with theories of change, big theories and little theories alike.  Big theories are often compelling because of their bold self-assurance and their tidy certainty…, [but] too often the self-activity of people is lost in a kind of determinism…  Small theories of change promise a different kind of certainty, but they fail as often for missing the larger context…”  Such a view, in turn, has shaped Ayer’s subsequent work on education as social justice, in which he repeatedly insists he is not seeking airy abstraction but on-the-ground changes for children.

Ayers’ departs from social justice accounts of education that see education as a mechanism for improving students’ economic and social prospects.  For Ayers such an approach reflects a certain naivete, since it rests on a basic endorsement of the overall forces and institutions that shape society and often constrain progress even for the well educated (the emphasis in such an approach can too fully rest on the idea of equipping under-educated students for society, without enabling changes in the political and social system that will make the resulting educated citizens more welcome).  Ayers thus also argues that social justice education has to be politically empowering even as basic life skills are inculcated, where schools might be imagined as also fostering real political agency.

The challenge, of course, is that education is complicated and the dynamics of successful teaching cannot be reduced to axiomatic rules teachable in college education classrooms.  In Teaching Toward Freedom, his 2004 book, Ayers (channelling Walt Whitman) cites the following as offering a more hopeful (and explicitly poetic) view of the well formed citizen:

Love the earth and the sun and the animals, despise riches, give alms to everyone that asks, stand up for the stupid and the crazy, devote your income and labor to others, hate tyrants, argue not concerning God, have patience and indulgence toward the people, take off your hat to nothing known or unknown or to any man or number of men, go freely with powerful uneducated persons and with the young and with the mothers of families, re-examine all you have been told at school or church or in any book, dismiss whatever insults your soul, and your very flesh shall be a great poem.

SOURCE:  William Ayers, “The Republican’s Favorite Whipping Boy, Former Student Radical William Ayers Tells What it Was Like to Be Painted as a Symbol of Evil by McCain and Palin,” Montreal Gazette, 8 November 2008, pg. B7; Colin Moynihan, “Ex-Radical Talks of Education and Justice, Not Obama,” New York Times, 27 October 2008, pg. A22; William Ayers and Michael Klonsky, “Navigating a Restless Sea:  The Continuing Struggle to Achieve a Decent Education for African American Youngsters in Chicago,” Journal of Negro Education 63.1 (1994): pgs. 5-18; Ayers, “The Shifting Ground of Curriculum Thought and Everyday Practice,” Theory into Practice 31.3 (Summer 1992): pgs. 259-263; Ayers, “Problems and Possibilities of Radical Reform:  A Teacher Educator Reflects on Making Change,” Peabody Journal of Education 65.2 (Winter 1988): pgs. 35-50;  Emery Hyslop-Margison, “Teaching for Social Justice,” Journal of Moral Education 34.2 (June 2005): pgs. 251-256; John Pulley, “Former Radicals, Now Professors, Draw Ire of Alumni at Two Universities,” Chronicle of Higher Education, 16 November 2001, pg. A32.