Frank Furedi, a sociologist at the University of Kent, got into trouble in May 2001 when he wrote an essay in the Times of London casting aspersions on the contemporary university. The critique turned into a little book he published in 2004, Where Have All the Intellectuals Gone? (London: Continuum) that was reissued in a second edition two years later along with his replies to criticisms. His argument is not entirely unfamiliar, of course: thinkers of real depth who used to be seriously considered by their cultures (the Bertrand Russells and Hannah Arendts) have been replaced with glib talking heads like Sean Hannity and Chris Matthews, and worse, by corporate hacks and political spinners.
The university bears some responsibility for this historical development, says Furedi, because the cache formerly attached to a life of learning, conducted in the search for Truth, has been culturally degraded. Truth has been transmuted into knowledge, which in turn has been entirely commodified and judged only by its capitalist use value, and so now even educational leaders tend to defend the enterprise in instrumental terms.
I am never entirely sure how to react to these arguments. One is sympathetic to the critiques, often, but also rebels against the generalizations that tend to characterize them. A long history of disparaging the masses exists, despite Mark Bauerlin’s recent claim (made in his book, The Dumbest Generation) that “no cohort in human history has opened such a fissure between its material conditions and its intellectual attainments.” Universities are too large, too numerous, too diverse, to make blanket claims like the humanities are dead or students are ill-prepared for college. It actually depends. More students than ever are pursuing college educations, but it seems true that their interests are more careerist than ever. More books are being published in the humanistic disciplines than ever, but their audiences seem to be shrinking.
However one judges these macro-claims, the status of the intellectual does seem diminished in a culture where everyone has an opinion, knowledge is open source, there is no danger in ignorance since blink instincts are usually reliable, and wisdom resides in crowds. At a conference a couple months ago focused on media studies, I was in the audience for a roundtable on television studies and one of the professors there asked what new knowledge media professors can be expected to add. In an earlier age, it was the role of the scholar to do the careful documentary historical work to track emergent changes in the wider culture that might have passed unnoticed. Today, all the television executives are blogging. They, presumably, have access to the inside debates in a more sophisticated way than the average professor sitting in, say, Dubuque. And if they mischaracterize the industry trends, the other professionals will be right there taking note of it in the comment section, not to mention in their own blogs.
And why actually learn anything when one can just track it all down on the internet? When the controversy over Barack Obama’s fictitious claim that one his relatives had helped liberate Auschwitz, a claim he quickly corrected, I heard Wolf Blitzer glibly say on CNN (I’m paraphrasing, so don’t be deceived by the quote marks), “Of course, as everyone knows, Senator Obama’s uncle could not have liberated Auschwitz since the Soviets liberated that camp.” Did Blitzer really know this? Does everyone really know this? I’m skeptical – my immediate reaction was that some staffer had either inserted this information into the script based on the Republican oppo-research memo or had downloaded it from Wikipedia. But the claim – everyone knows – isn’t quite inaccurate, since in a sense everyone does know, or has the capacity to find it out online.
Leon Wieseltier, the New Republic critic, has this to say in this week’s issue: “Now there is nothing more dazzling on the web, and hence in the galaxy, than video, and every newspaper, every magazine, every bedroom, is a television studio, and so it is the end of significant speech that I anxiously envision. It is being usurped by talkativeness. All those prattling heads – how can people spend so much time watching other people speak?… [The] American fear of silence may finally be retired, because silence, like oblivion, is no more… Go to a dreary website called Bigthink.com and ‘browse big ideas’ – identity, life and death, truth and justice – and you will see what I mean. Here is the false promise of the end of obscurity: profundity for busy people, delivered by people with time on their hands; an electronic gospel of relaxation; a slick hoax upon real thinking.”
Clearly the vastly wider distribution of information enabled by the web is a remarkable collective achievement, and the sheer intensity of online discussions can be civically invigorating. There is something quite amazing about Wikipedia, “constructed in less than eight years by strangers who disagreed about all kinds of things but who were drawn to a shared, not-for-profit purpose” (Nicholas Baker, writing in the New York Review of Books March 20). But what is the role of the intellectual in such a context? In the early days the argument was that Wikipedia lacked accuracy and that the essential skills required by those more substantively trained and editorially were still required; that claim seems to have subsided, and in a recent essay published in the American Journalism Review, Donna Shaw called attention to the fact that even journalists are more commonly slipping into reliance on Wikipedia.
Perhaps the intellectual’s role, if not to offer definitive or definitively useful knowledge, is to play the Greek chorus (or rather the Simon Cowell) role: nudging the audience to do the right thing, offering insights where appropriate. Maybe I’ll blog about that sometime.