But the potential cost of cowardice does not fully explain why it haunts us so, why Adam Smith wrote that “No character is more contemptible than that of a coward,” or why Kierkegaard described the “cowardly soul” as “the most miserable thing one can imagine.” A traitor probably has greater potential and certainly more intent to do harm than a coward, but his act can also have an element of daring about it, an element that despite ourselves we respect — and one that a cowardly act lacks. The proverbial traitor stabs you in the back, but at least he does something; at least he is present. The proverbial coward strives, more than anything, for absence, and if he (he is proverbially male) can’t be absent, if he can’t run or hide, then his body betrays his cowardice through incontinence. The coward combines the destructive and the pathetic in a singularly reviling way.
This revilement has great power, most obviously in the military context. The belief that the other side is cowardly, coupled with the punishment and prevention of cowardice on one’s own side, feeds confidence; the belief that it would be cowardly not to fight feeds belligerence. The pattern is evident in, perhaps essential to, every war.
But the contempt for cowardice applies far from the field of battle, too. In the new film “Force Majeure,” a Swedish man named Tomas is having lunch with his family on the deck of a ski lodge when he sees an avalanche coming. Rather than leaping to shield his wife and children, he runs away (after grabbing his sunglasses and cellphone). The avalanche, it turns out, was man-made, controlled, harmless. The rest of the movie explores the ruinous aftermath of this episode. Tomas fearfully failed in his duty as father and husband. Now his family may be collapsing under a different sort of avalanche — the shame of cowardice.
That avalanche is not quite as monolithic as it might appear to be. There are wrinkles, paradoxes. It has often been acknowledged that excessive fear of being cowardly can itself be cowardly (the 9/11 attackers may have been cowardly in this way) — and did I say cowardly acts had no element of daring? An 18th-century proverb had it that “every man would be a coward if he durst,” and the Soviet practice of executing deserters during World War II led General Georgy Zhukov to declare that “it takes a brave man to be a coward in the Red Army.”
Contempt for the archetypal cowardice of war seems to be fading in an age when some weapons are so fearsome that the very idea of being excessively fearful seems absurd, when we know that reactions to fear depend on physiological factors such as cortisol levels, and when we know that past traumas can diminish our capacity to deal with present ones.
Yet the contempt for cowardice seems too deeply rooted in us to disappear. William James wrote that “our ancestors have bred pugnacity into our bone and marrow, and thousands of years of peace won’t breed it out of us,” an assertion supported by the emerging consensus that our closest relatives in the animal world, chimpanzees, have a natural tendency toward violence against their fellow chimps.
This is not necessarily a reason to despair. Our pugnacity and the contempt of cowardice that goes with it does not condemn us to war. James thought we could apply them to constructive rather than destructive ends, and fight “the moral equivalent of war.”
Perhaps the most common and profound cowardice has to do with the long tradition of not thinking about it. A rigorous and nuanced consideration of the idea — one applicable not so much to terrorists as to ourselves — can help us cultivate what James called “toughness without callousness.” It can help us think critically about our own fears and obligations — to family, community, and country, and to causes beyond any of those. It can help us honor these obligations more than aspiring to heroism can. And it can help us appreciate, even as we emulate, those many veterans who thought about cowardice and resisted it, who, without ever crediting themselves with courage, managed their fears and did their duty.
By GEORGE YANCY and NAOMI ZACK NOVEMBER 5, 2014 7:00 PM
November 5, 2014 7:00 pm 419 Comments
This is the first in a series of interviews with philosophers on race that I am conducting for The Stone. This week’s conversation is with Naomi Zack, a professor of philosophy at the University of Oregon and the author of “The Ethics and Mores of Race: Equality After the History of Philosophy.” The interview was conducted by email and edited. — George Yancy
George Yancy: What motivates you to work as a philosopher in the area of race?
Naomi Zack: I am mainly motivated by a great need to work and not to be bored, and I have a critical bent. I think there is a lot of work to be done concerning race in the United States, and a lot of ignorance and unfairness that still needs to be uncovered and corrected. I received my doctorate in philosophy from Columbia University in 1970 and then became absent from academia until 1990. When I returned it had become possible to write about real issues and apply analytic skills to social ills and other practical forms of injustice. My first book, “Race and Mixed Race” (1991) was an analysis of the incoherence of U.S. black/white racial categories in their failure to allow for mixed race. In “Philosophy of Science and Race,” I examined the lack of a scientific foundation for biological notions of human races, and in “The Ethics and Mores of Race,” I turned to the absence of ideas of universal human equality in the Western philosophical tradition.
I’m also interested in the role of the university in homelessness and have begun to organize an ongoing project for the University of Oregon’s Community Philosophy Institute, with a unique website.
G.Y.: How can critical philosophy of race shed unique light on what has happened, and is still happening, in Ferguson, Mo.?
N.Z.: Critical philosophy of race, like critical race theory in legal studies, seeks to understand the disadvantages of nonwhite racial groups in society (blacks especially) by understanding social customs, laws, and legal practices. What’s happening in Ferguson is the result of several recent historical factors and deeply entrenched racial attitudes, as well as a breakdown in participatory democracy.
G.Y.: Would you put this in more concrete terms?
N.Z.: Let’s work backwards on this. Middle-class and poor blacks in the United States do less well than whites with the same income on many measures of human well-being: educational attainment, family wealth, employment, health, longevity, infant mortality. You would think that in a democracy, people in such circumstances would vote for political representatives on all levels of government who would be their advocates. But the United States, along with other rich Western consumer societies, has lost its active electorate (for a number of reasons that I won’t go into here). So when something goes wrong, when a blatant race-related injustice occurs, people get involved in whatever political action is accessible to them. They take to the streets, and if they do that persistently and in large enough numbers, first the talking heads and then the big media start to pay attention. And that gets the attention of politicians who want to stay in office.
It’s too soon to tell, but “Don’t Shoot” could become a real political movement — or it could peter out as the morally outraged self-expression of the moment, like Occupy Wall Street.
‘In the fullness of time, these differences will even out. But the sudden killings of innocent, unarmed youth bring it all to a head.’
But the value of money pales in contrast to the tragedy this country is now forced to deal with. A tragedy is the result of a mistake, of an error in judgment that is based on habit and character, which brings ruin. In recent years, it seems as though more unarmed young black men are shot by local police who believe they are doing their duty and whose actions are for the most part within established law.
In Ferguson, the American public has awakened to images of local police, fully decked out in surplus military gear from our recent wars in Iraq and Afghanistan, who are deploying all that in accordance with a now widespread “broken windows” policy, which was established on the hypothesis that if small crimes and misdemeanors are checked in certain neighborhoods, more serious crimes will be deterred. But this policy quickly intersected with police racial profiling already in existence to result in what has recently become evident as a propensity to shoot first. All of that surplus military gear now stands behind such actions, and should offend members of the public who protest.
G.Y.: How does this “broken windows” policy relate to the tragic deaths of young black men/boys?
N.Z.:People are now stopped by the police for suspicion of misdemeanor offenses and those encounters quickly escalate. The death of Michael Brown, like the death of Trayvon Martin before him and the death of Oscar Grant before him, may be but the tip of an iceberg.
Young black men are the convenient target of choice in the tragic intersection of the broken windows policy, the domestic effects of the war on terror and police racial profiling.
G.Y.: Why do you think that young black men are disproportionately targeted?
N.Z.: Exactly why unarmed young black men are the target of choice, as opposed to unarmed young white women, or unarmed old black women, or even unarmed middle-aged college professors, is an expression of a long American tradition of suspicion and terrorization of members of those groups who have the lowest status in our society and have suffered the most extreme forms of oppression, for centuries. What’s happening now in Ferguson is the crystallization of our grief. Don’t Shoot!
We also need to understand the basic motives of whole human beings, especially those with power. The local police have a lot of power — they are “the law” for all practical purposes.
Police in the United States are mostly white and mostly male. Some confuse their work roles with their own characters. As young males, they naturally pick out other young male opponents. They have to win, because they are the law, and they have the moral charge of protecting. So young black males, who have less status than they do, and are already more likely to be imprisoned than young white males, are natural suspects.
G.Y.: But aren’t young black males also stereotyped according to white racist assumptions?
N.Z.: Yes. Besides the police, a large segment of the white American public believes they are in danger from blacks, especially young black men, who they think want to rape young white women. This is an old piece of American mythology that has been invoked to justify crimes against black men, going back to lynching. The perceived danger of blacks becomes very intense when blacks are harmed. And so today, whenever an unarmed black man is shot by a police officer and the black community protests, whites in the area buy more guns.
This whole scenario is insane. The recent unarmed young black male victims of police and auxiliary police shootings have not been criminals. Their initial reactions to being confronted by police are surprise and outrage, because they cannot believe they are suspects or that merely looking black makes them suspicious. Maybe their grandfathers told them terrible stories, but after the Civil Rights movements and advancement for middle-class blacks, we are supposed to be beyond legally sanctioned racial persecution. Their parents may not have taught them the protocol for surviving police intervention. And right now the airwaves and Internet are buzzing with the anxiety of parents of young black men. They now have to caution their sons: “Yes, I know you don’t get into trouble, and I know you are going to college, but you have to listen to me about what to do and what not to do if you are ever stopped by the police. Your life depends on it. . . Don’t roll your eyes at me, have you heard what happened to Trayvon Martin and Michael Brown?”
G.Y.: We can safely assume white parents don’t need to have this talk with their children. Do you think white privilege is at work in this context?
N.Z.: The term “white privilege” is misleading. A privilege is special treatment that goes beyond a right. It’s not so much that being white confers privilege but that not being white means being without rights in many cases. Not fearing that the police will kill your child for no reason isn’t a privilege. It’s a right. But I think that is what “white privilege” is meant to convey, that whites don’t have many of the worries nonwhites, especially blacks, do. I was talking to a white friend of mine earlier today. He has always lived in the New York City area. He couldn’t see how the Michael Brown case had anything to do with him. I guess that would be an example of white privilege.
Other examples of white privilege include all of the ways that whites are unlikely to end up in prison for some of the same things blacks do, not having to worry about skin-color bias, not having to worry about being pulled over by the police while driving or stopped and frisked while walking in predominantly white neighborhoods, having more family wealth because your parents and other forebears were not subject to Jim Crow and slavery. Probably all of the ways in which whites are better off than blacks in our society are forms of white privilege. In the normal course of events, in the fullness of time, these differences will even out. But the sudden killings of innocent, unarmed youth bring it all to a head.
G.Y.: The fear of black bodies — the racist mythopoetic constructions of black bodies — has been perpetuated throughout the history of America. The myth of the black male rapist, for example, in “Birth of a Nation.” But even after the civil rights movements and other instances of raised awareness and progress, black bodies continue to be considered “phobogenic objects,” as Frantz Fanon would say.
N.Z.: Fanon, in his “Black Skin, White Masks,” first published in France, in 1952, quoted the reaction of a white child to him: “Look, a Negro! . . . Mama, see the Negro! I’m frightened!” Over half a century later, it hasn’t changed much in the United States. Black people are still imagined to have a hyper-physicality in sports, entertainment, crime, sex, politics, and on the street. Black people are not seen as people with hearts and minds and hopes and skills but as cyphers that can stand in for anything whites themselves don’t want to be or think they can’t be. And so, from a black perspective, the black self that whites serve up to them is not who they are as human beings. This exaggeration of black physicality is dehumanizing.
G.Y.: Given this, why have so many adopted the idea that we live in a post-racial moment in America?
N.Z.: I don’t know where the idea of “post-racial” America came from. It may have begun when minorities were encouraged to buy homes they could not afford so that bankers could bet against their ability to make their mortgage payments, before the real estate crash of 2007-08. It sounds like media hype to make black people feel more secure so that they will be more predictable consumers — if they can forget about the fact blacks are about four times as likely as whites to be in the criminal justice system. If America is going to become post-racial, it will be important to get the police on board with that. But it’s not that difficult to do. A number of minority communities have peaceful and respectful relations with their local police. Usually it requires negotiation, bargaining, dialogue — all of which can be set up at very little cost. In addition, police departments could use intelligent camera-equipped robots or drones to question suspects before human police officers approach them. It’s the human contact that is deadly here, because it lacks humanity. Indeed, the whole American system of race has always lacked humanity because it’s based on fantastic biological speculations that scientists have now discarded, for all empirical purposes.
G.Y.: So is it your position that race is a social construct? If so, why don’t we just abandon the concept?
N.Z.:Yes, race is through and through a social construct, previously constructed by science, now by society, including its most extreme victims. But, we cannot abandon race, because people would still discriminate and there would be no nonwhite identities from which to resist. Also, many people just don’t want to abandon race and they have a fundamental right to their beliefs. So race remains with us as something that needs to be put right.
George Yancy is a professor of philosophy at Duquesne University. He has written, edited and co-edited numerous books, including “Black Bodies, White Gazes,” “Look, a White!” and “Pursuing Trayvon Martin,” co-edited with Janine Jones.
How will the people take the Met to the 21st century?
By: Eric S. Caruncho
06:48 AM January 10th, 2016
It’s not just a structure, it’s a repository of memory. It’s part of the national patrimony,” says Gerard Lico, the architect in charge of the restoration of the Manila Metropolitan Theater, better known as the “Met.”
Lico came to this realization after the National Commission for Culture and the Arts (NCCA), the government agency which owns the Met, launched a Facebook campaign asking for volunteers to help clean it up.
Once a proud Manila landmark, the historic Art Deco building fell into a derelict state after it was abandoned in 1996. Now a magnet for the city’s homeless, it’s been covered in garbage and graffiti ever since.
“Initially, we asked for only 50 architecture student volunteers for the cleanup drive,” he recalls. To his amazement, the NCCA was flooded with thousands of responses, and not only from students, so many that the office e-mail server crashed.
“I was surprised at the involvement of the people,” he says after the first cleanup last Dec. 12. “The community has an attachment to the building. I gave the students a tour of the building, and they were saddened by its state. It was an emotional moment.”
Lico hopes this emotional connection can help sustain public support for the restoration effort, a Herculean task that he estimates will take five to 10 years, and eventually cost more than P500 million, well beyond the P270 million given by Malacañang for the acquisition of the Met.
There are encouraging signs.
Lico says that Sen. Pia Cayetano has offered to kick in supplementary funds amounting to P43 million, to help cover startup costs. The Organisasyon ng Pilipinong Mang-aawit (OPM) and the United Architects of the Philippines (UAP), as well as the French and Spanish governments, have also signified their willingness to help.
There’s no question as to the historical importance of the Met, designed by architect Juan M. Arellano in the Art Deco style and opened in 1931 as Manila’s premier performance venue. Lavishly decorated with stained glass, wrought-iron grillwork, murals by Fernando Amorsolo and numerous sculptures, the Met hosted performances by world-class artists in its heyday.
“The Met was envisioned in 1931 as the ‘people’s theater,’” says Lico.
He believes it can be so once more.
“There are many schools of thought on restoration,” he says. “I’m not a purist. We have to reinvent the building to fit the needs of modern life.”
A professor of architecture at the University of the Philippines, Lico also has a master’s degree in Art History and a Ph.D in Philippine Studies. He is the author of “Arkitekturang Filipino: A History of Architecture and Urbanism in the Philippines.”
He was also UP’s campus architect until 2014, when he returned to private practice designing institutional buildings. But he dropped most of his projects to focus on the Met when NCCA chair Felipe de Leon Jr. asked him to head the Met restoration team, a once-in-a-lifetime opportunity.
“This is what I really want to do,” he says. “This is how I want to leave my mark. I’m gung-ho on the project.”
Adaptive reuse, he says, is the key to not just restoring, but also reviving, the Met. “In restoration, you can’t just impose what you want on the people. We realized that it would be impossible to revive the Met as a theater only,” he notes. “It wouldn’t be sustainable. The audience just isn’t there anymore.”
In keeping with the idea of the new Met as the people’s theater, the NCCA has been conducting focus group discussions with veteran and emerging artists, cultural workers, students and other stakeholders to determine the best use for the restored building.
“Definitely we will retain the theater,” says Lico. “But the surrounding facility, which is huge—what should we do with it? Some have suggested that it could be a one-stop shop for Philippine arts and culture—a cultural mall, if you will, showcasing the best of Philippine arts and culture.”
The Met could form part of a “culture circuit”—a network that would include the National Museum and other destinations in the vicinity.
We also have to think beyond the building itself, he says, but also take into account its environment, which includes Mehan garden, Lawton plaza and the Post Office building—an area which has sadly gone to seed in the last few decades. A comprehensive master plan for the whole area is sorely needed, he adds.
Since the NCCA doesn’t have the capacity to manage properties, Lico says it may eventually enter into a private sector partnership to ensure the restored Met’s sustainability.
For now, however, the first priority is to restore the Met.
“It looks small from the outside, but if you go inside, it’s really large,” he points out. “In fact, the theater seats 1,700. Apart from that, there are offices all around. The intention in the 1930s was to rent out those arcade spaces.”
Lico estimates that 80 percent of the building, built from reinforced concrete, remains structurally sound.
But there are serious problems to be solved. Water penetration in parts of the building has corroded the rebar, causing the concrete to burst.
A tunnel underneath the Met that once formed part of its cooling system has allowed water to accumulate, turning what used to be the orchestra pit into a stagnant pond. Vegetation has also grown into concrete surfaces, causing cracks.
“As much as possible, we want to maintain authenticity and integrity,” he says. “When you restore, you don’t invent what is not there. But we are not frozen in time. The 1978 restoration by Mrs. Marcos made some additions, and we will keep those additions because it is part of the history of the building. It adds a layer. We want to add a layer of the 21st century, but we will maintain the iconic status of the Met.”
Since the NCCA is the government agency charged with preserving heritage sites, the restoration of the Met should also be a textbook example of the right way to do it.
The right way begins with a consultative approach that involves the community. Hence, the continuing dialogue with stakeholders, other government agencies and the private sector.
By mid-January, he says, the entire structure will be fenced off to discourage the informal settlers who transform the Met into their dormitory after dark. The fence itself will be used as a billboard detailing the history of the Met, to educate passersby on the significance of the structure.
The public will also be allowed to see the work on the building as it progresses. Hard hats will be provided for this purpose.
In addition to student volunteers, a professional cleanup crew will be hired to dispose of the hazardous debris, including asbestos, that has accumulated inside the building.
For the preparatory work, high-tech solutions will be used, he adds. This includes 3-D scanning to determine the structural soundness of existing members, and ground-penetrating radar to study the hidden parts of the foundation.
“Restoration is a very meticulous undertaking,” says Lico. “Restoring a building is always more expensive than building a new structure. But in the end, it’s an investment in culture and the creative industries.”
The Stone is a forum for contemporary philosophers and other thinkers on issues both timely and timeless.
How do we relate to our past, and what might this tell us about how to relate to our future? One of the most provocative approaches to this question comes from Friedrich Nietzsche, whose doctrine of the eternal return asks this: “What if some day or night a demon were to steal after you into your loneliest loneliness and say to you: ‘This life as you now live it and have lived it, you will have to live once more and innumerable times more’”? To ask myself the question of the eternal return is to wonder about the worth of what I have done, to inquire whether it would stand the test of being done innumerable times again.
There is, however, a more disturbing worry underneath this one. For me to be able to ask the question of the eternal return already supposes that I have come into existence; and the question may arise of whether I should affirm the conditions that brought me into existence, not innumerable times but even once. To see the bite of this worry, let me share a bit of my own past. Had Hitler not come to power in Germany, the Holocaust and World War II would not have happened. Had World War II not have happened, my father would not have signed up for officer’s training school. Had he not signed up, he would not have gone to college, majored in economics, and then moved to New York for a job. And so he would not have met my mother. In short, without the Holocaust I would not be here.
We need not look very deeply to see how many people’s existence requires the occurrence of the Holocaust. And as Peter Atterton has argued recently here, all of us can trace our existence back to some mass atrocity or another (if not the Holocaust, then perhaps to slavery or to the Crusades).
How, then, might we relate to the past, and specifically to the fact that we owe our existence to one or another historical atrocity (or, for that matter, to a host of other events: weather patterns, feelings of lust, etc.)? One suggestion, a pessimistic one, is offered by another philosopher, R. Jay Wallace, in his book “The View From Here.” Wallace argues that to affirm my existence, to say yes to it, requires that I affirm (among other unpalatable things) the past that led to it. To be sure, he does not claim that we must feel good about it. We might wish that our existence had come about another way. However, he argues that we cannot have what he calls “all-in regret” about it. It’s unfortunate that our existence had to arise this way, but since that’s the way it happened, affirming our existence requires affirming the past that led to it. It is no wonder that he calls his position one of “modest nihilism.”
But must we affirm the past that led to our existence? Must we be modest nihilists? For one thing, it is open to us to say that it would have been better for us not to have been born and for the Holocaust not to have happened. From a more cosmic perspective (assuming that recent history would not have offered us a comparable horror), we might say that it would have been better had the Holocaust not occurred and that the planet be filled with people different from us. When Atterton concludes his column by saying that we have no right to exist, I take it this is precisely what he is claiming. And, as far as it goes, I agree with him.
But that is not where the question should make us most uncomfortable, and not where Wallace stakes his ground. To affirm our existence is not a matter of what we think would be cosmically or impersonally better. It is to say what we prefer, what we would choose. Would I prefer that the Holocaust or slavery or the Crusades not have happened and that I not exist? If I were somehow allowed to rewind the tape of history and then let it go forward again in a way that prevented one of these atrocities, and thus my existence, would I do it? That is a more troubling question for those of us who are attached to our lives.
I would like to think that, at least in my better moments, I would, however reluctantly, acquiesce to that deal. At those times where I have a more vivid encounter with the Holocaust, for instance, when at the Holocaust Memorial Museum in Washington I saw the shoes of many who had perished in the camps, I think I would, with difficulty, be willing to trade my existence for those of its victims. (It is another and even more vexing question of whether I would trade my childrens’ lives to spare theirs, recognizing that my childrens’ existence requires my own. But for reasons outside the scope of this essay, I believe that that is a question for my offspring to answer rather than me.) I don’t know for sure what I would do, but I hope I would be able to rise to the occasion.
If this is right, then perhaps the proper attitude to take toward the monstrosities that gave rise to us might be called one of acceptance rather than affirmation. We are the products of histories we cannot change, histories that contain atrocities we cannot undo. We know that it would have been better if those horrors had not happened and, consequently, we had not been born, and in nobler moments we might even prefer that it had been that way. Our lives are rooted in tragedies that have no reparation, and in that they are inescapably tainted. We must accept this, but we need not affirm it. The difference lies in what we would have been willing to do, given the opportunity.
At this point, however, someone might ask why it matters what I, or any one of us, would do in an imaginary scenario that cannot possibly happen. The Holocaust happened; it cannot be prevented retroactively. So why should we take up any attitude toward our existence in relation to it? There are two reasons for doing so, one more philosophically reflective and the other more practical. The philosophically reflective reason is this: We condemn the Holocaust. I believe most of us would say that it should not have occurred. But had it not occurred, many of us would not be here. So what is our attitude toward the Holocaust, really? Do we really condemn it, or do we not? Asking the questions I am posing here will reveal to us aspects of who we are in ways that we may or may not find comfortable.
The second reason is practical. If we would be willing to sacrifice our existence for the sake of preventing past horrors, what would we be willing to sacrifice of ourselves to prevent horrors now and in the future? And why are so many of us (and I include myself here) not doing so? I should note here that the situation of the past is not exactly symmetrical to that of the future. There is a complication. If I had not existed, I would not technically have lost anything, because there would have been no “I” to lose it in the first place. (Of course, it’s even more complicated than that. I have to exist to consider the possibility of my never having existed.) However, now that I do exist, in sacrificing myself I do stand to lose something — my future existence.
Nevertheless, with that caveat in mind, a willingness to sacrifice our existence in the past should be matched by a willingness to sacrifice at least something of value now or in the future to prevent or mitigate new atrocities. What would we be willing to sacrifice for the refugees from Syria or the potential victims of police violence, or the impoverished undocumented workers in our country — those whose troubles will help determine who our children and grandchildren are? What would we be willing to sacrifice to prevent the enormous consequences of climate change, which seem already to be multiplying their victims? And if we’re not prepared to make some sacrifice, what does this in turn say about our relation to the horrors that gave rise to us? Our relation to the past and our relation to the future are not entirely distinct from each other. In asking about one, we offer answers — and perhaps not answers we would prefer to acknowledge — to the other.
As a new year is upon us, then, we might do better to renew rather than to forget our old acquaintance with the past, and allow that to be a guide to our future.
Todd May is a professor of philosophy at Clemson University, and the author of, most recently, “A Significant Life.”
By ROBERT FRODEMAN and ADAM BRIGGLE JANUARY 11, 2016 3:21 AM
The Stone is a forum for contemporary philosophers and other thinkers on issues both timely and timeless.
The history of Western philosophy can be presented in a number of ways. It can be told in terms of periods — ancient, medieval and modern. We can divide it into rival traditions (empiricism versus rationalism, analytic versus Continental), or into various core areas (metaphysics, epistemology, ethics). It can also, of course, be viewed through the critical lens of gender or racial exclusion, as a discipline almost entirely fashioned for and by white European men.
Yet despite the richness and variety of these accounts, all of them pass over a momentous turning point: the locating of philosophy within a modern institution (the research university) in the late 19th century. This institutionalization of philosophy made it into a discipline that could be seriously pursued only in an academic setting. This fact represents one of the enduring failures of contemporary philosophy.
Take this simple detail: Before its migration to the university, philosophy had never had a central home. Philosophers could be found anywhere — serving as diplomats, living off pensions, grinding lenses, as well as within a university. Afterward, if they were “serious” thinkers, the expectation was that philosophers would inhabit the research university. Against the inclinations of Socrates, philosophers became experts like other disciplinary specialists. This occurred even as they taught their students the virtues of Socratic wisdom, which highlights the role of the philosopher as the non-expert, the questioner, the gadfly.
Philosophy, then, as the French thinker Bruno Latour would have it, was “purified” — separated from society in the process of modernization. This purification occurred in response to at least two events. The first was the development of the natural sciences, as a field of study clearly distinct from philosophy, circa 1870, and the appearance of the social sciences in the decade thereafter. Before then, scientists were comfortable thinking of themselves as “natural philosophers” — philosophers who studied nature; and the predecessors of social scientists had thought of themselves as “moral philosophers.”
The second event was the placing of philosophy as one more discipline alongside these sciences within the modern research university. A result was that philosophy, previously the queen of the disciplines, was displaced, as the natural and social sciences divided the world between them.
This is not to claim that philosophy had reigned unchallenged before the 19th century. The role of philosophy had shifted across the centuries and in different countries. But philosophy in the sense of a concern about who we are and how we should live had formed the core of the university since the church schools of the 11th century. Before the development of a scientific research culture, conflicts among philosophy, medicine, theology and law consisted of internecine battles rather than clashes across yawning cultural divides. Indeed, these older fields were widely believed to hang together in a grand unity of knowledge — a unity directed toward the goal of the good life. But this unity shattered under the weight of increasing specialization by the turn of the 20th century.
Early 20th-century philosophers thus faced an existential quandary: With the natural and social sciences mapping out the entirety of both theoretical as well as institutional space, what role was there for philosophy? A number of possibilities were available: Philosophers could serve as 1) synthesizers of academic knowledge production; 2) formalists who provided the logical undergirding for research across the academy; 3) translators who brought the insights of the academy to the world at large; 4) disciplinary specialists who focused on distinctively philosophical problems in ethics, epistemology, aesthetics and the like; or 5) as some combination of some or all of these.
There might have been room for all of these roles. But in terms of institutional realities, there seems to have been no real choice. Philosophers needed to embrace the structure of the modern research university, which consists of various specialties demarcated from one another. That was the only way to secure the survival of their newly demarcated, newly purified discipline. “Real” or “serious” philosophers had to be identified, trained and credentialed. Disciplinary philosophy became the reigning standard for what would count as proper philosophy.
This was the act of purification that gave birth to the concept of philosophy most of us know today. As a result, and to a degree rarely acknowledged, the institutional imperative of the university has come to drive the theoretical agenda. If philosophy was going to have a secure place in the academy, it needed its own discrete domain, its own arcane language, its own standards of success and its own specialized concerns.
Having adopted the same structural form as the sciences, it’s no wonder philosophy fell prey to physics envy and feelings of inadequacy. Philosophy adopted the scientific modus operandi of knowledge production, but failed to match the sciences in terms of making progress in describing the world. Much has been made of this inability of philosophy to match the cognitive success of the sciences. But what has passed unnoticed is philosophy’s all-too-successful aping of the institutional form of the sciences. We, too, produce research articles. We, too, are judged by the same coin of the realm: peer-reviewed products. We, too, develop sub-specializations far from the comprehension of the person on the street. In all of these ways we are so very “scientific.”
Our claim, then, can be put simply: Philosophy should never have been purified. Rather than being seen as a problem, “dirty hands” should have been understood as the native condition of philosophic thought — present everywhere, often interstitial, essentially interdisciplinary and transdisciplinary in nature. Philosophy is a mangle. The philosopher’s hands were never clean and were never meant to be.
There is another layer to this story. The act of purification accompanying the creation of the modern research university was not just about differentiating realms of knowledge. It was also about divorcing knowledge from virtue. Though it seems foreign to us now, before purification the philosopher (and natural philosopher) was assumed to be morally superior to other sorts of people. The 18th-century thinker Joseph Priestley wrote “a Philosopher ought to be something greater and better than another man.” Philosophy, understood as the love of wisdom, was seen as a vocation, like the priesthood. It required significant moral virtues (foremost among these were integrity and selflessness), and the pursuit of wisdom in turn further inculcated those virtues. The study of philosophy elevated those who pursued it. Knowing and being good were intimately linked. It was widely understood that the point of philosophy was to become good rather than simply to collect or produce knowledge.
As the historian Steven Shapin has noted, the rise of disciplines in the 19th century changed all this. The implicit democracy of the disciplines ushered in an age of “the moral equivalence of the scientist” to everyone else. The scientist’s privileged role was to provide the morally neutral knowledge needed to achieve our goals, whether good or evil. This put an end to any notion that there was something uplifting about knowledge. The purification made it no longer sensible to speak of nature, including human nature, in terms of purposes and functions. By the late 19th century, Kierkegaard and Nietzsche had proved the failure of philosophy to establish any shared standard for choosing one way of life over another. This is how Alasdair MacIntyre explained philosophy’s contemporary position of insignificance in society and marginality in the academy. There was a brief window when philosophy could have replaced religion as the glue of society; but the moment passed. People stopped listening as philosophers focused on debates among themselves.
Once knowledge and goodness were divorced, scientists could be regarded as experts, but there are no morals or lessons to be drawn from their work. Science derives its authority from impersonal structures and methods, not the superior character of the scientist. The individual scientist is no different from the average Joe; he or she has, as Shapin has written, “no special authority to pronounce on what ought to be done.” For many, science became a paycheck, and the scientist became a “de-moralized” tool enlisted in the service of power, bureaucracy and commerce.
Here, too, philosophy has aped the sciences by fostering a culture that might be called “the genius contest.” Philosophic activity devolved into a contest to prove just how clever one can be in creating or destroying arguments. Today, a hyperactive productivist churn of scholarship keeps philosophers chained to their computers. Like the sciences, philosophy has largely become a technical enterprise, the only difference being that we manipulate words rather than genes or chemicals. Lost is the once common-sense notion that philosophers are seeking the good life — that we ought to be (in spite of our failings) model citizens and human beings. Having become specialists, we have lost sight of the whole. The point of philosophy now is to be smart, not good. It has been the heart of our undoing.
Robert Frodeman and Adam Briggle teach in the department of philosophy and religion and the University of North Texas. They are co-authors of the forthcoming “Socrates Tenured: The Institutions of 21st-Century Philosophy.”