Against experience: The perils of relativism

Created 23 Aug 2014 • Last modified 22 Aug 2022

I discuss a common thread in a wide variety of social problems, which is skepticism towards the notion of objective truth and reverence towards personal experience. These problems include pseudoscience, pseudomedicine, belief in the supernatural, political conspiracy theories, questionable philosophies and policies in the name of social justice, questionable educational practices, and a lack of concern for the consequences of one's own life. If we want to make the world a better place, we should believe in the world.

Empiricism and relativism

I think one of the biggest problems with today's society is widespread distrust of empiricism in favor of relativism. There are several problematic social trends that, although their causal connections to each other are unclear, reflect this core theme.

First, I'll explicate the theme a bit. I'm using neither "empiricism" nor "relativism" in an entirely standard way. By "empiricism", I mean all sorts of systematic knowledge and means of knowledge acquisition. Such means include reading textbooks, being instructed by experts, writing mathematical proofs, performing scientific experiments, conducting polls, deliberately honing a skill through practice and study, and simply systematically observing the world. By "relativism", I mean personal feelings regarded as knowledge, and means of knowledge acquisition that center on personal feelings. Superficially, this can involve many of the same activities as empiricism, such as reading books and being instructed by experts, but in the case of relativism, one is interested not in what has been demonstrated about some objective phenomenon, but about the perceptions, reactions, and interpretations of the participants. In empiricism, perception is a means to the end of knowledge of objective truths, whereas in relativism, perception is the end unto itself.

A key difference between empiricism and relativism, in fact, is the notion of objective truth. The empiricist assumption is that there is a single physical world that existed before there were any humans to think about it, and continues to exist as the ultimate source of all human experience. As the history of science shows, empirical methods are far from perfect, and humans (being naturally subjective creatures) are liable to misapply these methods, but they do tend to haltingly converge on consensus descriptions of the external world. Empiricism claims that these descriptions successively approximate correct descriptions, that is, the truth. Relativism rejects the centrality (or even, in some cases, the possibility) of objective truth in favor of opinions and consensus themselves. (See Haack, 1996, for a more detailed development of a form of empiricism, which Haack calls "innocent realism", that is robust to the criticisms of strong metaphysical realism that motivate the many forms of relativism. The basic idea is that there is only one real world, to which multiple descriptions correspond, and these descriptions can be compatible despite being different.)

So, empiricism and relativism are similar to Enlightenment and Romantic ideas, respectively. You could also call them positivist versus anti-positivist, as in the science wars. Relativism in particular is related to post-structuralism and postmodernism. In terms of political alignment, relativism has traditionally been most vigorously defended by left-leaning scholars and social activists, but to say that relativism is the left of empiricism, as I originally claimed when I first wrote this essay in 2014, is probably overgeneralizing. "Post-truth" politics, as popularized by fake news and Donald Trump, is a particularly visible and influential kind of right-wing relativism today.

In case it wasn't already clear, I'm not a neutral party when it comes to this divide. To put it bluntly: empiricism rules; relativism drools. In the following sections, I'll enumerate some of the dangers that relativism poses.

Pseudoscience and pseudomedicine

This is the most obvious and maybe the most important one. Many authors have documented and decried public resistance to scientific consensus on subjects like global warming, evolution by means of natural selection, psychic powers, ghosts, homeopathy, and vaccination. The lay public is willing to see scientists as authority figures, but no more. They see scientific opinion as nothing more than an opinion, giving no credit to the research that went into it. If their personal experience seems to contradict such opinions—if they vaccinate their child and their child develops autism—then of course they're going to believe what they saw with their own eyes in favor of mere testimony. Similarly, if a friend's child is vaccinated and develops autism, a typical person is going to believe their friend before they believe a bunch of academics they've never met.

An emphasis on knowledge through feelings and personal experience also frequently appears when it comes to religion and traditional values. For example, people may say God exists because they feel his presence. Or they may decry homosexuality as immoral because of a feeling of disgust. Often these ideas are not psudeoscience per se because they don't make empirical claims, like "vaccines cause autism". But religion, in particular, is anti-scientific in a more abstract sense because it requires the existence of the supernatural (that is, phenomena that don't behave according to consistent rules), and one of the core assumptions of the scientific method is that the supernatural doesn't exist (that is, that everything behaves according to consistent rules). If the supernatural were allowed in scientific thinking, no empirical reasoning would be possible because any observation could be compromised at will by supernatural influences; the supernatural, by definition, need not be consistent. So it's not that science has shown, or can ever show, that no gods or ghosts or fairies exist; it's that the scientific method has empiricism as one of its foundational assumptions, and empiricism means not even entertaining the idea that such things could exist. (More precisely: it only allows us to entertain notions of God, ghosts, etc. that are falsifable, hence not supernatural per se, and in practice, most aren't.) If we want to understand the world, we need to work on the assumption that it can be understood.

As a rule, the closer a domain of human life is to personal experience, the more likely people are to take a relativist rather than empiricist stance on it. So mathematics, being as far removed from personal experience as a field can be, is only treated relativistically by the canniest of philosophers (e.g., Lakoff & Núñez, 2000). Physics is slightly more approachable, since it's about the physical world rather than pure abstractions, so a few more people are willing to believe weird things about it: there are more crank physicists than crank mathematicians. By the time we get to psychology, though, laypeople are barely aware of the concept of treating human behavior as part of objective reality, let alone willing to believe empirical research about behavior over their own everyday experiences. In sociology, the relativist disease has infected a sizable number of researchers themselves, and in cultural anthropology, few empiricists are left in 2014. Empirical research that would have been conducted under the auspices of cultural anthropology in the early 20th century would now typically be in sociology or social psychology.

The cruel irony is that the closer a field is to personal experience, the more important it is to stick fast to empiricism, because the deluge of personal experience is liable to mislead us. (If you have any doubt that all those personal experiences do in fact mislead us, rather than happening to agree with research, look no further than Nisbett & Wilson, 1977.) In the case of algebraic geometry or something, at least we don't have many prejudices that apply.

The value of lived experience

Perhaps encouraged by longstanding relativist arguments from the humanities and non-empirical social sciences, many people see testimony of personal experiences—to use the academic term, "lived experiences"—as singularly valuable. Academic research uses open-ended interviews with small numbers of people to document lived experiences concerning everything from household chemical exposure (Altman et al., 2008) to the use of Playboy magazine in the construction of one's masculine self-concept (Beggan & Allison, 2003). Qualitative research of this kind can hardly be expected to do more than reproduce the prejudices of the respondents. In the words of Haack (1995) (p. 400–401),

It can be granted without further ado that all persons should be treated with respect; and that it is undesirable to encourage an attitude of suspicion or disrespect for what is unfamiliar merely because it is unfamiliar. But it doesn't follow, and neither is it true, that all opinions, practices, institutions, traditions, are equally deserving of respect.…

I find much to admire in the life of the Kalahari bushmen: their closeness to the natural world, the vigor of their cave paintings, their delight in music and dancing, their taking for granted, in the extraordinary harshness of their conditions of life, that "if one eats, all eat." And thinking about the remarkable ingenuity of the triple-jointed poisoned arrows with which they hunt their game, I am set to wondering in a new way about what the social or intellectual conditions were that led to the rise of modern science in seventeenth-century Europe. But it doesn't follow, and neither is it true, that Bushman myths about the origin of the world or the causes of the seasons, and so forth, are on a par with the best scientific theorizing.

Beyond the pages of scholarly journals, interest in lived experiences appears most reliably in the context of anti-discrimination, when businesses, schools, and other organizations are attempting to make their institution less prejudiced against oppressed groups such as women, black people, and gay people. It is entirely proper, from an empirical perspective, that oppressed people such as these are polled as part of a comprehensive effort to understand how the oppressed are unduly disadvantaged. But an empirical perspective also recognizes that the oppressed are not the only sources of information on this matter, and may be mistaken, and (like all human beings) are inclined to see themselves and their friends as blameless and whoever opposes them as stupid, malicious, or both. Empiricists need not reject the common-sense principle that each person is the sole authority on their own subjective experience: you're automatically right about your own immediate feelings, just not about anything else. Too often, people in a mindset of anti-discrimination take a relativist perspective that the lived experiences of the oppressed are not to be questioned. For example, Schlosser (2015) describes how universities may take a student's expression of discomfort as unimpeachable evidence of a professor's wrongdoing.

Now, somebody committed to relativism itself would not be happy with thinking about anti-discrimination like this, because treating all opinions as equal would require treating the opinions of the oppressor class as equal to the opinions of the oppressed. In practice, due to familiar sociopolitical alliances, relativist self-proclaimed champions of multiculturalism, social justice, and the like generally see the oppressed as epistemically (and morally) superior. In debates about gender, a woman's opinion is preferred to a man's; in debates about race, a black person's opinion is preferred to a white person's; and so on. On the Internet, one can find extreme forms of this attitude, such as the assertion that white people's opinions about their own actions are irrelevant (Alexa, 2012):

Dear White people who have opinions on whether or not certain behaviors are racist against PoC [people of color; i.e., nonwhites]… the next time you want to inject your opinions about whether your actions are hurtful to another group, remember: you are not part of the affected group. Therefore, nobody gives a fancy flying tin foil sugar coated fuck about your opinions on it.

In what sense is this philosophy still relativist? In that perceptions of one kind or another, not the external world, are the ultimate epistemic authority.

Defining gender

A related idea shows up concerning gender, in the context of transgenderism. The idea is that people are the sole authority on their own gender. Adam (2012) puts it like this (emphasis in original):

Our cultural framework tends to tell us that their [people's] bodies may contradict their statements — that there's no way you could be a guy with XX chromosomes, or a genderless person with an obvious beard. But the trans person is the one who's right, and the simplistic framework is the model that's wrong. Gender is not dependent on physical appearances, or on the word of doctors, friends, family. The individuals are the ones who get to assert their own identity.

Transsexual women genuinely are women, regardless of their being born with bodies that people tend to consider "male." Transsexual men genuinely are men, no less so than any other man.

Some writers go further still and argue that, for example, it is factually incorrect to refer to someone who is anatomically male as "male-bodied" (or "anatomically male", for that matter) so long as the person considers herself female. Because, the logic goes, if the person is female, so is her body.

These are relativist views of gender because, again, they grant subjective experience ultimate authority. They forbid us from defining "male" or "female" in terms of observable phenomena, such as anatomy (does the person have a penis?) or social behavior (does the person use the men's room or the women's room?). Notice that empirical notions of gender are what people use in most contexts, lay and scientific. The idea of asking people their preferred pronouns explicitly hasn't caught on in everyday social interaction. We just infer people's genders on the basis of clothes, hairstyle, voice, body shape, and many other factors, and wait to be corrected.

Is this only a matter of semantics, of what the words "gender", "male", and "female" mean? Possibly, although the intensity of these claims ("Transsexual women genuinely are women"; emphasis in original) suggests the writers think that empirical notions of gender are somehow wrong. I've argued that the question of who counts as "female" etc. is something of a red herring; our real concern should be how to make gender-related decisions, such as bathroom segregation, and these questions are ultimately independent of how you define gender.

Living a full life

I introduced the distinction between empiricism and relativism as an epistemological one, that is, as a distinction between conflicting notions of what counts as knowledge and how to get it. But the distinction also has an ethical dimension; that is, it can also be a distinction between ideas of how one should spend one's life.

Ethical relativism (and here, I admit, I'm getting even further from standard uses of the word "relativism"), like epistemological relativism, emphasizes the value of personal experience. A life well lived is a life of rich and varied experiences, called a "full life". Ethical empiricism, in accordance with a focus on the external world, judges a person's life in terms of consequences on the world. What ultimately matters is not a person's experiences during life but what's left behind after they're dead. A life well lived is a life that makes the world a better place.

Ethical relativism serves as the justification for a lot of arguably irrational behavior. For example, a lot of people go to college, spending years of their youth and tens of thousands of dollars, not in order to receive specific training or to qualify for a specific career but for the sheer undergraduate experience (which is related to the thinking behind the liberal arts). Likewise people may eat in a way that they themselves believe to be unhealthy because they see little point in a longer lifespan if they can't spend it guzzling soda (in the words of a villain from a James Bond movie, "There's no point in living if you can't feel alive"). Likewise people may perform dangerous stunts, such as rock climbing with insufficient equipment. Many academics and government agencies would like to persuade the public to be more prudent—to wash their hands, wear seatbelts, avoid recreational drugs, and use condoms—but this will be an uphill battle so long as the public sees a sheltered, sterilized life as not worth living.

My perception is that in the ethical domain, relativism is a lot more common than empiricism, even among intellectuals who are strong epistemological empiricists. I think it's particularly common for atheists to be ethical relativists because, without a divine calling, they see no alternative to living for living's sake.

Experiential education

Over the past century, interest has increased in learning by doing. To be sure, learning a skill requires practice, and sometimes one needs a skill just to grasp a concept: for example, one can't expect to understand much of real analysis without knowing how to prove a theorem. Experiential education goes much further than this and has students do elaborate activities in the name of reinforcing simple ideas. To give a particularly silly but real example, in elementary school (in Tribeca, a well-to-do neighborhood of Lower Manhattan, in the 1990s), my class was doing a unit on Arctic wildlife. One of the facts we were to learn was that animals like polar bears have thick layers of fat to keep themselves warm. To help learn this, we had to smother our hands with shortening and plunge them into ice-cold water. I wasn't allowed to cease this activity until I reported that my hand didn't feel cold in the water, which was frustrating not least because I knew that fat insulated against cold, the teacher knew it, and yet I still had to do this song and dance that didn't even serve an evaluative purpose. There's evidently no need to make somebody learn something the hard way (by trying it) if you can just tell them, as the teacher had already told me.

Why would a teacher bust out the Crisco and ice water when they could just give a lecture? Some theories of education may suggest this will lead to more effective learning, such as Howard Gardner's theory of multiple intelligences (the activity would exercise bodily-kinesthetic intelligence), or the theory of learning modalities (kinesthetic learners would benefit). However, my (limited) understanding of educational psychology is that these theories have fared poorly in empirical tests (e.g., see Hunt, 2011, for a description of the thin support for Gardner's theory of multiple intelligences). I'm inclined to think that the appeal of experiential education, and hence multiple intelligences and learning modalities, is that people are more interested in experiences than in testable, practical knowledge. Which sounds more likely to appeal to parents of a certain mindset: a school where kids read textbooks, or a school where kids maintain their own ant farms and go kayaking?

Experiential education may also be justified as an attempt to train not the specific skill being called upon—like maneuvering a kayak, or smothering blubber on one's hands—but much more general capacities, like critical thinking, problem-solving, self-control, teamwork, and civic responsibility. This seems an admirable goal. Critical thinking, for example, is something we do all the time, and it's obviously necessary for success in every domain of life. The problem is that these capacities don't, in fact, seem to be trainable, unitary skills. It is a general finding in psychology that skill transfer—people's capacity to apply skills gained in one domain to another domain—is astonishingly weak. For example, Sims and Mayer (2002) found that skill in manipulating Tetris shapes obtained from playing hours of Tetris did not transfer to similar shape-manipulating problems with shapes other than those in Tetris. If a change of shape is enough to make my Tetris skills useless, can I expect my Tetris skills to transfer all the way to problem-solving in some entirely unrelated domain? On a similar note, my training in experimental psychology has taught me how to think critically about psychology, but this knowledge doesn't help at all if I try to think critically about physics. If I want to know how to read a physics paper, I'll have to learn physics.

Finally, note that ideas similar to experiential education also appear in undergraduate settings. The whole liberal-arts philosophy of introducing people to diverse fields of scholarship and making them into generally educated people puts a lot of faith into the idea of experience as the essence of learning, and can distract from any more focused educational goals (will reading the Iliad make you a better physical therapist?). To do one of those short-term study-abroad programs, in an age when text and images can circumnavigate the globe in an instant, is to take this relativist view on education to its logical extreme.

Celebrating diversity

Another troubling aspect of the aforementioned organizational anti-discrimination efforts is the attitude that the differences between people on which discrimination is based are good. It is construed as good not only that any given dimension of difference (gender, race, etc.) exists, but that people with wide variability on the dimension coexist in the organization (e.g., that the organization comprises people of many different races). This philosophy is called "celebrating diversity" or "multiculturalism". (Confusingly, the word "multiculturalism" is also used to mean the mere existence of differences between people, or tolerance of different people. Here I'll use it exclusively to mean celebration.)

Cynically, one could interpret celebrating diversity as an attempt to do lip service to principles of social justice without making aggressive policy changes or admitting that, given that every organization is part of a much larger society which is itself prejudiced, there is only so much one organization can do. But celebrating diversity can also be justified from relativism. What better way to get people with all sorts of experiences into your organization than to get people of all sorts of races?

Here is the weird thing about multiculturalism. Do proponents of multiculturalism believe that, except for the effects of prejudice, different categories are equally good? That it's no better to be white than black or vice-versa, not even in certain domains? If so, the only possible reason for any disadvantage a category suffers is prejudice—they're mistreated on account of their category. All that ostensible diversity is a mere distraction from individual differences, a hindrance to treating individuals according to their merits. Diversity can only do harm, so we should be finding ways to overcome it, not celebrate it. I suppose the argument would be that the differences between categories are still meaningful even if there is no meaningful sense in which one category is better than another. But this is nonsense. In what sense can a feature of a person matter if it never gives them some advantage or disadvantage, if it's never desirable to have it or lack it? Well, if you're a relativist, I suppose you can believe that different categories are simply incommensurable.

Other proponents of multiculturalism might indeed believe that, for example, black people are better than white people, at least when it comes to, say, moral decision-making. This seems odd in that the whole business of multiculturalism is ostensibly combating discrimination on the basis of category, and here we are, discriminating on the basis of category (and not merely with the goal of compensating for other discrimination, as in affirmative action). All I can say besides that is that judgments about which categories are better than which others should be based on empirical research, and lots of it. The history of science provides many examples of investigators' preexisting prejudices getting the better of them in such politically sensitive work.


Adam. (2012). TransWhat? • Confused? Start here. Retrieved from

Alexa. (2012, September 27). Dear White people who have opinions on whether or not certain behaviors are racist against PoC. Retrieved from

Altman, R. G., Morello-Frosch, R., Brody, J. G., Rudel, R., Brown, P., & Averick, M. (2008). Pollution comes home and gets personal: Women's experience of household chemical exposure. Journal of Health and Human Behavior, 49(4), 417–435. doi:10.1177/002214650804900404

Beggan, J. K., & Allison, S. T. (2003). What sort of man reads Playboy? The self-reported influence of Playboy on the construction of masculinity. Journal of Men's Studies, 11(2), 189–206. doi:10.3149/jms.1102.189

Haack, S. (1995). Multiculturalism and objectivity. Partisan Review, 512(3), 397–405. Retrieved from

Haack, S. (1996). Reflections on relativism: From momentous tautology to seductive contradiction. Noûs, 30, 297–315. Retrieved from

Hunt, E. (2011). Taking intelligence beyond psychometrics. In Human intelligence. New York, NY: Cambridge University Press. ISBN 978-0-521-70781-7.

Lakoff, G., & Núñez, R. E. (2000). Where mathematics comes from. New York, NY: Hachette. ISBN 978-0-465-03771-1.

Nisbett, R. E., & Wilson, T. D. (1977). Telling more than we can know: Verbal reports on mental processes. Psychological Review, 84, 231–259. doi:10.1037/0033-295X.84.3.231

Schlosser, E. (2015, June 3). I'm a liberal professor, and my liberal students terrify me. Retrieved from

Sims, V. K., & Mayer, R. E. (2002). Domain specificity of spatial expertise: The case of video game players. Applied Cognitive Psychology, 16(1), 97–115. doi:10.1002/acp.759