Whence pseudoscience?
An epidemiological approach
In this paper, we develop an epidemiological approach to account for the typical features and persistent popularity of pseudoscience. An epidemiology of pseudoscience aims at explaining why some beliefs become widely distributed whereas others do not and hence seeks to identify the factors that exert a causal effect on this distribution. We pinpoint and discuss several factors that promote the dissemination of pseudoscientific beliefs. In particular, we argue that such beliefs manage to spread widely because they are intuitively appealing, manage to hitchhike on the authority of science, and successfully immunize themselves from criticism.
Keywords: pseudoscience, epidemiology of representations, human cognition, epistemic vigilance, science mimicry.
«Systems of misbeliefs can develop adaptive rationales of their own, which cross-cut human purposes and intentions»
Introduction
Since the beginning of modern times, science and technology have made immense progress. We have probed into the quirkiest realms of physical reality, unravelled the evolutionary origins of biological complexity and diversity, and we are making new discoveries about the peculiarities of the human mind every day. We have developed tools to look deep into the universe and its past; we prevent, cure and have even eradicated diseases that have cost millions of lives, and we apply our knowledge about genetics to develop new medicines and plants that reduce the use of insecticides in agriculture. Yet, despite these impressive scientific and technological advances, people continue to believe the weirdest things. Pseudoscience and bordering types of irrational beliefs remain rampant. Creationists insist that God created the universe and life on earth no more than 10,000 years ago; highly educated groups oppose vaccinations and prefer quackery such as homeopathy to modern medicine; and radical environmentalists scare people into opposing a technology that contributes to the development of a sustainable type of agriculture. Why do such irrational beliefs remain so popular and persistent? In order to answer this question, we have developed an epidemiological approach in a series of papers. In this review, we explain what such an approach entails, and we summarize and discuss our most important findings.
An epidemiology of representations
The term epidemiological refers to the epidemiology of representations, which is a naturalistic model of culture first developed by cognitive anthropologist and philosopher Dan Sperber (1996). According to this model, culture is not a thing, but denotes a property, namely of items that are more or less widely distributed. Thus, to explain culture is to explain why some items (ideas, practices, artefacts, beliefs, etc.) are more popular than others. Here is the analogy with the epidemiology of diseases: just as human bodies are vulnerable to certain pathogens and not others, so human minds are more susceptible to particular beliefs (and other representations). Some representations are more contagious than others. We submit that pseudoscience consists of such highly contagious beliefs. Hence, to explain the popularity of pseudosciences, we need to explain why these beliefs appeal so strongly to so many people.
The epidemiological approach aims at explaining macro-level cultural phenomena in terms of micro-level interactions between individuals. As such, it is, as Lewens (2015) puts it, a «kinetic theory», analogous to the theory of gases that accounts for macroscopic phenomena in terms of processes at the molecular level. In certain contexts, it is useful to adopt the perspective of cultural representations «adapting» to our mental susceptibilities, within particular contexts, and over the course of many microscopic interactions. For instance, we can say that portrait art «exploits» our face recognition system (Morin, 2013).
In the case of pseudoscience, this representation-centric approach brings two important advantages. First, a belief-centric approach allows us to get a better grip on the cui bono question, and to address questions that are difficult to resolve in a traditional framework. Instead of asking what is in it for the purveyors and believers of pseudoscience, we can ask what is in it for the beliefs themselves. As we have explained in detail elsewhere (Boudry & Hofhuis 2017), systems of misbeliefs can develop adaptive rationales of their own, which cross-cut human purposes and intentions. Sometimes, the interests of individuals and beliefs overlap, but certainly not always. This is the case, for instance, when a person decides to stop a medical treatment for cancer and opt for non-efficient herbal remedies. The beliefs may prosper, but the patient is unwittingly harming himself. In such cases, the beliefs themselves provide the only available repository of purposes, i.e., the only answer to the cui bono question. Second, from a populational point of view, we can afford to be agnostic about the question of intentionality on the part of purveyors of pseudoscience. Some might be conscious frauds that willingly mislead people, but in other, and perhaps most cases, they are true believers themselves, having no clue as to why they have adopted these particular beliefs. Beliefs are shaped and distributed through cognitive and communicative processes, as if by the works of an invisible hand (Boyer, 2001). In this sense, individuals are but links in the causal cognitive chains through which beliefs spread (Sperber, 1996).
«Many irrational beliefs tend to adopt the trappings of science. This is what justifies our calling them ‘pseudoscience’»
Hence, we will talk about beliefs as intentional agents, adopting certain strategies and so on. However, we will use such intentional language only in the way evolutionary biologists talk about organisms adapting to their environment, as for instance when a frog has evolved flashy colours in order to scare off predators. In fact, the frog has no idea as to why it has such colours. It is not even aware that it has a skin, let alone a coloured one. Natural selection has done the «thinking» for the animal. Similarly, cultural evolutionary processes shape beliefs by adapting them to the peculiarities of the human mind and the environments the mind interacts with. As such, patterns emerge that create the impression of beliefs strategically transforming in ways that maximize their interests (Blancke, Boudry, & Pigliucci, 2017). In the following, we will briefly discuss three strategies which pseudoscientific beliefs have adopted to expand and stabilize their cultural success: intuitive appeal, science mimicry and immunisation to criticism.
«People will prefer beliefs with a scientific certificate, so that they, in turn, can use it as an argument to justify their beliefs and convince others»
Intuitive appeal
One important factor that determines the shape and popularity of beliefs is the universal make-up of the human mind. The epidemiology of representations predicts that, ceteris paribus, the beliefs that manage to tap into our intuitive expectations stand the greatest chance of becoming popular and thus cultural. These expectations are constituted by our intuitive ontologies, which are non-conscious, automatic and spontaneous inferences about particular relevant domains of the world around us (Boyer & Barret, 2005). From a very young age, for instance, children have the intuitive expectation that dead objects will not move by themselves and that they will not suddenly disappear (Spelke, 1990). These expectations are part of our intuitive physics. We also hold intuitions about the biological world, an intuitive biology. Psychological essentialism, for instance, is the mental disposition by which we assume that an organism contains an invisible and immutable core (an essence) that determines its behaviour, development and identity (Gelman, 2004). Another intuition, teleological thinking, explains natural and biological phenomena in terms of their function or goal. For instance, that rain exists to water the plants, or that lions exist for being displayed in the zoo (e.g., Kelemen, 1999). We also have an intuitive psychology, by which we automatically explain other people’s behaviour in terms of their mental states, such as intentions and emotions. Being an exceptionally social species, such thinking comes very natural to us, which explains why we also overextend it to natural objects and events.
These intuitive expectations are highly robust, and strongly affect how beliefs will transform and stabilize, in other words, what sorts of belief will become cultural. Pseudoscience owes its cultural success largely to the fact that it manages to exploit our intuitive expectations (Boudry, Blancke, & Pigliucci, 2015). Our essentialist inclinations render us vulnerable to creationist beliefs that species are immutable and fixed categories, and that they have remained more or less the same since creation, with no change across species barriers (Blancke & De Smedt, 2013). They also make us susceptible to belief in homeopathy, which holds that water retains the essence of a substance even after the substance has been diluted to a point where not a single molecule can be detected anymore. Essentialism also underlies the widespread opposition to genetically modified organisms, in that it makes people more critical of applications involving transgenesis, i.e., modifications using DNA from a different species – even though DNA is DNA, no matter where it comes from (Blancke, Van Breusegem, De Jaeger, Braeckman, & Van Montagu, 2015). Teleological and intentional intuitions make us vulnerable to creationist beliefs and new-age beliefs about Mother Nature, but also to conspiracy theories and beliefs about UFOs and alien abduction, all of which postulate intentional agents where none are present. The fact that the mind holds a system that is dedicated to dealing exclusively with minds and not bodies also makes us vulnerable to dualist assumptions, which, in turn, make people susceptible to believing in ghosts and other non-bodily agents. Hence, an epidemiological perspective explains the typical features of pseudoscientific beliefs, but also accounts for their popularity and persistence. Because such beliefs tap into our intuitions, people can easily grasp, remember and communicate them. Scientific beliefs, in contrast, are often highly counterintuitive, demanding considerably more cognitive effort to process. Hence, they cannot simply hitchhike on human minds to become popular, but they require specific institutional support. The unnaturalness of science puts it at a serious disadvantage, leaving plenty of fertile ground for more natural, but irrational beliefs (Boudry et al., 2015).
Science mimicry
Although irrational beliefs fly in the face of science, many of them also tend to adopt the trappings of science. This is what justifies our calling them «pseudoscience». They pretend to be scientific, but they fail to live up to the criteria of epistemic warrant and rationality we expect from good science (Hansson, 2009). But why would weird beliefs emulate science? The reason is that many people regard science as an epistemic authority, a trustworthy source of information. Even though people may dislike some of the findings of science, they are still impressed with its technological success and cultural prestige. As a result of this, it pays off if you can present your beliefs as bearing the imprimatur of «science».
«Purveyors of pseudoscience will try their very best to have their papers published in respectable academic journals »
This is an instance of what Sperber et al. (2010) have termed «epistemic vigilance», i.e., the ability to discriminate between reliable and unreliable bits of information. In an uncertain world, people need to discriminate between reliable reports and mere rumour, between trustworthy sources and liars. Any organism that opens itself up to outside information, but fails to exercise some measure of epistemic vigilance, would be a sitting duck for manipulators and liars. Epistemic vigilance is exercised in broadly two different ways: source and belief content. When assessing new information, people can check the content of the information for consistency and for coherence with background beliefs. Alternatively, they can check the source of the information and examine whether it is competent and knowledgeable, whether it has a good reputation, and whether it has any hidden agenda. In other words, despite what some popular psychology books tell us (e.g., Ariely, 2009), people are not just gullible fools who accept just about anything they hear on the spot.
«In order to stand a decent chance of cultural survival, therefore, pseudoscience needs protective measures against reality»
For any belief to stand a chance of success in the competition for human minds, therefore, it needs to get past the screening procedures. As we already discussed at length above, pseudoscientific beliefs tend to tap into our intuitive assumptions, which means that they are more likely to cohere with our background beliefs. As a result, people will lower their vigilance. Furthermore, because pseudoscientific beliefs tend to adopt all the outward trappings of science, people tend to consider it a trustworthy source of information. People hold science in high esteem, but often they have only a limited understanding of what exactly «science» amounts to, and in virtue of which features science has accrued cultural prestige (namely, that it consists of practices that generate the best available knowledge within a certain domain). They might be impressed by the technological advancements that science delivers, or they might ascribe authority to science merely on the grounds of its abstruse and technical language, its use of sophisticated equipment and experimentation, and its reliance on quantifiable results and statistics. In itself, this is not a problem, as these are often pretty good indications of scientific standing. However, this situation creates opportunities for irrational beliefs. They can mimic the outward features of science to create a trustworthy impression, thus exploiting the mechanisms for epistemic vigilance. Purveyors of pseudoscience will try their very best to have their papers published in respectable academic journals and they will boast their academic credentials in order to convince people that their beliefs are trustworthy. Apart from that, we should also take into account the limitations of epistemic vigilance itself. The mechanisms for epistemic vigilance have evolved to deal with one-on-one interactions, in which it is relatively easy to check the content and gauge the trustworthiness of the source, not with complex matters of science and pseudoscience, where the content is abstruse and often (partly) incomprehensible, and the reliability of the source depends on complex chains of trust and expertise.
Most pseudosciences will not only mimic the features of science, but explicitly claim the honorific title of science. In other words, the quality label of science is explicitly used as an argument to persuade people and thus to tap into people’s epistemic vigilance (on this function of arguments, see Mercier & Sperber, 2011, 2017). In a cultural environment where science is acknowledged as an epistemic authority, emphasizing that a belief is scientific – even if it is not – is a convincing argument indeed. Moreover, at the receiver’s end, people will prefer beliefs with a scientific certificate, so that they, in turn, can use it as an argument to justify their beliefs and convince others. The result is that, in particular cases, irrational beliefs that mimic science outcompete similar beliefs that do not. In other cases, pseudoscientists will downplay the authority of science, pretending to offer an «alternative way of knowing», which supposedly stands on an equal footing with science. And sometimes, as in the case of so-called «creation science», the two strategies are merged together.
The attraction of pseudoscience is not just a cognitive phenomenon, but has a motivational component as well, which is captured by the notion of epistemic negligence. This captures the idea that people are lazy reasoners. They are easily satisfied with beliefs and arguments that they have come to hold on an intuitive basis or on the basis of trust. Understanding scientific concepts and theories requires a lot of effort, an investment which most people simply – and somewhat understandably – are not prepared to make. As a result, even if people profess belief in modern science, they only have a superficial understanding of the relevant theories and concepts, which they tend to distort into more intuitive representations. For instance, even people who endorse the theory of evolution, still harbour teleological intuitions about the direction of evolution, and still struggle with a purely populational understanding of species boundaries. As a result, popular conceptions of modern scientific theories actually border on pseudoscience anyway. This closes the mental gap between real science and pseudoscience, thus creating the ideal circumstances for pseudoscience to flourish and to present itself as the real thing (for more details, see Blancke et al., 2017).
Immune to criticism
Even though some irrational beliefs have an advantage over scientific ones, in that they typically tap into our intuitions, they also have a significant drawback. They are potentially destabilized by falsifying evidence, and by rational criticism. This is where scientific beliefs have a head start: because they are supported by empirical evidence and are internally consistent, they can afford to expose themselves to empirical testing.
Every pseudoscience, one way or the other, is confronted with this resistance of the world out there. If they want to hold sway over human minds, intuitive appeal and cultural mimicry will not be enough. Beliefs that are false in a manner that is immediately obvious or open to inspection are unlikely to gain wide acceptance, even if they are intuitive. As we noted above, people are not as gullible as is often supposed. If a belief is palpably false, people will be unlikely to endorse it.
In order to stand a decent chance of cultural survival, therefore, pseudoscience needs protective measures against reality. In one way or another, they have to ensure that they are never threatened by empirical evidence and rational criticism. For this reason, one of the recurring features of pseudosciences is the reliance on immunizing strategies which inoculate the theory against falsification and criticism (Boudry & Braeckman, 2011, 2012). There are many different ways to forestall falsification and prevent critical scrutiny. Many pseudosciences contain theory-internal explanations for opposition against the belief system itself, which Boudry and Braeckman dubbed «epistemic defence mechanisms». For instance, Sigmund Freud famously suggested that the opposition against psychoanalysis is a ringing confirmation of one of its main predictions: that critics are under the spell of unconscious resistance, desperate to cover up the inconvenient truths of Freudian theory. Scientologists and Marxists have constructed their own version of the resistance argument. It is quite neat, as it is a trump card that can be used in any discussion, against any given argument.
If a belief systems postulates invisible intentional agents, as many pseudosciences do, a whole range of immunizing strategies opens up: the secret conspirators may be planting false evidence to throw us off the scent, and the visiting extraterrestrials may wish to escape detection by earthlings. Ghosts can play hide and seek, and the devil may tempt us with clever skeptical arguments (some creationists believe that Satan himself whispered the idea of evolution in Darwin’s ear).
Parapsychology has a whole set of built-in defence mechanisms for fending off unwelcome findings. In particular, many parapsychologists believe that the presence of inquisitive minds disturbs psychic phenomena, a phenomenon they call «negative psi vibration» or «catapsi» (notice the technical jargon). They also believe that psi forces are shy and actively evade being detected, thus explaining the lack of empirical evidence.
«Parapsychology has a whole set of built-in defence mechanisms for fending off unwelcome findings»
Another immunizing tactic is to turn central concepts and claims into moving targets, amenable to a range of interpretations. Astrology and assorted forms of fortune-telling provide good cases in point. Horoscopes look as if they contain specific predictions or interesting observations about your character, but as soon as they are threatened with falsification, they become vague or turn into metaphors.
Concluding remarks
Why do irrational beliefs still thrive in the age of science, and why do they often adopt the trappings of science? An epidemiological approach to culture allows us to answer these questions. First, many such beliefs, though they have no basis in reality, appeal to universal human intuitions. This gives them a significant head start over scientific beliefs, which are famously inimical to our intuitive worldview. Second, because science is held in high esteem in our culture, mainly in virtue of the technological fruits it bears, it pays for irrational beliefs to adopt the outward trappings of science. Given that people have a poor understanding of the authority of science anyway, they will have a hard time telling the difference, and they will fall for this sort of cultural mimicry. Ironically, explicitly claiming the imprimatur of science has proved to be a successful strategy, even for beliefs that are anything but scientific, and that are roundly rejected by the scientific community. Third, even though pseudoscientific beliefs have no basis in reality, unlike scientific ones, they have evolved clever tricks to avoid exposure to destabilizing falsifications, and to prevent critical scrutiny. Thus, pseudosciences have developed their own immune systems, ensuring their hold over people’s minds.
References
Ariely, D. (2009). Predictably irrational, revised and expanded edition: The hidden forces that shape our decisions. Nova York: Harper Collins.
Blancke, S., Boudry, M., & Pigliucci, M. (2017). Why do irrational beliefs mimic science? The cultural evolution of pseudoscience. Theoria, 83(1), 78–97. doi: 10.1111/theo.12109
Blancke, S., & De Smedt, J. (2013). Evolved to be irrational? Evolutionary and cognitive foundations of pseudosciences. En M. Pigliucci, & M. Boudry (Eds.), The philosophy of pseudoscience (pp. 361–379). Chicago: The University of Chicago Press.
Blancke, S., Van Breusegem, F., De Jaeger, G., Braeckman, J., & Van Montagu, M. (2015). Fatal attraction: The intuitive appeal of GMO opposition. Trends in Plant Science, 20(7), 414–418. doi: 10.1016/j.tplants.2015.03.011
Boudry, M., Blancke, S., & Pigliucci, M. (2015). What makes weird beliefs thrive? The epidemiology of pseudoscience. Philosophical Psychology, 28(8), 1177–1198. doi: 10.1080/09515089.2014.971946
Boudry, M., & Braeckman, J. (2011). Immunizing strategies and epistemic defense mechanisms. Philosophia, 39(1), 145–161. doi: 10.1007/s11406-010-9254-9
Boudry, M., & Braeckman, J. (2012). How convenient! The epistemic rationale of self-validating belief systems. Philosophical Psychology, 25(3), 341–364. doi: 10.1080/09515089.2011.579420
Boudry, M., & Hofhuis, S. (2017). Parasites of the mind. How cultural representations can subvert human interests. PhilSci Archive. Consultat en http://philsci-archive.pitt.edu/id/eprint/13207
Boyer, P. (2001). Religion explained. The evolutionary origins of religious thought. Nova York: Basic Books.
Boyer, P., & Barrett, H. C. (2005). Domain specificity and intuitive ontology. En D. M. Buss (Ed.), The handbook of evolutionary psychology (pp. 96–118). Hoboken: Wiley.
Gelman, S. A. (2004). Psychological essentialism in children. Trends in Cognitive Sciences, 8(9), 404–409. doi: 10.1016/j.tics.2004.07.001
Hansson, S. O. (2009). Cutting the Gordian knot of demarcation. International Studies in the Philosophy of Science, 23(3), 237–243. doi: 10.1080/02698590903196007
Kelemen, D. (1999). Why are rocks pointy? Children’s preference for teleological explanations of the natural world. Developmental Psychology, 35(6), 1440–1452. doi: 10.1037//0012-1649.35.6.1440
Lewens, T. (2015). Cultural evolution. Conceptual challenges. Oxford: Oxford University Press.
Mercier, H., & Sperber, D. (2011). Why do humans reason? Arguments for an argumentative theory. Behavioral and Brain Sciences, 34(2), 57–74. doi: 10.1017/s0140525x10000968
Mercier, H., & Sperber, D. (2017). The enigma of reason. Cambridge, MA: Harvard University Press.
Morin, O. (2013). How portraits turned their eyes upon us: Visual preferences and demographic change in cultural evolution. Evolution and Human Behavior, 34(3), 222–229. doi: 10.1016/j.evolhumbehav.2013.01.004
Spelke, E. S. (1990). Principles of object perception. Cognitive Science, 14(1), 29–56. doi: 10.1207/s15516709cog1401_3
Sperber, D. (1996). Explaining culture. A naturalistic approach. Oxford: Blackwell.
Sperber, D., Clement, F., Heintz, C., Mascaro, O., Mercier, H., Origgi, G., & Wilson, D. (2010). Epistemic vigilance. Mind & Language, 25(4), 359–393.
Acknowledgements
This research was funded by Ghent University (BOF13/24J/089) and the Flemish Fund for Scientific Research (FWO/G001013N). The authors would like to thank Christophe Heintz for the helpful comments.