Deliberate Ignorance: The “Whens” and the “Whys” of not wanting to know

Deliberate Ignorance: The “Whens” and the “Whys” of not wanting to know

March 7, 2022 Off By Serena Iacobucci

According to Aeschylus’s version of ancient Greek tragedies, Apollo granted Cassandra the gift of prophecy. “What! A kind gift from the son of Zeus himself, right?” Not really. The God Apollo was actually in love with the beautiful priestess, and she promised him her favors, but, after receiving the gift, she went back on her word. At this point, the God, enraged by this refusal, turned his gift into a curse: though she could still see the future, nobody would believe her predictions and prophecies.

The priestess predicted the destruction of Troy, but her warnings remained ignored while a cheerful crowd welcomed the wooden horse built by the Greeks as a sign of surrender. She witnessed, equally powerless, the death of her father Priam. As a tragic epilogue for her tragic life, she predicted and chose to face her own death.

The punishment Apollo inflicted on Cassandra was thus terribly heartbreaking and the prophetess was doomed to a fate of a deep, unheard loneliness. Despite Cassandra’s sad destiny, however, humankind has always placed a great value over the ability to predict the future. This fascination went further than a pure hedonic attraction. Its positive, utilitarian value has always been associated – for example – with the possibility to incorporate new data and information in our decision-making process and, therefore, to make better choices.

No God was there to grant us humans this powerful gift, so we did the best we could, drawing from the sacred and the mystical to more scientific approaches. If we think of disciplines such as astrology and divination, the first attempts to predict the future and the steps we have taken have been quite remarkable. Consider, for example, the design and application of increasingly accurate machine learning algorithms in trade finance, which can make financial forecasts thanks to the ability to process a huge amount of data and time series easily and very quickly [1].

In the biomedical field, genomic research allows us to perform predictive and accurate pre-diagnostics for early detection and prevention of a wide range of diseases [2]. In rather less dramatic scenarios, some claim to be able to predict (with 90% accuracy) when a newly married couple will divorce [3].

With this gradual increase in the predictive ability of some of the methods on which we rely, the fine line between the unknown and the knowable has been shrinking more and more and more… Apparently, we can make ourselves much more like Cassandra than we thought possible, even without Apollo’s help.

The value of knowledge

Knowledge has always been appreciated and admired, and often financially rewarded. For this reason, human beings – across the history of philosophical thought (here we mostly refer to  Western thinking) – seem to have always pursued the goal of seeking and acquiring as much information as possible. From Adam and Eve, who drew from the tree of knowledge (aka, the tree of Eden) violating the prohibition placed by God himself, through Aristotle, Hobbes and Bacon, for whom the thirst for knowledge was intertwined with an ardent desire, similar but not completely assimilable to the concept of curiosity.

We arrive at modern psychology, as well as economic theories of utility maximization, with models assuming that more information translates into greater bargaining power [4].

Even the founding father of psychoanalysis, Freud, warned us to free ourselves from the yoke of knowledge repression (denial, or the so-called “ostrich policy”) that we put in place to protect ourselves from the hurt that can sometimes result from gaining unwanted access to painful information.

bird on green grass
An ostrich hiding its head in the sand.
Photo by Wolfgang Hasselmann on Unsplash

Unlike Cassandra, we are neither condemned nor forced to peek into our future. Indeed, we are free to (and, thus, often choose not to) take a genetic test to find out whether we will develop a disease for which there is a genetic predisposition or history within our family. In the same vain, nobody will force us to find out the odds of getting divorced right before walking down the aisle (though that would be the perfect time to gain such information so as to make an informed choice).

The question that many have asked themselves, Prof. Gerd Gigerenzer among them  (for those unfamiliar with his studies, take a minute to read his work on risk literacy, will ya?) is that of willful ignorance. When and why do we decide to go down the road of deliberate ignorance, rather than discover new and relevant (albeit painful) information from which we might benefit at some time in the future?

Deliberate ignorance

Human behavior, however, is inconsistent with the proposition that knowledge is always perceived to be valuable).

Sharot & Sunstein, 2020 [5].

Let us first clarify what we mean by “knowledge space,” which is a combinatorial structure that describes the possible knowledge states of human beings as learners.

Ignorance – accordingly – is a knowledge space where the answer to a question is unknown. This question may concern any event – whether in the past,  present or future – and the answer may, therefore, be knowable with certainty or with a certain degree of uncertainty [6].

By deliberate ignorance, on the other hand, we mean the active decision of not wanting to know: we choose not to access a particular piece of information, although it is contained in a knowledge space where it is readily and easily knowable.

According to Sweeny et al.  [7] deliberate ignorance can result from inaction as much as from action, i.e. actively refusing to listen to  information that someone else is willing to share with us.

Albert and Lucas [8] reported this definition in graphical form – via a knowledge space – consisting of a series of N questions and their answers, whether qualitative (“Is Darth Vader my father?” Yes/No) or quantitative (“What is the probability of dying in a car accident by driving 400km?”).

Fig. 1 As shown in the figure (author’s reworking, from Gigerenzer and Garcia-Retamero, 2017 [6]), an individual knowledge space that includes deliberate ignorance is represented by a set N of questions.
The (-) + signs represent the questions for which (not) one has knowledge of the answer (Ni);  the – signs shown in the black circles, on the other hand, are the questions for which one neither has an answer nor wants to know it (Ndi), despite having the possibility to do so.
For this reason, deliberate ignorance exists if Ni ≥ Ndi > 0.
Source: Gigerenzer & Retamero, 2017

So, formally, we speak of deliberate ignorance if, and only if:

Condition 1. One opts for ignorance even if there is no cost to accessing the information;

Condition 2. You opt for ignorance at the expense of your own personal interest.

Condition 1, therefore, makes it clear that we talk about someone who’s acting in a way that is not aligned with a rational information seeking behavior [9]: any rational subject should calculate whether they should try to acquire additional information by calculating the cost involved. If such cost is zero, there should not be any rational reason that would motivate our decision to opt out and not access the information.

Condition 2 emphasizes that this piece of neglected information is relevant (or will be relevant in the future) for the agent who – however – chooses to ignore it.

Another substantial feature of deliberate ignorance is that it is not related to any cognitive limitations – such as lack of memory – nor bias of any sort – such as confirmation bias. In the specific case of confirmation bias, for example, the agents involved albeit biased (duh) – are still actively looking for information. While those who choose deliberate ignorance are avoiding looking.

Another aspect of this type of deliberate ignorance concerns its difference with agnotology, i.e. deliberate, culturally induced ignorance or doubt, typically to sell a product or win favor, particularly through the publication of inaccurate or misleading scientific data [10]. In this case, similar to what happens with confirmation bias, we are certainly talking about a (rather oxymoric) anti-epistemic attitude toward knowledge. Agnotology, however, derives from external forces, for example, the will of some politicians or pressure groups, such as – trivially – the tobacco or arms lobbies deliberately manipulating and delivering misinformation so as to pursue their own interests. For further information about the topic, we suggest listening to this interview with Dr. Robert Proctor, a Stanford professor of the History of Science. Deliberate ignorance, on the other hand, focuses on the individual level, presupposing a conscious choice made without regard to external pressures or lobbying pressures.

The reasons for deliberate ignorance

Let’s get to the point, then. Why should we actively decide that we don’t want to know the answer to questions of great personal relevance, despite that answer being free and effortlessly obtained? Gigerenzer and Garcia-Retamero [6] propose four reasons:

Reason 1. Avoid the negative emotional reactions: think of James Watson. We owe him the discovery of DNA (Editor’s note: notwithstanding his positions on eugenics and the controversies over race and intelligence, we obviously take the author’s story only as a relevant example of willful ignorance). Watson, who personally involved in the collection of genomic data during his studies on the sequencing of the human genome, requested that his own genetic information on the ApoE gene would remain undisclosed from him because an association had been demonstrated between this gene and late-onset Alzheimer’s disease (LOAD), which caused the death of one of his grandmothers and remains incurable. Being in possession of his genetic information regarding the ApoE gene would mean that Watson would have to face actual probabilities of him developing the illness at some time in the future. And this was a scenario that he was unwilling to face (For more information, ApoE Genotype, Alzheimer’s Disease | Lab Tests Online-EN).

Reason 2. Save a surprise and keep the suspense going: just think about couples expecting a baby and deciding not to find out the sex of the newborn.

Reason 3. To gain strategic advantage: this might seem counter-intuitive, yet, according to some [11], there may be strategic advantages from intentional blindness in various domains – such as in banking, where, according to Margaret Heffernan, intentional blindness helps bankers and policy makers underestimate and overlook risks and deflect potential future criticism, as happened after the 2008 crisis.

In behavioral economics, it was game theory – first and foremost Schelling, in his 1956 essay “An Essay on Bargaining” [12] – that challenged the much-vaunted strategic hegemony of information bearers in bargaining scenarios. We shouldn’t underestimate that – in these contexts – deliberate ignorance sometimes allows us to avoid taking any sort or degree of responsibility for what is happening [13]. Gigerenzer and Garcia-Retamero [6] give as an example a (less bloody) version of the Chicken Game. Imagine two people walking toward each other. One of the two is inattentive and is looking at their cell phone: by not checking the road ahead, they are deliberately choosing to ignore the information they could easily get, i.e. “is there anyone I could collide with in front of me?”

However, the more careful pedestrian will avoid the collision and the inattentive one will effortlessly benefit from it, even if willfully ignoring the road ahead. A related (and much more mainstream)concept is the Dunning-Kruger effect [14]. Not being aware of one’s limits could increase motivation and self-confidence to the point of improving one’s performance. For further discussion, see Hertwit and Engel [13] who devote an entire paper to ignorance as a strategic expedient, specifically see “Deliberate ignorance as a performance-enhancing device.” The authors note that a prediction that would show a large discrepancy between desired and potential performance could generate a state of arousal (e.g. performance anxiety) able to compromise the actual performance [15, 16.] Similarly, the tendency to produce optimistic predictions – although based on an inaccurate observation of the probability of failure – could be decisive when choosing whether to undertake an ambitious project or not [17].

 The authors say: “It is possible that no textbook would ever be written, no house built, and no work composed if people based their decision on the progress and success of similar endeavors.”

Hertwig, R., & Engel, C. (2016).

Reason 4. Deliberate ignorance is used to ensure impartiality (Please note that justice is depicted as blindfolded). Consider interviews where certain personal information about the candidate is not reported in order to avoid bias of any kind. Or let’s think about blind hearings. In 1952, the Boston Symphony Orchestra decided to select musicians by having him/her perform behind a screen.

Vienna Philharmonic publishes pictures from its recent auditions | News |  The Strad
A blind audition for the Vienna Philharmonic Orchestra. Photo by Jun Keller.

By adopting this technique (from 1980 onwards, so did the other four major US orchestras, i.e. the Chicago Symphony Orchestra, the Cleveland Symphony Orchestra, the New Philharmonic and the Philadelphia Orchestra) by the end of the 1990s, the number of women in the orchestra had increased from 12% to 20-30%.

Deliberate ignorance: the possible role of regret.

Let’s focus now only on the first two reasons for deliberate ignorance – those related to positive or negative emotions avoidance (e.g. avoid knowing you’ll get sick, or deciding not to know the sex of your newborn).

Specifically, let’s think about negative ones, such as regret and remorse.

Regret, specifically, is a negative emotion that we feel after having selected choice A (e.g. not having taken out a policy on that trip booked for Easter 2020…) and having discovered that choice B would have been more advantageous (taking out travel insurance). The anticipated regret that we may experience influences the choice itself.

At this point, together with the two conditions already specified:

Condition 1. One opts for ignorance even though there is no cost to accessing the information;

Condition 2. Ignorance is chosen at the expense of personal interest.

We must add two other conditions related to the possibility of feeling regret:

Condition 3. The possibility of getting feedback about the alternative outcome (i.e. the outcome of the option not chosen).

The existence of feedback is fundamental, therefore, to the possibility of feeling regret. Think of the classic experiments in which you must choose between a certain win of 50 euros and a win of 100 euros – with 50% probability. Many risk-averse participants would choose the certain win of 50 euros. However, this preference is completely consistent with anticipatory regret: we choose the certain option to avoid the possibility of having to deal – in the future – with the scenario where we won nothing because of our greed. (For a re-examination of how we go about risk aversion and regret aversion, see the behavioral studies of Zeelenber et al. (1996) and Zeelenberg (1999), as well as the more recent neuroimaging studies of Coricelli et al. ( 2005) [1820].

Thus, a fifth reason comes into play (Please, bear with me, here. Let’s not confuse the three Conditions with the four Reasons mentioned above):

Reason 5. The possibility that knowing the outcome of the unchosen option could be either favorable or unfavorable for us.

This last condition, defined by Gigerenzer and Garcia-Retamero [6] as “Approach-avoidance conflict” occurs when the achievement of a goal could have more or less desirable outcomes. This is a crucial conflict – one that classical theories of rational information seeking partially resolve, as mentioned earlier, by identifying a trade-off between the benefits of obtaining more information and the costs of engaging in its search. In these theories, however:

a) knowledge is always considered beneficial;

b) there is always a cost to endure for information seeking.

In contrast, the theory of ignorance dictated by the anticipatory regret arises precisely where:

a)  there is a chance that knowledge may have negative consequences;

b)  there is a negligible cost (or even no cost at all) to obtain it.

For this reason, where classical theories assume that the expected utility of an option only depends on the positive or negative outcomes of that option multiplied by their probabilities, here we assume that choice also depends on the anticipated regret evoked by knowing the alternative option we are going to give up [i.a. 21,22].

But let’s get to the experimental part.

Gigerenzer and Garcia-Retamero [6] investigated this phenomenon through two experiments, in Germany and Spain, on two populations composed of about 900 respondents.

In the first experiment, they first asked four questions with a positive outcome and four questions with a negative outcome in order  to see if there were any differences relative to the valence of the question.

Some examples of events with a purely negative outcome are:

– “Would you like to know today when your partner will die?” To which 89% answered no

– “Would you like to know today when you will die?” No: 87%

– “Imagine you just got married. Would you like to know today why your marriage will fail?” No: 86%

Some positive events, on the other hand, are:

– “Would you like to know if there is life after death?” No: 56%

– “Suppose that, with your partner, you are expecting a baby. The sex of the baby can be reliably determined with an ultrasound. Would you want to know the sex of your baby before birth?” No: 40.3%.

Gigerenzer and Garcia-Retamero [6]  have thus noted that there is “widespread deliberate ignorance” for both negative and positive events, although this is not consistent with the human desire to avoid or reduce uncertainty, ambiguity, or the need for cognitive closure [23, 24].

It is equally difficult to reconcile what has just been said with more classical rational choice theory. Let’s just take negative events. Acquiring that information would certainly give one a good advantage: by knowing beforehand one can maximize one’s wellbeing (see Becker [25] – on the foresight of “forward-looking agents”).

But let’s see an example (Editor’s note: Sorry for the crudeness of the next few sentences, all credit goes to the Nobel Prize just mentioned!)

Regardless of the motives, whether selfishness, altruism, or even pure masochism,  knowing in advance the date or cause of death of one’s partner could allow us to maximize wellbeing in a wide variety of ways: from better planning one’s savings and retirement to allocating time to spend together, for example, in the last moments of one’s life. Similar arguments can be made with respect to the question “Would you want to know if and when you’re going to get divorced?” (on this subject, we recommend reading our paper on the Sunk Cost Fallacy).

Long story short: if you’re rational you might want to know – particularly if it doesn’t cost you anything. Yet, experimental evidence shows just the opposite: ignorance is not the exception but the rule. At the same time, we might say that, if we were forward-looking, we would at least want to know more about events with potential positive outcomes. Knowing the gender of one’s own newborn – again according to Becker – would allow us to reduce uncertainty and plan ahead. Becker’s far-sightedness, however, leaves no room for surprise, suspense, or even anticipated regret. But we all know we’re not THAT rational, so all of these variables play a huge role when deciding to obtain more information about such significant events in our lives.

Gigerenzer and Garcia-Retamero[6], moreover, point out that the closer one gets to the age at which a negative event is most likely to occur (e.g. divorce, death of a partner, health problems in old age), the more one is likely to deliberately choose ignorance. This finding would, therefore, seem to contradict:


– the assumption that information about imminent events is generally more relevant.

– the tendency to consider younger people more present- rather than future-oriented in all domains. Theoretically speaking, younger individuals should be applying a higher discount rate to information that is more distant in time (for clarification on the concept of temporal discount rate, read here). If this assumption was (always) true, in fact, the propensity to choose not to know should increase – and not decrease – with age. With the sort of information we have been so far discussing, however, we have an opposite effect.

This behavior is rather self-explanatory if we observe it through the lens of the early regret hypothesis: a young person, say a twenty-year-old, will be more likely to decide that he or she wants to know if he or she is going to die at 60 rather than 70, and will not be as devastated by such news as a fifty-five-year-old would be.

Another interesting result, especially from a public policy perspective, shows that those individuals who prefer to remain ignorant are more risk averse and are more willing to buy life or legal insurance (when, of course, such insurance is not already mandatory) – confirming the crucial role played by anticipated regret.

What about behavior?

From the discussed study, it looks like a very high percentage of people actively decide to ignore relevant information and, thus, not to know. But that is, of course, just a matter of self-reported measures or behavioral intentions, we’re not measuring actual behavior (yet…).  So, is it also what they would do? One possible criticism of this line of research is certainly related to the fact that, in some scenarios, people do know that their choice is completely hypothetical – since there is no way to accurately predict the kind of information that was presented to them during the experiment (yet, again…). Would some answers change if we knew about the existence of technology, for example, that calculates our exact date and time of death?

Well, we don’t know (ironic…). But we can still look at those questions that were about, let’s say, parents knowing the sex of their future children (an information that we can obviously access easily and with great reliability). Further studies [26] reported that, in real life – a good 31% of surveyed parents-to-be decides that they did not want to know the sex of the newborn (in line with the percentages reported in studies by Gigerenzer and Garcia-Retamero [6] – where, bear in mind, this choice was only the result of a hypothetical scenario and not reality).

One of the main reasons reported by those who decide not to want to find out before birth is precisely “to keep the gender surprise at birth.” However, it would be inaccurate to generalize this finding and assert that reported deliberate ignorance can translate into behavior in all the other hypothetical scenarios presented.

For millennia, the desire to know   seemed hardwired in human nature. People – driven by an alleged, phantomatic rationality – were expected to decide to take preventive screenings, genetic tests and monitor their own health, if given the opportunity to do so (free of charge).

For decades, across the 20th century,  philosophy, sociology as well as psychology and economics have been working on decision-making models that take for granted that:

– “more information is better” – given that such information is genuine and that the costs to acquire it do not exceed its benefits [9].

– that the new knowledge must necessarily be acquired and used to update the a priori probability distributions used to make rational decisions [27].

The decision of not wanting to know seems counter-intuitive and irrational while we’re talking (or writing) about it, although it is clear that, in everyday life, this is a scenario that we probably reiterate on a daily basis as well.

The truth is that we – unlike Cassandra –  seem to have the ultimate greater power: we can decide not to know and close our eyes so as to be surprised by life events and to ease the fear of remorse and the burden of suffering.

References

[1] Gomber, P., & Zimmermann, K. (2018). Algorithmic trading in practice. In The Oxford Handbook of Computational Economics and Finance. Oxford University Press.

[2] Cawthon, R. M., Smith, K. R., O’Brien, E., Sivatchenko, A., & Kerber, R. A. (2003). Association between telomere length in blood and mortality in people aged 60 years or older. The Lancet361(9355), 393-395.

[3] Gottman, J. M., & Levenson, R. W. (2002). A two‐factor model for predicting when a couple will divorce: Exploratory analyses using 14‐year longitudinal data. Family process41(1), 83-96.

[4] Conrads, J., & Irlenbusch, B. (2013). Strategic ignorance in ultimatum bargaining. Journal of Economic Behavior & Organization92, 104-115.

[5] Sharot, T., & Sunstein, C. R. (2020). How people decide what they want to know. Nature Human Behaviour4(1), 14-19.

[6] Gigerenzer, G., & Garcia-Retamero, R. (2017). Cassandra’s regret: The psychology of not wanting to know. Psychological review, 124(2), 179.

[7] Sweeny, K., Melnyk, D., Miller, W., & Shepperd, J. A. (2010). Information avoidance: Who, what, when, and why. Review of General Psychology, 14, 340 –353. http://dx.doi.org/10.1037/a0021288

[8] Albert, D., & Lukas, J. (Eds.). (1999). Knowledge spaces: Theories, empirical research, applications. Mahwah, NJ: Erlbaum.

[9] Stigler, G. J. (1961). The economics of information. Journal of Political Economy, 69, 213–225. http://dx.doi.org/10.1086/258464

[10] Treccani.it (2018). Disponibile online: https://www.treccani.it/enciclopedia/agnotologia_(altro)/

[11] Admati, A. R., & Hellwig, M. (2013). The bankers’ new clothes. Princeton, NJ: Princeton University Press.

[12] Schelling, T. C. (1956). An essay on bargaining. The American Economic Review, 46(3), 281-306.

[13] Hertwig, R., & Engel, C. (2016). Homo ignorans: Deliberately choosing not to know. Perspectives on Psychological Science, 11, 359 –372. http://dx.doi.org/10.1177/1745691616635594

[14] Dunning, D. (2011). The Dunning–Kruger effect: On being ignorant of one’s own ignorance. In Advances in experimental social psychology (Vol. 44, pp. 247-296). Academic Press.

[15] Kluger, A. N., & DeNisi, A. (1996). The effects of feedback interventions on performance: Historical review, a metaanalysis and a preliminary feedback intervention theory. Psychological Bulletin, 119, 254–284. doi:10.1037/0033- 2909.119.2.254

[16] Kluger, A. N., & DeNisi, A. (1998). Feedback interventions: Toward the understanding of a double-edged sword. Current Directions in Psychological Science, 7, 67–72. doi:10.1111/1467-8721.ep10772989

[17] Kahneman, D., & Lovallo, D. (1993). Timid choices and bold forecasts: A cognitive perspective on risk taking. Management science, 39(1), 17-31.

[18] Zeelenberg, M. (1999). Anticipated regret, expected feedback and behavioral decision-making. Journal of Behavioral Decision Making, 12, 93–106. http://dx.doi.org/10.1002/(SICI)1099-0771(199906)12:293:: AID-BDM3113.0.CO;

[19] Zeelenberg, M., Beattie, J., van der Pligt, J., & de Vries, N. K. (1996). Consequences of regret aversion: Effects of expected feedback on risky decision making. Organizational Behavior and Human Decision Processes, 65, 148 –158. http://dx.doi.org/10.1006/obhd.1996.0013

[20] Coricelli, G., Critchley, H. D., Joffily, M., O’Doherty, J. P., Sirigu, A., & Dolan, R. J. (2005). Regret and its avoidance: a neuroimaging study of choice behavior. Nature neuroscience, 8(9), 1255-1262.

[21] Mellers, B. A., Schwartz, A., Ho, K., & Ritov, I. (1997). Decision affect theory: Emotional reactions to the outcomes of risky options. Psychological Science, 8, 423– 429. http://dx.doi.org/10.1111/j.1467-9280.1997 .tb00455.x

[22] Mellers, B. A., Schwartz, A., & Ritov, I. (1999). Emotion-based choice. Journal of Experimental Psychology: General, 128, 332–345. http://dx .doi.org/10.1037/0096-3445.128.3.332

[23] Hogarth, R. (1987). Judgment and choice (2nd ed.). New York, NY: Wiley. Janis, I. L., & Mann, L. (1977). Decision making. New York, NY: Free Press.

[24] Kruglanski, A. W. (2004). The psychology of closed mindedness. New York, NY: Psychology Press

[25] Becker, G. S. (1993). The economic way of looking at behavior. Journal of Political Economy, 101, 385– 409. http://dx.doi.org/10.1086/261880

[26] Kooper, A. J. A., Pieters, J. J., Eggink, A. J., Feuth, T. B., Feenstra, I., Wijnberger, L. D. E.,… Smits, A. P. T. (2012). Why do parents prefer to know the fetal sex as part of invasive prenatal testing? ISRN Obstetrics and Gynecology, 2012, 524537. http://dx.doi.org/10.5402/2012/ 524537

[27] Good, I. J. (1967). On the principle of total evidence. The British Journal for the Philosophy of Science, 17, 319 –321. http://dx.doi.org/10.1093/ bjps/17.4.319