Illusion of explanatory depth

The illusion of explanatory depth (IOED) is a cognitive bias where people believe that they understand a topic much better than they actually do.[1][2] The term was coined by Yale reasearchers Leonid Rozenblit and Frank Keil in 2002.[1][3] The effect is strongest for explanatory knowledge, whereas people tend to be better at self-assessments for procedural, narrative, or factual knowledge.[2][4] IOED has been displayed in the domain of devices, natural phenomena, folk theories, and politics.[5][6]

Another description of the IOED is that "we mistake our familiarity with a situation for an understanding of how it works".[7] It may also explain the perception that knowledge in psychology is simple or obvious.[7] The illusion is related to the Dunning–Kruger effect.[1][8] The more highly people rate their knowledge, the greater the strength of the illusion.[9] However, the IOED's effect (overestimation of knowledge) applies to almost everyone, whereas the Dunning–Kruger effect (overestimation of competence) only applies to those of low to moderate competence.[8] The degree to which people overestimate their knowledge is higher when knowing about the topic is perceived as socially desirable.[10]

Practical significance

The IOED allows people to hold extreme political positions while being relatively uninformed about relevant topics.[11] A 2018 study in the United States found that the IOED is associated with belief in conspiracy theories for political topics, but not for non-political topics.[12]

The illusion can be combated by asking people to explain the topic, but not by simply asking them to provide reasons for their beliefs.[1][11] Other research has shown that when people are asked to justify their position, people's beliefs become more extreme instead of less, hence the type of explanation requested may be important; asking for reasons may lead people to strengthen their beliefs by selectively thinking of support for their position, while asking for a mechanistic explanation forces them to confront their lack of knowledge.[11]

Origin

The IOED was coined by Yale reasearchers Leonid Rozenblit and Frank Keil in 2002.[3] In a experiment they conducted with 16 Yale undergraduate students, they asked them to rate their understanding of devices and simple items.[13][3][5] They were then asked to generate a detailed explanation of how they worked and then rerate their understanding of their understanding of that item.[13][3][5] Consistently, ratings were lower after generating an explanation, suggesting they then began to understand that they lacked understanding of that item after attempting to explain.[13][3][5] In their paper, The Misunderstood Limits of Folk Science: an Illusion of Explanatory Depth, Rozenblit and Keil concluded that having to explain basic concepts or mechanisms, confronts people with the reality that they may not understand the subject as much as they think they do.[5][3][13]

See also

References

  1. Waytz, Adam (26 January 2022). "2017 : What scientific term or concept ought to be more widely known?". Edge.org. Retrieved 26 January 2022.
  2. Rozenblit, Leonid; Keil, Frank (2002). "The misunderstood limits of folk science: an illusion of explanatory depth". Cognitive Science. Wiley. 26 (5): 521–562. doi:10.1207/s15516709cog2605_1. ISSN 0364-0213. PMC 3062901. PMID 21442007.
  3. "The Illusion of Explanatory Depth". The Decision Lab. Retrieved 26 January 2022.
  4. Mills, Candice M; Keil, Frank C (2004). "Knowing the limits of one's understanding: The development of an awareness of an illusion of explanatory depth". Journal of Experimental Child Psychology. Elsevier BV. 87 (1): 1–32. doi:10.1016/j.jecp.2003.09.003. ISSN 0022-0965. PMID 14698687.
  5. Zeveney, Marsh, Andrew, Jessacae (2016). "The Illusion of Explanatory Depth in a Misunderstood Field: The IOED in Mental Disorders" (PDF). Cognitive Science Society: 1020.
  6. Rozenblit, Leonid; Keil, Frank (2002). "The misunderstood limits of folk science: an illusion of explanatory depth". Cognitive Science. 26 (5): 521–562. doi:10.1207/s15516709cog2605_1. ISSN 1551-6709. PMC 3062901. PMID 21442007.
  7. Stafford, Tom (February 2007). "Isn't it all just obvious?". The Psychologist. Retrieved 28 January 2022.
  8. Chromik, Michael; Eiband, Malin; Buchner, Felicitas; Krüger, Adrian; Butz, Andreas (13 April 2021). I Think I Get Your Point, AI! The Illusion of Explanatory Depth in Explainable AI. New York, NY, USA: ACM. doi:10.1145/3397481.3450644.
  9. Lawson, Rebecca (2006). "The science of cycology: Failures to understand how everyday objects work". Memory & Cognition. Springer Science and Business Media LLC. 34 (8): 1667–1675. doi:10.3758/bf03195929. ISSN 0090-502X. PMID 17489293. S2CID 4998257.
  10. Gaviria, Christian; Corredor, Javier (23 June 2021). "Illusion of explanatory depth and social desirability of historical knowledge". Metacognition and Learning. Springer Science and Business Media LLC. 16 (3): 801–832. doi:10.1007/s11409-021-09267-7. ISSN 1556-1623. S2CID 237878736.
  11. Fernbach, Philip M.; Rogers, Todd; Fox, Craig R.; Sloman, Steven A. (25 April 2013). "Political Extremism Is Supported by an Illusion of Understanding". Psychological Science. SAGE Publications. 24 (6): 939–946. doi:10.1177/0956797612464058. ISSN 0956-7976. PMID 23620547. S2CID 6173291.
  12. Vitriol, Joseph A.; Marsh, Jessecae K. (15 June 2018). "The illusion of explanatory depth and endorsement of conspiracy beliefs". European Journal of Social Psychology. Wiley. 48 (7): 955–969. doi:10.1002/ejsp.2504. ISSN 0046-2772. S2CID 149811872.
  13. Rozenblit, Leonid; Keil, Frank (2002-09-01). "The misunderstood limits of folk science: an illusion of explanatory depth". Cognitive Science. 26 (5): 521–562. doi:10.1207/s15516709cog2605_1. ISSN 0364-0213. PMC 3062901. PMID 21442007.
This article is issued from Wikipedia. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.