fbpx

Templeton.org is in English. Only a few pages are translated into other languages.

OK

Usted está viendo Templeton.org en español. Tenga en cuenta que solamente hemos traducido algunas páginas a su idioma. El resto permanecen en inglés.

OK

Você está vendo Templeton.org em Português. Apenas algumas páginas do site são traduzidas para o seu idioma. As páginas restantes são apenas em Inglês.

OK

أنت تشاهد Templeton.org باللغة العربية. ŘŞŘŞŮ… ترجمة بعض صŮحات المŮŮ‚Řą ŮŮ‚Ř· إلى لغتŮ. الصŮحات المتبقية هي باللغة الإنجليزية ŮŮ‚Ř·.

OK
Skip to main content
Back to Templeton Ideas

In our Study of the Day feature series, we highlight a research publication related to a John Templeton Foundation-supported project, connecting the fascinating and unique research we fund to important conversations happening around the world.

David Dunning and Justin Kruger are among the few contemporary psychologists whose last names have escaped into the popular consciousness in adjectival form. Their 1999 article “Unskilled and Unaware of It: How Difficulties in Recognizing One’s Own Incompetence Lead to Inflated Self-Assessments” codified what is now known as the Dunning–Kruger effect, a handy shorthand for the rationale (or lack of such) behind all kinds of hubristic human foolishness. Journalists now write about Dunning-Kruger economics, Dunning–Kruger geopolitics, a Dunning–Kruger presidency. When someone, somewhere, does something especially stupid, David Dunning is often the person the reporter calls for a comment.

But are unskilled people disproportionately prone to be unaware of what they don’t know? In a newly published study by Yuhan Han and Dunning, a series of experiments were conducted to explore how experts in climate science, psychological statistics, and investment think about the limits of their own expertise. They did this by measuring experts’ and non-experts’ metaknowledge (knowledge about what they do or don’t know) by quizzing them and recording not only their answers but also their perceived degree of confidence in each answer. 

They found that experts indeed tend to be on average less overconfident than non-experts — but that the effect could be explained by the fact that the experts got more answers right (and were correctly confident about their right answers). When Han and Dunning looked only at the answers people got wrong, though, they found that experts showed equal or higher confidence in their incorrect answers compared to non-experts. While one might hope that experts might be more attuned to the limits of their expertise than laypeople, that doesn’t seem to be the case. “[I]t appears,” Han and Dunning write, “that expertise is associated with knowing with more certainty what one knows but conceals awareness of what one does not know.” 

Some 2,500 years ago the Chinese magistrate Zhong You received a lesson in metaknowledge from his teacher: “I tell you what it is to know. To say you know when you know, and to say you do not when you do not, that is knowledge.” Today the second half of this definition, recorded in the second book of Confucius’ Analects, remains both eminently sensible and incredibly difficult to live out, no matter one’s level of expertise.

Still Curious?

Read Han and Dunning’s paper, “Metaknowledge of Experts Versus Nonexperts: Do Experts Know Better What They Do and Do Not Know?” 


Nate Barksdale writes about the intersection of science, history, philosophy, faith and popular culture. He was editor of the magazine re:generation quarterly and is a frequent contributor to History.com.