“Ignorance of ignorance”: On overconfidence and intellectual humility 

Fighting a common cognitive bias with humility.

Should you do your own research or trust the experts? Photo by Leeloo Thefirst via Pexels.

I heard a phrase once in an advertisement, and while I have been unable to track down the origin, it returns to the forefront of my mind randomly. “Only ingredients you can pronounce” is the tagline for the unknown product — it could have been cosmetics, food, medication or anything else commonly advertised. The reason I have not forgotten this line is because it irritates me.

On my bedside table, I have medicine for my seasonal allergies. I would not be able to tell you all of the ingredients in the gel capsules, and I definitely would not be able to pronounce all of them correctly, but I know these elements come together to create something that makes it possible for me to function. 

There are many things I do not know, and do not need to know. I do not have the time or the desire to take pharmaceutical medicine and chemistry classes. I have to decide to trust that the educated people, who know better than me, have created a safe, effective product. 

I do not think it is wrong to want to be informed about the products you consume. In fact, seeking out information can lead to a sense of security in one’s choices, especially when referring to medications.

What concerns me is that alongside the desire to know all the ingredients of a product, and to expect to be able to understand and pronounce each and every one, I often see another phrase that concerns me: “Do your own research.” 

As a journalist, I love research. Sometimes I find myself falling down the proverbial “rabbit hole” into a topic I will never even write about. 

But I know there are limitations to the research I am able to complete. Sometimes my search for information ends because I run into a paywall. Other times, I find discouraging or even boring results. 

“Do your own research.”

When it comes to decisions about important aspects of my life, whether medical, academic or financial, I will do my own research to a point, but I also recognize I need help, or a second opinion from a professional.

I am intensely aware of my own blind spots and shortcomings. I am also keenly aware of the ignorance of the people I know who claim to have done their own research. 

“Have you done research?” I want to ask. “Or have you found information that confirms your opinion a third of the way down the first page of Google search results and decided that is enough to make you confident in your decisions?” 

I actually envy the confidence of these folks to a point. While I am constantly second-guessing myself, or doubting my own knowledge, they seem to have an abundance of certainty and the willingness to back it up with their statements. 

There is a name for this as I discovered while doing, you guessed it, research. 

Per Encyclopedia Britannica “a cognitive bias whereby people with limited knowledge or competence in a given intellectual or social domain greatly overestimate their own knowledge” is called the Dunning-Kruger effect. 

In simpler terms, this effect describes the phenomenon of people with very little knowledge or background on a subject, believing they are in fact very well-informed. This may be due to the fact that because of their lack of experience in the topic they believe it to be far more simple than it is. 

“Because they are unaware of their deficiencies, such people generally assume that they are not deficient,” the Encyclopedia Britannica entry continues. 

David Dunning, for whom the effect is partially named, calls this “meta-ignorance,” or “ignorance of ignorance.” 

While this may seem harmless at first glance, this thought process has far-reaching consequences. 

Dunning listed the costs of overconfident thinking, including increased risk-taking behavior, diagnostic mistakes and conflict in interpersonal relationships. 

A 2022 study published in Frontiers in Psychology found a correlation between meta-ignorance and the belief in conspiracy theories. 

Authors Andrea Vranic, Ivana Hromatko and Mirjana Tonkovic wrote, “Overconfidence in one’s own reasoning abilities was negatively correlated with an objective measure of reasoning … and positively correlated with the endorsement of conspiracy theories, indicating that the so-called Dunning-Kruger effect plays a role in pseudoscientific conspiratorial thinking regarding COVID-19.” 

It is hard to escape the further consequences of this line of thought. If one trusts their own, uninformed viewpoint and believes conspiracies, theoretical thought can soon become action. 

Would we have suffered so many deaths and the ongoing battle against this virus if we had a few less armchair experts? 

Looking at my own acquaintances, I see a timeline many other people may recognize as well. Supposed research, not based in fact but in confirming personal opinions, led to decisions about COVID-19 safety or lack thereof. Then came a refusal of vaccination or proven treatments. A further step was often sharing these misplaced beliefs on social media, arguing with anyone who disagreed, and encouraging so-called “sheep” to do their own research on the matter. 

Unfortunately, in addition to leading to poor decision making, those under the influence of the Dunning-Kruger effect also become further entrenched in their ignorance. 

“Overconfidence in one’s attitudes and understanding tunes down the acceptance of knowledge and ideas across partisan boundaries and leads to polarization,” said Vranic, Hromatko and Tonkvic. “In turn, it decreases the likelihood of encountering information which might challenge or modify the existing beliefs and assumptions.” 

Changing the minds of those around us may not be simple, but there is an antidote to meta-ignorance in each of our individual lives. 

The concept of “intellectual humility” is beginning to be studied by actual, educated experts. 

In an article about intellectual humility in the peer-reviewed “Nature” journal, a group of psychologists defined the concept as follows: “Intellectual humility involves recognizing that there are gaps in one’s knowledge and that one’s current beliefs might be incorrect.” 

This form of humility is not exactly the same as what we typically think of when we hear the word. Instead of trying to avoid bragging or drawing attention to ourselves, intellectual humility is a thought process in which one takes stock of their knowledge, beliefs and background, and understands where they might fall short. 

“Intellectual humility involves recognizing that there are gaps in one’s knowledge and that one’s current beliefs might be incorrect”

It can be difficult to realize we are fallible and may not have all the answers. In this way, being ignorant of our own ignorance might be a more comfortable option. 

“Being intellectually humble involves embracing uncertainty and ambiguity, and entertaining the possibility that even one’s closely held beliefs might be incorrect,” the “Nature” journal article said.

However, in taking a leap of faith — or rather humility — there are benefits. 

Writing for UC Berkeley’s Greater Good Magazine, Mark Leary said intellectually humble people tend to pay more attention to the quality of evidence and facts they are presented with. They also take more time to read about viewpoints that contradict their own, and think more about them. 

“People high in intellectual humility more carefully consider the evidence on which their beliefs are based, are vigilant to the possibility that they might be incorrect, consider the perspectives of other informed people (including those whose viewpoints differ from theirs), and revise their views when evidence warrants,” Leary said. 

Though intellectual humility may have some downsides, including longer decision-making time or indecision and a degree of self-doubt, the potential outcomes hold more good than bad. 

Beyond a personal level, the increased practice of intellectual humility could have further spread cultural and societal impact. 

“Research suggests that intellectual humility can decrease polarization, extremism and susceptibility to conspiracy beliefs, increase learning and discovery and foster scientific credibility,” the authors of the “Nature” article said. 

People tend to view others who are able to admit if they are wrong or change their mind with further information as more trustworthy and respected as well. 

“Fortunately, people can increase in intellectual humility both through a personal decision to be more intellectually humble and through interventions that help people confront their intellectual overconfidence and take steps to reduce it,” Leary said.

Maybe that starts with not feeling the need to know how to pronounce every ingredient in your deodorant. Or for others it may be found in doing actual, solid research and acknowledging said research may contradict a previously held belief. 

I still find myself questioning my own knowledge and where it comes from. I’ve shared my opinions on the matter here and backed them up with facts from reputable sources — that’s what an opinion article should be after all — but have I also fallen prey to the Dunning-Kruger effect? Have I, a journalism student, read articles on psychology that align with my viewpoint and stopped my learning and research at that point? 

I guess you’ll have to do your own research and find out. 


Heather Taylor can be reached at [email protected]