Critical thinking

We’ve already established that information can be biased. Now it’s time to look at our own bias.

Studies have shown that we are more likely to accept information when it fits into our existing worldview, a phenomenon known as confirmation or myside bias (for examples see Kappes et al., 2020 ; McCrudden & Barnes, 2016 ; Pilditch & Custers, 2018 ). Wittebols (2019) defines it as a “tendency to be psychologically invested in the familiar and what we believe and less receptive to information that contradicts what we believe” (p. 211). Quite simply, we may reject information that doesn’t support our existing thinking.

This can manifest in a number of ways with Hahn and Harris (2014) suggesting four main behaviours:

  • Searching only for information that supports our held beliefs
  • Failing to critically evaluate information that supports our held beliefs - accepting it at face value - while explaining away or being overly critical of information that might contradict them
  • Becoming set in our thinking, once an opinion has been formed, and deliberately ignoring any new information on the topic
  • A tendency to be overconfident with the validity of our held beliefs.

Peters (2020) also suggests that we’re more likely to remember information that supports our way of thinking, further cementing our bias. Taken together, the research suggests that bias has a huge impact on the way we think. To learn more about how and why bias can impact our everyday thinking, watch this short video.

Filter bubbles and echo chambers

The theory of filter bubbles emerged in 2011, proposed by an Internet activist, Eli Pariser. He defined it as “your own personal unique world of information that you live in online” ( Pariser, 2011, 4:21 ). At the time that Pariser proposed the filter bubble theory, he focused on the impact of algorithms, connected with social media platforms and search engines, which prioritised content and personalised results based on the individuals past online activity, suggesting “the Internet is showing us what it thinks we want to see, but not necessarily what we should see” (Pariser, 2011, 3:47. Watch his TED talk if you’d like to know more).

Our understanding of filter bubbles has now expanded to recognise that individuals also select and create their own filter bubbles. This happens when you seek out likeminded individuals or sources; follow your friends or people you admire on social media; people that you’re likely to share common beliefs, points-of-view, and interests with. Barack Obama (2017) addressed the concept of filter bubbles in his presidential farewell address:

For too many of us it’s become safer to retreat into our own bubbles, whether in our neighbourhoods, or on college campuses, or places of worship, or especially our social media feeds, surrounded by people who look like us and share the same political outlook and never challenge our assumptions… Increasingly we become so secure in our bubbles that we start accepting only information, whether it’s true or not, that fits our opinions, instead of basing our opinions on the evidence that is out there. ( Obama, 2017, 22:57 ).

Filter bubbles are not unique to the social media age. Previously, the term echo chamber was used to describe the same phenomenon in the news media where different channels exist, catering to different points of view. Within an echo chamber, people are able to seek out information that supports their existing beliefs, without encountering information that might challenge, contradict or oppose.

Other forms of bias

There are many different ways in which bias can affect the way you think and how you process new information. Try the quiz below to discover some additional forms of bias, or check out Buzzfeed’s 2017 article on cognitive bias.

Cognitive Bias: How We Are Wired to Misjudge

Charlotte Ruhl

Research Assistant & Psychology Graduate

BA (Hons) Psychology, Harvard University

Charlotte Ruhl, a psychology graduate from Harvard College, boasts over six years of research experience in clinical and social psychology. During her tenure at Harvard, she contributed to the Decision Science Lab, administering numerous studies in behavioral economics and social psychology.

Learn about our Editorial Process

Saul Mcleod, PhD

Editor-in-Chief for Simply Psychology

BSc (Hons) Psychology, MRes, PhD, University of Manchester

Saul Mcleod, PhD., is a qualified psychology teacher with over 18 years of experience in further and higher education. He has been published in peer-reviewed journals, including the Journal of Clinical Psychology.

Olivia Guy-Evans, MSc

Associate Editor for Simply Psychology

BSc (Hons) Psychology, MSc Psychology of Education

Olivia Guy-Evans is a writer and associate editor for Simply Psychology. She has previously worked in healthcare and educational sectors.

On This Page:

Have you ever been so busy talking on the phone that you don’t notice the light has turned green and it is your turn to cross the street?

Have you ever shouted, “I knew that was going to happen!” after your favorite baseball team gave up a huge lead in the ninth inning and lost?

Or have you ever found yourself only reading news stories that further support your opinion?

These are just a few of the many instances of cognitive bias that we experience every day of our lives. But before we dive into these different biases, let’s backtrack first and define what bias is.

Cognitive Bias and Judgement Error - Systematic Mental Pattern of Deviation from Norm or Rationality in Judgement - Conceptual Illustration

What is Cognitive Bias?

Cognitive bias is a systematic error in thinking, affecting how we process information, perceive others, and make decisions. It can lead to irrational thoughts or judgments and is often based on our perceptions, memories, or individual and societal beliefs.

Biases are unconscious and automatic processes designed to make decision-making quicker and more efficient. Cognitive biases can be caused by many things, such as heuristics (mental shortcuts) , social pressures, and emotions.

Broadly speaking, bias is a tendency to lean in favor of or against a person, group, idea, or thing, usually in an unfair way. Biases are natural — they are a product of human nature — and they don’t simply exist in a vacuum or in our minds — they affect the way we make decisions and act.

In psychology, there are two main branches of biases: conscious and unconscious. Conscious or explicit bias is intentional — you are aware of your attitudes and the behaviors resulting from them (Lang, 2019).

Explicit bias can be good because it helps provide you with a sense of identity and can lead you to make good decisions (for example, being biased towards healthy foods).

However, these biases can often be dangerous when they take the form of conscious stereotyping.

On the other hand, unconscious bias , or cognitive bias, represents a set of unintentional biases — you are unaware of your attitudes and behaviors resulting from them (Lang, 2019).

Cognitive bias is often a result of your brain’s attempt to simplify information processing — we receive roughly 11 million bits of information per second. Still, we can only process about 40 bits of information per second (Orzan et al., 2012).

Therefore, we often rely on mental shortcuts (called heuristics) to help make sense of the world with relative speed. As such, these errors tend to arise from problems related to thinking: memory, attention, and other mental mistakes.

Cognitive biases can be beneficial because they do not require much mental effort and can allow you to make decisions relatively quickly, but like conscious biases, unconscious biases can also take the form of harmful prejudice that serves to hurt an individual or a group.

Although it may feel like there has been a recent rise of unconscious bias, especially in the context of police brutality and the Black Lives Matter movement, this is not a new phenomenon.

Thanks to Tversky and Kahneman (and several other psychologists who have paved the way), we now have an existing dictionary of our cognitive biases.

Again, these biases occur as an attempt to simplify the complex world and make information processing faster and easier. This section will dive into some of the most common forms of cognitive bias.

Cognitive biases as systematic error in thinking and behavior outline diagram. Psychological mindset feeling with non logic judgment effects vector illustration.

Confirmation Bias

Confirmation bias is the tendency to interpret new information as confirmation of your preexisting beliefs and opinions while giving disproportionately less consideration to alternative possibilities.

Real-World Examples

Since Watson’s 1960 experiment, real-world examples of confirmation bias have gained attention.

This bias often seeps into the research world when psychologists selectively interpret data or ignore unfavorable data to produce results that support their initial hypothesis.

Confirmation bias is also incredibly pervasive on the internet, particularly with social media. We tend to read online news articles that support our beliefs and fail to seek out sources that challenge them.

Various social media platforms, such as Facebook, help reinforce our confirmation bias by feeding us stories that we are likely to agree with – further pushing us down these echo chambers of political polarization.

Some examples of confirmation bias are especially harmful, specifically in the context of the law. For example, a detective may identify a suspect early in an investigation, seek out confirming evidence, and downplay falsifying evidence.

Experiments

The confirmation bias dates back to 1960 when Peter Wason challenged participants to identify a rule applying to triples of numbers.

People were first told that the sequences 2, 4, 6 fit the rule, and they then had to generate triples of their own and were told whether that sequence fits the rule. The rule was simple: any ascending sequence.

But not only did participants have an unusually difficult time realizing this and instead devised overly-complicated hypotheses, they also only generated triples that confirmed their preexisting hypothesis (Wason, 1960).

Explanations

But why does confirmation bias occur? It’s partially due to the effect of desire on our beliefs. In other words, certain desired conclusions (ones that support our beliefs) are more likely to be processed by the brain and labeled as true (Nickerson, 1998).

This motivational explanation is often coupled with a more cognitive theory.

The cognitive explanation argues that because our minds can only focus on one thing at a time, it is hard to parallel process (see information processing for more information) alternate hypotheses, so, as a result, we only process the information that aligns with our beliefs (Nickerson, 1998).

Another theory explains confirmation bias as a way of enhancing and protecting our self-esteem.

As with the self-serving bias (see more below), our minds choose to reinforce our preexisting ideas because being right helps preserve our sense of self-esteem, which is important for feeling secure in the world and maintaining positive relationships (Casad, 2019).

Although confirmation bias has obvious consequences, you can still work towards overcoming it by being open-minded and willing to look at situations from a different perspective than you might be used to (Luippold et al., 2015).

Even though this bias is unconscious, training your mind to become more flexible in its thought patterns will help mitigate the effects of this bias.

Hindsight Bias

Hindsight bias refers to the tendency to perceive past events as more predictable than they actually were (Roese & Vohs, 2012). There are cognitive and motivational explanations for why we ascribe so much certainty to knowing the outcome of an event only once the event is completed.

 Hindsight Bias Example

When sports fans know the outcome of a game, they often question certain decisions coaches make that they otherwise would not have questioned or second-guessed.

And fans are also quick to remark that they knew their team was going to win or lose, but, of course, they only make this statement after their team actually did win or lose.

Although research studies have demonstrated that the hindsight bias isn’t necessarily mitigated by pure recognition of the bias (Pohl & Hell, 1996).

You can still make a conscious effort to remind yourself that you can’t predict the future and motivate yourself to consider alternate explanations.

It’s important to do all we can to reduce this bias because when we are overly confident about our ability to predict outcomes, we might make future risky decisions that could have potentially dangerous outcomes.

Building on Tversky and Kahneman’s growing list of heuristics, researchers Baruch Fischhoff and Ruth Beyth-Marom (1975) were the first to directly investigate the hindsight bias in the empirical setting.

The team asked participants to judge the likelihood of several different outcomes of former U.S. president Richard Nixon’s visit to Beijing and Moscow.

After Nixon returned back to the States, participants were asked to recall the likelihood of each outcome they had initially assigned.

Fischhoff and Beyth found that for events that actually occurred, participants greatly overestimated the initial likelihood they assigned to those events.

That same year, Fischhoff (1975) introduced a new method for testing the hindsight bias – one that researchers still use today.

Participants are given a short story with four possible outcomes, and they are told that one is true. When they are then asked to assign the likelihood of each specific outcome, they regularly assign a higher likelihood to whichever outcome they have been told is true, regardless of how likely it actually is.

But hindsight bias does not only exist in artificial settings. In 1993, Dorothee Dietrich and Matthew Olsen asked college students to predict how the U.S. Senate would vote on the confirmation of Supreme Court nominee Clarence Thomas.

Before the vote, 58% of participants predicted that he would be confirmed, but after his actual confirmation, 78% of students said that they thought he would be approved – a prime example of the hindsight bias. And this form of bias extends beyond the research world.

From the cognitive perspective, hindsight bias may result from distortions of memories of what we knew or believed to know before an event occurred (Inman, 2016).

It is easier to recall information that is consistent with our current knowledge, so our memories become warped in a way that agrees with what actually did happen.

Motivational explanations of the hindsight bias point to the fact that we are motivated to live in a predictable world (Inman, 2016).

When surprising outcomes arise, our expectations are violated, and we may experience negative reactions as a result. Thus, we rely on the hindsight bias to avoid these adverse responses to certain unanticipated events and reassure ourselves that we actually did know what was going to happen.

Self-Serving Bias

Self-serving bias is the tendency to take personal responsibility for positive outcomes and blame external factors for negative outcomes.

You would be right to ask how this is similar to the fundamental attribution error (Ross, 1977), which identifies our tendency to overemphasize internal factors for other people’s behavior while attributing external factors to our own.

The distinction is that the self-serving bias is concerned with valence. That is, how good or bad an event or situation is. And it is also only concerned with events for which you are the actor.

In other words, if a driver cuts in front of you as the light turns green, the fundamental attribution error might cause you to think that they are a bad person and not consider the possibility that they were late for work.

On the other hand, the self-serving bias is exercised when you are the actor. In this example, you would be the driver cutting in front of the other car, which you would tell yourself is because you are late (an external attribution to a negative event) as opposed to it being because you are a bad person.

From sports to the workplace, self-serving bias is incredibly common. For example, athletes are quick to take responsibility for personal wins, attributing their successes to their hard work and mental toughness, but point to external factors, such as unfair calls or bad weather, when they lose (Allen et al., 2020).

In the workplace, people attribute internal factors when they have hired for a job but external factors when they are fired (Furnham, 1982). And in the office itself, workplace conflicts are given external attributions, and successes, whether a persuasive presentation or a promotion, are awarded internal explanations (Walther & Bazarova, 2007).

Additionally, self-serving bias is more prevalent in individualistic cultures , which place emphasis on self-esteem levels and individual goals, and it is less prevalent among individuals with depression (Mezulis et al., 2004), who are more likely to take responsibility for negative outcomes.

Overcoming this bias can be difficult because it is at the expense of our self-esteem. Nevertheless, practicing self-compassion – treating yourself with kindness even when you fall short or fail – can help reduce the self-serving bias (Neff, 2003).

The leading explanation for the self-serving bias is that it is a way of protecting our self-esteem (similar to one of the explanations for the confirmation bias).

We are quick to take credit for positive outcomes and divert the blame for negative ones to boost and preserve our individual ego, which is necessary for confidence and healthy relationships with others (Heider, 1982).

Another theory argues that self-serving bias occurs when surprising events arise. When certain outcomes run counter to our expectations, we ascribe external factors, but when outcomes are in line with our expectations, we attribute internal factors (Miller & Ross, 1975).

An extension of this theory asserts that we are naturally optimistic, so negative outcomes come as a surprise and receive external attributions as a result.

Anchoring Bias

individualistic cultures is closely related to the decision-making process. It occurs when we rely too heavily on either pre-existing information or the first piece of information (the anchor) when making a decision.

For example, if you first see a T-shirt that costs $1,000 and then see a second one that costs $100, you’re more likely to see the second shirt as cheap as you would if the first shirt you saw was $120. Here, the price of the first shirt influences how you view the second.

 Anchoring Bias Example

Sarah is looking to buy a used car. The first dealership she visits has a used sedan listed for $19,000. Sarah takes this initial listing price as an anchor and uses it to evaluate prices at other dealerships.

When she sees another similar used sedan priced at $18,000, that price seems like a good bargain compared to the $19,000 anchor price she saw first, even though the actual market value is closer to $16,000.

When Sarah finds a comparable used sedan priced at $15,500, she continues perceiving that price as cheap compared to her anchored reference price.

Ultimately, Sarah purchases the $18,000 sedan, overlooking that all of the prices seemed like bargains only in relation to the initial high anchor price.

The key elements that demonstrate anchoring bias here are:

  • Sarah establishes an initial reference price based on the first listing she sees ($19k)
  • She uses that initial price as her comparison/anchor for evaluating subsequent prices
  • This biases her perception of the market value of the cars she looks at after the initial anchor is set
  • She makes a purchase decision aligned with her anchored expectations rather than a more objective market value

Multiple theories seek to explain the existence of this bias.

One theory, known as anchoring and adjustment, argues that once an anchor is established, people insufficiently adjust away from it to arrive at their final answer, and so their final guess or decision is closer to the anchor than it otherwise would have been (Tversky & Kahneman, 1992).

And when people experience a greater cognitive load (the amount of information the working memory can hold at any given time; for example, a difficult decision as opposed to an easy one), they are more susceptible to the effects of anchoring.

Another theory, selective accessibility, holds that although we assume that the anchor is not a suitable answer (or a suitable price going back to the initial example) when we evaluate the second stimulus (or second shirt), we look for ways in which it is similar or different to the anchor (the price being way different), resulting in the anchoring effect (Mussweiler & Strack, 1999).

A final theory posits that providing an anchor changes someone’s attitudes to be more favorable to the anchor, which then biases future answers to have similar characteristics as the initial anchor.

Although there are many different theories for why we experience anchoring bias, they all agree that it affects our decisions in real ways (Wegner et al., 2001).

The first study that brought this bias to light was during one of Tversky and Kahneman’s (1974) initial experiments. They asked participants to compute the product of numbers 1-8 in five seconds, either as 1x2x3… or 8x7x6…

Participants did not have enough time to calculate the answer, so they had to estimate based on their first few calculations.

They found that those who computed the small multiplications first (i.e., 1x2x3…) gave a median estimate of 512, but those who computed the larger multiplications first gave a median estimate of 2,250 (although the actual answer is 40,320).

This demonstrates how the initial few calculations influenced the participant’s final answer.

Availability Bias

Availability bias (also commonly referred to as the availability heuristic ) refers to the tendency to think that examples of things that readily come to mind are more common than what is actually the case.

In other words, information that comes to mind faster influences the decisions we make about the future. And just like with the hindsight bias, this bias is related to an error of memory.

But instead of being a memory fabrication, it is an overemphasis on a certain memory.

In the workplace, if someone is being considered for a promotion but their boss recalls one bad thing that happened years ago but left a lasting impression, that one event might have an outsized influence on the final decision.

Another common example is buying lottery tickets because the lifestyle and benefits of winning are more readily available in mind (and the potential emotions associated with winning or seeing other people win) than the complex probability calculation of actually winning the lottery (Cherry, 2019).

A final common example that is used to demonstrate the availability heuristic describes how seeing several television shows or news reports about shark attacks (or anything that is sensationalized by the news, such as serial killers or plane crashes) might make you think that this incident is relatively common even though it is not at all.

Regardless, this thinking might make you less inclined to go in the water the next time you go to the beach (Cherry, 2019).

As with most cognitive biases, the best way to overcome them is by recognizing the bias and being more cognizant of your thoughts and decisions.

And because we fall victim to this bias when our brain relies on quick mental shortcuts in order to save time, slowing down our thinking and decision-making process is a crucial step to mitigating the effects of the availability heuristic.

Researchers think this bias occurs because the brain is constantly trying to minimize the effort necessary to make decisions, and so we rely on certain memories – ones that we can recall more easily – instead of having to endure the complicated task of calculating statistical probabilities.

Two main types of memories are easier to recall: 1) those that more closely align with the way we see the world and 2) those that evoke more emotion and leave a more lasting impression.

This first type of memory was identified in 1973, when Tversky and Kahneman, our cognitive bias pioneers, conducted a study in which they asked participants if more words begin with the letter K or if more words have K as their third letter.

Although many more words have K as their third letter, 70% of participants said that more words begin with K because the ability to recall this is not only easier, but it more closely aligns with the way they see the world (knowing the first letter of any word is infinitely more common than the third letter of any word).

In terms of the second type of memory, the same duo ran an experiment in 1983, 10 years later, where half the participants were asked to guess the likelihood of a massive flood would occur somewhere in North America, and the other half had to guess the likelihood of a flood occurring due to an earthquake in California.

Although the latter is much less likely, participants still said that this would be much more common because they could recall specific, emotionally charged events of earthquakes hitting California, largely due to the news coverage they receive.

Together, these studies highlight how memories that are easier to recall greatly influence our judgments and perceptions about future events.

Inattentional Blindness

A final popular form of cognitive bias is inattentional blindness . This occurs when a person fails to notice a stimulus that is in plain sight because their attention is directed elsewhere.

For example, while driving a car, you might be so focused on the road ahead of you that you completely fail to notice a car swerve into your lane of traffic.

Because your attention is directed elsewhere, you aren’t able to react in time, potentially leading to a car accident. Experiencing inattentional blindness has its obvious consequences (as illustrated by this example), but, like all biases, it is not impossible to overcome.

Many theories seek to explain why we experience this form of cognitive bias. In reality, it is probably some combination of these explanations.

Conspicuity holds that certain sensory stimuli (such as bright colors) and cognitive stimuli (such as something familiar) are more likely to be processed, and so stimuli that don’t fit into one of these two categories might be missed.

The mental workload theory describes how when we focus a lot of our brain’s mental energy on one stimulus, we are using up our cognitive resources and won’t be able to process another stimulus simultaneously.

Similarly, some psychologists explain how we attend to different stimuli with varying levels of attentional capacity, which might affect our ability to process multiple stimuli simultaneously.

In other words, an experienced driver might be able to see that car swerve into the lane because they are using fewer mental resources to drive, whereas a beginner driver might be using more resources to focus on the road ahead and unable to process that car swerving in.

A final explanation argues that because our attentional and processing resources are limited, our brain dedicates them to what fits into our schemas or our cognitive representations of the world (Cherry, 2020).

Thus, when an unexpected stimulus comes into our line of sight, we might not be able to process it on the conscious level. The following example illustrates how this might happen.

The most famous study to demonstrate the inattentional blindness phenomenon is the invisible gorilla study (Most et al., 2001). This experiment asked participants to watch a video of two groups passing a basketball and count how many times the white team passed the ball.

Participants are able to accurately report the number of passes, but what they fail to notice is a gorilla walking directly through the middle of the circle.

Because this would not be expected, and because our brain is using up its resources to count the number of passes, we completely fail to process something right before our eyes.

A real-world example of inattentional blindness occurred in 1995 when Boston police officer Kenny Conley was chasing a suspect and ran by a group of officers who were mistakenly holding down an undercover cop.

Conley was convicted of perjury and obstruction of justice because he supposedly saw the fight between the undercover cop and the other officers and lied about it to protect the officers, but he stood by his word that he really hadn’t seen it (due to inattentional blindness) and was ultimately exonerated (Pickel, 2015).

The key to overcoming inattentional blindness is to maximize your attention by avoiding distractions such as checking your phone. And it is also important to pay attention to what other people might not notice (if you are that driver, don’t always assume that others can see you).

By working on expanding your attention and minimizing unnecessary distractions that will use up your mental resources, you can work towards overcoming this bias.

Preventing Cognitive Bias

As we know, recognizing these biases is the first step to overcoming them. But there are other small strategies we can follow in order to train our unconscious mind to think in different ways.

From strengthening our memory and minimizing distractions to slowing down our decision-making and improving our reasoning skills, we can work towards overcoming these cognitive biases.

An individual can evaluate his or her own thought process, also known as metacognition (“thinking about thinking”), which provides an opportunity to combat bias (Flavell, 1979).

This multifactorial process involves (Croskerry, 2003):

(a) acknowledging the limitations of memory, (b) seeking perspective while making decisions, (c) being able to self-critique, (d) choosing strategies to prevent cognitive error.

Many strategies used to avoid bias that we describe are also known as cognitive forcing strategies, which are mental tools used to force unbiased decision-making.

The History of Cognitive Bias

The term cognitive bias was first coined in the 1970s by Israeli psychologists Amos Tversky and Daniel Kahneman, who used this phrase to describe people’s flawed thinking patterns in response to judgment and decision problems (Tversky & Kahneman, 1974).

Tversky and Kahneman’s research program, the heuristics and biases program, investigated how people make decisions given limited resources (for example, limited time to decide which food to eat or limited information to decide which house to buy).

As a result of these limited resources, people are forced to rely on heuristics or quick mental shortcuts to help make their decisions.

Tversky and Kahneman wanted to understand the biases associated with this judgment and decision-making process.

To do so, the two researchers relied on a research paradigm that presented participants with some type of reasoning problem with a computed normative answer (they used probability theory and statistics to compute the expected answer).

Participants’ responses were then compared with the predetermined solution to reveal the systematic deviations in the mind.

After running several experiments with countless reasoning problems, the researchers were able to identify numerous norm violations that result when our minds rely on these cognitive biases to make decisions and judgments (Wilke & Mata, 2012).

Key Takeaways

  • Cognitive biases are unconscious errors in thinking that arise from problems related to memory, attention, and other mental mistakes.
  • These biases result from our brain’s efforts to simplify the incredibly complex world in which we live.
  • Confirmation bias , hindsight bias, mere exposure effect , self-serving bias , base rate fallacy , anchoring bias , availability bias , the framing effect ,  inattentional blindness, and the ecological fallacy are some of the most common examples of cognitive bias. Another example is the false consensus effect .
  • Cognitive biases directly affect our safety, interactions with others, and how we make judgments and decisions in our daily lives.
  • Although these biases are unconscious, there are small steps we can take to train our minds to adopt a new pattern of thinking and mitigate the effects of these biases.

Allen, M. S., Robson, D. A., Martin, L. J., & Laborde, S. (2020). Systematic review and meta-analysis of self-serving attribution biases in the competitive context of organized sport. Personality and Social Psychology Bulletin, 46 (7), 1027-1043.

Casad, B. (2019). Confirmation bias . Retrieved from https://www.britannica.com/science/confirmation-bias

Cherry, K. (2019). How the availability heuristic affects your decision-making . Retrieved from https://www.verywellmind.com/availability-heuristic-2794824

Cherry, K. (2020). Inattentional blindness can cause you to miss things in front of you . Retrieved from https://www.verywellmind.com/what-is-inattentional-blindness-2795020

Dietrich, D., & Olson, M. (1993). A demonstration of hindsight bias using the Thomas confirmation vote. Psychological Reports, 72 (2), 377-378.

Fischhoff, B. (1975). Hindsight is not equal to foresight: The effect of outcome knowledge on judgment under uncertainty. Journal of Experimental Psychology: Human Perception and Performance, 1 (3), 288.

Fischhoff, B., & Beyth, R. (1975). I knew it would happen: Remembered probabilities of once—future things. Organizational Behavior and Human Performance, 13 (1), 1-16.

Furnham, A. (1982). Explanations for unemployment in Britain. European Journal of social psychology, 12(4), 335-352.

Heider, F. (1982). The psychology of interpersonal relations . Psychology Press.

Inman, M. (2016). Hindsight bias . Retrieved from https://www.britannica.com/topic/hindsight-bias

Lang, R. (2019). What is the difference between conscious and unconscious bias? : Faqs. Retrieved from https://engageinlearning.com/faq/compliance/unconscious-bias/what-is-the-difference-between-conscious-and-unconscious-bias/

Luippold, B., Perreault, S., & Wainberg, J. (2015). Auditor’s pitfall: Five ways to overcome confirmation bias . Retrieved from https://www.babson.edu/academics/executive-education/babson-insight/finance-and-accounting/auditors-pitfall-five-ways-to-overcome-confirmation-bias/

Mezulis, A. H., Abramson, L. Y., Hyde, J. S., & Hankin, B. L. (2004). Is there a universal positivity bias in attributions? A meta-analytic review of individual, developmental, and cultural differences in the self-serving attributional bias. Psychological Bulletin, 130 (5), 711.

Miller, D. T., & Ross, M. (1975). Self-serving biases in the attribution of causality: Fact or fiction?. Psychological Bulletin, 82 (2), 213.

Most, S. B., Simons, D. J., Scholl, B. J., Jimenez, R., Clifford, E., & Chabris, C. F. (2001). How not to be seen: The contribution of similarity and selective ignoring to sustained inattentional blindness. Psychological Science, 12 (1), 9-17.

Mussweiler, T., & Strack, F. (1999). Hypothesis-consistent testing and semantic priming in the anchoring paradigm: A selective accessibility model. Journal of Experimental Social Psychology, 35 (2), 136-164.

Neff, K. (2003). Self-compassion: An alternative conceptualization of a healthy attitude toward oneself. Self and Identity, 2 (2), 85-101.

Nickerson, R. S. (1998). Confirmation bias: A ubiquitous phenomenon in many guises. Review of General Psychology, 2 (2), 175-220.

Orzan, G., Zara, I. A., & Purcarea, V. L. (2012). Neuromarketing techniques in pharmaceutical drugs advertising. A discussion and agenda for future research. Journal of Medicine and Life, 5 (4), 428.

Pickel, K. L. (2015). Eyewitness memory. The handbook of attention , 485-502.

Pohl, R. F., & Hell, W. (1996). No reduction in hindsight bias after complete information and repeated testing. Organizational Behavior and Human Decision Processes, 67 (1), 49-58.

Roese, N. J., & Vohs, K. D. (2012). Hindsight bias. Perspectives on Psychological Science, 7 (5), 411-426.

Ross, L. (1977). The intuitive psychologist and his shortcomings: Distortions in the attribution process. In Advances in experimental social psychology (Vol. 10, pp. 173-220). Academic Press.

Tversky, A., & Kahneman, D. (1973). Availability: A heuristic for judging frequency and probability. Cognitive Psychology, 5 (2), 207-232.

Tversky, A., & Kahneman, D. (1974). Judgment under uncertainty: Heuristics and biases. Science, 185 (4157), 1124-1131.

Tversky, A., & Kahneman, D. (1983). Extensional versus intuitive reasoning: The conjunction fallacy in probability judgment. Psychological Review , 90(4), 293.

Tversky, A., & Kahneman, D. (1992). Advances in prospect theory: Cumulative representation of uncertainty. Journal of Risk and Uncertainty, 5 (4), 297-323.

Walther, J. B., & Bazarova, N. N. (2007). Misattribution in virtual groups: The effects of member distribution on self-serving bias and partner blame. Human Communication Research, 33 (1), 1-26.

Wason, Peter C. (1960), “On the failure to eliminate hypotheses in a conceptual task”. Quarterly Journal of Experimental Psychology, 12 (3): 129–40.

Wegener, D. T., Petty, R. E., Detweiler-Bedell, B. T., & Jarvis, W. B. G. (2001). Implications of attitude change theories for numerical anchoring: Anchor plausibility and the limits of anchor effectiveness. Journal of Experimental Social Psychology, 37 (1), 62-69.

Wilke, A., & Mata, R. (2012). Cognitive bias. In Encyclopedia of human behavior (pp. 531-535). Academic Press.

Further Information

Test yourself for bias.

  • Project Implicit (IAT Test) From Harvard University
  • Implicit Association Test From the Social Psychology Network
  • Test Yourself for Hidden Bias From Teaching Tolerance
  • How The Concept Of Implicit Bias Came Into Being With Dr. Mahzarin Banaji, Harvard University. Author of Blindspot: hidden biases of good people5:28 minutes; includes transcript
  • Understanding Your Racial Biases With John Dovidio, PhD, Yale University From the American Psychological Association11:09 minutes; includes transcript
  • Talking Implicit Bias in Policing With Jack Glaser, Goldman School of Public Policy, University of California Berkeley21:59 minutes
  • Implicit Bias: A Factor in Health Communication With Dr. Winston Wong, Kaiser Permanente19:58 minutes
  • Bias, Black Lives and Academic Medicine Dr. David Ansell on Your Health Radio (August 1, 2015)21:42 minutes
  • Uncovering Hidden Biases Google talk with Dr. Mahzarin Banaji, Harvard University
  • Impact of Implicit Bias on the Justice System 9:14 minutes
  • Students Speak Up: What Bias Means to Them 2:17 minutes
  • Weight Bias in Health Care From Yale University16:56 minutes
  • Gender and Racial Bias In Facial Recognition Technology 4:43 minutes

Journal Articles

  • An implicit bias primer Mitchell, G. (2018). An implicit bias primer. Virginia Journal of Social Policy & the Law , 25, 27–59.
  • Implicit Association Test at age 7: A methodological and conceptual review Nosek, B. A., Greenwald, A. G., & Banaji, M. R. (2007). The Implicit Association Test at age 7: A methodological and conceptual review. Automatic processes in social thinking and behavior, 4 , 265-292.
  • Implicit Racial/Ethnic Bias Among Health Care Professionals and Its Influence on Health Care Outcomes: A Systematic Review Hall, W. J., Chapman, M. V., Lee, K. M., Merino, Y. M., Thomas, T. W., Payne, B. K., … & Coyne-Beasley, T. (2015). Implicit racial/ethnic bias among health care professionals and its influence on health care outcomes: a systematic review. American journal of public health, 105 (12), e60-e76.
  • Reducing Racial Bias Among Health Care Providers: Lessons from Social-Cognitive Psychology Burgess, D., Van Ryn, M., Dovidio, J., & Saha, S. (2007). Reducing racial bias among health care providers: lessons from social-cognitive psychology. Journal of general internal medicine, 22 (6), 882-887.
  • Integrating implicit bias into counselor education Boysen, G. A. (2010). Integrating Implicit Bias Into Counselor Education. Counselor Education & Supervision, 49 (4), 210–227.
  • Cognitive Biases and Errors as Cause—and Journalistic Best Practices as Effect Christian, S. (2013). Cognitive Biases and Errors as Cause—and Journalistic Best Practices as Effect. Journal of Mass Media Ethics, 28 (3), 160–174.
  • Empathy intervention to reduce implicit bias in pre-service teachers Whitford, D. K., & Emerson, A. M. (2019). Empathy Intervention to Reduce Implicit Bias in Pre-Service Teachers. Psychological Reports, 122 (2), 670–688.

Print Friendly, PDF & Email

Related Articles

Automatic Processing in Psychology: Definition & Examples

Cognitive Psychology

Automatic Processing in Psychology: Definition & Examples

Controlled Processing in Psychology: Definition & Examples

Controlled Processing in Psychology: Definition & Examples

How Ego Depletion Can Drain Your Willpower

How Ego Depletion Can Drain Your Willpower

What is the Default Mode Network?

What is the Default Mode Network?

Theories of Selective Attention in Psychology

Availability Heuristic and Decision Making

Availability Heuristic and Decision Making

loading

SEP logo

  • Table of Contents
  • New in this Archive
  • Chronological
  • Editorial Information
  • About the SEP
  • Editorial Board
  • How to Cite the SEP
  • Special Characters
  • Support the SEP
  • PDFs for SEP Friends
  • Make a Donation
  • SEPIA for Libraries
  • Entry Contents

Bibliography

Academic tools.

  • Friends PDF Preview
  • Author and Citation Info
  • Back to Top

Critical Thinking

Critical thinking is a widely accepted educational goal. Its definition is contested, but the competing definitions can be understood as differing conceptions of the same basic concept: careful thinking directed to a goal. Conceptions differ with respect to the scope of such thinking, the type of goal, the criteria and norms for thinking carefully, and the thinking components on which they focus. Its adoption as an educational goal has been recommended on the basis of respect for students’ autonomy and preparing students for success in life and for democratic citizenship. “Critical thinkers” have the dispositions and abilities that lead them to think critically when appropriate. The abilities can be identified directly; the dispositions indirectly, by considering what factors contribute to or impede exercise of the abilities. Standardized tests have been developed to assess the degree to which a person possesses such dispositions and abilities. Educational intervention has been shown experimentally to improve them, particularly when it includes dialogue, anchored instruction, and mentoring. Controversies have arisen over the generalizability of critical thinking across domains, over alleged bias in critical thinking theories and instruction, and over the relationship of critical thinking to other types of thinking.

2.1 Dewey’s Three Main Examples

2.2 dewey’s other examples, 2.3 further examples, 2.4 non-examples, 3. the definition of critical thinking, 4. its value, 5. the process of thinking critically, 6. components of the process, 7. contributory dispositions and abilities, 8.1 initiating dispositions, 8.2 internal dispositions, 9. critical thinking abilities, 10. required knowledge, 11. educational methods, 12.1 the generalizability of critical thinking, 12.2 bias in critical thinking theory and pedagogy, 12.3 relationship of critical thinking to other types of thinking, other internet resources, related entries.

Use of the term ‘critical thinking’ to describe an educational goal goes back to the American philosopher John Dewey (1910), who more commonly called it ‘reflective thinking’. He defined it as

active, persistent and careful consideration of any belief or supposed form of knowledge in the light of the grounds that support it, and the further conclusions to which it tends. (Dewey 1910: 6; 1933: 9)

and identified a habit of such consideration with a scientific attitude of mind. His lengthy quotations of Francis Bacon, John Locke, and John Stuart Mill indicate that he was not the first person to propose development of a scientific attitude of mind as an educational goal.

In the 1930s, many of the schools that participated in the Eight-Year Study of the Progressive Education Association (Aikin 1942) adopted critical thinking as an educational goal, for whose achievement the study’s Evaluation Staff developed tests (Smith, Tyler, & Evaluation Staff 1942). Glaser (1941) showed experimentally that it was possible to improve the critical thinking of high school students. Bloom’s influential taxonomy of cognitive educational objectives (Bloom et al. 1956) incorporated critical thinking abilities. Ennis (1962) proposed 12 aspects of critical thinking as a basis for research on the teaching and evaluation of critical thinking ability.

Since 1980, an annual international conference in California on critical thinking and educational reform has attracted tens of thousands of educators from all levels of education and from many parts of the world. Also since 1980, the state university system in California has required all undergraduate students to take a critical thinking course. Since 1983, the Association for Informal Logic and Critical Thinking has sponsored sessions in conjunction with the divisional meetings of the American Philosophical Association (APA). In 1987, the APA’s Committee on Pre-College Philosophy commissioned a consensus statement on critical thinking for purposes of educational assessment and instruction (Facione 1990a). Researchers have developed standardized tests of critical thinking abilities and dispositions; for details, see the Supplement on Assessment . Educational jurisdictions around the world now include critical thinking in guidelines for curriculum and assessment. Political and business leaders endorse its importance.

For details on this history, see the Supplement on History .

2. Examples and Non-Examples

Before considering the definition of critical thinking, it will be helpful to have in mind some examples of critical thinking, as well as some examples of kinds of thinking that would apparently not count as critical thinking.

Dewey (1910: 68–71; 1933: 91–94) takes as paradigms of reflective thinking three class papers of students in which they describe their thinking. The examples range from the everyday to the scientific.

Transit : “The other day, when I was down town on 16th Street, a clock caught my eye. I saw that the hands pointed to 12:20. This suggested that I had an engagement at 124th Street, at one o'clock. I reasoned that as it had taken me an hour to come down on a surface car, I should probably be twenty minutes late if I returned the same way. I might save twenty minutes by a subway express. But was there a station near? If not, I might lose more than twenty minutes in looking for one. Then I thought of the elevated, and I saw there was such a line within two blocks. But where was the station? If it were several blocks above or below the street I was on, I should lose time instead of gaining it. My mind went back to the subway express as quicker than the elevated; furthermore, I remembered that it went nearer than the elevated to the part of 124th Street I wished to reach, so that time would be saved at the end of the journey. I concluded in favor of the subway, and reached my destination by one o’clock.” (Dewey 1910: 68-69; 1933: 91-92)

Ferryboat : “Projecting nearly horizontally from the upper deck of the ferryboat on which I daily cross the river is a long white pole, having a gilded ball at its tip. It suggested a flagpole when I first saw it; its color, shape, and gilded ball agreed with this idea, and these reasons seemed to justify me in this belief. But soon difficulties presented themselves. The pole was nearly horizontal, an unusual position for a flagpole; in the next place, there was no pulley, ring, or cord by which to attach a flag; finally, there were elsewhere on the boat two vertical staffs from which flags were occasionally flown. It seemed probable that the pole was not there for flag-flying.

“I then tried to imagine all possible purposes of the pole, and to consider for which of these it was best suited: (a) Possibly it was an ornament. But as all the ferryboats and even the tugboats carried poles, this hypothesis was rejected. (b) Possibly it was the terminal of a wireless telegraph. But the same considerations made this improbable. Besides, the more natural place for such a terminal would be the highest part of the boat, on top of the pilot house. (c) Its purpose might be to point out the direction in which the boat is moving.

“In support of this conclusion, I discovered that the pole was lower than the pilot house, so that the steersman could easily see it. Moreover, the tip was enough higher than the base, so that, from the pilot's position, it must appear to project far out in front of the boat. Morevoer, the pilot being near the front of the boat, he would need some such guide as to its direction. Tugboats would also need poles for such a purpose. This hypothesis was so much more probable than the others that I accepted it. I formed the conclusion that the pole was set up for the purpose of showing the pilot the direction in which the boat pointed, to enable him to steer correctly.” (Dewey 1910: 69-70; 1933: 92-93)

Bubbles : “In washing tumblers in hot soapsuds and placing them mouth downward on a plate, bubbles appeared on the outside of the mouth of the tumblers and then went inside. Why? The presence of bubbles suggests air, which I note must come from inside the tumbler. I see that the soapy water on the plate prevents escape of the air save as it may be caught in bubbles. But why should air leave the tumbler? There was no substance entering to force it out. It must have expanded. It expands by increase of heat, or by decrease of pressure, or both. Could the air have become heated after the tumbler was taken from the hot suds? Clearly not the air that was already entangled in the water. If heated air was the cause, cold air must have entered in transferring the tumblers from the suds to the plate. I test to see if this supposition is true by taking several more tumblers out. Some I shake so as to make sure of entrapping cold air in them. Some I take out holding mouth downward in order to prevent cold air from entering. Bubbles appear on the outside of every one of the former and on none of the latter. I must be right in my inference. Air from the outside must have been expanded by the heat of the tumbler, which explains the appearance of the bubbles on the outside. But why do they then go inside? Cold contracts. The tumbler cooled and also the air inside it. Tension was removed, and hence bubbles appeared inside. To be sure of this, I test by placing a cup of ice on the tumbler while the bubbles are still forming outside. They soon reverse” (Dewey 1910: 70–71; 1933: 93–94).

Dewey (1910, 1933) sprinkles his book with other examples of critical thinking. We will refer to the following.

Weather : A man on a walk notices that it has suddenly become cool, thinks that it is probably going to rain, looks up and sees a dark cloud obscuring the sun, and quickens his steps (1910: 6–10; 1933: 9–13).

Disorder : A man finds his rooms on his return to them in disorder with his belongings thrown about, thinks at first of burglary as an explanation, then thinks of mischievous children as being an alternative explanation, then looks to see whether valuables are missing, and discovers that they are (1910: 82–83; 1933: 166–168).

Typhoid : A physician diagnosing a patient whose conspicuous symptoms suggest typhoid avoids drawing a conclusion until more data are gathered by questioning the patient and by making tests (1910: 85–86; 1933: 170).

Blur : A moving blur catches our eye in the distance, we ask ourselves whether it is a cloud of whirling dust or a tree moving its branches or a man signaling to us, we think of other traits that should be found on each of those possibilities, and we look and see if those traits are found (1910: 102, 108; 1933: 121, 133).

Suction pump : In thinking about the suction pump, the scientist first notes that it will draw water only to a maximum height of 33 feet at sea level and to a lesser maximum height at higher elevations, selects for attention the differing atmospheric pressure at these elevations, sets up experiments in which the air is removed from a vessel containing water (when suction no longer works) and in which the weight of air at various levels is calculated, compares the results of reasoning about the height to which a given weight of air will allow a suction pump to raise water with the observed maximum height at different elevations, and finally assimilates the suction pump to such apparently different phenomena as the siphon and the rising of a balloon (1910: 150–153; 1933: 195–198).

Diamond : A passenger in a car driving in a diamond lane reserved for vehicles with at least one passenger notices that the diamond marks on the pavement are far apart in some places and close together in others. Why? The driver suggests that the reason may be that the diamond marks are not needed where there is a solid double line separating the diamond line from the adjoining lane, but are needed when there is a dotted single line permitting crossing into the diamond lane. Further observation confirms that the diamonds are close together when a dotted line separates the diamond lane from its neighbour, but otherwise far apart.

Rash : A woman suddenly develops a very itchy red rash on her throat and upper chest. She recently noticed a mark on the back of her right hand, but was not sure whether the mark was a rash or a scrape. She lies down in bed and thinks about what might be causing the rash and what to do about it. About two weeks before, she began taking blood pressure medication that contained a sulfa drug, and the pharmacist had warned her, in view of a previous allergic reaction to a medication containing a sulfa drug, to be on the alert for an allergic reaction; however, she had been taking the medication for two weeks with no such effect. The day before, she began using a new cream on her neck and upper chest; against the new cream as the cause was mark on the back of her hand, which had not been exposed to the cream. She began taking probiotics about a month before. She also recently started new eye drops, but she supposed that manufacturers of eye drops would be careful not to include allergy-causing components in the medication. The rash might be a heat rash, since she recently was sweating profusely from her upper body. Since she is about to go away on a short vacation, where she would not have access to her usual physician, she decides to keep taking the probiotics and using the new eye drops but to discontinue the blood pressure medication and to switch back to the old cream for her neck and upper chest. She forms a plan to consult her regular physician on her return about the blood pressure medication.

Candidate : Although Dewey included no examples of thinking directed at appraising the arguments of others, such thinking has come to be considered a kind of critical thinking. We find an example of such thinking in the performance task on the Collegiate Learning Assessment (CLA+), which its sponsoring organization describes as

a performance-based assessment that provides a measure of an institution’s contribution to the development of critical-thinking and written communication skills of its students. (Council for Aid to Education 2017)

A sample task posted on its website requires the test-taker to write a report for public distribution evaluating a fictional candidate’s policy proposals and their supporting arguments, using supplied background documents, with a recommendation on whether to endorse the candidate.

Immediate acceptance of an idea that suggests itself as a solution to a problem (e.g., a possible explanation of an event or phenomenon, an action that seems likely to produce a desired result) is “uncritical thinking, the minimum of reflection” (Dewey 1910: 13). On-going suspension of judgment in the light of doubt about a possible solution is not critical thinking (Dewey 1910: 108). Critique driven by a dogmatically held political or religious ideology is not critical thinking; thus Paulo Freire (1968 [1970]) is using the term (e.g., at 1970: 71, 81, 100, 146) in a more politically freighted sense that includes not only reflection but also revolutionary action against oppression. Derivation of a conclusion from given data using an algorithm is not critical thinking.

What is critical thinking? There are many definitions. Ennis (2016) lists 14 philosophically oriented scholarly definitions and three dictionary definitions. Following Rawls (1971), who distinguished his conception of justice from a utilitarian conception but regarded them as rival conceptions of the same concept, Ennis maintains that the 17 definitions are different conceptions of the same concept. Rawls articulated the shared concept of justice as

a characteristic set of principles for assigning basic rights and duties and for determining… the proper distribution of the benefits and burdens of social cooperation. (Rawls 1971: 5)

Bailin et al. (1999b) claim that, if one considers what sorts of thinking an educator would take not to be critical thinking and what sorts to be critical thinking, one can conclude that educators typically understand critical thinking to have at least three features.

  • It is done for the purpose of making up one’s mind about what to believe or do.
  • The person engaging in the thinking is trying to fulfill standards of adequacy and accuracy appropriate to the thinking.
  • The thinking fulfills the relevant standards to some threshold level.

One could sum up the core concept that involves these three features by saying that critical thinking is careful goal-directed thinking. This core concept seems to apply to all the examples of critical thinking described in the previous section. As for the non-examples, their exclusion depends on construing careful thinking as excluding jumping immediately to conclusions, suspending judgment no matter how strong the evidence, reasoning from an unquestioned ideological or religious perspective, and routinely using an algorithm to answer a question.

If the core of critical thinking is careful goal-directed thinking, conceptions of it can vary according to its presumed scope, its presumed goal, one’s criteria and threshold for being careful, and the thinking component on which one focuses As to its scope, some conceptions (e.g., Dewey 1910, 1933) restrict it to constructive thinking on the basis of one’s own observations and experiments, others (e.g., Ennis 1962; Fisher & Scriven 1997; Johnson 1992) to appraisal of the products of such thinking. Ennis (1991) and Bailin et al. (1999b) take it to cover both construction and appraisal. As to its goal, some conceptions restrict it to forming a judgment (Dewey 1910, 1933; Lipman 1987; Facione 1990a). Others allow for actions as well as beliefs as the end point of a process of critical thinking (Ennis 1991; Bailin et al. 1999b). As to the criteria and threshold for being careful, definitions vary in the term used to indicate that critical thinking satisfies certain norms: “intellectually disciplined” (Scriven & Paul 1987), “reasonable” (Ennis 1991), “skillful” (Lipman 1987), “skilled” (Fisher & Scriven 1997), “careful” (Bailin & Battersby 2009). Some definitions specify these norms, referring variously to “consideration of any belief or supposed form of knowledge in the light of the grounds that support it and the further conclusions to which it tends” (Dewey 1910, 1933); “the methods of logical inquiry and reasoning” (Glaser 1941); “conceptualizing, applying, analyzing, synthesizing, and/or evaluating information gathered from, or generated by, observation, experience, reflection, reasoning, or communication” (Scriven & Paul 1987); the requirement that “it is sensitive to context, relies on criteria, and is self-correcting” (Lipman 1987); “evidential, conceptual, methodological, criteriological, or contextual considerations” (Facione 1990a); and “plus-minus considerations of the product in terms of appropriate standards (or criteria)” (Johnson 1992). Stanovich and Stanovich (2010) propose to ground the concept of critical thinking in the concept of rationality, which they understand as combining epistemic rationality (fitting one’s beliefs to the world) and instrumental rationality (optimizing goal fulfillment); a critical thinker, in their view, is someone with “a propensity to override suboptimal responses from the autonomous mind” (2010: 227). These variant specifications of norms for critical thinking are not necessarily incompatible with one another, and in any case presuppose the core notion of thinking carefully. As to the thinking component singled out, some definitions focus on suspension of judgment during the thinking (Dewey 1910; McPeck 1981), others on inquiry while judgment is suspended (Bailin & Battersby 2009), others on the resulting judgment (Facione 1990a), and still others on the subsequent emotive response (Siegel 1988).

In educational contexts, a definition of critical thinking is a “programmatic definition” (Scheffler 1960: 19). It expresses a practical program for achieving an educational goal. For this purpose, a one-sentence formulaic definition is much less useful than articulation of a critical thinking process, with criteria and standards for the kinds of thinking that the process may involve. The real educational goal is recognition, adoption and implementation by students of those criteria and standards. That adoption and implementation in turn consists in acquiring the knowledge, abilities and dispositions of a critical thinker.

Conceptions of critical thinking generally do not include moral integrity as part of the concept. Dewey, for example, took critical thinking to be the ultimate intellectual goal of education, but distinguished it from the development of social cooperation among school children, which he took to be the central moral goal. Ennis (1996, 2011) added to his previous list of critical thinking dispositions a group of dispositions to care about the dignity and worth of every person, which he described as a “correlative” (1996) disposition without which critical thinking would be less valuable and perhaps harmful. An educational program that aimed at developing critical thinking but not the correlative disposition to care about the dignity and worth of every person, he asserted, “would be deficient and perhaps dangerous” (Ennis 1996: 172).

Dewey thought that education for reflective thinking would be of value to both the individual and society; recognition in educational practice of the kinship to the scientific attitude of children’s native curiosity, fertile imagination and love of experimental inquiry “would make for individual happiness and the reduction of social waste” (Dewey 1910: iii). Schools participating in the Eight-Year Study took development of the habit of reflective thinking and skill in solving problems as a means to leading young people to understand, appreciate and live the democratic way of life characteristic of the United States (Aikin 1942: 17–18, 81). Harvey Siegel (1988: 55–61) has offered four considerations in support of adopting critical thinking as an educational ideal. (1) Respect for persons requires that schools and teachers honour students’ demands for reasons and explanations, deal with students honestly, and recognize the need to confront students’ independent judgment; these requirements concern the manner in which teachers treat students. (2) Education has the task of preparing children to be successful adults, a task that requires development of their self-sufficiency. (3) Education should initiate children into the rational traditions in such fields as history, science and mathematics. (4) Education should prepare children to become democratic citizens, which requires reasoned procedures and critical talents and attitudes. To supplement these considerations, Siegel (1988: 62–90) responds to two objections: the ideology objection that adoption of any educational ideal requires a prior ideological commitment and the indoctrination objection that cultivation of critical thinking cannot escape being a form of indoctrination.

Despite the diversity of our 11 examples, one can recognize a common pattern. Dewey analyzed it as consisting of five phases:

  • suggestions , in which the mind leaps forward to a possible solution;
  • an intellectualization of the difficulty or perplexity into a problem to be solved, a question for which the answer must be sought;
  • the use of one suggestion after another as a leading idea, or hypothesis , to initiate and guide observation and other operations in collection of factual material;
  • the mental elaboration of the idea or supposition as an idea or supposition ( reasoning , in the sense on which reasoning is a part, not the whole, of inference); and
  • testing the hypothesis by overt or imaginative action. (Dewey 1933: 106–107; italics in original)

The process of reflective thinking consisting of these phases would be preceded by a perplexed, troubled or confused situation and followed by a cleared-up, unified, resolved situation (Dewey 1933: 106). The term ‘phases’ replaced the term ‘steps’ (Dewey 1910: 72), thus removing the earlier suggestion of an invariant sequence. Variants of the above analysis appeared in (Dewey 1916: 177) and (Dewey 1938: 101–119).

The variant formulations indicate the difficulty of giving a single logical analysis of such a varied process. The process of critical thinking may have a spiral pattern, with the problem being redefined in the light of obstacles to solving it as originally formulated. For example, the person in Transit might have concluded that getting to the appointment at the scheduled time was impossible and have reformulated the problem as that of rescheduling the appointment for a mutually convenient time. Further, defining a problem does not always follow after or lead immediately to an idea of a suggested solution. Nor should it do so, as Dewey himself recognized in describing the physician in Typhoid as avoiding any strong preference for this or that conclusion before getting further information (Dewey 1910: 85; 1933: 170). People with a hypothesis in mind, even one to which they have a very weak commitment, have a so-called “confirmation bias” (Nickerson 1998): they are likely to pay attention to evidence that confirms the hypothesis and to ignore evidence that counts against it or for some competing hypothesis. Detectives, intelligence agencies, and investigators of airplane accidents are well advised to gather relevant evidence systematically and to postpone even tentative adoption of an explanatory hypothesis until the collected evidence rules out with the appropriate degree of certainty all but one explanation. Dewey’s analysis of the critical thinking process can be faulted as well for requiring acceptance or rejection of a possible solution to a defined problem, with no allowance for deciding in the light of the available evidence to suspend judgment. Further, given the great variety of kinds of problems for which reflection is appropriate, there is likely to be variation in its component events. Perhaps the best way to conceptualize the critical thinking process is as a checklist whose component events can occur in a variety of orders, selectively, and more than once. These component events might include (1) noticing a difficulty, (2) defining the problem, (3) dividing the problem into manageable sub-problems, (4) formulating a variety of possible solutions to the problem or sub-problem, (5) determining what evidence is relevant to deciding among possible solutions to the problem or sub-problem, (6) devising a plan of systematic observation or experiment that will uncover the relevant evidence, (7) carrying out the plan of systematic observation or experimentation, (8) noting the results of the systematic observation or experiment, (9) gathering relevant testimony and information from others, (10) judging the credibility of testimony and information gathered from others, (11) drawing conclusions from gathered evidence and accepted testimony, and (12) accepting a solution that the evidence adequately supports (cf. Hitchcock 2017: 485).

Checklist conceptions of the process of critical thinking are open to the objection that they are too mechanical and procedural to fit the multi-dimensional and emotionally charged issues for which critical thinking is urgently needed (Paul 1984). For such issues, a more dialectical process is advocated, in which competing relevant world views are identified, their implications explored, and some sort of creative synthesis attempted.

If one considers the critical thinking process illustrated by the 11 examples, one can identify distinct kinds of mental acts and mental states that form part of it. To distinguish, label and briefly characterize these components is a useful preliminary to identifying abilities, skills, dispositions, attitudes, habits and the like that contribute causally to thinking critically. Identifying such abilities and habits is in turn a useful preliminary to setting educational goals. Setting the goals is in its turn a useful preliminary to designing strategies for helping learners to achieve the goals and to designing ways of measuring the extent to which learners have done so. Such measures provide both feedback to learners on their achievement and a basis for experimental research on the effectiveness of various strategies for educating people to think critically. Let us begin, then, by distinguishing the kinds of mental acts and mental events that can occur in a critical thinking process.

  • Observing : One notices something in one’s immediate environment (sudden cooling of temperature in Weather , bubbles forming outside a glass and then going inside in Bubbles , a moving blur in the distance in Blur , a rash in Rash ). Or one notes the results of an experiment or systematic observation (valuables missing in Disorder , no suction without air pressure in Suction pump )
  • Feeling : One feels puzzled or uncertain about something (how to get to an appointment on time in Transit , why the diamonds vary in frequency in Diamond ). One wants to resolve this perplexity. One feels satisfaction once one has worked out an answer (to take the subway express in Transit , diamonds closer when needed as a warning in Diamond ).
  • Wondering : One formulates a question to be addressed (why bubbles form outside a tumbler taken from hot water in Bubbles , how suction pumps work in Suction pump , what caused the rash in Rash ).
  • Imagining : One thinks of possible answers (bus or subway or elevated in Transit , flagpole or ornament or wireless communication aid or direction indicator in Ferryboat , allergic reaction or heat rash in Rash ).
  • Inferring : One works out what would be the case if a possible answer were assumed (valuables missing if there has been a burglary in Disorder , earlier start to the rash if it is an allergic reaction to a sulfa drug in Rash ). Or one draws a conclusion once sufficient relevant evidence is gathered (take the subway in Transit , burglary in Disorder , discontinue blood pressure medication and new cream in Rash ).
  • Knowledge : One uses stored knowledge of the subject-matter to generate possible answers or to infer what would be expected on the assumption of a particular answer (knowledge of a city’s public transit system in Transit , of the requirements for a flagpole in Ferryboat , of Boyle’s law in Bubbles , of allergic reactions in Rash ).
  • Experimenting : One designs and carries out an experiment or a systematic observation to find out whether the results deduced from a possible answer will occur (looking at the location of the flagpole in relation to the pilot’s position in Ferryboat , putting an ice cube on top of a tumbler taken from hot water in Bubbles , measuring the height to which a suction pump will draw water at different elevations in Suction pump , noticing the frequency of diamonds when movement to or from a diamond lane is allowed in Diamond ).
  • Consulting : One finds a source of information, gets the information from the source, and makes a judgment on whether to accept it. None of our 11 examples include searching for sources of information. In this respect they are unrepresentative, since most people nowadays have almost instant access to information relevant to answering any question, including many of those illustrated by the examples. However, Candidate includes the activities of extracting information from sources and evaluating its credibility.
  • Identifying and analyzing arguments : One notices an argument and works out its structure and content as a preliminary to evaluating its strength. This activity is central to Candidate . It is an important part of a critical thinking process in which one surveys arguments for various positions on an issue.
  • Judging : One makes a judgment on the basis of accumulated evidence and reasoning, such as the judgment in Ferryboat that the purpose of the pole is to provide direction to the pilot.
  • Deciding : One makes a decision on what to do or on what policy to adopt, as in the decision in Transit to take the subway.

By definition, a person who does something voluntarily is both willing and able to do that thing at that time. Both the willingness and the ability contribute causally to the person’s action, in the sense that the voluntary action would not occur if either (or both) of these were lacking. For example, suppose that one is standing with one’s arms at one’s sides and one voluntarily lifts one’s right arm to an extended horizontal position. One would not do so if one were unable to lift one’s arm, if for example one’s right side was paralyzed as the result of a stroke. Nor would one do so if one were unwilling to lift one’s arm, if for example one were participating in a street demonstration at which a white supremacist was urging the crowd to lift their right arm in a Nazi salute and one were unwilling to express support in this way for the racist Nazi ideology. The same analysis applies to a voluntary mental process of thinking critically. It requires both willingness and ability to think critically, including willingness and ability to perform each of the mental acts that compose the process and to coordinate those acts in a sequence that is directed at resolving the initiating perplexity.

Consider willingness first. We can identify causal contributors to willingness to think critically by considering factors that would cause a person who was able to think critically about an issue nevertheless not to do so (Hamby 2014). For each factor, the opposite condition thus contributes causally to willingness to think critically on a particular occasion. For example, people who habitually jump to conclusions without considering alternatives will not think critically about issues that arise, even if they have the required abilities. The contrary condition of willingness to suspend judgment is thus a causal contributor to thinking critically.

Now consider ability. In contrast to the ability to move one’s arm, which can be completely absent because a stroke has left the arm paralyzed, the ability to think critically is a developed ability, whose absence is not a complete absence of ability to think but absence of ability to think well. We can identify the ability to think well directly, in terms of the norms and standards for good thinking. In general, to be able do well the thinking activities that can be components of a critical thinking process, one needs to know the concepts and principles that characterize their good performance, to recognize in particular cases that the concepts and principles apply, and to apply them. The knowledge, recognition and application may be procedural rather than declarative. It may be domain-specific rather than widely applicable, and in either case may need subject-matter knowledge, sometimes of a deep kind.

Reflections of the sort illustrated by the previous two paragraphs have led scholars to identify the knowledge, abilities and dispositions of a “critical thinker”, i.e., someone who thinks critically whenever it is appropriate to do so. We turn now to these three types of causal contributors to thinking critically. We start with dispositions, since arguably these are the most powerful contributors to being a critical thinker, can be fostered at an early stage of a child’s development, and are susceptible to general improvement (Glaser 1941: 175)

8. Critical Thinking Dispositions

Educational researchers use the term ‘dispositions’ broadly for the habits of mind and attitudes that contribute causally to being a critical thinker. Some writers (e.g., Paul & Elder 2006; Hamby 2014; Bailin & Battersby 2016) propose to use the term ‘virtues’ for this dimension of a critical thinker. The virtues in question, although they are virtues of character, concern the person’s ways of thinking rather than the person’s ways of behaving towards others. They are not moral virtues but intellectual virtues, of the sort articulated by Zagzebski (1996) and discussed by Turri, Alfano, and Greco (2017).

On a realistic conception, thinking dispositions or intellectual virtues are real properties of thinkers. They are general tendencies, propensities, or inclinations to think in particular ways in particular circumstances, and can be genuinely explanatory (Siegel 1999). Sceptics argue that there is no evidence for a specific mental basis for the habits of mind that contribute to thinking critically, and that it is pedagogically misleading to posit such a basis (Bailin et al. 1999a). Whatever their status, critical thinking dispositions need motivation for their initial formation in a child—motivation that may be external or internal. As children develop, the force of habit will gradually become important in sustaining the disposition (Nieto & Valenzuela 2012). Mere force of habit, however, is unlikely to sustain critical thinking dispositions. Critical thinkers must value and enjoy using their knowledge and abilities to think things through for themselves. They must be committed to, and lovers of, inquiry.

A person may have a critical thinking disposition with respect to only some kinds of issues. For example, one could be open-minded about scientific issues but not about religious issues. Similarly, one could be confident in one’s ability to reason about the theological implications of the existence of evil in the world but not in one’s ability to reason about the best design for a guided ballistic missile.

Critical thinking dispositions can usefully be divided into initiating dispositions (those that contribute causally to starting to think critically about an issue) and internal dispositions (those that contribute causally to doing a good job of thinking critically once one has started) (Facione 1990a: 25). The two categories are not mutually exclusive. For example, open-mindedness, in the sense of willingness to consider alternative points of view to one’s own, is both an initiating and an internal disposition.

Using the strategy of considering factors that would block people with the ability to think critically from doing so, we can identify as initiating dispositions for thinking critically attentiveness, a habit of inquiry, self-confidence, courage, open-mindedness, willingness to suspend judgment, trust in reason, wanting evidence for one’s beliefs, and seeking the truth. We consider briefly what each of these dispositions amounts to, in each case citing sources that acknowledge them.

  • Attentiveness : One will not think critically if one fails to recognize an issue that needs to be thought through. For example, the pedestrian in Weather would not have looked up if he had not noticed that the air was suddenly cooler. To be a critical thinker, then, one needs to be habitually attentive to one’s surroundings, noticing not only what one senses but also sources of perplexity in messages received and in one’s own beliefs and attitudes (Facione 1990a: 25; Facione, Facione, & Giancarlo 2001).
  • Habit of inquiry : Inquiry is effortful, and one needs an internal push to engage in it. For example, the student in Bubbles could easily have stopped at idle wondering about the cause of the bubbles rather than reasoning to a hypothesis, then designing and executing an experiment to test it. Thus willingness to think critically needs mental energy and initiative. What can supply that energy? Love of inquiry, or perhaps just a habit of inquiry. Hamby (2015) has argued that willingness to inquire is the central critical thinking virtue, one that encompasses all the others. It is recognized as a critical thinking disposition by Dewey (1910: 29; 1933: 35), Glaser (1941: 5), Ennis (1987: 12; 1991: 8), Facione (1990a: 25), Bailin et al. (1999b: 294), Halpern (1998: 452), and Facione, Facione, & Giancarlo (2001).
  • Self-confidence : Lack of confidence in one’s abilities can block critical thinking. For example, if the woman in Rash lacked confidence in her ability to figure things out for herself, she might just have assumed that the rash on her chest was the allergic reaction to her medication against which the pharmacist had warned her. Thus willingness to think critically requires confidence in one’s ability to inquire (Facione 1990a: 25; Facione, Facione, & Giancarlo 2001).
  • Courage : Fear of thinking for oneself can stop one from doing it. Thus willingness to think critically requires intellectual courage (Paul & Elder 2006: 16).
  • Open-mindedness : A dogmatic attitude will impede thinking critically. For example, a person who adheres rigidly to a “pro-choice” position on the issue of the legal status of induced abortion is likely to be unwilling to consider seriously the issue of when in its development an unborn child acquires a moral right to life. Thus willingness to think critically requires open-mindedness, in the sense of a willingness to examine questions to which one already accepts an answer but which further evidence or reasoning might cause one to answer differently (Dewey 1933; Facione 1990a; Ennis 1991; Bailin et al. 1999b; Halpern 1998, Facione, Facione, & Giancarlo 2001). Paul (1981) emphasizes open-mindedness about alternative world-views, and recommends a dialectical approach to integrating such views as central to what he calls “strong sense” critical thinking.
  • Willingness to suspend judgment : Premature closure on an initial solution will block critical thinking. Thus willingness to think critically requires a willingness to suspend judgment while alternatives are explored (Facione 1990a; Ennis 1991; Halpern 1998).
  • Trust in reason : Since distrust in the processes of reasoned inquiry will dissuade one from engaging in it, trust in them is an initiating critical thinking disposition (Facione 1990a, 25; Bailin et al. 1999b: 294; Facione, Facione, & Giancarlo 2001; Paul & Elder 2006). In reaction to an allegedly exclusive emphasis on reason in critical thinking theory and pedagogy, Thayer-Bacon (2000) argues that intuition, imagination, and emotion have important roles to play in an adequate conception of critical thinking that she calls “constructive thinking”. From her point of view, critical thinking requires trust not only in reason but also in intuition, imagination, and emotion.
  • Seeking the truth : If one does not care about the truth but is content to stick with one’s initial bias on an issue, then one will not think critically about it. Seeking the truth is thus an initiating critical thinking disposition (Bailin et al. 1999b: 294; Facione, Facione, & Giancarlo 2001). A disposition to seek the truth is implicit in more specific critical thinking dispositions, such as trying to be well-informed, considering seriously points of view other than one’s own, looking for alternatives, suspending judgment when the evidence is insufficient, and adopting a position when the evidence supporting it is sufficient.

Some of the initiating dispositions, such as open-mindedness and willingness to suspend judgment, are also internal critical thinking dispositions, in the sense of mental habits or attitudes that contribute causally to doing a good job of critical thinking once one starts the process. But there are many other internal critical thinking dispositions. Some of them are parasitic on one’s conception of good thinking. For example, it is constitutive of good thinking about an issue to formulate the issue clearly and to maintain focus on it. For this purpose, one needs not only the corresponding ability but also the corresponding disposition. Ennis (1991: 8) describes it as the disposition “to determine and maintain focus on the conclusion or question”, Facione (1990a: 25) as “clarity in stating the question or concern”. Other internal dispositions are motivators to continue or adjust the critical thinking process, such as willingness to persist in a complex task and willingness to abandon nonproductive strategies in an attempt to self-correct (Halpern 1998: 452). For a list of identified internal critical thinking dispositions, see the Supplement on Internal Critical Thinking Dispositions .

Some theorists postulate skills, i.e., acquired abilities, as operative in critical thinking. It is not obvious, however, that a good mental act is the exercise of a generic acquired skill. Inferring an expected time of arrival, as in Transit , has some generic components but also uses non-generic subject-matter knowledge. Bailin et al. (1999a) argue against viewing critical thinking skills as generic and discrete, on the ground that skilled performance at a critical thinking task cannot be separated from knowledge of concepts and from domain-specific principles of good thinking. Talk of skills, they concede, is unproblematic if it means merely that a person with critical thinking skills is capable of intelligent performance.

Despite such scepticism, theorists of critical thinking have listed as general contributors to critical thinking what they variously call abilities (Glaser 1941; Ennis 1962, 1991), skills (Facione 1990a; Halpern 1998) or competencies (Fisher & Scriven 1997). Amalgamating these lists would produce a confusing and chaotic cornucopia of more than 50 possible educational objectives, with only partial overlap among them. It makes sense instead to try to understand the reasons for the multiplicity and diversity, and to make a selection according to one’s own reasons for singling out abilities to be developed in a critical thinking curriculum. Two reasons for diversity among lists of critical thinking abilities are the underlying conception of critical thinking and the envisaged educational level. Appraisal-only conceptions, for example, involve a different suite of abilities than constructive-only conceptions. Some lists, such as those in (Glaser 1941), are put forward as educational objectives for secondary school students, whereas others are proposed as objectives for college students (e.g., Facione 1990a).

The abilities described in the remaining paragraphs of this section emerge from reflection on the general abilities needed to do well the thinking activities identified in section 6 as components of the critical thinking process described in section 5 . The derivation of each collection of abilities is accompanied by citation of sources that list such abilities and of standardized tests that claim to test them.

Observational abilities : Careful and accurate observation sometimes requires specialist expertise and practice, as in the case of observing birds and observing accident scenes. However, there are general abilities of noticing what one’s senses are picking up from one’s environment and of being able to articulate clearly and accurately to oneself and others what one has observed. It helps in exercising them to be able to recognize and take into account factors that make one’s observation less trustworthy, such as prior framing of the situation, inadequate time, deficient senses, poor observation conditions, and the like. It helps as well to be skilled at taking steps to make one’s observation more trustworthy, such as moving closer to get a better look, measuring something three times and taking the average, and checking what one thinks one is observing with someone else who is in a good position to observe it. It also helps to be skilled at recognizing respects in which one’s report of one’s observation involves inference rather than direct observation, so that one can then consider whether the inference is justified. These abilities come into play as well when one thinks about whether and with what degree of confidence to accept an observation report, for example in the study of history or in a criminal investigation or in assessing news reports. Observational abilities show up in some lists of critical thinking abilities (Ennis 1962: 90; Facione 1990a: 16; Ennis 1991: 9). There are items testing a person’s ability to judge the credibility of observation reports in the Cornell Critical Thinking Tests, Levels X and Z (Ennis & Millman 1971; Ennis, Millman, & Tomko 1985, 2005). Norris and King (1983, 1985, 1990a, 1990b) is a test of ability to appraise observation reports.

Emotional abilities : The emotions that drive a critical thinking process are perplexity or puzzlement, a wish to resolve it, and satisfaction at achieving the desired resolution. Children experience these emotions at an early age, without being trained to do so. Education that takes critical thinking as a goal needs only to channel these emotions and to make sure not to stifle them. Collaborative critical thinking benefits from ability to recognize one’s own and others’ emotional commitments and reactions.

Questioning abilities : A critical thinking process needs transformation of an inchoate sense of perplexity into a clear question. Formulating a question well requires not building in questionable assumptions, not prejudging the issue, and using language that in context is unambiguous and precise enough (Ennis 1962: 97; 1991: 9).

Imaginative abilities : Thinking directed at finding the correct causal explanation of a general phenomenon or particular event requires an ability to imagine possible explanations. Thinking about what policy or plan of action to adopt requires generation of options and consideration of possible consequences of each option. Domain knowledge is required for such creative activity, but a general ability to imagine alternatives is helpful and can be nurtured so as to become easier, quicker, more extensive, and deeper (Dewey 1910: 34–39; 1933: 40–47). Facione (1990a) and Halpern (1998) include the ability to imagine alternatives as a critical thinking ability.

Inferential abilities : The ability to draw conclusions from given information, and to recognize with what degree of certainty one’s own or others’ conclusions follow, is universally recognized as a general critical thinking ability. All 11 examples in section 2 of this article include inferences, some from hypotheses or options (as in Transit , Ferryboat and Disorder ), others from something observed (as in Weather and Rash ). None of these inferences is formally valid. Rather, they are licensed by general, sometimes qualified substantive rules of inference (Toulmin 1958) that rest on domain knowledge—that a bus trip takes about the same time in each direction, that the terminal of a wireless telegraph would be located on the highest possible place, that sudden cooling is often followed by rain, that an allergic reaction to a sulfa drug generally shows up soon after one starts taking it. It is a matter of controversy to what extent the specialized ability to deduce conclusions from premisses using formal rules of inference is needed for critical thinking. Dewey (1933) locates logical forms in setting out the products of reflection rather than in the process of reflection. Ennis (1981a), on the other hand, maintains that a liberally-educated person should have the following abilities: to translate natural-language statements into statements using the standard logical operators, to use appropriately the language of necessary and sufficient conditions, to deal with argument forms and arguments containing symbols, to determine whether in virtue of an argument’s form its conclusion follows necessarily from its premisses, to reason with logically complex propositions, and to apply the rules and procedures of deductive logic. Inferential abilities are recognized as critical thinking abilities by Glaser (1941: 6), Facione (1990a: 9), Ennis (1991: 9), Fisher & Scriven (1997: 99, 111), and Halpern (1998: 452). Items testing inferential abilities constitute two of the five subtests of the Watson Glaser Critical Thinking Appraisal (Watson & Glaser 1980a, 1980b, 1994), two of the four sections in the Cornell Critical Thinking Test Level X (Ennis & Millman 1971; Ennis, Millman, & Tomko 1985, 2005), three of the seven sections in the Cornell Critical Thinking Test Level Z (Ennis & Millman 1971; Ennis, Millman, & Tomko 1985, 2005), 11 of the 34 items on Forms A and B of the California Critical Thinking Skills Test (Facione 1990b, 1992), and a high but variable proportion of the 25 selected-response questions in the Collegiate Learning Assessment (Council for Aid to Education 2017).

Experimenting abilities : Knowing how to design and execute an experiment is important not just in scientific research but also in everyday life, as in Rash . Dewey devoted a whole chapter of his How We Think (1910: 145–156; 1933: 190–202) to the superiority of experimentation over observation in advancing knowledge. Experimenting abilities come into play at one remove in appraising reports of scientific studies. Skill in designing and executing experiments includes the acknowledged abilities to appraise evidence (Glaser 1941: 6), to carry out experiments and to apply appropriate statistical inference techniques (Facione 1990a: 9), to judge inductions to an explanatory hypothesis (Ennis 1991: 9), and to recognize the need for an adequately large sample size (Halpern 1998). The Cornell Critical Thinking Test Level Z (Ennis & Millman 1971; Ennis, Millman, & Tomko 1985, 2005) includes four items (out of 52) on experimental design. The Collegiate Learning Assessment (Council for Aid to Education 2017) makes room for appraisal of study design in both its performance task and its selected-response questions.

Consulting abilities : Skill at consulting sources of information comes into play when one seeks information to help resolve a problem, as in Candidate . Ability to find and appraise information includes ability to gather and marshal pertinent information (Glaser 1941: 6), to judge whether a statement made by an alleged authority is acceptable (Ennis 1962: 84), to plan a search for desired information (Facione 1990a: 9), and to judge the credibility of a source (Ennis 1991: 9). Ability to judge the credibility of statements is tested by 24 items (out of 76) in the Cornell Critical Thinking Test Level X (Ennis & Millman 1971; Ennis, Millman, & Tomko 1985, 2005) and by four items (out of 52) in the Cornell Critical Thinking Test Level Z (Ennis & Millman 1971; Ennis, Millman, & Tomko 1985, 2005). The College Learning Assessment’s performance task requires evaluation of whether information in documents is credible or unreliable (Council for Aid to Education 2017).

Argument analysis abilities : The ability to identify and analyze arguments contributes to the process of surveying arguments on an issue in order to form one’s own reasoned judgment, as in Candidate . The ability to detect and analyze arguments is recognized as a critical thinking skill by Facione (1990a: 7–8), Ennis (1991: 9) and Halpern (1998). Five items (out of 34) on the California Critical Thinking Skills Test (Facione 1990b, 1992) test skill at argument analysis. The College Learning Assessment (Council for Aid to Education 2017) incorporates argument analysis in its selected-response tests of critical reading and evaluation and of critiquing an argument.

Judging skills and deciding skills : Skill at judging and deciding is skill at recognizing what judgment or decision the available evidence and argument supports, and with what degree of confidence. It is thus a component of the inferential skills already discussed.

Lists and tests of critical thinking abilities often include two more abilities: identifying assumptions and constructing and evaluating definitions.

In addition to dispositions and abilities, critical thinking needs knowledge: of critical thinking concepts, of critical thinking principles, and of the subject-matter of the thinking.

We can derive a short list of concepts whose understanding contributes to critical thinking from the critical thinking abilities described in the preceding section. Observational abilities require an understanding of the difference between observation and inference. Questioning abilities require an understanding of the concepts of ambiguity and vagueness. Inferential abilities require an understanding of the difference between conclusive and defeasible inference (traditionally, between deduction and induction), as well as of the difference between necessary and sufficient conditions. Experimenting abilities require an understanding of the concepts of hypothesis, null hypothesis, assumption and prediction, as well as of the concept of statistical significance and of its difference from importance. They also require an understanding of the difference between an experiment and an observational study, and in particular of the difference between a randomized controlled trial, a prospective correlational study and a retrospective (case-control) study. Argument analysis abilities require an understanding of the concepts of argument, premiss, assumption, conclusion and counter-consideration. Additional critical thinking concepts are proposed by Bailin et al. (1999b: 293), Fisher & Scriven (1997: 105–106), and Black (2012).

According to Glaser (1941: 25), ability to think critically requires knowledge of the methods of logical inquiry and reasoning. If we review the list of abilities in the preceding section, however, we can see that some of them can be acquired and exercised merely through practice, possibly guided in an educational setting, followed by feedback. Searching intelligently for a causal explanation of some phenomenon or event requires that one consider a full range of possible causal contributors, but it seems more important that one implements this principle in one’s practice than that one is able to articulate it. What is important is “operational knowledge” of the standards and principles of good thinking (Bailin et al. 1999b: 291–293). But the development of such critical thinking abilities as designing an experiment or constructing an operational definition can benefit from learning their underlying theory. Further, explicit knowledge of quirks of human thinking seems useful as a cautionary guide. Human memory is not just fallible about details, as people learn from their own experiences of misremembering, but is so malleable that a detailed, clear and vivid recollection of an event can be a total fabrication (Loftus 2017). People seek or interpret evidence in ways that are partial to their existing beliefs and expectations, often unconscious of their “confirmation bias” (Nickerson 1998). Not only are people subject to this and other cognitive biases (Kahneman 2011), of which they are typically unaware, but it may be counter-productive for one to make oneself aware of them and try consciously to counteract them or to counteract social biases such as racial or sexual stereotypes (Kenyon & Beaulac 2014). It is helpful to be aware of these facts and of the superior effectiveness of blocking the operation of biases—for example, by making an immediate record of one’s observations, refraining from forming a preliminary explanatory hypothesis, blind refereeing, double-blind randomized trials, and blind grading of students’ work.

Critical thinking about an issue requires substantive knowledge of the domain to which the issue belongs. Critical thinking abilities are not a magic elixir that can be applied to any issue whatever by somebody who has no knowledge of the facts relevant to exploring that issue. For example, the student in Bubbles needed to know that gases do not penetrate solid objects like a glass, that air expands when heated, that the volume of an enclosed gas varies directly with its temperature and inversely with its pressure, and that hot objects will spontaneously cool down to the ambient temperature of their surroundings unless kept hot by insulation or a source of heat. Critical thinkers thus need a rich fund of subject-matter knowledge relevant to the variety of situations they encounter. This fact is recognized in the inclusion among critical thinking dispositions of a concern to become and remain generally well informed.

Experimental educational interventions, with control groups, have shown that education can improve critical thinking skills and dispositions, as measured by standardized tests. For information about these tests, see the Supplement on Assessment .

What educational methods are most effective at developing the dispositions, abilities and knowledge of a critical thinker? Abrami et al. (2015) found that in the experimental and quasi-experimental studies that they analyzed dialogue, anchored instruction, and mentoring each increased the effectiveness of the educational intervention, and that they were most effective when combined. They also found that in these studies a combination of separate instruction in critical thinking with subject-matter instruction in which students are encouraged to think critically was more effective than either by itself. However, the difference was not statistically significant; that is, it might have arisen by chance.

Most of these studies lack the longitudinal follow-up required to determine whether the observed differential improvements in critical thinking abilities or dispositions continue over time, for example until high school or college graduation. For details on studies of methods of developing critical thinking skills and dispositions, see the Supplement on Educational Methods .

12. Controversies

Scholars have denied the generalizability of critical thinking abilities across subject domains, have alleged bias in critical thinking theory and pedagogy, and have investigated the relationship of critical thinking to other kinds of thinking.

McPeck (1981) attacked the thinking skills movement of the 1970s, including the critical thinking movement. He argued that there are no general thinking skills, since thinking is always thinking about some subject-matter. It is futile, he claimed, for schools and colleges to teach thinking as if it were a separate subject. Rather, teachers should lead their pupils to become autonomous thinkers by teaching school subjects in a way that brings out their cognitive structure and that encourages and rewards discussion and argument. As some of his critics (e.g., Paul 1985; Siegel 1985) pointed out, McPeck’s central argument needs elaboration, since it has obvious counter-examples in writing and speaking, for which (up to a certain level of complexity) there are teachable general abilities even though they are always about some subject-matter. To make his argument convincing, McPeck needs to explain how thinking differs from writing and speaking in a way that does not permit useful abstraction of its components from the subject-matters with which it deals. He has not done so. Nevertheless, his position that the dispositions and abilities of a critical thinker are best developed in the context of subject-matter instruction is shared by many theorists of critical thinking, including Dewey (1910, 1933), Glaser (1941), Passmore (1980), Weinstein (1990), and Bailin et al. (1999b).

McPeck’s challenge prompted reflection on the extent to which critical thinking is subject-specific. McPeck argued for a strong subject-specificity thesis, according to which it is a conceptual truth that all critical thinking abilities are specific to a subject. (He did not however extend his subject-specificity thesis to critical thinking dispositions. In particular, he took the disposition to suspend judgment in situations of cognitive dissonance to be a general disposition.) Conceptual subject-specificity is subject to obvious counter-examples, such as the general ability to recognize confusion of necessary and sufficient conditions. A more modest thesis, also endorsed by McPeck, is epistemological subject-specificity, according to which the norms of good thinking vary from one field to another. Epistemological subject-specificity clearly holds to a certain extent; for example, the principles in accordance with which one solves a differential equation are quite different from the principles in accordance with which one determines whether a painting is a genuine Picasso. But the thesis suffers, as Ennis (1989) points out, from vagueness of the concept of a field or subject and from the obvious existence of inter-field principles, however broadly the concept of a field is construed. For example, the principles of hypothetico-deductive reasoning hold for all the varied fields in which such reasoning occurs. A third kind of subject-specificity is empirical subject-specificity, according to which as a matter of empirically observable fact a person with the abilities and dispositions of a critical thinker in one area of investigation will not necessarily have them in another area of investigation.

The thesis of empirical subject-specificity raises the general problem of transfer. If critical thinking abilities and dispositions have to be developed independently in each school subject, how are they of any use in dealing with the problems of everyday life and the political and social issues of contemporary society, most of which do not fit into the framework of a traditional school subject? Proponents of empirical subject-specificity tend to argue that transfer is more likely to occur if there is critical thinking instruction in a variety of domains, with explicit attention to dispositions and abilities that cut across domains. But evidence for this claim is scanty. There is a need for well-designed empirical studies that investigate the conditions that make transfer more likely.

It is common ground in debates about the generality or subject-specificity of critical thinking dispositions and abilities that critical thinking about any topic requires background knowledge about the topic. For example, the most sophisticated understanding of the principles of hypothetico-deductive reasoning is of no help unless accompanied by some knowledge of what might be plausible explanations of some phenomenon under investigation.

Critics have objected to bias in the theory, pedagogy and practice of critical thinking. Commentators (e.g., Alston 1995; Ennis 1998) have noted that anyone who takes a position has a bias in the neutral sense of being inclined in one direction rather than others. The critics, however, are objecting to bias in the pejorative sense of an unjustified favoring of certain ways of knowing over others, frequently alleging that the unjustly favoured ways are those of a dominant sex or culture (Bailin 1995). These ways favour:

  • reinforcement of egocentric and sociocentric biases over dialectical engagement with opposing world-views (Paul 1981, 1984; Warren 1998)
  • distancing from the object of inquiry over closeness to it (Martin 1992; Thayer-Bacon 1992)
  • indifference to the situation of others over care for them (Martin 1992)
  • orientation to thought over orientation to action (Martin 1992)
  • being reasonable over caring to understand people’s ideas (Thayer-Bacon 1993)
  • being neutral and objective over being embodied and situated (Thayer-Bacon 1995a)
  • doubting over believing (Thayer-Bacon 1995b)
  • reason over emotion, imagination and intuition (Thayer-Bacon 2000)
  • solitary thinking over collaborative thinking (Thayer-Bacon 2000)
  • written and spoken assignments over other forms of expression (Alston 2001)
  • attention to written and spoken communications over attention to human problems (Alston 2001)
  • winning debates in the public sphere over making and understanding meaning (Alston 2001)

A common thread in this smorgasbord of accusations is dissatisfaction with focusing on the logical analysis and evaluation of reasoning and arguments. While these authors acknowledge that such analysis and evaluation is part of critical thinking and should be part of its conceptualization and pedagogy, they insist that it is only a part. Paul (1981), for example, bemoans the tendency of atomistic teaching of methods of analyzing and evaluating arguments to turn students into more able sophists, adept at finding fault with positions and arguments with which they disagree but even more entrenched in the egocentric and sociocentric biases with which they began. Martin (1992) and Thayer-Bacon (1992) cite with approval the self-reported intimacy with their subject-matter of leading researchers in biology and medicine, an intimacy that conflicts with the distancing allegedly recommended in standard conceptions and pedagogy of critical thinking. Thayer-Bacon (2000) contrasts the embodied and socially embedded learning of her elementary school students in a Montessori school, who used their imagination, intuition and emotions as well as their reason, with conceptions of critical thinking as

thinking that is used to critique arguments, offer justifications, and make judgments about what are the good reasons, or the right answers. (Thayer-Bacon 2000: 127–128)

Alston (2001) reports that her students in a women’s studies class were able to see the flaws in the Cinderella myth that pervades much romantic fiction but in their own romantic relationships still acted as if all failures were the woman’s fault and still accepted the notions of love at first sight and living happily ever after. Students, she writes, should

be able to connect their intellectual critique to a more affective, somatic, and ethical account of making risky choices that have sexist, racist, classist, familial, sexual, or other consequences for themselves and those both near and far… critical thinking that reads arguments, texts, or practices merely on the surface without connections to feeling/desiring/doing or action lacks an ethical depth that should infuse the difference between mere cognitive activity and something we want to call critical thinking. (Alston 2001: 34)

Some critics portray such biases as unfair to women. Thayer-Bacon (1992), for example, has charged modern critical thinking theory with being sexist, on the ground that it separates the self from the object and causes one to lose touch with one’s inner voice, and thus stigmatizes women, who (she asserts) link self to object and listen to their inner voice. Her charge does not imply that women as a group are on average less able than men to analyze and evaluate arguments. Facione (1990c) found no difference by sex in performance on his California Critical Thinking Skills Test. Kuhn (1991: 280–281) found no difference by sex in either the disposition or the competence to engage in argumentative thinking.

The critics propose a variety of remedies for the biases that they allege. In general, they do not propose to eliminate or downplay critical thinking as an educational goal. Rather, they propose to conceptualize critical thinking differently and to change its pedagogy accordingly. Their pedagogical proposals arise logically from their objections. They can be summarized as follows:

  • Focus on argument networks with dialectical exchanges reflecting contesting points of view rather than on atomic arguments, so as to develop “strong sense” critical thinking that transcends egocentric and sociocentric biases (Paul 1981, 1984).
  • Foster closeness to the subject-matter and feeling connected to others in order to inform a humane democracy (Martin 1992).
  • Develop “constructive thinking” as a social activity in a community of physically embodied and socially embedded inquirers with personal voices who value not only reason but also imagination, intuition and emotion (Thayer-Bacon 2000).
  • In developing critical thinking in school subjects, treat as important neither skills nor dispositions but opening worlds of meaning (Alston 2001).
  • Attend to the development of critical thinking dispositions as well as skills, and adopt the “critical pedagogy” practised and advocated by Freire (1968 [1970]) and hooks (1994) (Dalgleish, Girard, & Davies 2017).

A common thread in these proposals is treatment of critical thinking as a social, interactive, personally engaged activity like that of a quilting bee or a barn-raising (Thayer-Bacon 2000) rather than as an individual, solitary, distanced activity symbolized by Rodin’s The Thinker . One can get a vivid description of education with the former type of goal from the writings of bell hooks (1994, 2010). Critical thinking for her is open-minded dialectical exchange across opposing standpoints and from multiple perspectives, a conception similar to Paul’s “strong sense” critical thinking (Paul 1981). She abandons the structure of domination in the traditional classroom. In an introductory course on black women writers, for example, she assigns students to write an autobiographical paragraph about an early racial memory, then to read it aloud as the others listen, thus affirming the uniqueness and value of each voice and creating a communal awareness of the diversity of the group’s experiences (hooks 1994: 84). Her “engaged pedagogy” is thus similar to the “freedom under guidance” implemented in John Dewey’s Laboratory School of Chicago in the late 1890s and early 1900s. It incorporates the dialogue, anchored instruction, and mentoring that Abrami (2015) found to be most effective in improving critical thinking skills and dispositions.

What is the relationship of critical thinking to problem solving, decision-making, higher-order thinking, creative thinking, and other recognized types of thinking? One’s answer to this question obviously depends on how one defines the terms used in the question. If critical thinking is conceived broadly to cover any careful thinking about any topic for any purpose, then problem solving and decision making will be kinds of critical thinking, if they are done carefully. Historically, ‘critical thinking’ and ‘problem solving’ were two names for the same thing. If critical thinking is conceived more narrowly as consisting solely of appraisal of intellectual products, then it will be disjoint with problem solving and decision making, which are constructive.

Bloom’s taxonomy of educational objectives used the phrase “intellectual abilities and skills” for what had been labeled “critical thinking” by some, “reflective thinking” by Dewey and others, and “problem solving” by still others (Bloom et al. 1956: 38). Thus, the so-called “higher-order thinking skills” at the taxonomy’s top levels of analysis, synthesis and evaluation are just critical thinking skills, although they do not come with general criteria for their assessment (Ennis 1981b). The revised version of Bloom’s taxonomy (Anderson et al. 2001) likewise treats critical thinking as cutting across those types of cognitive process that involve more than remembering (Anderson et al. 2001: 269–270). For details, see the Supplement on History .

As to creative thinking, it overlaps with critical thinking (Bailin 1987, 1988). Thinking about the explanation of some phenomenon or event, as in Ferryboat , requires creative imagination in constructing plausible explanatory hypotheses. Likewise, thinking about a policy question, as in Candidate , requires creativity in coming up with options. Conversely, creativity in any field needs to be balanced by critical appraisal of the draft painting or novel or mathematical theory.

  • Abrami, Philip C., Robert M. Bernard, Eugene Borokhovski, David I. Waddington, C. Anne Wade, and Tonje Person, 2015, “Strategies for Teaching Students to Think Critically: A Meta-analysis”, Review of Educational Research , 85(2): 275–314. doi:10.3102/0034654314551063
  • Aikin, Wilford M., 1942, The Story of the Eight-year Study, with Conclusions and Recommendations , Volume I of Adventure in American Education , New York and London: Harper & Brothers. [ Aikin 1942 available online ]
  • Alston, Kal, 1995, “Begging the Question: Is Critical Thinking Biased?”, Educational Theory , 45(2): 225–233. doi:10.1111/j.1741-5446.1995.00225.x
  • –––, 2001, “Re/Thinking Critical Thinking: The Seductions of Everyday Life”, Studies in Philosophy and Education , 20(1): 27–40. doi:10.1023/A:1005247128053
  • American Educational Research Association, 2014, Standards for Educational and Psychological Testing / American Educational Research Association, American Psychological Association, National Council on Measurement in Education , Washington, DC: American Educational Research Association.
  • Anderson, Lorin W., David R. Krathwohl, Peter W. Airiasian, Kathleen A. Cruikshank, Richard E. Mayer, Paul R. Pintrich, James Raths, and Merlin C. Wittrock, 2001, A Taxonomy for Learning, Teaching and Assessing: A Revision of Bloom’s Taxonomy of Educational Objectives , New York: Longman, complete edition.
  • Bailin, Sharon, 1987, “Critical and Creative Thinking”, Informal Logic , 9(1): 23–30. [ Bailin 1987 available online ]
  • –––, 1988, Achieving Extraordinary Ends: An Essay on Creativity , Dordrecht: Kluwer. doi:10.1007/978-94-009-2780-3
  • –––, 1995, “Is Critical Thinking Biased? Clarifications and Implications”, Educational Theory , 45(2): 191–197. doi:10.1111/j.1741-5446.1995.00191.x
  • Bailin, Sharon and Mark Battersby, 2009, “Inquiry: A Dialectical Approach to Teaching Critical Thinking”, in Juho Ritola (ed.), Argument Cultures: Proceedings of OSSA 09 , CD-ROM (pp. 1–10), Windsor, ON: OSSA. [ Bailin & Battersby 2009 available online ]
  • –––, 2016, “Fostering the Virtues of Inquiry”, Topoi , 35(2): 367–374. doi:10.1007/s11245-015-9307-6
  • Bailin, Sharon, Roland Case, Jerrold R. Coombs, and Leroi B. Daniels, 1999a, “Common Misconceptions of Critical Thinking”, Journal of Curriculum Studies , 31(3): 269–283. doi:10.1080/002202799183124
  • –––, 1999b, “Conceptualizing Critical Thinking”, Journal of Curriculum Studies , 31(3): 285–302. doi:10.1080/002202799183133
  • Berman, Alan M., Seth J. Schwartz, William M. Kurtines, and Steven L. Berman, 2001, “The Process of Exploration in Identity Formation: The Role of Style and Competence”, Journal of Adolescence , 24(4): 513–528. doi:10.1006/jado.2001.0386
  • Black, Beth (ed.), 2012, An A to Z of Critical Thinking , London: Continuum International Publishing Group.
  • Bloom, Benjamin Samuel, Max D. Engelhart, Edward J. Furst, Walter H. Hill, and David R. Krathwohl, 1956, Taxonomy of Educational Objectives. Handbook I: Cognitive Domain , New York: David McKay.
  • Casserly, Megan, 2012, “The 10 Skills That Will Get You Hired in 2013”, Forbes , Dec. 10, 2012. Available at https://www.forbes.com/sites/meghancasserly/2012/12/10/the-10-skills-that-will-get-you-a-job-in-2013/#79e7ff4e633d ; accessed 2017 11 06.
  • Center for Assessment & Improvement of Learning, 2017, Critical Thinking Assessment Test , Cookeville, TN: Tennessee Technological University.
  • Cohen, Jacob, 1988, Statistical Power Analysis for the Behavioral Sciences , Hillsdale, NJ: Lawrence Erlbaum Associates, 2nd edition.
  • College Board, 1983, Academic Preparation for College. What Students Need to Know and Be Able to Do , New York: College Entrance Examination Board, ERIC document ED232517.
  • Commission on the Relation of School and College of the Progressive Education Association, 1943, Thirty Schools Tell Their Story , Volume V of Adventure in American Education , New York and London: Harper & Brothers.
  • Council for Aid to Education, 2017, CLA+ Student Guide . Available at http://cae.org/images/uploads/pdf/CLA_Student_Guide_Institution.pdf ; accessed 2017 09 26.
  • Dalgleish, Adam, Patrick Girard, and Maree Davies, 2017, “Critical Thinking, Bias and Feminist Philosophy: Building a Better Framework through Collaboration”, Informal Logic , 37(4): 351–369. [ Dalgleish et al. available online ]
  • Dewey, John, 1910, How We Think , Boston: D.C. Heath. [ Dewey 1910 available online ]
  • –––, 1916, Democracy and Education: An Introduction to the Philosophy of Education , New York: Macmillan.
  • –––, 1933, How We Think: A Restatement of the Relation of Reflective Thinking to the Educative Process , Lexington, MA: D.C. Heath.
  • –––, 1936, “The Theory of the Chicago Experiment”, Appendix II of Mayhew & Edwards 1936: 463–477.
  • –––, 1938, Logic: The Theory of Inquiry , New York: Henry Holt and Company.
  • Dominguez, Caroline (coord.), 2018a, A European Collection of the Critical Thinking Skills and Dispositions Needed in Different Professional Fields for the 21st Century , Vila Real, Portugal: UTAD. Available at http://bit.ly/CRITHINKEDUO1 ; accessed 2018 04 09.
  • ––– (coord.), 2018b, A European Review on Critical Thinking Educational Practices in Higher Education Institutions , Vila Real: UTAD. Available at http://bit.ly/CRITHINKEDUO2 ; accessed 2018 04 14.
  • Dumke, Glenn S., 1980, Chancellor’s Executive Order 338 , Long Beach, CA: California State University, Chancellor’s Office. Available at https://www.calstate.edu/eo/EO-338.pdf ; accessed 2017 11 16.
  • Ennis, Robert H., 1958, “An Appraisal of the Watson-Glaser Critical Thinking Appraisal”, The Journal of Educational Research , 52(4): 155–158. doi:10.1080/00220671.1958.10882558
  • –––, 1962, “A Concept of Critical Thinking: A Proposed Basis for Research on the Teaching and Evaluation of Critical Thinking Ability”, Harvard Educational Review , 32(1): 81–111.
  • –––, 1981a, “A Conception of Deductive Logical Competence”, Teaching Philosophy , 4(3/4): 337–385. doi:10.5840/teachphil198143/429
  • –––, 1981b, “Eight Fallacies in Bloom’s Taxonomy”, in C. J. B. Macmillan (ed.), Philosophy of Education 1980: Proceedings of the Thirty-seventh Annual Meeting of the Philosophy of Education Society , Bloomington, IL: Philosophy of Education Society, pp. 269–273.
  • –––, 1984, “Problems in Testing Informal Logic, Critical Thinking, Reasoning Ability”. Informal Logic , 6(1): 3–9. [ Ennis 1984 available online ]
  • –––, 1987, “A Taxonomy of Critical Thinking Dispositions and Abilities”, in Joan Boykoff Baron and Robert J. Sternberg (eds.), Teaching Thinking Skills: Theory and Practice , New York: W. H. Freeman, pp. 9–26.
  • –––, 1989, “Critical Thinking and Subject Specificity: Clarification and Needed Research”, Educational Researcher , 18(3): 4–10. doi:10.3102/0013189X018003004
  • –––, 1991, “Critical Thinking: A Streamlined Conception”, Teaching Philosophy , 14(1): 5–24. doi:10.5840/teachphil19911412
  • –––, 1996, “Critical Thinking Dispositions: Their Nature and Assessability”, Informal Logic , 18(2–3): 165–182. [ Ennis 1996 available online ]
  • –––, 1998, “Is Critical Thinking Culturally Biased?”, Teaching Philosophy , 21(1): 15–33. doi:10.5840/teachphil19982113
  • –––, 2011, “Critical Thinking: Reflection and Perspective Part I”, Inquiry: Critical Thinking across the Disciplines , 26(1): 4–18. doi:10.5840/inquiryctnews20112613
  • –––, 2013, “Critical Thinking across the Curriculum: The Wisdom CTAC Program”, Inquiry: Critical Thinking across the Disciplines , 28(2): 25–45. doi:10.5840/inquiryct20132828
  • –––, 2016, “Definition: A Three-Dimensional Analysis with Bearing on Key Concepts”, in Patrick Bondy and Laura Benacquista (eds.), Argumentation, Objectivity, and Bias: Proceedings of the 11th International Conference of the Ontario Society for the Study of Argumentation (OSSA), 18–21 May 2016 , Windsor, ON: OSSA, pp. 1–19. Available at http://scholar.uwindsor.ca/ossaarchive/OSSA11/papersandcommentaries/105 ; accessed 2017 12 02.
  • –––, 2018, “Critical Thinking Across the Curriculum: A Vision”, Topoi , 37(1): 165–184. doi:10.1007/s11245-016-9401-4
  • Ennis, Robert H., and Jason Millman, 1971, Manual for Cornell Critical Thinking Test, Level X, and Cornell Critical Thinking Test, Level Z , Urbana, IL: Critical Thinking Project, University of Illinois.
  • Ennis, Robert H., Jason Millman, and Thomas Norbert Tomko, 1985, Cornell Critical Thinking Tests Level X & Level Z: Manual , Pacific Grove, CA: Midwest Publication, 3rd edition.
  • –––, 2005, Cornell Critical Thinking Tests Level X & Level Z: Manual , Seaside, CA: Critical Thinking Company, 5th edition.
  • Ennis, Robert H. and Eric Weir, 1985, The Ennis-Weir Critical Thinking Essay Test: Test, Manual, Criteria, Scoring Sheet: An Instrument for Teaching and Testing , Pacific Grove, CA: Midwest Publications.
  • Facione, Peter A., 1990a, Critical Thinking: A Statement of Expert Consensus for Purposes of Educational Assessment and Instruction , Research Findings and Recommendations Prepared for the Committee on Pre-College Philosophy of the American Philosophical Association, ERIC Document ED315423.
  • –––, 1990b, California Critical Thinking Skills Test, CCTST – Form A , Millbrae, CA: The California Academic Press.
  • –––, 1990c, The California Critical Thinking Skills Test--College Level. Technical Report #3. Gender, Ethnicity, Major, CT Self-Esteem, and the CCTST , ERIC Document ED326584.
  • –––, 1992, California Critical Thinking Skills Test: CCTST – Form B, Millbrae, CA: The California Academic Press.
  • –––, 2000, “The Disposition Toward Critical Thinking: Its Character, Measurement, and Relationship to Critical Thinking Skill”, Informal Logic , 20(1): 61–84. [ Facione 2000 available online ]
  • Facione, Peter A. and Noreen C. Facione, 1992, CCTDI: A Disposition Inventory , Millbrae, CA: The California Academic Press.
  • Facione, Peter A., Noreen C. Facione, and Carol Ann F. Giancarlo, 2001, California Critical Thinking Disposition Inventory: CCTDI: Inventory Manual , Millbrae, CA: The California Academic Press.
  • Facione, Peter A., Carol A. Sánchez, and Noreen C. Facione, 1994, Are College Students Disposed to Think? , Millbrae, CA: The California Academic Press. ERIC Document ED368311.
  • Fisher, Alec, and Michael Scriven, 1997, Critical Thinking: Its Definition and Assessment , Norwich: Centre for Research in Critical Thinking, University of East Anglia.
  • Freire, Paulo, 1968 [1970], Pedagogia do Oprimido . Translated as Pedagogy of the Oppressed , Myra Bergman Ramos (trans.), New York: Continuum, 1970.
  • Glaser, Edward Maynard, 1941, An Experiment in the Development of Critical Thinking , New York: Bureau of Publications, Teachers College, Columbia University.
  • Halpern, Diane F., 1998, “Teaching Critical Thinking for Transfer Across Domains: Disposition, Skills, Structure Training, and Metacognitive Monitoring”, American Psychologist , 53(4): 449–455. doi:10.1037/0003-066X.53.4.449
  • –––, 2016, Manual: Halpern Critical Thinking Assessment , Mödling, Austria: Schuhfried. Available at https://drive.google.com/file/d/0BzUoP_pmwy1gdEpCR05PeW9qUzA/view ; accessed 2017 12 01.
  • Hamby, Benjamin, 2014, The Virtues of Critical Thinkers , Doctoral dissertation, Philosophy, McMaster University. [ Hamby 2014 available online ]
  • –––, 2015, “Willingness to Inquire: The Cardinal Critical Thinking Virtue”, in Martin Davies and Ronald Barnett (eds.), The Palgrave Handbook of Critical Thinking in Higher Education , New York: Palgrave Macmillan, pp. 77–87.
  • Haynes, Ada, Elizabeth Lisic, Kevin Harris, Katie Leming, Kyle Shanks, and Barry Stein, 2015, “Using the Critical Thinking Assessment Test (CAT) as a Model for Designing Within-Course Assessments: Changing How Faculty Assess Student Learning”, Inquiry: Critical Thinking Across the Disciplines , 30(3): 38–48. doi:10.5840/inquiryct201530316
  • Hitchcock, David, 2017, “Critical Thinking as an Educational Ideal”, in his On Reasoning and Argument: Essays in Informal Logic and on Critical Thinking , Dordrecht: Springer, pp. 477–497. doi:10.1007/978-3-319-53562-3_30
  • hooks, bell, 1994, Teaching to Transgress: Education as the Practice of Freedom , New York and London: Routledge.
  • –––, 2010, Teaching Critical Thinking: Practical Wisdom , New York and London: Routledge.
  • Johnson, Ralph H., 1992, “The Problem of Defining Critical Thinking”, in Stephen P, Norris (ed.), The Generalizability of Critical Thinking , New York: Teachers College Press, pp. 38–53.
  • Kahane, Howard, 1971, Logic and Contemporary Rhetoric: The Use of Reason in Everyday Life , Belmont, CA: Wadsworth.
  • Kahneman, Daniel, 2011, Thinking, Fast and Slow , New York: Farrar, Straus and Giroux.
  • Kenyon, Tim, and Guillaume Beaulac, 2014, “Critical Thinking Education and Debasing”, Informal Logic , 34(4): 341–363. [ Kenyon & Beaulac 2014 available online ]
  • Krathwohl, David R., Benjamin S. Bloom, and Bertram B. Masia, 1964, Taxonomy of Educational Objectives, Handbook II: Affective Domain , New York: David McKay.
  • Kuhn, Deanna, 1991, The Skills of Argument , New York: Cambridge University Press. doi:10.1017/CBO9780511571350
  • Lipman, Matthew, 1987, “Critical Thinking–What Can It Be?”, Analytic Teaching , 8(1): 5–12. [ Lipman 1987 available online ]
  • Loftus, Elizabeth F., 2017, “Eavesdropping on Memory”, Annual Review of Psychology , 68: 1–18. doi:10.1146/annurev-psych-010416-044138
  • Martin, Jane Roland, 1992, “Critical Thinking for a Humane World”, in Stephen P. Norris (ed.), The Generalizability of Critical Thinking , New York: Teachers College Press, pp. 163–180.
  • Mayhew, Katherine Camp, and Anna Camp Edwards, 1936, The Dewey School: The Laboratory School of the University of Chicago, 1896–1903 , New York: Appleton-Century. [ Mayhew & Edwards 1936 available online ]
  • McPeck, John E., 1981, Critical Thinking and Education , New York: St. Martin’s Press.
  • Nickerson, Raymond S., 1998, “Confirmation Bias: A Ubiquitous Phenomenon in Many Guises”, Review of General Psychology , 2(2): 175–220. doi:10.1037/1089-2680.2.2.175
  • Nieto, Ana Maria, and Jorge Valenzuela, 2012, “A Study of the Internal Structure of Critical Thinking Dispositions”, Inquiry: Critical Thinking across the Disciplines , 27(1): 31–38. doi:10.5840/inquiryct20122713
  • Norris, Stephen P., 1985, “Controlling for Background Beliefs When Developing Multiple-choice Critical Thinking Tests”, Educational Measurement: Issues and Practice , 7(3): 5–11. doi:10.1111/j.1745-3992.1988.tb00437.x
  • Norris, Stephen P. and Robert H. Ennis, 1989, Evaluating Critical Thinking (The Practitioners’ Guide to Teaching Thinking Series), Pacific Grove, CA: Midwest Publications.
  • Norris, Stephen P. and Ruth Elizabeth King, 1983, Test on Appraising Observations , St. John’s, NL: Institute for Educational Research and Development, Memorial University of Newfoundland.
  • –––, 1984, The Design of a Critical Thinking Test on Appraising Observations , St. John’s, NL: Institute for Educational Research and Development, Memorial University of Newfoundland. ERIC Document ED260083.
  • –––, 1985, Test on Appraising Observations: Manual , St. John’s, NL: Institute for Educational Research and Development, Memorial University of Newfoundland.
  • –––, 1990a, Test on Appraising Observations , St. John’s, NL: Institute for Educational Research and Development, Memorial University of Newfoundland, 2nd edition.
  • –––, 1990b, Test on Appraising Observations: Manual , St. John’s, NL: Institute for Educational Research and Development, Memorial University of Newfoundland, 2nd edition.
  • Obama, Barack, 2014, State of the Union Address , January 28, 2014. [ Obama 2014 available online ]
  • OCR [Oxford, Cambridge and RSA Examinations], 2011, AS/A Level GCE: Critical Thinking – H052, H452 , Cambridge: OCR. Information available at http://www.ocr.org.uk/qualifications/as-a-level-gce-critical-thinking-h052-h452/ ; accessed 2017 10 12.
  • OECD [Organization for Economic Cooperation and Development] Centre for Educational Research and Innovation, 2018, Fostering and Assessing Students’ Creative and Critical Thinking Skills in Higher Education , Paris: OECD. Available at http://www.oecd.org/education/ceri/Fostering-and-assessing-students-creative-and-critical-thinking-skills-in-higher-education.pdf ; accessed 2018 04 22.
  • Ontario Ministry of Education, 2013, The Ontario Curriculum Grades 9 to 12: Social Sciences and Humanities . Available at http://www.edu.gov.on.ca/eng/curriculum/secondary/ssciences9to122013.pdf ; accessed 2017 11 16.
  • Passmore, John Arthur, 1980, The Philosophy of Teaching , London: Duckworth.
  • Paul, Richard W., 1981, “Teaching Critical Thinking in the ‘Strong’ Sense: A Focus on Self-Deception, World Views, and a Dialectical Mode of Analysis”, Informal Logic , 4(2): 2–7. [ Paul 1981 available online ]
  • –––, 1984, “Critical Thinking: Fundamental to Education for a Free Society”, Educational Leadership , 42(1): 4–14.
  • –––, 1985, “McPeck’s Mistakes”, Informal Logic , 7(1): 35–43. [ Paul 1985 available online ]
  • Paul, Richard W. and Linda Elder, 2006, The Miniature Guide to Critical Thinking: Concepts and Tools , Dillon Beach, CA: Foundation for Critical Thinking, 4th edition.
  • Payette, Patricia, and Edna Ross, 2016, “Making a Campus-Wide Commitment to Critical Thinking: Insights and Promising Practices Utilizing the Paul-Elder Approach at the University of Louisville”, Inquiry: Critical Thinking Across the Disciplines , 31(1): 98–110. doi:10.5840/inquiryct20163118
  • Possin, Kevin, 2008, “A Field Guide to Critical-Thinking Assessment”, Teaching Philosophy , 31(3): 201–228. doi:10.5840/teachphil200831324
  • –––, 2013a, “Some Problems with the Halpern Critical Thinking Assessment (HCTA) Test”, Inquiry: Critical Thinking across the Disciplines , 28(3): 4–12. doi:10.5840/inquiryct201328313
  • –––, 2013b, “A Serious Flaw in the Collegiate Learning Assessment (CLA) Test”, Informal Logic , 33(3): 390–405. [ Possin 2013b available online ]
  • –––, 2014, “Critique of the Watson-Glaser Critical Thinking Appraisal Test: The More You Know, the Lower Your Score”, Informal Logic , 34(4): 393–416. [ Possin 2014 available online ]
  • Rawls, John, 1971, A Theory of Justice , Cambridge, MA: Harvard University Press.
  • Rousseau, Jean-Jacques, 1762, Émile , Amsterdam: Jean Néaulme.
  • Scheffler, Israel, 1960, The Language of Education , Springfield, IL: Charles C. Thomas.
  • Scriven, Michael, and Richard W. Paul, 1987, Defining Critical Thinking , Draft statement written for the National Council for Excellence in Critical Thinking Instruction. Available at http://www.criticalthinking.org/pages/defining-critical-thinking/766 ; accessed 2017 11 29.
  • Sheffield, Clarence Burton Jr., 2018, “Promoting Critical Thinking in Higher Education: My Experiences as the Inaugural Eugene H. Fram Chair in Applied Critical Thinking at Rochester Institute of Technology”, Topoi , 37(1): 155–163. doi:10.1007/s11245-016-9392-1
  • Siegel, Harvey, 1985, “McPeck, Informal Logic and the Nature of Critical Thinking”, in David Nyberg (ed.), Philosophy of Education 1985: Proceedings of the Forty-First Annual Meeting of the Philosophy of Education Society , Normal, IL: Philosophy of Education Society, pp. 61–72.
  • –––, 1988, Educating Reason: Rationality, Critical Thinking, and Education , New York: Routledge.
  • –––, 1999, “What (Good) Are Thinking Dispositions?”, Educational Theory , 49(2): 207–221. doi:10.1111/j.1741-5446.1999.00207.x
  • Simpson, Elizabeth, 1966–67, “The Classification of Educational Objectives: Psychomotor Domain”, Illinois Teacher of Home Economics , 10(4): 110–144, ERIC document ED0103613. [ Simpson 1966–67 available online ]
  • Skolverket, 2011, Curriculum for the Compulsory School, Preschool Class and the Recreation Centre , Stockholm: Ordförrådet AB. Available at http://malmo.se/download/18.29c3b78a132728ecb52800034181/pdf2687.pdf ; accessed 2017 11 16.
  • Smith, B. Othanel, 1953, “The Improvement of Critical Thinking”, Progressive Education , 30(5): 129–134.
  • Smith, Eugene Randolph, Ralph Winfred Tyler, and the Evaluation Staff, 1942, Appraising and Recording Student Progress , Volume III of Adventure in American Education , New York and London: Harper & Brothers.
  • Splitter, Laurance J., 1987, “Educational Reform through Philosophy for Children”, Thinking: The Journal of Philosophy for Children , 7(2): 32–39. doi:10.5840/thinking1987729
  • Stanovich Keith E., and Paula J. Stanovich, 2010, “A Framework for Critical Thinking, Rational Thinking, and Intelligence”, in David D. Preiss and Robert J. Sternberg (eds), Innovations in Educational Psychology: Perspectives on Learning, Teaching and Human Development , New York: Springer Publishing, pp 195–237.
  • Stanovich Keith E., Richard F. West, and Maggie E. Toplak, 2011, “Intelligence and Rationality”, in Robert J. Sternberg and Scott Barry Kaufman (eds.), Cambridge Handbook of Intelligence , Cambridge: Cambridge University Press, 3rd edition, pp. 784–826. doi:10.1017/CBO9780511977244.040
  • Tankersley, Karen, 2005, Literacy Strategies for Grades 4–12: Reinforcing the Threads of Reading , Alexandria, VA: Association for Supervision and Curriculum Development.
  • Thayer-Bacon, Barbara J., 1992, “Is Modern Critical Thinking Theory Sexist?”, Inquiry: Critical Thinking Across the Disciplines , 10(1): 3–7. doi:10.5840/inquiryctnews199210123
  • –––, 1993, “Caring and Its Relationship to Critical Thinking”, Educational Theory , 43(3): 323–340. doi:10.1111/j.1741-5446.1993.00323.x
  • –––, 1995a, “Constructive Thinking: Personal Voice”, Journal of Thought , 30(1): 55–70.
  • –––, 1995b, “Doubting and Believing: Both are Important for Critical Thinking”, Inquiry: Critical Thinking across the Disciplines , 15(2): 59–66. doi:10.5840/inquiryctnews199515226
  • –––, 2000, Transforming Critical Thinking: Thinking Constructively , New York: Teachers College Press.
  • Toulmin, Stephen Edelston, 1958, The Uses of Argument , Cambridge: Cambridge University Press.
  • Turri, John, Mark Alfano, and John Greco, 2017, “Virtue Epistemology”, in Edward N. Zalta (ed.), The Stanford Encyclopedia of Philosophy (Winter 2017 Edition). URL = < https://plato.stanford.edu/archives/win2017/entries/epistemology-virtue/ >
  • Warren, Karen J. 1988. “Critical Thinking and Feminism”, Informal Logic , 10(1): 31–44. [ Warren 1988 available online ]
  • Watson, Goodwin, and Edward M. Glaser, 1980a, Watson-Glaser Critical Thinking Appraisal, Form A , San Antonio, TX: Psychological Corporation.
  • –––, 1980b, Watson-Glaser Critical Thinking Appraisal: Forms A and B; Manual , San Antonio, TX: Psychological Corporation,
  • –––, 1994, Watson-Glaser Critical Thinking Appraisal, Form B , San Antonio, TX: Psychological Corporation.
  • Weinstein, Mark, 1990, “Towards a Research Agenda for Informal Logic and Critical Thinking”, Informal Logic , 12(3): 121–143. [ Weinstein 1990 available online ]
  • –––, 2013, Logic, Truth and Inquiry , London: College Publications.
  • Zagzebski, Linda Trinkaus, 1996, Virtues of the Mind: An Inquiry into the Nature of Virtue and the Ethical Foundations of Knowledge , Cambridge: Cambridge University Press. doi:10.1017/CBO9781139174763
How to cite this entry . Preview the PDF version of this entry at the Friends of the SEP Society . Look up this entry topic at the Internet Philosophy Ontology Project (InPhO). Enhanced bibliography for this entry at PhilPapers , with links to its database.
  • Association for Informal Logic and Critical Thinking (AILACT)
  • Center for Teaching Thinking (CTT)
  • Critical Thinking Across the European Higher Education Curricula (CRITHINKEDU)
  • Critical Thinking Definition, Instruction, and Assessment: A Rigorous Approach (criticalTHINKING.net)
  • Critical Thinking Research (RAIL)
  • Foundation for Critical Thinking
  • Insight Assessment
  • Partnership for 21st Century Learning (P21)
  • The Critical Thinking Consortium
  • The Nature of Critical Thinking: An Outline of Critical Thinking Dispositions and Abilities , by Robert H. Ennis

abilities | bias, implicit | children, philosophy for | civic education | decision-making capacity | Dewey, John | dispositions | education, philosophy of | epistemology: virtue | logic: informal

Copyright © 2018 by David Hitchcock < hitchckd @ mcmaster . ca >

Support SEP

Mirror sites.

View this site from another server:

  • Info about mirror sites

Stanford Center for the Study of Language and Information

The Stanford Encyclopedia of Philosophy is copyright © 2016 by The Metaphysics Research Lab , Center for the Study of Language and Information (CSLI), Stanford University

Library of Congress Catalog Data: ISSN 1095-5054

Contextual Debiasing and Critical Thinking: Reasons for Optimism

  • Published: 26 April 2016
  • Volume 37 , pages 103–111, ( 2018 )

Cite this article

critical thinking involves action of bias

  • Vasco Correia 1  

1603 Accesses

8 Citations

19 Altmetric

Explore all metrics

In this article I argue that most biases in argumentation and decision-making can and should be counteracted. Although biases can prove beneficial in certain contexts, I contend that they are generally maladaptive and need correction. Yet critical thinking alone seems insufficient to mitigate biases in everyday contexts. I develop a contextualist approach, according to which cognitive debiasing strategies need to be supplemented by extra-psychic devices that rely on social and environmental constraints in order to promote rational reasoning. Finally, I examine several examples of contextual debiasing strategies and show how they can contribute to enhance critical thinking at a cognitive level.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price includes VAT (Russian Federation)

Instant access to the full article PDF.

Rent this article via DeepDyve

Institutional subscriptions

Similar content being viewed by others

critical thinking involves action of bias

Accountability Breeds Response-Ability: Contextual Debiasing and Accountability in Argumentation

critical thinking involves action of bias

Wise Reasoning in an Uncertain World

critical thinking involves action of bias

Critical Thinking and Epistemic Responsibility Revisited

Despite mounting criticism, Meliorism remains the dominant position among philosophers and psychologists. See for example Elster ( 2007 ), Evans ( 2007 ), Kahneman ( 2011 ), Kenyon and Beaulac ( 2014 ), Stanovich ( 2011 ), Larrick ( 2004 ), Croskerry et al. ( 2013a ), Tetlock ( 2005 ), Wilson and Brekke ( 1994 ).

Taylor ( 1989 , p. 237) explicitly acknowledges this aspect: “Unrealistic optimism might lead people to ignore legitimate risks in their environment and to fail to take measures to offset those risks”.

See, for example, Aberdein ( 2010 ) and Cohen ( 2009 ).

Psychologists distinguish between two kinds of cognitive illusions: motivational (or “hot”) biases, on the one hand, which stem from the influence of emotions and interests on cognitive processes, and cognitive (or “cold”) biases, on the other hand, which stem from inferential errors due to cognitive malfunctioning (Kunda 1990 ; Nisbett 1993 ).

Cf. Fisher ( 2011 , p. 4), Lau ( 2011 , p. 2), Siegel ( 1988 , p. 32).

The “tools” metaphor can also be found in other approaches that stress the importance of non-cognitive (or extra-psychic) devices as means to promote rationality: Soll et al. ( 2015 ) refer to “debiasing tools”, Hogarth ( 2001 ) to “decision-making tools”, Elster ( 1989 ) to the “toolbox of mechanisms”, and Gigerenzer and Selten ( 2002 ) to the “adaptive toolbox”.

See, for example, Kenyon and Beaulac ( 2014 ), Larrick ( 2004 ), Soll et al. ( 2015 ).

Aberdein A (2010) Virtue in argument. Argumentation 24(2):165–179

Article   Google Scholar  

Ainslie G (2005) Précis of breakdown of will. Behav Brain Sci 28:635–673

Google Scholar  

Anderson C, Sechler E (1986) Effects of explanation and counterexplanation on the development and use of social theories. J Pers Soc Psychol 50:24–54

Arkes H (1981) Impediments to accurate clinical judgment and possible ways to minimize their impact. J Consult Clin Psychol 49:323–330

Arkes H (1991) Costs and benefits of judgment errors. Psychol Bull 110(13):486–498

Brest P, Krieger L (2010) Problem solving, decision making and professional judgment. Oxford University Press, Oxford

Budden A, Tregenza T, Aarssen L, Koricheva J, Leimu R, Lortie CJ (2008) Double-blind review favours increased representation of female authors. Trends Ecol Evol 23(1):4–6

Cohen J (1981) Can human irrationality be experimentally demonstrated? In: Adler J, Rips L (eds) Reasoning. Cambridge University Press, Cambridge

Cohen D (2009) Keeping an open mind and having a sense of proportion as virtues in argumentation. Cogency 1(2):49–64

Croskerry P, Singhal G, Mamede S (2013a) Cognitive debiasing 1: origins of bias and theory of debiasing. Qual Saf 22(2):58–64

Croskerry P, Singhal G, Mamede S (2013b) Cognitive debiasing 2: impediments to and strategies for change. Qual Saf 22(2):65–72

Davidson D (1985) Incoherence and irrationality. Dialectica 39(4):345–353

Dick Cheney’s Suite Demands (2006) Retrieved January 8, 2016, from http://www.thesmokinggun.com/file/dick-cheneys-suite-demands

Dunning D (2009) Disbelief and the neglect of environmental context. Behav Brain Sci 32:517–518

Elster J (1989) Nuts and bolts for the social sciences. Cambridge University Press, Cambridge

Book   Google Scholar  

Elster J (2007) Explaining social behavior. Cambridge University Press, Cambridge

Engel P (ed) (2000) Believing and accepting. Kluwer, Dordrecht

Evans J (2007) Hypothetical thinking. Psychology Press, New York

Fischhoff B (1982) Debiasing. In: Kahneman D, Slovic P, Tversky A (eds) Judgment under uncertainty. Cambridge University Press, Cambridge

Fischhoff B (2002) Heuristics and biases in application. In: Gilovich T, Griffin D, Kahneman D (eds) Heuristics and biases. Cambridge University Press, Cambridge

Fisher A (2011) Critical thinking: an introduction. Cambridge University Press, Cambridge

Galinsky A, Moskowitz G, Gordon B (2000) Perspective taking. J Pers Soc Psychol 784:708–724

Gigerenzer G (2008) Rationality for mortals. Oxford University Press, Oxford

Gigerenzer G, Selten R (2002) Bounded rationality. MIT Press, Cambridge

Gigerenzer G, Todd P (2000) Précis of simple heuristics that make us smart. Behav Brain Sci 23:727–780

Hirt E, Markman K (1995) Multiple explanation: a consider-an-alternative strategy for debiasing judgments. J Pers Soc Psychol 69:1069–1086

Hogarth R (2001) Educating intuition. University of Chicago Press, Chicago

Johnson R, Blair A (2006) Logical self-defense. International Debate Association, New York

Kahneman D (2011) Thinking, fast and slow. Farrar, Straus and Giroux, New York

Kenyon T, Beaulac G (2014) Critical thinking education and debiasing. Informal Log 34(4):341–363

Kunda Z (1990) The case for motivated reasoning. Psychol Bull 108(3):480–498

Larrick R (2004) Debiasing. In: Koehler D, Harvey N (eds) The Blackwell handbook of judgment and decision making. Blackwell Publishing, Oxford

Lau J (2011) An introduction to critical thinking and creativity. Wiley, New Jersey

Lilienfeld S, Ammirati R, Landfield K (2009) Giving debiasing away. Perspect Psychol Sci 4(4):390–398

Lord G, Lepper R, Preston E (1984) Considering the opposite: a corrective strategy for social judgment. J Pers Soc Psychol 47:1231–1243

McKay R, Dennett D (2009) The evolution of misbelief. Behav Brain Sci 32:493–561

Mercier H, Sperber D (2011) Why do humans reason? Behav Brain Sci 34:57–111

Mussweiler T, Strack F, Pfeiffer T (2000) Overcoming the inevitable anchoring effect. Pers Soc Psychol Bull 26:1142–1150

Myers D (1975) Discussion-induced attitude-polarization. Hum Relat 28:699–714

Nisbett R (ed) (1993) Rules for reasoning. Erlbaum, Hillsdale

Oaksford M, Chater N (2009) Précis of Bayesian Rationality. Behav Brain Sci 32:69–120

Paluk E, Green D (2009) Prejudice reduction: what works? A review and assessment of research and practice. Annu Rev Psychol 60:339–367

Paul W (1986) Critical thinking in the strong and the role of argumentation in everyday life. In: Eemeren F, Grootendorst R, Blair A, Willard C (eds) Argumentation. Foris Publications, Dordrecht

Pelham B, Neter E (1995) The effect of motivation of judgment depends on the difficulty of the judgment. J Pers Soc Psychol 68(4):581–594

Pronin E, Lin D, Ross L (2002) The bias blind spot: perceptions of bias in self versus others. Pers Soc Psychol Bull 28:369–381

Rawls J (2000) Lectures on the history of political philosophy. Harvard University Press, Cambridge

Sanna L, Schwarz N, Stocker S (2002) When debiasing backfires. J Exp Psychol 28:497–502

Siegel H (1988) Educating reason. Routledge, New York

Soll J, Milkman K, Payne J (2015) Outsmart your own biases. Harv Bus Rev 93:65–71

Stanovich K (2005) The robot’s rebellion. The University of Chicago Press, Chicago

Stanovich K (2011) Rationality and the reflective mind. Oxford University Press, New York

Stanovich K, West R (2008) On the relative independence of thinking biases and cognitive ability. J Pers Soc Psyshol 94:672–695

Stein E (1996) Without good reason. Clarendon Press, Oxford

Stich S (1990) The fragmentation of reason. MIT Press, Cambridge

Sunstein C (2003) Why societies need dissent. Harvard University Press, Harvard

Sunstein C, Schkade D, Ellman L (2004) Ideological voting on federal courts of appeal. Va Law Rev 90(1):301–354

Taber C, Lodge M (2006) Motivated skepticism in the evaluation of political beliefs. Am J Polit Sci 50(3):755–769

Taylor S (1989) Positive illusions. Basic Books, New York

Taylor S, Brown J (1988) Illusion and well-being. Psychol Bull 103(2):193–210

Tetlock P (2002) Intuitive politicians, theologians, and prosecutors. In: Gilovich T, Griffin D, Kahneman D (eds) Heuristics and biases. Cambridge University Press, Cambridge

Tetlock P (2005) Expert political judgment. Princeton University Press, Princeton

Tetlock P, Boettger R (1989) Accountability. J Pers Soc Psychol 57:388–398

Thagard P (2011) Critical thinking and informal logic. Informal Log 31(3):152–170

Thaler R, Sunstein C (2008) Nudge. Yale University Press, New Haven

Tversky A, Kahneman D (2008) Extensional versus intuitive reasoning. In: Gilovich T, Griffin D, Kahneman D (eds) Heuristics and biases. Cambridge University Press, Cambridge

Willingham D (2007) Critical thinking: why is it so hard to teach? Am Educ 31(2):8–19

Wilson T, Brekke N (1994) Mental contamination and mental correction. Psychol Bull 116(1):117142

Wilson T, Centerbar D, Brekke N (2002) Mental contamination and the debiasing problem. In: Gilovich T, Griffin D, Kahneman D (eds) Heuristics and biases. Cambridge University Press, Cambridge

Download references

Acknowledgments

I would like to thank the editor and two anonymous reviewers for their constructive comments. Work on this article was conducted under the grant SFRH/BPD/101744/2014 by the “Portuguese Foundation for Science and Technology” (FCT), as part of the project “Values in argumentative discourse” (PTDC/MHC-FIL/0521/2014).

Author information

Authors and affiliations.

ArgLab, IFILNOVA, Nova Institute of Philosophy, Universidade Nova de Lisboa, Av. De Berna 26, 4º piso, 1069-061, Lisbon, Portugal

Vasco Correia

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Vasco Correia .

Rights and permissions

Reprints and permissions

About this article

Correia, V. Contextual Debiasing and Critical Thinking: Reasons for Optimism. Topoi 37 , 103–111 (2018). https://doi.org/10.1007/s11245-016-9388-x

Download citation

Published : 26 April 2016

Issue Date : March 2018

DOI : https://doi.org/10.1007/s11245-016-9388-x

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Contextualism
  • Critical thinking
  • Rationality
  • Find a journal
  • Publish with us
  • Track your research

Have a language expert improve your writing

Run a free plagiarism check in 10 minutes, generate accurate citations for free.

  • Knowledge Base
  • Research bias
  • What Is Cognitive Bias? | Definition, Types, & Examples

What Is Cognitive Bias? | Definition, Types & Examples

Published on November 11, 2022 by Kassiani Nikolopoulou . Revised on December 11, 2023.

Cognitive bias is the tendency to act in an irrational way due to our limited ability to process information objectively. It is not always negative, but it can cloud our judgment and affect how clearly we perceive situations, people, or potential risks.

Everyone is susceptible to cognitive bias, and researchers are no exception to that. Therefore, cognitive bias can be a source of research bias .

Table of contents

What is cognitive bias, what causes cognitive bias, impact of cognitive bias, what are different types of cognitive bias, cognitive bias examples, other types of research bias, frequently asked questions.

Cognitive bias is an umbrella term used to describe our systematic but flawed patterns of responses to judgment- and decision-related problems. These patterns are predictably nonrandom . While based on our beliefs and experiences, they often go against logic or probability.

Although we like to think of ourselves as rational beings who process all information before making a decision, this is often not the case. Everyone is prone to cognitive bias to a different degree.

Cognitive biases are hardwired into our brains and can help us navigate the information overload inherent to everyday life. If we had to think carefully before all of our actions, it would be really hard to function.

To be more efficient, our brains rely on our experiences and beliefs more than we realize. These become mental shortcuts (also called heuristics ). These rules of thumb help us make judgments and predictions. Because this process is intuitive or subconscious, people often don’t realize they are acting based on biases or preconceived notions.

Our tendency towards cognitive bias can come from many different sources. A few of these include:

  • Limited information-processing capacity. Because our minds have a limited ability to store and recall information, we simply can’t consider all the relevant information when we make an inference or decision. Usually, we are forced to focus on a subset of the available information.
  • Emotions. If our decision involves our loved ones, as opposed to total strangers, we will evaluate the situation differently.
  • Motivation. Our judgments are influenced by our existing attitudes and beliefs. We are very likely to choose the beliefs and strategies that are most likely to help us arrive at the conclusions we want to arrive at.
  • Social influence. People have a tendency to conform to the opinions expressed earlier by others or to act in socially desirable ways . This can influence collective behaviors, such as voting.
  • Heuristics, or mental shortcuts. Our minds use simple rules to arrive at a conclusion in a “fast-and-frugal” way. The aim is not to capture the problem in all its complexity, or even to arrive at the optimal solution, but to arrive at a “good enough” solution quickly while minimizing mental effort.
  • Age. There is evidence suggesting that older people show less cognitive flexibility. This implies that as we get older, we are more likely to exhibit cognitive bias.

critical thinking involves action of bias

Relying on mental shortcuts in our everyday life is effective and leads to faster decision-making when timing is more important than accuracy. However, cognitive bias can lead us to misunderstand events, facts, or other people. This, in turn, can affect our behavior in a wide range of situations.

Cognitive biases can negatively impact:

  • Our decision-making ability , limiting how receptive we are to new or contradictory information.
  • How accurately we can recall incidents —for example, an event where we were an eyewitness. Inaccurate or incomplete recollection of events can lead to recall bias .
  • Our anxiety levels , making us focus only on negative events or aspects of our lives.
  • Our relationships with others , when we are too quick to judge their personality on the basis of a single trait.
  • Our critical thinking , leading us to perpetuate misconceptions or misinformation that can be harmful to others.

Although there is no exhaustive list of all types of cognitive bias, below are some common ones that often distort our thinking.

  • Anchoring bias is the tendency to rely on the first piece of information offered. It applies particularly to numbers. Negotiators use anchoring bias by starting with a number that is too low or too high. They know that this number will set the bar for subsequent offers.
  • The framing effect occurs when people make a choice based on whether the options presented to them are phrased in a positive or a negative way, for example in terms of loss or gain, reward or punishment.
  • Actor–observer bias is the tendency to attribute our actions to external factors and other people’s actions to internal ones. For example, if you and a classmate both fail an exam, you may think that your failure was due to the difficulty of the questions, while your classmate’s was due to poor preparation.
  • The availability heuristic (or availability bias) applies when we place greater value on information that is available to us or comes to mind quickly. Because of this, we tend to overestimate the probability of similar things happening again.
  • Confirmation bias refers to our tendency to look for evidence confirming what we already believe, viewing facts and ideas we encounter as further confirmation. Confirmation bias also leads us to ignore any evidence that seems to support an opposing view.
  • The halo effect refers to how our perception of a single trait can influence how we perceive other aspects, particularly in regards to someone’s personality. For example, when we consider someone to be physically attractive, it often determines how we rate their other qualities.
  • The Baader–Meinhof phenomenon (or frequency illusion) is the tendency to see new information, names, or patterns “everywhere” soon after they’re first brought to our attention.
  • The belief bias describes the tendency to judge an argument based on how plausible we find the conclusion to be, rather than how much evidence is provided to support this conclusions over the course of the argument.
  • The affect heuristic occurs when our current emotional state or mood influences our decisions. Instead of evaluating the situation objectively, we rely on our “gut feelings” and respond according to how we feel.
  • The representativeness heuristic occurs when we estimate the probability of an event based on how similar it is to a known situation. In other words, we compare it to a situation, prototype, or stereotype we already have in mind.

Because cognitive bias often causes us to perceive the world around us in an oversimplified way, it can have far-reaching consequences.

The traditional approach to price setting would be to ask focus groups about various price tags for the phone and, based on participants’ feedback, pick the price they thought would be most profitable ($400).

In a medical context, cognitive bias can lead even seasoned doctors to wrong diagnoses.

For example, the nurses in an emergency department ask the doctor to see and quickly discharge a patient. They explain that she is a “regular” in the department and is seeking drugs. However, labels like this can stick to a patient and lead to misdiagnosis. When the patient visits again with abdominal pain, the doctor, who is very busy, performs a quick physical exam and prescribes painkillers.

In everyday life, we are often tricked by cognitive bias and over- or underestimate how risky our choices might be.

Cognitive bias

  • Confirmation bias
  • Baader–Meinhof phenomenon

Selection bias

  • Sampling bias
  • Ascertainment bias
  • Attrition bias
  • Self-selection bias
  • Survivorship bias
  • Nonresponse bias
  • Undercoverage bias
  • Hawthorne effect
  • Observer bias
  • Omitted variable bias
  • Publication bias
  • Pygmalion effect
  • Recall bias
  • Social desirability bias
  • Placebo effect

The bandwagon effect is a type of cognitive bias . It describes the tendency of people to adopt behaviors or opinions simply because others are doing so, regardless of their own beliefs.

Attention bias is a common cognitive bias that means we are ignoring important information. Because our attention is limited, we tend to direct our awareness to specific things in our environment, while filtering out others.

Although this mechanism generally makes us more efficient, it can cause us to filter out information or signals in the environment that we shouldn’t be ignoring, leading to research bias .

Although it’s harder to identity cognitive bias in our own thinking than in that of others, here are a few examples of “red flags” to bear in mind:

  • Selecting information that is in line with our existing beliefs
  • Focusing too much on initial information and failing to adjust our judgment when new information becomes available
  • Making overgeneralizations or jumping to conclusions when the evidence is scarce
  • Blaming external factors for our failures, while taking all the credit for our successes

Myside bias is a type of cognitive bias where individuals process information in a way that favors their prior beliefs and attitudes. It occurs when people search for, interpret, and recall information that confirms their opinions, and refute opinions different from their own—such as selecting news sources that agree with one’s political affiliation, while ignoring any opposing arguments from other sources.

Myside bias is closely related to confirmation bias . Although some researchers use the terms interchangeably, others use myside bias to refer to the tendency of processing information that supports one’s own position.

Sources in this article

We strongly encourage students to use sources in their work. You can cite our article (APA Style) or take a deep dive into the articles below.

Nikolopoulou, K. (2023, December 11). What Is Cognitive Bias? | Definition, Types & Examples. Scribbr. Retrieved June 25, 2024, from https://www.scribbr.com/research-bias/cognitive-bias/
Blanco, F. (2017). Cognitive Bias. In: Vonk, J., Shackelford, T. (eds) Encyclopedia of Animal Cognition and Behavior. Springer, Cham. https://doi.org/10.1007/978-3-319-47829-6_1244-1
Wilson, C. G., Nusbaum, A. T., Whitney, P., & Hinson, J. M. (2017). Age-differences in cognitive flexibility when overcoming a preexisting bias through feedback. Journal of Clinical and Experimental Neuropsychology , 40 (6), 586–594. https://doi.org/10.1080/13803395.2017.1398311
The Canadian Medical Protective Association. (2012, April 26). CMPA Good Practices Guide : common cognitive biases. CMPA . h ttps://www.cmpa-acpm.ca/serve/docs/ela/goodpracticesguide/pages/human_factors/Cognitive_biases/pdf/hf_common_cognitive_biases-e.pdf

Is this article helpful?

Kassiani Nikolopoulou

Kassiani Nikolopoulou

Other students also liked, observer bias | definition, examples, prevention, what is confirmation bias | definition & examples, the baader–meinhof phenomenon explained.

  • Bipolar Disorder
  • Therapy Center
  • When To See a Therapist
  • Types of Therapy
  • Best Online Therapy
  • Best Couples Therapy
  • Best Family Therapy
  • Managing Stress
  • Sleep and Dreaming
  • Understanding Emotions
  • Self-Improvement
  • Healthy Relationships
  • Student Resources
  • Personality Types
  • Guided Meditations
  • Verywell Mind Insights
  • 2024 Verywell Mind 25
  • Mental Health in the Classroom
  • Editorial Process
  • Meet Our Review Board
  • Crisis Support

Signs of Different Types of Biases and How to Overcome Each of Them

These biases can unknowingly impact your thoughts and behaviors.

Wendy Rose Gould is a lifestyle reporter with over a decade of experience covering health and wellness topics.

critical thinking involves action of bias

Dr. Sabrina Romanoff, PsyD, is a licensed clinical psychologist and a professor at Yeshiva University’s clinical psychology doctoral program.

critical thinking involves action of bias

Fizkes / Getty Images

  • Confirmation Bias

Attribution Bias

Conformity bias, beauty bias, gender bias, the contrast effect.

Bias refers to a tendency or preference towards a certain group, idea, or concept that influences our judgments and decisions.

Our experiences, culture, social norms, and personal beliefs often shape these beliefs. The way we act on these biases can be either conscious or unconscious and can lead to prejudiced or discriminatory behaviors.

“Bias can play a significant role in day-to-day interactions and relationships, often influencing our thoughts, attitudes, and behaviors toward others,” says David Yadush, LPC, a licensed professional counselor at BetterHelp . “This can result in misinterpreting or overlooking facts and can change how we perceive people or events in our lives.”

Along with affecting our everyday interactions, being unaware of biases—or falling prey to them even when we know they exist—can hinder personal growth .

In this article, we’re outlining common types of biases, and discussing the signs of each type and ways to overcome them.

Why It's Important to Assess Your Biases

In order to recognize and work through bias, it’s important for us to challenge our assumptions and the subconscious stereotypes we make on a daily basis. This can be done by seeking out diverse perspectives, enjoying new experiences, and advocating for equal opportunity and treatment for everyone.

Of course, it also helps to understand the types of biases we’re apt to fall prey to so we can recognize and correct them in real-time.

How to Work Through Confirmation Bias

Confirmation bias is the tendency to seek out information that reaffirms our existing beliefs. In doing so, we tend to ignore information that contradicts our beliefs , which can lead us toward untruths.

Signs of confirmation bias may include:

  • Seeking information that confirms our beliefs
  • Ignoring or dismissing information that contradicts our beliefs
  • Failing to consider alternative opinions

“This bias can be harmful as it may prevent individuals from considering alternative viewpoints and may lead to closed-mindedness,” warns Yadush.

How to Overcome Confirmation Bias

“To recognize and work through confirmation bias, individuals should actively seek out diverse perspectives and information, consider alternative viewpoints, and engage in critical thinking and self-reflection ," says Yadush.

Attribution bias is a cognitive distortion where we view the behavior of others as impacted by internal motivation —such as morals and character—while considering your own behaviors as affected by external factors, such as circumstances and environment.

Signs of attribution bias may include:

  • Consistently blaming others for problems or failures
  • Being overly critical of others
  • Excusing our own mistakes without reflection

“Simply speaking, one tends to give themselves a break for their own mistakes or shortcomings as unavoidable but will blame others for similar mistakes or shortcomings as intentional,” explains Karri Francisco, LMFT, director of family programming at APN .

She says that this is intellectually dangerous because it leads to unfair judgments of others. It can also make it harder to learn from our own mistakes since this bias prevents us from taking responsibility for our actions.

How to Overcome Attribution Bias

Francisco says that practicing empathy and perspective-taking can help you move away from falling prey to attribution bias.

Conformity bias is when we simply agree—or conform—with the opinions and behaviors of others in a group setting even when it’s against our own personal beliefs or knowledge.

Signs of conformity bias may include:

  • Vocally agreeing with others even when you inwardly disagree
  • Not sharing your own thoughts and feelings out of fear of being “ousted” or judged in a group setting
  • Going along with a group that’s acting irresponsibly or cruelly when you know inwardly the behavior is wrong

Yadush says, “This is typically an unconscious process that we go through in an attempt to avoid social rejection or gain status. This bias can be harmful as it may prevent individuals from expressing their true thoughts and opinions and may lead to groupthink, where the desire for consensus overrides critical thinking.”

How to Overcome Conformity Bias

To recognize and work through conformity bias, focus on reflecting on your own beliefs and values. At the same time, you can engage in critical thinking and seek diverse perspectives and opinions from others.

If you’re in a leadership position , you can also reduce conformity bias by encouraging and rewarding diverse opinions. 

Beauty bias is either a subconscious or known propensity to treat conventionally beautiful people better or worse than those who aren’t as attractive .

Signs of beauty bias include:

  • You judge others on their appearance
  • You make assumptions about a conventionally attractive person’s capabilities
  • You treat others better or worse based on their appearance

The Halo Effect

The halo effect describes the phenomenon in which people assume that because a person has one favorable attribute, it must mean something favorable about them as a whole.

For example, if you think someone is attractive, you may assume that they are nicer or smarter than someone you deem less attractive.

For example, you might give favorable treatment to a beautiful person, or view them as more funny or interesting. This is referred to as The Halo Effect , and studies show that people have a tendency to do this without even thinking.

That said, you might also treat an unattractive person less favorably or make harsh judgments about them without getting to know them.

How to Overcome Beauty Bias

Francisco says, “The potential harm can lead to discrimination against those who do not present within conventional beauty standards. Are you making assumptions about a person's abilities or character based on their physical appearance, such as assuming that someone attractive is also intelligent or competent?”

She adds that in order to recognize and work through any bias, we must become aware of our own and challenge them as they occur.

One approach to challenging beauty bias is consciously focusing on a person's qualities and abilities when evaluating them.

Gender bias refers to the tendency we have to hold stereotypical or discriminatory attitudes towards people based solely on their gender. This not only affects our ability to socialize in meaningful ways, but it can also lead to unequal opportunities and treatment for others.

Signs of gender bias may include:

  • Making assumptions or judgments based on gender
  • Using gender-specific language
  • Treating individuals differently based on their gender

How to Overcome Gender Bias

According to Yadush, "To recognize and work through gender bias, individuals should challenge their assumptions and stereotypes and use gender-neutral language.”

Yadush adds that it’s also important to listen to and believe individuals about their experiences around gender bias and discrimination.

Similarly, ageism is the tendency we have to make judgments or assumptions about another person simply because of their age.

This tends to negatively impact people who are either young or old, as we subconsciously hold stereotypes about their capabilities or the “known characteristics” of their generation.

Signs of ageism may include:

  • Judging an individual's ability or intelligence based on age
  • Not interacting with someone because they’re a different age
  • Being rude or dismissive of others due to their age

Ageism and Its Impact on Mental Health

Yadush says that ageism has been shown to have serious effects on the mental health , physical health, and overall quality of life in the older adult population. It can hinder their ability to socialize, find employment, or make meaningful friendships.

For young people, it can also impact their ability to be taken seriously in professional settings. This is also referred to as "youngism."

How to Overcome Ageism

“To help combat ageism, seek out mentorship from individuals of all ages and be willing to learn from those with different lived experiences,” Yadush suggests. “When you do recognize ageism in the workplace or community, speak out and be an advocate as others may not have the opportunity or support to do so.”

The contrast effect tends to sneak up on us. It’s a cognitive bias where the comparison of two things influences your perception of both.

Other signs of the contrast effect include:

  • Comparing one person to another
  • Failing to focus on objective criteria when making decisions
  • Not considering the context of your evaluations

Karri Francisco, LMFT

[The contrast effect] can lead to inaccurate perceptions and judgments of individuals being evaluated in comparison to another.

How the Contrast Effect May Play Out in Everyday Life

Here are some examples of what the contrast effect may look like in the real world:

  • If you see someone casually dressed and standing next to someone looking unkempt, the casual attire may appear more professional in comparison. This might not seem important, but it demonstrates an important effect.
  • In another example, if someone is interviewed for a job immediately after a particularly impressive candidate, they may be judged more harshly than they would have been if someone had interviewed them alone. This creates space for perceptions to be distorted. 

“The contrast effect highlights how our perceptions are not solely based on objective measurements but can be influenced by the context in which we experience them,” explains Francisco. “This can lead to inaccurate perceptions and judgments of individuals being evaluated in comparison to another.”

How to Overcome the Contrast Effect

When making decisions, try to be as objective as possible. If you do have to make any comparisons, it can help to take breaks between comparisons and evaluations in order to clear your mind of influences, and to focus on objective criteria rather than subjective impressions.

We're all prone to cognitive distortions. Sometimes we're on the receiving end, while other times we're the ones making quick judgments. Reflecting on where these biases may exist in your daily life is the first step in understanding and overcoming them.

Merriam-Webster Dictionary. Halo effect .

Batres C, Shiramizu V.  Examining the “attractiveness halo effect” across cultures .  Curr Psychol . Published online August 25, 2022. doi:10.1007/s12144-022-03575-0

Francioli SP, North MS.  Youngism: The content, causes, and consequences of prejudices toward younger adults .  J Exp Psychol Gen . 2021;150(12):2591-2612. doi:10.1037/xge0001064

By Wendy Rose Gould Wendy Rose Gould is a lifestyle reporter with over a decade of experience covering health and wellness topics.

  • Exploring the SCAMPER Technique for Creative Problem Solving
  • Exploring Brainwalking: A Creative Problem-Solving Technique
  • Logic Puzzles and Brain Teasers: A Comprehensive Overview
  • Exploring the Serendipity Technique of Creative Problem Solving
  • Analytical problem solving
  • Identifying root causes
  • Analyzing consequences
  • Brainstorming solutions
  • Heuristic problem solving
  • Using analogies
  • Applying existing solutions
  • Trial and error
  • Creative problem solving
  • Mind mapping
  • Brainstorming
  • Lateral thinking
  • Research skills
  • Interpreting information
  • Data collection and analysis
  • Identifying patterns
  • Critical thinking skills
  • Recognizing bias
  • Analyzing arguments logically
  • Questioning assumptions
  • Communication skills
  • Negotiation and compromise
  • Listening skills
  • Explaining ideas clearly
  • Planning techniques
  • SWOT analysis
  • Gantt charting
  • Critical path analysis
  • Decision making techniques
  • Force field analysis
  • Paired comparison analysis
  • Cost-benefit analysis
  • Root cause analysis
  • Five whys technique
  • Fault tree analysis
  • Cause and effect diagrams
  • Brainstorming techniques
  • Brainwriting
  • Brainwalking
  • Round-robin brainstorming
  • Creative thinking techniques
  • Serendipity technique
  • SCAMPER technique
  • Innovation techniques
  • Value innovation techniques
  • Design thinking techniques
  • Idea generation techniques
  • Personal problems
  • Deciding what career to pursue
  • Managing finances effectively
  • Solving relationship issues
  • Business problems
  • Increasing efficiency and productivity
  • Improving customer service quality
  • Reducing costs and increasing profits
  • Environmental problems
  • Preserving natural resources
  • Reducing air pollution levels
  • Finding sustainable energy sources
  • Individual brainstorming techniques
  • Thinking outside the box
  • Word association and random word generation
  • Mind mapping and listing ideas
  • Group brainstorming techniques
  • Synectics technique
  • Online brainstorming techniques
  • Online whiteboarding tools
  • Virtual brainstorming sessions
  • Collaborative mind mapping software
  • Team activities
  • Group decision making activities
  • Debate activities and role-play scenarios
  • Collaborative problem solving games
  • Creative activities
  • Creative writing exercises and storyboards
  • Imagination activities and brainstorming sessions
  • Visualization activities and drawing exercises
  • Games and puzzles
  • Crossword puzzles and Sudoku
  • Logic puzzles and brain teasers
  • Jigsaw puzzles and mazes
  • Types of decisions
  • Structured decisions
  • Simple decisions
  • Complex decisions
  • Problem solving skills
  • Recognizing Bias: A Problem Solving and Critical Thinking Skills Guide

Learn how to identify and address bias in decision making with our guide to recognizing bias in problem solving and critical thinking.

Recognizing Bias: A Problem Solving and Critical Thinking Skills Guide

In today's world, it is becoming increasingly important to recognize bias and how it can affect our decision-making. Bias can cloud our judgement, lead us to make decisions that are not in our best interests, and limit our ability to solve problems effectively. In this guide, we will explore the concept of recognizing bias and how it can be used as a tool for developing critical thinking and problem-solving skills. We will discuss the various types of biases, why recognizing them is important, and how to identify and counteract them.

Confirmation bias

Cognitive bias.

This type of bias can lead to unfair judgments or decisions. Other common types of bias include cultural bias, which is the tendency to favor one’s own culture or group; and political bias, which is the tendency to favor one’s own political party or beliefs. In order to identify and address bias in oneself and others, it is important to be aware of potential sources of bias. This includes personal opinions, values, and preconceived notions. Being mindful of these potential sources of bias can help us become more aware of our own biases and recognize them in others.

Additionally, it is important to be open-minded and willing to consider alternative perspectives. Additionally, it is helpful to challenge our own assumptions and beliefs by questioning them and seeking out evidence that supports or refutes them. The potential implications of not recognizing or addressing bias are significant. If left unchecked, biases can lead to unfair decisions or judgments, as well as inaccurate conclusions. This can have serious consequences for individuals and organizations alike.

Implications of Not Recognizing or Addressing Bias

Strategies for identifying and addressing bias.

Recognizing bias in oneself and others is an important part of making informed decisions. There are several strategies that can be used to identify and address bias. One of the most effective strategies is to take a step back and look at the situation objectively. This involves examining the facts and assumptions that are being used to make decisions.

It can also involve assessing the potential impact of decisions on multiple stakeholders. By removing personal biases from the equation, it is possible to make more informed decisions. Another important strategy for identifying and addressing bias is to question the sources of information. It is important to consider the credibility of sources, as well as any potential biases that may be present.

Fact-checking sources and considering multiple perspectives can help identify any potential biases in the information being used. In addition, it is important to remain aware of our own biases. We all have preconceived notions about certain topics that can affect our decision-making process. By being mindful of our biases, we can avoid making decisions that are influenced by them. Finally, it is important to be open to other perspectives and willing to engage in meaningful dialogue with others.

Types of Bias

Halo effect, what is bias.

It can be an unconscious preference that influences decision making and can lead to adverse outcomes. It is important to recognize bias because it can have a negative impact on our ability to make sound decisions and engage in problem solving and critical thinking. Bias can manifest itself in various ways, from subtle mental shortcuts to overt prejudices. Types of bias include confirmation bias, where we seek out information that confirms our existing beliefs; availability bias, where we base decisions on the information that is most readily available; and representativeness bias, where we assume that two events or objects are related because they share similar characteristics. Other forms of bias include halo effect, where a single positive quality or trait can influence the perception of an entire person; and stereotyping, which is the tendency to make judgments about individuals based on their perceived membership in a certain group. It is important to recognize bias in ourselves and others so that we can make informed decisions and engage in problem solving and critical thinking.

Sources of Bias

Bias can have a profound effect on decisions, leading to outcomes that are not based on facts or evidence. Personal opinions and values can lead to biased decision-making. They can be shaped by past experiences, cultural background , and other personal factors. For example, someone's opinion about a certain topic may be based on what they have previously heard or read. Similarly, preconceived notions can also lead to biased conclusions. Cultural norms can also play a role in creating bias.

For instance, people may be more likely to believe information from a source they trust or respect, even if it is not based on fact. Similarly, people may be more likely to make decisions that conform to the expectations of their culture or society. In addition, people can also be influenced by their own prejudices or stereotypes. This type of bias can lead to unfair treatment of certain individuals or groups of people. Finally, it is important to be aware of the potential for confirmation bias, where people will seek out information that confirms their existing beliefs and disregard any contradictory evidence. By recognizing and understanding these sources of bias, people can make more informed decisions and engage in more effective problem solving and critical thinking.

In conclusion, recognizing and addressing bias is an essential part of problem solving and critical thinking. Bias can come from many sources, including our own beliefs, cultural norms, and past experiences. Knowing the types of bias and strategies for identifying and addressing them can help us make informed decisions and better engage in critical thinking. Taking time to reflect on our own biases is also important for making unbiased decisions.

Ultimately, recognizing and addressing bias will improve our problem-solving and critical thinking skills.

Mind Mapping: A Creative Problem Solving Tool

  • Mind Mapping: A Creative Problem Solving Tool

Learn all about mind mapping and how it can help you to solve problems creatively and effectively.

Visualization Activities and Drawing Exercises

  • Visualization Activities and Drawing Exercises

Learn more about visualization activities and drawing exercises, from problem-solving activities to creative activities.

Brainstorming Solutions: A Problem-Solving Guide

  • Brainstorming Solutions: A Problem-Solving Guide

Learn how to use brainstorming to come up with creative solutions to complex problems. Discover problem-solving strategies that work.

Thinking Outside the Box: An Overview of Individual Brainstorming Techniques

  • Thinking Outside the Box: An Overview of Individual Brainstorming Techniques

Learn about individual brainstorming techniques to help you think outside the box and come up with creative solutions.

  • Mind Mapping - Creative Problem Solving and Creative Thinking Techniques
  • Collaborative Problem Solving Games: Exploring Creative Solutions for Teams
  • Force Field Analysis for Problem Solving and Decision Making
  • Finding Sustainable Energy Sources

Improving Customer Service Quality

  • Value Innovation Techniques
  • Mind Mapping and Listing Ideas
  • Collaborative Mind Mapping Software
  • SWOT Analysis: A Comprehensive Overview
  • Fault Tree Analysis: A Comprehensive Overview
  • Data Collection and Analysis - Problem Solving Skills and Research Skills
  • Virtual Brainstorming Sessions: A Comprehensive Overview
  • Cause and Effect Diagrams: A Problem-Solving Technique

Exploring Trial and Error Problem Solving Strategies

  • Interpreting Information: A Problem-Solving and Research Skills Primer
  • Brainstorming: A Comprehensive Look at Creative Problem Solving
  • Gantt Charting: A Primer for Problem Solving & Planning Techniques
  • Debate Activities and Role-Play Scenarios
  • Design Thinking Techniques: A Comprehensive Overview
  • Cost-benefit Analysis: A Guide to Making Informed Decisions
  • Managing Your Finances Effectively
  • Idea Generation Techniques: A Comprehensive Overview
  • Structured Decisions: An Overview of the Decision Making Process
  • Preserving Natural Resources
  • Critical Path Analysis: A Comprehensive Guide
  • Maximizing Efficiency and Productivity
  • Crossword Puzzles and Sudoku: A Problem-Solving Exploration
  • Word Association and Random Word Generation
  • Paired Comparison Analysis: A Comprehensive Overview
  • Choosing the Right Career: Problem-Solving Examples
  • Brainwriting: A Creative Problem-Solving Technique
  • Applying Existing Solutions for Problem Solving Strategies
  • Identifying Patterns: A Practical Guide
  • Imagination Activities and Brainstorming Sessions
  • How to Explain Ideas Clearly
  • Analyzing Arguments Logically
  • Reducing Costs and Increasing Profits: A Problem Solving Example
  • Creative Writing Exercises and Storyboards
  • Exploring Synectics Technique: A Comprehensive Guide
  • Jigsaw Puzzles and Mazes: Problem Solving Activities for Fun and Learning
  • Brainwriting: A Group Brainstorming Technique
  • Questioning Assumptions: A Critical Thinking Skill
  • Analyzing Consequences: A Problem Solving Strategy
  • Identifying Root Causes
  • Exploring Lateral Thinking: A Comprehensive Guide to Problem Solving Strategies
  • Making Complex Decisions: A Comprehensive Overview
  • Round-robin Brainstorming: A Creative Problem Solving Tool
  • Solving Relationship Issues
  • Negotiation and Compromise

Using Analogies to Solve Problems

  • Five Whys Technique: A Comprehensive Analysis
  • Exploring Online Whiteboarding Tools for Brainstorming
  • Round-robin brainstorming: Exploring a Group Brainstorming Technique
  • Listening Skills: A Comprehensive Overview
  • Simple Decisions - An Overview
  • Reducing Air Pollution Levels
  • Group Decision Making Activities

New Articles

Improving Customer Service Quality

Which cookies do you want to accept?

Explore Psychology

Cognitive Bias: Common Types and How to Avoid Them

Categories Cognition

A cognitive bias is an unconscious systematic pattern of thinking that can result in errors in judgment. These biases stem from the brain’s limited resources and need to simplify the world to make faster decisions.

Such biases are often the result of limitations or problems in memory, attention, and information processing. While such biases often serve as shortcuts that help us make sense of the world around us, they also introduce errors in problem-solving and decision-making.

Cognitive biases were first described by Amos Tversky and Daniel Kahneman in 1972 and grew out of their work in heuristics. Heuristics are mental shortcuts that help speed up thinking. While these biases often help people make sense of the world, they also introduce distortions, illogical thinking, irrationality, and poor judgments.

In this article, learn more about some of the signs and most common types of cognitive biases. Also, discover how to look at and change bias in your own thinking.

Table of Contents

Signs of Cognitive Bias

critical thinking involves action of bias

Cognitive bias affects human thought and behavior in powerful ways, but that does not mean they are always easy to spot. A few signs you might be affected by some cognitive bias include:

  • Refusing to consider other possible explanations
  • Paying more attention to information that confirms what you already believe to be true
  • Always assuming that you are right and that other people are wrong
  • Believing that everyone else thinks the same way that you do
  • Taking credit for success when things go right but blaming others when things go wrong
  • Giving more weight to the first information you hear and not considering subsequent information

Becoming more aware of cognitive biases can be helpful, but knowing they exist doesn’t always make them easier to overcome. However, you can often take steps to help compensate and make more accurate decisions.

Types of Cognitive Bias

How many cognitive biases are there? Researchers have identified more than 175 different cognitive biases. Some of the most common ones include:

Confirmation Bias

This common type of bias involves paying more attention to information that reinforces your beliefs. At the same time, you might discount or ignore things that offer contrary evidence.

Hindsight Bias

This bias, also known as the ‘I-knew-it-all-along’ effect, involves people overestimating the predictability of events. In hindsight, the outcome appears obvious and inevitable. This often makes people feel overly confident in their predictions about the future.

Anchoring Bias

The anchoring bias involves relying too heavily on the first piece of information. Once you hear something, you then rely on it as a baseline to compare further information. If you see a car at a certain price, for example, you might then use that anchor to compare all future car prices.

Self-Serving Bias

The self-serving bias involves a tendency to take personal credit for success but blame external forces for failures. While this can protect self-esteem, it can prevent people from accurately evaluating the causes behind the events in their lives.

Bandwagon Effect

This type of cognitive bias causes people to be more likely to go along with something if many other people are also doing the same thing.

Fundamental Attribution Error

The fundamental attribution error is a cognitive bias that happens when people make assessments of other people’s behavior. When it comes to attributing the action of others, people often tend to place too much emphasis on personal characteristics of the individual while downplaying or underestimating external or situational factors.

Halo Effect

This effect involves judging all of a person’s qualities based on one trait. Also known as the ‘what is beautiful is good’ effect, an example would be believing someone is competent, kind, and generous because they are physically attractive.

Dunning-Kruger Effect

This effect occurs when people mistakenly think they know a great deal about a topic because of their limited knowledge of the subject. The less they understand about it, the easier the topic appears. It is only after becoming more informed that people begin to appreciate the complexity and depth of the subject and can put their own limited understanding in context.

Ingroup Bias

The ingroup bias involves attributing positive traits to people who share group affiliations with you. This might involve believing people in that group are more qualified or competent but also favoring members of that group while discounting members of outside groups.

Availability Bias

This type of cognitive bias causes people to base decisions on information that is immediately available. Rather than look for outside information, people base their decisions on the first examples that come to mind.

Projection Bias

The projection bias involves overestimating how much other people agree with how you think, feel, or believe. It shares similarities with the false consensus effect.

Researchers refer to the inability to recognize your own cognitive biases as the bias blind spot .

Identifying Cognitive Bias

Cognitive biases are natural and happen to everyone. While cognitive biases are often problematic, they can also be adaptive. 

They often lead to fairly accurate and fast decisions while reducing the mental effort required to make a choice. This can be particularly important when a person faces a threat and needs to devise a solution quickly.

Learning to spot them can help you better determine if they are serving a purpose or preventing you from making accurate judgments.

If you suspect that your thinking is being hindered by bias, ask yourself some questions:

  • Are you only paying attention to information that confirms what you already believe?
  • Are you blaming the misfortunes of others on their own personal failings while attributing your own misfortunes to external forces?
  • Do you assume that most people agree with your beliefs and opinions on different topics?
  • Do you assume that the first thing you learn about something is accurate and discount subsequent information?
  • Is your initial good impression of something affecting your subsequent assessments?
  • Do you ever look back on an event and feel like you knew the outcome all along?

Detecting and preventing bias every type of bias isn’t possible. Everyone has biases, often shaped by their perceptions and experiences.

While you might not be able to prevent it, you can get better at identifying your own biases and looking for ways to be less biased and more objective when possible.

How to Prevent Cognitive Bias

What can you do to help minimize the potentially negative effects of cognitive biases on your judgments? Steps that you can take include:

Take Your Time

When making a decision, try to give yourself some time to consider all of the aspects of the situation and the potential pros and cons of each choice.

Reduce Distractions

When other things are competing for your attention, it reduces your available mental resources for making a decision. Whenever possible, eliminate distractions so you can focus on the task at hand.

Learn More About Cognitive Bi as

Research suggests that cognitive training can help minimize biased thinking. Recognizing bias in your own thinking is often the first step to overcoming it. A 2019 study published in the journal Psychological Science found that participants who had been trained about the effects of bias were 19% less likely to be influenced by confirmation bias when making a choice.

Challenge Your Thinking

Actively questioning your choices and looking for sources of bias can also be helpful. Analyzing your choices can help you think more critically in various situations.

Get Some Perspective

If you’re having trouble checking for bias, consider seeing other people’s thoughts. Group decisions aren’t necessarily free from bias (because other people are affected by cognitive biases as well), but sometimes, getting an outside perspective can help you see things from another point of view . 

Cognitive biases are flaws in the way that we think. They can speed up thinking and simplify the world around us, but they can also cause us to make poor choices.

Getting rid of all biases isn’t realistic, but there are ways to get better at spotting biases in your thinking. Learning more about these biases, how they work, and their effects can help you get better at spotting flaws in thinking and making better decisions .

Sources: 

American Psychological Association. Availability heuristic . APA Dictionary of Psychology; 2020.

Friedman HH. Cognitive biases that interfere with critical thinking and scientific reasoning: a course module . SSRN Journal. Published online 2017. doi:10.2139/ssrn.2958800

Sellier A-L, Scopelliti I, Morewedge CK. Debiasing training improves decision making in the field . Psychol Sci . 2019;30(9):1371-1379. doi:10.1177/0956797619861429

van Geene K, de Groot E, Erkelens C, Zwart D. Raising awareness of cognitive biases during diagnostic reasoning . Perspect Med Educ. 2016;5(3):182-185. doi:10.1007/s40037-016-0274-4

Yik M, Wong KFE, Zeng KJ. Anchoring-and-adjustment during affect inferences . Front Psychol. 2019;9:2567. doi:10.3389/fpsyg.2018.02567

Mind Help

[show_hide]

[live_search]

  • Types of Cognitive Biases

Types of Cognitive Biases

✅ Verified by World Mental Healthcare Association

Table of Contents

Different types of cognitive biases, impact of different cognitive biases on mental health, frequently asked questions (faqs).

  • Cognitive Bias 

Types of cognitive biases refer to the 1  Friedman, H. H. (2017). Cognitive Biases that Interfere with Critical Thinking and Scientific Reasoning: A Course Module. SSRN Electronic Journal. https://doi.org/10.2139/ssrn.2958800 various systematic patterns of thinking or mental shortcuts that can lead to deviations from rationality and objective reasoning.

These cognitive biases types are inherent in human cognition and can influence how we perceive, process, and interpret information, as well as the judgments and decisions we make. They often occur automatically and unconsciously, shaping our thought processes and affecting our behaviors without our awareness. 

Cognitive biases can manifest 2  Berthet V. (2021). The Measurement of Individual Differences in Cognitive Biases: A Review and Improvement. Frontiers in psychology, 12, 630177. https://doi.org/10.3389/fpsyg.2021.630177 in different ways and are widespread among individuals with various mental health disorders 3  COMMON MENTAL HEALTH DISORDERS. (2011). Nih.gov; British Psychological Society. Available from: https://www.ncbi.nlm.nih.gov/books/NBK92254/ such as depression, post-traumatic disorder, substance abuse disorder, anxiety disorder, etc.

Research indicates 4  Berthet V. (2022). The Impact of Cognitive Biases on Professionals’ Decision-Making: A Review of Four Occupational Areas. Frontiers in psychology, 12, 802439. https://doi.org/10.3389/fpsyg.2021.802439 that people affected by these mental health issues tend to selectively focus on, remember, and interpret events in a manner which in a way indicates the responses that are common in cognitive biases.

Being aware of these biases is crucial as they can impact our mental health functioning and overall well-being and hinder our ability to make informed decisions.

Some examples of different types of cognitive biases include:

1. Actor-Observer Bias

The actor-observer bias is a cognitive bias that refers 5  Koenig, A. (2013). Actor‐Observer Bias. 19–20. https://doi.org/10.1002/9781118339893.wbeccp008 to the tendency to attribute our own behavior to external factors while attributing the behavior of others to internal factors.

For example, if an individual is driving and accidentally rear-end another car at a traffic light, as the “actor” in this situation, he/she may attribute the accident to external factors such as the slippery road conditions, or the behavior of the other driver.

However, if the individual observes someone else rear-end another car at a traffic light, he/she might attribute their behavior to internal factors, such as they were careless, reckless, or simply a bad driver. In this case, he/she sees the other driver as the “observer” in this situation.

Read More About  Actor-Observer Bias Here

2. Self-Serving Bias

The self-serving bias is a cognitive bias that refers 6  Kaplan, T. R., & Ruffle, B. J. (2004). The Self-serving Bias and Beliefs about Rationality. Economic Inquiry, 42(2), 237–246. https://doi.org/10.1093/ei/cbh057 to individuals’ tendency to attribute their successes to internal factors and their failures to external factors. 

For example, if an individual performs exceptionally well in a sports competition and attributes his victory solely to his skills and effort. However, when he performs poorly in the next competition, he blames external factors such as bad luck or unfavorable circumstances, disregarding any personal responsibility.

Read More About  Self-Serving Bias Here

3. Confirmation Bias

This bias entails selectively seeking or 7  Peters, U. (2020). What Is the Function of Confirmation Bias? Erkenntnis, 87(87). https://doi.org/10.1007/s10670-020-00252-1 interpreting information in a way that confirms our existing beliefs or hypotheses while disregarding the contradictory evidence. 

For example, if someone strongly supports a specific political ideology, they may only consume news or engage with information sources that align with their beliefs, while disregarding alternative perspectives.

Read More About  Confirmation Bias Here

4. Hindsight Bias

People experiencing hindsight bias tend to 8  Pohl, R. F., Bender, M., & Lachmann, G. (2002). Hindsight bias around the world. Experimental psychology, 49(4), 270–282. https://doi.org/10.1026//1618-3169.49.4.270 believe that they knew or should have known the outcome all along, underestimating the uncertainty and complexity of the situation as it unfolded.

For example, when a sports team wins a game, fans may claim that they had confidently anticipated the team’s success from the beginning, overlooking the inherent uncertainties and unpredictable nature of sports events.

5. Anchoring Bias

This bias manifests when 9  Wilson, T. D., Houston, C. E., Etling, K. M., & Brekke, N. (1996). A new look at anchoring effects: basic anchoring and its antecedents. Journal of experimental psychology. General, 125(4), 387–402. https://doi.org/10.1037//0096-3445.125.4.387 we heavily rely on the first piece of information encountered (the “anchor”) when making subsequent judgments or decisions. 

When comparing two laptops at a store, the higher initial price of one model can anchor the buyer’s perception of value, making them more likely to consider it a superior option even if it may not offer significantly better features.

6. Misinformation Effect

The misinformation effect refers to 10  Challies, D. M., Hunt, M., Garry, M., & Harper, D. N. (2011). Whatever gave you that idea? False memories following equivalence training: a behavioral account of the misinformation effect. Journal of the experimental analysis of behavior, 96(3), 343–362. https://doi.org/10.1901/jeab.2011.96-343 the phenomenon where the introduction of misleading or incorrect information can alter a person’s memory of an event or experience.

For example, if someone witnesses a crime and then hears various conflicting accounts of what happened, their memory of the event may be influenced by the misinformation they encountered, leading them to recall details incorrectly.

7. False-consensus Effect

The false-consensus effect is a cognitive bias in which 11  Alicke, M. D., & Largo, E. (1995). The Role of Self in the False Consensus Effect. Journal of Experimental Social Psychology, 31(1), 28–47. https://doi.org/10.1006/jesp.1995.1002 individuals overestimate the extent to which their own opinions, beliefs, preferences, or behaviors are shared by others. It involves assuming that one’s attitudes and behaviors are more common or typical than they are.

For instance, if we strongly believe that a particular political candidate is the best choice, we may assume that most other people also hold the same view, underestimating the diversity of opinions and perspectives.

8. Halo Effect

This bias occurs when 12  Nicolau, J. L., Mellinas, J. P., & Fuentes, E. M.(2021). The halo effect. Available from: https://www.researchgate.net/publication/349443219_The_halo_effect our overall impression of a person influences our judgments or evaluations of their single traits or ability.

For example, if someone is physically attractive, we may tend to assume that they are also intelligent, kind, or talented, even without concrete evidence to support those assumptions.

Read More About  Halo Effect Here

9. Availability Heuristic

The availability heuristic is a cognitive bias that involves 13  Pachur, T., Hertwig, R., & Steinmann, F. (2012). How do people judge risks: availability heuristic, affect heuristic, or both?. Journal of experimental psychology. Applied, 18(3), 314–330. https://doi.org/10.1037/a0028279 making judgments or decisions based on the ease with which examples or instances come to mind. People tend to rely on information that is readily available in their memory or easily retrieved when evaluating the probability, likelihood, or frequency of events or situations.

For instance, when asked about the safety of air travel, someone with an available heuristic might recall recent news stories of plane crashes, leading them to overestimate the risks associated with flying, despite statistical evidence indicating otherwise.

10. Optimism Bias

This bias involves overestimating the likelihood 14  Sharot T. (2011). The optimism bias. Current biology : CB, 21(23), R941–R945. https://doi.org/10.1016/j.cub.2011.10.030 of positive outcomes and underestimating the likelihood of negative outcomes. 

For example, when embarking on a new business venture, individuals may be overly optimistic about their chances of success, downplaying potential risks or challenges they may encounter along the way.

Here are some ways in which different cognitive biases types can affect 15  Savioni, L., & Triberti, S. (2020). Cognitive Biases in Chronic Illness and Their Impact on Patients’ Commitment. Frontiers in Psychology, 11. https://doi.org/10.3389/fpsyg.2020.579455 mental well-being:

  • Actor-observer bias leads individuals to blame others and external factors for their negative experiences, causing relationship conflicts.
  • Individuals with self-serving bias overly attribute successes to internal factors and failures to external factors that hinder responsibility and reduce effort. 
  • Confirmation bias reinforces people actively seek information that supports their negative thoughts, leading to a cycle of self-doubt, or low self-esteem.
  • Individuals with hindsight bias may blame themselves excessively for not being able to predict or prevent negative outcomes, leading to feelings of guilt, or regret.
  • Decisions based on anchoring bias and without considering other relevant information, can lead to poor choices or missed opportunities for improvement.
  • The misinformation effect can distort one’s understanding of events or experiences, leading to increased anxiety, restlessness, or feelings of confusion.
  • Availability heuristics lead to distorted perceptions of risks and probabilities which are based on easily accessible or memorable examples.

1. Are cognitive biases good or bad?

Cognitive biases are neither inherently good nor bad, but they can lead to errors in thinking and decision-making, potentially hindering our ability to make rational and objective judgments.

2. How do you identify cognitive bias?

Identifying cognitive biases requires self-awareness and analysis of our thoughts, beliefs, and decision-making processes.

3. What is the most powerful cognitive bias?

Confirmation bias is often considered one of the most pervasive and influential biases.

References:

  • 1  Friedman, H. H. (2017). Cognitive Biases that Interfere with Critical Thinking and Scientific Reasoning: A Course Module. SSRN Electronic Journal. https://doi.org/10.2139/ssrn.2958800
  • 2  Berthet V. (2021). The Measurement of Individual Differences in Cognitive Biases: A Review and Improvement. Frontiers in psychology, 12, 630177. https://doi.org/10.3389/fpsyg.2021.630177
  • 3  COMMON MENTAL HEALTH DISORDERS. (2011). Nih.gov; British Psychological Society. Available from: https://www.ncbi.nlm.nih.gov/books/NBK92254/
  • 4  Berthet V. (2022). The Impact of Cognitive Biases on Professionals’ Decision-Making: A Review of Four Occupational Areas. Frontiers in psychology, 12, 802439. https://doi.org/10.3389/fpsyg.2021.802439
  • 5  Koenig, A. (2013). Actor‐Observer Bias. 19–20. https://doi.org/10.1002/9781118339893.wbeccp008
  • 6  Kaplan, T. R., & Ruffle, B. J. (2004). The Self-serving Bias and Beliefs about Rationality. Economic Inquiry, 42(2), 237–246. https://doi.org/10.1093/ei/cbh057
  • 7  Peters, U. (2020). What Is the Function of Confirmation Bias? Erkenntnis, 87(87). https://doi.org/10.1007/s10670-020-00252-1
  • 8  Pohl, R. F., Bender, M., & Lachmann, G. (2002). Hindsight bias around the world. Experimental psychology, 49(4), 270–282. https://doi.org/10.1026//1618-3169.49.4.270
  • 9  Wilson, T. D., Houston, C. E., Etling, K. M., & Brekke, N. (1996). A new look at anchoring effects: basic anchoring and its antecedents. Journal of experimental psychology. General, 125(4), 387–402. https://doi.org/10.1037//0096-3445.125.4.387
  • 10  Challies, D. M., Hunt, M., Garry, M., & Harper, D. N. (2011). Whatever gave you that idea? False memories following equivalence training: a behavioral account of the misinformation effect. Journal of the experimental analysis of behavior, 96(3), 343–362. https://doi.org/10.1901/jeab.2011.96-343
  • 11  Alicke, M. D., & Largo, E. (1995). The Role of Self in the False Consensus Effect. Journal of Experimental Social Psychology, 31(1), 28–47. https://doi.org/10.1006/jesp.1995.1002
  • 12  Nicolau, J. L., Mellinas, J. P., & Fuentes, E. M.(2021). The halo effect. Available from: https://www.researchgate.net/publication/349443219_The_halo_effect
  • 13  Pachur, T., Hertwig, R., & Steinmann, F. (2012). How do people judge risks: availability heuristic, affect heuristic, or both?. Journal of experimental psychology. Applied, 18(3), 314–330. https://doi.org/10.1037/a0028279
  • 14  Sharot T. (2011). The optimism bias. Current biology : CB, 21(23), R941–R945. https://doi.org/10.1016/j.cub.2011.10.030
  • 15  Savioni, L., & Triberti, S. (2020). Cognitive Biases in Chronic Illness and Their Impact on Patients’ Commitment. Frontiers in Psychology, 11. https://doi.org/10.3389/fpsyg.2020.579455

— Share —

  • Share on Pinterest
  • Share on Facebook
  • Email this Page
  • Share on LinkedIn
  • Share on Telegram
  • Share on Tumblr
  • Share on Reddit
  • Print this Page
  • Share on Pocket
  • Share on WhatsApp

8 Warning Signs To Recognize Human Trafficking Victims

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • PMC10672018

Logo of jintell

Critical Thinking, Intelligence, and Unsubstantiated Beliefs: An Integrative Review

Associated data.

This research did not involve collection of original data, and hence there are no new data to make available.

A review of the research shows that critical thinking is a more inclusive construct than intelligence, going beyond what general cognitive ability can account for. For instance, critical thinking can more completely account for many everyday outcomes, such as how thinkers reject false conspiracy theories, paranormal and pseudoscientific claims, psychological misconceptions, and other unsubstantiated claims. Deficiencies in the components of critical thinking (in specific reasoning skills, dispositions, and relevant knowledge) contribute to unsubstantiated belief endorsement in ways that go beyond what standardized intelligence tests test. Specifically, people who endorse unsubstantiated claims less tend to show better critical thinking skills, possess more relevant knowledge, and are more disposed to think critically. They tend to be more scientifically skeptical and possess a more rational–analytic cognitive style, while those who accept unsubstantiated claims more tend to be more cynical and adopt a more intuitive–experiential cognitive style. These findings suggest that for a fuller understanding of unsubstantiated beliefs, researchers and instructors should also assess specific reasoning skills, relevant knowledge, and dispositions which go beyond what intelligence tests test.

1. Introduction

Why do some people believe implausible claims, such as the QAnon conspiracy theory, that a cabal of liberals is kidnapping and trafficking many thousands of children each year, despite the lack of any credible supporting evidence? Are believers less intelligent than non-believers? Do they lack knowledge of such matters? Are they more gullible or less skeptical than non-believers? Or, more generally, are they failing to think critically?

Understanding the factors contributing to acceptance of unsubstantiated claims is important, not only to the development of theories of intelligence and critical thinking but also because many unsubstantiated beliefs are false, and some are even dangerous. Endorsing them can have a negative impact on an individual and society at large. For example, false beliefs about the COVID-19 pandemic, such as believing that 5G cell towers induced the spread of the COVID-19 virus, led some British citizens to set fire to 5G towers ( Jolley and Paterson 2020 ). Other believers in COVID-19 conspiracy theories endangered their own and their children’s lives when they refused to socially distance and be vaccinated with highly effective vaccines, despite the admonitions of scientific experts ( Bierwiaczonek et al. 2020 ). Further endangering the population at large, those who believe the false conspiracy theory that human-caused global warming is a hoax likely fail to respond adaptively to this serious global threat ( van der Linden 2015 ). Parents, who uncritically accept pseudoscientific claims, such as the false belief that facilitated communication is an effective treatment for childhood autism, may forego more effective treatments ( Lilienfeld 2007 ). Moreover, people in various parts of the world still persecute other people whom they believe are witches possessing supernatural powers. Likewise, many people still believe in demonic possession, which has been associated with mental disorders ( Nie and Olson 2016 ). Compounding the problems created by these various unsubstantiated beliefs, numerous studies now show that when someone accepts one of these types of unfounded claims, they tend to accept others as well; see Bensley et al. ( 2022 ) for a review.

Studying the factors that contribute to unfounded beliefs is important not only because of their real-world consequences but also because this can facilitate a better understanding of unfounded beliefs and how they are related to critical thinking and intelligence. This article focuses on important ways in which critical thinking and intelligence differ, especially in terms of how a comprehensive model of CT differs from the view of intelligence as general cognitive ability. I argue that this model of CT more fully accounts for how people can accurately decide if a claim is unsubstantiated than can views of intelligence, emphasizing general cognitive ability. In addition to general cognitive ability, thinking critically about unsubstantiated claims involves deployment of specific reasoning skills, dispositions related to CT, and specific knowledge, which go beyond the contribution of general cognitive ability.

Accordingly, this article begins with an examination of the constructs of critical thinking and intelligence. Then, it discusses theories proposing that to understand thinking in the real world requires going beyond general cognitive ability. Specifically, the focus is on factors related to critical thinking, such as specific reasoning skills, dispositions, metacognition, and relevant knowledge. I review research showing that that this alternative multidimensional view of CT can better account for individual differences in the tendency to endorse multiple types of unsubstantiated claims than can general cognitive ability alone.

2. Defining Critical Thinking and Intelligence

Critical thinking is an almost universally valued educational objective in the US and in many other countries which seek to improve it. In contrast, intelligence, although much valued, has often been viewed as a more stable characteristic and less amenable to improvement through specific short-term interventions, such as traditional instruction or more recently through practice on computer-implemented training programs. According to Wechsler’s influential definition, intelligence is a person’s “aggregate or global capacity to act purposefully, to think rationally, and to deal effectively with his environment” ( Wechsler 1944, p. 3 ).

Consistent with this definition, intelligence has long been associated with general cognitive or intellectual ability and the potential to learn and reason well. Intelligence (IQ) tests measure general cognitive abilities, such as knowledge of words, memory skills, analogical reasoning, speed of processing, and the ability to solve verbal and spatial problems. General intelligence or “g” is a composite of these abilities statistically derived from various cognitive subtests on IQ tests which are positively intercorrelated. There is considerable overlap between g and the concept of fluid intelligence (Gf) in the prominent Cattell–Horn–Carroll model ( McGrew 2009 ), which refers to “the ability to solve novel problems, the solution of which does not depend on previously acquired skills and knowledge,” and crystalized intelligence (Gc), which refers to experience, existing skills, and general knowledge ( Conway and Kovacs 2018, pp. 50–51 ). Although g or general intelligence is based on a higher order factor, inclusive of fluid and crystallized intelligence, it is technically not the same as general cognitive ability, a commonly used, related term. However, in this article, I use “general cognitive ability” and “cognitive ability” because they are the imprecise terms frequently used in the research reviewed.

Although IQ scores have been found to predict performance in basic real-world domains, such as academic performance and job success ( Gottfredson 2004 ), an enduring question for intelligence researchers has been whether g and intelligence tests predict the ability to adapt well in other real-world situations, which concerns the second part of Wechsler’s definition. So, in addition to the search for the underlying structure of intelligence, researchers have been perennially concerned with how general abilities associated with intelligence can be applied to help a person adapt to real-world situations. The issue is largely a question of how cognitive ability and intelligence can help people solve real-world problems and cope adaptively and succeed in dealing with various environmental demands ( Sternberg 2019 ).

Based on broad conceptual definitions of intelligence and critical thinking, both intelligence and CT should aid adaptive functioning in the real world, presumably because they both involve rational approaches. Their common association with rationality gives each term a positive connotation. However, complicating the definition of each of these is the fact that rationality also continues to have a variety of meanings. In this article, in agreement with Stanovich et al. ( 2018 ), rationality is defined in the normative sense, used in cognitive science, as the distance between a person’s response and some normative standard of optimal behavior. As such, degree of rationality falls on a continuous scale, not a categorical one.

Despite disagreements surrounding the conceptual definitions of intelligence, critical thinking, and rationality, a commonality in these terms is they are value-laden and normative. In the case of intelligence, people are judged based on norms from standardized intelligence tests, especially in academic settings. Although scores on CT tests seldom are, nor could be, used to judge individuals in this way, the normative and value-laden basis of CT is apparent in people’s informal judgements. They often judge others who have made poor decisions to be irrational or to have failed to think critically.

This value-laden aspect of CT is also apparent in formal definitions of CT. Halpern and Dunn ( 2021 ) defined critical thinking as “the use of those cognitive skills or strategies that increase the probability of a desirable outcome. It is used to describe thinking that is purposeful, reasoned, and goal-directed.” The positive conception of CT as helping a person adapt well to one’s environment is clearly implied in “desirable outcome”.

Robert Ennis ( 1987 ) has offered a simpler, yet useful definition of critical thinking that also has normative implications. According to Ennis, “critical thinking is reasonable, reflective thinking focused on deciding what to believe or do” ( Ennis 1987, p. 102 ). This definition implies that CT helps people know what to believe (a goal of epistemic rationality) and how to act (a goal of instrumental rationality). This is conveyed by associating “critical thinking” with the positive terms, “reasonable” and “reflective”. Dictionaries commonly define “reasonable” as “rational”, “logical”, “intelligent”, and “good”, all terms with positive connotations.

For critical thinkers, being reasonable involves using logical rules, standards of evidence, and other criteria that must be met for a product of thinking to be considered good. Critical thinkers use these to evaluate how strongly reasons or evidence supports one claim versus another, drawing conclusions which are supported by the highest quality evidence ( Bensley 2018 ). If no high-quality evidence is available for consideration, it would be unreasonable to draw a strong conclusion. Unfortunately, people’s beliefs are too often based on acceptance of unsubstantiated claims. This is a failure of CT, but is it also a failure of intelligence?

3. Does Critical Thinking “Go Beyond” What Is Meant by Intelligence?

Despite the conceptual overlap in intelligence and CT at a general level, one way that CT can be distinguished from the common view of intelligence as general cognitive ability is in terms of what each can account for. Although intelligence tests, especially measures of general cognitive ability, have reliably predicted academic and job performance, they may not be sufficient to predict other everyday outcomes for which CT measures have made successful predictions and have added to the variance accounted for in performance. For instance, replicating a study by Butler ( 2012 ), Butler et al. ( 2017 ) obtained a negative correlation ( r = −0.33) between scores on the Halpern Critical Thinking Appraisal (HCTA) and a measure of 134 negative, real-world outcomes, not expected to befall critical thinkers, such as engaging in unprotected sex or posting a message on social media which the person regretted. They found that higher HCTA scores not only predicted better life decisions, but also predicted better performance beyond a measure of general cognitive ability. These results suggest that CT can account for real-world outcomes and goes beyond general cognitive ability to account for additional variance.

Some theorists maintain that standardized intelligence tests do not capture the variety of abilities that people need to adapt well in the real world. For example, Gardner ( 1999 ), has proposed that additional forms of intelligence are needed, such as spatial, musical, and interpersonal intelligences in addition to linguistic and logical–mathematical intelligences, more typically associated with general cognitive ability and academic success. In other theorizing, Sternberg ( 1988 ) has proposed three additional types of intelligence: analytical, practical, and creative intelligence, to more fully capture the variety of intelligent abilities on which people differ. Critical thinking is considered part of analytical skills which involve evaluating the quality and applicability of ideas, products, and options ( Sternberg 2022 ). Regarding adaptive intelligence, Sternberg ( 2019 ) has emphasized how adaptive aspects of intelligence are needed to solve real-world problems both at the individual and species levels. According to Sternberg, core components of intelligence have evolved in humans, but intelligence takes different forms in different cultures, with each culture valuing its own skills for adaptation. Thus, the construct of intelligence must go beyond core cognitive ability to encompass the specific abilities needed for adaptive behavior in specific cultures and settings.

Two other theories propose that other components be added to intelligent and rational thinking. Ackerman ( 2022 ) has emphasized the importance of acquiring domain-specific knowledge for engaging in intelligent functioning in the wide variety of tasks found in everyday life. Ackerman has argued that declarative, procedural, and tacit knowledge, as well as non-ability variables, are needed to better predict job performance and performance of other everyday activities. Taking another approach, Halpern and Dunn ( 2021 ) have proposed that critical thinking is essentially the adaptive application of intelligence for solving real-world problems. Elsewhere, Butler and Halpern ( 2019 ) have argued that dispositions such as open-mindedness are another aspect of CT and that domain-specific knowledge and specific CT skills are needed to solve real-world problems.

Examples are readily available for how CT goes beyond what IQ tests test to include specific rules for reasoning and relevant knowledge needed to execute real-world tasks. Take the example of scientific reasoning, which can be viewed as a specialized form of CT. Drawing a well-reasoned inductive conclusion about a theory or analyzing the quality of a research study both require that a thinker possess relevant specialized knowledge related to the question and specific reasoning skills for reasoning about scientific methodology. In contrast, IQ tests are deliberately designed to be nonspecialized in assessing Gc, broadly sampling vocabulary and general knowledge in order to be fair and unbiased ( Stanovich 2009 ). Specialized knowledge and reasoning skills are also needed in non-academic domains. Jurors must possess specialized knowledge to understand expert, forensic testimony and specific reasoning skills to interpret the law and make well-reasoned judgments about a defendant’s guilt or innocence.

Besides lacking specific reasoning skills and domain-relevant knowledge, people may fail to think critically because they are not disposed to use their reasoning skills to examine such claims and want to preserve their favored beliefs. Critical thinking dispositions are attitudes or traits that make it more likely that a person will think critically. Theorists have proposed numerous CT dispositions (e.g., Bensley 2018 ; Butler and Halpern 2019 ; Dwyer 2017 ; Ennis 1987 ). Some commonly identified CT dispositions especially relevant to this discussion are open-mindedness, skepticism, intellectual engagement, and the tendency to take a reflective, rational–analytic approach. Critical thinking dispositions are clearly value-laden and prescriptive. A good thinker should be open-minded, skeptical, reflective, intellectually engaged, and value a rational–analytic approach to inquiry. Conversely, corresponding negative dispositions, such as “close-mindedness” and “gullibility”, could obstruct CT.

Without the appropriate disposition, individuals will not use their reasoning skills to think critically about questions. For example, the brilliant mystery writer, Sir Arthur Conan Doyle, who was trained as a physician and created the hyper-reasonable detective Sherlock Holmes, was not disposed to think critically about some unsubstantiated claims. Conan Doyle was no doubt highly intelligent in cognitive ability terms, but he was not sufficiently skeptical (disposed to think critically) about spiritualism. He believed that he was talking to his dearly departed son though a medium, despite the warnings of his magician friend, Harry Houdini, who told him that mediums used trickery in their seances. Perhaps influenced by his Irish father’s belief in the “wee folk”, Conan Doyle also believed that fairies inhabited the English countryside, based on children’s photos, despite the advice of experts who said the photos could be faked. Nevertheless, he was skeptical of a new theory of tuberculosis proposed by Koch when he reported on it, despite his wife suffering from the disease. So, in professional capacities, Conan Doyle used his CT skills, but in certain other domains for which he was motivated to accept unsubstantiated claims, he failed to think critically, insufficiently disposed to skeptically challenge certain implausible claims.

This example makes two important points. Conan Doyle’s superior intelligence was not enough for him to reject implausible claims about the world. In general, motivated reasoning can lead people, even those considered highly intelligent, to accept claims with no good evidentiary support. The second important point is that we would not be able to adequately explain cases like this one, considering only the person’s intelligence or even their reasoning skills, without also considering the person’s disposition. General cognitive ability alone is not sufficient, and CT dispositions should also be considered.

Supporting this conclusion, Stanovich and West ( 1997 ) examined the influence of dispositions beyond the contribution of cognitive ability on a CT task. They gave college students an argument evaluation test in which participants first rated their agreement with several claims about real social and political issues made by a fictitious person. Then, they gave them evidence against each claim and finally asked them to rate the quality of a counterargument made by the same fictitious person. Participants’ ratings of the counterarguments were compared to the median ratings of expert judges on the quality of the rebuttals. Stanovich and West also administered a new measure of rational disposition called the Actively Open-minded Thinking (AOT) scale and the SAT as a proxy for cognitive ability. The AOT was a composite of items from several other scales that would be expected to measure CT disposition. They found that both SAT and AOT scores were significant predictors of higher argument analysis scores. Even after partialing out cognitive ability, actively open-minded thinking was significant. These results suggest that general cognitive ability alone was not sufficient to account for thinking critically about real-world issues and that CT disposition was needed to go beyond it.

Further examining the roles of CT dispositions and cognitive ability on reasoning, Stanovich and West ( 2008 ) studied myside bias, a bias in reasoning closely related to one-sided thinking and confirmation bias. A critical thinker would be expected to not show myside bias and instead fairly evaluate evidence on all sides of a question. Stanovich and West ( 2007 ) found that college students often showed myside bias when asked their opinions about real-world policy issues, such as those concerning the health risks of smoking and drinking alcohol. For example, compared to non-smokers, smokers judged the health risks of smoking to be lower. When they divided participants into higher versus lower cognitive ability groups based on SAT scores, the two groups showed little difference on myside bias. Moreover, on the hazards of drinking issue, participants who drank less had higher scores on the CT disposition measure.

Other research supports the need for both reasoning ability and CT disposition in predicting outcomes in the real world. Ren et al. ( 2020 ) found that CT disposition, as measured by a Chinese critical thinking disposition inventory, and a CT skill measure together contributed a significant amount of the variance in predicting academic performance beyond the contribution of cognitive ability alone, as measured by a test of fluid intelligence. Further supporting the claim that CT requires both cognitive ability and CT disposition, Ku and Ho ( 2010 ) found that a CT disposition measure significantly predicted scores on a CT test beyond the significant contribution of verbal intelligence in high school and college students from Hong Kong.

The contribution of dispositions to thinking is related to another way that CT goes beyond the application of general cognitive ability, i.e., by way of the motivation for reasoning. Assuming that all reasoning is motivated ( Kunda 1990 ), then CT is motivated, too, which is implicit within the Halpern and Dunn ( 2021 ) and Ennis ( 1987 ) definitions. Critical thinking is motivated in the sense of being purposeful and directed towards the goal of arriving at an accurate conclusion. For instance, corresponding to pursuit of the goal of accurate reasoning, the CT disposition of “truth-seeking” guides a person towards reaching the CT goal of arriving at an accurate conclusion.

Also, according to Kunda ( 1990 ), a second type of motivated reasoning can lead to faulty conclusions, often by directing a person towards the goal of maintaining favored beliefs and preconceptions, as in illusory correlation, belief perseverance, and confirmation bias. Corresponding to this second type, negative dispositions, such as close-mindedness and self-serving motives, can incline thinkers towards faulty conclusions. This is especially relevant in the present discussion because poorer reasoning, thinking errors, and the inappropriate use of heuristics are related to the endorsement of unsubstantiated claims, all of which are CT failures. The term “thinking errors” is a generic term referring to logical fallacies, informal reasoning fallacies, argumentation errors, and inappropriate uses of cognitive heuristics ( Bensley 2018 ). Heuristics are cognitive shortcuts, commonly used to simplify judgment tasks and reduce mental effort. Yet, when used inappropriately, heuristics often result in biased judgments.

Stanovich ( 2009 ) has argued that IQ tests do not test people’s use of heuristics, but heuristics have been found to be negatively correlated with CT performance ( West et al. 2008 ). In this same study, they found that college students’ cognitive ability, as measured by performance on the SAT, was not correlated with thinking biases associated with use of heuristics. Although Stanovich and West ( 2008 ) found that susceptibility to biases, such as the conjunction fallacy, framing effect, base-rate neglect, affect bias, and myside bias were all uncorrelated with cognitive ability (using SAT as a proxy), other types of thinking errors were correlated with SAT.

Likewise, two types of knowledge are related to the two forms of motivated reasoning. For instance, inaccurate knowledge, such as misconceptions, can derail reasoning from moving towards a correct conclusion, as in when a person reasons from false premises. In contrast, reasoning from accurate knowledge is more likely to produce an accurate conclusion. Taking into account inaccurate knowledge and thinking errors is important to understanding the endorsement of unsubstantiated claims because these are also related to negative dispositions, such as close-mindedness and cynicism, none of which are measured by intelligence tests.

Critical thinking questions are often situated in real-world examples or in simulations of them which are designed to detect thinking errors and bias. As described in Halpern and Butler ( 2018 ), an item like one on the “Halpern Critical Thinking Assessment” (HCTA) provides respondents with a mock newspaper story about research showing that first-graders who attended preschool were better able to learn how to read. Then the question asks if preschool should be made mandatory. A correct response to this item requires recognizing that correlation does not imply causation, that is, avoiding a common reasoning error people make in thinking about research implications in everyday life. Another CT skills test, “Analyzing Psychological Statements” (APS) assesses the ability to recognize thinking errors and apply argumentation skills and psychology to evaluate psychology-related examples and simulations of real-life situations ( Bensley 2021 ). For instance, besides identifying thinking errors in brief samples of thinking, questions ask respondents to distinguish arguments from non-arguments, find assumptions in arguments, evaluate kinds of evidence, and draw a conclusion from a brief psychological argument. An important implication of the studies just reviewed is that efforts to understand CT can be further informed by assessing thinking errors and biases, which, as the next discussion shows, are related to individual differences in thinking dispositions and cognitive style.

4. Dual-Process Theory Measures and Unsubstantiated Beliefs

Dual-process theory (DPT) and measures associated with it have been widely used in the study of the endorsement of unsubstantiated beliefs, especially as they relate to cognitive style. According to a cognitive style version of DPT, people have two modes of processing, a fast intuitive–experiential (I-E) style of processing and a slower, reflective, rational–analytic (R-A) style of processing. The intuitive cognitive style is associated with reliance on hunches, feelings, personal experience, and cognitive heuristics which simplify processing, while the R-A cognitive style is a reflective, rational–analytic style associated with more elaborate and effortful processing ( Bensley et al. 2022 ; Epstein 2008 ). As such, the rational–analytic cognitive style is consistent with CT dispositions, such as those promoting the effortful analysis of evidence, objective truth, and logical consistency. In fact, CT is sometimes referred to as “critical-analytic” thinking ( Byrnes and Dunbar 2014 ) and has been associated with analytical intelligence Sternberg ( 1988 ) and with rational thinking, as discussed before.

People use both modes of processing, but they show individual differences in which mode they tend to rely upon, although the intuitive–experiential mode is the default ( Bensley et al. 2022 ; Morgan 2016 ; Pacini and Epstein 1999 ), and they accept unsubstantiated claims differentially based on their predominate cognitive style ( Bensley et al. 2022 ; Epstein 2008 ). Specifically, individuals who rely more on an I-E cognitive style tend to endorse unsubstantiated claims more strongly, while individuals who rely more on a R-A cognitive style tend to endorse those claims less. Note, however, that other theorists view the two processes and cognitive styles somewhat differently, (e.g., Kahneman 2011 ; Stanovich et al. 2018 ).

Researchers have often assessed the contribution of these two cognitive styles to endorsement of unsubstantiated claims, using variants of three measures: the Cognitive Reflection Test (CRT) of Frederick ( 2005 ), the Rational–Experiential Inventory of Epstein and his colleagues ( Pacini and Epstein 1999 ), and the related Need for Cognition scale of Cacioppo and Petty ( 1982 ). The CRT is a performance-based test which asks participants to solve problems that appear to require simple mathematical calculations, but which actually require more reflection. People typically do poorly on the CRT, which is thought to indicate reliance on an intuitive cognitive style, while better performance is thought to indicate reliance on the slower, more deliberate, and reflective cognitive style. The positive correlation of the CRT with numeracy scores suggests it also has a cognitive skill component ( Patel et al. 2019 ). The Rational–Experiential Inventory (REI) of Pacini and Epstein ( 1999 ) contains one scale designed to measure an intuitive–experiential cognitive style and a second scale intended to measure a rational–analytic (R-A) style. The R-A scale was adapted from the Need for Cognition (NFC) scale of Cacioppo and Petty ( 1982 ), another scale associated with rational–analytic thinking and expected to be negatively correlated with unsubstantiated beliefs. The NFC was found to be related to open-mindedness and intellectual engagement, two CT dispositions ( Cacioppo et al. 1996 ).

The cognitive styles associated with DPT also relate to CT dispositions. Thinking critically requires that individuals be disposed to use their reasoning skills to reject unsubstantiated claims ( Bensley 2018 ) and that they be inclined to take a rational–analytic approach rather than relying on their intuitions and feelings. For instance, Bensley et al. ( 2014 ) found that students who endorsed more psychological misconceptions adopted a more intuitive cognitive style, were less disposed to take a rational–scientific approach to psychology, and scored lower on a psychological critical thinking skills test. Further supporting this connection, West et al. ( 2008 ) found that participants who tended to use cognitive heuristics more, thought to be related to intuitive processing and bias, scored lower on a critical thinking measure. As the Bensley et al. ( 2014 ) results suggest, in addition to assessing reasoning skills and dispositions, comprehensive CT assessment research should assess knowledge and unsubstantiated beliefs because these are related to failures of critical thinking.

5. Assessing Critical Thinking and Unsubstantiated Beliefs

Assessing endorsement of unsubstantiated claims provides another way to assess CT outcomes related to everyday thinking, which goes beyond what intelligence tests test ( Bensley and Lilienfeld 2020 ). From the perspective of the multi-dimensional model of CT, endorsement of unsubstantiated claims could result from deficiencies in a person’s CT reasoning skills, a lack of relevant knowledge, and in the engagement of inappropriate dispositions. Suppose an individual endorses an unsubstantiated claim, such as believing the conspiracy theory that human-caused global warming is a hoax. The person may lack the specific reasoning skills needed to critically evaluate the conspiracy. Lantian et al. ( 2020 ) found that scores on a CT skills test were negatively correlated with conspiracy theory beliefs. The person also must possess relevant scientific knowledge, such as knowing the facts that each year humans pump about 40 billion metric tons of carbon dioxide into the atmosphere and that carbon dioxide is a greenhouse gas which traps heat in the atmosphere. Or, the person may not be scientifically skeptical or too cynical or mistrustful of scientists or governmental officials.

Although endorsing unsubstantiated beliefs is clearly a failure of CT, problems arise in deciding which ones are unsubstantiated, especially when considering conspiracy theories. Typically, the claims which critical thinkers should reject as unsubstantiated are those which are not supported by objective evidence. But of the many conspiracies proposed, few are vigorously examined. Moreover, some conspiracy theories which authorities might initially deny turn out to be real, such as the MK-Ultra theory that the CIA was secretly conducting mind-control research on American citizens.

A way out of this quagmire is to define unsubstantiated beliefs on a continuum which depends on the quality of evidence. This has led to the definition of unsubstantiated claims as assertions which have not been supported by high-quality evidence ( Bensley 2023 ). Those which are supported have the kind of evidentiary support that critical thinkers are expected to value in drawing reasonable conclusions. Instead of insisting that a claim must be demonstrably false to be rejected, we adopt a more tentative acceptance or rejection of claims, based on how much good evidence supports them. Many claims are unsubstantiated because they have not yet been carefully examined and so totally lack support or they may be supported only by low quality evidence such as personal experience, anecdotes, or non-scientific authority. Other claims are more clearly unsubstantiated because they contradict the findings of high-quality research. A critical thinker should be highly skeptical of these.

Psychological misconceptions are one type of claim that can be more clearly unsubstantiated. Psychological misconceptions are commonsense psychological claims (folk theories) about the mind, brain, and behavior that are contradicted by the bulk of high-quality scientific research. Author developed the Test of Psychological Knowledge and Misconceptions (TOPKAM), a 40-item, forced-choice measure with each item posing a statement of a psychological misconception and the other response option stating the evidence-based alternative ( Bensley et al. 2014 ). They found that higher scores on the APS, the argument analysis test applying psychological concepts to analyze real-world examples, were associated with more correct answers on the TOPKAM. Other studies have found positive correlations between CT skills tests and other measures of psychological misconceptions ( McCutcheon et al. 1992 ; Kowalski and Taylor 2004 ). Bensley et al. ( 2014 ) also found that higher correct TOPKAM scores were positively correlated with scores on the Inventory of Thinking Dispositions in Psychology (ITDP) of Bensley ( 2021 ), a measure of the disposition to take a rational and scientific approach to psychology but were negatively correlated with an intuitive cognitive style.

Bensley et al. ( 2021 ) conducted a multidimensional study, assessing beginner psychology students starting a CT course on their endorsement of psychological misconceptions, recognition of thinking errors, CT dispositions, and metacognition, before and after CT instruction. Two classes received explicit instruction involving considerable practice in argument analysis and scientific reasoning skills, with one class receiving CT instruction focused more on recognizing psychological misconceptions and a second class focused more on recognizing various thinking errors. Bensley et al. assessed both classes before and after instruction on the TOPKAM and on the Test of Thinking Errors, a test of the ability to recognize in real-world examples 17 different types of thinking errors, such as confirmation bias, inappropriate use of the availability and representativeness heuristics, reasoning from ignorance/possibility, gambler’s fallacy, and hasty generalization ( Bensley et al. 2021 ). Correct TOPKAM and TOTE scores were positively correlated, and after CT instruction both were positively correlated with the APS, the CT test of argument analysis skills.

Bensley et al. found that after explicit instruction of CT skills, students improved significantly on both the TOPKAM and TOTE, but those focusing on recognizing misconceptions improved the most. Also, those students who improved the most on the TOTE scored higher on the REI rational–analytic scale and on the ITDP, while those improving the most on the TOTE scored higher on the ITDP. The students receiving explicit CT skill instruction in recognizing misconceptions also significantly improved the accuracy of their metacognitive monitoring in estimating their TOPKAM scores after instruction.

Given that before instruction neither class differed in GPA nor on the SAT, a proxy for general cognitive ability, CT instruction provided a good accounting for the improvement in recognition of thinking errors and misconceptions without recourse to intelligence. However, SAT scores were positively correlated with both TOTE scores and APS scores, suggesting that cognitive ability contributed to CT skill performance. These results replicated the earlier findings of Bensley and Spero ( 2014 ) showing that explicit CT instruction improved performance on both CT skills tests and metacognitive monitoring accuracy while controlling for SAT, which was positively correlated with the CT skills test performance.

Taken together, these findings suggest that cognitive ability contributes to performance on CT tasks but that CT instruction goes beyond it to further improve performance. As the results of Bensley et al. ( 2021 ) show, and as discussed next, thinking errors and bias from heuristics are CT failures that should also be assessed because they are related to endorsement of unsubstantiated beliefs and cognitive style.

6. Dual-Processing Theory and Research on Unsubstantiated Beliefs

Consistent with DPT, numerous other studies have obtained significant positive correlations between intuitive cognitive style and paranormal belief, often using the REI intuitive–experiential scale and the Revised Paranormal Belief Scale (RPBS) of Tobacyk ( 2004 ) (e.g., Genovese 2005 ; Irwin and Young 2002 ; Lindeman and Aarnio 2006 ; Pennycook et al. 2015 ; Rogers et al. 2018 ; Saher and Lindeman 2005 ). Studies have also found positive correlations between superstitious belief and intuitive cognitive style (e.g., Lindeman and Aarnio 2006 ; Maqsood et al. 2018 ). REI intuitive–experiential thinking style was also positively correlated with belief in complementary and alternative medicine ( Lindeman 2011 ), conspiracy theory belief ( Alper et al. 2020 ), and with endorsement of psychological misconceptions ( Bensley et al. 2014 ; Bensley et al. 2022 ).

Additional evidence for DPT has been found when REI R-A and NFC scores were negatively correlated with scores on measures of unsubstantiated beliefs, but studies correlating them with measures of paranormal belief and conspiracy theory belief have shown mixed results. Supporting a relationship, REI rational–analytic and NFC scores significantly and negatively predicted paranormal belief ( Lobato et al. 2014 ; Pennycook et al. 2012 ). Other studies have also obtained a negative correlation between NFC and paranormal belief ( Lindeman and Aarnio 2006 ; Rogers et al. 2018 ; Stahl and van Prooijen 2018 ), but both Genovese ( 2005 ) and Pennycook et al. ( 2015 ) found that NFC was not significantly correlated with paranormal belief. Swami et al. ( 2014 ) found that although REI R-A scores were negatively correlated with conspiracy theory belief, NFC scores were not.

Researchers often refer to people who are doubtful of paranormal and other unfounded claims as “skeptics” and so have tested whether measures related to skepticism are associated with less endorsement of unsubstantiated claims. They typically view skepticism as a stance towards unsubstantiated claims taken by rational people who reject them, (e.g., Lindeman and Aarnio 2006 ; Stahl and van Prooijen 2018 ), rather than as a disposition inclining a person to think critically about unsubstantiated beliefs ( Bensley 2018 ).

Fasce and Pico ( 2019 ) conducted one of the few studies using a measure related to skeptical disposition, the Critical Thinking Disposition Scale (CTDS) of Sosu ( 2013 ), in relation to endorsement of unsubstantiated claims. They found that scores on the CTDS were negatively correlated with scores on the RPBS but not significantly correlated with either a measure of pseudoscience or of conspiracy theory belief. However, the CRT was negatively correlated with both RPBS and the pseudoscience measure. Because Fasce and Pico ( 2019 ) did not examine correlations with the Reflective Skepticism subscale of the CTDS, its contribution apart from full-scale CTDS was not found.

To more directly test skepticism as a disposition, we recently assessed college students on how well three new measures predicted endorsement of psychological misconceptions, paranormal claims, and conspiracy theories ( Bensley et al. 2022 ). The dispositional measures included a measure of general skeptical attitude; a second measure, the Scientific Skepticism Scale (SSS), which focused more on waiting to accept claims until high-quality scientific evidence supported them; and a third measure, the Cynicism Scale (CS), which focused on doubting the sincerity of the motives of scientists and people in general. We found that although the general skepticism scale did not predict any of the unsubstantiated belief measures, SSS scores were a significant negative predictor of both paranormal belief and conspiracy theory belief. REI R-A scores were a less consistent negative predictor, while REI I-E scores were more consistent positive predictors, and surprisingly CS scores were the most consistent positive predictors of the unsubstantiated beliefs.

Researchers commonly assume that people who accept implausible, unsubstantiated claims are gullible or not sufficiently skeptical. For instance, van Prooijen ( 2019 ) has argued that conspiracy theory believers are more gullible (less skeptical) than non-believers and tend to accept unsubstantiated claims more than less gullible people. van Prooijen ( 2019 ) reviewed several studies supporting the claim that people who are more gullible tend to endorse conspiracy theories more. However, he did not report any studies in which a gullible disposition was directly measured.

Recently, we directly tested the gullibility hypothesis in relation to scientific skepticism ( Bensley et al. 2023 ) using the Gullibility Scale of Teunisse et al. ( 2019 ) on which people skeptical of the paranormal had been shown to have lower scores. We found that Gullibility Scale and the Cynicism Scale scores were positively correlated, and both were significant positive predictors of unsubstantiated beliefs, in general, consistent with an intuitive–experiential cognitive style. In contrast, we found that scores on the Cognitive Reflection Test, the Scientific Skepticism Scale, and the REI rational–analytic scale were all positively intercorrelated and significant negative predictors of unsubstantiated beliefs, in general, consistent with a rational–analytic/reflective cognitive style. Scientific skepticism scores negatively predicted general endorsement of unsubstantiated claims beyond the REI R-A scale, but neither the CTDS nor the CTDS Reflective Skepticism subscale were significant. These results replicated findings from the Bensley et al. ( 2023 ) study and supported an elaborated dual-process model of unsubstantiated belief. The SSS was not only a substantial negative predictor, it was also negatively correlated with the Gullibility Scale, as expected.

These results suggest that both CT-related dispositions and CT skills are related to endorsement of unsubstantiated beliefs. However, a measure of general cognitive ability or intelligence must be examined along with measures of CT and unsubstantiated beliefs to determine if CT goes beyond intelligence to predict unsubstantiated beliefs. In one of the few studies that also included a measure of cognitive ability, Stahl and van Prooijen ( 2018 ) found that dispositional characteristics helped account for acceptance of conspiracies and paranormal belief beyond cognitive ability. Using the Importance of Rationality Scale (IRS), a rational–analytic scale designed to measure skepticism towards unsubstantiated beliefs, Stahl and van Prooijen ( 2018 ) found that the IRS was negatively correlated with paranormal belief and belief in conspiracy theories. In separate hierarchical regressions, cognitive ability was the strongest negative predictor of both paranormal belief and of conspiracy belief, but IRS scores in combination with cognitive ability negatively predicted endorsement of paranormal belief but did not significantly predict conspiracy theory belief. These results provided partial support that that a measure of rational–analytic cognitive style related to skeptical disposition added to the variance accounted for beyond cognitive ability in negatively predicting unsubstantiated belief.

In another study that included a measure of cognitive ability, Cavojova et al. ( 2019 ) examined how CT-related dispositions and the Scientific Reasoning Scale (SRS) were related to a measure of paranormal, pseudoscientific, and conspiracy theory beliefs. The SRS of Drummond and Fischhoff ( 2017 ) likely measures CT skill in that it measures the ability to evaluate scientific research and evidence. As expected, the unsubstantiated belief measure was negatively correlated with the SRS and a cognitive ability measure, similar to Raven’s Progressive Matrices. Unsubstantiated beliefs were positively correlated with dogmatism (the opposite of open-mindedness) but not with REI rational–analytic cognitive style. The SRS was a significant negative predictor of both unsubstantiated belief and susceptibility to bias beyond the contribution of cognitive ability, but neither dogmatism nor analytic thinking were significant predictors. Nevertheless, this study provides some support that a measure related to CT reasoning skill accounts for variance in unsubstantiated belief beyond cognitive ability.

The failure of this study to show a correlation between rational–analytic cognitive style and unsubstantiated beliefs, when some other studies have found significant correlations with it and related measures, has implications for the multidimensional assessment of unsubstantiated beliefs. One implication is that the REI rational–analytic scale may not be a strong predictor of unsubstantiated beliefs. In fact, we have recently found that the Scientific Skepticism Scale was a stronger negative predictor ( Bensley et al. 2022 ; Bensley et al. 2023 ), which also suggests that other measures related to rational–analytic thinking styles should be examined. This could help triangulate the contribution of self-report cognitive style measures to endorsement of unsubstantiated claims, recognizing that the use of self-report measures has a checkered history in psychological research. A second implication is that once again, measures of critical thinking skill and cognitive ability were negative predictors of unsubstantiated belief and so they, too, should be included in future assessments of unsubstantiated beliefs.

7. Discussion

This review provided different lines of evidence supporting the claim that CT goes beyond cognitive ability in accounting for certain real-world outcomes. Participants who think critically reported fewer problems in everyday functioning, not expected to befall critical thinkers. People who endorsed unsubstantiated claims less showed better CT skills, more accurate domain-specific knowledge, less susceptibility to thinking errors and bias, and were more disposed to think critically. More specifically, they tended to be more scientifically skeptical and adopt a more rational–analytic cognitive style. In contrast, those who endorsed them more tended to be more cynical and adopt an intuitive–experiential cognitive style. These characteristics go beyond what standardized intelligence tests test. In some studies, the CT measures accounted for additional variance beyond the variance contributed by general cognitive ability.

That is not to say that measures of general cognitive ability are not useful. As noted by Gottfredson ( 2004 ), “g” is a highly successful predictor of academic and job performance. More is known about g and Gf than about many other psychological constructs. On average, g is closely related to Gf, which is highly correlated with working memory ( r = 0.70) and can be as high as r = 0.77 ( r 2 = 0.60) based on a correlated two-factor model ( Gignac 2014 ). Because modern working memory theory is, itself, a powerful theory ( Chai et al. 2018 ), this lends construct validity to the fluid intelligence construct. Although cognitive scientists have clearly made progress in understanding the executive processes underlying intelligence, they have not yet identified the specific cognitive components of intelligence ( Sternberg 2022 ). Moreover, theorists have acknowledged that intelligence must also include components beyond g, including domain-specific knowledge ( Ackerman 2022 ; Conway and Kovacs 2018 ) which are not yet clearly understood,

This review also pointed to limitations in the research that should be addressed. So far, not only have few studies of unsubstantiated beliefs included measures of intelligence, but they have also often used proxies for intelligence test scores, such as SAT scores. Future studies, besides using more and better measures of intelligence, could benefit from inclusion of more specifically focused measures, such as measures of Gf and Gc. Also, more research should be carried out to develop additional high-quality measures of CT, including ones that assess specific reasoning skills and knowledge relevant to thinking about a subject, which could help resolve perennial questions about the domain-general versus domain-specific nature of intelligence and CT. Overall, the results of this review encourage taking a multidimensional approach to investigating the complex constructs of intelligence, CT, and unsubstantiated belief. Supporting these recommendations were results of studies in which the improvement accrued from explicit CT skill instruction could be more fully understood when CT skills, relevant knowledge, CT dispositions, metacognitive monitoring accuracy, and a proxy for intelligence were used.

8. Conclusions

Critical thinking, broadly conceived, offers ways to understand real-world outcomes of thinking beyond what general cognitive ability can provide and intelligence tests test. A multi-dimensional view of CT which includes specific reasoning and metacognitive skills, CT dispositions, and relevant knowledge can add to our understanding of why some people endorse unsubstantiated claims more than others do, going beyond what intelligence tests test. Although general cognitive ability and domain-general knowledge often contribute to performance on CT tasks, thinking critically about real-world questions also involves applying rules, criteria, and knowledge which are specific to the question under consideration, as well as the appropriate dispositions and cognitive styles for deploying these.

Despite the advantages of taking this multidimensional approach to CT in helping us to more fully understand everyday thinking and irrationality, it presents challenges for researchers and instructors. It implies the need to assess and instruct multidimensionally, including not only measures of reasoning skills but also addressing thinking errors and biases, dispositions, the knowledge relevant to a task, and the accuracy of metacognitive judgments. As noted by Dwyer ( 2023 ), adopting a more complex conceptualization of CT beyond just skills is needed, but it presents challenges for those seeking to improve students’ CT. Nevertheless, the research reviewed suggests that taking this multidimensional approach to CT can enhance our understanding of the endorsement of unsubstantiated claims beyond what standardized intelligence tests contribute. More research is needed to resolve remaining controversies and to develop evidence-based applications of the findings.

Funding Statement

This research received no external funding.

Institutional Review Board Statement

This research involved no new testing of participants and hence did not require Institutional Review Board approval.

Informed Consent Statement

This research involved no new testing of participants and hence did not require an Informed Consent Statement.

Data Availability Statement

Conflicts of interest.

The author declares no conflict of interest.

Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

  • Ackerman Phillip L. Intelligence … Moving beyond the lowest common denominator. American Psychologist. 2022; 78 :283–97. doi: 10.1037/amp0001057. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Alper Sinan, Bayrak Faith, Yilmaz Onurcan. Psychological correlates of COVID-19 conspiracy beliefs and preventive measures: Evidence from Turkey. Current Psychology. 2020; 40 :5708–17. doi: 10.1007/s12144-020-00903-0. [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Bensley D. Alan. Critical Thinking in Psychology and Everyday Life: A Guide to Effective Thinking. Worth Publishers; New York: 2018. [ Google Scholar ]
  • Bensley D. Alan. The Critical Thinking in Psychology Assessment Battery (CTPAB) and Test Guide. 2021. Unpublished manuscript. Frostburg State University, Frostburg, MD, USA.
  • Bensley D. Alan. “I can’t believe you believe that”: Identifying unsubstantiated claims. Skeptical Inquirer. 2023; 47 :53–56. [ Google Scholar ]
  • Bensley D. Alan, Spero Rachel A. Improving critical thinking skills and metacognitive monitoring through direct infusion. Thinking Skills and Creativity. 2014; 12 :55–68. doi: 10.1016/j.tsc.2014.02.001. [ CrossRef ] [ Google Scholar ]
  • Bensley D. Alan, Lilienfeld Scott O. Assessment of Unsubstantiated Beliefs. Scholarship of Teaching and Learning in Psychology. 2020; 6 :198–211. doi: 10.1037/stl0000218. [ CrossRef ] [ Google Scholar ]
  • Bensley D. Alan, Masciocchi Christopher M., Rowan Krystal A. A comprehensive assessment of explicit critical thinking instruction on recognition of thinking errors and psychological misconceptions. Scholarship of Teaching and Learning in Psychology. 2021; 7 :107. doi: 10.1037/stl0000188. [ CrossRef ] [ Google Scholar ]
  • Bensley D. Alan, Watkins Cody, Lilienfeld Scott O., Masciocchi Christopher, Murtagh Michael, Rowan Krystal. Skepticism, cynicism, and cognitive style predictors of the generality of unsubstantiated belief. Applied Cognitive Psychology. 2022; 36 :83–99. doi: 10.1002/acp.3900. [ CrossRef ] [ Google Scholar ]
  • Bensley D. Alan, Rodrigo Maria, Bravo Maria, Jocoy Kathleen. Dual-Process Theory and Cognitive Style Predictors of the General Endorsement of Unsubstantiated Claims. 2023. Unpublished manuscript. Frostburg State University, Frostburg, MD, USA.
  • Bensley D. Alan, Lilienfeld Scott O., Powell Lauren. A new measure of psychological. misconceptions: Relations with academic background, critical thinking, and acceptance of paranormal and pseudoscientific claims. Learning and Individual Differences. 2014; 36 :9–18. doi: 10.1016/j.lindif.2014.07.009. [ CrossRef ] [ Google Scholar ]
  • Bierwiaczonek Kinga, Kunst Jonas R., Pich Olivia. Belief in COVID-19 conspiracy theories reduces social distancing over time. Applied Psychology Health and Well-Being. 2020; 12 :1270–85. doi: 10.1111/aphw.12223. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Butler Heather A. Halpern critical thinking assessment predicts real-world outcomes of critical thinking. Applied Cognitive Psychology. 2012; 26 :721–29. doi: 10.1002/acp.2851. [ CrossRef ] [ Google Scholar ]
  • Butler Heather A., Halpern Diane F. Is critical thinking a better model of intelligence? In: Sternberg Robert J., editor. The Nature of Intelligence. Cambridge University Press; Cambridge: 2019. pp. 183–96. [ Google Scholar ]
  • Butler Heather A., Pentoney Christopher, Bong Maebelle P. Predicting real-world outcomes: Critical thinking ability is a better predictor of life decisions than intelligence. Thinking Skills and Creativity. 2017; 25 :38–46. doi: 10.1016/j.tsc.2017.06.005. [ CrossRef ] [ Google Scholar ]
  • Byrnes James P., Dunbar Kevin N. The nature and development of critical-analytic thinking. Educational Research Review. 2014; 26 :477–93. doi: 10.1007/s10648-014-9284-0. [ CrossRef ] [ Google Scholar ]
  • Cacioppo John T., Petty Richard E. The need for cognition. Journal of Personality and Social Psychology. 1982; 42 :116–31. doi: 10.1037/0022-3514.42.1.116. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Cacioppo John T., Petty Richard E., Feinstein Jeffrey A., Jarvis W. Blair G. Dispositional differences in cognitive motivation: The life and times of individuals varying in need for cognition. Psychological Bulletin. 1996; 119 :197. doi: 10.1037/0033-2909.119.2.197. [ CrossRef ] [ Google Scholar ]
  • Cavojova Vladimira, Srol Jakub, Jurkovic Marek. Why we should think like scientists? Scientific reasoning and susceptibility to epistemically suspect beliefs and cognitive biases. Applied Cognitive Psychology. 2019; 34 :85–95. doi: 10.1002/acp.3595. [ CrossRef ] [ Google Scholar ]
  • Chai Wen Jia, Hamid Abd, Ismafairus Aini, Abdullah Jafri Malin. Working memory from the psychological and neuroscience perspective. Frontiers in Psychology. 2018; 9 :401. doi: 10.3389/fpsyg.2018.00401. [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Conway Andrew R., Kovacs Kristof. The nature of the general factor of intelligence. In: Sternberg Robert J., editor. The Nature of Human Intelligence. Cambridge University Press; Cambridge: 2018. pp. 49–63. [ Google Scholar ]
  • Drummond Caitlin, Fischhoff Baruch. Development and validation of the Scientific Reasoning Scale. Journal of Behavioral Decision Making. 2017; 30 :26–38. doi: 10.1002/bdm.1906. [ CrossRef ] [ Google Scholar ]
  • Dwyer Christopher P. Conceptual Perspectives and Practical Guidelines. Cambridge University Press; Cambridge: 2017. [ Google Scholar ]
  • Dwyer Christopher P. An evaluative review of barriers to critical thinking in educational and real-world settings. Journal of Intelligence. 2023; 11 :105. doi: 10.3390/jintelligence11060105. [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Ennis Robert H. A taxonomy of critical thinking dispositions and abilities. In: Baron Joan, Sternberg Robert., editors. Teaching Thinking Skills: Theory and Practice. W. H. Freeman; New York: 1987. [ Google Scholar ]
  • Epstein Seymour. Intuition from the perspective of cognitive-experiential self-theory. In: Plessner Henning, Betsch Tilmann., editors. Intuition in Judgment and Decision Making. Erlbaum; Washington, DC: 2008. pp. 23–37. [ Google Scholar ]
  • Fasce Angelo, Pico Alfonso. Science as a vaccine: The relation between scientific literacy and unwarranted beliefs. Science & Education. 2019; 28 :109–25. doi: 10.1007/s11191-018-00022-0. [ CrossRef ] [ Google Scholar ]
  • Frederick Shane. Cognitive reflection and decision making. Journal of Economic Perspectives. 2005; 19 :25–42. doi: 10.1257/089533005775196732. [ CrossRef ] [ Google Scholar ]
  • Gardner Howard. Intelligence Reframed: Multiple Intelligence for the 21st Century. Basic Books; New York: 1999. [ Google Scholar ]
  • Genovese Jeremy E. C. Paranormal beliefs, schizotypy, and thinking styles among teachers and future teachers. Personality and Individual Differences. 2005; 39 :93–102. doi: 10.1016/j.paid.2004.12.008. [ CrossRef ] [ Google Scholar ]
  • Gignac Gilles E. Fluid intelligence shares closer to 60% of its variance with working memory capacity and is a better indicator of general intelligence. Intelligence. 2014; 47 :122–33. doi: 10.1016/j.intell.2014.09.004. [ CrossRef ] [ Google Scholar ]
  • Gottfredson Linda S. Life, death, and intelligence. Journal of Cognitive Education and Psychology. 2004; 4 :23–46. doi: 10.1891/194589504787382839. [ CrossRef ] [ Google Scholar ]
  • Halpern Diane F., Dunn Dana. Critical thinking: A model of intelligence for solving real-world problems. Journal of Intelligence. 2021; 9 :22. doi: 10.3390/jintelligence9020022. [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Halpern Diane F., Butler Heather A. Is critical thinking a better model of intelligence? In: Sternberg Robert J., editor. The Nature of Human Intelligence. Cambridge University Press; Cambridge: 2018. pp. 183–196. [ Google Scholar ]
  • Irwin Harvey J., Young J. M. Intuitive versus reflective processes in the formation of paranormal beliefs. European Journal of Parapsychology. 2002; 17 :45–55. [ Google Scholar ]
  • Jolley Daniel, Paterson Jenny L. Pylons ablaze: Examining the role of 5G COVID-19 conspiracy beliefs and support for violence. British Journal of Social Psychology. 2020; 59 :628–40. doi: 10.1111/bjso.12394. [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Kahneman Daniel. Thinking Fast and Slow. Farrar, Strauss and Giroux; New York: 2011. [ Google Scholar ]
  • Kowalski Patricia, Taylor Annette J. Ability and critical thinking as predictors of change in students’ psychological misconceptions. Journal of Instructional Psychology. 2004; 31 :297–303. [ Google Scholar ]
  • Ku Kelly Y. L., Ho Irene T. Dispositional Factors predicting Chinese students’ critical thinking performance. Personality and Individual Differences. 2010; 48 :54–58. doi: 10.1016/j.paid.2009.08.015. [ CrossRef ] [ Google Scholar ]
  • Kunda Ziva. The case for motivated reasoning. Psychological Bulletin. 1990; 98 :480–98. doi: 10.1037/0033-2909.108.3.480. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Lantian Anthony, Bagneux Virginie, Delouvee Sylvain, Gauvrit Nicolas. Maybe a free thinker but not a critical one: High conspiracy belief is associated with low critical thinking ability. Applied Cognitive Psychology. 2020; 35 :674–84. doi: 10.1002/acp.3790. [ CrossRef ] [ Google Scholar ]
  • Lilienfeld Scott O. Psychological treatments that cause harm. Perspectives on Psychological Science. 2007; 2 :53–70. doi: 10.1111/j.1745-6916.2007.00029.x. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Lindeman Marjaana. Biases in intuitive reasoning and belief in complementary and alternative medicine. Psychology and Health. 2011; 26 :371–82. doi: 10.1080/08870440903440707. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Lindeman Marjaana, Aarnio Kia. Paranormal beliefs: Their dimensionality and correlates. European Journal of Personality. 2006; 20 :585–602. [ Google Scholar ]
  • Lobato Emilio J., Mendoza Jorge, Sims Valerie, Chin Matthew. Explaining the relationship between conspiracy theories, paranormal beliefs, and pseudoscience acceptance among a university population. Applied Cognitive Psychology. 2014; 28 :617–25. doi: 10.1002/acp.3042. [ CrossRef ] [ Google Scholar ]
  • Maqsood Alisha, Jamil Farhat, Khalid Ruhi. Thinking styles and belief in superstitions: Moderating role of gender in young adults. Pakistan Journal of Psychological Research. 2018; 33 :335–348. [ Google Scholar ]
  • McCutcheon Lynn E., Apperson Jenneifer M., Hanson Esher, Wynn Vincent. Relationships among critical thinking skills, academic achievement, and misconceptions about psychology. Psychological Reports. 1992; 71 :635–39. doi: 10.2466/pr0.1992.71.2.635. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • McGrew Kevin S. CHC theory and the human cognitive abilities project: Standing on the shoulders of the giants of psychometric intelligence research. Intelligence. 2009; 37 :1–10. doi: 10.1016/j.intell.2008.08.004. [ CrossRef ] [ Google Scholar ]
  • Morgan Jonathan. Religion and dual-process cognition: A continuum of styles or distinct types. Religion, Brain, & Behavior. 2016; 6 :112–29. doi: 10.1080/2153599X.2014.966315. [ CrossRef ] [ Google Scholar ]
  • Nie Fanhao, Olson Daniel V. A. Demonic influence: The negative mental health effects of belief in demons. Journal for the Scientific Study of Religion. 2016; 55 :498–515. doi: 10.1111/jssr.12287. [ CrossRef ] [ Google Scholar ]
  • Pacini Rosemary, Epstein Seymour. The relation of rational and experiential information processing styles to personality, basic beliefs, and the ratio-bias phenomenon. Journal of Personality and Social Psychology. 1999; 76 :972–87. doi: 10.1037/0022-3514.76.6.972. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Patel Niraj, Baker S. Glenn, Scherer Laura D. Evaluating the cognitive reflection test as a measure of intuition/reflection, numeracy, and insight problem solving, and the implications for understanding real-world judgments and beliefs. Journal of Experimental Psychology: General. 2019; 148 :2129–53. doi: 10.1037/xge0000592. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Pennycook Gordon, Cheyne James Allen, Barr Nathaniel, Koehler Derek J., Fugelsang Jonathan A. On the reception and detection of pseudo-profound bullshit. Judgment and Decision Making. 2015; 10 :549–63. doi: 10.1017/S1930297500006999. [ CrossRef ] [ Google Scholar ]
  • Pennycook Gordon, Cheyne James Allen, Seti Paul, Koehler Derek J., Fugelsang Jonathan A. Analytic cognitive style predicts religious and paranormal belief. Cognition. 2012; 123 :335–46. doi: 10.1016/j.cognition.2012.03.003. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Ren Xuezhu, Tong Yan, Peng Peng, Wang Tengfei. Critical thinking predicts academic performance beyond cognitive ability: Evidence from adults and children. Intelligence. 2020; 82 :10187. doi: 10.1016/j.intell.2020.101487. [ CrossRef ] [ Google Scholar ]
  • Rogers Paul, Fisk John E., Lowrie Emma. Paranormal belief, thinking style preference and susceptibility to confirmatory conjunction errors. Consciousness and Cognition. 2018; 65 :182–95. doi: 10.1016/j.concog.2018.07.013. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Saher Marieke, Lindeman Marjaana. Alternative medicine: A psychological perspective. Personality and Individual Differences. 2005; 39 :1169–78. doi: 10.1016/j.paid.2005.04.008. [ CrossRef ] [ Google Scholar ]
  • Sosu Edward M. The development and psychometric validation of a Critical Thinking Disposition Scale. Thinking Skills and Creativity. 2013; 9 :107–19. doi: 10.1016/j.tsc.2012.09.002. [ CrossRef ] [ Google Scholar ]
  • Stahl Tomas, van Prooijen Jan-Wilem. Epistemic irrationality: Skepticism toward unfounded beliefs requires sufficient cognitive ability and motivation to be rational. Personality and Individual Differences. 2018; 122 :155–63. doi: 10.1016/j.paid.2017.10.026. [ CrossRef ] [ Google Scholar ]
  • Stanovich Keith E. What Intelligence Tests Miss: The Psychology of Rational Thought. Yale University Press; New Haven: 2009. [ Google Scholar ]
  • Stanovich Keith E., West Richard F. Reasoning independently of prior belief and individual differences in actively open-minded thinking. Journal of Educational Psychology. 1997; 89 :345–57. doi: 10.1037/0022-0663.89.2.342. [ CrossRef ] [ Google Scholar ]
  • Stanovich Keith E., West Richard F. Natural myside bias is independent of cognitive ability. Thinking & Reasoning. 2007; 13 :225–47. [ Google Scholar ]
  • Stanovich Keith E., West Richard F. On the failure of cognitive ability to predict myside and one-sided thinking bias. Thinking and Reasoning. 2008; 14 :129–67. doi: 10.1080/13546780701679764. [ CrossRef ] [ Google Scholar ]
  • Stanovich Keith E., West Richard F., Toplak Maggie E. The Rationality Quotient: Toward a Test of Rational Thinking. The MIT Press; Cambridge, MA: 2018. [ Google Scholar ]
  • Sternberg Robert J. The Triarchic Mind: A New Theory of Intelligence. Penguin Press; London: 1988. [ Google Scholar ]
  • Sternberg Robert J. A theory of adaptive intelligence and its relation to general intelligence. Journal of Intelligence. 2019; 7 :23. doi: 10.3390/jintelligence7040023. [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Sternberg Robert J. The search for the elusive basic processes underlying human intelligence: Historical and contemporary perspectives. Journal of Intelligence. 2022; 10 :28. doi: 10.3390/jintelligence10020028. [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Swami Viren, Voracek Martin, Stieger Stefan, Tran Ulrich S., Furnham Adrian. Analytic thinking reduces belief in conspiracy theories. Cognition. 2014; 133 :572–85. doi: 10.1016/j.cognition.2014.08.006. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Teunisse Alessandra K., Case Trevor I., Fitness Julie, Sweller Naomi. I should have known better: Development of a self-report measure of gullibility. Personality and Social Psychology Bulletin. 2019; 46 :408–23. doi: 10.1177/0146167219858641. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Tobacyk Jerome J. A revised paranormal belief scale. The International Journal of Transpersonal Studies. 2004; 23 :94–98. doi: 10.24972/ijts.2004.23.1.94. [ CrossRef ] [ Google Scholar ]
  • van der Linden Sander. The conspiracy-effect: Exposure to conspiracy theories (about global warming) leads to decreases pro-social behavior and science acceptance. Personality and Individual Differences. 2015; 87 :173–75. doi: 10.1016/j.paid.2015.07.045. [ CrossRef ] [ Google Scholar ]
  • van Prooijen Jan-Willem. Belief in conspiracy theories: Gullibility or rational skepticism? In: Forgas Joseph P., Baumeister Roy F., editors. The Social Psychology of Gullibility: Fake News, Conspiracy Theories, and Irrational Beliefs. Routledge; London: 2019. pp. 319–32. [ Google Scholar ]
  • Wechsler David. The Measurement of Intelligence. 3rd ed. Williams & Witkins; Baltimore: 1944. [ Google Scholar ]
  • West Richard F., Toplak Maggie E., Stanovich Keith E. Heuristics and biases as measures of critical thinking: Associations with cognitive ability and thinking dispositions. Journal of Educational Psychology. 2008; 100 :930–41. doi: 10.1037/a0012842. [ CrossRef ] [ Google Scholar ]

inna kot/shutterstock

Cognitive Biases, Discrimination, Heuristics, Prejudice, Stereotypes, Racism, Sexism, Self-Serving Bias, Actor/Observer Bias, Change Bias

Reviewed by Psychology Today Staff

A bias is a tendency, inclination, or prejudice toward or against something or someone. Some biases are positive and helpful—like choosing to only eat foods that are considered healthy or staying away from someone who has knowingly caused harm. But biases are often based on stereotypes, rather than actual knowledge of an individual or circumstance. Whether positive or negative, such cognitive shortcuts can result in prejudgments that lead to rash decisions or discriminatory practices.

  • Bias and Stereotyping
  • Biases and Cognitive Errors

Angelina Bambina/Shutterstock

Bias is often characterized as stereotypes about people based on the group to which they belong and/or based on an immutable physical characteristic they possess, such as their gender , ethnicity , or sexual orientation . This type of bias can have harmful real-world outcomes. People may or may not be aware that they hold these biases.

The phenomenon of implicit bias refers to societal input that escapes conscious detection. Paying attention to helpful biases—while keeping negative, prejudicial, or accidental biases in check—requires a delicate balance between self-protection and empathy for others.

Bias is a natural inclination for or against an idea, object, group, or individual. It is often learned and is highly dependent on variables like a person’s socioeconomic status, race, ethnicity, educational background, etc. At the individual level, bias can negatively impact someone’s personal and professional relationships; at a societal level, it can lead to unfair persecution of a group, such as the Holocaust and slavery.

Starting at a young age, people will discriminate between those who are like them, their “ingroup,” and those who are not like them, “their outgroup.” On the plus side, they can gain a sense of identity and safety. However, taken to the extreme, this categorization can foster an “us-versus-them” mentality and lead to harmful prejudice .

People are naturally biased—they like certain things and dislike others, often without being fully conscious of their prejudice. Bias is acquired at a young age, often as a result of one’s upbringing. This unconscious bias becomes problematic when it causes an individual or a group to treat others poorly as a result of their gender, ethnicity, race, or other factors. 

Generally, no. Everyone has some degree of bias . It’s human nature to assign judgment based on first impressions. Also, most people have a lifetime of conditioning by schools, religious institutions, their families of origin, and the media. However, by reflecting critically on judgments and being aware of blind spots, individuals can avoid stereotyping and acting on harmful prejudice.

Telling people to “suppress prejudice” or racism often has the opposite effect. When people are trained to notice prejudiced or racist thoughts without trying to push them away, they are able to make a deliberate choice about how they behave towards others as a result. This can lead to less discrimination and reduced bias over time.

gustavo frazao/shutterstock

A category of biases, known as cognitive biases, are repeated patterns of thinking that can lead to inaccurate or unreasonable conclusions. Cognitive biases may help people make quicker decisions, but those decisions aren’t always accurate. Some common reasons why include flawed memory , scarce attention, natural limits on the brain’s ability to process information, emotional input, social pressures, and even aging. When assessing research—or even one's own thoughts and behaviors—it’s important to be aware of cognitive biases and attempt to counter their effects whenever possible.

When you are the actor, you are more likely to see your actions as a result of external and situational factors . Whereas, when you are observing other people, you are more likely to perceive their actions as based on internal factors (like overall disposition). This can lead to magical thinking and a lack of self-awareness.

People tend to jump at the first available piece of information and unconsciously use it to “anchor” their decision-making process , even when the information is incorrect or prejudiced. This can lead to skewed judgment and poor decision-making , especially when they don’t take the time to reason through their options.

Attribution bias occurs when someone tries to attribute reasons or motivations to the actions of others without concrete evidence to support such assumptions.

Confirmation bias refers to the brain’s tendency to search for and focus on information that supports what someone already believes, while ignoring facts that go against those beliefs, despite their relevance.

People with hindsight bias believe they should have anticipated certain outcomes , which might only be obvious now with the benefit of more knowledge and perspective. They may forget that at the time of the event, much of the information needed simply wasn’t available. They may also make unfair assumptions that other people share their experiences and expect them to come to the same conclusions.

In the Dunning-Kruger Effect , people lack the self-awareness to accurately assess their skills. They often wind up overestimating their knowledge or ability. For example, it’s not uncommon to think you’re smarter, kinder, or better at managing others than the average person.

People are more likely to attribute someone else’s actions to their personality rather than taking into account the situation they are facing. However, they rarely make this Fundamental Attribution Error when analyzing their own behavior.

The Halo Effect occurs when your positive first impression of someone colors your overall perception of them. For example, if you are struck by how beautiful someone is, you might assume they have other positive traits, like being wise or smart or brave. A negative impression, on the first hand, can lead you to assume the worst about a person, resulting in a “Reverse Halo” or “ Horns Effect .” 

People like to win, but they hate losing more. So they tend to pay more attention to negative outcomes and weigh them more heavily than positive ones when considering a decision. This negativity bias explains why we focus more on upsetting evens, and why the news seems so dire most of the time.

People tend to overestimate the likelihood of positive outcomes when they are in a good mood. Conversely, when they are feeling down, they are more likely to expect negative outcomes. In both instances, powerful emotions are driving irrational thinking .

Have you ever heard, “Don’t throw good money after bad”? That expression is based on the Sunk Cost Fallacy. Basically, when someone is aware of the time, effort, and emotional cost that’s already gone into an endeavor, they can find it difficult to change their mind or quit a longtime goal —even when it’s the healthiest choice for them.

critical thinking involves action of bias

Are you being manipulated? How do current advertising techniques like neuromarketing affect your perception and thinking? It may be more powerful than you think.

critical thinking involves action of bias

Ever find yourself returning to alcohol after weeks or months of sobriety? Discover how our brains distort past memories and the science to overcome relapse.

critical thinking involves action of bias

A new study claims racial colorblindness reflects racism, while another recent study claims colorblindness can be antiracist. Within this debate, which side is more correct?

the chain store game

The chain store paradox highlights a conflict between impulse and reason in deciding whether to compete or accommodate.

critical thinking involves action of bias

A Personal Perspective: I am a relentless teaser, but perhaps my irreverence has taught me some lessons in critical thinking.

critical thinking involves action of bias

The science of deceptive nonverbal communication.

Only one in five TikTok videos on ADHD is rated as helpful in a study. Scroller beware!

If you are using social media as your source of information, read this first. Only one in five ADHD-related TikTok videos studied was deemed based on "useful" information.

critical thinking involves action of bias

The ambition to do good by doing business is not new; however, it is realized in impact investing. Social psychology can help foster effective processes in impact investing.

critical thinking involves action of bias

A Personal Perspective: Without clear video evidence, why can no one be certain whether a memory accurately represents what has occurred?

critical thinking involves action of bias

When you feel hurt by someone, it's worth considering their intentions. Does it seem like they did it on purpose?

  • Find a Therapist
  • Find a Treatment Center
  • Find a Psychiatrist
  • Find a Support Group
  • Find Online Therapy
  • United States
  • Brooklyn, NY
  • Chicago, IL
  • Houston, TX
  • Los Angeles, CA
  • New York, NY
  • Portland, OR
  • San Diego, CA
  • San Francisco, CA
  • Seattle, WA
  • Washington, DC
  • Asperger's
  • Bipolar Disorder
  • Chronic Pain
  • Eating Disorders
  • Passive Aggression
  • Personality
  • Goal Setting
  • Positive Psychology
  • Stopping Smoking
  • Low Sexual Desire
  • Relationships
  • Child Development
  • Self Tests NEW
  • Therapy Center
  • Diagnosis Dictionary
  • Types of Therapy

May 2024 magazine cover

At any moment, someone’s aggravating behavior or our own bad luck can set us off on an emotional spiral that threatens to derail our entire day. Here’s how we can face our triggers with less reactivity so that we can get on with our lives.

  • Emotional Intelligence
  • Gaslighting
  • Affective Forecasting
  • Neuroscience
  • Ivey Leadership Certification for CPAs
  • Learning Test Lab
  • Download a Brochure
  • Strategic Facilitation
  • Learning Experience
  • Certificates

Unconscious bias: what it is and how to avoid it in the workplace

Unconscious Bias

An unconscious bias is a thinking error that can cloud judgment and lead to poor decisions.

As a leader, it’s important to look for and process a broad range of information from many perspectives. It’s equally important to be open to alternatives not previously considered. The more perspectives and strategies you have to choose from, the more likely it is you will make the best decisions for your team and organization as a whole.

But a powerful, yet subtle obstacle can stand in the way of open-mindedness in leadership: unconscious bias.

What is unconscious bias?

For most of human history, people experienced very little new information during their lifetimes. Decisions were based on the need for survival. In our modern world, we are constantly receiving new information and have to make numerous complicated choices each day. As many researchers have explained , our minds are ill-equipped to handle the modern world’s decision-making demands. Evaluating evidence (especially when it is complex or ambiguous) requires a great deal of mental energy. To save us from becoming overwhelmed, our brains have a natural tendency to take shortcuts. Unconscious bias – also known as cognitive bias – refers to how our mind can take shortcuts when processing information. This saves time when making decisions, which is especially helpful when we’re under pressure and need to meet deadlines. While these shortcuts may save time, an unconscious bias is a systematic thinking error that can cloud our judgment, and as a result, impact our decisions.

See if you can answer this riddle: A bat and a ball cost $1.10 in total. The bat costs $1.00 more than the ball. How much does the ball cost?

A woman holding her chin looking up into the air

Did you answer 10 cents? Most people do. Although this response intuitively comes to mind, it is incorrect. If the ball costs 10 cents and the bat costs $1.00 more than the ball, then the bat would cost $1.10 for a grand total of $1.20 for the bat and the ball. The correct answer to this problem is that the ball costs five cents and the bat costs (at $1.00 more) $1.05, for a grand total of $1.10.

If you answered 10 cents to the example above, your mind took a shortcut by unconsciously substituting the “more than” statement in the problem (the bat costs $1.00 more than the ball) with an “absolute” statement – the bat costs $1.00. It makes the equation easier to process: if a ball and bat together cost $1.10 and the bat costs $1.00, then the ball must cost 10 cents.

Our unconscious mind uses embedded, unconscious beliefs formed from our cultural environment and personal experiences to make immediate decisions about everything around us. The problem is that these shortcuts result in wrong decisions much of the time – especially when rational, logical thinking is required. We all have unconscious bias, and it influences our decisions without us even realizing it.

Common types of unconscious bias

An article in The Atlantic states there are at least 100 distinctive cognitive biases , while Wikipedia’s List of Cognitive Biases  contains more than 185 entries. Many of the unconscious biases listed, such as the IKEA effect (to place disproportionately high value on products you help create yourself) don’t present themselves often in the workplace. The following unconscious biases are the most common in the workplace and have the potential to derail your decision-making ability as a leader:

Sunk cost bias

You non-sensically cling to things that have already cost you something. When you’ve invested time, money, or emotion into something, it can be difficult to let it go – even when it is clear it’s no longer viable. The aversion to this pain can distort your judgment and can cause you to make ill-advised investments.

To combat this bias: ask yourself if you haven’t already invested time, money, effort, or emotion into something, would you still do so now? What advice would you give to a friend in the same situation?

Halo effect

The halo effect occurs when you allow your personal perception of someone (how attractive they are, how much you like them, how much they remind you of yourself) to influence your judgments about them, especially performance. In sociology, this is known as  homophily - people like people who are like themselves. 

To combat this bias: If you notice you are giving consistently high (or low) performance grades across the board to particular individuals, it’s worth considering your judgment may be compromised by the halo effect. Focus on the performance and not on the person.

The Dunning-Kruger effect

The Dunning-Kruger effect describes what happens when people mistakenly overestimate their own ability because of a lack of self awareness. Have you ever heard the phrase “you don’t know what you don’t know”? It’s easy to be over-confident when you only have a rudimentary perspective of how things are.

It also works the other way. Because experts are keenly aware of how much they don’t know, they can drastically underestimate their own ability and lose confidence in themselves and their decision-making ability. This bias is also known as “imposter syndrome.”

To combat this bias: acknowledge the thoughts you have about yourself and put them in perspective. Learn to value constructive criticism, and understand that you’re slowing your team down when you don’t ask for help.

If you’re feeling like an imposter, it can be helpful to share what you’re feeling with trusted friends or mentors. People who have more experience can reassure you that what you’re feeling is normal. Knowing that others have been in your position can make it seem less scary.

Availability heuristic

This unconscious bias influences your judgments by favouring the ideas that come most easily to mind. Similar to recency effect, the more recent and emotionally powerful your memories are can make them seem more relevant. This can cause you to place an inordinate amount of importance on recent memories and apply them to decisions too readily.

To combat this bias: use metrics and statistical information rather than relying on first instincts and emotional influences when making a decision.

The desire for conformity and harmony within a group results in an irrational or dysfunctional decision-making outcome.

To combat this bias: seek to facilitate objective means of evaluating situations and encourage critical thinking practices as a group activity.

Confirmation or Implicit Bias

Confirmation bias causes us to look for evidence confirming what we already think or believe in and to discount or ignore any information that may support an alternate view. It’s the most pervasive unconscious bias in the workplace and the most damaging.

“What the human being is best at doing is interpreting all new information so that their prior conclusions remain intact.” — Warren Buffett

Accepting information that confirms our beliefs is easy and requires little mental energy. With confirmation bias, when we encounter contradicting information we avoid processing it and find a reason to ignore it. In The Case for Motivated Reasoning , social psychologist Ziva Kunda wrote “we give special weight to information that allows us to come to the conclusion we want to reach.” In fact, neuroscientists have demonstrated that our brain reacts differently to information that confirms our previously held beliefs than it does to evidence that contradicts our current beliefs.

If a leader’s view is limited by confirmation bias, they may not pay enough attention to information that could be crucial to their work. Leaders need to be aware of how their biases might impact the people that work for them and with them. For example, direct reports may not share all available information, or may only tell a leader what they think their leader wants to hear. This can lead to poor decision-making, missed opportunities, and negative outcomes.

To combat this bias: think of your ideas and belief system as a piece of software you’re trying to de-bug, rather than a list of things to be defended. Ask yourself the following questions and be mindful of your thought process when answering them:

  • Where do I get information about the issues I care about?
  • Do my most common sources of information confirm or challenge my perspective?
  • How much time do I spend listening to or reading opposing points of view?
  • When I make decisions, am I likely to choose the option that the people closest to me will agree with?

Being cognizant of confirmation bias is not easy, but with practice, it is possible to recognize the role it plays in the way we interpret information and make decisions.

How unconscious bias can impact inclusion and diversity in an organization

The correlation between diversity and financial performance is clear across different industries and regions: more diverse teams translates directly to significant financial performance. Between 2011 and 2015, the most gender diverse companies were 20 per cent more likely than the least diverse to have above average financial performance.

For organizations to attract the most talented people and ensure a vibrant and diverse workforce, they need to select from a wide-ranging and diverse talent pool. Unfortunately, when hiring, assessing, or promoting employees, we often evaluate people against our unconscious assumptions of what top talent looks like. These assumptions can favour one group over others, even if members of each group are equally likely to be successful.

During the hiring process, hiring managers gather a wide array of information about job candidates. Through interviews, candidates will share their educational background, work and personal experiences, and how they would behave in hypothetical situations. But most of the time hiring managers are measuring this information against their own personal belief of what the successful candidate “should” look like. Did they go to the right school? Would they behave in the same manner as I would in the same situation? Is their personality a close match to mine (see halo effect above) and the rest of my team?

Most hiring managers will select candidates who best match their unconscious template of what a successful candidate looks and sounds like. This approach can give preferences to the “safe” choice. For example, a hiring manager may believe that only MBA graduates from elite business schools are suitable to fill leadership roles. And if that criteria were applied to all vacancies, you would soon develop a leadership team of predominantly white males, as most MBA graduates are male and white. Because diversity spurs innovation,  the organization would then be at a competitive disadvantage. 

Innovation is not just a nice-to-have benefit of having diverse work teams. It is an integral part of any revenue-generating organization. A Boston Consulting Group study found that organizations with more diverse management teams have 19% higher revenues from innovation alone. 

How unconscious bias can be avoided

Although unconscious bias can’t be cured, there are many steps that can be taken to mitigate it. Leaders who can recognize their unconscious biases and make adjustments to overcome them are more likely to make better decisions. To be ever-mindful of unconscious bias, it’s important to practice self awareness and slow down decision making to consider what is driving you. Ask yourself if your decisions are data-driven and evidence-based or if you rely on gut instinct? Have you asked for and considered different perspectives? It can be helpful to discuss your decisions and behaviour at work with an Ivey Academy executive coach.  An executive coach can provide a sounding board, a neutral perspective, and applicable strategies to help you overcome your unique unconscious biases. 

Promoting inclusion and diversity

To promote inclusion and diversity in your organization's hiring practices, appropriate procedures and processes need to be put in place. To eliminate bias in hiring decisions, make promotions fairer, and increase diversity, organizations are using data-driven talent assessments .

Organizations that use robust assessment tools have improved hiring success rates, lowered employee turnover, increased employee engagement and productivity, and fostered a resilient corporate culture. Assessments provide organizations with a consistent definition of what leadership potential looks like, regardless of race, gender, or ethnicity. With the help of assessment tools, leaders are able to find “ hidden gems ” — employees who have low visibility or who previously were not seen to have leadership potential. Most importantly, talent assessment tools help to educate leaders about the difference between an employee’s experience and his or her capability to take on new and more challenging responsibilities. With the help of talent assessments, you can be confident in knowing your organization is taking a needed step in removing unconscious bias from the hiring process.

The Ivey Academy’s talent assessment tools enable your organization to identify the best candidates for vacant roles and professional development. With our help, your organization can create and maintain a competitive edge in the recruitment, development, and retention of top talent. Learn more about our talent assessments here .

About The Ivey Academy at Ivey Business School The Ivey Academy at Ivey Business School is the home for executive Learning and Development (L&D) in Canada. It is Canada’s only full-service L&D house, blending   Financial Times   top-ranked university-based executive education with talent assessment, instructional design and strategy, and behaviour change sustainment. 

Rooted in Ivey Business School’s real-world leadership approach, The Ivey Academy is a place where professionals come to get better, to break old habits and establish new ones, to practice, to change, to obtain coaching and support, and to join a powerful peer network. Follow The Ivey Academy on   LinkedIn ,   Twitter ,   Facebook , and   Instagram .

  • 1. Circular Food Economy: Canada's $50 Billion Opportunity
  • 2. Developing Personal & Team Resilience
  • 3. Why understanding customer experience in the digital world is crucial to business outcomes

Related Articles

Choosing the right executive coach

Choosing the right executive coach

Because no universally reliable credential exists to identify credible coaches, it’s important to scrutinize an executive coach's experience and education.

Building a culture of innovation

The most important factor in building a culture of innovation

To create value for customers in ways their competitors cannot, organizations need to develop a culture of innovation that permeates all aspects of the business.

Identifying Talent And Managing Succession

Identifying talent and managing succession

For organizations to be able to remain competitive in an ever-changing global business landscape and labour market, having an identified group of talented employees who are able to fill key leadership roles is critical.

Subscribe for new content and event invitations

Join our community to receive the latest learning resources, articles, and invitations to exclusive live events from Ivey.

Speak to an advisor to get started today

Our team of learning and development advisors is ready to help you and your organization meet your development goals.

critical thinking involves action of bias

  • Have any questions? Try our live chat! Chat

Sign up for virtual event invites and new content delivered monthly.

Receive our latest news, offers, learning content, and more.

IMAGES

  1. The Impact of Bias in Critical Thinking

    critical thinking involves action of bias

  2. Critical Thinking & Bias

    critical thinking involves action of bias

  3. Critical thinking

    critical thinking involves action of bias

  4. Critical Thinking

    critical thinking involves action of bias

  5. Why “Bias For Action” is a critical trait

    critical thinking involves action of bias

  6. Critical Thinking: Media Bias by Roger Chanes on Prezi

    critical thinking involves action of bias

VIDEO

  1. Mastering the Art of Critical Thinking: Strategies for Problem-Solving #short #criticalthinking #yt

  2. How to Master the Art of Critical Thinking🎭

  3. What is critical thinking? #shorts #education

  4. 3 Steps for Critical Thinking #shorts #criticalthinking #cognition

  5. What is Action Bias? #cognitivebiases #criticalthinking

  6. Critical Thought Vs Bias #debate #education #philosophy

COMMENTS

  1. Cognitive Bias List: 13 Common Types of Bias

    The Availability Heuristic. The Optimism Bias. Other Kinds. Although we like to believe that we're rational and logical, the fact is that we are continually under the influence of cognitive biases. These biases distort thinking, influence beliefs, and sway the decisions and judgments that people make each and every day.

  2. Bias

    Wittebols (2019) defines it as a "tendency to be psychologically invested in the familiar and what we believe and less receptive to information that contradicts what we believe" (p. 211). Quite simply, we may reject information that doesn't support our existing thinking. This can manifest in a number of ways with Hahn and Harris (2014 ...

  3. Critical thinking

    Teaching bias and critical thinking skills. By following this step-by-step process, I believe we can talk about bias with our students and increase the chances of them incorporating critical thinking skills into their lives. 1) Choose a bias. Search for a list of biases and read the basic definitions. 2) Learn about it.

  4. What Is Critical Thinking?

    Critical thinking is the ability to effectively analyze information and form a judgment. To think critically, you must be aware of your own biases and assumptions when encountering information, and apply consistent standards when evaluating sources. Critical thinking skills help you to: Identify credible sources. Evaluate and respond to arguments.

  5. Cognitive Bias Is the Loose Screw in Critical Thinking

    Actor-observer bias is when you attribute your actions to external influences and other people's actions to internal ones. You might think you missed a business opportunity because your car broke ...

  6. What Is Cognitive Bias? Types & Examples

    Confirmation bias, hindsight bias, mere exposure effect, self-serving bias, base rate fallacy, anchoring bias, availability bias, the framing effect , inattentional blindness, and the ecological fallacy are some of the most common examples of cognitive bias. Another example is the false consensus effect. Cognitive biases directly affect our ...

  7. Cognitive Bias: What It Is, Signs, How to Overcome It

    Research suggests that cognitive training can help minimize cognitive biases in thinking. Some things that you can do to help overcome biases that might influence your thinking and decision-making include: Being aware of bias: Consider how biases might influence your thinking. In one study, researchers provided feedback and information that ...

  8. Bridging critical thinking and transformative learning: The role of

    In recent decades, approaches to critical thinking have generally taken a practical turn, pivoting away from more abstract accounts - such as emphasizing the logical relations that hold between statements (Ennis, 1964) - and moving toward an emphasis on belief and action.According to the definition that Robert Ennis (2018) has been advocating for the last few decades, critical thinking is ...

  9. 2.2: Overcoming Cognitive Biases and Engaging in Critical Reflection

    This process of critical reflection is often called metacognition in the literature of pedagogy and psychology. Metacognition means thinking about thinking and involves the kind of self-awareness that engages higher-order thinking skills. Cognition, or the way we typically engage with the world around us, is first-order thinking, while ...

  10. Critical Thinking

    Critical thinking involves rigorously and skilfully using information, experience, observation, and reasoning to guide your decisions, actions and beliefs. It's a useful skill in the workplace and in life. You'll need to be curious and creative to explore alternative possibilities, but rational to apply logic, and self-aware to identify when ...

  11. Critical Thinking

    Critical thinking is a widely accepted educational goal. Its definition is contested, but the competing definitions can be understood as differing conceptions of the same basic concept: careful thinking directed to a goal. Conceptions differ with respect to the scope of such thinking, the type of goal, the criteria and norms for thinking ...

  12. Contextual Debiasing and Critical Thinking: Reasons for Optimism

    5 Contextual Debiasing and Critical Thinking. The advantage of contextual debiasing techniques is that they allow individuals to "outsmart [their] own biases", to borrow Soll et al.'s ( 2015) expression, without having to rely on unrealistic assumptions regarding their cognitive and motivational capacities.

  13. What Is Cognitive Bias?

    Cognitive bias is the tendency to act in an irrational way due to our limited ability to process information objectively. It is not always negative, but it can cloud our judgment and affect how clearly we perceive situations, people, or potential risks. Example: Cognitive bias. One common manifestation of cognitive bias is the stereotype that ...

  14. How to Identify Cognitive Bias: 12 Examples of Cognitive Bias

    Identifying the biases you experience and purport in your everyday interactions is the first step to understanding how our mental processes work, which can help us make better, more informed decisions. Cognitive biases are inherent in the way we think, and many of them are unconscious.

  15. 7 Different Types of Bias and How to Work Through Them

    Confirmation Bias. Attribution Bias. Conformity Bias. Beauty Bias. Gender Bias. Ageism. The Contrast Effect. Bias refers to a tendency or preference towards a certain group, idea, or concept that influences our judgments and decisions. Our experiences, culture, social norms, and personal beliefs often shape these beliefs.

  16. Recognizing Bias: A Problem Solving and Critical Thinking Skills Guide

    Sources of Bias. Recognizing bias is an essential part of problem solving and critical thinking. It is important to be aware of potential sources of bias, such as personal opinions, values, or preconceived notions. Bias can have a profound effect on decisions, leading to outcomes that are not based on facts or evidence.

  17. Cognitive Bias: Common Types and How to Avoid Them

    A cognitive bias is an unconscious systematic pattern of thinking that can result in errors in judgment. These biases stem from the brain's limited resources and need to simplify the world to make faster decisions. Such biases are often the result of limitations or problems in memory, attention, and information processing.

  18. 10 Types Of Cognitive Biases And How It Affects Your Thinking

    Some examples of different types of cognitive biases include: 1. Actor-Observer Bias. The actor-observer bias is a cognitive bias that refers 5 to the tendency to attribute our own behavior to external factors while attributing the behavior of others to internal factors. For example, if an individual is driving and accidentally rear-end another ...

  19. Critical Thinking, Intelligence, and Unsubstantiated Beliefs: An

    Critical thinking questions are often situated in real-world examples or in simulations of them which are designed to detect thinking errors and bias. As described in Halpern and Butler ( 2018 ), an item like one on the "Halpern Critical Thinking Assessment" (HCTA) provides respondents with a mock newspaper story about research showing that ...

  20. List of cognitive biases

    Anchoring bias includes or involves the following: Common source bias, the ... Anthropocentric thinking, the tendency to use human analogies as a basis for reasoning about other, less familiar, biological phenomena. ... Action bias: The tendency for someone to act when faced with a problem even when inaction would be more effective, or to act ...

  21. Bias

    Bias is a natural inclination for or against an idea, object, group, or individual. It is often learned and is highly dependent on variables like a person's socioeconomic status, race, ethnicity ...

  22. Unconscious bias: what it is and how to avoid it in the workplace

    To combat this bias: seek to facilitate objective means of evaluating situations and encourage critical thinking practices as a group activity. Confirmation or Implicit Bias. Confirmation bias causes us to look for evidence confirming what we already think or believe in and to discount or ignore any information that may support an alternate view.