U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • PMC10300824

Logo of jintell

An Evaluative Review of Barriers to Critical Thinking in Educational and Real-World Settings

Associated data.

No new data were created or analyzed in this study. Data sharing is not applicable to this article.

Though a wide array of definitions and conceptualisations of critical thinking have been offered in the past, further elaboration on some concepts is required, particularly with respect to various factors that may impede an individual’s application of critical thinking, such as in the case of reflective judgment. These barriers include varying levels of epistemological engagement or understanding, issues pertaining to heuristic-based thinking and intuitive judgment, as well as emotional and biased thinking. The aim of this review is to discuss such barriers and evaluate their impact on critical thinking in light of perspectives from research in an effort to reinforce the ‘completeness’ of extant critical thinking frameworks and to enhance the potential benefits of implementation in real-world settings. Recommendations and implications for overcoming such barriers are also discussed and evaluated.

1. Introduction

Critical thinking (CT) is a metacognitive process—consisting of a number of skills and dispositions—that, through purposeful, self-regulatory reflective judgment, increases the chances of producing a logical solution to a problem or a valid conclusion to an argument ( Dwyer 2017 , 2020 ; Dwyer et al. 2012 , 2014 , 2015 , 2016 ; Dwyer and Walsh 2019 ; Quinn et al. 2020 ).

CT has long been identified as a desired outcome of education ( Bezanilla et al. 2019 ; Butler et al. 2012 ; Dwyer 2017 ; Ennis 2018 ), given that it facilitates a more complex understanding of information ( Dwyer et al. 2012 ; Halpern 2014 ), better judgment and decision-making ( Gambrill 2006 ) and less dependence on cognitive bias and heuristic thinking ( Facione and Facione 2001 ; McGuinness 2013 ). A vast body of research (e.g., Dwyer et al. 2012 ; Gadzella 1996 ; Hitchcock 2004 ; Reed and Kromrey 2001 ; Rimiene 2002 ; Solon 2007 ), including various meta-analyses (e.g., Abrami et al. 2008 , 2015 ; Niu et al. 2013 ; Ortiz 2007 ), indicates that CT can be enhanced through targeted, explicit instruction. Though CT can be taught in domain-specific areas, its domain-generality means that it can be taught across disciplines and in relation to real-world scenarios ( Dwyer 2011 , 2017 ; Dwyer and Eigenauer 2017 ; Dwyer et al. 2015 ; Gabennesch 2006 ; Halpern 2014 ). Indeed, the positive outcomes associated with CT transcend educational settings into real-world, everyday situations, which is important because CT is necessary for a variety of social and interpersonal contexts where good decision-making and problem-solving are needed on a daily basis ( Ku 2009 ). However, regardless of domain-specificity or domain-generality of instruction, the transferability of CT application has been an issue in CT research (e.g., see Dumitru 2012 ). This is an important consideration because issues with transferability—for example, in real-world settings—may imply something lacking in CT instruction.

In light of the large, aforementioned body of research focusing on enhancing CT through instruction, a growing body of research has also evaluated the manner in which CT instruction is delivered (e.g., Abrami et al. 2008 , 2015 ; Ahern et al. 2019 ; Cáceres et al. 2020 ; Byerly 2019 ; Dwyer and Eigenauer 2017 ), along with additional considerations for and the barriers to such education, faced by teachers and students alike (e.g., Aliakbari and Sadeghdaghighi 2013 ; Cáceres et al. 2020 ; Cornell et al. 2011 ; Lloyd and Bahr 2010 ; Ma and Liu 2022 ; Ma and Luo 2021 ; Rear 2019 ; Saleh 2019 ); for example, those regarding conceptualisation, beliefs about CT, having feasible time for CT application and CT’s aforementioned transferability. However, there is a significant lack of research investigating barriers to CT application by individuals in real-world settings, even by those who have enjoyed benefits from previous CT instruction. Thus, perhaps the previously conjectured ‘something lacking in CT instruction’ refers to, in conjunction with the teaching of what CT consists of, making clear to students what barriers to CT application we face.

Simply, CT instruction is designed in such a way as to enhance the likelihood of positive decision-making outcomes. However, there are a variety of barriers that can impede an individual’s application of CT, regardless of past instruction with respect to ‘how to conduct CT’. For example, an individual might be regarded as a ‘critical thinker’ because they apply it in a vast majority of appropriate scenarios, but that does not ensure that they apply CT in all such appropriate scenarios. What keeps them from applying CT in those scenarios might well be one of a number of barriers to CT that often go unaddressed in CT instruction, particularly if such instruction is exclusively focused on skills and dispositions. Perhaps too much focus is placed on what educators are teaching their students to do in their CT courses as opposed to what educators should be recommending their students to look out for or advising what they should not be doing. That is, perhaps just as important for understanding what CT is and how it is conducted (i.e., knowing what to do) is a genuine awareness of the various factors and processes that can impede CT; and so, for an individual to think critically, they must know what to look out for and be able to monitor for such barriers to CT application.

To clarify, thought has not changed regarding what CT is or the cognitive/metacognitive processes at its foundation (e.g., see Dwyer 2017 ; Dwyer et al. 2014 ; Ennis 1987 , 1996 , 1998 ; Facione 1990 ; Halpern 2014 ; Paul 1993 ; Paul and Elder 2008 ); rather, additional consideration of issues that have potential to negatively impact CT is required, such as those pertaining to epistemological engagement; intuitive judgment; as well as emotional and biased thinking. This notion has been made clear through what might be perceived of as a ‘loud shout’ for CT over at least the past 10–15 years in light of growing political, economic, social, and health-related concerns (e.g., ‘fake news’, gaps between political views in the general population, various social movements and the COVID-19 pandemic). Indeed, there is a dearth of research on barriers to CT ( Haynes et al. 2016 ; Lloyd and Bahr 2010 ; Mangena and Chabeli 2005 ; Rowe et al. 2015 ). As a result, this evaluative perspective review aims to provide an impetus for updating the manner in which CT education is approached and, perhaps most importantly, applied in real-world settings—through further identifying and elaborating on specific barriers of concern in order to reinforce the ‘completeness’ of extant CT frameworks and to enhance the potential benefits of their implementation 1 .

2. Barriers to Critical Thinking

2.1. inadequate skills and dispositions.

In order to better understand the various barriers to CT that will be discussed, the manner in which CT is conceptualised must first be revisited. Though debate over its definition and what components are necessary to think critically has existed over the 80-plus years since the term’s coining (i.e., Glaser 1941 ), it is generally accepted that CT consists of two main components: skills and dispositions ( Dwyer 2017 ; Dwyer et al. 2012 , 2014 ; Ennis 1996 , 1998 ; Facione 1990 ; Facione et al. 2002 ; Halpern 2014 ; Ku and Ho 2010a ; Perkins and Ritchhart 2004 ; Quinn et al. 2020 ). CT skills—analysis, evaluation, and inference—refer to the higher-order, cognitive, ‘task-based’ processes necessary to conduct CT (e.g., see Dwyer et al. 2014 ; Facione 1990 ). CT dispositions have been described as inclinations, tendencies, or willingness to perform a given thinking skill (e.g., see Dwyer et al. 2016 ; Siegel 1999 ; Valenzuela et al. 2011 ), which may relate to attitudinal and intellectual habits of thinking, as well as motivational processes ( Ennis 1996 ; Norris 1994 ; Paul and Elder 2008 ; Perkins et al. 1993 ; Valenzuela et al. 2011 ). The relationship between CT skills and dispositions has been argued to be mutually dependent. As a result, overemphasising or encouraging the development of one over the other is a barrier to CT as a whole. Though this may seem obvious, it remains the case that CT instruction often places added emphasis on skills simply because they can be taught (though that does not ensure that everyone has or will be taught such skills), whereas dispositions are ‘trickier’ (e.g., see Dwyer 2017 ; Ku and Ho 2010a ). That is, it is unlikely that simply ‘teaching’ students to be motivated towards CT or to value it over short-instructional periods will actually meaningfully enhance it. Moreover, debate exists over how best to train disposition or even measure it. With that, some individuals might be more ‘inherently’ disposed to CT in light of their truth-seeking, open-minded, or inquisitive natures ( Facione and Facione 1992 ; Quinn et al. 2020 ). The barrier, in this context, is how we can enhance the disposition of those who are not ‘inherently’ inclined. For example, though an individual may possess the requisite skills to conduct CT, it does not ensure the tendency or willingness to apply them; and conversely, having the disposition to apply CT does not mean that one has the ability to do so ( Valenzuela et al. 2011 ). Given the pertinence of CT skills and dispositions to the application of CT in a broader sense, inadequacies in either create a barrier to application.

2.2. Epistemological (Mis)Understanding

To reiterate, most extant conceptualisations of CT focus on the tandem working of skills and dispositions, though significantly fewer emphasise the reflective judgment aspect of CT that might govern various associated processes ( Dawson 2008 ; Dwyer 2017 ; Dwyer et al. 2014 , 2015 ; King and Kitchener 1994 , 2004 ; Stanovich and Stanovich 2010 ). Reflective judgment (RJ) refers to a self-regulatory process of decision-making, with respect to taking time to engage one’s understanding of the nature, limits, and certainty of knowing and how this can affect the defense of their reasoning ( Dwyer 2017 ; King and Kitchener 1994 ; Ku and Ho 2010b ). The ability to metacognitively ‘think about thinking’ ( Flavell 1976 ; Ku and Ho 2010b ) in the application of critical thinking skills implies a reflective sensibility consistent with epistemological understanding and the capacity for reflective judgement ( Dwyer et al. 2015 ; King and Kitchener 1994 ). Acknowledging levels of (un)certainty is important in CT because the information a person is presented with (along with that person’s pre-existing knowledge) often provides only a limited source of information from which to draw a conclusion. Thus, RJ is considered a component of CT ( Baril et al. 1998 ; Dwyer et al. 2015 ; Huffman et al. 1991 ) because it allows one to acknowledge that epistemological understanding is necessary for recognising and judging a situation in which CT may be required ( King and Kitchener 1994 ). For example, the interdependence between RJ and CT can be seen in the way that RJ influences the manner in which CT skills like analysis and evaluation are conducted or the balance and perspective within the subsequent inferences drawn ( Dwyer et al. 2015 ; King et al. 1990 ). Moreover, research suggests that RJ development is not a simple function of age or time but more so a function of the amount of active engagement an individual has working in problem spaces that require CT ( Brabeck 1981 ; Dawson 2008 ; Dwyer et al. 2015 ). The more developed one’s RJ, the better able one is to present “a more complex and effective form of justification, providing more inclusive and better integrated assumptions for evaluating and defending a point of view” ( King and Kitchener 1994, p. 13 ).

Despite a lesser focus on RJ, research indicates a positive relationship between it and CT ( Baril et al. 1998 ; Brabeck 1981 ; Dawson 2008 ; Dwyer et al. 2015 ; Huffman et al. 1991 ; King et al. 1990 )—the understanding of which is pertinent to better understanding the foundation to CT barriers. For example, when considering one’s proficiency in CT skills, there might come a time when the individual becomes so good at using them that their application becomes something akin to ‘second nature’ or even ‘automatic’. However, this creates a contradiction: automatic thinking is largely the antithesis of reflective judgment (even though judgment is never fully intuitive or reflective; see Cader et al. 2005 ; Dunwoody et al. 2000 ; Hamm 1988 ; Hammond 1981 , 1996 , 2000 )—those who think critically take their time and reflect on their decision-making; even if the solution/conclusion drawn from the automatic thinking is ‘correct’ or yields a positive outcome, it is not a critically thought out answer, per se. Thus, no matter how skilled one is at applying CT skills, once the application becomes primarily ‘automatic’, the thinking ceases to be critical ( Dwyer 2017 )—a perspective consistent with Dual Process Theory (e.g., Stanovich and West 2000 ). Indeed, RJ acts as System 2 thinking ( Stanovich and West 2000 ): it is slow, careful, conscious, and consistent ( Kahneman 2011 ; Hamm 1988 ); it is associated with high cognitive control, attention, awareness, concentration, and complex computation ( Cader et al. 2005 ; Kahneman 2011 ; Hamm 1988 ); and accounts for epistemological concerns—consistent not only with King and Kitchener’s ( 1994 ) conceptualisation but also Kuhn’s ( 1999 , 2000 ) perspective on metacognition and epistemological knowing . This is where RJ comes into play as an important component of CT—interdependent among the requisite skills and dispositions ( Baril et al. 1998 ; Dwyer et al. 2015 )—it allows one to acknowledge that epistemological understanding is vital to recognising and judging a situation in which CT is required ( King and Kitchener 1994 ). With respect to the importance of epistemological understanding, consider the following examples for elaboration.

The primary goal of CT is to enhance the likelihood of generating reasonable conclusions and/or solutions. Truth-seeking is a CT disposition fundamental to the attainment of this goal ( Dwyer et al. 2016 ; Facione 1990 ; Facione and Facione 1992 ) because if we just applied any old nonsense as justification for our arguments or solutions, they would fail in the application and yield undesirable consequences. Despite what may seem like truth-seeking’s obvious importance in this context, all thinkers succumb to unwarranted assumptions on occasion (i.e., beliefs presumed to be true without adequate justification). It may also seem obvious, in context, that it is important to be able to distinguish facts from beliefs. However, the concepts of ‘fact’ or ‘truth’, with respect to how much empirical support they have to validate them, also require consideration. For example, some might conceptualise truth as factual information or information that has been or can be ‘proven’ true. Likewise, ‘proof’ is often described as evidence establishing a fact or the truth of a statement—indicating a level of absolutism. However, the reality is that we cannot ‘prove’ things—as scientists and researchers well know—we can only disprove them, such as in experimental settings where we observe a significant difference between groups on some measure—we do not prove the hypothesis correct, rather, we disprove the null hypothesis. This is why, in large part, researchers and scientists use cautious language in reporting their results. We know the best our findings can do is reinforce a theory—another concept often misconstrued in the wider population as something like a hypothesis, as opposed to what it actually entails: a robust model for how and/or why a given phenomenon might occur (e.g., gravity). Thus, theories will hold ‘true’ until they are falsified—that is, disproven (e.g., Popper [1934] 1959 , 1999 ).

Unfortunately, ‘proof’, ‘prove’, and ‘proven’—words that ensure certainty to large populations—actually disservice the public in subtle ways that can hinder CT. For example, a company that produces toothpaste might claim its product to be ‘clinically proven’ to whiten teeth. Consumers purchasing that toothpaste are likely to expect to have whiter teeth after use. However, what happens—as often may be the case—if it does not whiten their teeth? The word ‘proven’ implies a false claim in context. Of course, those in research understand that the word’s use is a marketing ploy, given that ‘clinically proven’ sounds more reassuring to consumers than ‘there is evidence to suggest…’; but, by incorrectly using words like ‘proven’ in our daily language, we reinforce a misunderstanding of what it means to assess, measure and evaluate—particularly from a scientific standpoint (e.g., again, see Popper [1934] 1959 , 1999 ).

Though this example may seem like a semantic issue, it has great implications for CT in the population. For example, a vast majority of us grew up being taught the ‘factual’ information that there were nine planets in our solar system; then, in 2006, Pluto was reclassified as a dwarf planet—no longer being considered a ‘major’ planet of our solar system. As a result, we now have eight planets. This change might be perceived in two distinct ways: (1) ‘science is amazing because it’s always developing—we’ve now reached a stage where we know so much about the solar system that we can differentiate celestial bodies to the extent of distinguishing planets from dwarf planets’; and (2) ‘I don’t understand why these scientists even have jobs, they can’t even count planets’. The first perspective is consistent with that of an individual with epistemological understanding and engagement that previous understandings of models and theories can change, not necessarily because they were wrong, but rather because they have been advanced in light of gaining further credible evidence. The second perspective is consistent with that of someone who has failed to engage epistemological understanding, who does not necessarily see that the change might reflect progress, who might be resistant to change, and who might grow in distrust of science and research in light of these changes. The latter point is of great concern in the CT research community because the unwarranted cynicism and distrust of science and research, in context, may simply reflect a lack of epistemological understanding or engagement (e.g., to some extent consistent with the manner in which conspiracy theories are developed, rationalised and maintained (e.g., Swami and Furnham 2014 )). Notably, this should also be of great concern to education departments around the world, as well as society, more broadly speaking.

Upon considering epistemological engagement in more practical, day-to-day scenarios (or perhaps a lack thereof), we begin to see the need for CT in everyday 21st-century life—heightened by the ‘new knowledge economy’, which has resulted in exponential increases in the amount of information made available since the late 1990s (e.g., Darling-Hammond 2008 ; Dwyer 2017 ; Jukes and McCain 2002 ; Varian and Lyman 2003 ). Though increased amounts of and enhanced access to information are largely good things, what is alarming about this is how much of it is misinformation or disinformation ( Commission on Fake News and the Teaching of Critical Literacy in Schools 2018 ). Truth be told, the new knowledge economy is anything but ‘new’ anymore. Perhaps, over the past 10–15 years, there has been an increase in the need for CT above and beyond that seen in the ‘economy’s’ wake—or maybe ever before; for example, in light of the social media boom, political unrest, ‘fake news’, and issues regarding health literacy. The ‘new’ knowledge economy has made it so that knowledge acquisition, on its own, is no longer sufficient for learning—individuals must be able to work with and adapt information through CT in order to apply it appropriately ( Dwyer 2017 ).

Though extant research has addressed the importance of epistemological understanding for CT (e.g., Dwyer et al. 2014 ), it does not address how not engaging it can substantially hinder it—regardless of how skilled or disposed to think critically an individual may be. Notably, this is distinct from ‘inadequacies’ in, say, memory, comprehension, or other ‘lower-order’ cognitively-associated skills required for CT ( Dwyer et al. 2014 ; Halpern 2014 ; see, again, Note 1) in that reflective judgment is essentially a pole on a cognitive continuum (e.g., see Cader et al. 2005 ; Hamm 1988 ; Hammond 1981 , 1996 , 2000 ). Cognitive Continuum Theory postulates a continuum of cognitive processes anchored by reflective judgment and intuitive judgment, which represents how judgment situations or tasks relate to cognition, given that thinking is never purely reflective, nor is it completely intuitive; rather, it rests somewhere in between ( Cader et al. 2005 ; Dunwoody et al. 2000 ). It is also worth noting that, in Cognitive Continuum Theory, neither reflective nor intuitive judgment is assumed, a priori, to be superior ( Dunwoody et al. 2000 ), despite most contemporary research on judgment and decision-making focusing on the strengths of RJ and limitations associated with intuitive judgment ( Cabantous et al. 2010 ; Dhami and Thomson 2012 ; Gilovich et al. 2002 ). Though this point regarding superiority is acknowledged and respected (particularly in non-CT cases where it is advantageous to utilise intuitive judgment), in the context of CT, it is rejected in light of the example above regarding the automaticity of thinking skills.

2.3. Intuitive Judgment

The manner in which human beings think and the evolution of which, over millions of years, is a truly amazing thing. Such evolution has made it so that we can observe a particular event and make complex computations regarding predictions, interpretations, and reactions in less than a second (e.g., Teichert et al. 2014 ). Unfortunately, we have become so good at it that we often over-rely on ‘fast’ thinking and intuitive judgments that we have become ‘cognitively lazy’, given the speed at which we can make decisions with little energy ( Kahneman 2011 ; Simon 1957 ). In the context of CT, this ‘lazy’ thinking is an impediment (as in opposition to reflective judgment). For example, consider a time in which you have been presented numeric data on a topic, and you instantly aligned your perspective with what the ‘numbers indicate’. Of course, numbers do not lie… but people do—that is not to say that the person who initially interpreted and then presented you with those numbers is trying to disinform you; rather, the numbers presented might not tell the full story (i.e., the data are incomplete or inadequate, unbeknownst to the person reporting on them); and thus, there might be alternative interpretations to the data in question. With that, there most certainly are individuals who will wish to persuade you to align with their perspective, which only strengthens the impetus for being aware of intuitive judgment as a barrier. Consider another example: have you ever accidentally insulted someone at work, school, or in a social setting? Was it because the statement you made was based on some kind of assumption or stereotype? It may have been an honest mistake, but if a statement is made based on what one thinks they know, as opposed to what they actually know about the situation—without taking the time to recognise that all situations are unique and that reflection is likely warranted in light of such uncertainty—then it is likely that the schema-based ‘intuitive judgment’ is what is a fault here.

Our ability to construct schemas (i.e., mental frameworks for how we interpret the world) is evolutionarily adaptive in that these scripts allow us to: make quick decisions when necessary and without much effort, such as in moments of impending danger, answer questions in conversation; interpret social situations; or try to stave off cognitive load or decision fatigue ( Baumeister 2003 ; Sweller 2010 ; Vohs et al. 2014 ). To reiterate, research in the field of higher-order thinking often focuses on the failings of intuitive judgment ( Dwyer 2017 ; Hamm 1988 ) as being limited, misapplied, and, sometimes, yielding grossly incorrect responses—thus, leading to faulty reasoning and judgment as a result of systematic biases and errors ( Gilovich et al. 2002 ; Kahneman 2011 ; Kahneman et al. 1982 ; Slovic et al. 1977 ; Tversky and Kahneman 1974 ; in terms of schematic thinking ( Leventhal 1984 ), system 1 thinking ( Stanovich and West 2000 ; Kahneman 2011 ), miserly thinking ( Stanovich 2018 ) or even heuristics ( Kahneman and Frederick 2002 ; Tversky and Kahneman 1974 ). Nevertheless, it remains that such protocols are learned—not just through experience (as discussed below), but often through more ‘academic’ means. For example, consider again the anecdote above about learning to apply CT skills so well that it becomes like ‘second nature’. Such skills become a part of an individual’s ‘mindware’ ( Clark 2001 ; Stanovich 2018 ; Stanovich et al. 2016 ) and, in essence, become heuristics themselves. Though their application requires RJ for them to be CT, it does not mean that the responses yielded will be incorrect.

Moreover, despite the descriptions above, it would be incorrect, and a disservice to readers to imply that RJ is always right and intuitive judgment is always wrong, especially without consideration of the contextual issues—both intuitive and reflective judgments have the potential to be ‘correct’ or ‘incorrect’ with respect to validity, reasonableness or appropriateness. However, it must also be acknowledged that there is a cognitive ‘miserliness’ to depending on intuitive judgment, in which case, the ability to detect and override this dependence ( Stanovich 2018 )—consistent with RJ, is of utmost importance if we care about our decision-making. That is, if we care about our CT (see below for a more detailed discussion), we must ignore the implicit ‘noise’ associated with the intuitive judgment (regardless of whether or not it is ‘correct’) and, instead, apply the necessary RJ to ensure, as best we can, that the conclusion or solution is valid, reasonable or appropriate.

Although, such a recommendation is much easier said than done. One problem with relying on mental shortcuts afforded by intuition and heuristics is that they are largely experience-based protocols. Though that may sound like a positive thing, using ‘experience’ to draw a conclusion in a task that requires CT is erroneous because it essentially acts as ‘research’ based on a sample size of one; and so, ‘findings’ (i.e., one’s conclusion) cannot be generalised to the larger population—in this case, other contexts or problem-spaces ( Dwyer 2017 ). Despite this, we often over-emphasise the importance of experience in two related ways. First, people have a tendency to confuse experience for expertise (e.g., see the Dunning–KrugerEffect (i.e., the tendency for low-skilled individuals to overestimate their ability in tasks relevant to said skill and highly skilled individuals to underestimate their ability in tasks relevant to said skills); see also: ( Kruger and Dunning 1999 ; Mahmood 2016 ), wherein people may not necessarily be expert, rather they may just have a lot of experience completing a task imperfectly or wrong ( Dwyer and Walsh 2019 ; Hammond 1996 ; Kahneman 2011 ). Second, depending on the nature of the topic or problem, people often evaluate experience on par with research evidence (in terms of credibility), given its personalised nature, which is reinforced by self-serving bias(es).

When evaluating topics in domains wherein one lacks expertise, the need for intellectual integrity and humility ( Paul and Elder 2008 ) in their RJ is increased so that the individual may assess what knowledge is required to make a critically considered judgment. However, this is not necessarily a common response to a lack of relevant knowledge, given that when individuals are tasked with decision-making regarding a topic in which they do not possess relevant knowledge, these individuals will generally rely on emotional cues to inform their decision-making (e.g., Kahneman and Frederick 2002 ). Concerns here are not necessarily about the lack of domain-specific knowledge necessary to make an accurate decision, but rather the (1) belief of the individual that they have the knowledge necessary to make a critically thought-out judgment, even when this is not the case—again, akin to the Dunning–Kruger Effect ( Kruger and Dunning 1999 ); or (2) lack of willingness (i.e., disposition) to gain additional, relevant topic knowledge.

One final problem with relying on experience for important decisions, as alluded to above, is that when experience is engaged, it is not necessarily an objective recollection of the procedure. It can be accompanied by the individual’s beliefs, attitudes, and feelings—how that experience is recalled. The manner in which an individual draws on their personal experience, in light of these other factors, is inherently emotion-based and, likewise, biased (e.g., Croskerry et al. 2013 ; Loftus 2017 ; Paul 1993 ).

2.4. Bias and Emotion

Definitions of CT often reflect that it is to be applied to a topic, argument, or problem of importance that the individual cares about ( Dwyer 2017 ). The issue of ‘caring’ is important because it excludes judgment and decision-making in day-to-day scenarios that are not of great importance and do not warrant CT (e.g., ‘what colour pants best match my shirt’ and ‘what to eat for dinner’); again, for example, in an effort to conserve time and cognitive resources (e.g., Baumeister 2003 ; Sweller 2010 ). However, given that ‘importance’ is subjective, it essentially boils down to what one cares about (e.g., issues potentially impactful in one’s personal life; topics of personal importance to the individual; or even problems faced by an individual’s social group or work organisation (in which case, care might be more extrinsically-oriented). This is arguably one of the most difficult issues to resolve in CT application, given its contradictory nature—where it is generally recommended that CT should be conducted void of emotion and bias (as much as it can be possible), at the same time, it is also recommended that it should only be applied to things we care about. As a result, the manner in which care is conceptualised requires consideration. For example, in terms of CT, care can be conceptualised as ‘concern or interest; the attachment of importance to a person, place, object or concept; and serious attention or consideration applied to doing something correctly or to avoid damage or risk’; as opposed to some form of passion (e.g., intense, driving or over-powering feeling or conviction; emotions as distinguished from reason; a strong liking or desire for or devotion to some activity, object or concept). In this light, care could be argued as more of a dispositional or self-regulatory factor than emotional bias; thus, making it useful to CT. Though this distinction is important, the manner in which care is labeled does not lessen the potential for biased emotion to play a role in the thinking process. For example, it has been argued that if one cares about the decision they make or the conclusion they draw, then the individual will do their best to be objective as possible ( Dwyer 2017 ). However, it must also be acknowledged that this may not always be the case or even completely feasible (i.e., how can any decision be fully void of emotional input? )—though one may strive to be as objective as possible, such objectivity is not ensured given that implicit bias may infiltrate their decision-making (e.g., taking assumptions for granted as facts in filling gaps (unknowns) in a given problem-space). Consequently, such implicit biases may be difficult to amend, given that we may not be fully aware of them at play.

With that, explicit biases are just as concerning, despite our awareness of them. For example, the more important an opinion or belief is to an individual, the greater the resistance to changing their mind about it ( Rowe et al. 2015 ), even in light of evidence indicating the contrary ( Tavris and Aronson 2007 ). In some cases, the provision of information that corrects the flawed concept may even ‘backfire’ and reinforce the flawed or debunked stance ( Cook and Lewandowsky 2011 ). This cognitive resistance is an important barrier to CT to consider for obvious reasons—as a process; it acts in direct opposition to RJ, the skill of evaluation, as well as a number of requisite dispositions towards CT, including truth-seeking and open-mindedness (e.g., Dwyer et al. 2014 , 2016 ; Facione 1990 ); and at the same time, yields important real-world impacts (e.g., see Nyhan et al. 2014 ).

The notion of emotion impacting rational thought is by no means a novel concept. A large body of research indicates a negative impact of emotion on decision-making (e.g., Kahneman and Frederick 2002 ; Slovic et al. 2002 ; Strack et al. 1988 ), higher-order cognition ( Anticevic et al. 2011 ; Chuah et al. 2010 ; Denkova et al. 2010 ; Dolcos and McCarthy 2006 ) and cognition, more generally ( Iordan et al. 2013 ; Johnson et al. 2005 ; Most et al. 2005 ; Shackman et al. 2006 ) 2 . However, less attention has specifically focused on emotion’s impact on the application of critical thought. This may be a result of assumptions that if a person is inclined to think critically, then what is yielded will typically be void of emotion—which is true to a certain extent. However, despite the domain generality of CT ( Dwyer 2011 , 2017 ; Dwyer and Eigenauer 2017 ; Dwyer et al. 2015 ; Gabennesch 2006 ; Halpern 2014 ), the likelihood of emotional control during the CT process remains heavily dependent on the topic of application. Consider again, for example; there is no guarantee that an individual who generally applies CT to important topics or situations will do so in all contexts. Indeed, depending on the nature of the topic or the problem faced, an individual’s mindware ( Clark 2001 ; Stanovich 2018 ; Stanovich et al. 2016 ; consistent with the metacognitive nature of CT) and the extent to which a context can evoke emotion in the thinker will influence what and how thinking is applied. As addressed above, if the topic is something to which the individual feels passionate, then it will more likely be a greater challenge for them to remain unbiased and develop a reasonably objective argument or solution.

Notably, self-regulation is an important aspect of both RJ and CT ( Dwyer 2017 ; Dwyer et al. 2014 ), and, in this context, it is difficult not to consider the role emotional intelligence might play in the relationship between affect and CT. For example, though there are a variety of conceptualisations of emotional intelligence (e.g., Bar-On 2006 ; Feyerherm and Rice 2002 ; Goleman 1995 ; Salovey and Mayer 1990 ; Schutte et al. 1998 ), the underlying thread among these is that, similar to the concept of self-regulation, emotional intelligence (EI) refers to the ability to monitor (e.g., perceive, understand and regulate) one’s own feelings, as well as those of others, and to use this information to guide relevant thinking and behaviour. Indeed, extant research indicates that there is a positive association between EI and CT (e.g., Afshar and Rahimi 2014 ; Akbari-Lakeh et al. 2018 ; Ghanizadeh and Moafian 2011 ; Kaya et al. 2017 ; Stedman and Andenoro 2007 ; Yao et al. 2018 ). To shed light upon this relationship, Elder ( 1997 ) addressed the potential link between CT and EI through her description of the latter as a measure of the extent to which affective responses are rationally-based , in which reasonable desires and behaviours emerge from such rationally-based emotions. Though there is extant research on the links between CT and EI, it is recommended that future research further elaborate on this relationship, as well as with other self-regulatory processes, in an effort to further establish the potentially important role that EI might play within CT.

3. Discussion

3.1. interpretations.

Given difficulties in the past regarding the conceptualisation of CT ( Dwyer et al. 2014 ), efforts have been made to be as specific and comprehensive as possible when discussing CT in the literature to ensure clarity and accuracy. However, it has been argued that such efforts have actually added to the complexity of CT’s conceptualisation and had the opposite effect on clarity and, perhaps, more importantly, the accessibility and practical usefulness for educators (and students) not working in the research area. As a result, when asked what CT is, I generally follow up the ‘long definition’, in light of past research, with a much simpler description: CT is akin to ‘playing devil’s advocate’. That is, once a claim is made, one should second-guess it in as many conceivable ways as possible, in a process similar to the Socratic Method. Through asking ‘why’ and conjecturing alternatives, we ask the individual—be it another person or even ourselves—to justify the decision-making. It keeps the thinker ‘honest’, which is particularly useful if we’re questioning ourselves. If we do not have justifiable reason(s) for why we think or intend to act in a particular way (above and beyond considered objections), then it should become obvious that we either missed something or we are biased. It is perhaps this simplified description of CT that gives such impetus for the aim of this review.

Whereas extant frameworks often discuss the importance of CT skills, dispositions, and, to a lesser extent, RJ and other self-regulatory functions of CT, they do so with respect to components of CT or processes that facilitate CT (e.g., motivation, executive functions, and dispositions), without fully encapsulating cognitive processes and other factors that may hinder it (e.g., emotion, bias, intuitive judgment and a lack of epistemological understanding or engagement). With that, this review is neither a criticism of existing CT frameworks nor is it to imply that CT has so many barriers that it cannot be taught well, nor does it claim to be a complete list of processes that can impede CT (see again Note 1). To reiterate, education in CT can yield beneficial effects ( Abrami et al. 2008 , 2015 ; Dwyer 2017 ; Dwyer and Eigenauer 2017 ); however, such efficacy may be further enhanced by presenting students and individuals interested in CT the barriers they are likely to face in its application; explaining how these barriers manifest and operate; and offer potential strategies for overcoming them.

3.2. Further Implications and Future Research

Though the barriers addressed here are by no means new to the arena of research in higher-order cognition, there is a novelty in their collated discussion as impactful barriers in the context of CT, particularly with respect to extant CT research typically focusing on introducing strategies and skills for enhancing CT, rather than identifying ‘preventative measures’ for barriers that can negatively impact CT. Nevertheless, future research is necessary to address how such barriers can be overcome in the context of CT. As addressed above, it is recommended that CT education include discussion of these barriers and encourage self-regulation against them; and, given the vast body of CT research focusing on enhancement through training and education, it seems obvious to make such a recommendation in this context. However, it is also recognised that simply identifying these barriers and encouraging people to engage in RJ and self-regulation to combat them may not suffice. For example, educators might very well succeed in teaching students how to apply CT skills , but just as these educators may not be able to motivate students to use them as often as they might be needed or even to value such skills (such as in attempting to elicit a positive disposition towards CT), it might be the case that without knowing about the impact of the discussed barriers to CT (e.g., emotion and/or intuitive judgment), students may be just as susceptible to biases in their attempts to think critically as others without CT skills. Thus, what such individuals might be applying is not CT at all; rather, just a series of higher-order cognitive skills from a biased or emotion-driven perspective. As a result, a genuine understanding of these barriers is necessary for individuals to appropriately self-regulate their thinking.

Moreover, though the issues of epistemological beliefs, bias, emotion, and intuitive processes are distinct in the manner in which they can impact CT, these do not have set boundaries; thus, an important implication is that they can overlap. For example, epistemological understanding can influence how individuals make decisions in real-world scenarios, such as through intuiting a judgment in social situations (i.e., without considering the nature of the knowledge behind the decision, the manner in which such knowledge interacts [e.g., correlation v. causation], the level of uncertainty regarding both the decision-maker’s personal stance and the available evidence), when a situation might actually require further consideration or even the honest response of ‘I don’t know’. The latter concept—that of simply responding ‘I don’t know’ is interesting to consider because though it seems, on the surface, to be inconsistent with CT and its outcomes, it is commensurate with many of its associated components (e.g., intellectual honesty and humility; see Paul and Elder 2008 ). In the context this example is used, ‘I don’t know’ refers to epistemological understanding. With that, it may also be impacted by bias and emotion. For example, depending on the topic, an individual may be likely to respond ‘I don’t know’ when they do not have the relevant knowledge or evidence to provide a sufficient answer. However, in the event that the topic is something the individual is emotionally invested in or feels passionate about, an opinion or belief may be shared instead of ‘I don’t know’ (e.g., Kahneman and Frederick 2002 ), despite a lack of requisite evidence-based knowledge (e.g., Kruger and Dunning 1999 ). An emotional response based on belief may be motivated in the sense that the individual knows that they do not know for sure and simply uses a belief to support their reasoning as a persuasive tool. On the other hand, the emotional response based on belief might be used simply because the individual may not know that the use of a belief is an insufficient means of supporting their perspective– instead, they might think that their intuitive, belief-based judgment is as good as a piece of empirical evidence; thus, suggesting a lack of empirical understanding. With that, it is fair to say that though epistemological understanding, intuitive judgment, emotion, and bias are distinct concepts, they can influence each other in real-world CT and decision-making. Though there are many more examples of how this might occur, the one presented may further support the recommendation that education can be used to overcome some of the negative effects associated with the barriers presented.

For example, in Ireland, students are not generally taught about academic referencing until they reach third-level education. Anecdotally, I was taught about referencing at age 12 and had to use it all the way through high school when I was growing up in New York. In the context of these referencing lessons, we were taught about the credibility of sources, as well as how analyse and evaluate arguments and subsequently infer conclusions in light of these sources (i.e., CT skills). We were motivated by our teacher to find the ‘truth’ as best we could (i.e., a fundament of CT disposition). Now, I recognise that this experience cannot be generalised to larger populations, given that I am a sample size of one, but I do look upon such education, perhaps, as a kind of transformative learning experience ( Casey 2018 ; King 2009 ; Mezirow 1978 , 1990 ) in the sense that such education might have provided a basis for both CT and epistemological understanding. For CT, we use research to support our positions, hence the importance of referencing. When a ‘reference’ is not available, one must ask if there is actual evidence available to support the proposition. If there is not, one must question the basis for why they think or believe that their stance is correct—that is, where there is logic to the reasoning or if the proposition is simply an emotion- or bias-based intuitive judgment. So, in addition to referencing, the teaching of some form of epistemology—perhaps early in children’s secondary school careers, might benefit students in future efforts to overcome some barriers to CT. Likewise, presenting examples of the observable impact that bias, emotions, and intuitive thought can have on their thinking might also facilitate overcoming these barriers.

As addressed above, it is acknowledged that we may not be able to ‘teach’ people not to be biased or emotionally driven in their thinking because it occurs naturally ( Kahneman 2011 )—regardless of how ‘skilled’ one might be in CT. For example, though research suggests that components of CT, such as disposition, can improve over relatively short periods of time (e.g., over the duration of a semester-long course; Rimiene 2002 ), less is known about how such components have been enhanced (given the difficulty often associated with trying to teach something like disposition ( Dwyer 2017 ); i.e., to reiterate, it is unlikely that simply ‘teaching’ (or telling) students to be motivated towards CT or to value it (or its associated concepts) will actually enhance it over short periods of time (e.g., semester-long training). Nevertheless, it is reasonable to suggest that, in light of such research, educators can encourage dispositional growth and provide opportunities to develop it. Likewise, it is recommended that educators encourage students to be aware of the cognitive barriers discussed and provide chances to engage in CT scenarios where such barriers are likely to play a role, thus, giving students opportunities to acknowledge the barriers and practice overcoming them. Moreover, making students aware of such barriers at younger ages—in a simplified manner, may promote the development of personal perspectives and approaches that are better able to overcome the discussed barriers to CT. This perspective is consistent with research on RJ ( Dwyer et al. 2015 ), in which it was recommended that such enhancement requires not only time to develop (be it over the course of a semester or longer) but is also a function of having increased opportunities to engage CT. In the possibilities described, individuals may learn both to overcome barriers to CT and from the positive outcomes of applying CT; and, perhaps, engage in some form of transformative learning ( Casey 2018 ; King 2009 ; Mezirow 1978 , 1990 ) that facilitates an enhanced ‘valuing’ of and motivation towards CT. For example, through growing an understanding of the nature of epistemology, intuitive-based thinking, emotion, bias, and the manner in which people often succumb to faulty reasoning in light of these, individuals may come to better understand the limits of knowledge, barriers to CT and how both understandings can be applied; thus, growing further appreciation of the process as it is needed.

To reiterate, research suggests that there may be a developmental trajectory above and beyond the parameters of a semester-long training course that is necessary to develop the RJ necessary to think critically and, likewise, engage an adequate epistemological stance and self-regulate against impeding cognitive processes ( Dwyer et al. 2015 ). Though such research suggests that such development may not be an issue of time, but rather the amount of opportunities to engage RJ and CT, there is a dearth of recommendations offered with respect to how this could be performed in practice. Moreover, the how and what regarding ‘opportunities for engagement’ requires further investigation as well. For example, does this require additional academic work outside the classroom in a formal manner, or does it require informal ‘exploration’ of the world of information on one’s own? If the latter, the case of motivational and dispositional levels once again comes into question; thus, even further consideration is needed. One way or another, future research efforts are necessary to identify how best to make individuals aware of barriers to CT, encourage them to self-regulate against them, and identify means of increasing opportunities to engage RJ and CT.

4. Conclusions

Taking heed that it is unnecessary to reinvent the CT wheel ( Eigenauer 2017 ), the aim of this review was to further elaborate on the processes associated with CT and make a valuable contribution to its literature with respect to conceptualisation—not just in light of making people explicitly aware of what it is, but also what it is not and how it can be impeded (e.g., through inadequate CT skills and dispositions; epistemological misunderstanding; intuitive judgment; as well as bias and emotion)—a perspective consistent with that of ‘constructive feedback’ wherein students need to know both what they are doing right and what they are doing wrong. This review further contributes to the CT education literature by identifying the importance of (1) engaging understanding of the nature, limits, and certainty of knowing as individuals traverse the landscape of evidence-bases in their research and ‘truth-seeking’; (2) understanding how emotions and biases can affect CT, regardless of the topic; (3) managing gut-level intuition until RJ has been appropriately engaged; and (4) the manner in which language is used to convey meaning to important and/or abstract concepts (e.g., ‘caring’, ‘proof’, causation/correlation, etc.). Consistent with the perspectives on research advancement presented in this review, it is acknowledged that the issues addressed here may not be complete and may themselves be advanced upon and updated in time; thus, future research is recommended and welcomed to improve and further establish our working conceptualisation of critical thinking, particularly in a real-world application.

Acknowledgments

The author would like to acknowledge, with great thanks and appreciation, John Eigenauer (Taft College) for his consult, review and advice regarding earlier versions of this manuscript.

Funding Statement

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Data availability statement, conflicts of interest.

The author declares no conflict of interest.

1 Notably, though inadequacies in cognitive resources (apart from those explicitly set within the conceptualisations of CT discussed; e.g., see Section 2.1 ) are acknowledged as impediments to one’s ability to apply CT (e.g., a lack of relevant background knowledge, as well as broader cognitive abilities and resources ( Dwyer 2017 ; Halpern 2014 ; Stanovich and Stanovich 2010 )), these will not be discussed as focus is largely restricted to issues of cognitive processes that ‘naturally’ act as barriers in their functioning. Moreover, such inadequacies may more so be issues of individual differences than ongoing issues that everyone , regardless of ability, would face in CT (e.g., the impact of emotion and bias). Nevertheless, it is recommended that future research further investigates the influence of such inadequacies in cognitive resources on CT.

2 There is also some research that suggests that emotion may mediate enhanced cognition ( Dolcos et al. 2011 , 2012 ). However, this discrepancy in findings may result from the types of emotion studied—such as task-relevant emotion and task-irrelevant emotion. The distinction between the two is important to consider in terms of, for example, the distinction between one’s general mood and feelings specific unto the topic under consideration. Though mood may play a role in the manner in which CT is conducted (e.g., making judgments about a topic one is passionate about may elicit positive or negative emotions that affect the thinker’s mood in some way), notably, this discussion focuses on task-relevant emotion and associated biases that negatively impact the CT process. This is also an important distinction because an individual may generally think critically about ‘important’ topics, but may fail to do so when faced with a cognitive task that requires CT with which the individual has a strong, emotional perspective (e.g., in terms of passion , as described above).

Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

  • Abrami Philip C., Bernard Robert M., Borokhovski Eugene, Waddington David I., Wade C. Anne, Persson Tonje. Strategies for teaching students to think critically: A meta-analysis. Review of Educational Research. 2015; 85 :275–314. doi: 10.3102/0034654314551063. [ CrossRef ] [ Google Scholar ]
  • Abrami Philip C., Bernard Robert M., Borokhovski Evgueni, Wade Anne, Surkes Michael A., Tamim Rana, Zhang Dai. Instructional interventions affecting critical thinking skills and dispositions: A stage 1 meta-analysis. Review of Educational Research. 2008; 78 :1102–34. [ Google Scholar ]
  • Afshar Hassan Soodmand, Rahimi Masoud. The relationship among critical thinking, emotional intelligence, and speaking abilities of Iranian EFL learners. Procedia-Social and Behavioral Sciences. 2014; 136 :75–79. doi: 10.1016/j.sbspro.2014.05.291. [ CrossRef ] [ Google Scholar ]
  • Ahern Aoife, Dominguez Caroline, McNally Ciaran, O’Sullivan John J., Pedrosa Daniela. A literature review of critical thinking in engineering education. Studies in Higher Education. 2019; 44 :816–28. doi: 10.1080/03075079.2019.1586325. [ CrossRef ] [ Google Scholar ]
  • Akbari-Lakeh M., Naderi A., Arbabisarjou A. Critical thinking and emotional intelligence skills and relationship with students’ academic achievement. Prensa Médica Argentina. 2018; 104 :2. [ Google Scholar ]
  • Aliakbari Mohammad, Sadeghdaghighi Akram. Teachers’ perception of the barriers to critical thinking. Procedia-Social and Behavioral Sciences. 2013; 70 :1–5. doi: 10.1016/j.sbspro.2013.01.031. [ CrossRef ] [ Google Scholar ]
  • Anticevic Alan, Repovs Grega, Corlett Philip R., Barch Deanna M. Negative and nonemotional interference with visual working memory in schizophrenia. Biological Psychiatry. 2011; 70 :1159–68. doi: 10.1016/j.biopsych.2011.07.010. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Baril Charles P., Cunningham Billie M., Fordham David R., Gardner Robert L., Wolcott Susan K. Critical thinking in the public accounting profession: Aptitudes and attitudes. Journal of Accounting Education. 1998; 16 :381–406. doi: 10.1016/S0748-5751(98)00023-2. [ CrossRef ] [ Google Scholar ]
  • Bar-On Reuven. The Bar-On model of emotional-social intelligence (ESI) Psicothema. 2006; 18 :13–25. [ PubMed ] [ Google Scholar ]
  • Baumeister Roy. The psychology of irrationality: Why people make foolish, self-defeating choices. The Psychology of Economic Decisions. 2003; 1 :3–16. [ Google Scholar ]
  • Bezanilla María José, Fernández-Nogueira Donna, Poblete Manuel, Galindo-Domínguez Hector. Methodologies for teaching-learning critical thinking in higher education: The teacher’s view. Thinking Skills and Creativity. 2019; 33 :100584. doi: 10.1016/j.tsc.2019.100584. [ CrossRef ] [ Google Scholar ]
  • Brabeck Mary Margaret. The relationship between critical thinking skills and development of reflective judgment among adolescent and adult women; Paper presented at the 89th annual convention of the American Psychological Association; Los Angeles, CA, USA. August 24–26; 1981. [ Google Scholar ]
  • Butler Heather A., Dwyer Christopher P., Hogan Michael J., Franco Amanda, Rivas Silvia F., Saiz Carlos, Almeida Leandro S. The Halpern Critical Thinking Assessment and real-world outcomes: Cross-national applications. Thinking Skills and Creativity. 2012; 7 :112–21. doi: 10.1016/j.tsc.2012.04.001. [ CrossRef ] [ Google Scholar ]
  • Byerly T. Ryan. Teaching for intellectual virtue in logic and critical thinking classes: Why and how. Teaching Philosophy. 2019; 42 :1. doi: 10.5840/teachphil201911599. [ CrossRef ] [ Google Scholar ]
  • Cabantous Laure, Gond Jean-Pascal, Johnson-Cramer Michael. Decision theory as practice: Crafting rationality in organizations. Organization Studies. 2010; 31 :1531–66. doi: 10.1177/0170840610380804. [ CrossRef ] [ Google Scholar ]
  • Cáceres Martín, Nussbaum Miguel, Ortiz Jorge. Integrating critical thinking into the classroom: A teacher’s perspective. Thinking Skills and Creativity. 2020; 37 :100674. [ Google Scholar ]
  • Cader Raffik, Campbell Steve, Watson Don. Cognitive continuum theory in nursing decision-making. Journal of Advanced Nursing. 2005; 49 :397–405. doi: 10.1111/j.1365-2648.2004.03303.x. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Casey Helen. Doctoral dissertation. National University of Ireland; Galway, Ireland: 2018. Transformative Learning: An Exploration of the BA in Community and Family Studies Graduates’ Experiences. [ Google Scholar ]
  • Chuah Lisa YM, Dolcos Florin, Chen Annette K., Zheng Hui, Parimal Sarayu, Chee Michael WL. Sleep deprivation and interference by emotional distracters. SLEEP. 2010; 33 :1305–13. doi: 10.1093/sleep/33.10.1305. [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Clark Andy. Mindware: An Introduction to the Philosophy of Cognitive Science. Oxford University Press; New York: 2001. [ Google Scholar ]
  • Commission on Fake News and the Teaching of Critical Literacy in Schools . Fake News and Critical Literacy: Final Report. National Literacy Trust; London: 2018. [ Google Scholar ]
  • Cook John, Lewandowsky Stephan. The Debunking Handbook. University of Queensland; St. Lucia: 2011. [ Google Scholar ]
  • Cornell Paul, Riordan Monica, Townsend-Gervis Mary, Mobley Robin. Barriers to critical thinking: Workflow interruptions and task switching among nurses. JONA: The Journal of Nursing Administration. 2011; 41 :407–14. doi: 10.1097/NNA.0b013e31822edd42. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Croskerry Pat, Singhal Geeta, Mamede Sílvia. Cognitive debiasing 2: Impediments to and strategies for change. BMJ Quality and Safety. 2013; 22 :ii65–ii72. doi: 10.1136/bmjqs-2012-001713. [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Darling-Hammond Linda. How can we teach for meaningful learning? In: Darling-Hammond L., editor. Powerful Learning. Wiley; New York: 2008. pp. 1–10. [ Google Scholar ]
  • Dawson Theo L. Prepared in Response to Tasking from ODNI/CHCO/IC Leadership Development Office. Developmental Testing Service, LLC; Northampton: 2008. Metacognition and learning in adulthood. [ Google Scholar ]
  • Denkova Ekaterina, Wong Gloria, Dolcos Sanda, Sung Keen, Wang Lihong, Coupland Nicholas, Dolcos Florin. The impact of anxiety-inducing distraction on cognitive performance: A combined brain imaging and personality investigation. PLoS ONE. 2010; 5 :e14150. doi: 10.1371/journal.pone.0014150. [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Dhami Mandeep K., Thomson Mary E. On the relevance of cognitive continuum theory and quasirationality for understanding management judgment and decision making. European Management Journal. 2012; 30 :316–26. doi: 10.1016/j.emj.2012.02.002. [ CrossRef ] [ Google Scholar ]
  • Dolcos Florin, Iordan Alexandru D., Dolcos Sanda. Neural correlates of emotion–cognition interactions: A review of evidence from brain imaging investigations. Journal of Cognitive Psychology. 2011; 23 :669–94. doi: 10.1080/20445911.2011.594433. [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Dolcos Florin, McCarthy Gregory. Brain systems mediating cognitive interference by emotional distraction. Journal of Neuroscience. 2006; 26 :2072–79. doi: 10.1523/JNEUROSCI.5042-05.2006. [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Dolcos Florin, Denkova Ekaterina, Dolcos Sanda. Neural correlates of emotional memories: A review of evidence from brain imaging studies. Psychologia. 2012; 55 :80–111. doi: 10.2117/psysoc.2012.80. [ CrossRef ] [ Google Scholar ]
  • Dumitru Daniela. Critical thinking and integrated programs. The problem of transferability. Procedia-Social and Behavioral Sciences. 2012; 33 :143–47. doi: 10.1016/j.sbspro.2012.01.100. [ CrossRef ] [ Google Scholar ]
  • Dunwoody Philip T., Haarbauer Eric, Mahan Robert P., Marino Christopher, Tang Chu-Chun. Cognitive adaptation and its consequences: A test of cognitive continuum theory. Journal of Behavioral Decision Making. 2000; 13 :35–54. doi: 10.1002/(SICI)1099-0771(200001/03)13:1<35::AID-BDM339>3.0.CO;2-U. [ CrossRef ] [ Google Scholar ]
  • Dwyer Christopher P. Doctoral thesis. National University of Ireland; Galway, Ireland: 2011. The Evaluation of Argument Mapping as a Learning Tool. [ Google Scholar ]
  • Dwyer Christopher P. Critical Thinking: Conceptual Perspectives and Practical Guidelines. Cambridge University Press; Cambridge: 2017. [ Google Scholar ]
  • Dwyer Christopher P. Teaching critical thinking. The SAGE Encyclopedia of Higher Education. 2020; 4 :1510–12. [ Google Scholar ]
  • Dwyer Christopher P., Walsh Anne. A case study of the effects of critical thinking instruction through adult distance learning on critical thinking performance: Implications for critical thinking development. Educational Technology and Research. 2019; 68 :17–35. doi: 10.1007/s11423-019-09659-2. [ CrossRef ] [ Google Scholar ]
  • Dwyer Christopher P., Eigenauer John D. To Teach or not to Teach Critical Thinking: A Reply to Huber and Kuncel. Thinking Skills and Creativity. 2017; 26 :92–95. doi: 10.1016/j.tsc.2017.08.002. [ CrossRef ] [ Google Scholar ]
  • Dwyer Christopher P., Hogan Michael J., Stewart Ian. An evaluation of argument mapping as a method of enhancing critical thinking performance in e-learning environments. Metacognition and Learning. 2012; 7 :219–44. doi: 10.1007/s11409-012-9092-1. [ CrossRef ] [ Google Scholar ]
  • Dwyer Christopher P., Hogan Michael J., Stewart Ian. An integrated critical thinking framework for the 21st century. Thinking Skills and Creativity. 2014; 12 :43–52. doi: 10.1016/j.tsc.2013.12.004. [ CrossRef ] [ Google Scholar ]
  • Dwyer Christopher P., Hogan Michael J., Stewart Ian. The evaluation of argument mapping-infused critical thinking instruction as a method of enhancing reflective judgment performance. Thinking Skills & Creativity. 2015; 16 :11–26. [ Google Scholar ]
  • Dwyer Christopher. P., Hogan Michael J., Harney Owen M., Kavanagh Caroline. Facilitating a Student-Educator Conceptual Model of Dispositions towards Critical Thinking through Interactive Management. Educational Technology & Research. 2016; 65 :47–73. [ Google Scholar ]
  • Eigenauer John D. Don’t reinvent the critical thinking wheel: What scholarly literature says about critical thinking instruction. NISOD Innovation Abstracts. 2017; 39 :2. [ Google Scholar ]
  • Elder Linda. Critical thinking: The key to emotional intelligence. Journal of Developmental Education. 1997; 21 :40. doi: 10.5840/inquiryctnews199616211. [ CrossRef ] [ Google Scholar ]
  • Ennis Robert H. A taxonomoy of critical thinking dispositions and abilities. In: Baron J. B., Sternberg R. J., editors. Teaching Thinking Skills: Theory and Practice. W.H. Freeman; New York: 1987. pp. 9–26. [ Google Scholar ]
  • Ennis Robert H. Critical Thinking. Prentice-Hall; Upper Saddle River: 1996. [ Google Scholar ]
  • Ennis Robert H. Is critical thinking culturally biased? Teaching Philosophy. 1998; 21 :15–33. doi: 10.5840/teachphil19982113. [ CrossRef ] [ Google Scholar ]
  • Ennis Robert. H. Critical thinking across the curriculum: A vision. Topoi. 2018; 37 :165–84. doi: 10.1007/s11245-016-9401-4. [ CrossRef ] [ Google Scholar ]
  • Facione Noreen C., Facione Peter A. Analyzing explanations for seemingly irrational choices: Linking argument analysis and cognitive science. International Journal of Applied Philosophy. 2001; 15 :267–68. [ Google Scholar ]
  • Facione Peter A. The Delphi Report: Committee on Pre-College Philosophy. California Academic Press; Millbrae: 1990. [ Google Scholar ]
  • Facione Peter A., Facione Noreen C. CCTDI: A Disposition Inventory. California Academic Press; Millbrae: 1992. [ Google Scholar ]
  • Facione Peter A., Facione Noreen C., Blohm Stephen W., Giancarlo Carol Ann F. The California Critical Thinking Skills Test: CCTST. California Academic Press; San Jose: 2002. [ Google Scholar ]
  • Feyerherm Ann E., Rice Cheryl L. Emotional intelligence and team performance: The good, the bad and the ugly. International Journal of Organizational Analysis. 2002; 10 :343–63. doi: 10.1108/eb028957. [ CrossRef ] [ Google Scholar ]
  • Flavell John H. Metacognitive aspects of problem solving. The Nature of Intelligence. 1976:231–36. [ Google Scholar ]
  • Gabennesch Howard. Critical thinking… what is it good for? (In fact, what is it?) Skeptical Inquirer. 2006; 30 :36–41. [ Google Scholar ]
  • Gadzella Bernadette M. Teaching and Learning Critical Thinking Skills. 1996.
  • Gambrill Eileen. Evidence-based practice and policy: Choices ahead. Research on Social Work Practice. 2006; 16 :338–57. [ Google Scholar ]
  • Ghanizadeh Afsaneh, Moafian Fatemeh. Critical thinking and emotional intelligence: Investigating the relationship among EFL learners and the contribution of age and gender. Iranian Journal of Applied Linguistics. 2011; 14 :23–48. [ Google Scholar ]
  • Gilovich Thomas, Griffin Dale, Kahneman Daniel., editors. Heuristics and Biases: The Psychology of Intuitive Judgment. Cambridge University Press; Cambridge: 2002. [ Google Scholar ]
  • Glaser Edward. M. An Experiment in the Development of Critical Thinking. Teachers College of Columbia University, Bureau of Publications; New York: 1941. [ Google Scholar ]
  • Goleman Daniel. Emotional Intelligence. Bantam; New York: 1995. [ Google Scholar ]
  • Halpern Diane F. Thought & Knowledge: An Introduction to Critical Thinking. 5th ed. Psychology Press; London: 2014. [ Google Scholar ]
  • Hamm Robert M. Clinical intuition and clinical analysis: Expertise and the cognitive continuum. In: Dowie J., Elstein A. S., editors. Professional Judgment: A Reader in Clinical Decision Making. Cambridge University Press; Cambridge: 1988. pp. 78–105. [ Google Scholar ]
  • Hammond Kenneth R. Principles of Organization in Intuitive and Analytical Cognition. Center for Research on Judgment and Policy, University of Colorado; Boulder: 1981. Report No. 231. [ Google Scholar ]
  • Hammond Kenneth R. Upon reflection. Thinking and Reasoning. 1996; 2 :239–48. doi: 10.1080/135467896394537. [ CrossRef ] [ Google Scholar ]
  • Hammond Kenneth R. Judgments Under Stress. Oxford University Press on Demand; New York: 2000. [ Google Scholar ]
  • Haynes Ada, Lisic Elizabeth, Goltz Michele, Stein Barry, Harris Kevin. Moving beyond assessment to improving students’ critical thinking skills: A model for implementing change. Journal of the Scholarship of Teaching and Learning. 2016; 16 :44–61. doi: 10.14434/josotl.v16i4.19407. [ CrossRef ] [ Google Scholar ]
  • Hitchcock David. The effectiveness of computer-assisted instruction in critical thinking. Informal Logic. 2004; 24 :183–218. doi: 10.22329/il.v24i3.2145. [ CrossRef ] [ Google Scholar ]
  • Huffman Karen, Vernoy Mark W., William Barbara F. Studying Psychology in Action: A Study Guide to Accompany Psychology in Action. Wiley; Hoboken: 1991. [ Google Scholar ]
  • Iordan Alexandru D., Dolcos Sanda, Dolcos Florin. Neural signatures of the response to emotional distraction: A review of evidence from brain imaging investigations. Frontiers in Human Neuroscience. 2013; 7 :200. doi: 10.3389/fnhum.2013.00200. [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Johnson Marcia K., Raye Carol L., Mitchell Karen J., Greene Erich J., Cunningham William A., Sanislow Charles A. Using fMRI to investigate a component process of reflection: Prefrontal correlates of refreshing a just-activated representation. Cognitive, Affective, & Behavioral Neuroscience. 2005; 5 :339–61. doi: 10.3758/CABN.5.3.339. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Jukes I., McCain T. Minds in Play: Computer Game Design as a Context of Children’s Learning. Erlbaum; Hillsdale: 2002. [ Google Scholar ]
  • Kahneman Daniel. Thinking Fast and Slow. Penguin; Great Britain: 2011. [ Google Scholar ]
  • Kahneman Daniel, Frederick Shane. Representativeness revisited: Attribute substitution in 240 intuitive judgment. In: Gilovich T., Griffin D., Kahneman D., editors. Heuristics and Biases: The Psychology of Intuitive Judgment. Cambridge University Press; New York: 2002. pp. 49–81. [ Google Scholar ]
  • Kahneman Daniel, Slovic Paul, Tversky Amos., editors. Judgment under Uncertainty: Heuristics and Biases. Cambridge University Press; Cambridge: 1982. [ Google Scholar ]
  • Kaya Hülya, Şenyuva Emine, Bodur Gönül. Developing critical thinking disposition and emotional intelligence of nursing students: A longitudinal research. Nurse Education Today. 2017; 48 :72–77. doi: 10.1016/j.nedt.2016.09.011. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • King Kathleen P. Adult Education Special Topics: Theory, Research, and Practice in Lifelong Learning. Information Age Publishing; Charlotte: 2009. The Handbook of the Evolving Research of Transformative Learning Based on the Learning Activities Survey. [ Google Scholar ]
  • King Patricia M., Kitchener Karen S. Reflective judgment: Theory and research on the development of epistemic assumptions through adulthood. Educational Psychologist. 2004; 39 :5–18. doi: 10.1207/s15326985ep3901_2. [ CrossRef ] [ Google Scholar ]
  • King Patricia M., Wood Phillip K., Mines Robert A. Critical thinking among college and graduate students. The Review of Higher Education. 1990; 13 :167–86. doi: 10.1353/rhe.1990.0026. [ CrossRef ] [ Google Scholar ]
  • King Patricia. M., Kitchener Karen. Developing Reflective Judgment: Understanding and Promoting Intellectual Growth and Critical Thinking in Adolescents and Adults. Jossey Bass; San Francisco: 1994. [ Google Scholar ]
  • Kruger Justin, Dunning David. Unskilled and unaware of it: How difficulties in recognizing one’s own incompetence lead to inflated self-Assessments. Journal of Personality and Social Psychology. 1999; 77 :1121–34. doi: 10.1037/0022-3514.77.6.1121. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Ku Kelly Y. L. Assessing students’ critical thinking performance: Urging for measurements using multi-response format. Thinking Skills and Creativity. 2009; 4 :70–76. doi: 10.1016/j.tsc.2009.02.001. [ CrossRef ] [ Google Scholar ]
  • Ku Kelly Y. L., Ho Irene T. Dispositional factors predicting Chinese students’ critical thinking performance. Personality and Individual Differences. 2010a; 48 :54–58. doi: 10.1016/j.paid.2009.08.015. [ CrossRef ] [ Google Scholar ]
  • Ku Kelly Y. L., Ho Irene T. Metacognitive strategies that enhance critical thinking. Metacognition and Learning. 2010b; 5 :251–67. doi: 10.1007/s11409-010-9060-6. [ CrossRef ] [ Google Scholar ]
  • Kuhn Deanna. A developmental model of critical thinking. Educational Researcher. 1999; 28 :16–25. doi: 10.3102/0013189X028002016. [ CrossRef ] [ Google Scholar ]
  • Kuhn Deanna. Metacognitive development. Current Directions in Psychological Science. 2000; 9 :178–81. doi: 10.1111/1467-8721.00088. [ CrossRef ] [ Google Scholar ]
  • Leventhal Howard. A perceptual-motor theory of emotion. Advances in Experimental Social Psychology. 1984; 17 :117–82. [ Google Scholar ]
  • Lloyd Margaret, Bahr Nan. Thinking critically about critical thinking in higher education. International Journal for the Scholarship of Teaching and Learning. 2010; 4 :1–16. [ Google Scholar ]
  • Loftus Elizabeth. F. Eavesdropping on memory. Annual Review of Psychology. 2017; 68 :1–18. doi: 10.1146/annurev-psych-010416-044138. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Ma Lihong, Luo Haifeng. Chinese pre-service teachers’ cognitions about cultivating critical thinking in teaching English as a foreign language. Asia Pacific Journal of Education. 2021; 41 :543–57. doi: 10.1080/02188791.2020.1793733. [ CrossRef ] [ Google Scholar ]
  • Ma Lihong, Liu Ning. Teacher belief about integrating critical thinking in English teaching in China. Journal of Education for Teaching. 2022; 49 :137–52. doi: 10.1080/02607476.2022.2044267. [ CrossRef ] [ Google Scholar ]
  • Mahmood Khalid. Do people overestimate their information literacy skills? A systematic review of empirical evidence on the Dunning-Kruger effect. Communications in Information Literacy. 2016; 10 :199–213. doi: 10.15760/comminfolit.2016.10.2.24. [ CrossRef ] [ Google Scholar ]
  • Mangena Agnes, Chabeli Mary M. Strategies to overcome obstacles in the facilitation of critical thinking in nursing education. Nurse Education Today. 2005; 25 :291–98. doi: 10.1016/j.nedt.2005.01.012. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • McGuinness Carol. Teaching thinking: Learning how to think; Presented at the Psychological Society of Ireland and British Psychological Association’sPublic Lecture Series; Galway, Ireland. March 6; 2013. [ Google Scholar ]
  • Mezirow Jack. Perspective Transformation. Adult Education. 1978; 28 :100–10. doi: 10.1177/074171367802800202. [ CrossRef ] [ Google Scholar ]
  • Mezirow Jack. How Critical Reflection Triggers Transformative Learning. In: Mezirow J., editor. Fostering Critical Reflection in Adulthood. Jossey Bass; San Francisco: 1990. pp. 1–20. [ Google Scholar ]
  • Most Steven B., Chun Marvin M., Widders David M., Zald David H. Attentional rubbernecking: Cognitive control and personality in emotioninduced blindness. Psychonomic Bulletin and Review. 2005; 12 :654–61. doi: 10.3758/BF03196754. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Niu Lian, Behar-Horenstein Linda S., Garvan Cyndi W. Do instructional interventions influence college students’ critical thinking skills? A meta-analysis. Educational Research Review. 2013; 9 :114–28. doi: 10.1016/j.edurev.2012.12.002. [ CrossRef ] [ Google Scholar ]
  • Norris Stephen P. Critical Thinking: Current Research, Theory, and Practice. Kluwer; Dordrecht: 1994. The meaning of critical thinking test performance: The effects of abilities and dispositions on scores; pp. 315–29. [ Google Scholar ]
  • Nyhan Brendan, Reifler Jason, Richey Sean, Freed Gary L. Effective messages in vaccine promotion: A randomized trial. Pediatrics. 2014; 133 :E835–E842. doi: 10.1542/peds.2013-2365. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Ortiz Claudia Maria Alvarez. Master’s thesis. University of Melbourne; Melbourne, VIC, Australia: 2007. Does Philosophy Improve Critical Thinking Skills? [ Google Scholar ]
  • Paul Richard W. Critical Thinking: What Every Person Needs to Survive in a Rapidly Changing World. Foundation for Critical Thinking; Santa Barbara: 1993. [ Google Scholar ]
  • Paul Richard, Elder Linda. Critical. The Foundation for Critical Thinking; Dillon Beach: 2008. Thinking. [ Google Scholar ]
  • Perkins David N., Jay Eileen, Tishman Shari. Beyond abilities: A dispositional theory of thinking. Merrill Palmer Quarterly. 1993; 39 :1. [ Google Scholar ]
  • Perkins David, Ritchhart Ron. Motivation, Emotion, and Cognition. Routledge; London: 2004. When is good thinking? pp. 365–98. [ Google Scholar ]
  • Popper Karl R. The Logic of Scientific Discovery. Routledge; London: 1959. First published 1934. [ Google Scholar ]
  • Popper Karl R. All Life Is Problem Solving. Psychology Press; London: 1999. [ Google Scholar ]
  • Quinn Sarah, Hogan Michael, Dwyer Christopher, Finn Patrick, Fogarty Emer. Development and Validation of the Student-Educator Negotiated Critical Thinking Dispositions Scale (SENCTDS) Thinking Skills and Creativity. 2020; 38 :100710. doi: 10.1016/j.tsc.2020.100710. [ CrossRef ] [ Google Scholar ]
  • Rear David. One size fits all? The limitations of standardised assessment in critical thinking. Assessment & Evaluation in Higher Education. 2019; 44 :664–75. [ Google Scholar ]
  • Reed Jennifer H., Kromrey Jeffrey D. Teaching critical thinking in a community college history course: Empirical evidence from infusing Paul’s model. College Student Journal. 2001; 35 :201–15. [ Google Scholar ]
  • Rimiene Vaiva. Assessing and developing students’ critical thinking. Psychology Learning & Teaching. 2002; 2 :17–22. [ Google Scholar ]
  • Rowe Matthew P., Gillespie B. Marcus, Harris Kevin R., Koether Steven D., Shannon Li-Jen Y., Rose Lori A. Redesigning a general education science course to promote critical thinking. CBE—Life Sciences Education. 2015; 14 :ar30. doi: 10.1187/cbe.15-02-0032. [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Saleh Salamah Embark. Critical thinking as a 21st century skill: Conceptions, implementation and challenges in the EFL classroom. European Journal of Foreign Language Teaching. 2019; 4 :1. doi: 10.5281/zenodo.2542838. [ CrossRef ] [ Google Scholar ]
  • Salovey Peter, Mayer John D. Emotional intelligence. Imagination, Cognition and Personality. 1990; 9 :185–211. doi: 10.2190/DUGG-P24E-52WK-6CDG. [ CrossRef ] [ Google Scholar ]
  • Schutte Nicola S., Malouff John M., Hall Lena E., Haggerty Donald J., Cooper Joan T., Golden Charles J., Dornheim Liane. Development and validation of a measure of emotional intelligence. Personality and Individual Differences. 1998; 25 :167–77. doi: 10.1016/S0191-8869(98)00001-4. [ CrossRef ] [ Google Scholar ]
  • Shackman Alexander J., Sarinopoulos Issidoros, Maxwell Jeffrey S., Pizzagalli Diego A., Lavric Aureliu, Davidson Richard J. Anxiety selectively disrupts visuospatial working memory. Emotion. 2006; 6 :40–61. doi: 10.1037/1528-3542.6.1.40. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Siegel Harvey. What (good) are thinking dispositions? Educational Theory. 1999; 49 :207–21. doi: 10.1111/j.1741-5446.1999.00207.x. [ CrossRef ] [ Google Scholar ]
  • Simon Herbert A. Models of Man. Wiley; New York: 1957. [ Google Scholar ]
  • Slovic Paul, Fischhoff Baruch, Lichtenstein Sarah. Behavioral decision theory. Annual Review of Psychology. 1977; 28 :1–39. doi: 10.1146/annurev.ps.28.020177.000245. [ CrossRef ] [ Google Scholar ]
  • Slovic Paul, Finucane Melissa, Peters Ellen, MacGregor Donald G. Rational actors or rational fools: Implications of the affect heuristic for behavioral economics. The Journal of SocioEconomics. 2002; 31 :329–42. doi: 10.1016/S1053-5357(02)00174-9. [ CrossRef ] [ Google Scholar ]
  • Solon Tom. Generic critical thinking infusion and course content learning in Introductory Psychology. Journal of Instructional Psychology. 2007; 34 :95–109. [ Google Scholar ]
  • Stanovich Keith E. Miserliness in human cognition: The interaction of detection, override and mindware. Thinking & Reasoning. 2018; 24 :423–44. [ Google Scholar ]
  • Stanovich Keith E., Stanovich Paula J. A framework for critical thinking, rational thinking, and intelligence. In: Preiss D. D., Sternberg R. J., editors. Innovations in Educational Psychology: Perspectives on Learning, Teaching, and Human Development. Springer Publishing Company; Berlin/Heidelberg: 2010. pp. 195–237. [ Google Scholar ]
  • Stanovich Keith E., West Richard F. Individual differences in reasoning: Implications for the rationality debate? Behavioral and Brain Sciences. 2000; 23 :645–65. doi: 10.1017/S0140525X00003435. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Stanovich Keith E., West Richard F., Toplak Maggie E. The Rationality Quotient: Toward a Test of Rational Thinking. MIT Press; Cambridge: 2016. [ Google Scholar ]
  • Stedman Nicole LP, Andenoro Anthony C. Identification of relationships between emotional intelligence skill and critical thinking disposition in undergraduate leadership students. Journal of Leadership Education. 2007; 6 :190–208. doi: 10.12806/V6/I1/RF10. [ CrossRef ] [ Google Scholar ]
  • Strack Fritz, Martin Leonard L., Schwarz Norbert. Priming and communication: Social determinants of information use in judgments of life satisfaction. European Journal of Social Psychology. 1988; 18 :429–42. doi: 10.1002/ejsp.2420180505. [ CrossRef ] [ Google Scholar ]
  • Swami Viren, Furnham Adrian. Political paranoia and conspiracy theories. In: van Prooijen J. W., van Lange P. A. M., editors. Power, Politics, and Paranoia: Why People Are Suspicious of Their Leaders. Cambridge University Press; Cambridge: 2014. pp. 218–36. [ Google Scholar ]
  • Sweller John. Cognitive load theory: Recent theoretical advances. In: Plass J. L., Moreno R., Brünken R., editors. Cognitive Load Theory. Cambridge University Press; New York: 2010. pp. 29–47. [ Google Scholar ]
  • Tavris Carol, Aronson Elliot. Mistakes Were Made (But Not by Me) Harcourt; Orlando: 2007. [ Google Scholar ]
  • Teichert Tobias, Ferrera Vincent P., Grinband Jack. Humans optimize decision-making by delaying decision onset. PLoS ONE. 2014; 9 :e89638. doi: 10.1371/journal.pone.0089638. [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Tversky Amos, Kahneman Daniel. Judgment under uncertainty: Heuristics and biases. Science. 1974; 185 :1124–31. doi: 10.1126/science.185.4157.1124. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Valenzuela Jorge, Nieto Ana, Saiz Carlos. Critical Thinking Motivational Scale: A 253 contribution to the study of relationship between critical thinking and motivation. Journal of Research in Educational Psychology. 2011; 9 :823–48. doi: 10.25115/ejrep.v9i24.1475. [ CrossRef ] [ Google Scholar ]
  • Varian Hal, Lyman Peter. How Much Information? School of Information Management and Systems, UC Berkeley; Berkeley: 2003. [ Google Scholar ]
  • Vohs Kathleen D., Baumeister Roy F., Schmeichel Brandon J., Twenge Jean M., Nelson Noelle M., Tice Dianne M. Making choices impairs subsequent self-control: A limited-resource account of decision making, self-regulation, and active initiative. Personality Processes and Individual Differences. 2014; 94 :883–98. doi: 10.1037/2333-8113.1.S.19. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Yao Xiaonan, Yuan Shuge, Yang Wenjing, Chen Qunlin, Wei Dongtao, Hou Yuling, Zhang Lijie, Qiu Jiang, Yang Dong. Emotional intelligence moderates the relationship between regional gray matter volume in the bilateral temporal pole and critical thinking disposition. Brain Imaging and Behavior. 2018; 12 :488–98. doi: 10.1007/s11682-017-9701-3. [ PubMed ] [ CrossRef ] [ Google Scholar ]

Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

  • View all journals
  • My Account Login
  • Explore content
  • About the journal
  • Publish with us
  • Sign up for alerts
  • Review Article
  • Open access
  • Published: 11 January 2023

The effectiveness of collaborative problem solving in promoting students’ critical thinking: A meta-analysis based on empirical literature

  • Enwei Xu   ORCID: orcid.org/0000-0001-6424-8169 1 ,
  • Wei Wang 1 &
  • Qingxia Wang 1  

Humanities and Social Sciences Communications volume  10 , Article number:  16 ( 2023 ) Cite this article

12k Accesses

9 Citations

3 Altmetric

Metrics details

  • Science, technology and society

Collaborative problem-solving has been widely embraced in the classroom instruction of critical thinking, which is regarded as the core of curriculum reform based on key competencies in the field of education as well as a key competence for learners in the 21st century. However, the effectiveness of collaborative problem-solving in promoting students’ critical thinking remains uncertain. This current research presents the major findings of a meta-analysis of 36 pieces of the literature revealed in worldwide educational periodicals during the 21st century to identify the effectiveness of collaborative problem-solving in promoting students’ critical thinking and to determine, based on evidence, whether and to what extent collaborative problem solving can result in a rise or decrease in critical thinking. The findings show that (1) collaborative problem solving is an effective teaching approach to foster students’ critical thinking, with a significant overall effect size (ES = 0.82, z  = 12.78, P  < 0.01, 95% CI [0.69, 0.95]); (2) in respect to the dimensions of critical thinking, collaborative problem solving can significantly and successfully enhance students’ attitudinal tendencies (ES = 1.17, z  = 7.62, P  < 0.01, 95% CI[0.87, 1.47]); nevertheless, it falls short in terms of improving students’ cognitive skills, having only an upper-middle impact (ES = 0.70, z  = 11.55, P  < 0.01, 95% CI[0.58, 0.82]); and (3) the teaching type (chi 2  = 7.20, P  < 0.05), intervention duration (chi 2  = 12.18, P  < 0.01), subject area (chi 2  = 13.36, P  < 0.05), group size (chi 2  = 8.77, P  < 0.05), and learning scaffold (chi 2  = 9.03, P  < 0.01) all have an impact on critical thinking, and they can be viewed as important moderating factors that affect how critical thinking develops. On the basis of these results, recommendations are made for further study and instruction to better support students’ critical thinking in the context of collaborative problem-solving.

Similar content being viewed by others

barriers to critical thinking and problem solving pdf

Computer programmers show distinct, expertise-dependent brain responses to violations in form and meaning when reading code

Chu-Hsuan Kuo & Chantel S. Prat

barriers to critical thinking and problem solving pdf

Artificial intelligence and illusions of understanding in scientific research

Lisa Messeri & M. J. Crockett

barriers to critical thinking and problem solving pdf

Impact of artificial intelligence on human loss in decision making, laziness and safety in education

Sayed Fayaz Ahmad, Heesup Han, … Antonio Ariza-Montes

Introduction

Although critical thinking has a long history in research, the concept of critical thinking, which is regarded as an essential competence for learners in the 21st century, has recently attracted more attention from researchers and teaching practitioners (National Research Council, 2012 ). Critical thinking should be the core of curriculum reform based on key competencies in the field of education (Peng and Deng, 2017 ) because students with critical thinking can not only understand the meaning of knowledge but also effectively solve practical problems in real life even after knowledge is forgotten (Kek and Huijser, 2011 ). The definition of critical thinking is not universal (Ennis, 1989 ; Castle, 2009 ; Niu et al., 2013 ). In general, the definition of critical thinking is a self-aware and self-regulated thought process (Facione, 1990 ; Niu et al., 2013 ). It refers to the cognitive skills needed to interpret, analyze, synthesize, reason, and evaluate information as well as the attitudinal tendency to apply these abilities (Halpern, 2001 ). The view that critical thinking can be taught and learned through curriculum teaching has been widely supported by many researchers (e.g., Kuncel, 2011 ; Leng and Lu, 2020 ), leading to educators’ efforts to foster it among students. In the field of teaching practice, there are three types of courses for teaching critical thinking (Ennis, 1989 ). The first is an independent curriculum in which critical thinking is taught and cultivated without involving the knowledge of specific disciplines; the second is an integrated curriculum in which critical thinking is integrated into the teaching of other disciplines as a clear teaching goal; and the third is a mixed curriculum in which critical thinking is taught in parallel to the teaching of other disciplines for mixed teaching training. Furthermore, numerous measuring tools have been developed by researchers and educators to measure critical thinking in the context of teaching practice. These include standardized measurement tools, such as WGCTA, CCTST, CCTT, and CCTDI, which have been verified by repeated experiments and are considered effective and reliable by international scholars (Facione and Facione, 1992 ). In short, descriptions of critical thinking, including its two dimensions of attitudinal tendency and cognitive skills, different types of teaching courses, and standardized measurement tools provide a complex normative framework for understanding, teaching, and evaluating critical thinking.

Cultivating critical thinking in curriculum teaching can start with a problem, and one of the most popular critical thinking instructional approaches is problem-based learning (Liu et al., 2020 ). Duch et al. ( 2001 ) noted that problem-based learning in group collaboration is progressive active learning, which can improve students’ critical thinking and problem-solving skills. Collaborative problem-solving is the organic integration of collaborative learning and problem-based learning, which takes learners as the center of the learning process and uses problems with poor structure in real-world situations as the starting point for the learning process (Liang et al., 2017 ). Students learn the knowledge needed to solve problems in a collaborative group, reach a consensus on problems in the field, and form solutions through social cooperation methods, such as dialogue, interpretation, questioning, debate, negotiation, and reflection, thus promoting the development of learners’ domain knowledge and critical thinking (Cindy, 2004 ; Liang et al., 2017 ).

Collaborative problem-solving has been widely used in the teaching practice of critical thinking, and several studies have attempted to conduct a systematic review and meta-analysis of the empirical literature on critical thinking from various perspectives. However, little attention has been paid to the impact of collaborative problem-solving on critical thinking. Therefore, the best approach for developing and enhancing critical thinking throughout collaborative problem-solving is to examine how to implement critical thinking instruction; however, this issue is still unexplored, which means that many teachers are incapable of better instructing critical thinking (Leng and Lu, 2020 ; Niu et al., 2013 ). For example, Huber ( 2016 ) provided the meta-analysis findings of 71 publications on gaining critical thinking over various time frames in college with the aim of determining whether critical thinking was truly teachable. These authors found that learners significantly improve their critical thinking while in college and that critical thinking differs with factors such as teaching strategies, intervention duration, subject area, and teaching type. The usefulness of collaborative problem-solving in fostering students’ critical thinking, however, was not determined by this study, nor did it reveal whether there existed significant variations among the different elements. A meta-analysis of 31 pieces of educational literature was conducted by Liu et al. ( 2020 ) to assess the impact of problem-solving on college students’ critical thinking. These authors found that problem-solving could promote the development of critical thinking among college students and proposed establishing a reasonable group structure for problem-solving in a follow-up study to improve students’ critical thinking. Additionally, previous empirical studies have reached inconclusive and even contradictory conclusions about whether and to what extent collaborative problem-solving increases or decreases critical thinking levels. As an illustration, Yang et al. ( 2008 ) carried out an experiment on the integrated curriculum teaching of college students based on a web bulletin board with the goal of fostering participants’ critical thinking in the context of collaborative problem-solving. These authors’ research revealed that through sharing, debating, examining, and reflecting on various experiences and ideas, collaborative problem-solving can considerably enhance students’ critical thinking in real-life problem situations. In contrast, collaborative problem-solving had a positive impact on learners’ interaction and could improve learning interest and motivation but could not significantly improve students’ critical thinking when compared to traditional classroom teaching, according to research by Naber and Wyatt ( 2014 ) and Sendag and Odabasi ( 2009 ) on undergraduate and high school students, respectively.

The above studies show that there is inconsistency regarding the effectiveness of collaborative problem-solving in promoting students’ critical thinking. Therefore, it is essential to conduct a thorough and trustworthy review to detect and decide whether and to what degree collaborative problem-solving can result in a rise or decrease in critical thinking. Meta-analysis is a quantitative analysis approach that is utilized to examine quantitative data from various separate studies that are all focused on the same research topic. This approach characterizes the effectiveness of its impact by averaging the effect sizes of numerous qualitative studies in an effort to reduce the uncertainty brought on by independent research and produce more conclusive findings (Lipsey and Wilson, 2001 ).

This paper used a meta-analytic approach and carried out a meta-analysis to examine the effectiveness of collaborative problem-solving in promoting students’ critical thinking in order to make a contribution to both research and practice. The following research questions were addressed by this meta-analysis:

What is the overall effect size of collaborative problem-solving in promoting students’ critical thinking and its impact on the two dimensions of critical thinking (i.e., attitudinal tendency and cognitive skills)?

How are the disparities between the study conclusions impacted by various moderating variables if the impacts of various experimental designs in the included studies are heterogeneous?

This research followed the strict procedures (e.g., database searching, identification, screening, eligibility, merging, duplicate removal, and analysis of included studies) of Cooper’s ( 2010 ) proposed meta-analysis approach for examining quantitative data from various separate studies that are all focused on the same research topic. The relevant empirical research that appeared in worldwide educational periodicals within the 21st century was subjected to this meta-analysis using Rev-Man 5.4. The consistency of the data extracted separately by two researchers was tested using Cohen’s kappa coefficient, and a publication bias test and a heterogeneity test were run on the sample data to ascertain the quality of this meta-analysis.

Data sources and search strategies

There were three stages to the data collection process for this meta-analysis, as shown in Fig. 1 , which shows the number of articles included and eliminated during the selection process based on the statement and study eligibility criteria.

figure 1

This flowchart shows the number of records identified, included and excluded in the article.

First, the databases used to systematically search for relevant articles were the journal papers of the Web of Science Core Collection and the Chinese Core source journal, as well as the Chinese Social Science Citation Index (CSSCI) source journal papers included in CNKI. These databases were selected because they are credible platforms that are sources of scholarly and peer-reviewed information with advanced search tools and contain literature relevant to the subject of our topic from reliable researchers and experts. The search string with the Boolean operator used in the Web of Science was “TS = (((“critical thinking” or “ct” and “pretest” or “posttest”) or (“critical thinking” or “ct” and “control group” or “quasi experiment” or “experiment”)) and (“collaboration” or “collaborative learning” or “CSCL”) and (“problem solving” or “problem-based learning” or “PBL”))”. The research area was “Education Educational Research”, and the search period was “January 1, 2000, to December 30, 2021”. A total of 412 papers were obtained. The search string with the Boolean operator used in the CNKI was “SU = (‘critical thinking’*‘collaboration’ + ‘critical thinking’*‘collaborative learning’ + ‘critical thinking’*‘CSCL’ + ‘critical thinking’*‘problem solving’ + ‘critical thinking’*‘problem-based learning’ + ‘critical thinking’*‘PBL’ + ‘critical thinking’*‘problem oriented’) AND FT = (‘experiment’ + ‘quasi experiment’ + ‘pretest’ + ‘posttest’ + ‘empirical study’)” (translated into Chinese when searching). A total of 56 studies were found throughout the search period of “January 2000 to December 2021”. From the databases, all duplicates and retractions were eliminated before exporting the references into Endnote, a program for managing bibliographic references. In all, 466 studies were found.

Second, the studies that matched the inclusion and exclusion criteria for the meta-analysis were chosen by two researchers after they had reviewed the abstracts and titles of the gathered articles, yielding a total of 126 studies.

Third, two researchers thoroughly reviewed each included article’s whole text in accordance with the inclusion and exclusion criteria. Meanwhile, a snowball search was performed using the references and citations of the included articles to ensure complete coverage of the articles. Ultimately, 36 articles were kept.

Two researchers worked together to carry out this entire process, and a consensus rate of almost 94.7% was reached after discussion and negotiation to clarify any emerging differences.

Eligibility criteria

Since not all the retrieved studies matched the criteria for this meta-analysis, eligibility criteria for both inclusion and exclusion were developed as follows:

The publication language of the included studies was limited to English and Chinese, and the full text could be obtained. Articles that did not meet the publication language and articles not published between 2000 and 2021 were excluded.

The research design of the included studies must be empirical and quantitative studies that can assess the effect of collaborative problem-solving on the development of critical thinking. Articles that could not identify the causal mechanisms by which collaborative problem-solving affects critical thinking, such as review articles and theoretical articles, were excluded.

The research method of the included studies must feature a randomized control experiment or a quasi-experiment, or a natural experiment, which have a higher degree of internal validity with strong experimental designs and can all plausibly provide evidence that critical thinking and collaborative problem-solving are causally related. Articles with non-experimental research methods, such as purely correlational or observational studies, were excluded.

The participants of the included studies were only students in school, including K-12 students and college students. Articles in which the participants were non-school students, such as social workers or adult learners, were excluded.

The research results of the included studies must mention definite signs that may be utilized to gauge critical thinking’s impact (e.g., sample size, mean value, or standard deviation). Articles that lacked specific measurement indicators for critical thinking and could not calculate the effect size were excluded.

Data coding design

In order to perform a meta-analysis, it is necessary to collect the most important information from the articles, codify that information’s properties, and convert descriptive data into quantitative data. Therefore, this study designed a data coding template (see Table 1 ). Ultimately, 16 coding fields were retained.

The designed data-coding template consisted of three pieces of information. Basic information about the papers was included in the descriptive information: the publishing year, author, serial number, and title of the paper.

The variable information for the experimental design had three variables: the independent variable (instruction method), the dependent variable (critical thinking), and the moderating variable (learning stage, teaching type, intervention duration, learning scaffold, group size, measuring tool, and subject area). Depending on the topic of this study, the intervention strategy, as the independent variable, was coded into collaborative and non-collaborative problem-solving. The dependent variable, critical thinking, was coded as a cognitive skill and an attitudinal tendency. And seven moderating variables were created by grouping and combining the experimental design variables discovered within the 36 studies (see Table 1 ), where learning stages were encoded as higher education, high school, middle school, and primary school or lower; teaching types were encoded as mixed courses, integrated courses, and independent courses; intervention durations were encoded as 0–1 weeks, 1–4 weeks, 4–12 weeks, and more than 12 weeks; group sizes were encoded as 2–3 persons, 4–6 persons, 7–10 persons, and more than 10 persons; learning scaffolds were encoded as teacher-supported learning scaffold, technique-supported learning scaffold, and resource-supported learning scaffold; measuring tools were encoded as standardized measurement tools (e.g., WGCTA, CCTT, CCTST, and CCTDI) and self-adapting measurement tools (e.g., modified or made by researchers); and subject areas were encoded according to the specific subjects used in the 36 included studies.

The data information contained three metrics for measuring critical thinking: sample size, average value, and standard deviation. It is vital to remember that studies with various experimental designs frequently adopt various formulas to determine the effect size. And this paper used Morris’ proposed standardized mean difference (SMD) calculation formula ( 2008 , p. 369; see Supplementary Table S3 ).

Procedure for extracting and coding data

According to the data coding template (see Table 1 ), the 36 papers’ information was retrieved by two researchers, who then entered them into Excel (see Supplementary Table S1 ). The results of each study were extracted separately in the data extraction procedure if an article contained numerous studies on critical thinking, or if a study assessed different critical thinking dimensions. For instance, Tiwari et al. ( 2010 ) used four time points, which were viewed as numerous different studies, to examine the outcomes of critical thinking, and Chen ( 2013 ) included the two outcome variables of attitudinal tendency and cognitive skills, which were regarded as two studies. After discussion and negotiation during data extraction, the two researchers’ consistency test coefficients were roughly 93.27%. Supplementary Table S2 details the key characteristics of the 36 included articles with 79 effect quantities, including descriptive information (e.g., the publishing year, author, serial number, and title of the paper), variable information (e.g., independent variables, dependent variables, and moderating variables), and data information (e.g., mean values, standard deviations, and sample size). Following that, testing for publication bias and heterogeneity was done on the sample data using the Rev-Man 5.4 software, and then the test results were used to conduct a meta-analysis.

Publication bias test

When the sample of studies included in a meta-analysis does not accurately reflect the general status of research on the relevant subject, publication bias is said to be exhibited in this research. The reliability and accuracy of the meta-analysis may be impacted by publication bias. Due to this, the meta-analysis needs to check the sample data for publication bias (Stewart et al., 2006 ). A popular method to check for publication bias is the funnel plot; and it is unlikely that there will be publishing bias when the data are equally dispersed on either side of the average effect size and targeted within the higher region. The data are equally dispersed within the higher portion of the efficient zone, consistent with the funnel plot connected with this analysis (see Fig. 2 ), indicating that publication bias is unlikely in this situation.

figure 2

This funnel plot shows the result of publication bias of 79 effect quantities across 36 studies.

Heterogeneity test

To select the appropriate effect models for the meta-analysis, one might use the results of a heterogeneity test on the data effect sizes. In a meta-analysis, it is common practice to gauge the degree of data heterogeneity using the I 2 value, and I 2  ≥ 50% is typically understood to denote medium-high heterogeneity, which calls for the adoption of a random effect model; if not, a fixed effect model ought to be applied (Lipsey and Wilson, 2001 ). The findings of the heterogeneity test in this paper (see Table 2 ) revealed that I 2 was 86% and displayed significant heterogeneity ( P  < 0.01). To ensure accuracy and reliability, the overall effect size ought to be calculated utilizing the random effect model.

The analysis of the overall effect size

This meta-analysis utilized a random effect model to examine 79 effect quantities from 36 studies after eliminating heterogeneity. In accordance with Cohen’s criterion (Cohen, 1992 ), it is abundantly clear from the analysis results, which are shown in the forest plot of the overall effect (see Fig. 3 ), that the cumulative impact size of cooperative problem-solving is 0.82, which is statistically significant ( z  = 12.78, P  < 0.01, 95% CI [0.69, 0.95]), and can encourage learners to practice critical thinking.

figure 3

This forest plot shows the analysis result of the overall effect size across 36 studies.

In addition, this study examined two distinct dimensions of critical thinking to better understand the precise contributions that collaborative problem-solving makes to the growth of critical thinking. The findings (see Table 3 ) indicate that collaborative problem-solving improves cognitive skills (ES = 0.70) and attitudinal tendency (ES = 1.17), with significant intergroup differences (chi 2  = 7.95, P  < 0.01). Although collaborative problem-solving improves both dimensions of critical thinking, it is essential to point out that the improvements in students’ attitudinal tendency are much more pronounced and have a significant comprehensive effect (ES = 1.17, z  = 7.62, P  < 0.01, 95% CI [0.87, 1.47]), whereas gains in learners’ cognitive skill are slightly improved and are just above average. (ES = 0.70, z  = 11.55, P  < 0.01, 95% CI [0.58, 0.82]).

The analysis of moderator effect size

The whole forest plot’s 79 effect quantities underwent a two-tailed test, which revealed significant heterogeneity ( I 2  = 86%, z  = 12.78, P  < 0.01), indicating differences between various effect sizes that may have been influenced by moderating factors other than sampling error. Therefore, exploring possible moderating factors that might produce considerable heterogeneity was done using subgroup analysis, such as the learning stage, learning scaffold, teaching type, group size, duration of the intervention, measuring tool, and the subject area included in the 36 experimental designs, in order to further explore the key factors that influence critical thinking. The findings (see Table 4 ) indicate that various moderating factors have advantageous effects on critical thinking. In this situation, the subject area (chi 2  = 13.36, P  < 0.05), group size (chi 2  = 8.77, P  < 0.05), intervention duration (chi 2  = 12.18, P  < 0.01), learning scaffold (chi 2  = 9.03, P  < 0.01), and teaching type (chi 2  = 7.20, P  < 0.05) are all significant moderators that can be applied to support the cultivation of critical thinking. However, since the learning stage and the measuring tools did not significantly differ among intergroup (chi 2  = 3.15, P  = 0.21 > 0.05, and chi 2  = 0.08, P  = 0.78 > 0.05), we are unable to explain why these two factors are crucial in supporting the cultivation of critical thinking in the context of collaborative problem-solving. These are the precise outcomes, as follows:

Various learning stages influenced critical thinking positively, without significant intergroup differences (chi 2  = 3.15, P  = 0.21 > 0.05). High school was first on the list of effect sizes (ES = 1.36, P  < 0.01), then higher education (ES = 0.78, P  < 0.01), and middle school (ES = 0.73, P  < 0.01). These results show that, despite the learning stage’s beneficial influence on cultivating learners’ critical thinking, we are unable to explain why it is essential for cultivating critical thinking in the context of collaborative problem-solving.

Different teaching types had varying degrees of positive impact on critical thinking, with significant intergroup differences (chi 2  = 7.20, P  < 0.05). The effect size was ranked as follows: mixed courses (ES = 1.34, P  < 0.01), integrated courses (ES = 0.81, P  < 0.01), and independent courses (ES = 0.27, P  < 0.01). These results indicate that the most effective approach to cultivate critical thinking utilizing collaborative problem solving is through the teaching type of mixed courses.

Various intervention durations significantly improved critical thinking, and there were significant intergroup differences (chi 2  = 12.18, P  < 0.01). The effect sizes related to this variable showed a tendency to increase with longer intervention durations. The improvement in critical thinking reached a significant level (ES = 0.85, P  < 0.01) after more than 12 weeks of training. These findings indicate that the intervention duration and critical thinking’s impact are positively correlated, with a longer intervention duration having a greater effect.

Different learning scaffolds influenced critical thinking positively, with significant intergroup differences (chi 2  = 9.03, P  < 0.01). The resource-supported learning scaffold (ES = 0.69, P  < 0.01) acquired a medium-to-higher level of impact, the technique-supported learning scaffold (ES = 0.63, P  < 0.01) also attained a medium-to-higher level of impact, and the teacher-supported learning scaffold (ES = 0.92, P  < 0.01) displayed a high level of significant impact. These results show that the learning scaffold with teacher support has the greatest impact on cultivating critical thinking.

Various group sizes influenced critical thinking positively, and the intergroup differences were statistically significant (chi 2  = 8.77, P  < 0.05). Critical thinking showed a general declining trend with increasing group size. The overall effect size of 2–3 people in this situation was the biggest (ES = 0.99, P  < 0.01), and when the group size was greater than 7 people, the improvement in critical thinking was at the lower-middle level (ES < 0.5, P  < 0.01). These results show that the impact on critical thinking is positively connected with group size, and as group size grows, so does the overall impact.

Various measuring tools influenced critical thinking positively, with significant intergroup differences (chi 2  = 0.08, P  = 0.78 > 0.05). In this situation, the self-adapting measurement tools obtained an upper-medium level of effect (ES = 0.78), whereas the complete effect size of the standardized measurement tools was the largest, achieving a significant level of effect (ES = 0.84, P  < 0.01). These results show that, despite the beneficial influence of the measuring tool on cultivating critical thinking, we are unable to explain why it is crucial in fostering the growth of critical thinking by utilizing the approach of collaborative problem-solving.

Different subject areas had a greater impact on critical thinking, and the intergroup differences were statistically significant (chi 2  = 13.36, P  < 0.05). Mathematics had the greatest overall impact, achieving a significant level of effect (ES = 1.68, P  < 0.01), followed by science (ES = 1.25, P  < 0.01) and medical science (ES = 0.87, P  < 0.01), both of which also achieved a significant level of effect. Programming technology was the least effective (ES = 0.39, P  < 0.01), only having a medium-low degree of effect compared to education (ES = 0.72, P  < 0.01) and other fields (such as language, art, and social sciences) (ES = 0.58, P  < 0.01). These results suggest that scientific fields (e.g., mathematics, science) may be the most effective subject areas for cultivating critical thinking utilizing the approach of collaborative problem-solving.

The effectiveness of collaborative problem solving with regard to teaching critical thinking

According to this meta-analysis, using collaborative problem-solving as an intervention strategy in critical thinking teaching has a considerable amount of impact on cultivating learners’ critical thinking as a whole and has a favorable promotional effect on the two dimensions of critical thinking. According to certain studies, collaborative problem solving, the most frequently used critical thinking teaching strategy in curriculum instruction can considerably enhance students’ critical thinking (e.g., Liang et al., 2017 ; Liu et al., 2020 ; Cindy, 2004 ). This meta-analysis provides convergent data support for the above research views. Thus, the findings of this meta-analysis not only effectively address the first research query regarding the overall effect of cultivating critical thinking and its impact on the two dimensions of critical thinking (i.e., attitudinal tendency and cognitive skills) utilizing the approach of collaborative problem-solving, but also enhance our confidence in cultivating critical thinking by using collaborative problem-solving intervention approach in the context of classroom teaching.

Furthermore, the associated improvements in attitudinal tendency are much stronger, but the corresponding improvements in cognitive skill are only marginally better. According to certain studies, cognitive skill differs from the attitudinal tendency in classroom instruction; the cultivation and development of the former as a key ability is a process of gradual accumulation, while the latter as an attitude is affected by the context of the teaching situation (e.g., a novel and exciting teaching approach, challenging and rewarding tasks) (Halpern, 2001 ; Wei and Hong, 2022 ). Collaborative problem-solving as a teaching approach is exciting and interesting, as well as rewarding and challenging; because it takes the learners as the focus and examines problems with poor structure in real situations, and it can inspire students to fully realize their potential for problem-solving, which will significantly improve their attitudinal tendency toward solving problems (Liu et al., 2020 ). Similar to how collaborative problem-solving influences attitudinal tendency, attitudinal tendency impacts cognitive skill when attempting to solve a problem (Liu et al., 2020 ; Zhang et al., 2022 ), and stronger attitudinal tendencies are associated with improved learning achievement and cognitive ability in students (Sison, 2008 ; Zhang et al., 2022 ). It can be seen that the two specific dimensions of critical thinking as well as critical thinking as a whole are affected by collaborative problem-solving, and this study illuminates the nuanced links between cognitive skills and attitudinal tendencies with regard to these two dimensions of critical thinking. To fully develop students’ capacity for critical thinking, future empirical research should pay closer attention to cognitive skills.

The moderating effects of collaborative problem solving with regard to teaching critical thinking

In order to further explore the key factors that influence critical thinking, exploring possible moderating effects that might produce considerable heterogeneity was done using subgroup analysis. The findings show that the moderating factors, such as the teaching type, learning stage, group size, learning scaffold, duration of the intervention, measuring tool, and the subject area included in the 36 experimental designs, could all support the cultivation of collaborative problem-solving in critical thinking. Among them, the effect size differences between the learning stage and measuring tool are not significant, which does not explain why these two factors are crucial in supporting the cultivation of critical thinking utilizing the approach of collaborative problem-solving.

In terms of the learning stage, various learning stages influenced critical thinking positively without significant intergroup differences, indicating that we are unable to explain why it is crucial in fostering the growth of critical thinking.

Although high education accounts for 70.89% of all empirical studies performed by researchers, high school may be the appropriate learning stage to foster students’ critical thinking by utilizing the approach of collaborative problem-solving since it has the largest overall effect size. This phenomenon may be related to student’s cognitive development, which needs to be further studied in follow-up research.

With regard to teaching type, mixed course teaching may be the best teaching method to cultivate students’ critical thinking. Relevant studies have shown that in the actual teaching process if students are trained in thinking methods alone, the methods they learn are isolated and divorced from subject knowledge, which is not conducive to their transfer of thinking methods; therefore, if students’ thinking is trained only in subject teaching without systematic method training, it is challenging to apply to real-world circumstances (Ruggiero, 2012 ; Hu and Liu, 2015 ). Teaching critical thinking as mixed course teaching in parallel to other subject teachings can achieve the best effect on learners’ critical thinking, and explicit critical thinking instruction is more effective than less explicit critical thinking instruction (Bensley and Spero, 2014 ).

In terms of the intervention duration, with longer intervention times, the overall effect size shows an upward tendency. Thus, the intervention duration and critical thinking’s impact are positively correlated. Critical thinking, as a key competency for students in the 21st century, is difficult to get a meaningful improvement in a brief intervention duration. Instead, it could be developed over a lengthy period of time through consistent teaching and the progressive accumulation of knowledge (Halpern, 2001 ; Hu and Liu, 2015 ). Therefore, future empirical studies ought to take these restrictions into account throughout a longer period of critical thinking instruction.

With regard to group size, a group size of 2–3 persons has the highest effect size, and the comprehensive effect size decreases with increasing group size in general. This outcome is in line with some research findings; as an example, a group composed of two to four members is most appropriate for collaborative learning (Schellens and Valcke, 2006 ). However, the meta-analysis results also indicate that once the group size exceeds 7 people, small groups cannot produce better interaction and performance than large groups. This may be because the learning scaffolds of technique support, resource support, and teacher support improve the frequency and effectiveness of interaction among group members, and a collaborative group with more members may increase the diversity of views, which is helpful to cultivate critical thinking utilizing the approach of collaborative problem-solving.

With regard to the learning scaffold, the three different kinds of learning scaffolds can all enhance critical thinking. Among them, the teacher-supported learning scaffold has the largest overall effect size, demonstrating the interdependence of effective learning scaffolds and collaborative problem-solving. This outcome is in line with some research findings; as an example, a successful strategy is to encourage learners to collaborate, come up with solutions, and develop critical thinking skills by using learning scaffolds (Reiser, 2004 ; Xu et al., 2022 ); learning scaffolds can lower task complexity and unpleasant feelings while also enticing students to engage in learning activities (Wood et al., 2006 ); learning scaffolds are designed to assist students in using learning approaches more successfully to adapt the collaborative problem-solving process, and the teacher-supported learning scaffolds have the greatest influence on critical thinking in this process because they are more targeted, informative, and timely (Xu et al., 2022 ).

With respect to the measuring tool, despite the fact that standardized measurement tools (such as the WGCTA, CCTT, and CCTST) have been acknowledged as trustworthy and effective by worldwide experts, only 54.43% of the research included in this meta-analysis adopted them for assessment, and the results indicated no intergroup differences. These results suggest that not all teaching circumstances are appropriate for measuring critical thinking using standardized measurement tools. “The measuring tools for measuring thinking ability have limits in assessing learners in educational situations and should be adapted appropriately to accurately assess the changes in learners’ critical thinking.”, according to Simpson and Courtney ( 2002 , p. 91). As a result, in order to more fully and precisely gauge how learners’ critical thinking has evolved, we must properly modify standardized measuring tools based on collaborative problem-solving learning contexts.

With regard to the subject area, the comprehensive effect size of science departments (e.g., mathematics, science, medical science) is larger than that of language arts and social sciences. Some recent international education reforms have noted that critical thinking is a basic part of scientific literacy. Students with scientific literacy can prove the rationality of their judgment according to accurate evidence and reasonable standards when they face challenges or poorly structured problems (Kyndt et al., 2013 ), which makes critical thinking crucial for developing scientific understanding and applying this understanding to practical problem solving for problems related to science, technology, and society (Yore et al., 2007 ).

Suggestions for critical thinking teaching

Other than those stated in the discussion above, the following suggestions are offered for critical thinking instruction utilizing the approach of collaborative problem-solving.

First, teachers should put a special emphasis on the two core elements, which are collaboration and problem-solving, to design real problems based on collaborative situations. This meta-analysis provides evidence to support the view that collaborative problem-solving has a strong synergistic effect on promoting students’ critical thinking. Asking questions about real situations and allowing learners to take part in critical discussions on real problems during class instruction are key ways to teach critical thinking rather than simply reading speculative articles without practice (Mulnix, 2012 ). Furthermore, the improvement of students’ critical thinking is realized through cognitive conflict with other learners in the problem situation (Yang et al., 2008 ). Consequently, it is essential for teachers to put a special emphasis on the two core elements, which are collaboration and problem-solving, and design real problems and encourage students to discuss, negotiate, and argue based on collaborative problem-solving situations.

Second, teachers should design and implement mixed courses to cultivate learners’ critical thinking, utilizing the approach of collaborative problem-solving. Critical thinking can be taught through curriculum instruction (Kuncel, 2011 ; Leng and Lu, 2020 ), with the goal of cultivating learners’ critical thinking for flexible transfer and application in real problem-solving situations. This meta-analysis shows that mixed course teaching has a highly substantial impact on the cultivation and promotion of learners’ critical thinking. Therefore, teachers should design and implement mixed course teaching with real collaborative problem-solving situations in combination with the knowledge content of specific disciplines in conventional teaching, teach methods and strategies of critical thinking based on poorly structured problems to help students master critical thinking, and provide practical activities in which students can interact with each other to develop knowledge construction and critical thinking utilizing the approach of collaborative problem-solving.

Third, teachers should be more trained in critical thinking, particularly preservice teachers, and they also should be conscious of the ways in which teachers’ support for learning scaffolds can promote critical thinking. The learning scaffold supported by teachers had the greatest impact on learners’ critical thinking, in addition to being more directive, targeted, and timely (Wood et al., 2006 ). Critical thinking can only be effectively taught when teachers recognize the significance of critical thinking for students’ growth and use the proper approaches while designing instructional activities (Forawi, 2016 ). Therefore, with the intention of enabling teachers to create learning scaffolds to cultivate learners’ critical thinking utilizing the approach of collaborative problem solving, it is essential to concentrate on the teacher-supported learning scaffolds and enhance the instruction for teaching critical thinking to teachers, especially preservice teachers.

Implications and limitations

There are certain limitations in this meta-analysis, but future research can correct them. First, the search languages were restricted to English and Chinese, so it is possible that pertinent studies that were written in other languages were overlooked, resulting in an inadequate number of articles for review. Second, these data provided by the included studies are partially missing, such as whether teachers were trained in the theory and practice of critical thinking, the average age and gender of learners, and the differences in critical thinking among learners of various ages and genders. Third, as is typical for review articles, more studies were released while this meta-analysis was being done; therefore, it had a time limit. With the development of relevant research, future studies focusing on these issues are highly relevant and needed.

Conclusions

The subject of the magnitude of collaborative problem-solving’s impact on fostering students’ critical thinking, which received scant attention from other studies, was successfully addressed by this study. The question of the effectiveness of collaborative problem-solving in promoting students’ critical thinking was addressed in this study, which addressed a topic that had gotten little attention in earlier research. The following conclusions can be made:

Regarding the results obtained, collaborative problem solving is an effective teaching approach to foster learners’ critical thinking, with a significant overall effect size (ES = 0.82, z  = 12.78, P  < 0.01, 95% CI [0.69, 0.95]). With respect to the dimensions of critical thinking, collaborative problem-solving can significantly and effectively improve students’ attitudinal tendency, and the comprehensive effect is significant (ES = 1.17, z  = 7.62, P  < 0.01, 95% CI [0.87, 1.47]); nevertheless, it falls short in terms of improving students’ cognitive skills, having only an upper-middle impact (ES = 0.70, z  = 11.55, P  < 0.01, 95% CI [0.58, 0.82]).

As demonstrated by both the results and the discussion, there are varying degrees of beneficial effects on students’ critical thinking from all seven moderating factors, which were found across 36 studies. In this context, the teaching type (chi 2  = 7.20, P  < 0.05), intervention duration (chi 2  = 12.18, P  < 0.01), subject area (chi 2  = 13.36, P  < 0.05), group size (chi 2  = 8.77, P  < 0.05), and learning scaffold (chi 2  = 9.03, P  < 0.01) all have a positive impact on critical thinking, and they can be viewed as important moderating factors that affect how critical thinking develops. Since the learning stage (chi 2  = 3.15, P  = 0.21 > 0.05) and measuring tools (chi 2  = 0.08, P  = 0.78 > 0.05) did not demonstrate any significant intergroup differences, we are unable to explain why these two factors are crucial in supporting the cultivation of critical thinking in the context of collaborative problem-solving.

Data availability

All data generated or analyzed during this study are included within the article and its supplementary information files, and the supplementary information files are available in the Dataverse repository: https://doi.org/10.7910/DVN/IPFJO6 .

Bensley DA, Spero RA (2014) Improving critical thinking skills and meta-cognitive monitoring through direct infusion. Think Skills Creat 12:55–68. https://doi.org/10.1016/j.tsc.2014.02.001

Article   Google Scholar  

Castle A (2009) Defining and assessing critical thinking skills for student radiographers. Radiography 15(1):70–76. https://doi.org/10.1016/j.radi.2007.10.007

Chen XD (2013) An empirical study on the influence of PBL teaching model on critical thinking ability of non-English majors. J PLA Foreign Lang College 36 (04):68–72

Google Scholar  

Cohen A (1992) Antecedents of organizational commitment across occupational groups: a meta-analysis. J Organ Behav. https://doi.org/10.1002/job.4030130602

Cooper H (2010) Research synthesis and meta-analysis: a step-by-step approach, 4th edn. Sage, London, England

Cindy HS (2004) Problem-based learning: what and how do students learn? Educ Psychol Rev 51(1):31–39

Duch BJ, Gron SD, Allen DE (2001) The power of problem-based learning: a practical “how to” for teaching undergraduate courses in any discipline. Stylus Educ Sci 2:190–198

Ennis RH (1989) Critical thinking and subject specificity: clarification and needed research. Educ Res 18(3):4–10. https://doi.org/10.3102/0013189x018003004

Facione PA (1990) Critical thinking: a statement of expert consensus for purposes of educational assessment and instruction. Research findings and recommendations. Eric document reproduction service. https://eric.ed.gov/?id=ed315423

Facione PA, Facione NC (1992) The California Critical Thinking Dispositions Inventory (CCTDI) and the CCTDI test manual. California Academic Press, Millbrae, CA

Forawi SA (2016) Standard-based science education and critical thinking. Think Skills Creat 20:52–62. https://doi.org/10.1016/j.tsc.2016.02.005

Halpern DF (2001) Assessing the effectiveness of critical thinking instruction. J Gen Educ 50(4):270–286. https://doi.org/10.2307/27797889

Hu WP, Liu J (2015) Cultivation of pupils’ thinking ability: a five-year follow-up study. Psychol Behav Res 13(05):648–654. https://doi.org/10.3969/j.issn.1672-0628.2015.05.010

Huber K (2016) Does college teach critical thinking? A meta-analysis. Rev Educ Res 86(2):431–468. https://doi.org/10.3102/0034654315605917

Kek MYCA, Huijser H (2011) The power of problem-based learning in developing critical thinking skills: preparing students for tomorrow’s digital futures in today’s classrooms. High Educ Res Dev 30(3):329–341. https://doi.org/10.1080/07294360.2010.501074

Kuncel NR (2011) Measurement and meaning of critical thinking (Research report for the NRC 21st Century Skills Workshop). National Research Council, Washington, DC

Kyndt E, Raes E, Lismont B, Timmers F, Cascallar E, Dochy F (2013) A meta-analysis of the effects of face-to-face cooperative learning. Do recent studies falsify or verify earlier findings? Educ Res Rev 10(2):133–149. https://doi.org/10.1016/j.edurev.2013.02.002

Leng J, Lu XX (2020) Is critical thinking really teachable?—A meta-analysis based on 79 experimental or quasi experimental studies. Open Educ Res 26(06):110–118. https://doi.org/10.13966/j.cnki.kfjyyj.2020.06.011

Liang YZ, Zhu K, Zhao CL (2017) An empirical study on the depth of interaction promoted by collaborative problem solving learning activities. J E-educ Res 38(10):87–92. https://doi.org/10.13811/j.cnki.eer.2017.10.014

Lipsey M, Wilson D (2001) Practical meta-analysis. International Educational and Professional, London, pp. 92–160

Liu Z, Wu W, Jiang Q (2020) A study on the influence of problem based learning on college students’ critical thinking-based on a meta-analysis of 31 studies. Explor High Educ 03:43–49

Morris SB (2008) Estimating effect sizes from pretest-posttest-control group designs. Organ Res Methods 11(2):364–386. https://doi.org/10.1177/1094428106291059

Article   ADS   Google Scholar  

Mulnix JW (2012) Thinking critically about critical thinking. Educ Philos Theory 44(5):464–479. https://doi.org/10.1111/j.1469-5812.2010.00673.x

Naber J, Wyatt TH (2014) The effect of reflective writing interventions on the critical thinking skills and dispositions of baccalaureate nursing students. Nurse Educ Today 34(1):67–72. https://doi.org/10.1016/j.nedt.2013.04.002

National Research Council (2012) Education for life and work: developing transferable knowledge and skills in the 21st century. The National Academies Press, Washington, DC

Niu L, Behar HLS, Garvan CW (2013) Do instructional interventions influence college students’ critical thinking skills? A meta-analysis. Educ Res Rev 9(12):114–128. https://doi.org/10.1016/j.edurev.2012.12.002

Peng ZM, Deng L (2017) Towards the core of education reform: cultivating critical thinking skills as the core of skills in the 21st century. Res Educ Dev 24:57–63. https://doi.org/10.14121/j.cnki.1008-3855.2017.24.011

Reiser BJ (2004) Scaffolding complex learning: the mechanisms of structuring and problematizing student work. J Learn Sci 13(3):273–304. https://doi.org/10.1207/s15327809jls1303_2

Ruggiero VR (2012) The art of thinking: a guide to critical and creative thought, 4th edn. Harper Collins College Publishers, New York

Schellens T, Valcke M (2006) Fostering knowledge construction in university students through asynchronous discussion groups. Comput Educ 46(4):349–370. https://doi.org/10.1016/j.compedu.2004.07.010

Sendag S, Odabasi HF (2009) Effects of an online problem based learning course on content knowledge acquisition and critical thinking skills. Comput Educ 53(1):132–141. https://doi.org/10.1016/j.compedu.2009.01.008

Sison R (2008) Investigating Pair Programming in a Software Engineering Course in an Asian Setting. 2008 15th Asia-Pacific Software Engineering Conference, pp. 325–331. https://doi.org/10.1109/APSEC.2008.61

Simpson E, Courtney M (2002) Critical thinking in nursing education: literature review. Mary Courtney 8(2):89–98

Stewart L, Tierney J, Burdett S (2006) Do systematic reviews based on individual patient data offer a means of circumventing biases associated with trial publications? Publication bias in meta-analysis. John Wiley and Sons Inc, New York, pp. 261–286

Tiwari A, Lai P, So M, Yuen K (2010) A comparison of the effects of problem-based learning and lecturing on the development of students’ critical thinking. Med Educ 40(6):547–554. https://doi.org/10.1111/j.1365-2929.2006.02481.x

Wood D, Bruner JS, Ross G (2006) The role of tutoring in problem solving. J Child Psychol Psychiatry 17(2):89–100. https://doi.org/10.1111/j.1469-7610.1976.tb00381.x

Wei T, Hong S (2022) The meaning and realization of teachable critical thinking. Educ Theory Practice 10:51–57

Xu EW, Wang W, Wang QX (2022) A meta-analysis of the effectiveness of programming teaching in promoting K-12 students’ computational thinking. Educ Inf Technol. https://doi.org/10.1007/s10639-022-11445-2

Yang YC, Newby T, Bill R (2008) Facilitating interactions through structured web-based bulletin boards: a quasi-experimental study on promoting learners’ critical thinking skills. Comput Educ 50(4):1572–1585. https://doi.org/10.1016/j.compedu.2007.04.006

Yore LD, Pimm D, Tuan HL (2007) The literacy component of mathematical and scientific literacy. Int J Sci Math Educ 5(4):559–589. https://doi.org/10.1007/s10763-007-9089-4

Zhang T, Zhang S, Gao QQ, Wang JH (2022) Research on the development of learners’ critical thinking in online peer review. Audio Visual Educ Res 6:53–60. https://doi.org/10.13811/j.cnki.eer.2022.06.08

Download references

Acknowledgements

This research was supported by the graduate scientific research and innovation project of Xinjiang Uygur Autonomous Region named “Research on in-depth learning of high school information technology courses for the cultivation of computing thinking” (No. XJ2022G190) and the independent innovation fund project for doctoral students of the College of Educational Science of Xinjiang Normal University named “Research on project-based teaching of high school information technology courses from the perspective of discipline core literacy” (No. XJNUJKYA2003).

Author information

Authors and affiliations.

College of Educational Science, Xinjiang Normal University, 830017, Urumqi, Xinjiang, China

Enwei Xu, Wei Wang & Qingxia Wang

You can also search for this author in PubMed   Google Scholar

Corresponding authors

Correspondence to Enwei Xu or Wei Wang .

Ethics declarations

Competing interests.

The authors declare no competing interests.

Ethical approval

This article does not contain any studies with human participants performed by any of the authors.

Informed consent

Additional information.

Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary information

Supplementary tables, rights and permissions.

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/ .

Reprints and permissions

About this article

Cite this article.

Xu, E., Wang, W. & Wang, Q. The effectiveness of collaborative problem solving in promoting students’ critical thinking: A meta-analysis based on empirical literature. Humanit Soc Sci Commun 10 , 16 (2023). https://doi.org/10.1057/s41599-023-01508-1

Download citation

Received : 07 August 2022

Accepted : 04 January 2023

Published : 11 January 2023

DOI : https://doi.org/10.1057/s41599-023-01508-1

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

This article is cited by

Exploring the effects of digital technology on deep learning: a meta-analysis.

Education and Information Technologies (2024)

Impacts of online collaborative learning on students’ intercultural communication apprehension and intercultural communicative competence

  • Hoa Thi Hoang Chau
  • Hung Phu Bui
  • Quynh Thi Huong Dinh

Education and Information Technologies (2023)

Quick links

  • Explore articles by subject
  • Guide to authors
  • Editorial policies

barriers to critical thinking and problem solving pdf

Critical thinking definition

barriers to critical thinking and problem solving pdf

Critical thinking, as described by Oxford Languages, is the objective analysis and evaluation of an issue in order to form a judgement.

Active and skillful approach, evaluation, assessment, synthesis, and/or evaluation of information obtained from, or made by, observation, knowledge, reflection, acumen or conversation, as a guide to belief and action, requires the critical thinking process, which is why it's often used in education and academics.

Some even may view it as a backbone of modern thought.

However, it's a skill, and skills must be trained and encouraged to be used at its full potential.

People turn up to various approaches in improving their critical thinking, like:

  • Developing technical and problem-solving skills
  • Engaging in more active listening
  • Actively questioning their assumptions and beliefs
  • Seeking out more diversity of thought
  • Opening up their curiosity in an intellectual way etc.

Is critical thinking useful in writing?

Critical thinking can help in planning your paper and making it more concise, but it's not obvious at first. We carefully pinpointed some the questions you should ask yourself when boosting critical thinking in writing:

  • What information should be included?
  • Which information resources should the author look to?
  • What degree of technical knowledge should the report assume its audience has?
  • What is the most effective way to show information?
  • How should the report be organized?
  • How should it be designed?
  • What tone and level of language difficulty should the document have?

Usage of critical thinking comes down not only to the outline of your paper, it also begs the question: How can we use critical thinking solving problems in our writing's topic?

Let's say, you have a Powerpoint on how critical thinking can reduce poverty in the United States. You'll primarily have to define critical thinking for the viewers, as well as use a lot of critical thinking questions and synonyms to get them to be familiar with your methods and start the thinking process behind it.

Are there any services that can help me use more critical thinking?

We understand that it's difficult to learn how to use critical thinking more effectively in just one article, but our service is here to help.

We are a team specializing in writing essays and other assignments for college students and all other types of customers who need a helping hand in its making. We cover a great range of topics, offer perfect quality work, always deliver on time and aim to leave our customers completely satisfied with what they ordered.

The ordering process is fully online, and it goes as follows:

  • Select the topic and the deadline of your essay.
  • Provide us with any details, requirements, statements that should be emphasized or particular parts of the essay writing process you struggle with.
  • Leave the email address, where your completed order will be sent to.
  • Select your prefered payment type, sit back and relax!

With lots of experience on the market, professionally degreed essay writers , online 24/7 customer support and incredibly low prices, you won't find a service offering a better deal than ours.

  • Study Guides
  • Homework Questions

Critical Thinking - Assessment 2 -

  • Information Systems

IMAGES

  1. 7 Barriers to Critical Thinking and How to Destroy Them

    barriers to critical thinking and problem solving pdf

  2. PPT

    barriers to critical thinking and problem solving pdf

  3. Top 7 Barriers to Critical Thinking: Examples and Solutions

    barriers to critical thinking and problem solving pdf

  4. Barriers TO Critical Thinking AND Problem Solving- pdf

    barriers to critical thinking and problem solving pdf

  5. Barriers to Critical Thinking

    barriers to critical thinking and problem solving pdf

  6. PPT

    barriers to critical thinking and problem solving pdf

VIDEO

  1. Critical Thinking

  2. Critical thinking

  3. Critical Thinking

  4. Learning Outcomes Of Critical Thinking

  5. Critical Thinking

  6. Critical Thinking

COMMENTS

  1. Barriers TO Critical Thinking AND Problem Solving- pdf

    These, known to exist or remotely impacting on our problem-solving and decision-making process are the barriers to critical thinking. Barriers to critical thinking are chronic shortfalls within us and our environs that continually demean and destruct clear, distinct, and precise thinking.

  2. Notes

    1. Introduction. Critical thinking (CT) is a metacognitive process—consisting of a number of skills and dispositions—that, through purposeful, self-regulatory reflective judgment, increases the chances of producing a logical solution to a problem or a valid conclusion to an argument (Dwyer 2017, 2020; Dwyer et al. 2012, 2014, 2015, 2016; Dwyer and Walsh 2019; Quinn et al. 2020).

  3. (PDF) Barriers to Critical Thinking

    PDF | On May 4, 2020, Name : Alcos and others published Barriers to Critical Thinking | Find, read and cite all the research you need on ResearchGate

  4. PDF An Evaluative Review of Barriers to Critical Thinking in Educational

    An Evaluative Review of Barriers to Critical Thinking in ... the chances of producing a logical solution to a problem or a valid conclusion to an argu-ment (Dwyer 2017, 2020; Dwyer et al. 2012, 2014, 2015, 2016; Dwyer and Walsh 2019; ... sonal contexts where good decision-making and problem-solving are needed on a daily basis (Ku 2009). However ...

  5. (PDF) Barriers to Critical Thinking

    L.-J. Shannon. J. Bennett. A majority of incoming college freshmen and sophomores have not applied their critical thinking skills as part of their learning process. This paper investigates how ...

  6. An Evaluative Review of Barriers to Critical Thinking in Educational

    Though a wide array of definitions and conceptualisations of critical thinking have been offered in the past, further elaboration on some concepts is required, particularly with respect to various factors that may impede an individual's application of critical thinking, such as in the case of reflective judgment. These barriers include varying levels of epistemological engagement or ...

  7. 'Destroying barriers to critical thinking' to surge the effect of self

    1. Introduction. The importance of critical thinking skills is widely accepted as one of the prominent sets of 21st-century skills for innovation and countering pervasive misinformation, developing the cognitive ability of self-regulation, raising responsible citizens, dealing with complex challenges in education, and finding well-reasoned solutions to tricky problems (Álvarez-Huerta et al ...

  8. [PDF] Overcoming Barriers To Teaching Critical Thinking

    Critical thinking (CT) has long been a goal of modern education, and its importance has been reiterated in a wide range of documents. Despite the consensus of scholars and educators on the significance of nurturing students to become critical thinkers, teaching for CT has not been a simple task because there are competing definitions and practices and many barriers to its implementation [1 ...

  9. PDF teaching critical thinking and Problem solving skills

    critical thinking instructional methods into business education classrooms at both the secondary and post-secondary levels. First, critical thinking is described as it relates to instructional design. Then barriers to critical thinking are outlined. Finally, instructional strategies for enhancing students' critical thinking skills are

  10. The effectiveness of collaborative problem solving in promoting

    With respect to the dimensions of critical thinking, collaborative problem-solving can significantly and effectively improve students' attitudinal tendency, and the comprehensive effect is ...

  11. (PDF) Perceived Barriers to Critical Thinking Development: The Student

    This survey study sets out learners' perceived barriers to critical thinking in two state Moroccan universities. Surveying 110 EFL students, the respondents were invited to openly comment on the ...

  12. PDF Getting Students to Think Critically and Visibly

    that thinking is communicative and attainable through encouragement. The review of literature explored in this article describes how building a positive classroom fosters an environment for students to think more visibly which will intuitively increase critical thinking and problem-solving skills. Literature Creating a classroom culture of thinking

  13. PDF Critical Thinking Competency Standards

    critical thinking entails effective communication and problem solving skills, as well as a commitment to o vercoming one's native egocentric and sociocentric ten - dencies. All students (beyond the elementary level) are expected to demonstrate all of the critical thinking competencies included in this battery of demonstrable skills,

  14. PDF Critical Thinking

    Glaser defined critical thinking as: (1) an attitude of being disposed to consider in a thoughtful way the problems and subjects that come within the range of one's experience; (2) knowledge of the methods of logical enquiry and reasoning; and (3) some skill in applying those methods. Critical thinking calls for a persistent effort to examine ...

  15. PDF An Evaluative Review of Barriers to Critical Thinking in Educational

    2. Barriers to Critical Thinking 2.1. Inadequate Skills and Dispositions In order to better understand the various barriers to CT that will be discussed, the manner in which CT is conceptualised must first be revisited. Though debate over its definition and what components are necessary to think critically has existed over the 80-plus

  16. PDF Numeracy as critical thinking

    The critical thinking approach makes the unit very different to previous high school courses our students may have encountered. With the development of higher-order critical thinking skills at the core of the unit, the unit is immediately relevant and valuable to all of our students, whatever their chosen field of study.

  17. PDF The Miniature Guide to Critical Thinking Concepts and Tools

    Critical thinking is the art of analyzing and evaluating thought processes with a view to improving them. Critical thinking is self-directed, self-disciplined, self-monitored, and self-corrective thinking. It requires rigorous standards of excellence and mindful command of their use. It entails effective communication and problem solving abili-

  18. PDF UNIT 4 IMPEDIMENTS TO PROBLEM Problem Solving Theoretical ...

    4.7 Problem Solving in Team and Small Groups 4.8 Critical Thinking in Problem Solving and Impediments 4.9 Other Barriers to Problem Solving 4.9.1 Perceptual Blocks 4.9.2 Emotional Blocks 4.9.3 Intellectual Blocks 4.9.4 Expressive Blocks 4.9.5 Environmental Blocks 4.9.6 Cultural Blocks 4.10 Teaching and Learning Strategies that Enhance Problem ...

  19. 7 Critical Thinking Barriers and How to Overcome Them

    Most importantly, we must discover how to get around these barriers. This article will explore seven common critical thinking barriers and how to effectively circumvent them. In our view, the 7 most common and harmful critical thinking barriers to actively overcome are: Egocentric Thinking. Groupthink. Drone Mentality.

  20. Using Critical Thinking in Essays and other Assignments

    Critical thinking, as described by Oxford Languages, is the objective analysis and evaluation of an issue in order to form a judgement. Active and skillful approach, evaluation, assessment, synthesis, and/or evaluation of information obtained from, or made by, observation, knowledge, reflection, acumen or conversation, as a guide to belief and ...

  21. PDF Critical Thinking Competency Standards

    critical thinking entails effective communication and problem solving skills, as well as a commitment to o vercoming one's native egocentric and sociocentric ten - dencies. All students (beyond the elementary level) are expected to demonstrate all of the critical thinking competencies included in this battery of demonstrable skills,

  22. (PDF) Students' views regarding the barriers to learning critical thinking

    In a study trying to identify students' views about the barriers to learning critical thinking, Franklin et al. (2022) found that teachers were regarded by students as one of the obstacles to ...

  23. (PDF) Critical Thinking and Problem Solving

    Abstract. This slide help you to gather knowledge to develop organization. This will help to increase capacity to solve problem. 25+ million members. 160+ million publication pages. 2.3+ billion ...

  24. Critical Thinking

    Critical Thinking assessment 2 - 1. Case studies present real-world scenarios that require employees to analyze information, identify problems, evaluate options, and propose solutions. This process encourages critical thinking by: Promoting analysis and evaluation: Employees need to critically analyze the information presented in the case study, considering various perspectives, biases, and ...