Logo for OPEN OKSTATE

1 Introduction to Critical Thinking

I. what is c ritical t hinking [1].

Critical thinking is the ability to think clearly and rationally about what to do or what to believe.  It includes the ability to engage in reflective and independent thinking. Someone with critical thinking skills is able to do the following:

  • Understand the logical connections between ideas.
  • Identify, construct, and evaluate arguments.
  • Detect inconsistencies and common mistakes in reasoning.
  • Solve problems systematically.
  • Identify the relevance and importance of ideas.
  • Reflect on the justification of one’s own beliefs and values.

Critical thinking is not simply a matter of accumulating information. A person with a good memory and who knows a lot of facts is not necessarily good at critical thinking. Critical thinkers are able to deduce consequences from what they know, make use of information to solve problems, and to seek relevant sources of information to inform themselves.

Critical thinking should not be confused with being argumentative or being critical of other people. Although critical thinking skills can be used in exposing fallacies and bad reasoning, critical thinking can also play an important role in cooperative reasoning and constructive tasks. Critical thinking can help us acquire knowledge, improve our theories, and strengthen arguments. We can also use critical thinking to enhance work processes and improve social institutions.

Some people believe that critical thinking hinders creativity because critical thinking requires following the rules of logic and rationality, whereas creativity might require breaking those rules. This is a misconception. Critical thinking is quite compatible with thinking “out-of-the-box,” challenging consensus views, and pursuing less popular approaches. If anything, critical thinking is an essential part of creativity because we need critical thinking to evaluate and improve our creative ideas.

II. The I mportance of C ritical T hinking

Critical thinking is a domain-general thinking skill. The ability to think clearly and rationally is important whatever we choose to do. If you work in education, research, finance, management or the legal profession, then critical thinking is obviously important. But critical thinking skills are not restricted to a particular subject area. Being able to think well and solve problems systematically is an asset for any career.

Critical thinking is very important in the new knowledge economy.  The global knowledge economy is driven by information and technology. One has to be able to deal with changes quickly and effectively. The new economy places increasing demands on flexible intellectual skills, and the ability to analyze information and integrate diverse sources of knowledge in solving problems. Good critical thinking promotes such thinking skills, and is very important in the fast-changing workplace.

Critical thinking enhances language and presentation skills. Thinking clearly and systematically can improve the way we express our ideas. In learning how to analyze the logical structure of texts, critical thinking also improves comprehension abilities.

Critical thinking promotes creativity. To come up with a creative solution to a problem involves not just having new ideas. It must also be the case that the new ideas being generated are useful and relevant to the task at hand. Critical thinking plays a crucial role in evaluating new ideas, selecting the best ones and modifying them if necessary.

Critical thinking is crucial for self-reflection. In order to live a meaningful life and to structure our lives accordingly, we need to justify and reflect on our values and decisions. Critical thinking provides the tools for this process of self-evaluation.

Good critical thinking is the foundation of science and democracy. Science requires the critical use of reason in experimentation and theory confirmation. The proper functioning of a liberal democracy requires citizens who can think critically about social issues to inform their judgments about proper governance and to overcome biases and prejudice.

Critical thinking is a   metacognitive skill . What this means is that it is a higher-level cognitive skill that involves thinking about thinking. We have to be aware of the good principles of reasoning, and be reflective about our own reasoning. In addition, we often need to make a conscious effort to improve ourselves, avoid biases, and maintain objectivity. This is notoriously hard to do. We are all able to think but to think well often requires a long period of training. The mastery of critical thinking is similar to the mastery of many other skills. There are three important components: theory, practice, and attitude.

III. Improv ing O ur T hinking S kills

If we want to think correctly, we need to follow the correct rules of reasoning. Knowledge of theory includes knowledge of these rules. These are the basic principles of critical thinking, such as the laws of logic, and the methods of scientific reasoning, etc.

Also, it would be useful to know something about what not to do if we want to reason correctly. This means we should have some basic knowledge of the mistakes that people make. First, this requires some knowledge of typical fallacies. Second, psychologists have discovered persistent biases and limitations in human reasoning. An awareness of these empirical findings will alert us to potential problems.

However, merely knowing the principles that distinguish good and bad reasoning is not enough. We might study in the classroom about how to swim, and learn about the basic theory, such as the fact that one should not breathe underwater. But unless we can apply such theoretical knowledge through constant practice, we might not actually be able to swim.

Similarly, to be good at critical thinking skills it is necessary to internalize the theoretical principles so that we can actually apply them in daily life. There are at least two ways to do this. One is to perform lots of quality exercises. These exercises don’t just include practicing in the classroom or receiving tutorials; they also include engaging in discussions and debates with other people in our daily lives, where the principles of critical thinking can be applied. The second method is to think more deeply about the principles that we have acquired. In the human mind, memory and understanding are acquired through making connections between ideas.

Good critical thinking skills require more than just knowledge and practice. Persistent practice can bring about improvements only if one has the right kind of motivation and attitude. The following attitudes are not uncommon, but they are obstacles to critical thinking:

  • I prefer being given the correct answers rather than figuring them out myself.
  • I don’t like to think a lot about my decisions as I rely only on gut feelings.
  • I don’t usually review the mistakes I have made.
  • I don’t like to be criticized.

To improve our thinking we have to recognize the importance of reflecting on the reasons for belief and action. We should also be willing to engage in debate, break old habits, and deal with linguistic complexities and abstract concepts.

The  California Critical Thinking Disposition Inventory  is a psychological test that is used to measure whether people are disposed to think critically. It measures the seven different thinking habits listed below, and it is useful to ask ourselves to what extent they describe the way we think:

  • Truth-Seeking—Do you try to understand how things really are? Are you interested in finding out the truth?
  • Open-Mindedness—How receptive are you to new ideas, even when you do not intuitively agree with them? Do you give new concepts a fair hearing?
  • Analyticity—Do you try to understand the reasons behind things? Do you act impulsively or do you evaluate the pros and cons of your decisions?
  • Systematicity—Are you systematic in your thinking? Do you break down a complex problem into parts?
  • Confidence in Reasoning—Do you always defer to other people? How confident are you in your own judgment? Do you have reasons for your confidence? Do you have a way to evaluate your own thinking?
  • Inquisitiveness—Are you curious about unfamiliar topics and resolving complicated problems? Will you chase down an answer until you find it?
  • Maturity of Judgment—Do you jump to conclusions? Do you try to see things from different perspectives? Do you take other people’s experiences into account?

Finally, as mentioned earlier, psychologists have discovered over the years that human reasoning can be easily affected by a variety of cognitive biases. For example, people tend to be over-confident of their abilities and focus too much on evidence that supports their pre-existing opinions. We should be alert to these biases in our attitudes towards our own thinking.

IV. Defining Critical Thinking

There are many different definitions of critical thinking. Here we list some of the well-known ones. You might notice that they all emphasize the importance of clarity and rationality. Here we will look at some well-known definitions in chronological order.

1) Many people trace the importance of critical thinking in education to the early twentieth-century American philosopher John Dewey. But Dewey did not make very extensive use of the term “critical thinking.” Instead, in his book  How We Think (1910), he argued for the importance of what he called “reflective thinking”:

…[when] the ground or basis for a belief is deliberately sought and its adequacy to support the belief examined. This process is called reflective thought; it alone is truly educative in value…

Active, persistent and careful consideration of any belief or supposed form of knowledge in light of the grounds that support it, and the further conclusions to which it tends, constitutes reflective thought.

There is however one passage from How We Think where Dewey explicitly uses the term “critical thinking”:

The essence of critical thinking is suspended judgment; and the essence of this suspense is inquiry to determine the nature of the problem before proceeding to attempts at its solution. This, more than any other thing, transforms mere inference into tested inference, suggested conclusions into proof.

2) The  Watson-Glaser Critical Thinking Appraisal  (1980) is a well-known psychological test of critical thinking ability. The authors of this test define critical thinking as:

…a composite of attitudes, knowledge and skills. This composite includes: (1) attitudes of inquiry that involve an ability to recognize the existence of problems and an acceptance of the general need for evidence in support of what is asserted to be true; (2) knowledge of the nature of valid inferences, abstractions, and generalizations in which the weight or accuracy of different kinds of evidence are logically determined; and (3) skills in employing and applying the above attitudes and knowledge.

3) A very well-known and influential definition of critical thinking comes from philosopher and professor Robert Ennis in his work “A Taxonomy of Critical Thinking Dispositions and Abilities” (1987):

Critical thinking is reasonable reflective thinking that is focused on deciding what to believe or do.

4) The following definition comes from a statement written in 1987 by the philosophers Michael Scriven and Richard Paul for the  National Council for Excellence in Critical Thinking (link), an organization promoting critical thinking in the US:

Critical thinking is the intellectually disciplined process of actively and skillfully conceptualizing, applying, analyzing, synthesizing, and/or evaluating information gathered from, or generated by, observation, experience, reflection, reasoning, or communication, as a guide to belief and action. In its exemplary form, it is based on universal intellectual values that transcend subject matter divisions: clarity, accuracy, precision, consistency, relevance, sound evidence, good reasons, depth, breadth, and fairness. It entails the examination of those structures or elements of thought implicit in all reasoning: purpose, problem, or question-at-issue, assumptions, concepts, empirical grounding; reasoning leading to conclusions, implications and consequences, objections from alternative viewpoints, and frame of reference.

The following excerpt from Peter A. Facione’s “Critical Thinking: A Statement of Expert Consensus for Purposes of Educational Assessment and Instruction” (1990) is quoted from a report written for the American Philosophical Association:

We understand critical thinking to be purposeful, self-regulatory judgment which results in interpretation, analysis, evaluation, and inference, as well as explanation of the evidential, conceptual, methodological, criteriological, or contextual considerations upon which that judgment is based. CT is essential as a tool of inquiry. As such, CT is a liberating force in education and a powerful resource in one’s personal and civic life. While not synonymous with good thinking, CT is a pervasive and self-rectifying human phenomenon. The ideal critical thinker is habitually inquisitive, well-informed, trustful of reason, open-minded, flexible, fairminded in evaluation, honest in facing personal biases, prudent in making judgments, willing to reconsider, clear about issues, orderly in complex matters, diligent in seeking relevant information, reasonable in the selection of criteria, focused in inquiry, and persistent in seeking results which are as precise as the subject and the circumstances of inquiry permit. Thus, educating good critical thinkers means working toward this ideal. It combines developing CT skills with nurturing those dispositions which consistently yield useful insights and which are the basis of a rational and democratic society.

V. Two F eatures of C ritical T hinking

A. how not what .

Critical thinking is concerned not with what you believe, but rather how or why you believe it. Most classes, such as those on biology or chemistry, teach you what to believe about a subject matter. In contrast, critical thinking is not particularly interested in what the world is, in fact, like. Rather, critical thinking will teach you how to form beliefs and how to think. It is interested in the type of reasoning you use when you form your beliefs, and concerns itself with whether you have good reasons to believe what you believe. Therefore, this class isn’t a class on the psychology of reasoning, which brings us to the second important feature of critical thinking.

B. Ought N ot Is ( or Normative N ot Descriptive )

There is a difference between normative and descriptive theories. Descriptive theories, such as those provided by physics, provide a picture of how the world factually behaves and operates. In contrast, normative theories, such as those provided by ethics or political philosophy, provide a picture of how the world should be. Rather than ask question such as why something is the way it is, normative theories ask how something should be. In this course, we will be interested in normative theories that govern our thinking and reasoning. Therefore, we will not be interested in how we actually reason, but rather focus on how we ought to reason.

In the introduction to this course we considered a selection task with cards that must be flipped in order to check the validity of a rule. We noted that many people fail to identify all the cards required to check the rule. This is how people do in fact reason (descriptive). We then noted that you must flip over two cards. This is how people ought to reason (normative).

  • Section I-IV are taken from http://philosophy.hku.hk/think/ and are in use under the creative commons license. Some modifications have been made to the original content. ↵

Critical Thinking Copyright © 2019 by Brian Kim is licensed under a Creative Commons Attribution 4.0 International License , except where otherwise noted.

Share This Book

Have a language expert improve your writing

Run a free plagiarism check in 10 minutes, generate accurate citations for free.

  • Knowledge Base
  • Working with sources
  • What Is Critical Thinking? | Definition & Examples

What Is Critical Thinking? | Definition & Examples

Published on May 30, 2022 by Eoghan Ryan . Revised on May 31, 2023.

Critical thinking is the ability to effectively analyze information and form a judgment .

To think critically, you must be aware of your own biases and assumptions when encountering information, and apply consistent standards when evaluating sources .

Critical thinking skills help you to:

  • Identify credible sources
  • Evaluate and respond to arguments
  • Assess alternative viewpoints
  • Test hypotheses against relevant criteria

Table of contents

Why is critical thinking important, critical thinking examples, how to think critically, other interesting articles, frequently asked questions about critical thinking.

Critical thinking is important for making judgments about sources of information and forming your own arguments. It emphasizes a rational, objective, and self-aware approach that can help you to identify credible sources and strengthen your conclusions.

Critical thinking is important in all disciplines and throughout all stages of the research process . The types of evidence used in the sciences and in the humanities may differ, but critical thinking skills are relevant to both.

In academic writing , critical thinking can help you to determine whether a source:

  • Is free from research bias
  • Provides evidence to support its research findings
  • Considers alternative viewpoints

Outside of academia, critical thinking goes hand in hand with information literacy to help you form opinions rationally and engage independently and critically with popular media.

Scribbr Citation Checker New

The AI-powered Citation Checker helps you avoid common mistakes such as:

  • Missing commas and periods
  • Incorrect usage of “et al.”
  • Ampersands (&) in narrative citations
  • Missing reference entries

critical thinking beliefs attitudes and values

Critical thinking can help you to identify reliable sources of information that you can cite in your research paper . It can also guide your own research methods and inform your own arguments.

Outside of academia, critical thinking can help you to be aware of both your own and others’ biases and assumptions.

Academic examples

However, when you compare the findings of the study with other current research, you determine that the results seem improbable. You analyze the paper again, consulting the sources it cites.

You notice that the research was funded by the pharmaceutical company that created the treatment. Because of this, you view its results skeptically and determine that more independent research is necessary to confirm or refute them. Example: Poor critical thinking in an academic context You’re researching a paper on the impact wireless technology has had on developing countries that previously did not have large-scale communications infrastructure. You read an article that seems to confirm your hypothesis: the impact is mainly positive. Rather than evaluating the research methodology, you accept the findings uncritically.

Nonacademic examples

However, you decide to compare this review article with consumer reviews on a different site. You find that these reviews are not as positive. Some customers have had problems installing the alarm, and some have noted that it activates for no apparent reason.

You revisit the original review article. You notice that the words “sponsored content” appear in small print under the article title. Based on this, you conclude that the review is advertising and is therefore not an unbiased source. Example: Poor critical thinking in a nonacademic context You support a candidate in an upcoming election. You visit an online news site affiliated with their political party and read an article that criticizes their opponent. The article claims that the opponent is inexperienced in politics. You accept this without evidence, because it fits your preconceptions about the opponent.

There is no single way to think critically. How you engage with information will depend on the type of source you’re using and the information you need.

However, you can engage with sources in a systematic and critical way by asking certain questions when you encounter information. Like the CRAAP test , these questions focus on the currency , relevance , authority , accuracy , and purpose of a source of information.

When encountering information, ask:

  • Who is the author? Are they an expert in their field?
  • What do they say? Is their argument clear? Can you summarize it?
  • When did they say this? Is the source current?
  • Where is the information published? Is it an academic article? Is it peer-reviewed ?
  • Why did the author publish it? What is their motivation?
  • How do they make their argument? Is it backed up by evidence? Does it rely on opinion, speculation, or appeals to emotion ? Do they address alternative arguments?

Critical thinking also involves being aware of your own biases, not only those of others. When you make an argument or draw your own conclusions, you can ask similar questions about your own writing:

  • Am I only considering evidence that supports my preconceptions?
  • Is my argument expressed clearly and backed up with credible sources?
  • Would I be convinced by this argument coming from someone else?

If you want to know more about ChatGPT, AI tools , citation , and plagiarism , make sure to check out some of our other articles with explanations and examples.

  • ChatGPT vs human editor
  • ChatGPT citations
  • Is ChatGPT trustworthy?
  • Using ChatGPT for your studies
  • What is ChatGPT?
  • Chicago style
  • Paraphrasing

 Plagiarism

  • Types of plagiarism
  • Self-plagiarism
  • Avoiding plagiarism
  • Academic integrity
  • Consequences of plagiarism
  • Common knowledge

Prevent plagiarism. Run a free check.

Critical thinking refers to the ability to evaluate information and to be aware of biases or assumptions, including your own.

Like information literacy , it involves evaluating arguments, identifying and solving problems in an objective and systematic way, and clearly communicating your ideas.

Critical thinking skills include the ability to:

You can assess information and arguments critically by asking certain questions about the source. You can use the CRAAP test , focusing on the currency , relevance , authority , accuracy , and purpose of a source of information.

Ask questions such as:

  • Who is the author? Are they an expert?
  • How do they make their argument? Is it backed up by evidence?

A credible source should pass the CRAAP test  and follow these guidelines:

  • The information should be up to date and current.
  • The author and publication should be a trusted authority on the subject you are researching.
  • The sources the author cited should be easy to find, clear, and unbiased.
  • For a web source, the URL and layout should signify that it is trustworthy.

Information literacy refers to a broad range of skills, including the ability to find, evaluate, and use sources of information effectively.

Being information literate means that you:

  • Know how to find credible sources
  • Use relevant sources to inform your research
  • Understand what constitutes plagiarism
  • Know how to cite your sources correctly

Confirmation bias is the tendency to search, interpret, and recall information in a way that aligns with our pre-existing values, opinions, or beliefs. It refers to the ability to recollect information best when it amplifies what we already believe. Relatedly, we tend to forget information that contradicts our opinions.

Although selective recall is a component of confirmation bias, it should not be confused with recall bias.

On the other hand, recall bias refers to the differences in the ability between study participants to recall past events when self-reporting is used. This difference in accuracy or completeness of recollection is not related to beliefs or opinions. Rather, recall bias relates to other factors, such as the length of the recall period, age, and the characteristics of the disease under investigation.

Cite this Scribbr article

If you want to cite this source, you can copy and paste the citation or click the “Cite this Scribbr article” button to automatically add the citation to our free Citation Generator.

Ryan, E. (2023, May 31). What Is Critical Thinking? | Definition & Examples. Scribbr. Retrieved March 20, 2024, from https://www.scribbr.com/working-with-sources/critical-thinking/

Is this article helpful?

Eoghan Ryan

Eoghan Ryan

Other students also liked, student guide: information literacy | meaning & examples, what are credible sources & how to spot them | examples, applying the craap test & evaluating sources, "i thought ai proofreading was useless but..".

I've been using Scribbr for years now and I know it's a service that won't disappoint. It does a good job spotting mistakes”

Bookmark this page

Defining Critical Thinking

  • A Brief History of the Idea of Critical Thinking
  • Critical Thinking: Basic Questions & Answers
  • Our Conception of Critical Thinking
  • Sumner’s Definition of Critical Thinking
  • Research in Critical Thinking
  • Critical Societies: Thoughts from the Past

Translate this page from English...

*Machine translated pages not guaranteed for accuracy. Click Here for our professional translations.

For full copies of this and many other critical thinking articles, books, videos, and more, join us at the Center for Critical Thinking Community Online - the world's leading online community dedicated to critical thinking!   Also featuring interactive learning activities, study groups, and even a social media component, this learning platform will change your conception of intellectual development.

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • Front Psychol

The Role of Students’ Beliefs When Critically Reasoning From Multiple Contradictory Sources of Information in Performance Assessments

Olga zlatkin-troitschanskaia.

1 Department of Business and Economics Education, Johannes Gutenberg University Mainz, Mainz, Germany

Jennifer Fischer

Dominik braunheim, susanne schmidt, richard j. shavelson.

2 Graduate School of Education, Stanford University, Palo Alto, CA, United States

Associated Data

The datasets presented in this article will be made available by the authors, without undue reservation, to any qualified researcher. Requests to access the datasets should be directed to ed.zniam-inu@aiaksnahcstiort .

Critical reasoning (CR) when confronted with contradictory information from multiple sources is a crucial ability in a knowledge-based society and digital world. Using information without critically reflecting on the content and its quality may lead to the acceptance of information based on unwarranted claims. Previous personal beliefs are assumed to play a decisive role when it comes to critically differentiating between assertions and claims and warranted knowledge and facts. The role of generic epistemic beliefs on critical stance and attitude in reflectively dealing with information is well researched. Relatively few studies however, have been conducted on the influence of domain-specific beliefs , i.e., beliefs in relation to specific content encountered in a piece of information or task, on the reasoning process , and on how these beliefs may affect decision-making processes. This study focuses on students’ task- and topic-related beliefs that may influence their reasoning when dealing with multiple and partly contradictory sources of information. To validly assess CR among university students, we used a newly developed computer-based performance assessment in which the students were confronted with an authentic task which contains theoretically defined psychological stimuli for measuring CR. To investigate the particular role of task- and topic-related beliefs on CR, a purposeful sample of 30 university students took part in a performance assessment and then were interviewed immediately afterward. In the semi-structured cognitive interviews, the participants’ task-related beliefs were assessed. Based on qualitative analyses of the interview transcripts, three distinct profiles of decision-making among students have been identified. More specifically, the different types of students’ beliefs and attitudes derived from the cognitive interview data suggest their influence on information processing, reasoning approaches and decision-making. The results indicated that the students’ beliefs had an influence on their selection, critical evaluation and use of information as well as on their reasoning processes and final decisions.

Research Background and Study Objectives

Critical reasoning (CR) when confronted with contradictory information from multiple sources is a crucial ability in a knowledge-based society and digital world ( Brooks, 2016 ; Newman and Beetham, 2017 ; Wineburg and McGrew, 2017 ). The Internet presents a flood of complex, potentially conflicting, and competing information on one and the same issue. To build a dependable and coherent knowledge base and to develop sophisticated (domain-specific and generic) attitudes and an analytical, reflective stance, students must be able to select and critically evaluate, analyze, synthesize, and integrate incoherent, fragmented, and biased information.

Students’ mental CR strategies may likely be insufficient for what is demanded for understanding heterogeneous information and, what is more, for effective and productive participation in a complex information environment (for a meta-study, see Huber and Kuncel, 2016 , for university students, see McGrew et al., 2018 ; Wineburg et al., 2018 ; Hahnel et al., 2019 ; Münchow et al., 2019 ; Shavelson et al., 2019 ; Zlatkin-Troitschanskaia et al., 2019b ). As a coping strategy, they may choose to reduce complexity by various means, for instance, by using cognitive heuristics, preferring simplified forms of information presentation, or relying on sources without verification, which can be exploited for manipulation ( Walthen and Burkell, 2002 ; Metzger, 2007 ; Horstmann et al., 2009 ; Metzger et al., 2010 ).

In addition, certain information representations may be (sub)consciously preferred not for their informational but for their entertainment value, their elicitation of certain affects, or their engagement properties ( Maurer et al., 2018 , 2020 ). Based on students’ previous media experience, knowledge, and expectations, they may have learned to assume that certain types of media representations are more trustworthy than others ( McGrew et al., 2017 ). They may follow a heuristic that similar media representations offer similarly reliable evidence, without considering the communicative context, communicator’s intentions, and possibilities of becoming a victim of manipulation. This is particularly the case when it comes to online information channels ( Metzger et al., 2010 ; Ciampaglia, 2018 ).

As some current studies indicate, students who habitually avoid information that contradicts their beliefs may easily miss important content and fall prey to biased information [see Section “State of Research on Beliefs and Their Impact on (Online) Information Processing”]. Using information without critically reflecting on the content and its quality may lead to the acceptance of information based on unwarranted claims. Deficits in due critical evaluation arise most likely because of shallow processing or insufficient reasoning and may occur subconsciously ( Stanovich, 2003 , 2016 ).

Insufficient reasoning can be amplified when information on a topic is distorted or counterfactual and when students do not recognize biased or false information and use it to build knowledge. As a result, learners may neglect complex, academically warranted knowledge and rely more on lower-quality information that is consistent with their beliefs and biases and that is easier to comprehend ( Hahnel et al., 2019 ; Schoor et al., 2019 ). The internalization of this biased information may subsequently affect learning by acting to inhibit or distort more advanced information processing and knowledge acquisition ( List and Alexander, 2017 , 2018 ).

Theoretically, previous personal beliefs are assumed to play a very decisive role when it comes to critically differentiating between assertions and claims on the one hand and warranted knowledge and facts on the other hand. Rather, the role of generic epistemic beliefs on critical stance and attitude in reflectively dealing with information is well researched [see Section “State of Research on Beliefs and Their Impact on (Online) Information Processing”]. Relatively few studies have been conducted on the influence of domain-specific beliefs, i.e., beliefs in relation to specific content encountered in a piece of information or task , on the reasoning process. Beliefs of this kind are usually measured using psychological scales, but without insight into the reasoning process and how these beliefs may affect the information-processing and decision-making processes [for an overview of current research, see Section “State of Research on Beliefs and Their Impact on (Online) Information Processing”].

With our study, we aim to contribute to this research desideratum. This study focuses on students’ task- and topic-related beliefs that may influence their reasoning when dealing with multiple and partly contradictory sources of information. To validly assess CR among university students, we used a newly developed computer-based performance assessment of learning in which the students are confronted with an authentic task which contains theoretically defined psychological stimuli for measuring CR (for details, see Section “Assessment Frameworks for Measuring Critical Reasoning”) in accordance with our construct definition (see Section “Critically Reasoning from Multiple Sources of Information”; Shavelson et al., 2019 ; Zlatkin-Troitschanskaia et al.,2019b ).

To investigate the particular role of task- and topic-related beliefs on CR, a purposeful sample of 30 university students from the overall sample of the overarching German iPAL study took part in a performance assessment and then were interviewed immediately afterward (for details, see Sections “A Study on Performance Assessments of Higher Education Students’ CR” and “Materials and Methods”). In the semi-structured cognitive interviews, the task-related beliefs of the participants were elicited and then qualitatively analyzed (see Section “Cognitive Interviews and Qualitative Analyses”). The cognitive interview transcripts were examined in order to address the two overarching questions (i) how do students’ beliefs influence their selection, evaluation and use of information as well as their subsequent reasoning and decision-making? and (ii) how do students’ beliefs change as they progress through the task and encounter multiple new information sources along the way. Based on qualitative analyses ( Strauss and Corbin, 1998 ; for details, see “Materials and Methods”), different profiles of participants have been identified, which can be distinguished by different personal characteristics such as the level of prior knowledge.

In this paper, we present our theoretical and empirical analyses to address these two questions (see Section “Results”). The study results – despite the necessary limitations (see Section “Limitations and Implication for Future Research”) – lead to a valuable specification of theoretical assumptions for further empirical research in this highly relevant but under-researched field (see Section “Summary and Interpretation of Results”).

State of Research on Beliefs and Their Impact on (Online) Information Processing

For a systematic analysis of the state of research, we conducted a criteria-driven online search. Based on expert interviews, we determined a set of keywords and conducted the search on the ERIC database and Google Scholar for the period 2009–2020. The stepwise search using keywords related to online information processing and critical reasoning among university students resulted in 56 eligible studies. The review of the abstracts showed that students’ beliefs were assessed and analyzed in 15 studies. The essential results of these studies are summarized in the following overview (see Table 1 ).

Overview of recent studies on beliefs and their impact on information processing.

About half of these 15 studies focus explicitly on the relation between beliefs and (online) information processing (see Table 1 ), while critical reasoning was only implicitly addressed. Despite this narrow research focus, all studies indicate a clear connection between epistemic beliefs and the approach to (online) information processing, especially regarding judgment of information sources and their content. Well-developed and more advanced epistemic beliefs positively influenced the quality of students’ information processing.

Ulyshen et al. (2015) provided one of the few studies specifically investigating the relation between general epistemic beliefs and Internet search behavior . Using participants’ think-aloud protocols they investigated the impact of students’ task-related epistemic beliefs on their information processing. The results indicate a positive impact of well-developed epistemic beliefs on evaluating the quality and credibility of information.

Chiu et al. (2013) used a questionnaire to investigate the relation between university students’ Internet-specific epistemic beliefs and Internet search behavior. The authors identified four dimensions of beliefs: certainty, simplicity, source , and justification of Internet-based knowledge . The results indicate a positive association between Internet searching and justification , and a negative association with simplicity and source . In a follow-up study, Chiu et al. (2015) examined gender differences in students’ Internet-specific epistemic beliefs, indicating a gender gap in certainty and simplicity , and revealing more perceived uncertainty and complexity among females compared to males.

Mason et al. (2010) specifically focused on students’ Internet search when working on academic tasks dealing with a controversial topic , and in relation to epistemic metacognition , which was defined as students’ ability to spontaneously reflect on the accessed information. In addition, they examined the relationship between personal characteristics and prior topic-related knowledge. Test participants were asked to think aloud during their Internet search . Qualitative and quantitative analyses revealed diverse epistemic metacognitions among all study participants, but to different extents and levels. No correlation between epistemic beliefs and prior knowledge was identified. Overall, two patterns of epistemic metacognition were determined that significantly affected students’ Internet search. Students who spontaneously generated more sophisticated reflections about the sources and the information provided outperformed students who were at a lower epistemic level.

In an experimental study with an intervention-control group design (the intervention aiming at improving medicine-specific epistemic beliefs ), Kienhues et al. (2011) focused on the relationship between processing conflicting versus consistent (medical) information on the Internet and topic-related and medicine-specific epistemic beliefs . The intervention groups differed in both their topic-related and medicine-specific epistemic beliefs, and were more advanced compared to the control group.

van Strien et al. (2016) examined the influence of attitude strength on the processing and evaluation of sources of information on the Web. In an eye-tracking study, university students received information from pre-selected websites from different sources on a controversial topic. Participants who felt strongly about the topic did not consider websites with attitude-inconsistent content for as long and did not rate the credibility of this information as highly as students with less strongly established prior attitudes. The participants with strong prior attitudes also included more attitude-consistent information in an essay task than participants with weaker prior attitudes. Thus, differences in prior attitudes bias the evaluation and processing of information in different ways. Even though students were not fully biased during initial information processing, they were so when evaluating the information and presenting it in an essay task.

Similar biases were found in a study by Bråten et al. (2011) , who examined how undergraduates judged the trustworthiness of information sources on a controversial topic. Students judged information differently depending on the sources (e.g., textbooks, official documents, newspapers). In addition, students with limited topic-specific knowledge were inclined to trust less trustworthy sources. Lucassen and Schraagen (2011) show similar results in terms of relation with domain-specific knowledge and source expertise .

Following the assumption that students spontaneously engage in epistemic cognition when processing conflicting scientific information, van Strien et al. (2012b) examined how this epistemic cognition is related to students’ actual beliefs. In addition, the interplay of students’ epistemic beliefs and prior attitudes when encountering conflicting and partly attitude-inconsistent information on a controversial socio-scientific topic was studied using think-aloud methods. The results indicated that students’ difficulties in adequately evaluating diverse and conflicting information do not correlate with their prior epistemic beliefs . These beliefs might be developing from naïve to sophisticated , i.e., from absolutism to multiplism to evaluativism (which were measured using a test developed by van Strien et al., 2012a ).

van Strien et al. (2014) investigated the effects of prior attitudes on how students deal with conflicting information in multiple texts, indicating that students with strong prior attitudes were significantly more likely to write essays that were biased toward their prior attitudes. Moreover, students with strong attitudes took explicit stances and used large proportions of information not presented in the sources in their essays, while students with neutral attitudes wrote syntheses and used more information from the given documents.

To gain a deeper insight into the role of experience in the evaluation phase of the information search process and into the development of beliefs influencing the evaluation of information, Johnson et al. (2016) found significant differences between first-year vs. third-year undergraduates regarding the factors that influence their judgment of the trustworthiness of online information. The results indicate that the more advanced students were not only more sophisticated in evaluating information sources but also more aware in terms of making use of the evaluation criteria.

Likewise, Hsu et al. (2014) examined how students’ different levels of development of their scientific-epistemic beliefs impact their online information searching strategies and behaviors. They divided undergraduates and graduates into two groups depending on whether they employed a naïve or sophisticated strategy. They measured students’ self-perceived online searching strategies and video recorded their search behaviors. Students with higher-quality scientific epistemic beliefs showed more advanced online searching strategies and demonstrated a rather meta-cognitive search pattern.

Mason et al. (2014) studied whether topic-specific prior knowledge and epistemological beliefs influence visual behavior when reading verbally and graphically presented information on webpages. They found that readers made a presumably implicit evaluation of the sources they were confronted with. University students with more elaborated topic-specific epistemic beliefs spent more time on graphics in the context of more reliable sources and increased their knowledge of the topic.

The study of Kahne and Bowyer (2017) is of particular interest for our analysis, as they took policy positions into consideration, an aspect which plays an important role in the task scenario we administered to our test participants (see section “Research Questions”). In their survey of young adults, which is representative for the U.S., they asked participants to judge the veracity of simulated online postings. Controlling for political knowledge and media literacy, their main finding was that the alignment of statements with prior policy beliefs is more decisive for the evaluation of information quality than their accuracy.

Summing up, from the findings reported in recent literature, we register several commonalities in respect to the relation between beliefs and the evaluation of internet-based information . First, information as such and especially information encountered on the Internet was generally recognized and processed on the basis of beliefs and attitudes. Initially, students were always inclined to consider information trustworthy that corresponds with their own (prior) knowledge, whereas they tended to neglect conflicting information. Other biasing factors were prior beliefs (attitudes), which were of comparatively greater impact on the ascription of quality of information in terms of credibility, reliability, plausibility, or trustworthiness. Students appear to be liable to believe and to use information sources in line with their previous convictions, i.e., to avoid “cognitive dissonances” ( Festinger, 1962 ). In addition, the impact of these factors is moderated by their strength (i.e., attitude strength). All in all, well-developed and more advanced (domain-specific) prior knowledge and epistemic beliefs seem to positively influence the quality of students’ Internet searches and (online) information processing.

Research Questions

In the studies we referenced above, the question of whether (prior) beliefs and attitudes are personal traits or states and to what extent they may change remains open. We do not yet know whether (prior) beliefs and attitudes will change during the information acquisition process, and if so, under what circumstances. Our study aims to shed some light on the answers to these questions.

More specifically, based on the analyses of the current state of international research (see Section “State of Research on Beliefs and Their Impact on (Online) Information Processing”), we developed an analytical framework for our study as presented in Figure 1 , and specify the following research questions ( RQs ):

An external file that holds a picture, illustration, etc.
Object name is fpsyg-11-02192-g001.jpg

The analytical framework of the study including the research questions ( RQ ).

(I) The Relationship between Beliefs and Decision-Making

RQ1: Students’ beliefs at the beginning of task processing

  • • Do the students indicate that they held certain beliefs before they began the performance assessment?
  • • Is it possible to identify distinct types based on these beliefs?

RQ2: The relationship between students’ beliefs and their reasoning process as well as their final decision (written task response)

  • • At which point in time during task processing did the students make their decision?
  • • Do the students’ beliefs affect their decision-making process?
  • • Is it possible to identify distinct profiles of decision-making?
  • • Which reasoning approaches become evident that may influence the decision-making of the participants?

(II) Change of Beliefs While Solving the Task.

RQ3: Interaction between students’ beliefs and the processing of the given information (in the task)

  • • Do the students’ beliefs change as they progress through the task and encounter multiple new information sources along the way (which could indicate that the processed information influences the students’ beliefs)? If so, to what extent is this reflected in their final decision (written task response)?

Conceptual and Methodological Background

Critically reasoning from multiple sources of information.

Students’ skills in judging (online) information are of central importance to avoid the acquisition of erroneous domain-specific and generic knowledge ( Murray and Pérez, 2014 ; Brooks, 2016 ). The abilities involved in finding, accessing, selecting, critically evaluating, and applying information from the Internet and from various media are crucial to learning in a globalized digital information society ( Pellegrino, 2017 ; List and Alexander, 2019 ). Students need critical reasoning (CR) skills to judge the quality of the information sources and content they access inside as well as outside of higher education ( Harrison and Luckett, 2019 ). Students need CR to recognize easily available biased and counterfactual information, withstand manipulation attempts ( Wineburg and McGrew, 2017 ; McGrew et al., 2018 ), and avoid generating erroneous domain-specific and generic knowledge or arguments. 1

In our study, we follow the definition of CR and its facets as described in Zlatkin-Troitschanskaia et al. (2019b) . CR is defined as students’ (I.) identification, evaluation, and integration of data sources; (II.) recognition and use of evidence; (III.) reasoning based on evidence, and synthesis; (IV) (causal and moral) recognition of consequences of decision-making, which ultimately lead to (V) the use of appropriate communicative action . The performance assessments used in this study to measure CR (see next section) are based on these five theoretically driven central facets of this definition of CR. Students’ ability to critically reason from multiple sources of information as a specific representation of CR was measured within this assessment framework.

Assessment Frameworks for Measuring Critical Reasoning

Valid measurement of CR skills is an important component of a program of research on how CR can be effectively promoted in higher education. Moreover, as part of a validity argument, CR’s relation to other related constructs needs to be examined. Based on existing psychological learning models ( Mislevy, 2018 ; Pellegrino, 2020 ), analyses of this kind can provide a significant contribution to developing appropriate explanatory approaches to CR. Despite the urgency of this topic for higher education ( Harrison and Luckett, 2019 ), theoretically sophisticated CR learning and performance assessment tools have so far been developed by only a few projects internationally (for an overview, see Zlatkin-Troitschanskaia et al., 2018a ).

Multidimensional and multifaceted (meta)cognitive higher-order (procedural) skills, such as CR, can be validly measured with closed-format tests to a limited extent, as selected-response items fall at the lower end of the ‘lifelike fidelity scale.’ Multiple-choice tests predominantly assess declarative and conceptual/factual knowledge (e.g., Braun, 2019 ). As Liu et al. (2014) and Oser and Biedermann (2020) documented, there are several closed-format tests for assessing CR (or related constructs). One main shortcoming of tests of this kind is their limited face validity, ecological validity, or content validity ( Davey et al., 2015 ). They usually demonstrate (extremely) strong correlations with tests focused on general intellectual ability [e.g., intelligence tests or the Scholastic Aptitude Test (SAT)], but they tend to fail to measure more specific procedural skills regarding the use and the evaluation of information sources used for learning in higher education. Well-established CR assessments have been based on standard-setting research ( Facione, 1990 ; Facione et al., 1998 ), but have used multiple-choice formats and brief situational contexts and have assessed generic minimal inferencing abilities. 2 Despite the broad use of this test type in educational assessment, it remains unclear to what extent these tasks are ecologically valid and whether students can transfer the measured abilities to more authentic and complex requirement situations.

At the other end of the assessment spectrum are traditional essay prompts with open responses and rubric scoring. Their suitability for assessing CR based on multiple sources of information in particular, may be limited by challenges in objective scoring and the brevity of the prompt ( Zlatkin-Troitschanskaia et al., 2019b ). While ecological validity in particular is especially limited in standardized tests ( Braun, 2019 ), CR can be more adequately measured through performance assessments that simulate the complex environment students find themselves in ‘in everyday life,’ and provide an addition to standardized measures, as they are better suited to reflect current contexts and learning conditions inside and outside of higher education ( Oliveri and Mislevy, 2019 ; Shavelson et al., 2019 ).

So far, to measure university students’ performance on concrete, real-world tasks and to tap their critical thinking skills, the Council for Aid to Education (CAE) has developed the Collegiate Learning Assessment (CLA) ( Klein et al., 2007 ), which was also used in the Assessment of Higher Education Learning Outcomes study, and has launched a refined performance test on CT, the CLA+. The assessment contains an hour-long performance task and a half-hour set of multiple-choice items so as to produce reliable individual student scores ( Zahner, 2013 ). The CLA+ is available internationally for various countries ( Wolf et al., 2015 ). It has been used in the United States and was also adapted and used in Finland 3 , Italy, and the United Kingdom ( Zahner and Ciolfi, 2018 ), and has undergone preliminary validation for Germany ( Zlatkin-Troitschanskaia et al., 2018b ). This computer-delivered assessment consists of a performance task where students are confronted with a complex scenario. Additionally, they are presented with a collection of documents with further information and data that should be used to properly evaluate the case and decide on a course of action. The test has an open-ended response format and is complemented by 25 selected-response questions on separate item stems. According to Wolf et al. (2015) , the assessment measures the following student abilities: Problem-solving and analysis, writing effectiveness, writing mechanics, reasoning scientifically and quantitatively, reading critically and evaluatively, and critiquing an argument.

Other assessments that were recently developed for higher education, such as HEIghten by ETS 4 or The Cap Critical Reasoning test, can be considered knowledge-based analytical-thinking, multiple-choice tests 5 and do not encompass any performance tasks (for an overview, see Zlatkin-Troitschanskaia et al., 2018a ).

The iPAL Study on Performance Assessments of Higher Education Students’ CR

In iPAL (international Performance Assessment of Learning), an international consortium focuses on the development and testing of performance assessments as the next generation of student learning outcome measurements ( Shavelson et al., 2019 ). The researchers address the question how performance assessments can enhance targeted student learning beyond rote memorization of facts and actively foster students’ acquisition of 21st century skills (including CR). The subproject presented here is designed to measure higher education students’ CR by simulating real-life decision-making and judgment situations ( Shavelson et al., 2019 ).

The German iPAL subproject follows a criterion-sampling measurement approach to assessing students’ CR. Criterion-sampled performance assessment tasks present real-world decision-making and judgment situations that students may face in academic and professional domains as well as in public and private life. Test takers are assigned a role in an authentic holistic scenario and are given additional documents and links to Internet sources related to the topic of the task (presented in different print and online formats) to be judged in respect to their varying degrees of trustworthiness and relevance. The skillset tapped by these tasks comprises skills necessary to critically reason from multiple sources of information, i.e., to critically select and evaluate (online) sources and information, and to use them to make and justify an evidence-based conclusive decision.

In the German iPAL study, we developed a performance assessment with a case scenario (renewable energy) [comprising 22 (ir)relevant, (un)reliable and partly conflicting pieces of information]. This newly developed computer-based performance assessment was comprehensively validated according to the Standards for Educational and Psychological Testing ( American Educational Research Association [AERA], 2014 ; see Zlatkin-Troitschanskaia et al., 2019b ; Nagel et al., 2020 ). Validity evidence was gathered (i) by evaluating the test-takers’ responses to the performance assessment (for details, see Shavelson et al., 2019 ; Zlatkin-Troitschanskaia et al.,2019b ), (ii) a semi-structured cognitive interview, and ( iii ) an additional questionnaire on the students’ personal characteristics such as prior knowledge, general intellectual ability, and media use behavior (for details, see Nagel et al., 2020 ).

In the following, we focus on the validation work conducted in (ii) cognitive interviews and present the analyses of transcripts of these cognitive interviews and corresponding results. To strengthen our validity argument (in the sense of Messick, 1994 ; Kane, 2013 ; Mislevy, 2018 ), we additionally refer to the particular findings from (i) to demonstrate how students’ beliefs and reasoning processes as identified in the cognitive interviews are related to their task performance (written final response on the case presented in the task).

Materials and Methods

In this section, we first describe the entire study, including the performance task and the other assessments applied, before presenting the sub-study of the cognitive interviews and its results.

Instruments

The performance task.

To assess students’ CR and their ability to critically reason from multiple sources of information, the German iPAL study developed and tested the “Wind Turbine” performance task. This computer-based assessment consists of a realistic short-frame scenario that describes a particular situation and requests a recommendation for a decision based on information provided in an accompanying document library (including 22 snippets and sources of information of different types; e.g., Wikipedia articles, videos, public reports, official statistics). These information sources, on which the students are to base their decision recommendation, vary in their relevance to the task topic and in the trustworthiness of their contents (for detailed descriptions of the performance task, see Shavelson et al., 2019 ; Zlatkin-Troitschanskaia et al.,2019b ).

In this case scenario, test takers were assigned the role of a member of the municipal council of a small town confronted with the opportunity to build a wind farm on its grounds. They were asked to review the information sources provided in the task and, based on the evidence, to write a policy recommendation for a course of action, i.e., to recommend to the city council whether or not to permit the construction of the wind turbines in its agricultural countryside (for more details, see Shavelson et al., 2019 ; Zlatkin-Troitschanskaia et al., 2019b ).

Accompanying Assessments

To control for task- and topic-related prior knowledge, we used a short version of the WiWiKom test, which was comprehensively validated in the representative nation-wide WiWiKom study as an indicator of knowledge in economics and social sciences ( Zlatkin-Troitschanskaia et al., 2019a ). As two indicators of general cognitive ability, the scale “Choosing figures” of the Intelligence Structure Test (IST-2000 R, Liepmann et al., 2007 ) as well as the grade of university entrance qualification were used in the present study (for details, see Nagel et al., 2020 ). The participants’ levels of interest in the task topic and case scenario (renewable energy) as well as their test motivation were also assessed in this study using two five-point-Likert-type scales (validated in the previous studies cited).

Furthermore, socio-demographic information and personal characteristics (e.g., scales on ‘media use,’ ‘need of evaluation,’ ‘information overload’; for details, see Nagel et al., 2020 ) expected to affect test performance were collected. Indicators of relevant expertise in the context of solving the performance task, such as completed commercial or vocational training, were also surveyed, as they might also influence task performance.

Study Design and Validation

To test and validate the “Wind Turbine” task in accordance with the Standards of Educational and Psychological Testing ( American Educational Research Association [AERA], 2014 ), assessments were conducted with a total of 95 undergraduate and graduate students from different study domains (e.g., business and economics, teacher education, psychology) at a German university ( Shavelson et al., 2019 ; Zlatkin-Troitschanskaia et al., 2019b ).

The students worked on the task in a controlled laboratory on computers configured specifically for this assessment and had no access to other resources to solve the task. The study was carried out in small groups on several dates under the supervision of experienced test administrators.

The total test time for the performance task was 60 min. The time limit put the participants under pressure, which led to them not having enough time to study all given sources intensively. Instead, it required them to decide which sources and contents to select and review more thoroughly, considering their relevance, validity, and trustworthiness. After the performance task (and a short break), the participants were asked to work on the accompanying assessments. The participants received an incentive of €20, and were offered an individual feedback on their test results.

Test performance was scored using a 6-point Likert-type anchored rating scheme based on the CR definition with 5 dimensions and 23 subdimensions (Section “Sample and Data”; for details on scoring, see Zlatkin-Troitschanskaia et al., 2019b ). The individual task responses, i.e., short essays, were randomly assigned to two of four trained raters, and the written responses were evaluated according to this newly developed and validated scoring scheme with behavioral anchors for each sub-category. Two raters independently evaluated the participants’ task responses, and a sufficient interrater agreement was determined (Cohen’s κ > 0.80; p = 0.000, for the overall test score).

In terms of psychometric diagnostics, the student response when solving the performance task (well-founded written final decision) was interpreted as a manifestation of their latent (meta)cognitions. The students’ task performance, i.e., their written responses, were regarded as valid indicators of the students’ ability to critically reason from multiple and contradictory sources of information (in the sense of the construct definition, see Section “Critically Reasoning from Multiple Sources of Information”). The theoretically hypothesized multidimensional internal structure of this construct was supported empirically using confirmatory factor analysis (CFA) ( Zlatkin-Troitschanskaia et al., 2019b ).

As theoretically expected, analyses of the task performance did not identify any significant domain-specific effects among students from different study domains ( Nagel et al., 2020 ). This also holds true for prior knowledge from previous vocational training, which showed no significant effect on the test results. As the performance task was developed to measure generic CR skills, this finding indicates that as expected, the assessment is not domain-specific. However, with regard to theories in learning and expertise research, it could be assumed that domain-specific expertise, though acquired within a certain domain, can be transferred to generic problems or tasks ( Alexander, 2004 ). In this respect, these results indicate that students may have deficits in their (meta)cognitive abilities that would enable them to transfer their prior knowledge and skills to the new context encountered in the performance task. Overall, the results from these validation studies provide evidence of the technical quality of the developed performance task and provided evidence as to the test’s construct validity and reliability ( Zlatkin-Troitschanskaia et al., 2019b ; Nagel et al., 2020 ).

Cognitive Interviews and Qualitative Analyses

The analyses of test performance per se do not permit us to draw valid conclusions on students’ underlying response processes when performing this task. In accordance with our construct and test definition, we expect that while working on the performance task, the test participants selected and evaluated the given information with the goal of finding relevant and reliable/trustworthy information for their evidence-based reasoning, decision-making, and final conclusion in the written response. To investigate how the test participants dealt with multiple contradictory sources, to what extent they integrated and evaluated given information during their reasoning and decision-making process, and which individual factors influenced their response processes, a semi-structured cognitive interview with stimulated recalls was conducted immediately after the performance assessment with a subsample of participants (see next section). The participants were first shown a screen displaying all 22 documents included in the document library one after the other. The students reflected and commented on, for instance, whether they considered the source in question and the information given therein to be relevant and/or credible (and why), and whether they used or ignored this source and information (and why). Particular attention was paid to controlling whether the students were aware of their task topic-related beliefs and attitudes and if so, whether they were aware to which extent these influenced their critical reasoning while processing the task, for example resulting in selective inclusion of the given information. The interviews took approximately 80 minutes and were recorded for later transcription.

The interview questions included, for instance, whether a test participant held task topic-related beliefs about wind power, the environment, etc. prior to the performance assessment, and to what extent previous experience, individual knowledge, attitudes and beliefs relevant to the task topic influenced the students’ information selection, evaluation and decision-making. In particular, the students reported at which point in time during the task processing they made their decision whether or not to suggest to the municipal council to allow the building of the wind farm (for instance, indicating that many students made their decision before they had even read the given information; see Section “Results”). The cognitive interviews also indicated that the performance task with its task prompt is clear and suitable for the objectives of the presented study.

The evaluation of the data from the cognitive interviews in the German iPAL project was carried out using the software MaxQDA. Based on the cognitive interview protocols, a differentiated category system was developed and validated. More specifically, the qualitative analysis of the cognitive interviews was guided by grounded- and data-driven theory for developing a coding scheme ( Strauss and Corbin, 1998 ).

The iterative process of coding consists of (1) open, (2) axial ( Strauss and Corbin, 1990 ) and (3) selective coding. At first, an open coding was used to access the data that did not yet follow any schematics. In the subsequent step, first categories were identified, such as the students’ beliefs at the beginning of the task or at which time point during the task processing the decision was made. Then, all interviews were analyzed and coded based on the defined categories. The coding scheme highlighted the points of reference regarding different information sources used by the students in their interviews. It linked the use of different sources to the students’ reasoning processes while reaching their decision, making it possible to derive and generalize response patterns. Coding development was complemented by an analytical approach of constantly comparing phenomena within the dataset (minimum and maximum contrast between the phenomena). Selected codes with a focus on student beliefs are presented in this paper (see Section “Results”).

The classification of the participants into the three profiles described in Section “Results” was based on a combination of criteria from the category system. These were primarily: (1) at what point during the assessment they made their decision (reported in the cognitive interviews), (2) their decision-making process (pros and cons; intuitively, based on original opinion; based on a specific source, etc.) and (3) the strength of their beliefs (strong personal beliefs primarily about nature conservation/animal welfare, etc. and general personal identification with the task topic). As the participants were classified into profiles based on a combination of these three categories, participants classified into different profiles may share certain attributes (e.g., listing pros and cons).

Sample and Data

The semi-structured interviews were conducted with a purposefully selected subsample 6 (which is part of the overall sample used in the German iPAL study) of 30 university students from one German university and from different courses of study. With this subsample, we aimed to include students from all study domains represented in the main sample in the cognitive interviews as well. Accordingly, the subsample consists of about 50% students of economics education, while the other half comprises students from other study domains (economics, psychology, and geoscience). Another important criterion for purposeful sampling was to include as wide a range as possible in terms of participants’ prior study experience and other personal characteristics that may influence students’ task topic-related beliefs, attitudes and knowledge, and their task performance. Accordingly, the subsample consists of students from both undergraduate and graduate programs and in different study semesters. To gain first indications of the possible impact of knowledge and skills achieved during academic studies, we focused on more experienced students. The average duration of studies to date among subsample participants was therefore 5.1 semesters, indicating that the students were fairly advanced in their respective study programs. Additionally, the university entrance qualification (with an average grade of 2.1; range from 1.0 = best to 6.0 = worst; n = 25 ∗7 . To control for the possible impact of pre-university education and practical experience, we included students with completed vocational education and training (11 students had completed an apprenticeship before beginning their academic studies; n = 29 ∗ 7 ). The average interviewee’s age was 24 years; 21 students were female – these proportions were similar to those in the overall sample.

Despite this purposeful sampling procedure, as participation in this study was voluntary, our sample cannot be considered representative. However, no significant deviations from the entire student population described in Nagel et al. (2020) were found with regard to the socio-biographical characteristics (e.g., gender and age).

Prior Findings on Test Performance and Additional Assessments

The students achieved an average intelligence-test (IST) score of 17.18 points (out of a maximum score of 40 points; n = 28 ∗ 7 ) and an average economics knowledge test score of 10.46 points (out of a maximum score of 15 points; n = 23 ∗ 7 ). Only four students stated that they had previous practical experience with Wind Turbines. Most students reported a low to medium level of task topic-related previous knowledge while a high level of knowledge on wind turbines was very rare ( n = 1).

The mean performance on the task was 3.52 points with 6 points being the highest possible score (for the scoring, see Zlatkin-Troitschanskaia et al., 2019b ). The median number of information sources students used in their written statements was 7 (out of 22 information sources given in the task). The written argument-based statements within the scope of the task processing differed in length, which was on average 426 words, with a maximum of 866 words and a minimum of 68 words, indicating a high deviation ( SD = 196 words) within the sample.

In the following, these results were used as external criteria to demonstrate how the following results from the analyses of the cognitive interviews correspond to these results of the quantitative analyses of the test scores.

The Relationship Between Beliefs and Decision-Making

Rq1: students’ beliefs at the beginning of task processing.

In the cognitive interviews, the students were asked whether they had been aware of their task topic-related beliefs prior to working on the task and if so, whether they were aware that their personal beliefs may have influenced their decision in the performance task and how they believed this influence may have shaped their response. Most participants ( n = 23) stated that they had already held certain beliefs on the task topic before beginning the task . For instance, one participant stated: “[…] I think I would have recommended this from the beginning because this is also a topic I hear about in the media from time to time, so that I already have a personal opinion about wind power and energy “(ID15). In response to the question whether his personal beliefs had influenced his response, interviewee ID7 stated: “Sure, because then I did not even look at the controversial sources at all and, that is… for example, if I believed that the bats from source 21 were extremely important, then of course I would have looked at the source.” Seven participants ( n = 7) reported that they did not have any prior beliefs about the topic of the performance assessment.

In terms of distinct types based on the reported beliefs, both groups of students – those who indicated prior beliefs and those who did not – can be further distinguished into two subgroups each (i) depending on the students’ positive or negative stance toward wind turbines , which vary considerably in stance strength and (ii) which can also be linked to a more economics-focused or a more ecologically oriented reasoning perspective (see Section “The Relationship between Beliefs and Decision-Making”).

R2: The Relationship Between Students’ Beliefs, Their Decision-Making Process and Their Final Decision

Time of the decision-making and types of decision (intuition-based vs. evidence-based).

In the cognitive interviews, the students reported at which point in time while working on the performance task and processing the information they made their decision as to whether renewable energies should be promoted or not in the given case (see Table 2 ). About one third of the students ( n = 8) made their decision at the beginning of the task, after having read the scenario, even though they had not yet read or considered the given information at all or only very briefly.

Time of decision-making and type of decision.

Another group of students ( n = 8) used the given sources and made their decision mostly after (more or less thoroughly) looking through the information provided. For instance, when asked when he had decided in favor of or against the construction of wind turbines, interviewee (ID 1) stated “ yes […], I actually knew from the beginning when I went through this [task] what direction my statement would go in. ” Interviewee (ID 23) made his decision while working on the audiovisual information: “ So after I watched the videos […] I changed my opinion .”

In contrast, other students made their recommendations after having reviewed the information material and after weighing up the pros and cons ( n = 9), as it is the case with, for example, interviewee (ID 7): “Interviewer: So that means that you first read all the sources and all the arguments? Participant: First the pros and cons, and only then I had a feeling .” This finding indicates that for some students the creation of pro and contra lists was an important step in their decision-making process. However, not all of those who made their decision comparably late in the task-solving process stated that they had done so based on weighing up the pros and cons: five students indicated that they made a late but still intuitive decision .

Overall, with regard to the time of decision-making , four types can be distinguished among the participants ( Table 2 ), which differ in terms of intuition-based vs. evidence-based decision-making as well as the extent to which the given information and pros/cons were considered or ignored.

Students who made a late intuition-based decision ( n = 5) performed worse, with an average test score of 3.25. Students who made their decision at the end of the task based on weighing pros and cons ( n = 9) performed better compared to all other participants, with an average test score of 3.83. Noticeably, there were hardly any differences in task performance between the students who decided intuitively at the beginning of the task and the students who weighed up pros and cons and decided at the end of the task.

The Relationship Between Students’ Beliefs and the Decision-Making Process: Profiles of Decision-Making

Regarding the question to what extent students’ were aware that their beliefs impact their decision-making process and whether distinct profiles of decision-making can be determined, among the 30 study participants, we identified students who indicated that their previous beliefs played a major , minor or no role in their decision-making process. Combined with the time at which they made their decision, we distinguished three profiles of decision-making:

Profile 1 “determined” : Participants who ignored the given information and made their decision solely based on their individual beliefs, almost immediately after having read the task ( n = 7). For example, (ID5) stated: “I wouldn’t have made a recommendation that goes against my gut instinct. For example, I think that even if the sources had been chosen in such a way that they would have given me a negative impression, I am not sure whether that would have caused me to change my initial positive stance. I simply couldn’t just ignore my background knowledge and my personal attitude when giving my recommendation at the end.”

Profile 2 “deliberative” : Participants who decided contrary to their task topic-related beliefs, and changed their decision after having read the information provided in the task ( n = 11), as well as participants who stated that they held certain beliefs at the beginning of the task but weighed up pros and cons while processing the task and made their decision based on these considerations ( n = 5). The two cases were merged into one profile as students in both cases stated that they held certain beliefs but made their decision based on the pros and cons of the evidence rather than on those beliefs.

There were some differences within this decision-making profile. For instance, some students switched between being in favor of or against the construction of wind turbines while working on the task: “ So basically I’m for it and then while I was writing this I just started to waver, you have to list the negative things and then I doubted it for a moment but then I finally decided in favor at the end. ” (ID 8); other students changed their prior opinions by reflecting on their own beliefs in the context of the given information: “ At the beginning I would have said yes [impact of belief on decision-making]. But then I tried to be as unbiased as possible, or rather to be subjective in my role as a member of the council. And then I kind of abandoned my [initial] decision and my personal belief. ” (ID 17).

Profile 3 “open minded” : Participants who did not state any prior beliefs, took note of the provided information, and made their decision after considering pros and cons ( n = 7). Interviewee (ID7) stated that he had had no prior beliefs before starting the task, and that he made his decision after considering the given information and making a pro and con list: “ No, I couldn’t decide at the beginning, it just happened toward the end of the argumentation. Well, I was not for or against it from the beginning. I just did not know how to decide. ”

Since the participants were classified into profiles based on a combination of the scoring categories (see Section “Cognitive Interviews and Qualitative Analyses”), participants classified into different profiles may share some attributes (e.g., listing pros and cons) and there may be some overlaps between the profiles.

The Relationship Between the Decision-Making Profiles and Task Performance (Test Scores) as Well as the Results of Additional Assessments

Noticeably, the participants in profile 3 on average achieved higher test scores than the other two profiles ( Table 3 ). Students who based their decision on their beliefs (profile 1) performed worse compared to other participants (profile 3). In terms of the average test score, the deviation between these two profiles (1 and 3) was more than 1 point.

Means of task performance of different profiles.

Upon further characterizing the three profiles, we found additional differences between the groups of students in terms of the number of information sources used and the number of words in the final recommendation statements, which differ greatly ( Table 4 ). Compared to students who made a decision based on their beliefs (profile 1), the average number of information sources used was 3.25 points higher for students who changed their beliefs (profile 2) and 2.86 points higher for students of profile 3. The mean number of words in the written final recommendation statements also varied heavily. Remarkably, the responses of “deliberative” students (profile 2) were the shortest with an average of 370 words. “Determined” students (profile 1) who did not change their beliefs wrote on average 33 more words than “open minded” students (profile 3) with a mean of 473 words.

Characteristics of the three decision-making profiles.

With regard to personal characteristics, there were no significant differences in the intelligence test scores for the three profiles ( Table 4 ). The same was true for performance in the economics knowledge test, with results ranging from 10.20 to 10.65 points (on a 15-point scale).

Students who made their decision based on evidence and pros/cons, despite their beliefs or without considering previous task-related beliefs, tended to be older (profile 2: 3.58 years older on average; profile 3: 3.26 years older on average) than “determined” (profile 1) students. There were no significant differences in terms of gender, pre-university education (vocational training or university entry qualification grade) or degree course, which does not indicate any substantial influence of prior education on the response processes.

Task-Topic Related Attitudes and Their Relationship to Reasoning Processes and Decision-Making

Another approach to identifying certain beliefs and their possible relationship to information processing and critical reasoning was to analyze students’ task-topic related attitudes and their impact on reasoning approaches when solving the performance task. In this respect, the reasoning lines identified in the cognitive interviews (as well as in the written task responses) can be categorized as follows:

(1) The first category differentiates between primarily economics-focused or ecologically oriented reasoning lines. Twelve students’ recommendations had a primary economical focus in their reasoning, while 18 students relied more on ecological aspects and sources presenting an ecological perspective.

Remarkably, students who were in favor of building wind turbines tended to choose an economics-focused reasoning line, while students against the construction chose an ecologically focused perspective ( Table 5 ). An example for an economical reasoning line can be seen in the following statements: “ The trade tax to be paid by the operator could be sensibly invested in the modernization of facilities, the infrastructure of the place and the marketing of the local recreation area. This source of income seems to be important for the community, especially in the future, against the background of an increasingly dwindling agriculture” (ID 13); “In my opinion, the offer should be accepted, as the positive aspects outweigh the negative ones and, in general, the construction of wind turbines would mean a macroeconomic, long-term benefit for the community. In addition, it is an investment in infrastructure.” (ID 25). In contrast, an example for an ecological line of reasoning and their relationship to information processing and decision-making can be seen in the following statement (ID 26): “ That caused me to have fewer choices, and I had already had the notion in mind that wind turbines are good and nuclear power plants are bad, which is why I said from the very outset that yes, no matter in which form, more renewable energy should be produced and, well, that’s why I said all along that that would be the most sensible result in my opinion, without any of those arguments.”

Economic- and ecological-focused reasoning and decision against or in favor of wind turbines at the end of the tasks.

(2) The students’ decision-making processes and (final) recommendations can also be categorized in terms of the extent to which the specific situation described in the task was considered . While half of the participants took the task-specific perspective of the local council and the current situation of the city into account ( n = 15), other students choose a more general approach in making a recommendation for or against wind turbines ( n = 15).

One example for considering of local conditions can be found in the statement of participant (ID 7): “ I consider the construction of the wind turbines in the north of the municipality to be an incalculable risk, as the tertiary sector and especially the tourism that goes along with it represent an important source of income for the town. I think it makes much more sense to locate the wind turbines in the west. Farmers who live there, such as Mr. Anders and Mr. Bender, should welcome an additional source of income besides agriculture, so that they should agree to the construction of the wind turbines.“ A more general approach is expressed in the statement of participant (ID 16): “The fact that wind energy is initially a clean and environmentally friendly way of generating energy speaks for the installation of wind turbines. In addition, there are also economic reasons for this, as good money can be made from the rent that incurs when a wind turbine is installed. […].”

While the majority of students who took the task-specific current situation of the city into account tended to express a negative attitude about wind turbines, students who took a more general reasoning approach were rather in favor of building wind turbines ( Table 6 ).

Perspective of reasoning (local council included or not) and decision against or in favor of wind turbines at the end of the tasks.

The Relationship Between the Reasoning Approaches and Task Performance (Test Scores)

In terms of task performance, no significant differences were found between the students with different reasoning approaches, although students who chose economics-focused reasoning achieved slightly higher performances than the other students. When taking into account the positive vs. negative stance toward wind turbines at the end of the task, however, the difference in task performance of students with ecological-focused reasoning is about 0.8 points, whereas the difference in the group with economic-focused approach is only 0.1 point.

Change of Beliefs While Solving the Task

Rq3: interaction between students’ beliefs and processing of the given information.

Looking at the time of decision-making, we found that some students changed their opinion about the construction of wind turbines (once or several times) while processing and working on the task, while others did not. While 14 interviewees reported that they did not change their opinion about the wind turbines over the course of their task solving, 12 interviewees changed their opinion after processing of information given in the task ( Table 7 ). Four participants claimed that they had not been initially disposed either way. Both groups of students–those who changed their opinion and those who did not–can each be further distinguished into two subgroups depending on their positive or negative stance toward wind turbines, which vary considerably in size. Within the group with no change of opinion, participants who had voted against the construction on wind power plants at the beginning of the tasks and remained negative ( n = 3) can be distinguished from participants who had a positive stance toward wind turbines before and after completing the task ( n = 12). We can also differentiate between students who have changed their opinion during working on the task. Some students initially had negative attitudes toward wind turbines, but changed their opinion during the task processing and in the end voted in favor wind turbines ( n = 2). The same applies to participants who were in favor of constructing wind turbines at the beginning, but ultimately spoke out against wind turbines ( n = 9).

Change of opinion.

The Relationship Between a Change of Students’ Beliefs and Task Performance (Test Scores)

There was hardly any significant difference in the test score of the two groups, although students who did not change their opinion performed slightly better than students who changed their opinion: the difference in task performance was about 0.7 points.

Discussion and Conclusion

Summary and interpretation of results.

The data from the cognitive interviews on the students’ beliefs, information processing and reasoning processes make a valuable contribution to explaining the students’ CR abilities and the complex interplay between their underlying thought processes and task topic-related beliefs. In the interviews, most participants expressed that they were aware of holding certain beliefs at the beginning of task processing (RQ1). The results of the qualitative analysis of the cognitive interview protocols indicated that the students’ task topic-related beliefs had an influence on their selection, critical evaluation and use of information as well as on their reasoning process and final decision (RQ2). As an additional decisive contribution to existing research [see Section “State of Research on Beliefs and Their Impact on (Online) Information Processing”], we provide initial evidence that some students’ task topic-related beliefs changed over the course of task processing, indicating that the processed information (recognized and reflected evidence and pros/cons) influenced the students’ beliefs to varying degrees (RQ3).

Overall, the evidence from this qualitative analysis suggests a complex reciprocal and changeable relationship between students’ task topic-related beliefs, their processing of new (confirm or deviant) information and their decision-making based on both beliefs and evidence.

More specifically, the types of beliefs and attitudes derived from the cognitive interview data suggest their influence on information processing, reasoning approaches and decision-making. In particular, the students who already had strong task topic-related beliefs at the beginning regarded these as decisive while solving the task. For instance, students who had already made a decision based on their beliefs at the beginning of the task cited fewer sources in their written response (final decision).

Overall, the selection, evaluation, and use of information while working on the task were influenced, in particular, by the participants’ task topic-related beliefs (RQ2) . By contrast, hardly any differences became evident in terms of students’ relevant knowledge. However, the majority of the participants had only little prior knowledge of the subject, i.e., a large amount of the information in the task was new to them. Though most students had a positive or negative stance toward renewable energy in general, their personal beliefs concerning wind energy in particular did not appear to be very firm and well-founded. The few test participants who had already dealt with the subject area in more detail appeared to have more solid personal beliefs about wind energy (RQ1). Furthermore, there were no differences in terms of students’ general interest in the topic. However, two reasoning lines – more ecologically oriented vs. economics-focused approaches – became evident, which appear to influence students’ decision-making processes and final decision.

Remarkably, the students who had more elaborated beliefs prior to processing the task were more likely to come to a decision that contradicted their personal beliefs. For instance, the information on the negative effects of wind turbines on the health of people and animals living in the vicinity of a wind farm (noise emission, bird strike, infrasound) was particularly relevant for these participants when making their decision; they were more astonished by this information than the students who had hardly any prior knowledge about the subject and no well-developed beliefs (RQ2).

Most students started selecting information right away after obtaining an initial overview of the sources presented in the task. The participants’ subsequent evaluation of the given information with regard to the reliability, validity, objectivity, and trustworthiness of the respective sources (as stated in the interviews) does not appear to have had much of an influence on their selection and use of information. In contrast, the participants evaluated the relevance of the sources differently, whereby a large number of the sources that were evaluated as relevant were used to inform their decisions and help them formulate their written recommendations. For instance, in the interviews, the majority of students rated Wikipedia as a less reliable source (of course the exact details vary, but in general, it received rather negative ratings), as Wikipedia pages can potentially be edited by any Internet user. However, the choice as to whether or not to use information from Wikipedia sources was primarily made on the basis of the content of these sources (“ do I want to address bird mortality or not? ”). In contrast, when it came to the evaluation of the public-service broadcaster videos, a large number of participants assessed these videos as trustworthy despite not having watched them, as they considered this source to be particularly reliable.

Overall, in the cognitive interviews it became evident that the students mostly selected and evaluated (or ignored) new information depending on media or source type (i.e., whether they believed that certain types of media and presented sources are relevant and reliable) but not on the particular content/evidence. This finding is in line with previous research reported in the Section “State of Research on Beliefs and Their Impact on (Online) Information Processing” and stresses the importance of epistemic beliefs regarding information sources, which was not a focus of this study and requires further investigations in the particular context of online reasoning (for limitations, see the next section). In addition, this result points to a demand for more observational studies that capture in detail what documents, what parts of these documents, and which content the participants read and comprehend while solving the task.

Although participants used different sources in their statements, most of the students did not compile the information provided to them and weigh the evidence (pros/cons), but rather selected information related to their own beliefs, indicating biased selection, evaluation and use of information (for the confirmation bias, see Mercier and Sperber, 2009 , 2011 ; Metzger et al., 2010 ; Metzger and Flanagin, 2015 ). A (repeated) critical examination of the information and evidence provided did not take place.

Linking the results from the qualitative analyses of the cognitive interviews with task performance further suggests a confirmation bias in reasoning, showing that students who only made their decision based on their beliefs (profile 1) had the worst test scores on average. This was also reflected in the number of sources used. They wrote the longest statements but based on the lowest number of used sources , without sufficiently reflecting on the available information and evidence. This finding is also supported by the lower performance of students who tended to overemphasize a single source while neglecting all contradicting source information (for the authority bias, see Metzger et al., 2010 ; Metzger and Flanagin, 2015 ). Overall, the finding from the qualitative analyses that often no sufficient critical reasoning took place in the decision-making process and that the decision was based on beliefs (and bias) was also reflected in the students’ statements.

In contrast, the students with no early inclination (profile 3) approached more source material neutrally and decided on the incorporation of the information and evidence individually, outperforming the other students in terms of task score. Their statements were less belief-driven since they addressed the specific task scenario and prioritized the town’s needs and restrictions over their personal stance on renewable energy.

As the students only had limited time (60 min) to respond, time pressure also played an important role and forced them to gather relevant information as quickly as possible. If the participants selected the information they intended to read more precisely, worked with it and then used it in their decision-making at an early stage ( right at the beginning ), quickly ( without deliberative thinking ), and consistently ( without changing their minds ), the issue of time pressure apparently did not have much of an effect on their task-solving efforts. The cognitive interviews indicate that for some students, however, selecting suitable information was a major challenge while working on the task (indicating the higher cognitive load; Sweller, 1988 ). These participants often opted to use internal sources as opposed to external sources, indicating that they mostly focused on the information that was available within the task document itself and disregarded the hyperlinks. The majority of participants did not watch the two videos (completely) due to time issues. This aspect also points to some limitations of our study (see next section).

Limitations and Implication for Future Research

Though the study provides some important insights into the complex reciprocal relationship between students’ beliefs and their reasoning and decision-making process, some limitations (besides those related to the sample, see Section “Sample and Data”) must be critically noted, which indicate some perspectives for further research.

While the results of the qualitative analyses pertaining to RQs 1&2 allow for some clear statements about students’ beliefs and their influence on critical reasoning, the findings pertaining to RQ 3 regarding changes in beliefs are still limited. First, in our study, we can only derive conclusions about task topic-related beliefs. These need to be distinguished from general personal (e.g., epistemic) beliefs, which were not analyzed in our study. In prior research, general beliefs usually were seen as a trait that does not change during the course of solving a task. However, measuring epistemic beliefs is considered challenging from a conceptual as well as a methodological perspective, and requires further research ( van Strien et al., 2012a ).

Second, based on the cognitive interview protocols, a clear distinction between a change in task topic-related beliefs and a change of overall opinion could also only be made to a limited extent. Although some students clearly stated that they had beliefs prior to processing the task that influenced their information processing and decision-making, and they had changed their opinion, we cannot conclude, on the basis of the interviews, whether this change of opinion was due to a change in their underlying beliefs . It is also questionable whether students were able to clearly distinguish between their belief, their attitude toward the task topic, and their opinion, and to express this difference in the interviews. This limitation results in an important follow-up for further research: Is a change of opinion accompanied by a change of task topic-related beliefs?

Though the results of both assessed scales on students’ interest in the task topic and students’ test motivation showed (very) high levels among all participants in this study, we noticed some differences in the way students approached the cognitive interviews. While some students were very communicative and talked a lot about their beliefs and task processing, other students gave short answers. Consequently, the cognitive interview protocols vary substantially in length and detail. The results of the qualitative analyses must therefore be viewed critically in terms of this data limitation. For instance, it could not be ruled out that students who did not express that they had topic-related beliefs prior to processing the task may not have deliberatively reflected on this interview question or simply not have wanted to share this information (e.g., due to a bias of social desirability). Despite the use of a standardized guide in the semi-structured interviews, the comparability of the cognitive interview protocols may be limited in this regard.

The task topic may also be not without bias in this respect, since renewable energy can be generally framed in a positive light. For this reason, it can also not be ruled out that students’ responses to the task and their answers to the interview questions were biased in terms of social desirability. However, the fact that some students in our sample were both initially and ultimately against the construction of the wind turbines ( n = 3) may contradict this assumption.

In addition, though (i) the task prompt to write an evidence-based statement regarding the decision for the community should have been clear and strong enough to indicate that a discussion of the evidence (pros-cons) made available in the task was required, and (ii) (very) high levels of assessed interest in the task topic and overall test motivation among participants were determined, a difference among participants in terms of (metacognitively) engaging their critical reasoning skills when solving the performance task can still not be ruled out. Based on prior research, however, it can be assumed that the activation of critical reasoning abilities requires metacognitive skills (e.g., Brand-Gruvel and Stadtler, 2011 ). Therefore, further understanding of students’ (metacognitive) engaging (and other influences) during the decision-making process is required to help identify certain patterns in task processing strategies for this type of performance assessment and to further improve computer-based simulations in terms of their ecological validity and reliability to ensure more authentic assessment (for a critical discussion, see also Mercier and Sperber, 2011 ).

In this context, it is remarkable that the group of students who were aware of the influence of their beliefs – despite the task prompt asking them to include the given information and evidence in their decision-making process-decided to use only information that supported their beliefs (profile 1). These students had already recognized at the beginning of the task processing that their beliefs would have a decisive influence on their decision. If we transfer this finding to other real-life situations, in particular the everyday use of online sources in Internet searching, further research is required as to whether students, when searching for sources and in the context of their university education, also specifically focus on sources and information that confirm their beliefs. In this respect, the identified reasoning profile 1 may lead to an acquisition of biased (domain-specific) knowledge. In contrast, the “open minded” profile 3 approached more information neutrally, outperforming the other students in terms of the scored quality of written statements.

In this context, it is also important to focus on those students who claimed to have certain beliefs on the topic before starting the task but still reviewed all the information given and even partly decided against their beliefs after having regarded all information (“deliberative” profile 2). This profile should be analyzed more in-depth, especially taking into consideration both additional underlying cognitive and non-cognitive student characteristics as well as specific learning opportunities that this group might have had to develop this deliberative reasoning approach. Here, further questions arise: Why did students choose this approach and decide against their beliefs? What personal or contextual factors may have played a decisive role?

The complex relationship between prior knowledge and beliefs also requires further in-depth investigation. Ho et al. (2008) found that task topic-related beliefs interact with the amount and quality of topic-relevant knowledge, whereby the topic-related beliefs may have a stronger impact on decision-making than knowledge. Analogously, the results of our study suggest that in general, no matter how experienced a student is in a topic or how much previous knowledge they had, certain beliefs seemed to be influential and predominant. However, to what extent the beliefs influence students in their approach to a task topic and which aspects were particularly crucial for students to be influenced by their beliefs (e.g., strength of beliefs or additional personal characteristics) must also be analyzed in further research (for an overview, see Brand-Gruvel and Stadtler, 2011 ).

In addition, looking at the differences in the students’ reported reasoning processes, we can conclude that diverse students’ beliefs and attitudes, which were related to the task context and topic to a very different extent (e.g., in the area of sustainability), had an influence on the students’ decision-making and final decision. Based on the data from the cognitive interview protocols, however, we were not able to analyze the complex relationship between beliefs, reasoning approaches and lines of argumentation . Though critical reasoning is indeed related to aspects of argumentative skills, this latter aspect was not the focus of our study (as described in Section “Conceptual and Methodological Background”) and requires further investigation in several regards. Particular investigation of argumentative skills would require a substantial change and further development of the experimental and assessment setting. For instance, there are several performance tests available that specifically focus on measuring argumentative skills (e.g., Argument Structure Test, Münchow et al., 2020 ; Agrument Judgment Test, Münchow et al., 2019 ) and are suitable for discriminant validation of CR assessments, which should be investigated in a follow-up research. In addition, comprehensive qualitative analyses of both the argumentative importance of the material on the one hand as well as (i) arguments (more or less reflective or intuitive, Mercier and Sperber, 2011 ) used by students in their responses and (ii) (new) arguments created by the students themselves based on given arguments in the provided information on the other hand need to be conducted in further studies, and explicitly linked to students’ critical reasoning ability and performance.

Finally, the method of cognitive interviews also has certain limitations in terms of understanding and explaining students’ reasoning processes during task-solving, for instance due to a bias of social desirability (e.g., Kahne and Bowyer, 2017 ) as mentioned above or limited mental recall capacities. However, one central focus of the presented study lies on the investigation of self-awareness of one’s owns beliefs, i.e., whether the students were aware of their beliefs and whether they were aware if their beliefs influencing their perception, evaluation, selection and use of the given information. Hence, cognitive interviews were necessary to gain indications regarding the students (critical) reflection on their thought processes involved in solving the task, i.e., writing a statement. Especially any conclusions about self-awareness regarding one’s beliefs and their relation to decision-making can best be reached by means of stimulated recalls in cognitive interviews, which has been shown to approximate think-aloud methods in the study settings where participants cannot think aloud while processing the task (as in this computer-based test environment).

Follow-up research observing these limitations and implications would provide a better understanding of successful CR and a more significant basis for developing targeted instructional interventions in order to promote students’ CR skills in dealing with new more or less trustworthy or contradictory information.

Data Availability Statement

Ethics statement.

Ethical review and approval was not required for the study on human participants in accordance with the local legislation and institutional requirements. The participants provided their written informed consent to participate in this study.

Author Contributions

OZ-T provided the idea for the study, co-developed the assessment, supervised the analyses, and co-wrote the manuscript. KB co-developed the assessment, supervised the analyses, and was involved in preparing and reviewing the manuscript. JF and DB conducted the analyses, and were involved in preparing the manuscript. SS was involved in the data collection and in the analyses. RS was involved in the development of the performance assessment and in preparing the manuscript. All the authors contributed to the article and approved the submitted version.

Conflict of Interest

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Acknowledgments

We would like to thank the two reviewers and the editor who provided constructive feedback and helpful guidance in the revision of this manuscript. We would also like to thank all students from the Johannes Gutenberg University Mainz who participated in this study as well as the raters who evaluated the written responses.

Funding. This study is part of the PLATO project, which is funded by the German federal state of Rhineland-Palatinate.

1 In contrast to other concepts related to critical thinking , critical online reasoning (COR) is explicitly limited to the online information environment and includes the specific ability of “online information acquisition”. While there is currently no unified definition of COR, there are numerous definitions of its related construct critical thinking (CT) that include and describe different dimensions or levels. For instance, Oser and Biedermann (2020) distinguish between CT as alertness, CT as immediate reflection, and CT as analysis. Facione(2004 , p. 9) describes CT as “inference, explanation, interpretation, evaluation, analysis, self-regulation” (for further definitions of CT, see Moore, 2013 ; Beck, 2020 ). As Brookfield (1987) emphasizes, “Being a critical thinker involves more than cognitive activities such as logical reasoning or scrutinizing arguments for assertions unsupported by empirical evidence. Thinking critically involves us recognizing the assumptions underlying our beliefs and behaviors ”.

2 One well-known test of this kind is the Watson-Glaser Critical Thinking Appraisal (2002), which comprises tasks on inferences, recognition of assumptions, deduction, interpretation, and evaluation of arguments ( Watson and Glaser, 2002 ).

3 https://ktl.jyu.fi/fi/hankkeet/kappas/copy_of_lyhyesti

4 www.ets.org/heighten/about/critical_thinking/

5 http://practice.cappassessments.com

6 The criteria for selection from the overall sample and inclusion were the participants’ socio-biographical and educational characteristics to ensure: (1) gender balance, (2) age distribution, (3) course of study representation (all), (4) study year/progress (advanced students), and (5) prior education (e.g., completed vocational training).

7 The deviation from the total sample size (30 participants) is due to missing values.

  • Alexander P. A. (2004). “ A model of domain learning: Reinterpreting expertise as a multidimensional, multistage process ,” in Motivation, emotion, and cognition: Integrative perspectives on intellectual functioning and development , eds Dai D. Y., Sternberg R. J. (New Jersey: Lawrence Erlbaum Associates; ), 273–298. [ Google Scholar ]
  • American Educational Research Association [AERA], American Psychological Association, and National Council on Measurement in Education (2014). Standards for Educational and Psychological Testing. Washington, DC: American Educational Research Association. [ Google Scholar ]
  • Beck K. (2020). “ On the relationship between “Education” and “Critical Thinking” ,” in Frontiers and Advances in Positive Learning in the Age of Information (PLATO) , ed. Zlatkin-Troitschanskaia O. (New York, NY: Springer; ), 73–87. [ Google Scholar ]
  • Brand-Gruvel S., Stadtler M. (2011). Solving information-based problems: evaluating sources and information. Learn. Instr. 21 175–179. 10.1016/j.learninstruc.2010.02.008 [ CrossRef ] [ Google Scholar ]
  • Bråten I., Strømsø I. H., Salmerón L. (2011). Trust and mistrust when students read multiple information sources about climate change. Learn. Instr. 21 180–192. 10.1016/j.learninstruc.2010.02.002 [ CrossRef ] [ Google Scholar ]
  • Braun H. (2019). Performance assessment and standardization in higher education: a problematic conjunction? Br. J. Educ. Psychol. 89 429–440. 10.1111/bjep.12274 [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Brookfield S. D. (1987). Developing Critical Thinkers: Challenging Adults to Explore Alternative Ways of Thinking and Acting. San Francisco, CA: Jossey-Bass. [ Google Scholar ]
  • Brooks C. (2016). ECAR Study of Students and Information Technology. Louisville KY: ECAR. [ Google Scholar ]
  • Chiu Y.-L., Liang Y.-C., Tsai C.-C. (2013). Internet-specific epistemic beliefs and self-regulated learning in online academic information searching. Metacogn. Learn. 8 235–260. 10.1007/s11409-013-9103-x [ CrossRef ] [ Google Scholar ]
  • Chiu Y.-L., Tsai C.-C., Liang J.-C. (2015). Testing measurement invariance and latent mean differences across gender groups in college students’ internet-specific epistemic beliefs. Austr. J. Educ. Technol. 31 486–499. 10.14742/ajet.1437 [ CrossRef ] [ Google Scholar ]
  • Ciampaglia G. L. (2018). “ The digital misinformation pipeline ,” in Positive Learning in the Age of Information , eds Zlatkin-Troitschanskaia O., Wittum G., Dengel A. (Wiesbaden: Springer; ), 413–421. 10.1007/978-3-658-19567-0_25 [ CrossRef ] [ Google Scholar ]
  • Davey T., Ferrara S., Shavelson R., Holland P., Webb N., Wise L. (2015). Psychometric Considerations for the Next Generation of Performance Assessment. Washington, DC: Center for K-12 Assessment & Performance Management, Educational Testing Service. [ Google Scholar ]
  • Facione P. A. (1990). Critical Thinking: A Statement of Expert Consensus for Purposes of Educational Assessment and Instruction—The Delphi Report. Millbrae, CA: Academic Press. [ Google Scholar ]
  • Facione P. A. (2004). Critical Thinking: What It Is and Why It Counts. Milbrae, CA: Academic Press. [ Google Scholar ]
  • Facione P. A., Facione N. C., Giancarlo C. A. F. (1998). The California Critical Thinking Disposition Inventory (CA). Cambridge, MA: Academic Press. [ Google Scholar ]
  • Festinger L. (1962). Cognitive dissonance. Sci. Ame. 207 93–107. 10.1038/scientificamerican1062-93 [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Hahnel C., Kroehne U., Goldhammer F., Schoor C., Mahlow N., Artelt C. (2019). Validating process variables of sourcing in an assessment of multiple document comprehension. Br. J. Educ. Psychol. 89 524–537. 10.1111/bjep.12278 [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Harrison N., Luckett K. (2019). Experts, knowledge and criticality in the age of ‘alternative facts’: re-examining the contribution of higher education. Teach. Higher Educ. 24 259–271. 10.1080/13562517.2019.1578577 [ CrossRef ] [ Google Scholar ]
  • Ho S., Bossard D., Scheufele D. A. (2008). Effects of value predispositions, mass media and knowledge on public attitudes toward embryonic stem cell research. Int. J. Public Opin. Res. 20 171–192. 10.1093/ijpor/edn017 [ CrossRef ] [ Google Scholar ]
  • Horstmann N., Ahlgrimm A., Glockner A. (2009). How distinct are intuition and deliberation? An eye tracking analysis of instruction-induced decision modes. Judgm. Decis. Mak. 4 335–354. [ Google Scholar ]
  • Hsu C.-H., Tsai M.-J., Hou H.-T., Tsai C.-C. (2014). Epistemic beliefs, online search strategies, and behavioral patterns while exploring socioscientific issues. J. Sci. Educ. Technol. 23 471–480. 10.1007/s10956-013-9477-1 [ CrossRef ] [ Google Scholar ]
  • Huber C. R., Kuncel N. R. (2016). Does college teach critical thinking? A meta-analysis. Rev. Educ. Res. 86 431–468. 10.3102/0034654315605917 [ CrossRef ] [ Google Scholar ]
  • Johnson F., Sbaffi L., Rowley J. (2016). Students’ approaches to the evaluation of digital information: insights from their trust judgments. Br. J. Educ. Technol. 47 1243–1258. 10.1111/bjet.12306 [ CrossRef ] [ Google Scholar ]
  • Kahne J., Bowyer B. (2017). Educating for democracy in a partisan age: confronting the challenges of motivated reasoning and misinformation. Am. Educ. Res. J. 54 3–34. 10.3102/0002831216679817 [ CrossRef ] [ Google Scholar ]
  • Kammerer Y., Gerjets P. (2012). Effects of search interface and Internet-specific epistemic beliefs on source evaluations during Web search for medical information: an eye-tracking study. Behav. Inform. Technol. 31 83–97. 10.1080/0144929X.2011.599040 [ CrossRef ] [ Google Scholar ]
  • Kane M. T. (2013). Validating the interpretations and uses of test scores. J. Educ. Meas. 50 1–73. 10.1111/jedm.12000 [ CrossRef ] [ Google Scholar ]
  • Kienhues D., Stadtler M., Bromme R. (2011). Dealing with conflicting or consistent medical information on the web: when expert information breeds laypersons’ doubts about experts. Learn. Instr. 21 193–204. 10.1016/j.learninstruc.2010.02.004 [ CrossRef ] [ Google Scholar ]
  • Klein S., Benjamin R., Shavelson R., Bolus R. (2007). The collegiate learning assessment: facts and fantasies. Eval. Rev. 31 415–439. 10.1177/0193841X07303318 [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Liepmann D., Beauducel A., Brocke B., Amthauer R. (2007). I-S-T 2000 R. Intelligenz-Struktur-Test 2000 R , 2nd Edn Göttingen: Hogrefe. [ Google Scholar ]
  • List A., Alexander P. A. (2017). Cognitive affective engagement model of multiple source use. Educ. Psychol. 52 182–199. 10.1080/00461520.2017.1329014 [ CrossRef ] [ Google Scholar ]
  • List A., Alexander P. A. (2018). “ Cold and warm perspectives on the cognitive affective engagement model of multiple source use ,” in Handbook of Multiple Source Use , eds Braasch J. L. G., Bråten I., McCrudden M. T. (New York, NY: Routledge; ), 34–54. 10.4324/9781315627496-3 [ CrossRef ] [ Google Scholar ]
  • List A., Alexander P. A. (2019). Toward an integrated framework of multiple text use. Educ. Psychol. 54 20–39. 10.1080/00461520.2018.1505514 [ CrossRef ] [ Google Scholar ]
  • Liu O. L., Frankel L., Roohr K. C. (2014). Assessing critical thinking in higher education: current state and directions for next-generation assessments. ETS Res. Rep. 2014 1–23. 10.1002/ets2.12009 [ CrossRef ] [ Google Scholar ]
  • Lucassen T., Schraagen J. M. (2011). Factual accuracy and trust in information: the role of expertise. J. Am. Soc. Inform. Sci. Technol. 62 1232–1242. 10.1002/asi.21545 [ CrossRef ] [ Google Scholar ]
  • Mason L., Boldrin A., Ariasi N. (2010). Searching the Web to learn about a controversial topic: are students epistemically active? Instr. Sci. 38 607–633. 10.1007/s11251-008-9089-y [ CrossRef ] [ Google Scholar ]
  • Mason L., Pluchino P., Ariasi N. (2014). Reading information about a scientific phenomenon on webpages varying for reliability: an eye-movement analysis. Educ. Technolo. Res. Dev. 62 663–685. 10.1007/s11423-014-9356-3 [ CrossRef ] [ Google Scholar ]
  • Maurer M., Quiring O., Schemer C. (2018). “ Media effects on positive and negative learning ,” in Positive Learning in the Age of Information (PLATO) – A Blessing or a Curse? , eds Zlatkin-Troitschanskaia O., Wittum G., Dengel A. (Wiesbaden: Springer; ), 197–208. 10.1007/978-3-030-26578-6 [ CrossRef ] [ Google Scholar ]
  • Maurer M., Schemer C., Zlatkin-Troitschanskaia O., Jitomirski J. (2020). “ Positive and negative media effects on university students’ learning: preliminary findings and a research program ,” in Frontiers and Advances in Positive Learning in the Age of Information (PLATO) , ed. Zlatkin-Troitschanskaia O. (New York, NY: Springer; ), 109–119. 10.1007/978-3-030-26578-6_8 [ CrossRef ] [ Google Scholar ]
  • McGrew S., Breakstone J., Ortega T., Smith M., Wineburg S. (2018). Can students evaluate online sources? Learning from assessments of civic online reasoning. Theory Res. Soc. Educ. 46 165–193. 10.1080/00933104.2017.1416320 [ CrossRef ] [ Google Scholar ]
  • McGrew S., Ortega T., Breakstone J., Wineburg S. (2017). The challenge that’s bigger than fake news: civic reasoning in a social media environment. Am. Edu. 41 4–9. [ Google Scholar ]
  • Mercier H., Sperber D. (2009). “ Intuitive and reflective inferences ,” in In Two Minds: Dual Processes and Beyond , eds Evans J., Frankish K. (Oxford: Oxford University Press; ), 149–170. 10.1093/acprof:oso/9780199230167.003.0007 [ CrossRef ] [ Google Scholar ]
  • Mercier H., Sperber D. (2011). Why do humans reason? Arguments for an argumentative theory. Behav. Brain Sci. 34 57–74. 10.1017/S0140525X100009 [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Messick S. (1994). The interplay of evidence and consequences in the validation of performance assessments. Edu. Res. 23 13–23. 10.3102/0013189X023002013 [ CrossRef ] [ Google Scholar ]
  • Metzger M. J. (2007). Making sense of credibility on the web: models for evaluating online information and recommendations for future research. J. Am. Soc. Inform. Sci. Technol. 58 2078–2091. 10.1002/asi.20672 [ CrossRef ] [ Google Scholar ]
  • Metzger M. J., Flanagin A. J. (2015). Credibility and trust of information in online environments: the use of cognitive heuristics. J. Prag. 59 210–220. 10.1016/j.pragma.2013.07.012 [ CrossRef ] [ Google Scholar ]
  • Metzger M. J., Flanagin A. J., Medders R. B. (2010). Social and heuristic approaches to credibility evaluation online. J. Commun. 60 413–439. 10.1111/j.1460-2466.2010.01488.x [ CrossRef ] [ Google Scholar ]
  • Mislevy R. J. (2018). Socio-Cognitive Foundations of Educational Measurement. New York, NY: Routledge. [ Google Scholar ]
  • Moore T. (2013). Critical thinking: seven definitions in search of a concept. Stud. Higher Educ. 38 506–522. 10.1080/03075079.2011.586995 [ CrossRef ] [ Google Scholar ]
  • Münchow H., Richter T., Schmid S. (2020). “ What does it take to deal with academic literature? ,” in Student Learning in German Higher Education: Innovative Measurement Approaches and Research Results, Vol. 2 , eds Zlatkin-Troitschanskaia O., Pant H. A., Toepper M., Lautenbach C. (Wiesbaden: Springer; ), 241–260. 10.1007/978-3-658-27886-1_12 [ CrossRef ] [ Google Scholar ]
  • Münchow H., Richter T., von der Mühlen S., Schmid S. (2019). The ability to evaluate arguments in scientific texts: measurement, cognitive processes, nomological network, and relevance for academic success at the university. Br. J. Educ. Psychol. 89 501–523. 10.1111/bjep.12298 [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Murray M. C., Pérez J. (2014). Unraveling the digital literacy paradox: how higher education fails at the fourth literacy. Issues in Inform. Sci. Inform. Technol. 11 85–100. 10.28945/1982 [ CrossRef ] [ Google Scholar ]
  • Nagel M.-T., Zlatkin-Troitschanskaia O., Schmidt S., Beck K. (2020). “ Performance assessment of generic and domain-specific skills in higher education economics ,” in Student Learning in German Higher Education: Innovative Measurement Approaches and Research Results , Zlatkin-Troitschanskaia O., Pant H. A., Toepper M., Lautenbach C. (Wiesbaden: Springer; ), 281–299. 10.1007/978-3-658-27886-1_14 [ CrossRef ] [ Google Scholar ]
  • Newman T., Beetham H. (2017). Student Digital Experience Tracker 2017: the Voice of 22,000 UK Learners. Bristol: JISC. [ Google Scholar ]
  • Oliveri M. E., Mislevy R. J. (2019). Introduction to “Challenges and Opportunities in the Design of ‘Next-Generation Assessments of 21st Century Skills”’ Special Issue. Int. J. Test. 19 97–102. 10.1080/15305058.2019.1608551 [ CrossRef ] [ Google Scholar ]
  • Oser F., Biedermann H. (2020). “ A three-level model for critical thinking: critical alertness, critical reflection, and critical analysis ,” in Frontiers and Advances in Positive Learning in the Age of Information (PLATO) , ed. Zlatkin-Troitschanskaia O. (New York, NY: Springer; ), 89–106. 10.1007/978-3-030-26578-6_7 [ CrossRef ] [ Google Scholar ]
  • Pellegrino J. (2017). “ Teaching, learning and assessing 21st century skills ,” in Pedagogical Knowledge and the Changing Nature of the Teaching Profession , ed. Guerriero S. (Paris: OECD Publishing; ), 10.1787/9789264270695-12-en [ CrossRef ] [ Google Scholar ]
  • Pellegrino J. W. (2020). Sciences of learning and development: some thoughts from the learning sciences. Appl. Dev. Sci. 24 48–56. 10.1080/10888691.2017.1421427 [ CrossRef ] [ Google Scholar ]
  • Schoor C., Melzner N., Artelt C. (2019). The effect of the wording of multiple documents on learning. Zeitschrift für Pädagogische Psychologie 33 223–240. 10.1024/1010-0652/a000246 [ CrossRef ] [ Google Scholar ]
  • Shavelson R. J., Zlatkin-Troitschanskaia O., Beck K., Schmidt S., Marino J. (2019). Assessment of university students’ critical thinking: next generation performance assessment. Int. J. Test. 19 337–362. 10.1080/15305058.2018.1543309 [ CrossRef ] [ Google Scholar ]
  • Stanovich K. E. (2003). “ The fundamental computational biases of human cognition: heuristics that (sometimes) impair decision making and problem solving ,” in The Psychology of Problem Solving , eds Davidson J. E., Sternberg R. J. (New York, NY: Cambridge University Press; ), 291–342. 10.1017/cbo9780511615771.011 [ CrossRef ] [ Google Scholar ]
  • Stanovich K. E. (2016). The Rationality Quotient: Toward a Test of Rational Thinking , 1st Edn Cambridge, MA: MIT Press. [ Google Scholar ]
  • Strauss A., Corbin J. (1990). Basic of Grounded Theory Methods. Beverly Hills, CA: Sage. [ Google Scholar ]
  • Strauss A., Corbin J. (1998). Basics of Qualitative Research. Techniques and Procedures for Developing Grounded Theory , 2nd Edn Thousand Oaks, CA: Sage Publications. [ Google Scholar ]
  • Sweller J. (1988). Cognitive load during problem solving: effects on learning. Cogn. Sci. 12 257–285. 10.1207/s15516709cog1202_4 [ CrossRef ] [ Google Scholar ]
  • Ulyshen T. Z., Koehler M. J., Gao F. (2015). Understanding the connection between epistemic beliefs and internet searching. J. Educ. Comput. Res. 53 345–383. 10.1177/0735633115599604 [ CrossRef ] [ Google Scholar ]
  • van Strien J. L. H., Brand-Gruvel S., Boshuizen H. P. A. (2014). Dealing with conflicting information from multiple nonlinear texts: effects of prior attitudes. Comput. Hum. Behav. 32 101–111. 10.1016/j.chb.2013.11.021 [ CrossRef ] [ Google Scholar ]
  • van Strien J. L. H., Bijker M., Brand-Gruwel S., Boshuizen H. P. A. (2012a). “ Measuring sophistication of epistemic beliefs using rasch analysis ,” in The Future of Learning: Proceedings of the 10th International Conference of the Learning Sciences (ICLS 2012) Volume 2. Short Papers, Symposia, and Abstracts , eds van Aalst J., Thompson K., Jacobson M. J., Reimann P. (Sydney: International Society of the Learning Sciences; ), 197–201. [ Google Scholar ]
  • van Strien J. L. H., Brand-Gruwel S., Boshuizen H. P. A. (2012b). Do Prior Attitudes Influence Epistemic Cognition While Reading Conflicting Information? Poster Presented at the Biannual Meeting of the EARLI Special Interest Group Comprehension of Text and Graphics in August 2016. Grenoble (France). Avaliable at: https://www.researchgate.net/publication/254848942_Do_prior_attitudes_influence_epistemic_cognition_while_reading_conflicting_information (accessed May 16, 2020). [ Google Scholar ]
  • van Strien J. L. H., Kammerer Y., Brand-Gruvel S., Boshuizen H. P. A. (2016). How attitude strength biases information processing and evaluation on the web. Comput. Hum. Behav. 60 245–252. 10.1016/j.chb.2016.02.057 [ CrossRef ] [ Google Scholar ]
  • Walthen C. N., Burkell J. (2002). Believe it or not: factors influencing credibility on the web. J. Am. Soc. Infor. Sci. Technol. 53 134–144. 10.1002/asi.10016 [ CrossRef ] [ Google Scholar ]
  • Watson G., Glaser E. (2002). Watson-Glaser Critical Thinking Appraisal – UK Edition. London: Pearson Assessment. [ Google Scholar ]
  • Wineburg S., McGrew S. (2017). “ Lateral reading: Reading less and learning more when evaluating digital information ,” in Stanford History Education Group Working Paper No. 2017-A1. Avaliable at: https://ssrn.com/ abstract = 3048994 (accessed May 16, 2020). [ Google Scholar ]
  • Wineburg S., Smith M., Breakstone J. (2018). What is learned in college history classes? J. Am. History 104 983–993. 10.1093/jahist/jax434 [ CrossRef ] [ Google Scholar ]
  • Wolf R., Zahner D., Benjamin R. (2015). Methodological challenges in international comparative post-secondary assessment programs: lessons learned and the road ahead. Stud. Higher Educ. 40 471–481. 10.1080/03075079.2015.1004239 [ CrossRef ] [ Google Scholar ]
  • Zahner D. (2013). Reliability and Validity–CLA+. New York, NY: CAE. [ Google Scholar ]
  • Zahner D., Ciolfi A. (2018). “ International comparison of a performance-based assessment in higher education ,” in Assessment of Learning Outcomes in Higher Education. Methodology of Educational Measurement and Assessment , eds Zlatkin-Troitschanskaia O., Toepper M., Pant H., Lautenbach C., Kuhn C. (Wiebaden: Springer; ), 10.1007/978-3-319-74338-7 [ CrossRef ] [ Google Scholar ]
  • Zlatkin-Troitschanskaia O., Jitomirski J., Happ R., Molerov D., Schlax J., Kühling-Thees C., et al. (2019a). Validating a test for measuring knowledge and understanding of economics among university students. Zeitschrift für Pädagogische Psychologie , 33 119–133. [ Google Scholar ]
  • Zlatkin-Troitschanskaia O., Shavelson R. J., Schmidt S., Beck K. (2019b). On the complementarity of holistic and analytic approaches to performance assessment scoring. Br. J. Educ. Psychol. 89 468–484. 10.1111/bjep.12286 [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Zlatkin-Troitschanskaia O., Shavelson R. J., Pant H. A. (2018a). “ Assessment of Learning outcomes in higher education: international comparisons and perspectives ,” in Handbook on Measurement, Assessment and Evaluation in Higher Education , Secolsky C., Denison B. (New York, NY: Routledge; ). [ Google Scholar ]
  • Zlatkin-Troitschanskaia O., Toepper M., Molerov D., Buske R., Brückner S., Pant H. A. (2018b). “ Adapting and validating the collegiate learning assessment to measure generic academic skills of students in germany – implications for international assessment studies in higher education ,” in Assessment of Learning Outcomes in Higher Education , eds Zlatkin-Troitschanskaia O., Pant H. A., Toepper M., Lautenbach C., Kuhn C. (Wiesbaden: Springer; ), 245–266. [ Google Scholar ]

A diverse group of people brainstorming around a small conference table

How workplaces can encourage diverse personalities, values and attitudes

critical thinking beliefs attitudes and values

Professor of Psychology, Saint Mary’s University

critical thinking beliefs attitudes and values

Assistant Professor of Management, Indigenous Business, St. Francis Xavier University

critical thinking beliefs attitudes and values

Associate Professor, Sobey School of Business Management, Saint Mary’s University

Disclosure statement

Steven Smith receives funding from the Social Sciences and Humanities Research Council of Canada.

Katelynn Carter-Rogers receives funding from the Social Sciences and Humanities Research Council of Canada.

Vurain Tabvuma receives funding from the Social Sciences and Humanities Research Council of Canada.

View all partners

If you work for an organization that believes diversity can increase organizational performance and employee well-being, we have a secret to share with you: despite what is commonly espoused about diversity, very few organizations have actually achieved benefits through current diversity approaches .

There is no question that diversity and accessibility in the workplace has value — diverse workplaces are more welcoming, more productive and have better retention of employees.

However, diversity is usually only thought of in terms of visible diversity (e.g., in terms of race, ethnicity, age, national origin, sexual orientation and cultural identity). In reality, diversity goes far beyond this.

The importance of valuing diversity

There are two limitations to only approaching diversity from a visible perspective. First, people may not be diverse in ways that are meaningful to organizations when only visible diversity is considered . Second, people may be diverse in ways that are not clearly visible and are difficult to observe and identify.

A visible diversity-only approach stops organizations from achieving the full benefits of true diversity and can lead to organizations actually becoming less diverse in their attitudes and beliefs. This is because of group polarization and groupthink, which can occur when like-minded people get together and make decisions.

Many professions tend to skew either liberal (e.g., academia ) or conservative (e.g., the military ), and the work environment further accentuates those tendencies, potentially leading to poor decision-making.

In such groups there are different, more deeply held attitudes, beliefs and values that cannot be easily dismissed without sincere critical thinking and engagement.

Groupthink and group polarization can be overcome when workplaces are composed of people with diverse personalities, values, and attitudes.

A woman placing sticky notes on wall while people sitting at a conference table look on

This makes it more difficult for the group to coalesce around particular beliefs and attitudes because these are continuously challenged from within the group.

Further, this process of deep critical thinking and engagement leads to increased creativity, innovation and productivity as underlying assumptions about work and organizing are challenged and critiqued.

Managing diverse organizations

The challenge that managers and human resource professionals face within organizations and groups that have diverse personalities, values and attitudes is finding ways for the organization to work together effectively and reduce conflict. Here are three ways to ensure diversity works in your organization:

1. Create an inclusive climate

Organizations must create an environment where all voices are heard and everyone is encouraged to express themselves and contribute. This should begin from the very moment newcomers join the organization.

Employee on-boarding should introduce newcomers to an organization’s inclusive practices and openness to engaging their unique perspectives and abilities. These inclusive practices should include having robust conflict resolution procedures, as these have been shown to positively impact team outcomes .

This is especially important for organizations with diverse personalities, values and attitudes. A wide range of deeply held values and attitudes have the potential to lead to discord and disputes.

Two women watch a third woman, who is speaking. All are seated at a conference table.

In addition, inclusive leaders are needed to create workplaces that encourage dialogue concerning differences and support authenticity in employees.

Recent research has found that inclusive leadership is more likely to result in workplace environments where employees are open to making changes in their work procedures, policies and tasks. We live in a fast changing dynamic world where organizations need a workforce that is able and willing to adapt to continuously changing conditions.

2. Leave your ego at the door

It’s important for organizations to hire people that don’t bring feelings of self-importance, vanity and arrogance to the workplace.

First, organizations should encourage members to leave their ego at the door and focus on team goals, not individual accomplishments or pride. Research has shown that teams perform better when they set group goals .

Second, organizations should ensure there are ways for everyone to communicate their perspectives in ways suitable to them. Introverted members, for example, should have their preferred communication methods available .

Third, organizations should encourage all members to learn something new. Mastering a new skill elicits feelings of doubt and frustration, which causes people to seek help or guidance from others. It also results in humility .

3. Be comfortable with being uncomfortable

To work effectively, organizations should strive to create a culture where members are comfortable working with people with different personalities and perspectives. Such an environment is one where members are encouraged to be honest about their strengths and weaknesses.

Acknowledging our capabilities and the areas where we struggle — and seeing the same in others — helps us see others more completely. Group members can use a deeper understanding of each other’s strengths and weaknesses to assign tasks and support where needed.

Two women having a serious conversation

Research has shown that perceptions of individual group task competence and group belonging are higher in groups that receive positive feedback . Organizations should focus on positive aspects of individual differences as groups learn to work effectively together.

The road to prosperity

We are able to make the most impactful, lasting changes when we embrace those with different values and attitudes from our own. Leading innovation consultancies have understood this for quite some time. For example, the success of the innovation consultancy IDEO is built on developing innovations by having multi-perspective working teams .

This approach has helped IDEO create breakthrough innovations such as Apple’s first mouse , Steelcase’s Leap Chair , and the Palm V.

The process of intentionally including diverse personalities, values and attitudes in the workplace is not an easy one — it is hard. Working with people with very different value systems can be very challenging.

However, once we begin to have a deeper understanding of what drives these different perspectives, we can start to leverage the vast wealth of knowledge that has come from the many different individual experiences around us. With this wealth, we can begin to create new thoughts, ideas, products and experiences that will enrich us all.

  • Workplace equality
  • Equity, diversity and inclusion
  • workplace equity

critical thinking beliefs attitudes and values

School of Social Sciences – Public Policy and International Relations opportunities

critical thinking beliefs attitudes and values

Partner, Senior Talent Acquisition

critical thinking beliefs attitudes and values

Deputy Editor - Technology

critical thinking beliefs attitudes and values

Sydney Horizon Educators (Identified)

critical thinking beliefs attitudes and values

Deputy Vice-Chancellor (Academic and Student Life)

Book cover

Global Applications of Culturally Competent Health Care: Guidelines for Practice pp 97–112 Cite as

Critical Reflection

  • Larry Purnell Ph.D., R.N., FAAN 4 , 5 , 6  
  • First Online: 03 July 2018

2677 Accesses

Critical reflection, sometimes referred to as cultural self-awareness, is a purposeful, vital, careful evaluation of one’s own values, beliefs, and cultural heritage in order to have an awareness of how these qualities can influence patient care. However, critical reflection goes beyond solely awareness by examining and critiquing the assumptions of one’s values and beliefs. It includes an examination of one’s own cultural values that have the potential to be in conflict with the values of others and, as a result, hinder therapeutic relationships and effective patient care outcomes. A number of models related to critical thinking are reviewed and include Dewey’s model of reflective learning, Habermas’s model of critical reflection, Kolb’s model of experiential learning, and feminist theory. In addition, recommendations for clinical practice, administration, education and training, and research are addressed. Tools to help practitioners assess their views and values related to bias are included as appendices.

Guideline : Nurses shall engage in critical reflection of their own values, beliefs, and cultural heritage in order to have an awareness of how these qualities and issues can impact culturally congruent care . Douglas et al. ( 2014 : 110)

This is a preview of subscription content, log in via an institution .

Buying options

  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
  • Available as EPUB and PDF
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
  • Durable hardcover edition

Tax calculation will be finalised at checkout

Purchases are for personal use only

Akers M, Grover P (2017) What is emotional intelligence? Psych Central. https://psychcentral.com/lib/what-is-emotional-intelligence-eq/ . Accessed 2 June 2017

American Association of Colleges of Nursing (2006) Essentials of doctoral education in advanced practice nursing. http://www.aacn.nche.edu/publications/position/DNPEssentials.pdf . Accessed 5 June 2017

American Nurses Association (2015) Code of ethics for nurses with interpretive statements. Author, Silver Spring

Google Scholar  

Andrews MM, Boyle JS (eds) (2016) Transcultural concepts and nursing care, 7th edn. Wolters Kluwer, Philadelphia

Calvillo E, Clark L, Purnell L, Pacquiao D, Ballantyne J, Villaruel S (2009) Cultural competencies in health care: emerging changes in baccalaureate nursing education. J Transcult Nurs 20(2):137–145

Article   PubMed   Google Scholar  

Canadian Nurses Association (2010) Position statement: promoting cultural competence in nursing. https://www.cna-aiic.ca/~/media/cna/page-content/pdf-en/ps114_cultural_competence_2010_e.pdf?la=en . Accessed 2 June 2017

Carper BA (1978) Fundamental patterns of knowing in nursing. Adv Nurs Sci 1(1):13–23

Article   CAS   Google Scholar  

Chinn P (2018) Critical theory and emancipatory knowing. In: Butts JB, Rich KL (eds) Philosophies and theories for advanced nursing practice. Jones & Bartlett Learning

Chinn PL, Kramer MK (2015) Knowledge development in nursing theory and process, 9th edn. Elsevier, St. Louis

Cholle FP (2011) What is intuition, and how do we use it? https://www.psychologytoday.com/blog/the-intuitive-compass/201108/what-is-intuition-and-how-do-we-use-it . Accessed 2 June 2017

Cuellar N (2017) Unconscious bias. What is yours? J Transcult Nurs 28(4):333

Dewey J (1933) How we think: a restatement on the relation of reflective thinking to the educative process. D.C. Health, New York

Douglas M, Rosenketter M, Pacquiao D, Clark Callister L et al (2014) Guidelines for implementing culturally competent nursing care. J Transcult Nurs 25(2):109–221

Faith as a Way of Knowing. http://www.agakhanacademies.org/sites/default/files/AKA%20Faith%20Booklet%20for%20IB%20TOK.pdf . Accessed 3 June 2017

Ghoshal RA, Lippard C, Ribas V, Muir K (2013) Beyond bigotry: teaching about unconscious prejudice. Teach Sociol 41(2):130–143. http://vlib.excelsior.edu/login?url=http://search.ebscohost.com/login.aspx?direct=true&db=sih&AN=86187613&site=eds-live&scope=site . Accessed 11 July 2017

Article   Google Scholar  

Goleman D (2005) Emotional intelligence: why it can matter more than IQ. Bloomsbury Publishing, London

Hickson H (2011) Critical reflection: reflecting on learning to be reflective. Reflect Pract Interprof Multidiscip Perspect 12(6):829–839

How We Think: John Dewey on the Art of Reflection and Fruitful Curiosity in an Age of Instant Opinions and Information Overload (n.d.) https://www.brainpickings.org/2014/08/18/how-we-think-john-dewey/. Accessed 2 June 2017

ICN Code of Ethics for Nurses (2012) http://www.icn.ch/who-we-are/code-of-ethics-for-nurses/ . Accessed 2 June 2017

Johns C (1995) Framing learning through reflection within Carper’s fundamental ways of knowing in nursing. J Adv Nurs 22:226–234

Article   CAS   PubMed   Google Scholar  

Kolb AY, Kolb DA (2005) Kolb learning style inventory. Experience Based Learning Systems, Inc. https://search.yahoo.com/yhs/search;_ylt=A0LEViOkBIFYzecABzonnIlQ;_ylu=X3oDMTEwcTc4aWkwBGNvbG8DYmYxBHBvcwMxBHZ0aWQDBHNlYwNxc3MtcXJ3?ei=UTF-8&hspart=mozilla&hsimp=yhs-002&fr=yhs-mozilla-002&p=Kolb%2C+%26+Kolb%2C+D.A.+%282005%29.+Kolb+Learning+Style+Inventory.+Experience+Based+Learning+Systems%2C+Inc.&fr2=12642 . Accessed 2 June 2017

Mezirow J (1990) Fostering critical reflection in adulthood. Jossey-Bass, San Francisco

Miettinen R (2000) The concept of experiential learning and John Dewey’s theory of reflective thought and action. Int J Lifelong Educ 19(1):54–72. https://doi.org/10.1080/026013700293458 . Accessed 7 June 2017

Purnell L, Salmond S (2013) Individual cultural competence and evidence-based practice. In: Purnell L (ed) Transcultural health care: a culturally competent approach, 4th edn. F.A. Davis, Philadelphia

Rogers C (2002) Dewey’s model of reflective learning. Teach Coll Rec 104(4):842–866. file:///D:/dewey.pdf. Accessed 2 June 2017

Schon DA (2014) The theory of inquiry: Dewey’s legacy to education. Curric Inq 22(2):119–139. http://www.tandfonline.com/doi/abs/10.1080/03626784.1992.11076093?src=recsys . Accessed 2 June 2017

Sullivan-Marx EM (2013) The bear and the canyon: toward an understanding of personal leadership. Nurs Sci Q 26(4):373–375

The Critical Theory of Jurgen Habermas (n.d.) http://physicsed.buffalostate.edu/danowner/habcritthy.html . Accessed 4 June 2017

Theory of Nursing Knowldege. http://www.theoryofknowledge.net/ . Accessed March 10, 2018

Timmins F (2006) Critical practice in nursing care: analysis, action and reflexivity. Nurs Stand 20:49–54

Tong R (2018) Feminist ethics: some applicable thought for advance practice nurses. In: Butts JB, Rich KL (eds) Philosophies and theories for advanced nursing practice. Jones & Bartlett Learning

Ways of Knowing: Faith (2017) http://sohowdoweknow.weebly.com/faith.html . Accessed 3 June 2017

Ways of Knowing: Theory of Knowledge.net (2017) http://www.theoryofknowledge.net/ways-of-knowing/ . Accessed 2 June 2017

Download references

Author information

Authors and affiliations.

School of Nursing, University of Delaware, Newark, DE, USA

Larry Purnell Ph.D., R.N., FAAN

Florida International University, Miami, FL, USA

Excelsior College, Albany, NY, USA

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Larry Purnell Ph.D., R.N., FAAN .

Editor information

Editors and affiliations.

School of Nursing, University of California San Francisco, Palo Alto, California, USA

Marilyn "Marty" Douglas

School of Nursing, Rutgers University, Newark, New Jersey, USA

Dula Pacquiao

College of Health Sciences, University of Delaware, Sudlersville, Maryland, USA

Larry Purnell

Appendix 1: Promoting Cultural and Linguistic Competency

1.1 self-assessment checklist for personnel providing primary healthcare services.

Please select A, B, or C for each item listed below.

A = Things I do frequently, or statement applies to me to a great degree

B = Things I do occasionally, or statement applies to me to a moderate degree

C = Things I do rarely or never, or statement applies to me to minimal degree or not at all

1.1.1 Physical Environment, Materials, and Resources

_____ 1. I display pictures, posters, artworks, and other decors that reflect the cultures and ethnic backgrounds of clients served by my program or agency.

_____ 2. I ensure that magazines, brochures, and other printed materials in reception areas are of interest to and reflect the different cultures and languages of individuals and families served by my program or agency.

_____ 3. When using videos, films, or other media resources for health education, treatment, or other interventions, I ensure that they reflect the culture and ethnic backgrounds of individuals and families served by my program or agency.

_____ 4. I ensure that printed information disseminated by my agency or program takes into consideration accuracy and without bias.

1.1.2 Communication Styles

_____ 5. When interacting with individuals and families who have limited English proficiency, I always keep in mind that:

* Limitations in English proficiency are in no way a reflection of their level of intellectual functioning.

* Their limited ability to speak the language of the dominant culture has no bearing on their ability to communicate effectively in their language of origin.

* They may neither be literate in their language of origin nor in English.

______ 6. I use bilingual/bicultural or multilingual/multicultural staff and/or personnel and volunteers who are skilled or certified in the provision of medical interpretation services during treatment, interventions, meetings, or other events for individuals and families who need or prefer this level of assistance.

______ 7. For individuals and families who speak languages or dialects other than English, I attempt to learn and use key words so that I am better able to communicate with them during assessment, treatment, or other interventions.

______ 8. I attempt to determine any familial colloquialisms used by individuals or families that may impact on assessment, treatment, health promotion and education, or other interventions.

______ 9. For those who request or need this service, I ensure that all notices and communiqués to individuals and families are written in their language of origin.

_____ 10. I understand that it may be necessary to use alternatives to written communications for some individuals and families, as word of mouth may be a preferred method of receiving information.

_____ 11. I understand the principles and practices of linguistic competency and:

* Apply them within my program or agency

* Advocate for them within my program or agency

_____ 12. I understand the implications of health literacy within the context of my roles and responsibilities.

_____ 13. I use alternative formats and varied approaches to communicate and share information with individuals and/or their family members who experience disability.

1.1.3 Values and Attitudes

_____ 14. I avoid imposing values that may conflict or be inconsistent with those of cultures or ethnic groups other than my own.

_____ 15. I screen books, movies, and other media resources for negative cultural, ethnic, or racial stereotypes before sharing them with individuals and families served by my program or agency.

_____ 16. I intervene in an appropriate manner when I observe other staff or clients within my program or agency engaging in behaviors that show cultural insensitivity, racial biases, and prejudice.

_____ 17. I recognize and accept that individuals from culturally diverse backgrounds may desire varying degrees of acculturation into the dominant culture.

_____ 18. I understand and accept that family is defined differently by different cultures (e.g., extended family members, fictive kin, godparents).

_____ 19. I accept and respect that male-female roles may vary significantly among different cultures (e.g., who makes major decisions for the family).

_____ 20. I understand that age and life cycle factors must be considered in interactions with individuals and families (e.g., high value placed on the decision of elders, the role of eldest male or female in families, or roles and expectation of children within the family).

_____ 21. Even though my professional or moral viewpoints may differ, I accept individuals and families as the ultimate decision-makers for services and supports impacting their lives.

_____ 22. I recognize that the meaning or value of medical treatment and health education may vary greatly among cultures.

_____ 23. I accept that religion and other beliefs may influence how individuals and families respond to illnesses, disease, and death.

_____ 24. I understand that the perception of health, wellness, and preventive health services has different meanings to different cultural groups.

_____ 25. I recognize and understand that beliefs and concepts of emotional well-being vary significantly from culture to culture.

_____ 26. I understand that beliefs about mental illness and emotional disability are culturally based. I accept that responses to these conditions and related treatments/interventions are heavily influenced by culture.

_____ 27. I recognize and accept that folk and religious beliefs may influence an individual’s or family’s reaction and approach to a child born with a disability or later diagnosed with a disability, genetic disorder, or special healthcare needs.

_____ 28. I understand that grief and bereavement are influenced by culture.

_____ 29. I accept and respect that customs and beliefs about food, its value, preparation, and use are different from culture to culture.

_____ 30. I seek information from individuals, families, or other key community informants that will assist in service adaptation to respond to the needs and preferences of culturally and ethnically diverse groups served by my program or agency.

_____ 31. Before visiting or providing services in the home setting, I seek information on acceptable behaviors, courtesies, customs, and expectations that are unique to the culturally diverse groups served by my program or agency.

_____ 32. I keep abreast of the major health and mental health concerns and issues for ethnically and racially diverse client populations residing in the geographic locale served by my program or agency.

_____ 33. I am aware of specific health and mental health disparities and their prevalence within the communities served by my program or agency.

_____ 34. I am aware of the socioeconomic and environmental risk factors that contribute to health and mental health disparities or other major health problems of culturally and linguistically diverse populations served by my program or agency.

_____ 35. I am well versed in the most current and proven practices, treatments, and interventions for the delivery of health and mental healthcare to specific racial, ethnic, cultural, and linguistic groups within the geographic locale served by my agency or program.

_____ 36. I avail myself to professional development and training to enhance my knowledge and skills in the provision of services and supports to culturally and linguistically diverse groups.

_____ 37. I advocate for the review of my program’s or agency’s mission statement, goals, policies, and procedures to ensure that they incorporate principles and practices that promote cultural and linguistic competence.

Reprinted with Permission: Tawara D. Goode • National Center for Cultural Competence • Georgetown University Center for Child & Human Development • University Center for Excellence in Developmental Disabilities, Education, Research & Service • Adapted Promoting Cultural Competence and Cultural Diversity for Personnel Providing Services and Supports to Children with Special Health Care Needs and their Families • June 1989 (Revised 2009).

SCORING: This checklist is intended to heighten the awareness and sensitivity of personnel to the importance of cultural and linguistic cultural competence in health, mental health, and human service settings. It provides concrete examples of the kinds of beliefs, attitudes, values, and practices which foster cultural and linguistic competence at the individual or practitioner level. There is no answer key with correct responses. However, if you frequently responded “C,” you may not necessarily demonstrate beliefs, attitudes, values, and practices that promote cultural and linguistic competence within health and mental healthcare delivery programs.

Appendix 2: Promoting Cultural and Linguistic Competency

1.1 self-assessment checklist for personnel providing services and supports in early intervention and early childhood settings.

Directions : Please select A, B, or C for each item listed below.

______ 1. I display pictures, posters, and other materials that reflect the cultures and ethnic backgrounds of children and families served in my early childhood program or setting.

______ 2. I select props for the dramatic play/housekeeping area that are culturally diverse (e.g., dolls, clothing, cooking utensils, household articles, furniture).

______ 3. I ensure that the book/literacy area has pictures and storybooks that reflect the different cultures of children and families served in my early childhood program or setting.

______ 4. I ensure that tabletop toys and other play accessories (that depict people) are representative of the various cultural and ethnic groups both within my community and the society in general.

______ 5. I read a variety of books exposing children in my early childhood program or setting to various life experiences of cultures and ethnic groups other than their own.

______ 6. When such books are not available, I provide opportunities for children and their families to create their own books and include them among the resources and materials in my early childhood program or setting.

______ 7. I adapt the above referenced approaches when providing services, supports, and other interventions in the home setting.

______ 8. I encourage and provide opportunities for children and their families to share experiences through storytelling, puppets, marionettes, or other props to support the “oral tradition” common among many cultures.

______ 9. I plan trips and community outings to places where children and their families can learn about their own cultural or ethnic history as well as the history of others.

_____ 10. I select videos, films, or other media resources reflective of diverse cultures to share with children and families served in my early childhood program or setting.

_____ 11. I play a variety of music and introduce musical instruments from many cultures.

_____ 12. I ensure that meals provided include foods that are unique to the cultural and ethnic backgrounds of children and families served in my early childhood program or setting.

_____ 13. I provide opportunities for children to cook or sample a variety of foods typically served by different cultural and ethnic groups other than their own.

_____ 14. If my early childhood program or setting consists entirely of children and families from the same cultural or ethnic group, I feel it is important to plan an environment and implement activities that reflect the cultural diversity within the society at large.

_____ 15. I am cognizant of and ensure that curricula I use include traditional holidays celebrated by the majority culture, as well as those holidays that are unique to the culturally diverse children and families served in my early childhood program or setting.

_____ 16. For children who speak languages or dialects other than English, I attempt to learn and use key words in their language so that I am better able to communicate with them.

_____ 17. I attempt to determine any familial colloquialisms used by children and families that will assist and/or enhance the delivery of services and supports.

_____ 18. I use visual aids, gestures, and physical prompts in my interactions with children who have limited English proficiency.

_____ 19. When interacting with parents and other family members who have limited English proficiency, I always keep in mind that:

____ (a) Limitation in English proficiency is in no way a reflection of their level of intellectual functioning.

____ (b) Their limited ability to speak the language of the dominant culture has no bearing on their ability to communicate effectively in their language of origin.

____ (c) They may neither be literate in their language of original English.

_____ 20. I ensure that all notices and communiqués to parents are written in their language of origin.

_____ 21. I understand that it may be necessary to use alternatives to written communications for some families, as word of mouth may be a preferred method of receiving information.

_____ 22. I understand the principles and practices of linguistic competency and:

(a) Apply them within my early childhood program or setting

(b) Advocate for them within my program or agency

_____ 23. I use bilingual or multilingual staff and/or trained/certified foreign language interpreters for meetings, conferences, or other events for parents and family members who may require this level of assistance.

_____ 24. I encourage and invite parents and family members to volunteer and assist with activities regardless of their ability to speak English.

_____ 25. I use alternative formats and varied approaches to communicate with children and/or their family members who experience disability.

_____ 26. I arrange accommodations for parents and family members who may require communication assistance to ensure their full participation in all aspects of the early childhood program (e.g., hearing impaired, physical disability, visually impaired, not literate or low literacy, etc.).

_____ 27. I accept and recognize that there are often differences between language used in early childhood/early intervention settings, or at “school,” and in the home setting.

_____ 28. I avoid imposing values that may conflict or be inconsistent with those of cultures or ethnic groups other than my own.

_____ 29. I discourage children from using racial and ethnic slurs by helping them understand that certain words can hurt others.

_____ 30. I screen books, movies, and other media resources for negative cultural, ethnic, racial, or religious stereotypes before sharing them with children and their families served in my early childhood program or setting.

_____ 31. I provide activities to help children learn about and accept the differences and similarities in all people as an ongoing component of program curricula.

_____ 32. I intervene in an appropriate manner when I observe other staff or parents within my program or agency engaging in behaviors that show cultural insensitivity, bias, or prejudice.

_____ 33. I recognize and accept that individuals from culturally diverse backgrounds may desire varying degrees of acculturation into the dominant culture.

_____ 34. I understand and accept that family is defined differently by different cultures (e.g., extended family members, fictive kin, godparents).

_____ 35. I accept and respect that male-female roles in families may vary significantly among different cultures (e.g., who makes major decisions for the family, play and social interactions expected of male and female children).

_____ 36. I understand that age and life cycle factors must be considered in interactions with families (e.g., high value placed on the decisions or child-rearing practices of elders or the role of the eldest female in the family).

_____ 37. Even though my professional or moral viewpoints may differ, I accept the family/parents as the ultimate decision-makers for services and supports for their children.

_____ 38. I accept that religion, spirituality, and other beliefs may influence how families respond to illness, disease, and death.

_____ 39. I recognize and understand that beliefs and concepts of mental health or emotional well-being, particularly for infants and young children, vary significantly from culture to culture.

_____ 40. I recognize and accept that familial folklore, religious, or spiritual beliefs may influence a family’s reaction and approach to a child born with a disability or later diagnosed with a disability or special healthcare needs.

_____ 41. I understand that beliefs about mental illness and emotional disability are culturally based. I accept that responses to these conditions and related treatments/interventions are heavily influenced by culture.

_____ 42. I understand that the healthcare practices of families served in my early childhood program or setting may be rooted in cultural traditions.

_____ 43. I recognize that the meaning or value of early childhood education or early intervention may vary greatly among cultures.

_____ 44. I understand that traditional approaches to disciplining children are influenced by culture.

_____ 45. I understand that families from different cultures will have different expectations of their children for acquiring toileting, dressing, feeding, and other self-help skills.

_____ 46. I accept and respect that customs and beliefs about food, its value, preparation, and use are different from culture to culture.

_____ 47. Before visiting or providing services in the home setting, I seek information on acceptable behaviors, courtesies, customs, and expectations that are unique to families of specific cultural groups served in my early childhood program or setting.

_____ 48. I advocate for the review of my program’s or agency’s mission statement, goals, policies, and procedures to ensure that they incorporate principles and practices that promote cultural diversity, cultural competence, and linguistic competence.

_____ 49. I seek information from family members or other key community informants that will assist me to respond effectively to the needs and preferences of culturally and linguistically diverse children and families served in my early childhood program or setting.

Reprinted with Permission: Tawara D. Goode • National Center for Cultural Competence • Georgetown University Center for Child & Human Development • University Center for Excellence in Developmental Disabilities, Education, Research & Service • Adapted Promoting Cultural Competence and Cultural Diversity for Personnel Providing Services and Supports to Children with Special Health Care Needs and their Families • June 1989 (Revised 2009).

SCORING: This checklist is intended to heighten the awareness and sensitivity of personnel to the importance of cultural diversity, cultural competence, and linguistic competence in early childhood settings. It provides concrete examples of the kinds of practices that foster such an environment. There is no answer key with correct responses. However, if you frequently responded “C,” you may not necessarily demonstrate practices that promote a culturally diverse and culturally competent learning environment for children and families within your classroom, program, or agency.

Appendix 3: Personal Self-Assessment of Antibias Behavior

Directions : Using the rating scale of NEVER to ALWAYS, assess yourself for each item by placing an “X” on the appropriate place along each continuum. When you have completed the checklist, review your responses to identify areas in need of improvement. Create specific goals to address the areas in which you would like to improve.

I educate myself about the culture and experiences of other racial, religious, ethnic and socioeconomic groups by reading and attending classes, workshops, cultural events, etc.

Never ________________________ Always

I spend time reflecting on my own upbringing and childhood to better understand my own biases and the ways I may have internalized the prejudicial messages I received.

I look at my own attitudes and behaviors as an adult to determine the ways they may be contributing to or combating prejudice in society.

I evaluate my use of language to avoid terms or phrases that may be degrading or hurtful to other groups.

I avoid stereotyping and generalizing other people based on their group identity.

Never __________________________ Always

I value cultural differences and avoid statements such as “I never think of you as______________,” which discredits differences.

Never _________________________ Always

I am comfortable discussing issues of racism, anti-Semitism and other forms of prejudice with others.

I am open to other people’s feedback about ways in which my behavior may be culturally insensitive or offensive to others.

I give equal attention to other people regardless of race, religion, gender, socioeconomic class or other difference.

I am comfortable giving constructive feedback to someone of another race, gender, age or physical ability.

The value of diversity is reflected in my work, which includes a wide range of racial, religious, ethnic and socioeconomic groups, even when these groups are not personally represented in my community.

I work intentionally to develop inclusive practices, such as considering how the time, location and cost of scheduled meetings and programs might inadvertently exclude certain groups.

I work to increase my awareness of biased content in television programs, newspapers and advertising.

I take time to notice the environment of my home, office, house of worship and children’s school, to ensure that visual media represent diverse groups, and I advocate for the addition of such materials if they are lacking.

When other people use biased language and behavior, I feel comfortable speaking up, asking them to refrain and stating my reasons.

I contribute to my organization’s achievement of its diversity goals through programming and by advocating for hiring practices that contribute to a diverse workforce.

I demonstrate my commitment to social justice in my personal life by engaging in activities to achieve equity.

This activity was adapted from “Commitment to Combat Racism” by Dr. Beverly Tatum & Andrea Ayvazian in White Awareness: Handbook for Anti-Racism Training by Judy H. Katz. ©1978 by the University of Oklahoma Press, Norman. Reprinted by permission of the publisher. All rights reserved.

Permission was also granted from the Anti-Defamation League, Education Division, A WORLD OF DIFFERENCE ® Institute © 2007 Anti-Defamation League: www.adl.org/education ; email: [email protected].

Rights and permissions

Reprints and permissions

Copyright information

© 2018 Springer International Publishing AG, part of Springer Nature

About this chapter

Cite this chapter.

Purnell, L. (2018). Critical Reflection. In: Douglas, M., Pacquiao, D., Purnell, L. (eds) Global Applications of Culturally Competent Health Care: Guidelines for Practice. Springer, Cham. https://doi.org/10.1007/978-3-319-69332-3_10

Download citation

DOI : https://doi.org/10.1007/978-3-319-69332-3_10

Published : 03 July 2018

Publisher Name : Springer, Cham

Print ISBN : 978-3-319-69331-6

Online ISBN : 978-3-319-69332-3

eBook Packages : Medicine Medicine (R0)

Share this chapter

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Publish with us

Policies and ethics

  • Find a journal
  • Track your research

Library homepage

  • school Campus Bookshelves
  • menu_book Bookshelves
  • perm_media Learning Objects
  • login Login
  • how_to_reg Request Instructor Account
  • hub Instructor Commons
  • Download Page (PDF)
  • Download Full Book (PDF)
  • Periodic Table
  • Physics Constants
  • Scientific Calculator
  • Reference & Cite
  • Tools expand_more
  • Readability

selected template will load here

This action is not available.

Social Sci LibreTexts

9.2: Beliefs

  • Last updated
  • Save as PDF
  • Page ID 67199

  • Jim Marteney
  • Los Angeles Valley College via ASCCC Open Educational Resources Initiative (OERI)

Beliefs represent all the bits of information we collect about people, events, and things in our life. They are cognitions that we have discriminated and selected from all those we have been exposed to, relevant to any subject in our environment.

Beliefs are measured using a true-false continuum and a probability scale. There are some beliefs you feel are absolutely true or false; probably true or false; or are not sure about. All of us possess beliefs about a college education. They may include that a college education takes time, is a lot of work, makes our parents happy, and will allow us to make more money in the future, and so on.

Some beliefs are stronger than others or as we say, have more salience. That is, some information about the environment is more important to us than other information, such as, how you are doing in a class as opposed to how another class member is doing.

IMAGES

  1. How do values, beliefs and attitudes affect teaching methods?

    critical thinking beliefs attitudes and values

  2. The benefits of critical thinking for students and how to develop it

    critical thinking beliefs attitudes and values

  3. How to be a critical thinker

    critical thinking beliefs attitudes and values

  4. Attitude

    critical thinking beliefs attitudes and values

  5. beliefs values and attitudes

    critical thinking beliefs attitudes and values

  6. How Might Principals Model the 9 Traits of Critical Thinking

    critical thinking beliefs attitudes and values

VIDEO

  1. Critical thinking

  2. Critical Thinking

  3. Critical thinking

  4. Critical Thinking Attitudes

  5. Critical Thinking

  6. Critical Thinking

COMMENTS

  1. Developing Critical Thinking & the Difference between Beliefs

    They stem from our upbringing and situation, feeding into our belief system. Beliefs are ideas considered true without facts and can be spiritual, moral, political, social, etc. Attitudes are the ...

  2. 9.3: Values

    9.3: Values. A special subset or type of beliefs is known as values. Values are: Enduring, or long lasting concepts of the nature of good as opposed to brief ideas. Resistant to change. Salient, important beliefs, inflexible beliefs about the worth given to people, events, things and philosophy in one's life. Values are enduring.

  3. 9.5: How Are Values Learned?

    The process of learning values and organizing them into a defined value system takes place through a learning process. The book, Values Clarification: A Handbook of Practical Strategies for Teachers and Students, describes four methods of learning our values. 1. Moralizing is the method by which values are transmitted in a direct manner from a ...

  4. 1. Attitudes and values for shaping a better future

    Attitudes are underpinned by values and beliefs and have an influence on behaviour ... awareness about the influence of the media, and critical thinking skills (e.g. to discern "fake news", risks of manipulation and ... (2018), Attitudes and Values and the OECD Learning Framework 2030: A critical review of definitions, concepts and ...

  5. 9.6: Attitudes

    Based on all these beliefs you have a positive attitude towards vegetables. 9.6.3: "Beliefs to Attitude" (CC BY 3.0; J. Marteney) Given that you have a positive attitude towards vegetables, your behavior should be to eat them. This balance between your beliefs and attitudes, and your attitude and behavior is a form of Stasis. You are comfortable.

  6. Introduction to Critical Thinking

    Reflect on the justification of one's own beliefs and values. Critical thinking is not simply a matter of accumulating information. A person with a good memory and who knows a lot of facts is not necessarily good at critical thinking. ... Attitudes. Good critical thinking skills require more than just knowledge and practice. Persistent ...

  7. Critical Thinking

    Critical Thinking is the process of using and assessing reasons to evaluate statements, assumptions, and arguments in ordinary situations. The goal of this process is to help us have good beliefs, where "good" means that our beliefs meet certain goals of thought, such as truth, usefulness, or rationality. Critical thinking is widely ...

  8. What Is Critical Thinking?

    Critical thinking is the ability to effectively analyze information and form a judgment. To think critically, you must be aware of your own biases and assumptions when encountering information, and apply consistent standards when evaluating sources. Critical thinking skills help you to: Identify credible sources. Evaluate and respond to arguments.

  9. Defining Critical Thinking

    Critical thinking is, in short, self-directed, self-disciplined, self-monitored, and self-corrective thinking. It presupposes assent to rigorous standards of excellence and mindful command of their use. It entails effective communication and problem solving abilities and a commitment to overcome our native egocentrism and sociocentrism.

  10. When Beliefs and Logic Contradict: Issues of Values ...

    Ability to decouple our prior beliefs and attitudes from the evaluation of arguments and evidence is one of the fundamental bases of the critical thinking . In the modern and increasingly multicultural world (in which different cultural values can clash together) and with the instant access to any information, ability to reason well, to ...

  11. Bridging critical thinking and transformative learning: The role of

    In recent decades, approaches to critical thinking have generally taken a practical turn, pivoting away from more abstract accounts - such as emphasizing the logical relations that hold between statements (Ennis, 1964) - and moving toward an emphasis on belief and action.According to the definition that Robert Ennis (2018) has been advocating for the last few decades, critical thinking is ...

  12. Development of the Critical Thinking Toolkit (CriTT): A measure of

    In sum, we predicted that more positive attitudes and beliefs about critical thinking, as measured by the new scale, would be related positively to scores for cognitive reflection and argument-driven responding, and negatively to scores for belief-driven responding. ... p < 0.001), and low off-diagonal values in the anti-image correlation ...

  13. 9.14: Changing Attitude and Stasis

    This new attitude, once adopted, will allow him or her to get back to a state of comfort or stasis, restoring the balance between his or her beliefs, values, and attitudes. As long as you are comfortable with your weight, you will never diet. But your doctor, who you really trust, says that you have to lose 35 pounds or be at risk of acquiring ...

  14. Psychology students' attitudes towards research: the role of critical

    Defining questions and hypotheses, critical thinking, and epistemic understanding are vital to overcoming intuitive-based decisions and non-scientific beliefs, leading to an evidence-based approach to problem-solving (Murtonen and Salmento, 2019). An empiric epistemic orientation has significant effects on attitude towards research.

  15. Changes in critical thinking, attitudes, and values from freshman to

    To investigate changes in critical thinking ability, stereotypic beliefs, dogmatism, and values, a battery of cognitive and affective measures was administered to 1051 students as freshmen and then as seniors. The data were considered separately for males and females. The principal findings were that: (a) there is a significant decrease in stereotypic beliefs and unreceptivity to new ideas, (b ...

  16. The Role of Students' Beliefs When Critically Reasoning From Multiple

    The role of generic epistemic beliefs on critical stance and attitude in reflectively dealing with information is well researched. Relatively few studies however, have been conducted on the influence of domain-specific beliefs , i.e., beliefs in relation to specific content encountered in a piece of information or task, on the reasoning process ...

  17. How workplaces can encourage diverse personalities, values and attitudes

    In such groups there are different, more deeply held attitudes, beliefs and values that cannot be easily dismissed without sincere critical thinking and engagement.

  18. Critical Reflection

    Abstract. Critical reflection, sometimes referred to as cultural self-awareness, is a purposeful, vital, careful evaluation of one's own values, beliefs, and cultural heritage in order to have an awareness of how these qualities can influence patient care. However, critical reflection goes beyond solely awareness by examining and critiquing ...

  19. 9: Changing Beliefs, Attitudes and Behavior

    Arguing Using Critical Thinking (Marteney) 9: Changing Beliefs, Attitudes and Behavior ... Changing Beliefs, Attitudes and Behavior Last updated; Save as PDF Page ID 67204; Jim Marteney; Los Angeles Valley College via ASCCC Open Educational Resources Initiative (OERI) ... Beliefs; 9.3: Values; 9.4: Value Systems; 9.5: How Are Values Learned? 9. ...

  20. 9.1: Challenging Stasis

    Arguing Using Critical Thinking (Marteney) 9: Changing Beliefs, Attitudes and Behavior 9.1: Challenging Stasis ... Briefly explained, one's beliefs (knowledge) and values (goods or bads) lead to the development of an attitude (likes or dislikes), which in turn guides or directs one's behavior. And when all these parts become comfortable, we are ...

  21. (PDF) The-Imperatives-of-Critical-Thinking-Social-Norms-and-Values-in

    PDF | This chapter examines critical thinking as a veritable tool in the usage of values, norms, and beliefs for sustainable development in contemporary... | Find, read and cite all the research ...

  22. 9.2: Beliefs

    Arguing Using Critical Thinking (Marteney) 9: Changing Beliefs, Attitudes and Behavior 9.2: Beliefs Expand/collapse global location 9.2: Beliefs ... Beliefs represent all the bits of information we collect about people, events, and things in our life. They are cognitions that we have discriminated and selected from all those we have been ...