This page uses technologies your browser does not support.

Many of our new website's features will not function and basic layout will appear broken.

Visit browsehappy.com to learn how to upgrade your browser.

University of New Orleans Logo

  • university of new orleans
  • general education
  • evaluation rubrics
  • critical thinking rubric

Critical Thinking Rubric

This rubric is designed to evaluate the extent to which undergraduate students evaluate claims, arguments, evidence, and hypotheses.

Results will be used for program improvement purposes only.

Download the Critical Thinking Rubric (PDF version)

  • Open access
  • Published: 09 March 2020

Rubrics to assess critical thinking and information processing in undergraduate STEM courses

  • Gil Reynders 1 , 2 ,
  • Juliette Lantz 3 ,
  • Suzanne M. Ruder 2 ,
  • Courtney L. Stanford 4 &
  • Renée S. Cole   ORCID: orcid.org/0000-0002-2807-1500 1  

International Journal of STEM Education volume  7 , Article number:  9 ( 2020 ) Cite this article

66k Accesses

56 Citations

4 Altmetric

Metrics details

Process skills such as critical thinking and information processing are commonly stated outcomes for STEM undergraduate degree programs, but instructors often do not explicitly assess these skills in their courses. Students are more likely to develop these crucial skills if there is constructive alignment between an instructor’s intended learning outcomes, the tasks that the instructor and students perform, and the assessment tools that the instructor uses. Rubrics for each process skill can enhance this alignment by creating a shared understanding of process skills between instructors and students. Rubrics can also enable instructors to reflect on their teaching practices with regard to developing their students’ process skills and facilitating feedback to students to identify areas for improvement.

Here, we provide rubrics that can be used to assess critical thinking and information processing in STEM undergraduate classrooms and to provide students with formative feedback. As part of the Enhancing Learning by Improving Process Skills in STEM (ELIPSS) Project, rubrics were developed to assess these two skills in STEM undergraduate students’ written work. The rubrics were implemented in multiple STEM disciplines, class sizes, course levels, and institution types to ensure they were practical for everyday classroom use. Instructors reported via surveys that the rubrics supported assessment of students’ written work in multiple STEM learning environments. Graduate teaching assistants also indicated that they could effectively use the rubrics to assess student work and that the rubrics clarified the instructor’s expectations for how they should assess students. Students reported that they understood the content of the rubrics and could use the feedback provided by the rubric to change their future performance.

The ELIPSS rubrics allowed instructors to explicitly assess the critical thinking and information processing skills that they wanted their students to develop in their courses. The instructors were able to clarify their expectations for both their teaching assistants and students and provide consistent feedback to students about their performance. Supporting the adoption of active-learning pedagogies should also include changes to assessment strategies to measure the skills that are developed as students engage in more meaningful learning experiences. Tools such as the ELIPSS rubrics provide a resource for instructors to better align assessments with intended learning outcomes.

Introduction

Why assess process skills.

Process skills, also known as professional skills (ABET Engineering Accreditation Commission, 2012 ), transferable skills (Danczak et al., 2017 ), or cognitive competencies (National Research Council, 2012 ), are commonly cited as critical for students to develop during their undergraduate education (ABET Engineering Accreditation Commission, 2012 ; American Chemical Society Committee on Professional Training, 2015 ; National Research Council, 2012 ; Singer et al., 2012 ; The Royal Society, 2014 ). Process skills such as problem-solving, critical thinking, information processing, and communication are widely applicable to many academic disciplines and careers, and they are receiving increased attention in undergraduate curricula (ABET Engineering Accreditation Commission, 2012 ; American Chemical Society Committee on Professional Training, 2015 ) and workplace hiring decisions (Gray & Koncz, 2018 ; Pearl et al., 2019 ). Recent reports from multiple countries (Brewer & Smith, 2011 ; National Research Council, 2012 ; Singer et al., 2012 ; The Royal Society, 2014 ) indicate that these skills are emphasized in multiple undergraduate academic disciplines, and annual polls of about 200 hiring managers indicate that employers may place more importance on these skills than in applicants’ content knowledge when making hiring decisions (Deloitte Access Economics, 2014 ; Gray & Koncz, 2018 ). The assessment of process skills can provide a benchmark for achievement at the end of an undergraduate program and act as an indicator of student readiness to enter the workforce. Assessing these skills may also enable instructors and researchers to more fully understand the impact of active learning pedagogies on students.

A recent meta-analysis of 225 studies by Freeman et al. ( 2014 ) showed that students in active learning environments may achieve higher content learning gains than students in traditional lectures in multiple STEM fields when comparing scores on equivalent examinations. Active learning environments can have many different attributes, but they are commonly characterized by students “physically manipulating objects, producing new ideas, and discussing ideas with others” (Rau et al., 2017 ) in contrast to students sitting and listening to a lecture. Examples of active learning pedagogies include POGIL (Process Oriented Guided Inquiry Learning) (Moog & Spencer, 2008 ; Simonson, 2019 ) and PLTL (Peer-led Team Learning) (Gafney & Varma-Nelson, 2008 ; Gosser et al., 2001 ) in which students work in groups to complete activities with varying levels of guidance from an instructor. Despite the clear content learning gains that students can achieve from active learning environments (Freeman et al., 2014 ), the non-content-gains (including improvements in process skills) in these learning environments have not been explored to a significant degree. Active learning pedagogies such as POGIL and PLTL place an emphasis on students developing non-content skills in addition to content learning gains, but typically only the content learning is assessed on quizzes and exams, and process skills are not often explicitly assessed (National Research Council, 2012 ). In order to fully understand the effects of active learning pedagogies on all aspects of an undergraduate course, evidence-based tools must be used to assess students’ process skill development. The goal of this work was to develop resources that could enable instructors to explicitly assess process skills in STEM undergraduate classrooms in order to provide feedback to themselves and their students about the students’ process skills development.

Theoretical frameworks

The incorporation of these rubrics and other currently available tools for use in STEM undergraduate classrooms can be viewed through the lenses of constructive alignment (Biggs, 1996 ) and self-regulated learning (Zimmerman, 2002 ). The theory of constructivism posits that students learn by constructing their own understanding of knowledge rather than acquiring the meaning from their instructor (Bodner, 1986 ), and constructive alignment extends the constructivist model to consider how the alignment between a course’s intended learning outcomes, tasks, and assessments affects the knowledge and skills that students develop (Biggs, 2003 ). Students are more likely to develop the intended knowledge and skills if there is alignment between the instructor’s intended learning outcomes that are stated at the beginning of a course, the tasks that the instructor and students perform, and the assessment strategies that the instructor uses (Biggs, 1996 , 2003 , 2014 ). The nature of the tasks and assessments indicates what the instructor values and where students should focus their effort when studying. According to Biggs ( 2003 ) and Ramsden ( 1997 ), students see assessments as defining what they should learn, and a misalignment between the outcomes, tasks, and assessments may hinder students from achieving the intended learning outcomes. In the case of this work, the intended outcomes are improved process skills. In addition to aligning the components of a course, it is also critical that students receive feedback on their performance in order to improve their skills. Zimmerman’s theory of self-regulated learning (Zimmerman, 2002 ) provides a rationale for tailoring assessments to provide feedback to both students and instructors.

Zimmerman’s theory of self-regulated learning defines three phases of learning: forethought/planning, performance, and self-reflection. According to Zimmerman, individuals ideally should progress through these three phases in a cycle: they plan a task, perform the task, and reflect on their performance, then they restart the cycle on a new task. If a student is unable to adequately progress through the phases of self-regulated learning on their own, then feedback provided by an instructor may enable the students to do so (Butler & Winne, 1995 ). Thus, one of our criteria when creating rubrics to assess process skills was to make the rubrics suitable for faculty members to use to provide feedback to their students. Additionally, instructors can use the results from assessments to give themselves feedback regarding their students’ learning in order to regulate their teaching. This theory is called self-regulated learning because the goal is for learners to ultimately reflect on their actions to find ways to improve. We assert that, ideally, both students and instructors should be “learners” and use assessment data to reflect on their actions, although with different aims. Students need consistent feedback from an instructor and/or self-assessment throughout a course to provide a benchmark for their current performance and identify what they can do to improve their process skills (Black & Wiliam, 1998 ; Butler & Winne, 1995 ; Hattie & Gan, 2011 ; Nicol & Macfarlane-Dick, 2006 ). Instructors need feedback on the extent to which their efforts are achieving their intended goals in order to improve their instruction and better facilitate the development of process skills through course experiences.

In accordance with the aforementioned theoretical frameworks, tools used to assess undergraduate STEM student process skills should be tailored to fit the outcomes that are expected for undergraduate students and be able to provide formative assessment and feedback to both students and faculty about the students’ skills. These tools should also be designed for everyday classroom use to enable students to regularly self-assess and faculty to provide consistent feedback throughout a semester. Additionally, it is desirable for assessment tools to be broadly generalizable to measure process skills in multiple STEM disciplines and institutions in order to increase the rubrics’ impact on student learning. Current tools exist to assess these process skills, but they each lack at least one of the desired characteristics for providing regular feedback to STEM students.

Current tools to assess process skills

Current tests available to assess critical thinking include the Critical Thinking Assessment Test (CAT) (Stein & Haynes, 2011 ), California Critical Thinking Skills Test (Facione, 1990a , 1990b ), and Watson Glaser Critical Thinking Appraisal (Watson & Glaser, 1964 ). These commercially available, multiple-choice tests are not designed to provide regular, formative feedback throughout a course and have not been implemented for this purpose. Instead, they are designed to provide summative feedback with a focus on assessing this skill at a programmatic or university level rather than for use in the classroom to provide formative feedback to students. Rather than using tests to assess process skills, rubrics could be used instead. Rubrics are effective assessment tools because they can be quick and easy to use, they provide feedback to both students and instructors, and they can evaluate individual aspects of a skill to give more specific feedback (Brookhart & Chen, 2014 ; Smit & Birri, 2014 ). Rubrics for assessing critical thinking are available, but they have not been used to provide feedback to undergraduate STEM students nor were they designed to do so (Association of American Colleges and Universities, 2019 ; Saxton et al., 2012 ). The Critical Thinking Analytic Rubric is designed specifically to assess K-12 students to enhance college readiness and has not been broadly tested in collegiate STEM courses (Saxton et al., 2012 ). The critical thinking rubric developed by the Association of American Colleges and Universities (AAC&U) as part its Valid Assessment of Learning in Undergraduate Education (VALUE) Institute and Liberal Education and America’s Promise (LEAP) initiative (Association of American Colleges and Universities, 2019 ) is intended for programmatic assessment rather than specifically giving feedback to students throughout a course. As with tests for assessing critical thinking, current rubrics to assess critical thinking are not designed to act as formative assessments and give feedback to STEM faculty and undergraduates at the course or task level. Another issue with the assessment of critical thinking is the degree to which the construct is measurable. A National Research Council report (National Research Council, 2011 ) has suggested that there is little evidence of a consistent, measurable definition for critical thinking and that it may not be different from one’s general cognitive ability. Despite this issue, we have found that critical thinking is consistently listed as a programmatic outcome in STEM disciplines (American Chemical Society Committee on Professional Training, 2015 ; The Royal Society, 2014 ), so we argue that it is necessary to support instructors as they attempt to assess this skill.

Current methods for evaluating students’ information processing include discipline-specific tools such as a rubric to assess physics students’ use of graphs and equations to solve work-energy problems (Nguyen et al., 2010 ) and assessments of organic chemistry students’ ability to “[manipulate] and [translate] between various representational forms” including 2D and 3D representations of chemical structures (Kumi et al., 2013 ). Although these assessment tools can be effectively used for their intended context, they were not designed for use in a wide range of STEM disciplines or for a variety of tasks.

Despite the many tools that exist to measure process skills, none has been designed and tested to facilitate frequent, formative feedback to STEM undergraduate students and faculty throughout a semester. The rubrics described here have been designed by the Enhancing Learning by Improving Process Skills in STEM (ELIPSS) Project (Cole et al., 2016 ) to assess undergraduate STEM students’ process skills and to facilitate feedback at the classroom level with the potential to track growth throughout a semester or degree program. The rubrics described here are designed to assess critical thinking and information processing in student written work. Rubrics were chosen as the format for our process skill assessment tools because the highest level of each category in rubrics can serve as an explicit learning outcome that the student is expected to achieve (Panadero & Jonsson, 2013 ). Rubrics that are generalizable to multiple disciplines and institutions can enable the assessment of student learning outcomes and active learning pedagogies throughout a program of study and provide useful tools for a greater number of potential users.

Research questions

This work sought to answer the following research questions for each rubric:

Does the rubric adequately measure relevant aspects of the skill?

How well can the rubrics provide feedback to instructors and students?

Can multiple raters use the rubrics to give consistent scores?

This work received Institutional Review Board approval prior to any data collection involving human subjects. The sources of data used to construct the process skill rubrics and answer these research questions were (1) peer-reviewed literature on how each skill is defined, (2) feedback from content experts in multiple STEM disciplines via surveys and in-person, group discussions regarding the appropriateness of the rubrics for each discipline, (3) interviews with students whose work was scored with the rubrics and teaching assistants who scored the student work, and (4) results of applying the rubrics to samples of student work.

Defining the scope of the rubrics

The rubrics described here and the other rubrics in development by the ELIPSS Project are intended to measure process skills, which are desired learning outcomes identified by the STEM community in recent reports (National Research Council, 2012 ; Singer et al., 2012 ). In order to measure these skills in multiple STEM disciplines, operationalized definitions of each skill were needed. These definitions specify which aspects of student work (operations) would be considered evidence for the student using that skill and establish a shared understanding of each skill by members of each STEM discipline. The starting point for this work was the process skill definitions developed as part of the POGIL project (Cole et al., 2019a ). The POGIL community includes instructors from a variety of disciplines and institutions and represented the intended audience for the rubrics: faculty who value process skills and want to more explicitly assess them. The process skills discussed in this work were defined as follows:

Critical thinking is analyzing, evaluating, or synthesizing relevant information to form an argument or reach a conclusion supported with evidence.

Information processing is evaluating, interpreting, and manipulating or transforming information.

Examples of critical thinking include the tasks that students are asked to perform in a laboratory course. When students are asked to analyze the data they collected, combine data from different sources, and generate arguments or conclusions about their data, we see this as critical thinking. However, when students simply follow the so-called “cookbook” laboratory instructions that require them to confirm pre-determined conclusions, we do not think students are engaging in critical thinking. One example of information processing is when organic chemistry students are required to re-draw molecules in different formats. The students must evaluate and interpret various pieces of one representation, and then they recreate the molecule in another representation. However, if students are asked to simply memorize facts or algorithms to solve problems, we do not see this as information processing.

Iterative rubric development

The development process was the same for the information processing rubric and the critical thinking rubric. After defining the scope of the rubric, an initial version was drafted based upon the definition of the target process skill and how each aspect of the skill is defined in the literature. A more detailed discussion of the literature that informed each rubric category is included in the “Results and Discussion” section. This initial version then underwent iterative testing in which the rubric was reviewed by researchers, practitioners, and students. The rubric was first evaluated by the authors and a group of eight faculty from multiple STEM disciplines who made up the ELIPSS Project’s primary collaborative team (PCT). The PCT was a group of faculty members with experience in discipline-based education research who employ active-learning pedagogies in their classrooms. This initial round of evaluation was intended to ensure that the rubric measured relevant aspects of the skill and was appropriate for each PCT member’s discipline. This evaluation determined how well the rubrics were aligned with each instructor’s understanding of the process skill including both in-person and email discussions that continued until the group came to consensus that each rubric category could be applied to student work in courses within their disciplines. There has been an ongoing debate regarding the role of disciplinary knowledge in critical thinking and the extent to which critical thinking is subject-specific (Davies, 2013 ; Ennis, 1990 ). This work focuses on the creation of rubrics to measure process skills in different domains, but we have not performed cross-discipline comparisons. This initial round of review was also intended to ensure that the rubrics were ready for classroom testing by instructors in each discipline. Next, each rubric was tested over three semesters in multiple classroom environments, illustrated in Table 1 . The rubrics were applied to student work chosen by each PCT member. The PCT members chose the student work based on their views of how the assignments required students to engage in process skills and show evidence of those skills. The information processing and critical thinking rubrics shown in this work were each tested in at least three disciplines, course levels, and institutions.

After each semester, the feedback was collected from the faculty testing the rubric, and further changes to the rubric were made. Feedback was collected in the form of survey responses along with in-person group discussions at annual project meetings. After the first iteration of completing the survey, the PCT members met with the authors to discuss how they were interpreting each survey question. This meeting helped ensure that the surveys were gathering valid data regarding how well the rubrics were measuring the desired process skill. Questions in the survey such as “What aspects of the student work provided evidence for the indicated process skill?” and “Are there edits to the rubric/descriptors that would improve your ability to assess the process skill?” allowed the authors to determine how well the rubric scores were matching the student work and identify necessary changes to the rubric. Further questions asked about the nature and timing of the feedback given to students in order to address the question of how well the rubrics provide feedback to instructors and students. The survey questions are included in the Supporting Information . The survey responses were analyzed qualitatively to determine themes related to each research question.

In addition to the surveys given to faculty rubric testers, twelve students were interviewed in fall 2016 and fall 2017. In the United States of America, the fall semester typically runs from August to December and is the first semester of the academic year. Each student participated in one interview which lasted about 30 min. These interviews were intended to gather further data to answer questions about how well the rubrics were measuring the identified process skills that students were using when they completed their assignments and to ensure that the information provided by the rubrics made sense to students. The protocol for these interviews is included in the Supporting Information . In fall 2016, the students interviewed were enrolled in an organic chemistry laboratory course for non-majors at a large, research-intensive university in the United States. Thirty students agreed to have their work analyzed by the research team, and nine students were interviewed. However, the rubrics were not a component of the laboratory course grading. Instead, the first author assessed the students’ reports for critical thinking and information processing, and then the students were provided electronic copies of their laboratory reports and scored rubrics in advance of the interview. The first author had recently been a graduate teaching assistant for the course and was familiar with the instructor’s expectations for the laboratory reports. During the interview, the students were given time to review their reports and the completed rubrics, and then they were asked about how well they understood the content of the rubrics and how accurately each category score represented their work.

In fall 2017, students enrolled in a physical chemistry thermodynamics course for majors were interviewed. The physical chemistry course took place at the same university as the organic laboratory course, but there was no overlap between participants. Three students and two graduate teaching assistants (GTAs) were interviewed. The course included daily group work, and process skill assessment was an explicit part of the instructor’s curriculum. At the end of each class period, students assessed their groups using portions of ELIPSS rubrics, including the two process skill rubrics included in this paper. About every 2 weeks, the GTAs assessed the student groups with a complete ELIPSS rubric for a particular skill, then gave the groups their scored rubrics with written comments. The students’ individual homework problem sets were assessed once with rubrics for three skills: critical thinking, information processing, and problem-solving. The students received the scored rubric with written comments when the graded problem set was returned to them. In the last third of the semester, the students and GTAs were interviewed about how rubrics were implemented in the course, how well the rubric scores reflected the students’ written work, and how the use of rubrics affected the teaching assistants’ ability to assess the student skills. The protocols for these interviews are included in the Supporting Information .

Gathering evidence for utility, validity, and reliability

The utility, validity, and reliability of the rubrics were measured throughout the development process. The utility is the degree to which the rubrics are perceived as practical to experts and practitioners in the field. Through multiple meetings, the PCT faculty determined that early drafts of the rubric seemed appropriate for use in their classrooms, which represented multiple STEM disciplines. Rubric utility was reexamined multiple times throughout the development process to ensure that the rubrics would remain practical for classroom use. Validity can be defined in multiple ways. For example, the Standards for Educational and Psychological Testing (Joint Committee on Standards for Educational Psychological Testing, 2014 ) defines validity as “the degree to which all the accumulated evidence supports the intended interpretation of test scores for the proposed use.” For the purposes of this work, we drew on the ways in which two distinct types of validity were examined in the rubric literature: content validity and construct validity. Content validity is the degree to which the rubrics cover relevant aspects of each process skill (Moskal & Leydens, 2000 ). In this case, the process skill definition and a review of the literature determined which categories were included in each rubric. The literature review was finished once the data was saturated: when no more new aspects were found. Construct validity is the degree to which the levels of each rubric category accurately reflect the process that students performed (Moskal & Leydens, 2000 ). Evidence of construct validity was gathered via the faculty surveys, teaching assistant interviews, and student interviews. In the student interviews, students were given one of their completed assignments and asked to explain how they completed the task. Students were then asked to explain how well each category applied to their work and if any changes were needed to the rubric to more accurately reflect their process. Due to logistical challenges, we were not able to obtain evidence for convergent validity, and this is further discussed in the “Limitations” section.

Adjacent agreement, also known as “interrater agreement within one,” was chosen as the measure of interrater reliability due to its common use in rubric development projects (Jonsson & Svingby, 2007 ). The adjacent agreement is the percentage of cases in which two raters agree on a rating or are different by one level (i.e., they give adjacent ratings to the same work). Jonsson and Svingby ( 2007 ) found that most of the rubrics they reviewed had adjacent agreement scores of 90% or greater. However, they noted that the agreement threshold varied based on the number of possible levels of performance for each category in the rubric, with three and four being the most common numbers of levels. Since the rubrics discussed in this report have six levels (scores of zero through five) and are intended for low-stakes assessment and feedback, the goal of 80% adjacent agreement was selected. To calculate agreement for the critical thinking and information processing rubrics, two researchers discussed the scoring criteria for each rubric and then independently assessed the organic chemistry laboratory reports.

Results and discussion

The process skill rubrics to assess critical thinking and information processing in student written work were completed after multiple rounds of revision based on feedback from various sources. These sources include feedback from instructors who tested the rubrics in their classrooms, TAs who scored student work with the rubrics, and students who were assessed with the rubrics. The categories for each rubric will be discussed in terms of the evidence that the rubrics measure the relevant aspects of the skill and how they can be used to assess STEM undergraduate student work. Each category discussion will begin with a general explanation of the category followed by more specific examples from the organic chemistry laboratory course and physical chemistry lecture course to demonstrate how the rubrics can be used to assess student work.

Information processing rubric

The definition of information processing and the focus of the rubric presented here (Fig. 1 ) are distinct from cognitive information processing as defined by the educational psychology literature (Driscoll, 2005 ). The rubric shown here is more aligned with the STEM education construct of representational competency (Daniel et al., 2018 ).

figure 1

Rubric for assessing information processing

When solving a problem or completing a task, students must evaluate the provided information for relevance or importance to the task (Hanson, 2008 ; Swanson et al., 1990 ). All the information provided in a prompt (e.g., homework or exam questions) may not be relevant for addressing all parts of the prompt. Students should ideally show evidence of their evaluation process by identifying what information is present in the prompt/model, indicating what information is relevant or not relevant, and indicating why information is relevant. Responses with these characteristics would earn high rubric scores for this category. Although students may not explicitly state what information is necessary to address a task, the information they do use can act as indirect evidence of the degree to which they have evaluated all of the available information in the prompt. Evidence for students inaccurately evaluating information for relevance includes the inclusion of irrelevant information or the omission of relevant information in an analysis or in completing a task. When evaluating the organic chemistry laboratory reports, the focus for the evaluating category was the information students presented when identifying the chemical structure of their products. For students who received a high score, this information included their measured value for the product’s melting point, the literature (expected) value for the melting point, and the peaks in a nuclear magnetic resonance (NMR) spectrum. NMR spectroscopy is a commonly used technique in chemistry to obtain structural information about a compound. Lower scores were given if students omitted any of the necessary information or if they included unnecessary information. For example, if a student discussed their reaction yield when discussing the identity of their product, they would receive a low Evaluating score because the yield does not help them determine the identity of their product; the yield, in this case, would be unnecessary information. In the physical chemistry course, students often did not show evidence that they determined which information was relevant to answer the homework questions and thus earned low evaluating scores. These omissions will be further addressed in the “Interpreting” section.

Interpreting

In addition to evaluating, students must often interpret information using their prior knowledge to explain the meaning of something, make inferences, match data to predictions, and extract patterns from data (Hanson, 2008 ; Nakhleh, 1992 ; Schmidt et al., 1989 ; Swanson et al., 1990 ). Students earn high scores for this category if they assign correct meaning to labeled information (e.g., text, tables, graphs, diagrams), extract specific details from information, explain information in their own words, and determine patterns in information. For the organic chemistry laboratory reports, students received high scores if they accurately interpreted their measured values and NMR peaks. Almost every student obtained melting point values that were different than what was expected due to measurement error or impurities in their products, so they needed to describe what types of impurities could cause such discrepancies. Also, each NMR spectrum contained one peak that corresponded to the solvent used to dissolve the students’ product, so the students needed to use their prior knowledge of NMR spectroscopy to recognize that peak did not correspond to part of their product.

In physical chemistry, the graduate teaching assistant often gave students low scores for inaccurately explaining changes to chemical systems such as changes in pressure or entropy. The graduate teaching assistant who assessed the student work used the rubric to identify both the evaluating and interpreting categories as weaknesses in many of the students’ homework submissions. However, the students often earned high scores for the manipulating and transforming categories, so the GTA was able to give students specific feedback on their areas for improvement while also highlighting their strengths.

Manipulating and transforming (extent and accuracy)

In addition to evaluating and interpreting information, students may be asked to manipulate and transform information from one form to another. These transformations should be complete and accurate (Kumi et al., 2013 ; Nguyen et al., 2010 ). Students may be required to construct a figure based on written information, or conversely, they may transform information in a figure into words or mathematical expressions. Two categories for manipulating and transforming (i.e., extent and accuracy) were included to allow instructors to give more specific feedback. It was often found that students would either transform little information but do so accurately, or transform much information and do so inaccurately; the two categories allowed for differentiated feedback to be provided. As stated above, the organic chemistry students were expected to transform their NMR spectral data into a table and provide a labeled structure of their final product. Students were given high scores if they converted all of the relevant peaks from their spectrum into the table format and were able to correctly match the peaks to the hydrogen atoms in their products. Students received lower scores if they were only able to convert the information for a few peaks or if they incorrectly matched the peaks to the hydrogen atoms.

Critical thinking rubric

Critical thinking can be broadly defined in different contexts, but we found that the categories included in the rubric (Fig. 2 ) represented commonly accepted aspects of critical thinking (Danczak et al., 2017 ) and suited the needs of the faculty collaborators who tested the rubric in their classrooms.

figure 2

Rubric for assessing critical thinking

When completing a task, students must evaluate the relevance of information that they will ultimately use to support a claim or conclusions (Miri et al., 2007 ; Zohar et al., 1994 ). An evaluating category is included in both critical thinking and information processing rubrics because evaluation is a key aspect of both skills. From our previous work developing a problem-solving rubric (manuscript in preparation) and our review of the literature for this work (Danczak et al., 2017 ; Lewis & Smith, 1993 ), the overlap was seen between information processing, critical thinking, and problem-solving. Additionally, while the Evaluating category in the information processing rubric assesses a student’s ability to determine the importance of information to complete a task, the evaluating category in the critical thinking rubric places a heavier emphasis on using the information to support a conclusion or argument.

When scoring student work with the evaluating category, students receive high scores if they indicate what information is likely to be most relevant to the argument they need to make, determine the reliability of the source of their information, and determine the quality and accuracy of the information itself. The information used to assess this category can be indirect as with the Evaluating category in the information processing rubric. In the organic chemistry laboratory reports, students needed to make an argument about whether they successfully produced the desired product, so they needed to discuss which information was relevant to their claims about the product’s identity and purity. Students received high scores for the evaluating category when they accurately determined that the melting point and nearly all peaks except the solvent peak in the NMR spectrum indicated the identity of their product. Students received lower scores for evaluating when they left out relevant information because this was seen as evidence that the student inaccurately evaluated the information’s relevance in supporting their conclusion. They also received lower scores when they incorrectly stated that a high yield indicated a pure product. Students were given the opportunity to demonstrate their ability to evaluate the quality of information when discussing their melting point. Students sometimes struggled to obtain reliable melting point data due to their inexperience in the laboratory, so the rubric provided a way to assess the student’s ability to critique their own data.

In tandem with evaluating information, students also need to analyze that same information to extract meaningful evidence to support their conclusions (Bailin, 2002 ; Lai, 2011 ; Miri et al., 2007 ). The analyzing category provides an assessment of a student’s ability to discuss information and explore the possible meaning of that information, extract patterns from data/information that could be used as evidence for their claims, and summarize information that could be used as evidence. For example, in the organic chemistry laboratory reports, students needed to compare the information they obtained to the expected values for a product. Students received high scores for the analyzing category if they could extract meaningful structural information from the NMR spectrum and their two melting points (observed and expected) for each reaction step.

Synthesizing

Often, students are asked to synthesize or connect multiple pieces of information in order to draw a conclusion or make a claim (Huitt, 1998 ; Lai, 2011 ). Synthesizing involves identifying the relationships between different pieces of information or concepts, identifying ways that different pieces of information or concepts can be combined, and explaining how the newly synthesized information can be used to reach a conclusion and/or support an argument. While performing the organic chemistry laboratory experiments, students obtained multiple types of information such as the melting point and NMR spectrum in addition to other spectroscopic data such as an infrared (IR) spectrum. Students received high scores for this category when they accurately synthesized these multiple data types by showing how the NMR and IR spectra could each reveal different parts of a molecule in order to determine the molecule’s entire structure.

Forming arguments (structure and validity)

The final key aspect of critical thinking is forming a well-structured and valid argument (Facione, 1984 ; Glassner & Schwarz, 2007 ; Lai, 2011 ; Lewis & Smith, 1993 ). It was observed that students can earn high scores for evaluating, analyzing, and synthesizing, but still struggle to form arguments. This was particularly common in assessing problem sets in the physical chemistry course.

As with the manipulating and transforming categories in the information processing rubric, two forming arguments categories were included to allow instructors to give more specific feedback. Some students may be able to include all of the expected structural elements of their arguments but use faulty information or reasoning. Conversely, some students may be able to make scientifically valid claims but not necessarily support them with evidence. The two forming arguments categories are intended to accurately assess both of these scenarios. For the forming arguments (structure) category, students earn high scores if they explicitly state their claim or conclusion, list the evidence used to support the argument, and provide reasoning to link the evidence to their claim/conclusion. Students who do not make a claim or who provide little evidence or reasoning receive lower scores.

For the forming arguments (validity) category, students earn high scores if their claim is accurate and their reasoning is logical and clearly supports the claim with provided evidence. Organic chemistry students earned high scores for the forms and supports arguments categories if they made explicit claims about the identity and purity of their product and provided complete and accurate evidence for their claim(s) such as the melting point values and positions of NMR peaks that correspond to their product. Additionally, the students provided evidence for the purity of their products by pointing to the presence or absence of peaks in their NMR spectrum that would match other potential side products. They also needed to provide logical reasoning for why the peaks indicated the presence or absence of a compound. As previously mentioned, the physical chemistry students received lower scores for the forming arguments categories than for the other aspects of critical thinking. These students were asked to make claims about the relationships between entropy and heat and then provide relevant evidence to justify these claims. Often, the students would make clearly articulated claims but would provide little evidence to support them. As with the information processing rubric, the critical thinking rubric allowed the GTAs to assess aspects of these skills independently and identify specific areas for student improvement.

Validity and reliability

The goal of this work was to create rubrics that can accurately assess student work (validity) and be consistently implemented by instructors or researchers within multiple STEM fields (reliability). The evidence for validity includes the alignment of the rubrics with literature-based descriptions of each skill, review of the rubrics by content experts from multiple STEM disciplines, interviews with undergraduate students whose work was scored using the rubrics, and interviews of the GTAs who scored the student work.

The definitions for each skill, along with multiple iterations of the rubrics, underwent review by STEM content experts. As noted earlier, the instructors who were testing the rubrics were given a survey at the end of each semester and were invited to offer suggested changes to the rubric to better help them assess their students. After multiple rubric revisions, survey responses from the instructors indicated that the rubrics accurately represented the breadth of each process skill as seen in each expert’s content area and that each category could be used to measure multiple levels of student work. By the end of the rubrics’ development, instructors were writing responses such as “N/A” or “no suggestions” to indicate that the rubrics did not need further changes.

Feedback from the faculty also indicated that the rubrics were measuring the intended constructs by the ways they responded to the survey item “What aspects of the student work provided evidence for the indicated process skill?” For example, one instructor noted that for information processing, she saw evidence of the manipulating and transforming categories when “students had to transform their written/mathematical relationships into an energy diagram.” Another instructor elicited evidence of information processing during an in-class group quiz: “A question on the group quiz was written to illicit [sic] IP [information processing]. Students had to transform a structure into three new structures and then interpret/manipulate the structures to compare the pKa values [acidity] of the new structures.” For this instructor, the structures written by the students revealed evidence of their information processing by showing what information they omitted in the new structures or inaccurately transformed. For critical thinking, an instructor assessed short research reports with the critical thinking rubric and “looked for [the students’] ability to use evidence to support their conclusions, to evaluate the literature studies, and to develop their own judgements by synthesizing the information.” Another instructor used the critical thinking rubric to assess their students’ abilities to choose an instrument to perform a chemical analysis. According to the instructor, the students provided evidence of their critical thinking because “in their papers, they needed to justify their choice of instrument. This justification required them to evaluate information and synthesize a new understanding for this specific chemical analysis.”

Analysis of student work indicates multiple levels of achievement for each rubric category (illustrated in Fig. 3 ), although there may have been a ceiling effect for the evaluating and the manipulating and transforming (extent) categories in information processing for organic chemistry laboratory reports because many students earned the highest possible score (five) for those categories. However, other implementations of the ELIPSS rubrics (Reynders et al., 2019 ) have shown more variation in student scores for the two process skills.

figure 3

Student rubric scores from an organic chemistry laboratory course. The two rubrics were used to evaluate different laboratory reports. Thirty students were assessed for information processing and 28 were assessed for critical thinking

To provide further evidence that the rubrics were measuring the intended skills, students in the physical chemistry course were interviewed about their thought processes and how well the rubric scores reflected the work they performed. During these interviews, students described how they used various aspects of information processing and critical thinking skills. The students first described how they used information processing during a problem set where they had to answer questions about a diagram of systolic and diastolic blood pressure. Students described how they evaluated and interpreted the graph to make statements such as “diastolic [pressure] is our y-intercept” and “volume is the independent variable.” The students then demonstrated their ability to transform information from one form to another, from a graph to a mathematic equation, by recognizing “it’s a linear relationship so I used Y equals M X plus B ” and “integrated it cause it’s the change, the change in V [volume]. For critical thinking, students described their process on a different problem set. In this problem set, the students had to explain why the change of Helmholtz energy and the change in Gibbs free energy were equivalent under a certain given condition. Students first demonstrated how they evaluated the relevant information and analyzed what would and would not change in their system. One student said, “So to calculate the final pressure, I think I just immediately went to the ideal gas law because we know the final volume and the number of moles won’t change and neither will the temperature in this case. Well, I assume that it wouldn’t.” Another student showed evidence of their evaluation by writing out all the necessary information in one place and stating, “Whenever I do these types of problems, I always write what I start with which is why I always have this line of information I’m given.” After evaluating and analyzing, students had to form an argument by claiming that the two energy values were equal and then defending that claim. Students explained that they were not always as clear as they could be when justifying their claim. For instance, one student said, “Usually I just write out equations and then hope people understand what I’m doing mathematically” but they “probably could have explained it a little more.”

Student feedback throughout the organic chemistry course and near the end of the physical chemistry course indicated that the rubric scores were accurate representations of the students’ work with a few exceptions. For example, some students felt like they should have received either a lower or higher score for certain categories, but they did say that the categories themselves applied well to their work. Most notably, one student reported that the forms and supports arguments categories in the critical thinking rubric did not apply to her work because she “wasn’t making an argument” when she was demonstrating that the Helmholtz and Gibbs energy values were equal in her thermodynamics assignment. We see this as an instance where some students and instructors may define argument in different ways. The process skill definitions and the rubric categories are meant to articulate intended learning outcomes from faculty members to their students, so if a student defines the skills or categories differently than the faculty member, then the rubrics can serve to promote a shared understanding of the skill.

As previously mentioned, reliability was measured by two researchers assessing ten laboratory reports independently to ensure that multiple raters could use the rubrics consistently. The average adjacent agreement scores were 92% for critical thinking and 93% for information processing. The exact agreement scores were 86% for critical thinking and 88% for information processing. Additionally, two different raters assessed a statistics assignment that was given to sixteen first-year undergraduates. The average pairwise adjacent agreement scores were 89% for critical thinking and 92% for information processing for this assignment. However, the exact agreement scores were much lower: 34% for critical thinking and 36% for information processing. In this case, neither rater was an expert in the content area. While the exact agreement scores for the statistics assignment are much lower than desirable, the adjacent agreement scores do meet the threshold for reliability as seen in other rubrics (Jonsson & Svingby, 2007 ) despite the disparity in expertise. Based on these results, it may be difficult for multiple raters to give exactly the same scores to the same work if they have varying levels of content knowledge, but it is important to note that the rubrics are primarily intended for formative assessment that can facilitate discussions between instructors and students about the ways for students to improve. The high level of adjacent agreement scores indicates that multiple raters can identify the same areas to improve in examples of student work.

Instructor and teaching assistant reflections

The survey responses from faculty members determined the utility of the rubrics. Faculty members reported that when they used the rubrics to define their expectations and be more specific about their assessment criteria, the students seemed to be better able to articulate the areas in which they needed improvement. As one instructor put it, “having the rubrics helped open conversations and discussions” that were not happening before the rubrics were implemented. We see this as evidence of the clear intended learning outcomes that are an integral aspect of achieving constructive alignment within a course. The instructors’ specific feedback to the students, and the students’ increased awareness of their areas for improvement, may enable the students to better regulate their learning throughout a course. Additionally, the survey responses indicated that the faculty members were changing their teaching practices and becoming more cognizant of how assignments did or did not elicit the process skill evidence that they desired. After using the rubrics, one instructor said, “I realize I need to revise many of my activities to more thoughtfully induce process skill development.” We see this as evidence that the faculty members were using the rubrics to regulate their teaching by reflecting on the outcomes of their practices and then planning for future teaching. These activities represent the reflection and forethought/planning aspects of self-regulated learning on the part of the instructors. Graduate teaching assistants in the physical chemistry course indicated that the rubrics gave them a way to clarify the instructor’s expectations when they were interacting with the students. As one GTA said, “It’s giving [the students] feedback on direct work that they have instead of just right or wrong. It helps them to understand like ‘Okay how can I improve? What areas am I lacking in?’” A more detailed account of how the instructors and teaching assistants implemented the rubrics has been reported elsewhere (Cole et al., 2019a ).

Student reflections

Students in both the organic and physical chemistry courses reported that they could use the rubrics to engage in the three phases of self-regulated learning: forethought/planning, performing, and reflecting. In an organic chemistry interview, one student was discussing how they could improve their low score for the synthesizing category of critical thinking by saying “I could use the data together instead of trying to use them separately,” thus demonstrating forethought/planning for their later work. Another student described how they could use the rubric while performing a task: “I could go through [the rubric] as I’m writing a report…and self-grade.” Finally, one student demonstrated how they could use the rubrics to reflect on their areas for improvement by saying that “When you have the five column [earn a score of five], I can understand that I’m doing something right” but “I really need to work on revising my reports.” We see this as evidence that students can use the rubrics to regulate their own learning, although classroom facilitation can have an effect on the ways in which students use the rubric feedback (Cole et al., 2019b ).

Limitations

The process skill definitions presented here represent a consensus understanding among members of the POGIL community and the instructors who participated in this study, but these skills are often defined in multiple ways by various STEM instructors, employers, and students (Danczak et al., 2017 ). One issue with critical thinking, in particular, is the broadness of how the skill is defined in the literature. Through this work, we have evidence via expert review to indicate that our definitions represent common understandings among a set of STEM faculty. Nonetheless, we cannot claim that all STEM instructors or researchers will share the skill definitions presented here.

There is currently a debate in the STEM literature (National Research Council, 2011 ) about whether the critical thinking construct is domain-general or domain-specific, that is, whether or not one’s critical thinking ability in one discipline can be applied to another discipline. We cannot make claims about the generalness of the construct based on the data presented here because the same students were not tested across multiple disciplines or courses. Additionally, we did not gather evidence for convergent validity, which is “the degree to which an operationalized construct is similar to other operationalized constructs that it theoretically should be similar to” (National Research Council, 2011 ). In other words, evidence for convergent validity would be the comparison of multiple measures of information processing or critical thinking. However, none of the instructors who used the ELIPSS rubrics also used a secondary measure of the constructs. Although the rubrics were examined by a multidisciplinary group of collaborators, this group was primarily chemists and included eight faculties from other disciplines, so the content validity of the rubrics may be somewhat limited.

Finally, the generalizability of the rubrics is limited by the relatively small number of students who were interviewed about their work. During their interviews, the students in the organic and physical chemistry courses each said that they could use the rubric scores as feedback to improve their skills. Additionally, as discussed in the “Validity and Reliability” section, the processes described by the students aligned with the content of the rubric and provided evidence of the rubric scores’ validity. However, the data gathered from the student interviews only represents the views of a subset of students in the courses, and further study is needed to determine the most appropriate contexts in which the rubrics can be implemented.

Conclusions and implications

Two rubrics were developed to assess and provide feedback on undergraduate STEM students’ critical thinking and information processing. Faculty survey responses indicated that the rubrics measured the relevant aspects of each process skill in the disciplines that were examined. Faculty survey responses, TA interviews, and student interviews over multiple semesters indicated that the rubric scores accurately reflected the evidence of process skills that the instructors wanted to see and the processes that the students performed when they were completing their assignments. The rubrics showed high inter-rater agreement scores, indicating that multiple raters could identify the same areas for improvement in student work.

In terms of constructive alignment, courses should ideally have alignment between their intended learning outcomes, student and instructor activities, and assessments. By using the ELIPSS rubrics, instructors were able to explicitly articulate the intended learning outcomes of their courses to their students. The instructors were then able to assess and provide feedback to students on different aspects of their process skills. Future efforts will be focused on modifying student assignments to enable instructors to better elicit evidence of these skills. In terms of self-regulated learning, students indicated in the interviews that the rubric scores were accurate representations of their work (performances), could help them reflect on their previous work (self-reflection), and the feedback they received could be used to inform their future work (forethought). Not only did the students indicate that the rubrics could help them regulate their learning, but the faculty members indicated that the rubrics had helped them regulate their teaching. With the individual categories on each rubric, the faculty members were better able to observe their students’ strengths and areas for improvement and then tailor their instruction to meet those needs. Our results indicated that the rubrics helped instructors in multiple STEM disciplines and at multiple institutions reflect on their teaching and then make changes to better align their teaching with their desired outcomes.

Overall, the rubrics can be used in a number of different ways to modify courses or for programmatic assessment. As previously stated, instructors can use the rubrics to define expectations for their students and provide them with feedback on desired skills throughout a course. The rubric categories can be used to give feedback on individual aspects of student process skills to provide specific feedback to each student. If an instructor or department wants to change from didactic lecture-based courses to active learning ones, the rubrics can be used to measure non-content learning gains that stem from the adoption of such pedagogies. Although the examples provided here for each rubric were situated in chemistry contexts, the rubrics were tested in multiple disciplines and institution types. The rubrics have the potential for wide applicability to assess not only laboratory reports but also homework assignments, quizzes, and exams. Assessing these tasks provides a way for instructors to achieve constructive alignment between their intended outcomes and their assessments, and the rubrics are intended to enhance this alignment to improve student process skills that are valued in the classroom and beyond.

Availability of data and materials

The datasets used and/or analyzed during the current study are available from the corresponding author on reasonable request.

Abbreviations

American Association of Colleges and Universities

Critical Thinking Assessment Test

Comprehensive University

Enhancing Learning by Improving Process Skills in STEM

Liberal Education and America’s Promise

Nuclear Magnetic Resonance

Primary Collaborative Team

Peer-led Team Learning

Process Oriented Guided Inquiry Learning

Primarily Undergraduate Institution

Research University

Science, Technology, Engineering, and Mathematics

Valid Assessment of Learning in Undergraduate Education

ABET Engineering Accreditation Commission. (2012). Criteria for Accrediting Engineering Programs . Retrieved from http://www.abet.org/accreditation/accreditation-criteria/criteria-for-accrediting-engineering-programs-2016-2017/ .

American Chemical Society Committee on Professional Training. (2015). Unergraduate Professional Education in Chemistry: ACS Guidelines and Evaluation Procedures for Bachelor's Degree Programs . Retrieved from https://www.acs.org/content/dam/acsorg/about/governance/committees/training/2015-acs-guidelines-for-bachelors-degree-programs.pdf

Association of American Colleges and Universities. (2019). VALUE Rubric Development Project. Retrieved from https://www.aacu.org/value/rubrics .

Bailin, S. (2002). Critical Thinking and Science Education. Science and Education, 11 , 361–375.

Article   Google Scholar  

Biggs, J. (1996). Enhancing teaching through constructive alignment. Higher Education, 32 (3), 347–364.

Biggs, J. (2003). Aligning teaching and assessing to course objectives. Teaching and learning in higher education: New trends and innovations, 2 , 13–17.

Google Scholar  

Biggs, J. (2014). Constructive alignment in university teaching. HERDSA Review of higher education, 1 (1), 5–22.

Black, P., & Wiliam, D. (1998). Assessment and Classroom Learning. Assessment in Education: Principles, Policy & Practice, 5 (1), 7–74.

Bodner, G. M. (1986). Constructivism: A theory of knowledge. Journal of Chemical Education, 63 (10), 873–878.

Brewer, C. A., & Smith, D. (2011). Vision and change in undergraduate biology education: a call to action. American Association for the Advancement of Science . DC : Washington .

Brookhart, S. M., & Chen, F. (2014). The quality and effectiveness of descriptive rubrics. Educational Review , 1–26.

Butler, D. L., & Winne, P. H. (1995). Feedback and Self-Regulated Learning: A Theoretical Synthesis. Review of Educational Research, 65 (3), 245–281.

Cole, R., Lantz, J., & Ruder, S. (2016). Enhancing Learning by Improving Process Skills in STEM. Retrieved from http://www.elipss.com .

Cole, R., Lantz, J., & Ruder, S. (2019a). PO: The Process. In S. R. Simonson (Ed.), POGIL: An Introduction to Process Oriented Guided Inquiry Learning for Those Who Wish to Empower Learners (pp. 42–68). Sterling, VA: Stylus Publishing.

Cole, R., Reynders, G., Ruder, S., Stanford, C., & Lantz, J. (2019b). Constructive Alignment Beyond Content: Assessing Professional Skills in Student Group Interactions and Written Work. In M. Schultz, S. Schmid, & G. A. Lawrie (Eds.), Research and Practice in Chemistry Education: Advances from the 25 th IUPAC International Conference on Chemistry Education 2018 (pp. 203–222). Singapore: Springer.

Chapter   Google Scholar  

Danczak, S., Thompson, C., & Overton, T. (2017). ‘What does the term Critical Thinking mean to you?’A qualitative analysis of chemistry undergraduate, teaching staff and employers' views of critical thinking. Chemistry Education Research and Practice, 18 , 420–434.

Daniel, K. L., Bucklin, C. J., Leone, E. A., & Idema, J. (2018). Towards a Definition of Representational Competence. In Towards a Framework for Representational Competence in Science Education (pp. 3–11). Switzerland: Springer.

Davies, M. (2013). Critical thinking and the disciplines reconsidered. Higher Education Research & Development, 32 (4), 529–544.

Deloitte Access Economics. (2014). Australia's STEM Workforce: a survey of employers. Retrieved from https://www2.deloitte.com/au/en/pages/economics/articles/australias-stem-workforce-survey.html .

Driscoll, M. P. (2005). Psychology of learning for instruction . Boston, MA: Pearson Education.

Ennis, R. H. (1990). The extent to which critical thinking is subject-specific: Further clarification. Educational researcher, 19 (4), 13–16.

Facione, P. A. (1984). Toward a theory of critical thinking. Liberal Education, 70 (3), 253–261.

Facione, P. A. (1990a). The California Critical Thinking Skills Test--College Level . In Technical Report #1 . Experimental Validation and Content : Validity .

Facione, P. A. (1990b). The California critical thinking skills test—college level . In Technical Report #2 . Factors Predictive of CT : Skills .

Freeman, S., Eddy, S. L., McDonough, M., Smith, M. K., Okoroafor, N., Jordt, H., & Wenderoth, M. P. (2014). Active learning increases student performance in science, engineering, and mathematics. Proceedings of the National Academy of Sciences, 111 (23), 8410–8415.

Gafney, L., & Varma-Nelson, P. (2008). Peer-led team learning: evaluation, dissemination, and institutionalization of a college level initiative (Vol. 16): Springer Science & Business Media, Netherlands.

Glassner, A., & Schwarz, B. B. (2007). What stands and develops between creative and critical thinking? Argumentation? Thinking Skills and Creativity, 2 (1), 10–18.

Gosser, D. K., Cracolice, M. S., Kampmeier, J. A., Roth, V., Strozak, V. S., & Varma-Nelson, P. (2001). Peer-led team learning: A guidebook: Prentice Hall Upper Saddle River, NJ .

Gray, K., & Koncz, A. (2018). The key attributes employers seek on students' resumes. Retrieved from http://www.naceweb.org/about-us/press/2017/the-key-attributes-employers-seek-on-students-resumes/ .

Hanson, D. M. (2008). A cognitive model for learning chemistry and solving problems: implications for curriculum design and classroom instruction. In R. S. Moog & J. N. Spencer (Eds.), Process-Oriented Guided Inquiry Learning (pp. 15–19). Washington, DC: American Chemical Society.

Hattie, J., & Gan, M. (2011). Instruction based on feedback. Handbook of research on learning and instruction , 249-271.

Huitt, W. (1998). Critical thinking: an overview. In Educational psychology interactive Retrieved from http://www.edpsycinteractive.org/topics/cogsys/critthnk.html .

Joint Committee on Standards for Educational Psychological Testing. (2014). Standards for Educational and Psychological Testing : American Educational Research Association.

Jonsson, A., & Svingby, G. (2007). The use of scoring rubrics: Reliability, validity and educational consequences. Educational Research Review, 2 (2), 130–144.

Kumi, B. C., Olimpo, J. T., Bartlett, F., & Dixon, B. L. (2013). Evaluating the effectiveness of organic chemistry textbooks in promoting representational fluency and understanding of 2D-3D diagrammatic relationships. Chemistry Education Research and Practice, 14 , 177–187.

Lai, E. R. (2011). Critical thinking: a literature review. Pearson's Research Reports, 6 , 40–41.

Lewis, A., & Smith, D. (1993). Defining higher order thinking. Theory into Practice, 32 , 131–137.

Miri, B., David, B., & Uri, Z. (2007). Purposely teaching for the promotion of higher-order thinking skills: a case of critical thinking. Research in Science Education, 37 , 353–369.

Moog, R. S., & Spencer, J. N. (Eds.). (2008). Process oriented guided inquiry learning (POGIL) . Washington, DC: American Chemical Society.

Moskal, B. M., & Leydens, J. A. (2000). Scoring rubric development: validity and reliability. Practical Assessment, Research and Evaluation, 7 , 1–11.

Nakhleh, M. B. (1992). Why some students don't learn chemistry: Chemical misconceptions. Journal of Chemical Education, 69 (3), 191.

National Research Council. (2011). Assessing 21st Century Skills: Summary of a Workshop . Washington, DC: The National Academies Press.

National Research Council. (2012). Education for Life and Work: Developing Transferable Knowledge and Skills in the 21st Century . Washington, DC: The National Academies Press.

Nguyen, D. H., Gire, E., & Rebello, N. S. (2010). Facilitating Strategies for Solving Work-Energy Problems in Graphical and Equational Representations. 2010 Physics Education Research Conference, 1289 , 241–244.

Nicol, D. J., & Macfarlane-Dick, D. (2006). Formative assessment and self-regulated learning: A model and seven principles of good feedback practice. Studies in Higher Education, 31 (2), 199–218.

Panadero, E., & Jonsson, A. (2013). The use of scoring rubrics for formative assessment purposes revisited: a review. Educational Research Review, 9 , 129–144.

Pearl, A. O., Rayner, G., Larson, I., & Orlando, L. (2019). Thinking about critical thinking: An industry perspective. Industry & Higher Education, 33 (2), 116–126.

Ramsden, P. (1997). The context of learning in academic departments. The experience of learning, 2 , 198–216.

Rau, M. A., Kennedy, K., Oxtoby, L., Bollom, M., & Moore, J. W. (2017). Unpacking “Active Learning”: A Combination of Flipped Classroom and Collaboration Support Is More Effective but Collaboration Support Alone Is Not. Journal of Chemical Education, 94 (10), 1406–1414.

Reynders, G., Suh, E., Cole, R. S., & Sansom, R. L. (2019). Developing student process skills in a general chemistry laboratory. Journal of Chemical Education , 96 (10), 2109–2119.

Saxton, E., Belanger, S., & Becker, W. (2012). The Critical Thinking Analytic Rubric (CTAR): Investigating intra-rater and inter-rater reliability of a scoring mechanism for critical thinking performance assessments. Assessing Writing, 17 , 251–270.

Schmidt, H. G., De Volder, M. L., De Grave, W. S., Moust, J. H. C., & Patel, V. L. (1989). Explanatory Models in the Processing of Science Text: The Role of Prior Knowledge Activation Through Small-Group Discussion. J. Educ. Psychol., 81 , 610–619.

Simonson, S. R. (Ed.). (2019). POGIL: An Introduction to Process Oriented Guided Inquiry Learning for Those Who Wish to Empower Learners . Sterling, VA: Stylus Publishing, LLC.

Singer, S. R., Nielsen, N. R., & Schweingruber, H. A. (Eds.). (2012). Discipline-Based education research: understanding and improving learning in undergraduate science and engineering . Washington D.C.: The National Academies Press.

Smit, R., & Birri, T. (2014). Assuring the quality of standards-oriented classroom assessment with rubrics for complex competencies. Studies in Educational Evaluation, 43 , 5–13.

Stein, B., & Haynes, A. (2011). Engaging Faculty in the Assessment and Improvement of Students' Critical Thinking Using the Critical Thinking Assessment Test. Change: The Magazine of Higher Learning, 43 , 44–49.

Swanson, H. L., Oconnor, J. E., & Cooney, J. B. (1990). An Information-Processing Analysis of Expert and Novice Teachers Problem-Solving. American Educational Research Journal, 27 (3), 533–556.

The Royal Society. (2014). Vision for science and mathematics education: The Royal Society Science Policy Centre . London: England.

Watson, G., & Glaser, E. M. (1964). Watson-Glaser Critical Thinking Appraisal Manual . New York, NY: Harcourt, Brace, and World.

Zimmerman, B. J. (2002). Becoming a self-regulated learner: An overview. Theory into Practice, 41 (2), 64–70.

Zohar, A., Weinberger, Y., & Tamir, P. (1994). The Effect of the Biology Critical Thinking Project on the Development of Critical Thinking. Journal of Research in Science Teaching, 31 , 183–196.

Download references

Acknowledgements

We thank members of our Primary Collaboration Team and Implementation Cohorts for collecting and sharing data. We also thank all the students who have allowed us to examine their work and provided feedback.

Supporting information

• Product rubric survey

• Initial implementation survey

• Continuing implementation survey

This work was supported in part by the National Science Foundation under collaborative grants #1524399, #1524936, and #1524965. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of the National Science Foundation.

Author information

Authors and affiliations.

Department of Chemistry, University of Iowa, W331 Chemistry Building, Iowa City, Iowa, 52242, USA

Gil Reynders & Renée S. Cole

Department of Chemistry, Virginia Commonwealth University, Richmond, Virginia, 23284, USA

Gil Reynders & Suzanne M. Ruder

Department of Chemistry, Drew University, Madison, New Jersey, 07940, USA

Juliette Lantz

Department of Chemistry, Ball State University, Muncie, Indiana, 47306, USA

Courtney L. Stanford

You can also search for this author in PubMed   Google Scholar

Contributions

RC, JL, and SR performed an initial literature review that was expanded by GR. All authors designed the survey instruments. GR collected and analyzed the survey and interview data with guidance from RC. GR revised the rubrics with extensive input from all other authors. All authors contributed to reliability measurements. GR drafted all manuscript sections. RC provided extensive comments during manuscript revisions; JL, SR, and CS also offered comments. All authors read and approved the final manuscript.

Corresponding author

Correspondence to Renée S. Cole .

Ethics declarations

Competing interests.

The authors declare that they have no competing interests.

Additional information

Publisher’s note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary information

Additional file 1..

Supporting Information

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ .

Reprints and permissions

About this article

Cite this article.

Reynders, G., Lantz, J., Ruder, S.M. et al. Rubrics to assess critical thinking and information processing in undergraduate STEM courses. IJ STEM Ed 7 , 9 (2020). https://doi.org/10.1186/s40594-020-00208-5

Download citation

Received : 01 October 2019

Accepted : 20 February 2020

Published : 09 March 2020

DOI : https://doi.org/10.1186/s40594-020-00208-5

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Constructive alignment
  • Self-regulated learning
  • Process skills
  • Professional skills
  • Critical thinking
  • Information processing

critical thinking rubric college

T2I Critical Thinking Rubric

Critical Thinking is disciplined thinking that is clear, rational, open-minded, systematic, and informed by evidence (EO 1.2).

Framing Language

The culture of higher education includes the pursuit of truth, for the sake of conveying truth to the world. In the pursuit of truth students encounter various understandings and points of view. Therefore, CT is required for students’ development through understanding, evaluating, deciding, and graciously communicating ideas and conclusions.

The goal of Montreat College’s T2I is to develop the critical thinking skills of our students so that they can graciously impact the world around them. To assess the effectiveness of T2I the following rubric is utilized. The T2I rubric addresses the skills of problem solving and learning, creative thinking, and communication of multifaceted ideas, as each are skill based components of critical thinking.

Problem solving and learning include the ability to separate relevant and irrelevant information, to integrate multiple sources of information to solve problems and to learn and apply new information to solve real-world problems. Creative thinking involves identifying or deriving alternative interpretations for data or observations, recognizing new information that might support or contradict a hypothesis, and explaining how new information can change one’s understanding and ability to address a problem. Communication of multifaceted ideas includes the ability to communicate ideas graciously so engagement with the world can be done effectively and precisely.

Work samples to be assessed include, but are not limited to, student reflections, discussion board posts, and course presentations.

  • Clear: Expressing ideas in a straight-forward and simple manner.
  • Rational: Thought that clearly demonstrates cognitive reasoning to come to a logical conclusion.
  • Open-minded: A genuine critical openness to the ideas and beliefs of others (EO 2.4).
  • Systematic: Organized thought that follows a logical plan to investigate or explain an idea.
  • Evidence: Quantitative and qualitative information that is supported by direct observation and empirical sources.

Evaluators are encouraged to assign a zero to any work sample or collection that does not meet benchmark (cell one) level performance.

This rubric was created using the Association of American Colleges and Universities Critical Thinking VALUE Rubric. Retrieved from https://www.aacu.org/value-rubrics

  • Student Life
  • Residence Life
  • Special Events
  • Campus Store
  • Spiritual Formation
  • Thrive Center

Critical Thinking: Learning, Teaching and Assessment

critical thinking rubric college

Designing Assessments

decorative

Introduction

The Critical Thinking Assessment Rubric was developed as a key deliverable of the ‘Building Capacity to Measure Essential Employability Skills’ project funded by the Higher Education Quality Council of Ontario (HEQCO) . This handbook serves as a resource to teachers in using the Critical Thinking Assessment Rubric.

Critical thinking is one of the six skill categories within the ‘essential employability skills’ (EES) curriculum requirements for Ontario college programs – specifically, EES numbers 4 and 5. Each of these essential employability skills must be addressed (learned, practiced, evaluated) within a program. How and when these are implemented should be based on decisions regarding the program as a whole and by individual teachers.

The wording of the learning outcomes associated with the critical thinking essential employability skill: ‘Apply a systematic approach to solve problems’ and ‘Use a variety of thinking skills to anticipate and solve problems’ are too vague for direct measurement in an assignment; more concrete and measurable learning outcomes are needed.

In this project we used the skill of critical thinking as an example to demonstrate possible ways of incorporating a broadly described essential employability skill into the curriculum – what needs to be taught and practiced, how it can be demonstrated by the learner, and how it can be measured by the teacher.

We aimed to develop a common language with which teachers could talk about critical thinking in the classroom. Our objective was to create a practical critical thinking measurement or marking tool, grounded in the literature, and developed by George Brown College teachers, which would have sufficient flexibility to allow it to be adapted by teachers for use in any college classroom in which critical thinking is being taught and measured.

decorative

Six Critical Thinking Constructs

The project began with a review of the literature about critical thinking. While there are many valid definitions of critical thinking, we chose the following three definitions in the earliest discussions with faculty about the development of the first version of the critical thinking rubric.

‘Purposeful, self-regulatory judgment which results in interpretation, analysis, evaluation, and inference as well as explanation of the evidential, conceptual and methodological considerations on which a judgment is based’ (American Philosophical Association)

‘Critical thinking is a habit of mind characterized by the comprehensive exploration of issues, ideas, artifacts, and events before accepting or formulating an opinion or conclusion’

(American Association of Colleges and Universities, Critical Thinking Value Rubric)

‘[Critical thinking] entails (1) identifying assumptions that frame our thinking and determine our actions, (2) checking out the degree to which these assumptions are accurate and valid, (3) looking at our ideas and decisions (intellectual, organizational, and personal) from several different perspectives, and (4) on the basis of all this, taking informed actions’ (Brookfield, 2012)

The literature confirmed that there is no single standard definition of critical thinking, which results in a wide range of critical thinking constructs being taught and evaluated. Faculty in Phase 1 of the project agreed on six relevant, concrete and measurable critical thinking constructs which could be taught, and then evaluated within an assignment. These were the critical thinking constructs that were already most commonly taught and evaluated, according to project faculty.

  • Clarifies the issue to be discussed and/or the position to be argued in this paper
  • Identifies the sources of ideas or evidence used in developing the argument or conclusions
  • Analyses the ideas or evidence to develop the argument or conclusions
  • Critiques contradictory evidence, information, experts’ opinions and/or methodologies
  • Acknowledges personal biases or assumptions
  • Describes conclusions

During the project, we recognized that a student’s mastery of the English language, as used in their assignment, could strongly influence the teacher’s rating of critical thinking skills, and provide a focus other than specifically on the critical thinking skills. For this reason, English language-specific criteria were excluded from this rubric.

decorative

About the critical thinking rubric

The Critical Thinking Assessment Rubric:

  • Is brief and easy for a marker to use
  • Has demonstrated high inter-rater reliability through a validation process with George Brown faculty
  • Focuses on only a select and manageable number of the many possible critical thinking constructs identified in the literature—uses six distinct critical thinking criteria (i.e. constructs) judged by GBC faculty to be most relevant to the types of assignments used at George Brown
  • Provides meaningful descriptors for a range of performance levels, clearly distinguishing from inadequate to exemplary performance in regard to expectations (which, in turn, would be identified in the assignment instructions).
  • Below expectations
  • Meets expectations
  • Exemplary/exceeds expectations
  • Includes a description or example of performance in each of the 24 cells to assist the marker in differentiating the 4 levels of performance for each criterion
  • Uses sufficiently generic language in the descriptors allowing applicability to different types of assignments
  • Provides sufficient flexibility to be used either as a stand-alone rubric (with the addition of a grading scheme and criteria weights) or to be integrated into and existing rubric whenever relevant
  • Can be used for formative feedback
  • May be adapted to use some but not all of the criteria, depending upon what is taught in the course.

decorative

How To Use The Critical Thinking Assessment Rubric

The Critical Thinking Assessment Rubric can be used to develop or modify curriculum, both teaching and learning activities, and assignments. The six constructs can inform what is to be discussed, taught and practiced about critical thinking in a course or across courses in a program.

This rubric should be used to evaluate only those assignments that have specifically incorporated the relevant critical thinking criteria from the rubric. It is assumed that the constructs to be evaluated have been discussed/taught/practiced by students, that students already know the specific performance expectations for each critical thinking construct (criterion) to be evaluated, and that this information is clearly identified in the assignment instructions.

Options for using the Critical Thinking Assessment Rubric:

  • The rubric can be used as a stand-alone marking rubric either for formative/teaching purposes or for summative evaluation with the addition of a grading scheme and criteria weights.
  • No specific weight is currently attached to the six criteria in this rubric. Teachers may choose to attach relative weights or a grading scheme to the rubric.
  • Any or all of the six critical thinking constructs (criteria) as relevant to the assignment can be incorporated into an existing grading rubric.
  • Teachers may choose to use fewer than the six criteria provided in this rubric, as is relevant to the specifics of their assignment.
  • Teachers may add additional critical thinking criteria, reflecting other relevant critical thinking constructs, as needed.

Critical Thinking Assessment Rubric

Examples of assignments using the critical thinking assessment rubric.

The following are real examples of five course assignments which were developed or modified by the project faculty to incorporate the critical thinking constructs. To illustrate and assist you in identifying this, we have purposely colour-coded the relevant sections of these assignments to demonstrate where the critical thinking constructs are incorporated.

Land acknowledgement

Learn more about our land acknowledgment

  • Arts, Design & Information Technology
  • Community Services & Early Childhood
  • Construction & Engineering Technologies
  • Health Sciences
  • Hospitality & Culinary Arts
  • Preparatory & Liberal Studies
  • Alumni Stories
  • Achievements
  • Connected to Employers
  • College Advising
  • Experiential Learning
  • FAQ – Guidance Counsellors
  • Campus Tours
  • Information Sessions
  • Online Tours & Information Sessions
  • Virtual Tour
  • Student Life
  • Study and Work Abroad
  • Transfer Agreements & Opportunities
  • Courses eligible for transfer
  • Course-to-Course Equivalency Database
  • Transferring Credits Into GBC
  • Transferring Credits to Another institution
  • Transferring Credits within GBC
  • Entry Advising
  • Information sessions and workshops
  • Appointments
  • Program Availability
  • Program Requirements
  • English Proficiency
  • Mature Students
  • Admission & Placement Assessments
  • College Policies
  • CHOICES: Explore your Post-Secondary Options
  • Academic Upgrading – Study On‑Campus
  • Academic Upgrading (Online) - ACE Distance
  • Academic Upgrading for Deaf & Hard‑of‑Hearing (Study On‑Campus)
  • Academic Upgrading for Deaf & Hard‑of‑Hearing Adults (Online)
  • Degree Preparation: University Level (U‑Level) Bridging Programs
  • Mature Student Assessment Prep (MSAP)
  • Placement Test Assessment Prep (PTAP)
  • Academic Upgrading FAQ
  • Advanced Standing
  • International Students
  • Prior Learning Assessment and Recognition (PLAR)
  • Tuition fees and costs
  • Awards and Scholarships
  • Canada Learning Bond
  • External Loans
  • How to Pay for College
  • Tuition Payment Plan
  • Work Study Program
  • Contact Financial Aid
  • Career Coach
  • Contact Admissions
  • Spring 2024 Delivery
  • Winter 2024 Delivery
  • Program Finder
  • Program Comparison
  • International-Eligible Programs
  • Apprenticeship Programs
  • Better Jobs Ontario
  • Bridging Programs
  • Continuing Education
  • Degree Programs
  • English as a Second Language (ESL) Programs
  • Online Programs
  • Postgraduate Programs
  • Pre-Programs
  • Course Outlines Search
  • 2024-2025 Program Viewbook
  • Academic Centres & Schools
  • Ask George Brown
  • Campus Bookstore
  • Computer Store
  • Athletics & Recreation
  • Black Student Success Network (BSSN)
  • Experience Record
  • Recreational Sports
  • Student Association
  • Student Clubs
  • Student Life Volunteer Squad
  • Study & Work Abroad
  • Campuses & Locations
  • How to Get Your Student ID Online
  • How to Use Your Student ID Card
  • Replacing Lost Student ID Card
  • Your Digital GBC ID
  • Policies, Terms & Conditions
  • Important Dates
  • Academic Records
  • Contact Office of the Registrar
  • Financial Aid
  • Graduation & Convocation
  • Registration Information
  • Tuition Tax Receipts
  • Academic Program Orientations
  • Before You Start
  • Navigating your first 30 days
  • Orientation Events Calendar
  • Transitioning Semesters
  • About Clinical Pre-Placement Office
  • Additional requirements
  • Announcements
  • COVID-19 Vaccines
  • Deadline Dates
  • Flu Shot vaccine
  • ParaMed Placement Pass
  • Prerequisite Health Forms by Program
  • Webinar Schedules
  • Spring 2024 Registration
  • Register for Electives
  • Registration FAQ
  • Accessible Learning Services
  • Anti-Racism, Equity and Human Rights Services
  • Career Services & Job Posting
  • Child Care Centres
  • Counselling
  • Deaf & Hard of Hearing Services
  • First Year Experience
  • Food and Drink
  • Gym & Fitness Classes
  • Housing Information
  • Indigenous Initiatives
  • Library Learning Commons (LLC)
  • Locker Rentals
  • Office of Student Conduct & Support
  • Reflection Rooms
  • Safewalk Program
  • Sexual Violence Resources
  • Staying Healthy Off-Campus
  • Student Residence
  • Student Success Hub
  • Tutoring & Learning Centre (TLC)
  • Where to Eat on Campus
  • Welcome & Information Desks
  • Entry Advising Services
  • AppsAnywhere
  • Brightspace Support
  • Email Support
  • Gartner Access
  • Mobile Apps
  • Office 365 & OneDrive
  • Protecting your computer and devices from malware
  • Resetting Your Password
  • Tech Support
  • Zoom Support
  • Admission Requirements
  • How to Apply
  • International Viewbooks
  • Life with George Brown
  • Withdrawal & Refund Policy
  • Refund Process policies
  • International Student Advisors Contact
  • Permits & Visas
  • Scholarships
  • Download the iCent App
  • Study Abroad
  • Outbound Exchange Opportunities
  • Inbound Exchange Opportunities
  • Work-Integrated Learning Abroad
  • International Pathways to Further Study
  • Cultural Intelligence Certificate Program
  • Tuition Fees and Related Costs
  • International Partners
  • Information for Agents
  • Contact International
  • Accessibility
  • AODA Accessibility Training for Employees
  • College AODA Policies
  • College AODA Reports
  • Planning for Accessible Event
  • Anti-Racism
  • Freedom of Expression
  • Freedom of Information and Protection of Privacy (FIPPA)
  • Human Rights
  • Say My Name
  • Sexual Harassment and Sexual Violence Policy, Prevention and Support
  • Student Self-Identification Survey
  • Employee Resources
  • Contact Archives
  • Casa Loma Campus
  • St. James Campus
  • Waterfront Campus
  • Toronto Metropolitan University Location
  • Sunnybrook Location
  • Fashion Exchange
  • Young Centre for the Performing Arts
  • The George Student Residence
  • College History
  • Alumni Relations
  • Retirees' Association
  • Seniors' Association
  • Contact information
  • Annual Reports
  • Board of Governors
  • College Council
  • Labour Negotiations at George Brown College
  • Master Campus Plan
  • President Dr. Gervan Fearon – video messages
  • Senior Leadership Team
  • GBC Foundation
  • Temporary & Contract
  • Why Work at George Brown
  • Health & Safety Guidelines for Contractors
  • How to Support George Brown
  • Food Court Social
  • Government Relations
  • Health, Safety & Wellness
  • GBC in the Media
  • Information for Media
  • News & Announcements
  • Social Media Hub
  • Thought Leadership
  • Developing New Programs
  • Program Review
  • Curriculum Support Resources
  • Office of Research & Innovation
  • Anti–Spam Commitment
  • Freedom of Expression policy
  • Privacy Policy
  • Frequently Asked Questions
  • Socially Responsible Procurement
  • Procurement Supply Chain Code of Ethics
  • Contact & Security Locations
  • Emergency Call Boxes
  • Emergency Preparedness & Guidelines
  • Health Emergency Business Continuity Plan (CEP)
  • Emergency Procedures
  • Flyers, Pamphlets, and Informational Materials
  • In Case of Emergency
  • Public Safety and Security Forms
  • Winter weather guide
  • Institutional Research & Planning
  • Strategic Project Portfolio Management
  • Strategic and Operational Planning
  • LEAD Values
  • Sustainability Plan 2022
  • Sustainability Guidelines
  • The Story of Recycling
  • Waste Management and Recycling
  • What can I do?
  • Work Shift Podcast
  • Current Students
  • Parents & Guidance Counsellors
  • Industry Partners
  • Job Seekers
  • Services for the Public
  • Brightspace (D2L)
  • College Email
  • Emergency information
  • MyGBC Student Portal
  • Password Reset
  • Technical Support
  • Contact George Brown
  • Campuses and Locations
  • Centre & School Directory
  • 24/7 Urgent Support
  • On-Campus Emergency & Security

Library homepage

  • school Campus Bookshelves
  • menu_book Bookshelves
  • perm_media Learning Objects
  • login Login
  • how_to_reg Request Instructor Account
  • hub Instructor Commons
  • Download Page (PDF)
  • Download Full Book (PDF)
  • Periodic Table
  • Physics Constants
  • Scientific Calculator
  • Reference & Cite
  • Tools expand_more
  • Readability

selected template will load here

This action is not available.

Humanities LibreTexts

4.3: The College Essay Assignment- Analysis, Rubrics, and Critical Thinking

  • Last updated
  • Save as PDF
  • Page ID 38779

  • Athena Kashyap & Erika Dyquisto
  • City College of San Francisco via ASCCC Open Educational Resources Initiative

interface-3593269__340.png

The Prompt: What Does “Analyze” Mean Anyway?

Often, the handout or other written text explaining the assignment—what professors call the assignment prompt —will explain the purpose of the assignment, the required parameters (length, number and type of sources, referencing style, etc.), and the criteria for evaluation. Sometimes, though—especially when you are new to a field—you will encounter the baffling situation in which you comprehend every single sentence in the prompt but still have absolutely no idea how to approach the assignment. No one is doing anything wrong in a situation like that. It just means that further discussion of the assignment is in order. Here are some tips:

  • Focus on the verbs . Look for verbs like “compare,” “explain,” “justify,” “reflect” or the all-purpose “analyze.” You’re not just producing a paper as an artifact; you’re conveying, in written communication, some intellectual work you have done. So the question is, what kind of thinking are you supposed to do to deepen your learning?
  • Put the assignment in context . Many professors think in terms of assignment sequences . For example, a social science professor may ask you to write about a controversial issue three times: first, arguing for one side of the debate; second, arguing for another; and finally, from a more comprehensive and nuanced perspective, incorporating text produced in the first two assignments. A sequence like that is designed to help you think through a complex issue. Another common one is a scaffolded research paper sequence: you first propose a topic, then prepare an annotated bibliography, then a first draft, then a final draft, and, perhaps, a reflective paper. The preparatory assignments help ensure that you’re on the right track, beginning the research process long before the final due date, and taking the time to consider recasting your thesis, finding additional sources, or reorganizing your discussion. 5 If the assignment isn’t part of a sequence, think about where it falls in the semester, and how it relates to readings and other assignments. Are there headings on the syllabus that indicate larger units of material? For example, if you see that a paper comes at the end of a three-week unit on the role of the Internet in organizational behavior, then your professor likely wants you to synthesize that material in your own way. You should also check your notes and online course resources for any other guidelines about the workflow. Maybe you got a rubric a couple weeks ago and forgot about it. Maybe your instructor posted a link about “how to make an annotated bibliography” but then forgot to mention it in class.
  • Try a free-write . When handing out an assignment, teachers often ask students to do a five-minute or ten-minute free-write. A free-write is when you just write, without stopping, for a set period of time. That doesn’t sound very “free;” it actually sounds kind of coerced. The “free” part is what you write—it can be whatever comes to mind. Professional writers use free-writing to get started on a challenging (or distasteful) writing task or to overcome writers block or a powerful urge to procrastinate. The idea is that if you just make yourself write, you can’t help but produce some kind of useful nugget. Thus, even if the first eight sentences of your free write are all variations on “I don’t understand this” or “I’d really rather be doing something else,” eventually you’ll write something like “I guess the main point of this is …” and then you’re off and running. As instructors, we’ve found that asking students to do a brief free-write right after I hand out an assignment generates useful clarification questions. If your instructor doesn’t make time for that in class, a quick free-write on your own will quickly reveal whether you need clarification about the assignment and, often, what questions to ask.
  • Ask for clarification the right way . Even the most skillfully crafted assignments may need some verbal clarification, especially because students’ familiarity with the field can vary enormously. Asking for clarification is a good thing. Be aware, though, that instructors get frustrated when they perceive that students want to skip doing their own thinking and instead receive an exact recipe for an 'A' paper. Go ahead and ask for clarification, but try to convey that you want to learn and you’re ready to work.In general, avoid starting a question with “Do we have to …” because I can guarantee that your instructor is thinking, “You don’t have to do crap. You’re an adult. You chose college. You chose this class. You’re free to exercise your right to fail.” Similarly, avoid asking the professor about what he or she “wants.” You’re not performing some service for the professor when you write a paper. What they “want” is for you to really think about the material.
  • So what does analyze mean? Analyze means breaking down a topic into component parts and exploring and examining those various parts to come to some sort of new understanding.

Table 4.3.1 -- Clarifying Questions for Essay Assignments

Three Common Types of College Writing Assignments

From our experience, you are likely to get three kinds of writing assignments based upon the instructor’s degree of direction for the assignment. We’ll take a brief look at each kind of academic writing task.

The Closed Writing Assignment

  • Does your advertisement employ techniques of propaganda, and if so what kind?
  • Was the South justified in seceding from the Union?
  • Was Hitler evil or just mad?

These kinds of writing assignments present you with two counter claims and ask you to determine from your own analysis the more valid claim. They resemble yes-no questions. These topics define the claim for you, so the major task of the writing assignment then is working out the support for the claim. They resemble a math problem in which the teacher has given you the answer and now wants you to “show your work” in arriving at that answer. Be careful with these writing assignments, however, because often these topics don’t have a simple yes/no, either/or answer (despite the nature of the essay question). A close analysis of the subject matter often reveals nuances and ambiguities within the question that your eventual claim should reflect. Perhaps a claim such as, “In my opinion, Hitler was mad” might work, but I urge you to avoid such a simplistic thesis. This thesis would be better: “I believe Hitler's unhinged mind borders on insanity but doesn’t quite reach it.”

The Semi-Open Writing Assignment

  • Discuss the role of law in Antigone.
  • Show how the Fugitive Slave Act influenced the Abolitionist Movement.

Although these topics chart out a subject matter for you to write about, they don’t offer up claims you can easily use in your paper. It would be a misstep to offer up claims such as, “Law plays a role in Antigone." Such statements express the obvious and what the topic takes for granted. The question, for example, is not whether law plays a role in Antigone, but rather what sort of role law plays. What is the nature of this role? What influences does it have on the characters or actions or theme? This kind of writing assignment resembles a kind of archeological dig. The teacher cordons off an area, hands you a shovel, and says dig here and see what you find. Be sure to avoid summary and mere explanation in this kind of assignment. Despite using key words in the assignment such as “explain,” “illustrate,” analyze,” “discuss,” or “show how,” these topics still ask you to make an argument. Implicit in the topic is the expectation that you will analyze the reading and arrive at some insights into patterns and relationships about the subject. Your eventual paper, then, needs to present what you found from this analysis—the treasure you found from your digging. Determining your own claim represents the biggest challenge for this type of writing assignment.

The Open Writing Assignment

  • What does it mean to be an “American” in the 21st Century?
  • Analyze the influence of slavery upon one cause of the Civil War.

These kinds of writing assignments require you to decide both your writing topic and your claim (or thesis). Which character in the Inferno will I pick to analyze? What two themes in Pride and Prejudice will I choose to write about? Many students struggle with these types of assignments because they have to understand their subject matter well before they can intelligently choose a topic. For instance, you need a good familiarity with the characters in The Inferno before you can pick one. You have to have a solid understanding defining elements of American identity as well as 21st century culture before you can begin to connect them. This kind of writing assignment resembles riding a bike without the training wheels on. It says, “You decide what to write about.” The biggest decision, then, becomes selecting your topic and limiting it to a manageable size.

globe-trotter-1828079__340.jpg

Rubrics as Road Maps

If a professor provides a grading rubric with an assignment prompt, thank your lucky stars (and your professor). If the professor took the trouble to prepare and distribute it, you can be sure that he or she will use it to grade your paper. He or she may not go over it in class, but it’s the clearest possible statement of what the professor is looking for in the paper. If it’s wordy, it may seem like those online “terms and conditions” that we routinely accept without reading. But you really should read it over carefully before you begin and again as your work progresses. A lot of rubrics do have some useful specifics. Ours, for example, often contain phrases like “makes at least six error-free connections to concepts or ideas from the course,” or “gives thorough consideration to at least one plausible counter-argument.” Even less specific criteria (such as “incorporates course concepts” and “considers counter-arguments”) will tell you how you should be spending your writing time.

Even the best rubrics aren’t completely transparent. They simply can’t be. Take, for example, this AAC&U rubric . It has been drafted and repeatedly revised by a multidisciplinary expert panel and tested multiple times on sample student work to ensure reliability. But it is still seems kind of vague. What is the real difference between “demonstrating a thorough understanding of context, audience, and purpose” and “demonstrating adequate consideration” of the same? It depends on the specific context. So how can you know whether you’ve done that? A big part of what you’re learning, through feedback from your professors, is to judge the quality of your writing for yourself. Your future managers are counting on that. At this point, it is better to think of rubrics as roadmaps, displaying your destination, rather than a GPS system directing every move you make.

Behind any rubric is the essential goal of higher education: helping you take charge of your own learning, which means writing like an independently motivated scholar. Are you tasked with proposing a research paper topic? Don’t just tell the professor what you want to do, convince him or her of the salience of your topic, as if you were a scholar seeking grant money. Is it a reflection paper? Then outline both the insights you’ve gained and the intriguing questions that remain, as a scholar would. Are you writing a thesis-driven analytical paper? Then apply the concepts you’ve learned to a new problem or situation. Write as if your scholarly peers around the country are eagerly awaiting your unique insights. Descriptors like “thoroughness” or “mastery” or “detailed attention” convey the vision of student writers making the time and rigorous mental effort to offer something new to the ongoing, multi-stranded academic conversation. What your professor wants, in short, is critical thinking.

Critical Thinking in the College Essay Assignments

What’s critical about cr itical thinkin g? Why do college professors gravitate towards essay assignments that encourage critical thinking?

Critical thinking is one of those terms that has been used so often and in so many different ways that if often seems meaningless. It also makes one wonder, is there such a thing as uncritical thinking? If you aren’t thinking critically, then are you even thinking?

Despite the prevalent ambiguities, critical thinking actually does mean something. The Association of American Colleges and Universities usefully defines it as “a habit of mind characterized by the comprehensive exploration of issues, ideas, artifacts, and events before accepting or formulating an opinion or conclusion.” 6

That definition aligns with the best description of critical thinking I ever heard; it came from junior high art teacher, Joe Bolger. 7 He once asked, “What color is the ceiling?” In that withering 'tween tone, we reluctantly replied, “Whiiiite.” He then asked, “What color is it really?” We deigned to aim our pre-adolescent eyes upwards, and eventually began to offer more accurate answers: “Ivory?” “Yellow-ish tan.” “It’s grey in that corner.” After finally getting a few thoughtful responses, Mr. Bolger said something like, “Making good art is about drawing what you see, not what you think you’re supposed to see.” The AAC&U definition, above, essentially amounts to the same thing: taking a good look and deciding what you really think rather than relying on the first idea or assumption that comes to mind.

The critical thinking rubric produced by the AAC&U describes the relevant activities of critical thinking in more detail. To think critically, one must …

  • “clearly state and comprehensively describe the issue or problem,”
  • “independently interpret and evaluate sources,”
  • “thoroughly analyze assumptions behind and context of your own or others’ ideas,”
  • “argue a complex position and one that takes counter-arguments into account,” and
  • “arrive at logical and well informed conclusions”. 8

While you are probably used to providing some evidence for your claims, you can see that college-level expectations go quite a bit further. When professors assign an analytical paper, they don’t just want you to formulate a plausible-sounding argument. They want you to dig into the evidence, think hard about unspoken assumptions and the influence of context, and then explain what you really think and why.

Interestingly, the AAC&U defines critical thinking as a “habit of mind” rather than a discrete achievement. And there are at least two reasons to see critical thinking as a craft or art to pursue rather than a task to check off. First, the more you think critically, the better you get at it . As you get more and more practice in closely examining claims, their underlying logic, and alternative perspectives on the issue, it’ll begin to feel automatic. You’ll no longer make or accept claims that begin with “Everyone knows that …” or end with “That’s just human nature.” Second, just as artists and crafts persons hone their skills over a lifetime, learners continually expand their critical thinking capacities, both through the feedback they receive from others and from their own reflections . Artists of all kinds find satisfaction in continually seeking greater challenges. Continual reflection and improvement is part of the craft.

This comment about critical thinking was provided by student, Aly Button: "As soon as I see the phrase 'critical thinking,' the first thing I think is more work. It always sounds as if you’re going to have to think harder and longer. But I think the AAC&U’s definition is on point; critical thinking is a habit. Seeing that phrase shouldn’t be a scary thing because by this point in many people’s college career this is an automatic response. I never expect an answer to a question to be in the text; by now I realize that my professors want to know what I have to say about something or what I have learned. In a paper or essay, the three-step thesis process is a tool that will help you get this information across. While you’re doing the hard work (the thinking part), this formula offers you a way to clearly state your position on a subject. It’s as simple as: make a general statement, make an arguable statement, and finally, say why it is important. This is my rule of thumb, and I would not want to start a thesis-driven paper any other way!

Critical thinking is hard work. Even those who actively choose to do it experience it as tedious, difficult, and sometimes surprisingly emotional. Nobel-prize winning psychologist Daniel Kahneman explains that our brains aren’t designed to think; rather, they’re designed to save us from having to think. 9 Our brains are great at developing routines and repertoires that enable us to accomplish fairly complex tasks like driving cars, choosing groceries, and having a conversation without thinking consciously and thoroughly about every move we make. Kahneman calls this “fast thinking.” “Slow thinking,” which is deliberate and painstaking, is something our brains seek to avoid. That built-in tendency can lead us astray. Kahneman and his colleagues often used problems like this one in experiments to gauge how people used fast and slow thinking in different contexts:

A bat and ball cost $1.10. The bat costs one dollar more than the ball. How much does the ball cost?

Most people automatically say the ball costs $0.10. However, if the bat costs $1 more, than the bat would cost $1.10 leading to the incorrect total of $1.20. The ball costs $0.05. Kahneman notes, “Many thousands of university students have answered the bat-and-ball puzzle, and the results are shocking. More than 50% of students at Harvard, MIT, and Princeton gave the intuitive—incorrect—answer.” These and other results confirm that “many people are overconfident, prone to place too much faith in their intuitions.” Thinking critically—thoroughly questioning your immediate intuitive responses—is difficult work, but every organization and business in the world needs people who can do that effectively. Some students assume that an unpleasant critical thinking experience means that they’re either doing something wrong or that it’s an inherently uninteresting (and oppressive) activity. While we all relish those times when we’re pleasantly absorbed in a complex activity (what psychologist Mihaly Czikszentmihalyi calls “flow” 12 ), the more tedious experiences can also bring satisfaction, sort of like a good work-out.

Critical thinking can also be emotionally challenging, researchers have found. Facing a new realm of uncertainty and contradiction without relying on familiar assumptions is inherently anxiety-provoking because when you’re doing it, you are, by definition, incompetent. Recent research has highlighted that both children and adults need to be able to regulate their own emotions in order to cope with the challenges of building competence in a new area. 13 The kind of critical thinking your professors are looking for—that is, pursuing a comprehensive, multi-faceted exploration in order to arrive at an arguable, nuanced argument—is inevitably a struggle and it may be an emotional one. Your best bet is to find ways to make those processes as efficient, pleasant, and effective as you can .

Kaethe Leonard, another student, says: "The thing no one tells you when you get to college is that critical thinking papers are professors’ favorites. College is all about learning how to think individual thoughts so you’ll have to do quite a few of them. Have no fear though; they do get easier with time. The first step? Think about what you want to focus on in the paper (aka your thesis) and go with it.

The demands students face are not at all unique to their academic pursuits. Professional working roles demand critical thinking, as 81% of major employers reported in an AAC&U-commissioned survey 14 , and it’s pretty easy to imagine how critical thinking helps one make much better decisions in all aspects of life. Embrace it. And just as athletes, artists, and writers sustain their energy and inspiration for hard work by interacting with others who share these passions, look to others in the scholarly community—your professors and fellow students—to keep yourself engaged in these ongoing intellectual challenges. While writing time is often solitary, it’s meant to plug you into a vibrant academic community. What your professors want, overall, is for you to join them in asking and pursuing important questions about the natural, social, and creative worlds.

Other Resources

  • This website from the Capital Community College Foundation has some good advice about overcoming writer’s block. And student contributor Aly Button recommends this funny clip from SpongeBob Squarepants .
  • The Foundation for Critical Thinking maintains a website with many useful articles and tools.
  • The Online Writing Laboratory (OWL) at Purdue University is a wonderful set of resources for every aspect of college writing. Especially germane to this chapter is this summary of the most common types of writing assignments.
  • This website , BrainBashers.com, offers logic puzzles and other brain-teasers for your entertainment.
  • Free-write on an assignment prompt. If you have one, do that one. If not, here’s one to practice with: A. “Please write a five-page paper analyzing the controversy surrounding genetically modified organisms (GMOs) in the food supply.” B. What clarification questions would you like to ask your professor? What additional background knowledge do you need to deeply understand the topic? What are some starter ideas that could lead to a good thesis and intriguing argument?
  • Find a couple of sample student papers from online paper mills such as this one (Google “free college papers”) and journals featuring excellent undergraduate writing (such as this one from Cornell University ), and use the AAC&U rubric on critical thinking to evaluate them. Which descriptor in each row most closely fits the paper?

Outcomes and Rubrics

RIT faculty developed sixteen General Education Student Learning Outcomes aligned to the General Education Framework. Each of RIT’s General Education Student Learning Outcomes has a corresponding rubric. All rubrics were developed by RIT Faculty Teams.

Communication

hands typing on a laptop keyboard with another person in the background holding a pencil

Express oneself effectively in common college-level written forms

  • View PDF Rubric 1 RUBRIC Express oneself effectively in written forms REV 2023.pdf

two pencils laid cross over each other on a blank notebook page

Revise and improve written products

  • View PDF Rubric 2 RUBRIC Revise and improve written products REV 2019.pdf

a person standing in front of a podium while others sit and clap

Express oneself effectively in presentations, either in American English or American Sign Language

  • View PDF Rubric 3 RUBRIC Express oneself effectively in presentations REV 2023 fillable.pdf

two books stacked on a table next to an open book with a blury book shelf in the background

Demonstrate comprehension of information and ideas accessed through reading

  • View PDF Rubric 4 RUBRIC Demonstrate comprehension accessed through reading REV 2019.pdf

Critical Thinking

a person sitting on top of a stack of books using a laptop. The background is filled with hand drawn doodles and ideas

Use relevant evidence gathered through accepted scholarly methods and properly acknowledge sources of information

  • View PDF Rubric 5 RUBRIC Use Relevant Evidence REV 2019.pdf

A sign with the word "FRACKING" covered up by a red circle with a slash through it

Analyze or construct arguments considering their premises, assumptions, contexts, and conclusions, and anticipating counterarguments

  • View PDF Rubric 6 RUBRIC Analyze or Construct Arguments REV 2019_0.pdf

a number of flowcharts and documents on a desk with a notebook and pen on top

Reach sound conclusions based on logical analysis of evidence

  • View PDF Rubric 7 RUBRIC Reach Sound Conclusions REV 2019.pdf

people standing around a bulletin board on a table attaching sticky notes over papers

Demonstrate creative or innovative approaches to assignments or projects

  • View PDF Rubric 8 RUBRIC Demonstrate Creative Innovative REV 2019.pdf

Perspectives

a statue of the blindfolded lady of justice holding up scales

Ethical: Identify contemporary ethical questions and relevant positions

  • View PDF Rubric 9 RUBRIC Identify Contemporary Ethical Questions REV 2019 01.27.2020.pdf

wooden totem poles colorfully painted

Artistic: Interpret and evaluate artistic expression considering the cultural context in which it was created

  • View PDF Rubric 10 RUBRIC Interpret and Evaluate Artistic Expression REV 2019.pdf

a number of international flags hanging from flag poles with a blue sky background

Global: Examine connections among the world’s populations

  • View PDF Rubric 11 RUBRIC Examine Connections among World Populations REV 10.2019.pdf

an elevated view of a large number of people walking

Social: Analyze similarities and differences in human social experiences and evaluate the consequences

  • View PDF Rubric 12 RUBRIC Analyze Human Similarities and Differences REV 2019.pdf

a black and white atom over a number of mathematical equations

Natural Science Inquiry: Demonstrate knowledge of basic principles and concepts of one of the natural sciences

  • View PDF Rubric 13 RUBRIC Demonstrate Knowledge of Science REV 2019.pdf

glass beakers with an eyedropper dropping clear liquid into them

Scientific Principles: Apply methods of scientific inquiry and problem solving to contemporary issues and scientific questions

  • View PDF Rubric 14 RUBRIC Apply Methods of Scientific Inquiry REV 2019.pdf

a black board with equations charts and graphs on it

Mathematical: Comprehend and evaluate mathematical or statistical information

  • View PDF Rubric 15 RUBRIC Comprehend and Evaluate Math REV 2019.pdf

a scientific calculator and pencil on a notebook

Mathematical: Perform college-level mathematical operations or apply statistical techniques

  • View PDF Rubric 16 RUBRIC Perform College Level Math REV 2019.pdf

Lee Honors College

Gary l. belleville critical thinking student employee of the year award for 2023-24.

Dean Lopez, Jackie Chavarria, and Jennifer Townsend stand in the LHC after Jackie received her award.

We’re excited to share that Jaqueline (Jackie) Chavarria has been awarded the Gary L. Belleville Critical Thinking Student Employee of the Year Award for 2023-24! 

Jackie works at the Lee Honors College as a student receptionist and is a junior double majoring in Criminal Justice and Psychology. Jackie was nominated for this award by her supervisor, LHC executive assistant Jennifer Townsend. 

When presenting the award, Amanda Jeppeson, Student Employment Specialist at WMU, shared that Jackie really impressed the judges with how well she’s managed communication within her staff team and disseminated information in a meaningful way to make the Lee Honors College a great place for Honors Broncos. 

As a winning student for this academic year, Jackie’s name will be forwarded to a competition through the Midwest Association of Student Employment Administrators (MASEA) later this spring. Congratulations to Jackie on this well-deserved honor! 

IMAGES

  1. 6-12 Critical Thinking Rubric (CCSS Aligned)

    critical thinking rubric college

  2. File:Critical Thinking Rubric.pdf

    critical thinking rubric college

  3. Critical Thinking Rubric

    critical thinking rubric college

  4. Capital Community College

    critical thinking rubric college

  5. Critical Thinking & Problem Solving Rubric-BS

    critical thinking rubric college

  6. 😊 Rubric for critical thinking questions. Rio Salado College. 2019-01-16

    critical thinking rubric college

VIDEO

  1. Mind Rubric 'Gesture Makes' CONVULSIVE ,Thinking Movement When

  2. Georges Bataille

  3. CRITICAL EXPLANATION OF OUR RUBRIC,( TALK SHOW)

  4. Guidelines for writing a powerful reflection

  5. Grading with Rubrics in Forums

  6. D2L

COMMENTS

  1. PDF CRITICAL THINKING VALUE RUBRIC

    Critical thinking is a habit of mind characterized by the comprehensive exploration of issues, ideas, artifacts, and events before accepting or formulating an opinion or conclusion. Framing Language . This rubric is designed to be transdisciplinary, reflecting the recognition that success in all disciplines requires habits o f inquiry and ...

  2. VALUE Rubrics

    VALUE Rubrics - Critical Thinking. The VALUE rubrics were developed by teams of faculty experts representing colleges and universities across the United States through a process that examined many existing campus rubrics and related documents for each learning outcome and incorporated additional feedback from faculty.

  3. PDF Designing Rubrics to Assess Critical Thinking

    Microsoft PowerPoint - Designing Rubrics to Assess Critical Thinking.pptx. 3:00. Traditional assessment measures such as multiple choice questions are a form of selected response measures designed for knowledge recall and sometimes for decision‐making from a selection of options. In such measures, students are asked to think critically in the ...

  4. PDF CRITICAL THINKING VALUE RUBRIC

    The Creative Thinking VALUE Rubric is intended to help faculty assess creative thinking in a broad range of transdisciplinary or interdisciplinary work samples or collections of work. The rubric is made up ofa ... CRITICAL THINKING PART 2: INQUIRY & ANALYSIS VALUE RUBRIC

  5. PDF Critical Thinking VALUE Rubric

    Critical thinking is a habit of mind characterized by the comprehensive exploration of issues, ideas, artifacts, and events before accepting or formulating an opinion or conclusion. Framing Language This rubric is designed to be transdisciplinary, reflecting the recognition that success in all disciplines requires habits o f inquiry and ...

  6. Critical Thinking Rubric

    Critical Thinking Rubric. This rubric is designed to evaluate the extent to which undergraduate students evaluate claims, arguments, evidence, and hypotheses. Results will be used for program improvement purposes only. Download the Critical Thinking Rubric (PDF version) Course: Instructor: Student: Date: Component.

  7. PDF Texas A&M University Core Curriculum Critical Thinking Rubric

    Code §4.28(2021)). Further, the Association of American Colleges & Universities' Critical Thinking VALUE Rubric defines critical thinking as "a habit of the mind characterized by the comprehensive exploration of issues, ideas, artifacts, and events before accepting or formulating an opinion or conclusion.".

  8. Rubrics to assess critical thinking and information processing in

    The Critical Thinking Analytic Rubric is designed specifically to assess K-12 students to enhance college readiness and has not been broadly tested in collegiate STEM courses (Saxton et al., 2012).

  9. PDF Critical Thinking Rubric

    CRITICAL THINKING VALUE RUBRIC. This rubric was developed by an interdisciplinary team of faculty representing Texas Southmost College through a process that examined and modified the AACU Teamwork Value Rubric and the Stephen F. Austin State University (SFA) rubric to meet the needs of TSC's core curriculum assessment. ...

  10. T2I Critical Thinking Rubric

    To assess the effectiveness of T2I the following rubric is utilized. The T2I rubric addresses the skills of problem solving and learning, creative thinking, and communication of multifaceted ideas, as each are skill based components of critical thinking. Problem solving and learning include the ability to separate relevant and irrelevant ...

  11. Student Evaluation Using an Intellectual Standards Rubric for Critical

    Development of critical thinking skills is an important outcome in education, though pedagogies to both promote and evaluate critical thinking present challenges and vary greatly. In this article, we describe the development and use of a formative and generalizable rubric that leverages the Paul-Elder model for critical thinking, and in ...

  12. PDF Critical Thinking

    CRITICAL THINKING VALUE RUBRIC for more information, please contact [email protected] The VALUE rubrics were developed by teams of faculty experts representing colleges and universities across the United States through a process that examined many existing campus rubrics and related documents for each learning outcome and incorporated additional feedback from faculty.

  13. PDF Critical Thinking

    San Jacinto College . Critical Thinking . Institutional Interpretation . As of February 15, 2019 Definitions & Rubric • Critical Thinking Skills (SJC), "Students will develop habits of mind, allowing them to appreciate the processes by which scholars in various disciplines organize and evaluate data and use the methodologies of each ...

  14. PDF Rubrics for assessment (10-11-17)

    RUBRICS FOR ASSESSMENT Office of Assessment, Trinity College October 11, 2017 ... Trinity College, Duke University. Types of rubrics: Holistic/simplistic Credit: Johns Hopkins University ... critical thinking skills, including hypothesis generation and testing. Writing is somewhat persuasive, and has some ...

  15. PDF Aims Community College Critical Thinking Rubric

    consequences of the issue, but. the other elements of the problem (context, assumptions, or data and evidence) are not discussed/documented. Overall Rating ILSLO-Critical Thinking. Mark the appropriate level (4, 3, 2, 1) (4.00) (3.99-3.00) (2.99-2.00) (1.99-1.00) This rubric was initially adapted from the Association of American Colleges and ...

  16. Critical Thinking: Learning, Teaching and Assessment

    This handbook serves as a resource to teachers in using the Critical Thinking Assessment Rubric. Critical thinking is one of the six skill categories within the 'essential employability skills' (EES) curriculum requirements for Ontario college programs - specifically, EES numbers 4 and 5. Each of these essential employability skills must ...

  17. PDF CRITICAL THINKING RUBRIC

    Critical thinking is a habit of mind characterized by the comprehensive exploration of issues, ideas, artifacts, and events before accepting or formulating an opinion or conclusion. Framing Language This rubric is designed to be transdisciplinary, reflecting the recognition that success in all disciplines requires habits of inquiry and analysis ...

  18. Critical Thinking Rubric

    The student will demonstrate the ability to analyze information, evaluate material, use inference to draw conclusions, and use deductive reasoning and inductive reasoning at a college level. Four-Point Rubric. 4 = High level excellence in evidence of critical thinking ability and performance at the college level

  19. (PDF) Developing a rubric to assess critical thinking in a

    Assessment Rubric for Critical Thinking (A RC - St. Petersburg College 2008) and the Critical Thinking Assessment Rubric (CTA - George Brown College 2 015).

  20. 4.3: The College Essay Assignment- Analysis, Rubrics, and Critical Thinking

    The AAC&U definition, above, essentially amounts to the same thing: taking a good look and deciding what you really think rather than relying on the first idea or assumption that comes to mind. The critical thinking rubric produced by the AAC&U describes the relevant activities of critical thinking in more detail.

  21. Outcomes and Rubrics

    Critical Thinking. Use relevant evidence gathered through accepted scholarly methods and properly acknowledge sources of information. ... View PDF Rubric 16 RUBRIC Perform College Level Math REV 2019.pdf; Address and Phone. 12 Lomb Memorial Drive Rochester, NY 14623-5608 585-475-2411 Footer Main navigation.

  22. Using Critical Thinking Rubrics to Increase Academic Performance

    Tutors trained in a College Reading and Learning Association (CRLA)-certified program at the University of Louisville used a rubric based on the Paul-Elder Critical Thinking Model in order to measure the development of students' critical thinking skills such as effective questioning, ability to gather important information, intellectual ...

  23. PDF Critical Thinking VALUE Rubric

    Critical thinking is a habit of mind characterized by the comprehensive exploration of issues, ideas, artifacts, and events before accepting or formulating an opinion or conclusion. Framing Language This rubric is designed to be transdisciplinary, reflecting the recognition that success in all disciplines requires habits o f inquiry and ...

  24. PDF An Initial Development of an Analytic Rubric for Assessing Critical

    college-aged students' critical thinking in argumentative writing. The objective of this study is to develop a valid and reliable rubric for evaluating EFL college-aged students' critical thinking in argumentative essays. There are two research questions: 1) Is the CTER a valid and reliable

  25. Gary L. Belleville Critical Thinking Student Employee of the Year Award

    We're excited to share that Jaqueline (Jackie) Chavarria has been awarded the Gary L. Belleville Critical Thinking Student Employee of the Year Award for 2023-24! Jackie works at the Lee Honors College as a student receptionist and is a junior double majoring in Criminal Justice and Psychology. Jackie was nominated for this award by her ...