Robert E. Levasseur, PhD

critical thinking doctoral dissertation

Blog Categories

  • Collaborative Meeting Management
  • Decision Making
  • Doctoral Study
  • High Performance Team Building
  • Hiking and Nature
  • Leadership and Change
  • Lifelong Learning

Recent Posts

  • Step 3. Transitioning from Group to Team
  • Step 2. Creating Psychological Safety
  • Step 1. Balancing the Team
  • The Perfect Team Model
  • Tips on Building a Perfect Team

Critical Thinking at the Doctoral Level

What does it mean to exercise critical thinking? Does it mean to be negative and adversarial? Does it mean to provide constructive criticism? Or does it mean something totally different? To explore the nature of critical thinking, we begin by examining the concept of left and right brain thinking.

Left and Right Brain Thinking

Brain research suggests that the left and right sides of the brain have distinct and complementary functions. Simply put, the left brain is the seat of logic and, hence, analytical thinking, and the right brain is the seat of intuition and, hence, system thinking.

So, is critical thinking left-brained, analytical thinking, or is it right-brained, system thinking?

Lower vs. Higher-order Thinking

differentiate the work of students from scholars, academics use a framework called Bloom’s Taxonomy. According to Benjamin Bloom, there are multiple levels of thinking.

They follow a hierarchy from the lowest to the highest order or level:

Comprehension

Application

New doctoral students tend to focus on the lower level skills since the educational system at the levels below the doctorate tend to emphasize their use.

As a doctoral student, however, your work must reflect all levels of thinking, particularly the higher-order thinking skills of analysis, synthesis, and evaluation. In addition, your work should incorporate a whole-brain approach that uses right-brained, systemic thinking to support left-brained, analytical thinking, and vice-versa.

critical thinking doctoral dissertation

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • Int J Exerc Sci

Logo of ijes

Factors Affecting PhD Student Success

Sonia n. young.

1 Department of Physical Therapy, Western Kentucky University, Bowling Green, KY, USA

WILLIAM R. VANWYE

2 Doctor of Physical Therapy Program, Gannon University, Ruskin, FL, USA

MARK A. SCHAFER

3 School of Kinesiology, Recreation & Sport, Western Kentucky University, Bowling Green, KY, USA

TROY A. ROBERTSON

4 School of Engineering and Applied Sciences, Western Kentucky University, Bowling Green, KY, USA

5 Educational Leadership Doctoral Program, Western Kentucky University, Bowling Green, KY, USA

ASHLEY VINCENT POORE

Attrition rates for Doctor of Philosophy (PhD) programs in the United States across the fields of engineering, life sciences, social sciences, mathematics and physical sciences, and humanities range from 36 – 51%. A qualitative literature review indicates certain factors may impact the PhD student’s success in completing the program and degree. The factors focused on in this review include the student-advisor relationship, mentorship, and the dissertation process. Although kinesiology doctoral programs are evaluated and ranked by the National Academy of Kinesiology, little information is available exploring kinesiology PhD student success. General information on PhD student success may, therefore, be valuable to kinesiology PhD students and programs.

INTRODUCTION

Results from 2006 ( 31 ), 2007 ( 30 ), 2011 ( 28 ), and 2015 ( 33 ) provide evidence that the National Academy of Kinesiology (formerly the American Academy of Kinesiology and Physical Education) evaluates and ranks kinesiology doctoral programs in the United States (U.S.) every five years. However, ranking information and data regarding the attrition rate of kinesiology PhD students and factors that may impact student attainment of the degree is not included. Although not specific to kinesiology, Doctor of Philosophy (PhD) attrition data is available through The Council of Graduate Schools who performed a quantitative analysis of 30 institutions and nearly 50,000 students across five fields (i.e., engineering, life sciences, social sciences, mathematics and physical sciences, and humanities; 6). The 10-year PhD completion rate was 64%, 63%, 56%, 55%, and 49% for engineering, life sciences, social sciences, mathematics and physical sciences, and humanities, respectively ( 6 ). This relates to the field of kinesiology as it is classified as a life science by the Council of Graduate Schools ( 7 ).

Across the country, kinesiology programs typically do not have a standardized core curriculum and the outcomes of each program are established by the teaching and research expertise of the faculty. However, common courses such as research design and statistics are included in kinesiology PhD curriculums ( 26 ). Each program varies in the courses offered and amount and type of mentoring and advising of PhD students dependent on program faculty. Confusion also exists in the definition of the terms advising and mentoring in regard to PhD students. In a study by Titus and Ballou in 2013, 3,534 researchers, who had received a National Institute of Health (NIH) grant and had at least one PhD student, completed a survey to determine views of the role of advising and mentoring in PhD students ( 32 ). The participants were asked to rate and classify 19 activities as advisor only, mentor only, both, or neither ( 32 ). The activities of chairing a student’s dissertation committee and providing financial support were identified as the highest “advisor only” activities while teaching life or social skills and preparing contracts or grant proposals rated highest in “mentor only” activities ( 32 ). However, results indicated that most faculty members view their roles as mentor and advisor almost synonymously ( 32 ). As these terms can be used interchangeably, the authors will use the term mentoring or mentorship to encompass all advising and mentoring activities with PhD students. Titus and Ballou ( 32 ) also found that, while the majority of faculty supervising PhD students had training in human or animal subjects’ protection (89.6%) and responsible conduct of research (72.3%), relatively few had formal training on how to mentor (27.7%) or advise (25.4%) PhD students. Much of the mentorship therefore is dictated by the faculty’s personal doctoral experience and not typically from any formal training. Based on the amount of experience and training a mentor possesses, as well as the level of involvement in the PhD student experience, students may have vastly different experiences and outcomes such as completion of the program ( 8 ) and opportunities for professional development ( 20 ).

While little data is available on PhD student attrition in the area of kinesiology, research has indicated that multiple reasons contribute to PhD students in general not completing their programs ( 9 , 14 , 16 ). One of those reasons is navigating the dissertation process and following through to completion. In regard to the dissertation, it is typically up to the student to be intrinsically motivated and resourceful to manage the process and ensure the dissertation process persists until completion ( 9 ). Involvement of the faculty mentor in the dissertation process varies and may be dependent on the motivation and capabilities of the student. Faculty mentoring can play a monumental role in ensuring that doctorial students are successful throughout the coursework, dissertation process, and professional development. Russell advocates for kinesiology PhD programs to focus on developing professional stewardship in students ( 25 ). Stewardship includes teaching students how to: interact as a professional, become involved in and promote the profession, maintain ethical standards, and become autonomous researchers ( 13 , 24 , 25 ). Understanding reasons for attrition in PhD students can lead to a plan to mitigate barriers.

Overall, there has been limited research examining kinesiology PhD student success ( 20 , 25 ). Therefore, examining the existing evidence regarding PhD student success would be of benefit for kinesiology students and faculty alike by examining how to ensure successful completion of the program and determine any potential barriers. A qualitative literature review was performed by four of the authors with one additional author providing first-hand insight into the field of PhD level kinesiology programs. One author performed a search on PubMed using terms such as “kinesiology doctoral student success” and “kinesiology doctoral student” which garnered only one pertinent article. Three authors also performed an expanded electronic search on general PhD or doctoral student success, persistence, advising/mentoring, and attrition. The articles were selected if they related to kinesiology or general PhD success. The articles were then read and analyzed resulting in three recurring themes. For this review, PhD success was interpreted to mean successful completion of the degree and dissertation process. Attrition, in this review, is interpreted to mean students who did not complete the degree and or dissertation process and included those who dropped out during the program/coursework or those who finished the program/coursework but not the dissertation. In this review, the authors will discuss the following commonly cited issues affecting PhD student success: the student-advisor relationship, mentorship, and the dissertation process ( Table 1 ). In addition, the authors will provide practical recommendations to address these issues to aid in success. By addressing the potential factors that may impact student success of completing the dissertation and program, administration, faculty, and students can have conversations that may lead to a better understanding of the process and address potential issues.

Summary of potential factors influencing PhD student success.

POTENTIAL CONTRIBUTING FACTORS TO PHD STUDENT SUCCESS

Student-advisor relationship.

A critical factor in PhD student success (i.e., attaining the degree) is the student-advisor relationship ( 11 ). In a qualitative study by Knox et al., 19 psychology faculty members were interviewed about their student-advisor relationship with PhD students ( 18 ). Results indicated that it is not uncommon for doctoral advisors to adopt a mentoring style based on their own experience as a PhD student ( 18 ). Furthermore, they found a lack of training or preparation by the instructional institution leaves the task of acquiring mentoring skills to the practicing faculty member ( 18 ). This is in agreement with Golde and Dore who found that there appears to be a lack of emphasis programmatically on doctoral advising and mentoring ( 10 ).

An investigation by Mansson and Myers the authors found that advisors and advisees have similar ideas of what make a successful relationship ( 19 ). In this study, 636 doctoral and 141 faculty advisors from around the United States were surveyed about the mentoring relationship by using the Mentoring and Communication Support Scale, the Academic Mentoring Behavior Scale, and the Advisee Relational Maintenance Scale ( 19 ). This study found that advisees can positively influence the advisor-advisee relationship with 6 behaviors: showing appreciation, completing assignments in a timely manner, being courteous, protecting the reputation of the advisor, using humor in interactions with the advisor, and consulting the advisor about their individual goals ( 19 ). This was also supported by a qualitative study by Mazerolle et al. in 2015 in which 28 students completing a PhD in varied programs, including kinesiology and exercise science, were interviewed to determine their perception of mentoring from their advisors ( 20 ). The study found that most PhD students had positive relationships with their advisors with students founded on trust and communication ( 20 ). The students in this study further identified themes that must exist in a healthy mentoring relationship: encouraging independence and collaboration in a supportive environment, reciprocal relationship, and providing chances for professional development ( 20 ).

In the interest of improving PhD student success, some studies suggested that university and program-specific officials should evaluate how they can best provide structured and consistent mentorship, including training/mentoring for advisors ( 14 , 18 ). These mentorship strategies must be structured to consider that each student begins a program with different skill sets, levels of intrinsic motivation, and resilience. Harding-DeKam et al. postulated that initial steps for advisors, when the student initiates the program, included asking students what they intend to accomplish during the doctoral program and what area(s) they foresee needing the most support ( 15 ). The authors further suggest that advisors should schedule purposeful meetings to foster a relationship of open communication and trust, as well as using this time to provide explicit expectations ( 15 ). In a study of graduate students from library and information science, public affairs, higher education, and a variety of doctoral programs in the humanities and social sciences by Grady et al., the authors devised additional goals of regular meetings including: 1) timeline planning for degree completion and 2) possible funding available during and throughout their coursework ( 14 ). Some evidence offered advice on how to foster an improved relationship between the advisor and the PhD student but did not offer data that indicated whether or not the positive relationship impacted success.

While the evidence demonstrates that a healthy mentoring relationship is beneficial for the PhD student, there is conflicting evidence that this relationship has a direct impact on attrition. Golde et al. performed a qualitative analysis of 58 individuals from the humanities (English and history) and the science (biology and geology) who did not complete a doctoral program at a major American research institution to determine reasons for attrition ( 11 ). Major themes indicated that students feeling they were a mismatch and in isolation emerged ( 11 ). One of the areas of mismatch was in the student-advisor relationship and was cited as a reason for attrition ( 11 ). This is contrast to a qualitative study by Devos et al. who interviewed 21 former PhD students in Belgium (8 completing and 13 who did not) from science and technology, social sciences, and health sciences disciplines) to explore the students’ experiences that led to completion or attrition of the degree ( 8 ). The results indicated that while the supervisor support had a large impact, the quality of the relationship did not necessarily predict the success of the student in completing ( 8 ).

In summary, the student-advisor relationship can have both positive and negative influences ( Table 1 ). Recommendations to foster a positive student-advisor relationship include establishing mutual trust and clear communication early in the program including setting expectations, goals, and deadlines. The advisor should be supportive but also provide opportunities for development and encourage independence. The student should be proactive in the process of developing and maintaining a collaborative relationship rather than relying solely on the advisor to perform these tasks. Finally, administrators can also assist by providing an emphasis on advising tasks.

Mentorship plays a significant role in developing PhD students into professionals ( 9 ). Therefore, the advisor can also serve as a mentor to help the transition from student to professional ( 14 ). A study by Golde and Dore contends mentors are pivotal, not only for the PhD student’s education, but also for the development of the student’s desired career path ( 12 ). This includes exposing students to teaching, research, and service, but also includes helping students navigate professional subtleties, such as office politics ( 25 ).

An investigation of graduate student stress and strain found great value when mentors advised students transitioning into their new position being that there are many new added responsibilities beyond the pedagogical aspects of degree attainment ( 14 ). This is essential for PhD students, who often have many added responsibilities and subsequent stressors beyond the pedagogical aspects of the degree. For example, graduate students are often required to take on novel tasks beyond their studies (e.g., research, teach and/or oversee undergraduates), without the status, resources, or experience of a professional ( 14 ). Added responsibilities without support can lead to role conflict and overload, possibly affecting mental health and student success ( 14 ). A study that looked at the mental health of 146 graduate students in Brazil, who had been seen at a university mental health clinic, found that depression and anxiety were the main diagnoses reported (44%) and caused 4.5% of the students to be suspended from their programs ( 22 ). As mental health disorders are present in the graduate student population, advisors should be aware of this and may advise students on mental health resources.

It is also important to consider the advisor’s professional background and experience. A study by Carpenter et al. surveyed 21 doctoral faculty members of varying academic ranks in the field of communication, from a representative 14 universities, and revealed four main areas of support mentors provide: career, psychosocial, research, and intellectual ( 4 ). Of particular interest were the factors contributing to how this advisement was delivered ( 4 ). For example, lower ranking faculty provided mentorship that was more psychosocially-based ( 4 ). The authors speculated that as newer faculty tend to relate easier to students as they are not as far removed from their own graduate studies experience ( 4 ). On the other hand, the authors found that higher-ranking professors tend to provide more career and intellectual mentorship than their lower-ranking colleagues ( 4 ). However, tenured professors were less likely to collaborate on research compared to assistant professors ( 4 ). The authors of the study speculate that assistant professors are more inclined to collaborate with graduate students on research projects being that they are working towards tenure and promotion ( 4 ). Effective mentorship of the PhD student provides an avenue of development of professional behaviors and understanding of professional roles. This supportive environment may contribute to successful completion of a degree ( 4 ). Quality advising indicators of “number of doctoral advisees, faculty with at least one doctoral advisee, doctoral advisees who graduate, faculty with at least one doctoral advisee graduated, graduates who found employment within the field” were once used by the National Academy of Kinesiology in the five-year reports to rank and evaluate doctoral programs in kinesiology ( 33 ). Specific data related to these indicators for each school was not published, however. Additionally, in the latest report in 2015, the faculty indicators of total number and number of advisees that graduated were removed and employment was moved to a student indicator ( 33 ). The removal of these indicators, as well as the lack of specific data other than rank of the program, makes it difficult to gauge quality of mentorship as it relates to successful completion of a degree in kinesiology PhD programs.

Mentorship can also have a potential positive and negative influence on PhD student success ( Table 1 ). Recommendations for effective mentorship include providing students with exposure to and guidance in research, teaching, service and office politics. Additionally, the mentor should model professional behaviors and provide advice on mental health resources if needed.

Dissertation Process

The dissertation process may impact a PhD student’s success in completing the degree. Ali and Kohun divide the PhD program into four stages: Stage 1 – Preadmission to Enrollment, Stage II – First Year through Candidacy, Stage III- Second Year to Candidacy, and Stage IV – Dissertation Stage ( 1 ). Throughout theses stages, the student must build a committee and find a chair, formulate a research proposal, manage scheduling and time deadlines, and complete the dissertation. This process is often performed in relative isolation which can impact completion ( 1 ). A researcher interviewed 58 individuals from 4 departments in one university in the fields of history, biology, geology, and English who did not complete a PhD program and found isolation to be a major theme of the reason for attrition ( 11 ). Alternative dissertation models such as use of the cohort model and a lock-step process ( 11 ), the companion dissertation ( 21 , 23 ), and the supervision across disciplines model ( 5 ) have been proposed to mitigate the feelings of isolation.

Building a committee and finding a chair can complicate the dissertation process ( 15 , 27 ). Difficulties can arise from not knowing the pertinent questions to ask, nor understanding one’s options when selecting a chair and committee members. Spaulding and Rockinson-Szapkiw advise to carefully select a chair and committee that work well together and with you ( 27 ). Beatty found that lack of effective communication with the committee and chair can also be a concern ( 3 ). This ineffective communication can lead to the supervisor being unaware of the amount and type of feedback that the student needs or lead to ambiguity about authorship and writing responsibilities ( 3 ). Another challenge noted by Beatty and Harding-DeKam et al. is selecting a topic that is unique, interesting, and relevant ( 3 , 15 ). Beatty further reports that PhD students should consider the focus of the topic area, whether the research is feasible and congruent with the committee chair’s expertise, and whether the methodology is appropriate ( 3 ). It has also been suggested that students start considering dissertation topics early at the start of the program to narrow the focus of their research ( 3 ). This may benefit students if assignments throughout the program can serve as preliminary work for the final dissertation ( 3 , 15 ). Lastly, time management skills may impact dissertation completion. The PhD student must be responsible and willing to take on tasks and to complete them in a timely manner ( 17 ). It has been proposed that PhD students should set deadlines and work continuously, avoiding taking extended breaks ( 2 , 13 ). As time is a critical factor, scheduling time for research and writing may keep the student focused ( 2 , 12 , 13 ). Harding-DeKam suggests that PhD advisors utilize structured meetings where what the student knows is analyzed against what the student needs to learn ( 15 ). The student is then given individualized and explicit expectations and deadlines to complete assignments depending on the stage of the process that he or she is in ( 15 ).

The dissertation process offers the PhD student an opportunity to develop critical thinking skills as well as positive attributes and behaviors needed as a professional. This challenging period of growth from student to professional may have barriers that will need to be overcome to be successful. Unfortunately, however, some students are unable to overcome these barriers. Completing the dissertation can be a major hurdle in PhD student success and influence attrition ( 23 ). These barriers were also noted in studies related to doctoral degrees in the field of education where when the student is no longer in the classroom, there is a loss of support from peers and instructors giving an opportunity to develop independence ( 15 , 27 ). This loss of structure can lead to apprehension and feelings of isolation, with the dissertation often cited as the most isolating portion of doctoral training ( 2 , 12 , 13 , 21 ). In addition, lack of structure as an all but dissertation (ABD) PhD student may lead to feelings of isolation and a loss of focus resulting in the student never completing his/her dissertation. This is congruent with a study by Gardner who interviewed 60 PhD students and 34 faculty members to determine perceived attributes for attrition from these stakeholders ( 10 ). The results of this study indicated that faculty found “student lacking” (including a lack of focus and motivation) to be the most identified reason for attrition at 53% ( 10 ). Both groups identified “personal problems” as reasons for PhD student attrition (15% faculty and 34% student) ( 10 ). Ali and Kohun found social isolation to be a major factor in attrition of the doctoral program and developed a four-stage framework to combat this ( 1 , 2 ). Some of the highlights from the proposed framework included a structured orientation, formal social events, a structured advisor selection, collaboration, and face-to-face communication ( 2 ). Kinesiology students also need structure and support. A study examining the socialization experiences of kinesiology PhD students by utilizing a qualitative approach found that they needed both social and resource support to be successful with difficulty noted most during times of transition – such as from the coursework phase to the dissertation phase ( 24 ).

Multiple alternative models for the dissertation process have been suggested. One alternative model is the cohort approach with a lock-step program. A study by Ali and Kohum described a PhD program of Information Systems and communications at Robert Morris University (RMU) that has a higher graduation rate (90%) and time of completion (3 years) than the national average ( 1 ). The RMU program utilizes a three-year lock-step program in which a strict schedule of community dinners, debriefings, presentation of proposals to students and faculty, and individual meetings with each member of the students’ committee is required to keep the PhD student on track ( 1 ). Additionally, the PhD students presented their progress to others in their cohort and elicited feedback throughout the process from development to completion allowing them to find issues and make modifications quicker ( 1 ). This method was also noted to decrease these PhD students’ feelings of isolation ( 1 ). The use of a companion dissertation is another alternative model for the dissertation process that has been described in the education ( 21 ) and nursing ( 23 ) fields which may decrease feelings of isolation. In a companion dissertation, two PhD students work together on the same project ( 23 ). Essential components are sharing a dissertation chair, a common research agenda, and a collaborative completion of the research and writing ( 21 , 23 ). While Robinson and Tagher found that this approach improved interactions between PhD students and, subsequently, degree completion ( 23 ), limited evidence on the number of schools utilizing this method was found. Limitations were also noted with the companion dissertation including co-writing taking longer, the dissertation seen as less rigorous, and tension between students to meet all deadlines ( 23 ). Thus, this dissertation approach may not be feasible in the field of kinesiology without further evidence of success. Additionally, Carter-Veale et al proposed another alternative dissertation model that utilized faculty mentors from multiple departments to give additional support and collaboration ( 5 ). However, limited information is available on the effectiveness of this proposed model or the number of schools utilizing this multi-department collaboration. Overall, the goal of these alternative methods is to decrease feelings of isolation by improving connectivity, collaboration, and communication between students, their peers, and their advisors and mentors ( 5 , 23 ). While the dissertation process can impact a PhD student’s completion of a degree, effective communication with the dissertation committee, early and relevant topic selection, effective time management skills, and adoption of alternative models may positively impact this process, but more evidence is needed.

As with the other areas identified, the dissertation process has positive and negative consequences on completion ( Table 1 ). Recommendations to improve the dissertation process include choosing a topic at the start of the program and scheduling times for research and writing with set deadlines. As isolation and ambiguity in the process can impact completion, mentors should ensure the students understand the dissertation process early in the program, be available to consult, and encourage the student to ask questions. Likewise, the student should take a proactive approach to understanding the process and seek help when needed.

A review of the literature suggests repeated themes of potential factors that impact PhD student success in completing the program and degree: the student-advisor relationship, mentorship, and the dissertation process. As limited evidence is available regarding factors of success in PhD students specific to kinesiology, this general information gives insight to potential factors that may impact kinesiology PhD student success as well.

The student-advisor relationship can positively influence PhD student success by incorporating structured meetings, communication, and training for the advisors may improve the student-advisor relationship and therefore impact student success. This information may be useful to advisors so that they can help students better understand and navigate the program, as well as assist students in setting goals for meeting dissertation timeline deadlines.

Mentorship may also have a potential impact on PhD student success. Having a mentor to provide critical and timely information offers support to PhD students as they face the challenges listed in this review. Additionally, a mentor provides an opportunity for modeling and instruction on professional behaviors needed by the PhD student. A student could also find a mentor that is outside the department as in the Dissertation House Model where PhD students utilize multiple mentors across many disciplines to help supervise and assist in a cohort model ( 5 ).

The dissertation process should not be overlooked as an impactful experience on PhD student success. Evidence suggests selecting a chair and committee, building a topic, and managing the process and deadlines can impact success. Choosing a dissertation chair and committee was found to be a critical aspect to student success. To navigate this process, students are encouraged to proactively ask questions to understand the dissertation process, seek help from a mentor inside or outside of their department, research the chair and committee members area of research to see if it congruent with their interests, foster positive relationship by being proactive, and schedule time for writing and research. It has been suggested that the selection of a dissertation topic should begin early in the doctoral process. However, students should spend time reflecting prior to selecting a topic to ensure that it is interesting to them and that it will be relevant to their profession. As PhD students may feel isolated in the dissertation process, alternate models such as collaboration or the companion dissertation were reviewed; however little evidence is available on the widespread use or success of these models.

PhD student success of completing a degree and program is multifactorial. More evidence is needed regarding PhD student success for those enrolled in kinesiology programs. This could include a comprehensive survey to PhD students enrolled in kinesiology programs and those who completed the degree to determine the factors that these stakeholders attribute to successful completion. Additionally, a rise in undergraduate majors in kinesiology programs also necessitates the need for qualified PhD trained faculty as these majors are often selected by students entering physical therapy and other professional graduate programs ( 29 ). Therefore, future studies may also look at the type and quality of mentorship of PhD students for careers in higher education. Because there is limited information regarding kinesiology PhD student success degree completion, more research is needed with the aim of improving retention and completion.

Banner

  • Teesside University Student & Library Services
  • Learning Hub Group

Preparing for your dissertation - the literature review

  • Critical Thinking

Introduction

  • Doing a Literature Search
  • Interlibrary Loans
  • Writing a Literature Review
  • Online tutorial

 Critical thinking - what it is and why it matters

What does it mean to be a critical student? This part of the guide will introduce you to the key aspects of critical thinking:

  • the main components of an argument
  • what makes an argument succeed or fail
  • identifying supporting evidence
  • recognising the most reliable research

               

What is Critical Thinking?

View the following introduction to critical thinking, which comes from the University of Leicester.

Support materials

  • Thinking critically about your results
  • Critically evaluating a journal article worksheet
  • Critical thinking checklist
  • Critical thinking exercise - Teesside University Publication date: 2008
  • Critical thinking study guide - University of Plymouth Publication date: 2010

Further Reading

critical thinking doctoral dissertation

  • << Previous: Interlibrary Loans
  • Next: RefWorks >>
  • Last Updated: Mar 27, 2024 1:08 PM
  • URL: https://libguides.tees.ac.uk/dissertation
  • Open access
  • Published: 16 August 2023

Training doctoral students in critical thinking and experimental design using problem-based learning

  • Michael D. Schaller 1 ,
  • Marieta Gencheva 1 ,
  • Michael R. Gunther 1 &
  • Scott A. Weed 1  

BMC Medical Education volume  23 , Article number:  579 ( 2023 ) Cite this article

1582 Accesses

3 Citations

Metrics details

Traditionally, doctoral student education in the biomedical sciences relies on didactic coursework to build a foundation of scientific knowledge and an apprenticeship model of training in the laboratory of an established investigator. Recent recommendations for revision of graduate training include the utilization of graduate student competencies to assess progress and the introduction of novel curricula focused on development of skills, rather than accumulation of facts. Evidence demonstrates that active learning approaches are effective. Several facets of active learning are components of problem-based learning (PBL), which is a teaching modality where student learning is self-directed toward solving problems in a relevant context. These concepts were combined and incorporated in creating a new introductory graduate course designed to develop scientific skills (student competencies) in matriculating doctoral students using a PBL format.

Evaluation of course effectiveness was measured using the principals of the Kirkpatrick Four Level Model of Evaluation. At the end of each course offering, students completed evaluation surveys on the course and instructors to assess their perceptions of training effectiveness. Pre- and post-tests assessing students’ proficiency in experimental design were used to measure student learning.

The analysis of the outcomes of the course suggests the training is effective in improving experimental design. The course was well received by the students as measured by student evaluations (Kirkpatrick Model Level 1). Improved scores on post-tests indicate that the students learned from the experience (Kirkpatrick Model Level 2). A template is provided for the implementation of similar courses at other institutions.

Conclusions

This problem-based learning course appears effective in training newly matriculated graduate students in the required skills for designing experiments to test specific hypotheses, enhancing student preparation prior to initiation of their dissertation research.

Peer Review reports

Introduction

For over a decade there have been calls to reform biomedical graduate education. There are two main problems that led to these recommendations and therefore two different prescriptions to solve these problems. The first major issue is the pursuit of non-traditional (non-academic) careers by doctorates and concerns of adequate training [ 1 , 2 ]. The underlying factors affecting career outcomes are the number of PhDs produced relative to the number of available academic positions [ 1 , 3 , 4 , 5 ], and the changing career interests of doctoral students [ 6 , 7 , 8 , 9 ]. One aspect in the proposed reformation to address this problem is incorporation of broader professional skills training and creating awareness of a greater diversity of careers into the graduate curriculum [ 1 , 4 , 5 ]. The second issue relates to the curricula content and whether content knowledge or critical scientific skills should be the core of the curriculum [ 10 , 11 ]. The proposed reformation to address this issue is creation of curricula focusing upon scientific skills, e.g. reasoning, experimental design and communication, while simultaneously reducing components of the curricula that build a foundational knowledge base [ 12 , 13 ]. Components of these two approaches are not mutually exclusive, where incorporation of select specialized expertise in each area has the potential to concurrently address both issues. Here we describe the development, implementation and evaluation of a new problem-based learning (PBL)-based graduate course that provides an initial experience in introducing the scientific career-relevant core competencies of critical thinking and experimental design to incoming biomedical doctoral students. The purpose of this course is to address these issues by creating a vehicle to develop professional skills (communication) and critical scientific skills (critical thinking and experimental design) for first year graduate students.

One approach that prioritizes the aggregate scientific skill set required for adept biomedical doctorates is the development of core competencies for doctoral students [ 5 , 14 , 15 ], akin to set milestones that must be met by medical residents and fellows [ 16 ]. Key features of these competencies include general and field-specific scientific knowledge, critical thinking, experimental design, evaluation of outcomes, scientific rigor, ability to work in teams, responsible conduct of research, and effective communication [ 5 , 14 , 15 ]. Such competencies provide clear benchmarks to evaluate the progress of doctoral students’ development into an independent scientific professional and preparedness for the next career stage. Historically, graduate programs relied on traditional content-based courses and supervised apprenticeship in the mentor’s laboratory to develop such competencies. An alternative to this approach is to modify the graduate student curriculum to provide a foundation for these competencies early in the curriculum in a more structured way. This would provide a base upon which additional coursework and supervised dissertation research could build to develop competencies in doctoral students.

Analyses of how doctoral students learn scientific skills suggest a threshold model, where different skillsets are mastered (a threshold reached), before subsequent skillsets can be mastered [ 17 , 18 ]. Skills like using the primary literature, experimental design and placing studies in context are earlier thresholds than identifying alternatives, limitations and data analysis [ 18 ]. Timmerman et al. recommend revision of graduate curricula to sequentially build toward these thresholds using evidence-based approaches [ 18 ]. Several recent curricular modifications are aligned with these recommendations. One program, as cited above, offers courses to develop critical scientific skills early in the curriculum with content knowledge provided in later courses [ 12 , 13 ]. A second program has built training in experimental design into the coursework in the first semester of the curriculum. Improvements in students experimental design skills and an increase in self-efficacy in experimental design occurred over the course of the semester [ 19 ]. Other programs have introduced exercises into courses and workshops to develop experimental design skills using active learning. One program developed interactive sessions on experimental design, where students give chalk talks about an experimental plan to address a problem related to course content and respond to challenges from their peers [ 20 ]. Another program has developed a workshop drawing upon principles from design thinking to build problem solving skills and creativity, and primarily uses active learning and experiential learning approaches [ 21 ]. While these programs are well received by students, the outcomes of training have not been reported. Similar undergraduate curricula that utilize literature review with an emphasis on scientific thought and methods report increased performance in critical thinking, scientific reasoning and experimental design [ 22 , 23 ].

It is notable that the changes these examples incorporate into the curriculum are accompanied with a shift from didactic teaching to active learning. Many studies have demonstrated that active learning is more effective than a conventional didactic curriculum in STEM education [ 24 ]. Problem-based learning (PBL) is one active learning platform that the relatively new graduate program at the Van Andel Institute Graduate School utilizes for delivery of the formal curriculum [ 25 ]. First developed for medical students [ 26 ], the PBL learning approach has been adopted in other educational settings, including K-12 and undergraduate education [ 27 , 28 ]. A basic tenet of PBL is that student learning is self-directed [ 26 ]. Students are tasked to solve an assigned problem and are required to find the information necessary for the solution (self-directed). In practice, learning occurs in small groups where a faculty facilitator helps guide the students in identifying gaps in knowledge that require additional study [ 29 ]. As such, an ideal PBL course is “well organized” but “poorly structured”. The lack of a traditional restrictive structure allows students to pursue and evaluate different solutions to the problem.

The premise for PBL is that actively engaging in problem solving enhances learning in several ways [ 29 , 30 ]. First, activation of prior knowledge, as occurs in group discussions, aids in learning by providing a framework to incorporate new knowledge. Second, deep processing of material while learning, e.g. by answering questions or using the knowledge, enhances the ability to later recall key concepts. Third, learning in context, e.g. learning the scientific basis for clinical problems in the context of clinical cases, enables and improves recall. These are all effective strategies to enhance learning [ 31 ]. PBL opponents argue that acquisition of knowledge is more effective in a traditional didactic curriculum. Further, development of critical thinking skills requires the requisite foundational knowledge to develop realistic solutions to problems [ 32 ].

A comprehensive review of PBL outcomes from K-12 through medical school indicated that PBL students perform better in the application of knowledge and reasoning, but not in other areas like basic knowledge [ 33 ]. Other recent meta-analyses support the conclusion that PBL, project-based learning and other small group teaching modalities are effective in education from primary school to university, including undergraduate courses in engineering and technology, and pharmacology courses for professional students in health sciences [ 34 , 35 , 36 , 37 , 38 , 39 ]. While the majority of the studies reported in these meta-analyses demonstrate that PBL results in better academic performance, there are contrasting studies that demonstrate that PBL is ineffective. This prompts additional investigation to determine the salient factors that distinguish the two outcomes to establish best practices for better results using the PBL platform. Although few studies report the outcomes of PBL based approaches in graduate education, this platform may be beneficial in training biomedical science doctoral students for developing and enhancing critical thinking and practical problem-solving skills.

At our institution, biomedical doctoral students enter an umbrella program and take a core curriculum in the first semester prior to matriculating into one of seven biomedical sciences doctoral programs across a wide range of scientific disciplines in the second semester. Such program diversity created difficulty in achieving consensus on the necessary scientific foundational knowledge for a core curriculum. Common ground was achieved during a recent curriculum revision through the development of required core competencies for all students, regardless of field of study. These competencies and milestones for biomedical science students at other institutions [ 5 , 14 , 15 ], along with nontraditional approaches to graduate education [ 12 , 25 ], were used as guidelines for curriculum modification.

Course design

A course was created to develop competencies required by all biomedical sciences doctoral students regardless of their program of interest [ 14 ]. As an introductory graduate level course, this met the needs of all our seven diverse biomedical sciences doctoral programs where our first-year doctoral students matriculate. A PBL platform was chosen for the course to engage the students in an active learning environment [ 25 ]. The process of problem solving in small teams provided the students with experience in establishing working relationships and how to operate in teams. The students gained experience in researching material from the literature to establish scientific background, find current and appropriate experimental approaches and examples of how results are analyzed. This small group approach allowed each team to develop different hypotheses, experimental plans and analyses based upon the overall interests of the group. The course was designed following discussions with faculty experienced in medical and pharmacy school PBL, and considering course design principles from the literature [ 27 , 40 ]. The broad learning goals are similar to the overall objectives in another doctoral program using PBL as the primary course format [ 25 ], and are aligned with recommended core competencies for PhD scientists [ 14 ]. These goals are to:

Develop broad, general scientific knowledge (core competency 1 [ 14 ]).

Develop familiarity with technical approaches specific to each problem.

Practice critical thinking/experimental design incorporating rigor and reproducibility,

including: formulation of hypotheses, detailed experimental design, interpretation of data, statistical analysis (core competencies 3 and 4 [ 14 ]).

Practice communication skills: written and verbal communication skills (core competency 8 [ 14 ]).

Develop collaboration and team skills (core competency 6 [ 14 ]).

Practice using the literature.

Students were organized into groups of four or five based on their scientific background. Student expertise in each group was deliberately mixed to provide different viewpoints during discussion. A single faculty facilitator was assigned to each student group, which met formally in 13 separate sessions (Appendix II). In preparation for each session, the students independently researched topics using the literature (related to goal 6) and met informally without facilitator oversight to coordinate their findings and organize the discussion for each class session. During the formal one-hour session, one student served as the group leader to manage the discussion. The faculty facilitator guided the discussion to ensure coverage of necessary topics and helped the students identify learning issues, i.e. areas that required additional development, for the students to research and address for the subsequent session. At the end of each session, teams previewed the leading questions for the following class and organized their approach to address these questions prior to the next session. The whole process provided experiences related to goal 5.

As the course was developed during the COVID-19 pandemic, topics related to SARS-CoV2 and COVID-19 were selected as currently relevant problems in society. Session 1 prepared the students to work in teams by discussing about how to work in teams and manage conflict (related to goal 5). In session 2, the students met in their assigned groups to get to know each other, discuss problem-based learning and establish ground rules for the group. Sessions 3 and 4 laid the course background by focusing on the SARS-CoV2 virus and COVID-19-associated pathologies (related to goal 1). The subsequent nine sessions were organized into three separate but interrelated three-session blocks: one on COVID-19 and blood clotting, one on COVID-19 and loss of taste, and one on SARS-CoV2 and therapeutics. The first session in each of these blocks was devoted to covering background information (blood clotting, neurosensation and drug application)(related to goal 1). The second session of each block discussed hypothesis development (mechanisms that SARS-CoV2 infection might utilize to alter blood clotting, the sense of taste, and identification of therapeutic targets to attenuate SARS-CoV2 infection)(related to goal 3). In the second sessions the students also began to design experiments to test the hypothesis. The final session of each block fleshed out the details of the experimental design (related to goals 2 and 3).

The process was iterative, where the students had three opportunities to discuss hypothesis development, experimental design and analysis during sessions with their facilitators. Written and oral presentation assignments (Appendix V) provided additional opportunities to articulate a hypothesis, describe experimental approaches to test the hypotheses, propose analysis of experimental results and develop communication skills (related to goal 4).

Rigor and reproducibility was incorporated into the course. This was an important component given the emphasis recently placed on rigor and reproducibility by federal agencies. As the students built the experimental design to address the hypothesis, recurring questions were posed to encourage them to consider rigor. Examples include: “ Are the methods and experimental approaches rigorous? How could they be made more rigorous? ” “ Discuss how your controls validate the outcome of the experiment. What additional controls could increase confidence in your result? ” The facilitators were instructed to direct discussion to topics related to the rigor of the experimental design. The students were asked about numbers of replicates, number of animals, additional methods that could be applied to support the experiment, and other measurements to address the hypothesis in a complementary fashion. In the second iteration of the course, we introduced an exercise on rigor and reproducibility for the students using the NIH Rigor and Reproducibility Training Modules (see Appendix III). In this exercise, the students read a short introduction to rigor and reproducibility and viewed a number of short video modules to introduce lessons on rigor. The students were also provided the link to the National Institute of General Medical Sciences clearinghouse of training modules on rigor and reproducibility as reference for experimental design in their future (see Appendix III).

The first delivery of the course was during the COVID-19 pandemic and sessions were conducted on the Zoom platform. The thirteen PBL sessions, and two additional sessions dedicated to oral presentations, were spaced over the course of the first semester of the biomedical sciences doctoral curriculum. The second iteration of the course followed the restructuring of the graduate first year curriculum and the thirteen PBL sessions, plus one additional session devoted to oral presentations, were held during the first three and a half weeks of the first-year curriculum. During this period in the semester, this was the only course commitment for the graduate students. Due to this compressed format, only one written assignment and a single oral presentation were assigned. As the small group format worked well via Zoom in the first iteration of the course, the small groups continued to meet using this virtual platform.

IRB Approval. The West Virginia University Institutional Review Board approved the study (WVU IRB Protocol#: 2008081739). Informed consent was provided by the participants in writing and all information was collected anonymously.

Surveys. Evaluation of training effectiveness was measured in two ways corresponding to the first two levels of the Kirkpatrick Model of Evaluation [ 41 ]. First, students completed a questionnaire upon completion of the course to capture their perceptions of training (Appendix VII). Students were asked their level of agreement/disagreement on a Likert scale with 10 statements about the course and 7 statements about their facilitator. Second, students took a pre- and post-test to measure differences in their ability to design experiments before and after training (Appendix VIII). The pre- and post-tests were identical, asking the students to design an experiment to test a specific hypothesis, include controls, plan analyses, and state possible results and interpretation. Five questions were provided for the pre- and post-test, where each question posed a hypothesis from a different biomedical discipline, e.g. cancer biology or neuroscience. Students were asked to choose one of the five questions to answer.

Peer-to-peer evaluations were collected to provide feedback on professionalism and teamwork. This survey utilized a Goldilocks scale ranging from 1 to 7, with 4 being the desired score. An example peer question asked about accountability, where responses included not accountable, e.g. always late (score = 1), accountable, e.g. punctual, well prepared, follows up (score = 4) and controlling, e.g. finds fault in others (score = 7). Each student provided a peer-to-peer evaluation for each student in their group. (see Appendix VII). In the second course iteration, Goldilocks surveys were collected three times over the three-week course period due to the compressed time frame. This was necessary to provide rapid feedback to the students about their performance during the course in order to provide opportunities to address and rectify any deficits before making final performance assessments.

Evaluating Pre- and Post-Tests. All pre- and post-test answers were evaluated by three graders in a blind fashion, where the graders were unaware if an answer came from a pre- or post-test. Prior to grading, each grader made up individual answer keys based upon the question(s) on the tests. The graders then met to compare and deliberate these preliminary keys, incorporating changes and edits to produce a single combined key to use for rating answers. While the students were asked to answer one question, some students chose to answer several questions. Superfluous answers were used as a training dataset for the graders. The graders independently scored each answer, then met to review the results and discuss modification of the grading key. The established final grading key, with a perfect score of 16, was utilized by the graders in independently evaluating the complete experimental dataset consisting of all pre- and post-test answers (Appendix IX). To assess the ability of student cohorts to design experiments before and after the course, three of the authors graded all of the pre- and post-test answers. Grading was performed in a blind fashion and the scores of the three raters were averaged for each answer.

Statistical analysis. To measure the interrater reliability of the graders, the intraclass correlation coefficient (ICC) was calculated. A two-way mixed effects model was utilized to evaluate consistency between multiple raters/measurements. The ICC for grading the training dataset was 0.82, indicating a good inter-rater agreement. The ICC for grading the experimental dataset was also 0.82. For comparison of pre-test vs. post-test performance, the scores of the three raters were averaged for each answer. Since answers were anonymous, the analyses compared responses between individuals. Most, but not all scores, exhibited a Gaussian distribution and therefore a nonparametric statistic, a one-tailed Mann Whitney U test, was used for comparison. The pre-test and post-test scores for 2020 and 2021 could not be compared due to the different format used for the course in each year.

Thirty students participated in the course in the first offering, while 27 students were enrolled in the second year. The students took pre- and post-tests to measure their ability to design an experiment before and after training (Appendix VIII). As the course progressed, students were surveyed on their views of the professionalism of other students in their group (Appendix VII). At the end of the course, students were asked to respond to surveys evaluating the course and their facilitator (see Appendix VII).

Student reception of the course (Kirkpatrick Level 1) . In the first year, 23 students responded to the course evaluation (77% response rate) and 26 students submitted facilitator evaluations (87% response rate), whereas in the second year there were 25 responses to the course evaluation (93% response rate) and 26 for facilitators (96% response rate). Likert scores for the 2020 and 2021 course evaluations are presented in Fig.  1 . The median score for each question was 4 on a scale of 5 in 2020. In 2021, the median scores for the questions about active learning and hypothesis testing were 5 and the median score of the other questions was 4. The students appreciated the efforts of the facilitators in the course, based upon their evaluations of the facilitators. The median score for every facilitator across all survey questions is shown in Fig.  2 . The median score for a single question in 2020 and 2021 was 4.5 and the median score for all other questions was 5. The results of the peer-to-peer evaluations are illustrated in Fig.  3 . The average score for each student were plotted, with scores further from the desired score of 4 indicating perceived behaviors that were not ideal. The wide range of scores in the 2020 survey were noted. The students completed three peer-to-peer surveys during the 2021 course. The range of scores in the 2021 peer-to-peer evaluation was narrower than the range in the 2020 survey. The range of scores was expected to narrow from the first (initial) to third (final) survey as students learned and implemented improvements in their professional conduct based upon peer feedback. The narrow range of scores in the initial survey left little room for improvement.

figure 1

Results of Course Evaluations by Students. Student evaluations of the course were collected at the end of each offering. The evaluation surveys are in Appendix VII. Violin plots showing the distribution and median score for each question in the 2020 survey (A) and the 2021 survey (B) are shown. The survey used a Likert scale (1 – low to 5 – high)

figure 2

Results of Facilitator Evaluations by Students. Student evaluations of the facilitators were collected at the end of each offering of the course. The evaluation surveys are in Appendix VII. Violin plots showing the distribution and median score for each question in the 2020 survey (A) and the 2021 survey (B) are shown. The survey used a Likert scale (1 – low to 5 – high)

figure 3

Results of Student Peer-to-Peer Evaluations. Student peer-to-peer evaluations were collected at the end of the course in year 1 (A) , and at the beginning (B) , the middle (C) and the end (D) of the course in year 2. Each student evaluated the professionalism of each other student in their group using the evaluation survey shown in Appendix VII. The average score for each student is plotted as a data point. The survey used a Goldilocks scale (range of 1 to 7) where the desired professional behavior is reflected by a score of 4

Student learning (Kirkpatrick Level 2). Twenty-six students completed the pre-test in each year and consented to participate in this study (87% response in the first year and 96% response in the second year). Eighteen students completed the post-test at the end of the first year (60%) and 26 students completed the test at the end of the second year (96%). Question selection (excluding students that misunderstood the assignment and answered all questions) is shown in Table  1 . The most frequently selected questions were Question 1 (45 times) and Question 2 (23 times). Interestingly, the results in Table  1 also indicate that students did not necessarily choose the same question to answer on the pre-test and post-test.

Average scores on pre-tests and post-tests were compared using a one-tailed Mann Whitney U test. Since the format of the course was different in the two iterations, comparison of test results between the two years could not be made. The average scores of the pre- and post-test in 2020 were not statistically different (p = 0.0673), although the post-test scores trended higher. In contrast, the difference between the pre- and post-test in 2021 did reach statistical significance (p = 0.0329). The results collectively indicate an overall improvement in student ability in experimental design (Fig.  4 ).

figure 4

Pre- and Post-Test Scores. At the beginning and end of each offering, the students completed a test to measure their ability to design an experiment (see Appendix VIII for the details of the exam). Three faculty graded every answer to the pre- and post-test using a common grading rubric (see Appendix IX). The maximum possible score was 16. The average score for each individual answer on the pre-test and post-test is represented as a single data point. The bar indicates the mean score across all answers +/- SD. The average scores of the pre- and post-test scores were compared using a one-tailed Mann Whitney U test. For the 2020 data (A) , p = 0.0673, and for the 2021 data (B) , p = 0.0329

This course was created in response to biomedical workforce training reports recommending increased training in general professional skills and scientific skills, e.g. critical thinking and experimental design. The course utilizes a PBL format, which is not extensively utilized in graduate education, to incorporate active learning throughout the experience. It was well received by students and analysis suggests that major goals of the course were met. This provides a template for other administrators and educators seeking to modify curricula in response to calls to modify training programs for doctoral students.

Student evaluations indicated the course was effective at motivating active learning and that students became more active learners. The evaluation survey questions were directly related to three specific course goals: (1) Students reported developing skills in problem solving, hypothesis testing and experimental design. (2) The course helped develop oral presentation skills and written communication skills (in one iteration of the course) and (3) students developed collaboration and team skills. Thus, from the students’ perspective, these three course goals were met. Student perceptions of peer professionalism was measured using peer-to-peer surveys. The wide range of Goldilocks scores in the first student cohort was unexpected. In the second student cohort changes in professional behavior were measured over time and the score ranges were narrower. The reasons for the difference between cohorts is unclear. One possibility for this discrepancy is that the first iteration of the course extended over one semester and was during the first full semester of the pandemic, impacting professional behavior and perceptions of professionalism. The second cohort completed a professionalism survey three times during the course. The narrow range of scores from this cohort in the initial survey made detection of improved professionalism over the course difficult. Results do indicate that professionalism improved in terms of respect and compassion between the first and last surveys. Finally, the results of the pre-test and post-test analysis demonstrated a trend of improved performance on the post-test relative to the pre-test for students in each year of the course and a statistical difference between the pre- and post-test scores in the second year.

Areas for improvement. The course was initially offered as a one-credit course. Student comments on course evaluations and comments in debriefing sessions with facilitators at the end of the course concurred that the work load exceeded that of a one credit course. As a result, the year two version was offered as a two-credit course to better align course credits with workload.

There were student misperceptions about the goals of the course in the first year. Some students equated experimental design with research methods and expressed disappointment that this was not a methods course. While learning appropriate methods is a goal of the course, the main emphasis is developing hypotheses and designing experiments to test the hypotheses. As such, the choice of methods was driven by the hypotheses and experimental design. This misperception was addressed in the second year by clearly elaborating on the course goals in an introductory class session.

The original course offering contained limited statistical exercises to simulate experimental planning and data analysis, e.g. students were required to conduct a power analysis. Between the first and second years of the course, the entire first semester biomedical sciences curriculum was overhauled with several new course offerings. This new curriculum contained an independent biostatistics workshop that students completed prior to the beginning of this course. Additional statistics exercises were incorporated into the PBL course to provide the students with more experience in the analysis of experimental results. Student evaluations indicated that the introduction of these additional exercises was not effective. Improved coordination between the biostatistics workshop and the PBL course is required to align expectations, better equipping students for the statistical analysis of experimental results encountered later in this course.

An important aspect that was evident from student surveys, facilitator discussions and debrief sessions was that improved coordination between the individual facilitators of the different groups is required to reduce intergroup variability. Due to class size, the students were divided into six groups, with each facilitator assigned to the same group for the duration of the course to maintain continuity. The facilitators met independent of the students throughout the course to discuss upcoming sessions and to share their experiences with their respective groups. This allowed the different facilitators to compare approaches and discuss emerging or perceived concerns/issues. In the second year, one facilitator rotated between different groups during each session to observe how the different student groups functioned. Such a real time faculty peer-evaluation process has the potential to reduce variability between groups, but was challenging to implement within the short three-week time period. Comprehensive training where all facilitators become well versed in PBL strategies and adhere to an established set of guidelines/script for each session is one mechanism that may reduce variability across different facilitator-group pairings.

Limitations. The current study has a number of limitations. The sample size for each class was small, with 30 students enrolled in the first year of the course and 27 students enrolled in the second. The response rates for the pre-tests were high (> 87%) but the response rate for the post-test varied between the first year (60%) and second year (96%) of the course. The higher response rate in the second year might be due to fewer end of semester surveys since this was the only course that the students took in that time period. Additionally, the post-test in the second year was conducted at a scheduled time, rather than on the student’s own time as was the case in year one. Due to restructuring of the graduate curriculum and the pandemic, the two iterations of the course were formatted differently. This precluded pooling the data from the two offerings and makes comparison between the outcomes difficult.

Presentation of the course was similar, but not identical, to all of the students. Six different PBL groups were required to accommodate the number of matriculating students in each year. Despite efforts to provide a consistent experience, there was variability between the different facilitators in running their respective groups. Further, the development of each session in each group was different, since discussion was driven by the students and their collective interests. These variables could be responsible for increasing the spread of scores on the post-tests and decreasing the value of the course for a subset of students.

The pre- and post-tests were conducted anonymously to encourage student participation. This prevented correlating the differential between pre- and post-test scores for each student and in comparing learning between different groups. The pre-test and post-test were identical, and provided the students with five options to design experiments (with identical instructions) in response to a different biomedical science problem. An alternative approach could have used isomorphic questions for the pre- and post-tests. It is clear that some students answered the same question on the pre- and post-test, and may benefit from answering the same question twice (albeit after taking the course). Some students clearly answered different questions on the pre- and post-test and the outcomes might be skewed if the two questions challenged the student differently.

While the course analysis captured the first two levels of the Kirkpatrick model of evaluation (reaction and learning), it did not attempt to measure the third level (behavior) or fourth level (results) [ 41 ]. Future studies are required to measure the third level. This could be achieved by asking students to elaborate on their experimental design used in recent experiments in their dissertation laboratory following completion of the course, or by evaluating the experimental design students incorporate into their dissertation proposals. The fourth Kirkpatrick level could potentially be assessed by surveying preceptors about their students’ abilities in experimental design in a longitudinal manner at semi- or annual committee meetings and accompanying written progress reports. The advantage of focusing on the first two Kirkpatrick levels of evaluation is that the measured outcomes can be confidently attributed to the course. Third and fourth level evaluations are more complicated, since they necessarily take place at some point after completion of the course. Thus, the third and fourth level outcomes can result from additional factors outside of the course (e.g. other coursework, working in the lab, attendance in student-based research forum, meeting with mentors, etc.). Another limiting factor is the use of a single test to measure student learning. Additional alternative approaches to measure learning might better capture differences between the pre- and post-test scores.

Implementation. This curriculum is readily scalable and can be modified for graduate programs of any size, with the caveat that larger programs will require more facilitators. At Van Andel, the doctoral cohorts are three to five new students per year and all are accommodated in one PBL group [ 25 ]. At our institution, we have scaled up to a moderate sized doctoral program with 25 to 30 matriculating students per year, dividing the students into six PBL groups (4–5 students each). Medical School classes frequently exceed 100 students (our program has 115–120 new students each fall) and typically have between five and eight students per group. Our graduate course has groups at the lower end of this range. This course could be scaled up by increasing the number of students in the group or by increasing the number of groups.

Consistency between groups is important so each group of students has a similar experience and reaps the full benefit of this experience. Regular meetings between the course coordinator and facilitators to discuss the content of upcoming sessions and define rubrics to guide student feedback and evaluation were mechanisms used to standardize between the different groups in this course (Appendix VI). In hindsight, the course would benefit from more rigorous facilitator training prior to participation in the course. While a number of our facilitators were veterans of a medical school PBL course, the necessary skillset required to effectively manage a graduate level PBL course that is centered on developing critical thinking and experimental design are different. Such training requires an extensive time commitment by the course coordinators and participating facilitators.

The most difficult task in developing this course involved the course conception and development of the problem-based assignments. Designing a COVID-19 based PBL course in 2020 required de novo development of all course material. This entailed collecting and compiling information about the virus and the disease to provide quick reference for facilitators to guide discussion in their groups, all in the face of constantly shifting scientific and medical knowledge, along with the complete lack of traditional peer-based academic social engagement due to the pandemic. In development of this course, three different COVID-based problems were identified, with appropriate general background material for each problem requiring extensive research and development. Background material on cell and animal models, general strategies for experimental manipulation and methods to measure specific outcomes were collected in each case. Student copies for each session were designed to contain a series of questions as a guide to identifying important background concepts. Facilitator copies for each session were prepared with the goal of efficiently and effectively guiding each class meeting. These guidelines contained ideas for discussion points, areas of elaboration and a truncated key of necessary information to guide the group (Appendix IV). Several PBL repositories exist (e.g. https://itue.udel.edu/pbl/problems/ , https://www.nsta.org/case-studies ) and MedEdPORTAL ( https://www.mededportal.org/ ) publishes medical-specific cases. These provide valuable resources for case-based ideas, but few are specifically geared for research-focused biomedical graduate students. As such, modification of cases germane to first year biomedical graduate students with a research-centered focus is required prior to implementation. Finally, appropriate support materials for surveys and evaluation rubrics requires additional development and refinement of current or existing templates to permit improved evaluation of learning outcomes (Appendix VI).

Development of an effective PBL course takes considerable time and effort to conceive and construct. Successful implementation requires the requisite higher administrative support to identify and devote the necessary and appropriate faculty needed for course creation, the assignment of skilled faculty to serve as facilitators and staff support to coordinate the logistics for the course. It is critical that there is strong faculty commitment amongst the facilitators to devote the time and energy necessary to prepare and to successfully facilitate a group of students. Strong institutional support is linked to facilitator satisfaction and commitment to the PBL-based programs [ 42 ]. Institutional support can be demonstrated in multiple ways. The time commitment for course developers, coordinators and facilitators should be accurately reflected in teaching assignments. Performance in these roles in PBL should factor into decisions about support for professional development, e.g. travel awards, and merit based pay increases. Further, efforts in developing, implementing and executing a successful PBL course should be recognized as important activities during annual faculty evaluations by departmental chairs and promotion and tenure committees.

Key Takeaways. The creation and implementation of this course was intellectually stimulating and facilitators found their interactions with students gratifying. From student survey responses and test results the course was at least modestly successful at achieving its goals. Based upon our experience, important issues to consider when deciding to implement such a curriculum include: (1) support of the administration for developing the curriculum, (2) facilitator buy-in to the approach, (3) continuity (not uniformity) between PBL groups, (4) other components of the curriculum and how they might be leveraged to enhance the effectiveness of PBL and (5) effort required to develop and deliver the course, which must be recognized by the administration.

Future Directions. Novel curriculum development is an often overlooked but important component to contemporary graduate student education in the biomedical sciences. It is critical that modifications incorporated in graduate education are evidence based. We report the implementation of a novel PBL course for training in the scientific skill sets required for developing and testing hypotheses, and demonstrate its effectiveness. Additional measures to assess the course goals in improving critical thinking, experimental design and self-efficacy in experimental design will be implemented using validated tests [ 22 , 43 , 44 , 45 ]. Further studies are also required to determine the long-term impact of this training on student performance in the laboratory and progression towards degree. It will be interesting to determine if similar curriculum changes to emphasize development of skills will shorten the time to degree, a frequent recommendation for training the modern biomedical workforce [ 1 , 46 , 47 , 48 ].

Incorporation of courses emphasizing development of skills can be done in conjunction with traditional didactic instruction to build the necessary knowledge base for modern biomedical research. Our PBL course was stand-alone, requiring the students to research background material prior to hypothesis development and experimental design. Coordination between the two modalities would obviate the need for background research in the PBL component, reinforce the basic knowledge presented didactically through application, and prepare students for higher order thinking about the application of the concepts learned in the traditional classroom. Maintaining a balance between problem-based and traditional instruction may also be key in improving faculty engagement into such new and future initiatives. Continued investments in the creation and improvement of innovative components of graduate curricula centered around developing scientific skills of doctoral students can be intellectually stimulating for faculty and provide a better training environment for students. The effort may be rewarded by streamlining training and strengthening the biomedical workforce of the future.

Data Availability

All data generated in this study are included in this published article and its supplementary information files.

Abbreviations

  • Problem-based learning

Science, technology, engineering, and math

kindergarten through grade 12

Intraclass coefficient>

severe acute respiratory syndrome coronavirus 2

Coronavirus disease 19

National Institutes of Health. Biomedical research workforce working group report. Bethesda, MD: National Institutes of Health; 2012.

Google Scholar  

Sinche M, Layton RL, Brandt PD, O’Connell AB, Hall JD, Freeman AM, Harrell JR, Cook JG, Brennwald PJ. An evidence-based evaluation of transferrable skills and job satisfaction for science PhDs. PLoS ONE. 2017;12:e0185023.

Ghaffarzadegan N, Hawley J, Larson R, Xue Y. A note on PhD Population Growth in Biomedical Sciences. Syst Res Behav Sci. 2015;23:402–5.

National Academies of Sciences Engineering and Medicine. The next generation of biomedical and behavioral sciences researchers: breaking through. Washington, DC: National Academies Press (US); 2018.

National Academies of Sciences Engineering and Medicine. Graduate STEM education for the 21st century. Washington, DC: National Academies Press; 2018.

Roach M, Sauermann H. The declining interest in an academic career. PLoS ONE. 2017;12:e0184130.

Sauermann H, Roach M. Science PhD career preferences: levels, changes, and advisor encouragement. PLoS ONE. 2012;7:e36307.

St Clair R, Hutto T, MacBeth C, Newstetter W, McCarty NA, Melkers J. The “new normal”: adapting doctoral trainee career preparation for broad career paths in science. PLoS ONE. 2017;12:e0177035.

Fuhrmann CN, Halme DG, O’Sullivan PS, Lindstaedt B. Improving graduate education to support a branching career pipeline: recommendations based on a survey of doctoral students in the basic biomedical sciences. CBE—Life Sci Educ. 2011;10:239–49.

Casadevall A, Ellis LM, Davies EW, McFall-Ngai M, Fang FC. (2016) A framework for improving the quality of research in the biological sciences. 7, e01256–01216.

Casadevall A, Fang FC. (2016) Rigorous science: a how-to guide. 7, e01902-01916.

Bosch G, Casadevall A. Graduate Biomedical Science Education needs a New Philosophy. mBio. 2017;8:e01539–01517.

Bosch G. Train PhD students to be thinkers not just specialists. Nature. 2018;554:277–8.

Verderame MF, Freedman VH, Kozlowski LM, McCormack WT. (2018) Competency-based assessment for the training of PhD students and early-career scientists. Elife 7, e34801.

Graziane J, Graziane N. Neuroscience Milestones: developing standardized core-competencies for Research-Based neuroscience trainees. J Neurosci. 2022;42:7332–8.

Edgar L, Roberts S, Holmboe E. Milestones 2.0: a step forward. J graduate Med Educ. 2018;10:367–9.

Kiley M, Wisker G. Threshold concepts in research education and evidence of threshold crossing. High Educ Res Dev. 2009;28:431–41.

Timmerman BC, Feldon D, Maher M, Strickland D, Gilmore J. Performance-based assessment of graduate student research skills: timing, trajectory, and potential thresholds. Stud High Educ. 2013;38:693–710.

Lachance K, Heustis RJ, Loparo JJ, Venkatesh MJ. Self-efficacy and performance of research skills among first-semester bioscience doctoral students. CBE—Life Sci Educ. 2020;19:ar28.

Heustis RJ, Venkatesh MJ, Gutlerner JL, Loparo JJ. Embedding academic and professional skills training with experimental-design chalk talks. Nat Biotechnol. 2019;37:1523–7.

Ulibarri N, Cravens AE, Cornelius M, Royalty A, Nabergoj AS. Research as design: developing creative confidence in doctoral students through design thinking. Int J Doctoral Stud. 2014;9:249–70.

Gottesman AJ, Hoskins SG. CREATE cornerstone: introduction to scientific thinking, a new course for STEM-interested freshmen, demystifies scientific thinking through analysis of scientific literature. CBE—Life Sci Educ. 2013;12:59–72.

Koenig K, Schen M, Edwards M, Bao L. (2012) Addressing STEM Retention through a scientific thought and methods Course. J Coll Sci Teach 41.

Freeman S, Eddy SL, McDonough M, Smith MK, Okoroafor N, Jordt H, Wenderoth MP. Active learning increases student performance in science, engineering, and mathematics. Proc Natl Acad Sci. 2014;111:8410–5.

Turner JD, Triezenberg SJ. PBL for Ph. D.: a problem-based learning approach to doctoral education in biomedical research. ASQ High Educ Brief. 2010;3:1–5.

Neufeld VR, Barrows HS. The “McMaster Philosophy”: an approach to medical education. Acad Med. 1974;49:1040–50.

Duch BJ, Groh SE, Allen DE. The power of problem-based learning: a practical” how to” for teaching undergraduate courses in any discipline. Sterling, VA: Stylus Publishing, LLC.; 2001.

Wirkala C, Kuhn D. Problem-based learning in K–12 education: is it effective and how does it achieve its effects? Am Educ Res J. 2011;48:1157–86.

Norman G, Schmidt HG. The psychological basis of problem-based learning: a review of the evidence. Acad Med. 1992;67:557–65.

Handelsman J, Ebert-May D, Beichner R, Bruns P, Chang A, DeHaan R, Gentile J, Lauffer S, Stewart J, Tilghman SM. Scientific teaching. Science. 2004;304:521–2.

Brown PC, Roediger III, H. L., and, McDaniel MA. Make it stick: the science of successful learning. Cambridge, Massachusets: The Belknap Press of Harvard University Press; 2014.

Willingham DT. Critical thinking: why is it so hard to teach? Arts Educ Policy Rev. 2008;109:21–32.

Hung W, Jonassen DH, Liu R. (2008) Problem-based learning. In Handbook of research on educational communications and technology pp. 485–506, Routledge, Abingdon UK.

Uluçınar U. The Effect of Problem-Based learning in Science Education on Academic Achievement: a Meta-Analytical Study. Sci Educ Int. 2023;34:72–85.

Chen C-H, Yang Y-C. Revisiting the effects of project-based learning on students’ academic achievement: a meta-analysis investigating moderators. Educational Res Rev. 2019;26:71–81.

Liu Y, Pásztor A. Effects of problem-based learning instructional intervention on critical thinking in higher education: a meta-analysis. Think Skills Creativity. 2022;45:101069.

Kalaian SA, Kasim RM, Nims JK. Effectiveness of small-group learning pedagogies in Engineering and Technology Education: a Meta-analysis. J Technol Educ. 2018;29:20–35.

Liu L, Du X, Zhang Z, Zhou J. Effect of problem-based learning in pharmacology education: a meta-analysis. Stud Educational Evaluation. 2019;60:43–58.

Dochy F, Segers M, Van den Bossche P, Gijbels D. Effects of problem-based learning: a meta-analysis. Learn instruction. 2003;13:533–68.

Azer SA. Challenges facing PBL tutors: 12 tips for successful group facilitation. Med Teach. 2005;27:676–81.

Kirkpatrick DL. Seven keys to unlock the four levels of evaluation. Perform Improv. 2006;45:5–8.

Trullàs JC, Blay C, Sarri E, Pujol R. Effectiveness of problem-based learning methodology in undergraduate medical education: a scoping review. BMC Med Educ. 2022;22:104.

Deane T, Nomme K, Jeffery E, Pollock C, Birol G. Development of the biological experimental design concept inventory (BEDCI). CBE—Life Sci Educ. 2014;13:540–51.

Sirum K, Humburg J. The experimental design ability test (EDAT). Bioscene: J Coll Biology Teach. 2011;37:8–16.

Hoskins SG, Lopatto D, Stevens LM. The CREATE approach to primary literature shifts undergraduates’ self-assessed ability to read and analyze journal articles, attitudes about science, and epistemological beliefs. CBE—Life Sci Educ. 2011;10:368–78.

Pickett CL, Corb BW, Matthews CR, Sundquist WI, Berg JM. Toward a sustainable biomedical research enterprise: finding consensus and implementing recommendations. Proc Natl Acad Sci U S A. 2015;112:10832–6.

National Research Council. Research universities and the future of America: ten breakthrough actions vital to our nation’s prosperity and security. Washington, DC: National Academies Press; 2012.

American Academy of Arts and Sciences. Restoring the Foundation: the vital role of Research in preserving the American Dream: report brief. American Academy of Arts & Sciences; 2014.

Download references

Acknowledgements

Thanks to Mary Wimmer and Drew Shiemke for many discussions over the years about PBL in the medical curriculum and examples of case studies. We thank Steve Treisenberg for initial suggestions and discussions regarding PBL effectiveness in the Van Andel Institute. Thanks to Paul and Julie Lockman for discussions about PBL in School of Pharmacy curricula and examples of case studies. Special thanks to the facilitators of the groups, Stan Hileman, Hunter Zhang, Paul Chantler, Yehenew Agazie, Saravan Kolandaivelu, Hangang Yu, Tim Eubank, William Walker, and Amanda Gatesman-Ammer. Without their considerable efforts the course could never have been successfully implemented. Thanks to the Department of Biochemistry and Molecular Medicine for supporting the development of this project. MS is the director of the Cell & Molecular Biology and Biomedical Engineering Training Program (T32 GM133369).

There was no funding available for this work.

Author information

Authors and affiliations.

Department of Biochemistry and Molecular Medicine, West Virginia University School of Medicine, Robert C. Byrd Health Sciences Center 64 Medical Center Drive, P.O. Box 9142, Morgantown, WV, 26506, USA

Michael D. Schaller, Marieta Gencheva, Michael R. Gunther & Scott A. Weed

You can also search for this author in PubMed   Google Scholar

Contributions

SW and MS developed the concept for the course. MS was responsible for creation and development of all of the content, for the implementation of the course, the design of the study and creating the first draft of the manuscript. MG, MRG and SW graded the pre- and post-test answers in a blind fashion. MS, MG, MRG and SW analyzed the data and edited the manuscript.

Corresponding author

Correspondence to Michael D. Schaller .

Ethics declarations

Competing interests.

The authors declare no competing interests.

Ethics approval and consent to participate

The West Virginia University Institutional Review Board approved the study (WVU IRB Protocol#: 2008081739). Informed consent was provided in writing and all information was collected anonymously. All methods were carried out in accordance with relevant guidelines and regulations.

Consent for publication

Not applicable.

Additional information

Publisher’s note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Electronic supplementary material

Below is the link to the electronic supplementary material.

Supplementary Material 1

Supplementary material 2, supplementary material 3, supplementary material 4, supplementary material 5, rights and permissions.

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ . The Creative Commons Public Domain Dedication waiver ( http://creativecommons.org/publicdomain/zero/1.0/ ) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Cite this article.

Schaller, M.D., Gencheva, M., Gunther, M.R. et al. Training doctoral students in critical thinking and experimental design using problem-based learning. BMC Med Educ 23 , 579 (2023). https://doi.org/10.1186/s12909-023-04569-7

Download citation

Received : 04 March 2023

Accepted : 05 August 2023

Published : 16 August 2023

DOI : https://doi.org/10.1186/s12909-023-04569-7

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Critical thinking
  • Experimental design
  • Doctoral Student

BMC Medical Education

ISSN: 1472-6920

critical thinking doctoral dissertation

Enhancing college students' critical thinking: A review of studies

  • Published: March 1987
  • Volume 26 , pages 3–29, ( 1987 )

Cite this article

  • James H. McMillan 1  

2897 Accesses

123 Citations

6 Altmetric

Explore all metrics

Twenty-seven studies are reviewed that investigate the effect of instructional methods, courses, programs, and general college experiences on changes in college students' critical thinking. Only two studies used true experimental designs; most were nonequivalent pretest-posttest control group designs. The results failed to support the use of specific instructional or course conditions to enhance critical thinking, but did support the conclusion that college attendance improves critical thinking. What is lacking in the research is a common definition of critical thinking, good instrumentation to provide specific measurement, and a clear theoretical description of the nature of an experience that should enhance critical thinking.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price includes VAT (Russian Federation)

Instant access to the full article PDF.

Rent this article via DeepDyve

Institutional subscriptions

Similar content being viewed by others

critical thinking doctoral dissertation

Critical Thinking Across the Curriculum: A Vision

Robert H. Ennis

critical thinking doctoral dissertation

The Effectiveness of Instruction in Critical Thinking

Student perceptions of effective instruction and the development of critical thinking: a replication and extension.

Chad N. Loes, Mark H. Salisbury & Ernest T. Pascarella

Association of American Colleges. (1985). Integrity in the College Curriculum: A Report to the Academic Community . Washington, D.C.

Abo El-Nasser, M. E. M. (1978). Conflicting concepts of critical thinking. (Doctoral dissertation, George Peabody College of Teachers). Dissertation Abstracts International, 39 , 480A.

Bailey, J. F. (1979). The effects of an instructional paradigm on the development of critical thinking of college students in an introductory botany course. (Doctoral dissertation, Purdue University). Dissertation Abstracts International, 40 , 3138A.

Beckman, V. E. (1956). An investigation of the contributions to critical thinking made by courses in argumentation and discussion in selected colleges. (Doctoral dissertation, University of Minnesota). Dissertation Abstracts International, 16 , 2551A.

Beyer, B. (1985). Critical thinking: What is it? Social Education 49(4): 270–276.

Google Scholar  

Brabeck, M. M. (1983). Critical thinking skills and reflective judgment development: redefining the aims of higher education. Journal of Applied Developmental Psychology 4(1): 23–24.

Broughton, J. (1977). Beyond formal operations: theoretical thought in adolescence. Teachers College Record 79(1): 87–97.

Carmichael, J. W., Hassell, J., Hunter, J., Jones, L., Ryan, M. A., and Vincent, H. (1980). Project SOAR (stress on analytical reasoning). The American Biology Teacher 42(3): 169–173.

Chipman, S. F., Segal, J. W., and Glaser, R., eds. (1985). Thinking and Learning Skills (Volume 2: Research and Open Questions) . Hillsdale, NJ: Lawrence Erlbaum Associates.

Coscarelli, W. C., and Schwen, T. M. (1979). Effects of three algorithmic representations on critical thinking, laboratory efficiency, and final grade. Educational Communication & Technology 27(1): 58–64.

Dressel, P. L., and Mayhew, L. B. (1954). General Education: Explorations in Evaluation . Westport, T: Greenwood Press.

Elstein, A. S., Shulman, L. S., and Sprafka, S. A. (1978). Medical Problem Solving . Cambridge, MA: Harvard University Press.

Ennis, R. H. (1985). A logical basis for measuring critical thinking skills. Educational Leadership 43(2): 44–48.

Feldman, K. A., and Newcomb, T. M. (1969). The Impact of College on Students (Vol. 1). San Francisco: Jossey-Bass, Inc.

Fishbein, E. L. (1975). The effect of three patterns of small group assignment in promoting critical thinking, open-mindedness, and creativity in community college students. (Doctoral dissertation, University of Miami). Dissertation Abstracts International, 36 , 7946A.

Glaser, R. (1984). Education and thinking: the role of knowledge. American Psychologist 39(2): 93–104.

Glaser, E. M. (1985). Critical thinking: educating for responsible citizenship in a democracy. National Forum 65(1): 24–27.

Gressler, L. A. (1976). The effect of research courses upon the attitudes and critical thinking abilities of graduate students. (Doctoral dissertation, Mississippi State University). Dissertation Abstracts International, 37 , 3994A.

Hancock, B. W. (1981). The effect of guided design in the critical thinking ability of college level administrative science students. (Doctoral dissertation, Southern Illinois University at Carbondale). Dissertation Abstracts International, 42 , 4275A.

Hardin, L. D. (1977). A study of the influence of a physics personalized system of instruction versus lecture on cognitive reasoning, attitudes, and critical thinking. (Doctoral dissertation, University of Northern Colorado). Dissertation Abstracts International, 38 , 4711A.

Hayden, V. M. B. (1978). A study of the effects of traditional biology and selected biological science curriculum study (BSCS) minicourses on the attitudes, achievement levels, and critical thinking abilities of students at Alcorn State University. (Doctoral dissertation, University of Southern Mississippi). Dissertation Abstracts International, 39 , 2167A.

Jackson, T. R. (1961). The effects of intercollegiate debating on critical thinking ability. (Doctoral dissertation, University of Wisconsin). Dissertation Abstracts International, 21 , 3556A.

Jones, J. T. (1974). An experimental study of four interdisciplinary approaches to promoting critical thinking skills and personal development in the college classroom. (Doctoral dissertation, The University of Florida). Dissertation Abstracts International, 35 , 5216A.

Keeley, S., Browne, and Kreutzer, J. (1982). A comparison of freshmen and seniors on general and specific essay tests of critical thinking. Research in Higher Education 17(2): 139–154.

Kitchener, K. and King, P. (1981). Reflective judgment: concepts of justification and their relationship to age and education. Journal of Applied Developmental Psychology 2(1): 89–116.

Kulick, J. A., and McKeachie, W. J. (1975). Effective college teaching. In F. Kerlinger (ed.), Review of Research and Education , Vol. 3, Itasca, IL: F. E. Peacock.

Lehmann, I. J. (1963). Changes in critical thinking, attitudes, and values from freshman to senior years. Journal of Educational Psychology 54(6): 305–315.

Loevinger, J. (1976). Ego Development . San Francisco: Jossey-Bass, Inc.

Logan, G. H. (1976). Do sociologists teach students to think more critically? Teaching Sociology 4(1): 29–48.

Lyle, E. (1958). An exploration in the teaching of critical thinking in general psychology. Journal of Educational Research 52(4): 129–133.

McKeachie, W. J. (1970). Research on college teaching: a review. Washington, D.C.: ERIC Clearninghouse on Higher Education. (ERIC Document Reproduction Service No. ED 043 789)

McPeck, J. E. (1981). Critical Thinking and Education . New York: St. Martin's Press.

Mentkowski, M., and Strait, M. J. (1983). A longitudinal study of student change in cognitive development, learning styles, and generic abilities in an outcomecentered liberal arts curriculum. (Final Report to the National Institute of Education: Research Report Number Six). Milwaukee: Alverno College, Office of Research and Evaluation. (NIE-G-77-0058)

Mitchell, J. V. Jr., ed. (1985). The Ninth Mental Measurements Yearbook . [Reviews of the Watson-Glaser Critical Thinking Appraisal]. Lincoln, NE: The Buros Institute of Mental Measurements of the University of Nebraska-Lincoln, pp. 1692–1694.

Modjeski, R. B., and Michael, W. B. (1983). An evaluation by a panel of psychologists of the reliability and validity of two tests of critical thinking. Educational and Psychological Measurement 43: 1187–1197.

Morante, E. A., and Vlesky, A. (1984). Assessment of reasoning abilities. Educational Leadership 42(1): 71–74.

National Institute of Education. (1984). Involvement in Learning: Realizing the Potential of American Higher Education . Washington, D.C.: U.S. Government Printing Office.

Pascarella, E. (1985). College environmental influences on learning and cognitive development: a critical review and synthesis. In J. Smart (ed.), Higher Education: Handbook of Theory and Research , Vol. I, pp. 1–62. New York: Agathon Press.

Paul, R. W. (1984). Critical thinking: fundamental to education for a free society. Educational Leadership 42(1): 4–14.

Paul, R. W. (1985). The critical thinking movement: a historical perspective, National Forum 65(1): 2–3, 32.

Perkins, D. N. (1982). Difficulties in everyday reasoning and their change with education. Report to the Spencer Foundation, Harvard University, Cambridge, MA.

Perkins, D. N. (1985). General cognitive skills: why not? In Chipman, S. F., Segal, J. W., and Glaser, R. (eds.), Thinking and Learning Skills (Volume 2: Research and Open Questions) pp. 339–364. Hillsdale, NJ: Lawrence Erlbaum.

Perry, W. (1970). Forms of Intellectual and Ethical Development in the College Years: A Scheme . New York: Holt, Rinehart & Winston.

Piaget, J. (1972). Intellectual evolution from adolescence to adulthood. Human Development 15: 1–12.

Presseisen, B. Z. (1986). Critical thinking and thinking skills: State of the art definitions and practice in public schools. Paper presented at the 1986 annual meeting of the American Educational Research Association, San Francisco.

Resnick, L. B. (1985). Education and learning to think. Unpublished paper. Learning Research and Development Center, University of Pittsburgh.

Segal, J. W., Chipman, S. F., and Glaser, R., eds. (1985). Thinking and Learning Skills (Volume 1: Relating Instruction to Research) . Hillsdale, NJ: Lawrence Erlbaum.

Shuch, M. L. (1975). The use of calculators versus hand computations in teaching business arithmetic and the effects on the critical thinking ability of community college students. (Doctoral dissertation, New York University). Dissertation Abstracts International, 36 , 4299A.

Smith, D. G. (1977). College classroom interactions and critical thinking. Journal of Educational Psychology 69(2): 180–190.

Sternberg, R. J. (1985). Teaching critical thinking, part 2: possible solutions. Phi Delta Kappan 67(4): 277–280.

Stonewater, J. K. (1980). Strategies for problem-solving. In Young, Fostering Critical Thinking , pp. 33–58.

Suksringarm, P. (1976). An experimental study comparing the effects of BSCS and the traditional biology on achievement understanding of science, critical thinking ability, and attitude toward science of first year students at the Sakon Nakorn Teachers College, Thailand. (Doctoral dissertation, Pennsylvania State University). Dissertation Abstracts International, 37 , 2764A.

Terenzini, P. T., Theophilides, C., and Lorang, W. G. (1984). Influences on students' perceptions of their academic skill development during college. Journal of Higher Education 55(5): 621–636.

Tomlinson-Keasey, C. A., and Eisert, D. (1977). Second year evaluation of the ADAPT program. In Multidisciplinary Piagetian-Based Programs for College Freshmen: ADAPT . Lincoln: University of Nebraska.

Tomlinson-Keasey, C. A., Williams, V., and Eisert, D. (1977). Evaluation report of the first year of the ADAPT program. In Multidisciplinary Piagetian-Based Programs for College Freshmen: ADAPT . Lincoln: University of Nebraska.

Whitla, D. K. (1977). Value added: Measuring the Impact of Undergraduate Education . Cambridge, MA: Office of Instructional Research and Evaluation, Harvard University.

Williams, D. E. (1951). The effects of training in debating on critical thinking ability. Unpublished master's thesis, Purdue University.

Yinger, R. J. (1980). Can we really teach them to think? In Young, Fostering Critical Thinking , pp. 11–32.

Young, R. E., ed. (1980). Fostering Critical Thinking . San Francisco: Jossey-Bass.

Download references

Author information

Authors and affiliations.

Educational Studies, Virginia Commonwealth University, Box 2020, 23284, Richmond, VA

James H. McMillan

You can also search for this author in PubMed   Google Scholar

Rights and permissions

Reprints and permissions

About this article

McMillan, J.H. Enhancing college students' critical thinking: A review of studies. Res High Educ 26 , 3–29 (1987). https://doi.org/10.1007/BF00991931

Download citation

Received : 31 October 1986

Issue Date : March 1987

DOI : https://doi.org/10.1007/BF00991931

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • College Student
  • Critical Thinking
  • Education Research
  • Specific Measurement
  • Theoretical Description
  • Find a journal
  • Publish with us
  • Track your research

IMAGES

  1. How to Use Critical Thinking Approach in Dissertation Writing?

    critical thinking doctoral dissertation

  2. (PDF) Critical Thinking

    critical thinking doctoral dissertation

  3. Writing the Doctoral Dissertation : A Systematic Approach (Edition 3

    critical thinking doctoral dissertation

  4. Critical Thinking and its Importance in Doctoral Education1

    critical thinking doctoral dissertation

  5. (PDF) Critical Thinking and its Importance in Doctoral Education

    critical thinking doctoral dissertation

  6. Critical Thinking

    critical thinking doctoral dissertation

VIDEO

  1. Doctoral Student Colloquium 12-4-23

  2. THESIS VS. DISSERTATION VS. RESEARCH

  3. Reflection on Writing a PhD-Level Dissertation

  4. Best Critical Doctoral Dissertation/Thesis Award 2024: Information workshop

  5. How to Prepare for Your Doctoral Dissertation

  6. Crafting Your Academic Masterpiece: The Power of a Well-Crafted PhD Thesis

COMMENTS

  1. PDF The Implementation of Critical Thinking As Efl Pedagogy: Challenges and

    The introduction of critical thinking into education has recently become a global aim. The implementation of critical thinking as language pedagogy in the field of English as a Foreign Language (hereafter EFL) has started recently, and it consequently requires further investigation. Despite Atkinson's (1997) claims that

  2. Critical Thinking at the Doctoral Level

    To explore the nature of critical thinking, we begin by examining the concept of left and right brain thinking. Left and Right Brain Thinking. Brain research suggests that the left and right sides of the brain have distinct and complementary functions. Simply put, the left brain is the seat of logic and, hence, analytical thinking, and the ...

  3. Factors Affecting PhD Student Success

    The dissertation process offers the PhD student an opportunity to develop critical thinking skills as well as positive attributes and behaviors needed as a professional. This challenging period of growth from student to professional may have barriers that will need to be overcome to be successful.

  4. Critical Thinking and its Importance in Doctoral Education

    Article. Critical thinking is that skill that people develop according to their professional growth and studies, and through which allows them to make a successful decision-making process, due to ...

  5. Developing Critical Thinking in Doctoral Students: Issues and Solutions

    The PhD is the highest level of academic qualification, and is by its very nature an exercise in the development of critical thinking. This chapter discusses what it means to study for a PhD and the problems that students have with developing skills of criticality. The author discusses his own experiences of supervising over 50 doctoral students and relates this to the relevant literature.

  6. Full article: A 'doctoral compass': strategic reflection, self

    A doctoral thesis, as an intellectual masterpiece, mandates scholarly arguments, ... stage can be exceptionally daunting as it likewise generates pressure to accomplish other tasks deemed essential in the doctorate. Linking it to critical thinking, through supervisors' feedback, doctoral scholars are increasingly made aware that the critical ...

  7. PDF Thinking and Writing Critically for Doctoral Students

    doctorate. Being critical does not mean criticising in a negative way, but instead involves ... Brewer, R. (2007) Your PhD thesis: how to plan, draft, revise and edit your thesis. Abergele: ... Guide to thinking and writing critically for doctoral students [online]. Wolverhampton: University of Wolverhampton. [Accessed give date accessed ...

  8. Writing a Postgraduate or Doctoral Thesis: A Step-by-Step ...

    It must demonstrate critical thinking and uphold a high level of formal literacy, having both accuracy and persistence. Writing for a doctoral degree thesis puts candidates through emotional endurance tests, prompts identity changes, and reassigns them to modern social and scholarly networks . These factors make writing a thesis challenging and ...

  9. Critical Thinking

    Critical thinking - what it is and why it matters. What does it mean to be a critical student? This part of the guide will introduce you to the key aspects of critical thinking: the main components of an argument; what makes an argument succeed or fail; identifying supporting evidence; recognising the most reliable research

  10. Constructing Academic Identity Through Critical Argumentation: A

    For instance, Ming acknowledged the importance of critical thinking in doctoral thesis writing but struggled to develop her critical voice and defend her authorial position. Cultural influences, such as respect for authority, shaped Ming's reluctance to critique established authors' work. Ming's case highlighted the need for explicit ...

  11. Conditions for Criticality in Doctoral Education: A Creative Concern

    Abstract. The demand for developing profound critical thinking in doctoral education is a serious concern since today's doctoral students are the academics and societal leaders of tomorrow. Thus they need to be well prepared for handling the rapid changes of academia, and society at large, in deliberate, transformative, and responsible ways.

  12. Faculty Perception: Developing Critical Thinking in New Graduate

    This Dissertation is brought to you for free and open access by the Walden Dissertations and Doctoral Studies Collection at ScholarWorks. It has been accepted for inclusion in Walden Dissertations and Doctoral Studies by an

  13. The Need for Cognition and Critical Thinking Skills and Depressive

    This Dissertation is brought to you for free and open access by the Walden Dissertations and Doctoral Studies Collection at ScholarWorks. It has been accepted for inclusion in Walden Dissertations and Doctoral Studies by an authorized administrator of ScholarWorks. For more information, please [email protected].

  14. Promoting Critical Thinking and Reflection in a Capstone Course for

    course on how to develop critical thinking abilities among adult students. The purpose of this case study was to examine the perceptions of the instructors of the capstone courses and their approaches to promoting critical thinking. Literature on critical thinking and reflective writing provided the framework for this study. Participants included 5

  15. A meta-analysis on critical thinking and community college student

    Metacognitive self-regulation, self-efficacy for learning and performance, and critical thinking as predictors of academic success and course retention among community college students enrolled in online, telecourse, and traditional public speaking courses (Unpublished doctoral dissertation)

  16. PDF Critical Thinking and Academic Writing in Higher Education: A New

    teaching critical thinking and academic writing yet few authors have investigated the connection between these constructs. Underpinned by academic literacies theory and Larsson's critical thinking sequence (2017), this thesis presents a design for a new pedagogical approach to critical thinking and academic writing. Using an educational

  17. Training doctoral students in critical thinking and experimental design

    Background Traditionally, doctoral student education in the biomedical sciences relies on didactic coursework to build a foundation of scientific knowledge and an apprenticeship model of training in the laboratory of an established investigator. Recent recommendations for revision of graduate training include the utilization of graduate student competencies to assess progress and the ...

  18. Enhancing college students' critical thinking: A review of studies

    (Doctoral dissertation, Southern Illinois University at Carbondale).Dissertation Abstracts International, 42, 4275A. Hardin, L. D. (1977). A study of the influence of a physics personalized system of instruction versus lecture on cognitive reasoning, attitudes, and critical thinking. (Doctoral dissertation, University of Northern Colorado).

  19. Use of Critical Thinking Strategies by Nurse Educators

    nurse educator is required to be competent in clinical practice, curriculum development, teaching strategies and evaluation methods (Booth et al.,2016). Critical thinking teaching strategies: CT teaching strategies are the teaching. methods nurse educator uses in facilitating activity learning and the development of.

  20. Argumentation, critical thinking and the postgraduate dissertation

    Its basic structures are discussed; and three dissertations are examined to test the degree to which they embody argumentation and criticality. A particular dimension is explored as part of the article, in relation to current thinking in the UK about postgraduate research student skills training: to what extent does the genre of dissertation or ...

  21. PDF CERTIFICATE OF APPROVAL JENNIFER H. REED

    Ph.D. Dissertation _____ This is to certify that the Ph.D. Dissertation of JENNIFER H. REED with a major in Interdisciplinary Studies has been approved by ... critical thinking into a U.S. history course on community college students' 1) abilities to think critically about U.S. history and about everyday issues, 2) dispositions

  22. Newly Graduated Baccalaureate Nurses Critical-Thinking Development

    Walden Dissertations and Doctoral Studies Walden Dissertations and Doctoral Studies Collection 2022 Newly Graduated Baccalaureate Nurses Critical-Thinking Development ... Critical thinking is a skill that requires active learning to acquire. Student-centered instruction encourages the student to consider and apply what they are learning

  23. Annual Three-Minute Thesis Competition Provides Research Capsule Talks

    Creating an elevator pitch from information gleaned through years of specialized research takes clear thinking, precise wording and a flair for presenting to an audience. Just ask the participants of this year's Three-Minute Thesis (3MT) competition. Ten graduate and doctoral students took part in the contest's final round last month.

  24. Improving Student Nurse Clinical Reasoning, Critical Thinking, and

    This Dissertation is brought to you for free and open access by the Walden Dissertations and Doctoral Studies Collection at ScholarWorks. It has been accepted for inclusion in Walden Dissertations and Doctoral Studies by an authorized administrator of ScholarWorks. For more information, please contact [email protected]. Masthead Logo Link