Level Up Your MTSS With Our Free Interventions and Progress Monitoring Toolkit.

  • Multi-Tiered System of Supports Build effective, district-wide MTSS
  • School Climate & Culture Create a safe, supportive learning environment
  • Positive Behavior Interventions & Supports Promote positive behavior and climate
  • Family Engagement Engage families as partners in education
  • Platform Holistic data and student support tools
  • Integrations Daily syncs with district data systems and assessments
  • Professional Development Strategic advising, workshop facilitation, and ongoing support

Mesa OnTime

  • Surveys and Toolkits

book-supporting every student 18 interventions

18 Research-Based MTSS Interventions

Download step-by-step guides for intervention strategies across literacy, math, behavior, and SEL.

  • Connecticut
  • Massachusetts
  • Mississippi
  • New Hampshire
  • North Carolina
  • North Dakota
  • Pennsylvania
  • Rhode Island
  • South Carolina
  • South Dakota
  • West Virginia
  • Testimonials
  • Success Stories
  • About Panorama
  • Data Privacy
  • Leadership Team
  • In the Press
  • Request a Demo

Request a Demo

  • Popular Posts
  • Multi-Tiered System of Supports
  • Family Engagement
  • Social-Emotional Well-Being
  • College and Career Readiness

Show Categories

School Climate

45 survey questions to understand student engagement in online learning.

Nick Woolf

In our work with K-12 school districts during the COVID-19 pandemic, countless district leaders and school administrators have told us how challenging it's been to  build student engagement outside of the traditional classroom. 

Not only that, but the challenges associated with online learning may have the largest impact on students from marginalized communities.   Research   suggests that some groups of students experience more difficulty with academic performance and engagement when course content is delivered online vs. face-to-face.

As you look to improve the online learning experience for students, take a moment to understand  how students, caregivers, and staff are currently experiencing virtual learning. Where are the areas for improvement? How supported do students feel in their online coursework? Do teachers feel equipped to support students through synchronous and asynchronous facilitation? How confident do families feel in supporting their children at home?

Below, we've compiled a bank of 45 questions to understand student engagement in online learning.  Interested in running a student, family, or staff engagement survey? Click here to learn about Panorama's survey analytics platform for K-12 school districts.

Download Toolkit: 9 Virtual Learning Resources to Engage Students, Families, and Staff

45 Questions to Understand Student Engagement in Online Learning

For students (grades 3-5 and 6-12):.

1. How excited are you about going to your classes?

2. How often do you get so focused on activities in your classes that you lose track of time?

3. In your classes, how eager are you to participate?

4. When you are not in school, how often do you talk about ideas from your classes?

5. Overall, how interested are you in your classes?

6. What are the most engaging activities that happen in this class?

7. Which aspects of class have you found least engaging?

8. If you were teaching class, what is the one thing you would do to make it more engaging for all students?

9. How do you know when you are feeling engaged in class?

10. What projects/assignments/activities do you find most engaging in this class?

11. What does this teacher do to make this class engaging?

12. How much effort are you putting into your classes right now?

13. How difficult or easy is it for you to try hard on your schoolwork right now?

14. How difficult or easy is it for you to stay focused on your schoolwork right now?

15. If you have missed in-person school recently, why did you miss school?

16. If you have missed online classes recently, why did you miss class?

17. How would you like to be learning right now?

18. How happy are you with the amount of time you spend speaking with your teacher?

19. How difficult or easy is it to use the distance learning technology (computer, tablet, video calls, learning applications, etc.)?

20. What do you like about school right now?

21. What do you not like about school right now?

22. When you have online schoolwork, how often do you have the technology (laptop, tablet, computer, etc) you need?

23. How difficult or easy is it for you to connect to the internet to access your schoolwork?

24. What has been the hardest part about completing your schoolwork?

25. How happy are you with how much time you spend in specials or enrichment (art, music, PE, etc.)?

26. Are you getting all the help you need with your schoolwork right now?

27. How sure are you that you can do well in school right now?

28. Are there adults at your school you can go to for help if you need it right now?

29. If you are participating in distance learning, how often do you hear from your teachers individually?

For Families, Parents, and Caregivers:

30 How satisfied are you with the way learning is structured at your child’s school right now?

31. Do you think your child should spend less or more time learning in person at school right now?

32. How difficult or easy is it for your child to use the distance learning tools (video calls, learning applications, etc.)?

33. How confident are you in your ability to support your child's education during distance learning?

34. How confident are you that teachers can motivate students to learn in the current model?

35. What is working well with your child’s education that you would like to see continued?

36. What is challenging with your child’s education that you would like to see improved?

37. Does your child have their own tablet, laptop, or computer available for schoolwork when they need it?

38. What best describes your child's typical internet access?

39. Is there anything else you would like us to know about your family’s needs at this time?

For Teachers and Staff:

40.   In the past week, how many of your students regularly participated in your virtual classes?

41. In the past week, how engaged have students been in your virtual classes?

42. In the past week, how engaged have students been in your in-person classes?

43. Is there anything else you would like to share about student engagement at this time?

44. What is working well with the current learning model that you would like to see continued?

45. What is challenging about the current learning model that you would like to see improved?

Elevate Student, Family, and Staff Voices This Year With Panorama

Schools and districts can use Panorama’s leading survey administration and analytics platform to quickly gather and take action on information from students, families, teachers, and staff. The questions are applicable to all types of K-12 school settings and grade levels, as well as to communities serving students from a range of socioeconomic backgrounds.

back-to-school-students

In the Panorama platform, educators can view and disaggregate results by topic, question, demographic group, grade level, school, and more to inform priority areas and action plans. Districts may use the data to improve teaching and learning models, build stronger academic and social-emotional support systems, improve stakeholder communication, and inform staff professional development.

To learn more about Panorama's survey platform, get in touch with our team.

Related Articles

Engaging Your School Community in Survey Results (Q&A Ep. 4)

Engaging Your School Community in Survey Results (Q&A Ep. 4)

Learn how to engage principals, staff, families, and students in the survey results when running a stakeholder feedback program around school climate.

La Cañada Shares Survey Results

La Cañada Shares Survey Results

La Cañada Unified School District, Panorama's first client, shares results from its surveys, used to collect feedback from students, families, and staff.

44 Questions to Ask Students, Families, and Staff During the Pandemic

44 Questions to Ask Students, Families, and Staff During the Pandemic

Identify ways to support students, families, and staff in your school district during the pandemic with these 44 questions.

what are the possible research questions about online learning

Featured Resource

9 virtual learning resources to connect with students, families, and staff.

We've bundled our top resources for building belonging in hybrid or distance learning environments.

Join 90,000+ education leaders on our weekly newsletter.

  • Twin Cities

University of Minnesota

  • Bachelor's Degrees
  • Master's Degrees
  • Doctorate Degrees
  • Certificates
  • Coursera Online Courses
  • Licensing Programs
  • Post-Secondary Enrollment Options (PSEO)
  • Credit Online Courses
  • Professional Development Online Courses
  • Student Stories
  • Health and Well-being
  • Learn Online

Top 6 Questions People Ask About Online Learning

Closeup of hands typing on a laptop and holding a pen

Since the invention of the internet, we have witnessed a huge change in the accessibility and flexibility of higher education. Not only can students earn their degrees at a distance and on their own schedule but they can also complete certifications and trade programs with more ease than ever before.

If you’re considering online classes as a means to achieving your goals, you likely have questions. Here are some of the most common ones, with answers!

What Is Online Learning?

So, just what is online learning? This term refers to education that takes place in a completely virtual environment using an internet connection and a computer or device to connect to the school. In the online "classroom," you can do all the same things that in-person students do, such as:

  • Listening to lectures
  • Answering questions from a professor
  • Completing readings
  • Turning in assignments
  • Taking quizzes and tests
  • Meeting as a group

Some schools, programs, or courses combine online learning with in-person learning experiences. This model is known as "hybrid education," wherein students participate online most of the time. However, when learning objectives call for hands-on experience (say, practicing skills for a health profession or laboratory experiments), they can head to campus.

That said, many programs allow their students to complete the entire curriculum virtually. Degrees such as a Bachelor of Science in Software Engineering, for example, may not call for in-person learning at all. You can always contact admissions or the specific department if you want to learn more about delivery format.

Why Online Learning Is Good for Students

Despite the widespread accessibility of remote education, some students remain skeptical about online classes. Are you really learning if there’s not a professor present at the front of a lecture hall? Can you really learn the skills you need without the in-person interaction between students and faculty?

Ease and Accessibility

While some people feel online education lacks the intimacy and immediacy of a "real" classroom, it offers an educational channel to students who might otherwise not have the time or resources to attend. Online access has made it possible for students to enroll and participate in online classes with greater ease, from nearly anywhere, in a way that fits their schedules.

Affordability

Online courses are usually more affordable as well. According to the Education Data Initiative , an online degree is $36,595 cheaper than an in-person degree when the cost of tuition and attendance are compared. The average cost of attending a private university is $129,800 for an in-person degree and only $60,593 for an online degree.

It’s also estimated that students who commute to college for in-person classes pay $1,360 per year in transportation costs that an online student wouldn’t have to pay. Add in factors such as cheaper meals at home and more time to work, and it’s not hard to see why many students opt for online learning.

Top Questions About Online Learning

Despite the benefits, you likely still have some questions about online learning. Let’s take a look at six of the most common.

1. Are You Able to Earn Your Degree Completely Online? Yes, many (but not all) schools do offer this as an option. We’re not just talking about certificates or minors, either.

For instance, you can earn a Master of Science in Electrical and Computer Engineering from U of M Online. If you complete the entire program virtually, you will pay in-state tuition costs from anywhere in the United States – a major bonus. A good school should offer you a searchable course catalog to compare options and view which have a required on-campus component.

2. How Long Does It Take to Earn a Degree Online? Most online programs mirror their in-person counterparts in terms of how long it takes to earn the degree. From certificates and minors to bachelor’s or master’s degrees, you’re looking at roughly the same timeline for equivalent programs. Some programs offer students the flexibility for part time options if that is needed to accommodate work and family responsibilities.

Some schools or programs may limit how quickly you can move through the material. However, given the freedom and flexibility of online learning, it’s possible you can complete more coursework in less time than you could on campus. Talk to your admissions officer or program coordinator about specifics.

When first researching your options, you can again turn to the searchable course catalog. On each degree page, you should find the recommended timeline clearly listed.

3. Is an Online Degree Viewed Differently Than a Traditional Degree? Among the most common and pressing questions for online learning is whether future employers view online degrees with skepticism. The answer is an emphatic "no." Most online programs appear on your transcript the same as on-campus programs would.

You may also wonder if an online program will impact your plans for a higher degree later. As long as your degree is from an accredited institution, it won’t harm your chances of acceptance.

4. What Are Some Benefits of Online Learning? When you choose to learn online, you can:

  • Study more, due to the lack of commuting to, from, and around campus
  • Potentially take more classes, again because of the time savings
  • Get more immediate feedback from professors on assignments
  • Leverage the online resources that come with your course portal
  • Spend less money on your degree overall
  • Continue working or caring for family while going to school

5. Do Instructors Offer Help and Support to Students? Instructors are required to give the same amount of time and energy to their online classes as they do to in-person groups. In fact, many professors are enthusiastic about virtual learning because it means they have more flexibility and don’t have to commute either.

6. Can Students Have Success and Excel in Online Learning? Lastly, can you learn new skills, attain knowledge, and become successful in online learning? Unequivocally, the answer is yes! Online degree programs still afford you tutoring and career resources as well as full access to academic resources such as the library .

Plus, you will have the ability to transfer credits either to or from the degree program, just as you would with an on-campus one. In other words, you will find yourself and your goals in no way hampered by taking the online approach.

Online Learning

In summary, online learning offers you a ton of freedom and savings. It allows you to complete your work anywhere, from the office to the living room to on the road. And you can rest assured that you’ll get the same level of professorial support as you would from an on-campus program, as well as a degree that’s worth just as much.

Learn More, Today

Ready to learn more? Reach out to U of M Online to ask questions or get information about specific programs today!

  • Cost of Online Education vs. Traditional Education
  • The top 5 questions people ask about online learning
  • https://online.umn.edu/programs-search
  • https://online.umn.edu/tuition-fees-and-financial-aid
  • https://online.umn.edu/story/academic-tutoring-and-career-resources
  • https://online.umn.edu/story/u-m-libraries
  • https://online.umn.edu/transfer-credit
  • https://online.umn.edu/
  • Research article
  • Open access
  • Published: 02 December 2020

Integrating students’ perspectives about online learning: a hierarchy of factors

  • Montgomery Van Wart 1 ,
  • Anna Ni 1 ,
  • Pamela Medina 1 ,
  • Jesus Canelon 1 ,
  • Melika Kordrostami 1 ,
  • Jing Zhang 1 &

International Journal of Educational Technology in Higher Education volume  17 , Article number:  53 ( 2020 ) Cite this article

148k Accesses

49 Citations

24 Altmetric

Metrics details

This article reports on a large-scale ( n  = 987), exploratory factor analysis study incorporating various concepts identified in the literature as critical success factors for online learning from the students’ perspective, and then determines their hierarchical significance. Seven factors--Basic Online Modality, Instructional Support, Teaching Presence, Cognitive Presence, Online Social Comfort, Online Interactive Modality, and Social Presence--were identified as significant and reliable. Regression analysis indicates the minimal factors for enrollment in future classes—when students consider convenience and scheduling—were Basic Online Modality, Cognitive Presence, and Online Social Comfort. Students who accepted or embraced online courses on their own merits wanted a minimum of Basic Online Modality, Teaching Presence, Cognitive Presence, Online Social Comfort, and Social Presence. Students, who preferred face-to-face classes and demanded a comparable experience, valued Online Interactive Modality and Instructional Support more highly. Recommendations for online course design, policy, and future research are provided.

Introduction

While there are different perspectives of the learning process such as learning achievement and faculty perspectives, students’ perspectives are especially critical since they are ultimately the raison d’être of the educational endeavor (Chickering & Gamson, 1987 ). More pragmatically, students’ perspectives provide invaluable, first-hand insights into their experiences and expectations (Dawson et al., 2019 ). The student perspective is especially important when new teaching approaches are used and when new technologies are being introduced (Arthur, 2009 ; Crews & Butterfield, 2014 ; Van Wart, Ni, Ready, Shayo, & Court, 2020 ). With the renewed interest in “active” education in general (Arruabarrena, Sánchez, Blanco, et al., 2019 ; Kay, MacDonald, & DiGiuseppe, 2019 ; Nouri, 2016 ; Vlachopoulos & Makri, 2017 ) and the flipped classroom approach in particular (Flores, del-Arco, & Silva, 2016 ; Gong, Yang, & Cai, 2020 ; Lundin, et al., 2018 ; Maycock, 2019 ; McGivney-Burelle, 2013 ; O’Flaherty & Phillips, 2015 ; Tucker , 2012 ) along with extraordinary shifts in the technology, the student perspective on online education is profoundly important. What shapes students’ perceptions of quality integrate are their own sense of learning achievement, satisfaction with the support they receive, technical proficiency of the process, intellectual and emotional stimulation, comfort with the process, and sense of learning community. The factors that students perceive as quality online teaching, however, has not been as clear as it might be for at least two reasons.

First, it is important to note that the overall online learning experience for students is also composed of non-teaching factors which we briefly mention. Three such factors are (1) convenience, (2) learner characteristics and readiness, and (3) antecedent conditions that may foster teaching quality but are not directly responsible for it. (1) Convenience is an enormous non-quality factor for students (Artino, 2010 ) which has driven up online demand around the world (Fidalgo, Thormann, Kulyk, et al., 2020 ; Inside Higher Education and Gallup, 2019 ; Legon & Garrett, 2019 ; Ortagus, 2017 ). This is important since satisfaction with online classes is frequently somewhat lower than face-to-face classes (Macon, 2011 ). However, the literature generally supports the relative equivalence of face-to-face and online modes regarding learning achievement criteria (Bernard et al., 2004 ; Nguyen, 2015 ; Ni, 2013 ; Sitzmann, Kraiger, Stewart, & Wisher, 2006 ; see Xu & Jaggars, 2014 for an alternate perspective). These contrasts are exemplified in a recent study of business students, in which online students using a flipped classroom approach outperformed their face-to-face peers, but ironically rated instructor performance lower (Harjoto, 2017 ). (2) Learner characteristics also affect the experience related to self-regulation in an active learning model, comfort with technology, and age, among others,which affect both receptiveness and readiness of online instruction. (Alqurashi, 2016 ; Cohen & Baruth, 2017 ; Kintu, Zhu, & Kagambe, 2017 ; Kuo, Walker, Schroder, & Belland, 2013 ; Ventura & Moscoloni, 2015 ) (3) Finally, numerous antecedent factors may lead to improved instruction, but are not themselves directly perceived by students such as instructor training (Brinkley-Etzkorn, 2018 ), and the sources of faculty motivation (e.g., incentives, recognition, social influence, and voluntariness) (Wingo, Ivankova, & Moss, 2017 ). Important as these factors are, mixing them with the perceptions of quality tends to obfuscate the quality factors directly perceived by students.

Second, while student perceptions of quality are used in innumerable studies, our overall understanding still needs to integrate them more holistically. Many studies use student perceptions of quality and overall effectiveness of individual tools and strategies in online contexts such as mobile devices (Drew & Mann, 2018 ), small groups (Choi, Land, & Turgeon, 2005 ), journals (Nair, Tay, & Koh, 2013 ), simulations (Vlachopoulos & Makri, 2017 ), video (Lange & Costley, 2020 ), etc. Such studies, however, cannot provide the overall context and comparative importance. Some studies have examined the overall learning experience of students with exploratory lists, but have mixed non-quality factors with quality of teaching factors making it difficult to discern the instructor’s versus contextual roles in quality (e.g., Asoodar, Vaezi, & Izanloo, 2016 ; Bollinger & Martindale, 2004 ; Farrell & Brunton, 2020 ; Hong, 2002 ; Song, Singleton, Hill, & Koh, 2004 ; Sun, Tsai, Finger, Chen, & Yeh, 2008 ). The application of technology adoption studies also fall into this category by essentially aggregating all teaching quality in the single category of performance ( Al-Gahtani, 2016 ; Artino, 2010 ). Some studies have used high-level teaching-oriented models, primarily the Community of Inquiry model (le Roux & Nagel, 2018 ), but empirical support has been mixed (Arbaugh et al., 2008 ); and its elegance (i.e., relying on only three factors) has not provided much insight to practitioners (Anderson, 2016 ; Cleveland-Innes & Campbell, 2012 ).

Research questions

Integration of studies and concepts explored continues to be fragmented and confusing despite the fact that the number of empirical studies related to student perceptions of quality factors has increased. It is important to have an empirical view of what students’ value in a single comprehensive study and, also, to know if there is a hierarchy of factors, ranging from students who are least to most critical of the online learning experience. This research study has two research questions.

The first research question is: What are the significant factors in creating a high-quality online learning experience from students’ perspectives? That is important to know because it should have a significant effect on the instructor’s design of online classes. The goal of this research question is identify a more articulated and empirically-supported set of factors capturing the full range of student expectations.

The second research question is: Is there a priority or hierarchy of factors related to students’ perceptions of online teaching quality that relate to their decisions to enroll in online classes? For example, is it possible to distinguish which factors are critical for enrollment decisions when students are primarily motivated by convenience and scheduling flexibility (minimum threshold)? Do these factors differ from students with a genuine acceptance of the general quality of online courses (a moderate threshold)? What are the factors that are important for the students who are the most critical of online course delivery (highest threshold)?

This article next reviews the literature on online education quality, focusing on the student perspective and reviews eight factors derived from it. The research methods section discusses the study structure and methods. Demographic data related to the sample are next, followed by the results, discussion, and conclusion.

Literature review

Online education is much discussed (Prinsloo, 2016 ; Van Wart et al., 2019 ; Zawacki-Richter & Naidu, 2016 ), but its perception is substantially influenced by where you stand and what you value (Otter et al., 2013 ; Tanner, Noser, & Totaro, 2009 ). Accrediting bodies care about meeting technical standards, proof of effectiveness, and consistency (Grandzol & Grandzol, 2006 ). Institutions care about reputation, rigor, student satisfaction, and institutional efficiency (Jung, 2011 ). Faculty care about subject coverage, student participation, faculty satisfaction, and faculty workload (Horvitz, Beach, Anderson, & Xia, 2015 ; Mansbach & Austin, 2018 ). For their part, students care about learning achievement (Marks, Sibley, & Arbaugh, 2005 ; O’Neill & Sai, 2014 ; Shen, Cho, Tsai, & Marra, 2013 ), but also view online education as a function of their enjoyment of classes, instructor capability and responsiveness, and comfort in the learning environment (e.g., Asoodar et al., 2016 ; Sebastianelli, Swift, & Tamimi, 2015 ). It is this last perspective, of students, upon which we focus.

It is important to note students do not sign up for online classes solely based on perceived quality. Perceptions of quality derive from notions of the capacity of online learning when ideal—relative to both learning achievement and satisfaction/enjoyment, and perceptions about the likelihood and experience of classes living up to expectations. Students also sign up because of convenience and flexibility, and personal notions of suitability about learning. Convenience and flexibility are enormous drivers of online registration (Lee, Stringer, & Du, 2017 ; Mann & Henneberry, 2012 ). Even when students say they prefer face-to-face classes to online, many enroll in online classes and re-enroll in the future if the experience meets minimum expectations. This study examines the threshold expectations of students when they are considering taking online classes.

When discussing students’ perceptions of quality, there is little clarity about the actual range of concepts because no integrated empirical studies exist comparing major factors found throughout the literature. Rather, there are practitioner-generated lists of micro-competencies such as the Quality Matters consortium for higher education (Quality Matters, 2018 ), or broad frameworks encompassing many aspects of quality beyond teaching (Open and Distant Learning Quality Council, 2012 ). While checklists are useful for practitioners and accreditation processes, they do not provide robust, theoretical bases for scholarly development. Overarching frameworks are heuristically useful, but not for pragmatic purposes or theory building arenas. The most prominent theoretical framework used in online literature is the Community of Inquiry (CoI) model (Arbaugh et al., 2008 ; Garrison, Anderson, & Archer, 2003 ), which divides instruction into teaching, cognitive, and social presence. Like deductive theories, however, the supportive evidence is mixed (Rourke & Kanuka, 2009 ), especially regarding the importance of social presence (Annand, 2011 ; Armellini and De Stefani, 2016 ). Conceptually, the problem is not so much with the narrow articulation of cognitive or social presence; cognitive presence is how the instructor provides opportunities for students to interact with material in robust, thought-provoking ways, and social presence refers to building a community of learning that incorporates student-to-student interactions. However, teaching presence includes everything else the instructor does—structuring the course, providing lectures, explaining assignments, creating rehearsal opportunities, supplying tests, grading, answering questions, and so on. These challenges become even more prominent in the online context. While the lecture as a single medium is paramount in face-to-face classes, it fades as the primary vehicle in online classes with increased use of detailed syllabi, electronic announcements, recorded and synchronous lectures, 24/7 communications related to student questions, etc. Amassing the pedagogical and technological elements related to teaching under a single concept provides little insight.

In addition to the CoI model, numerous concepts are suggested in single-factor empirical studies when focusing on quality from a student’s perspective, with overlapping conceptualizations and nonstandardized naming conventions. Seven distinct factors are derived here from the literature of student perceptions of online quality: Instructional Support, Teaching Presence, Basic Online Modality, Social Presence, Online Social Comfort, cognitive Presence, and Interactive Online Modality.

Instructional support

Instructional Support refers to students’ perceptions of techniques by the instructor used for input, rehearsal, feedback, and evaluation. Specifically, this entails providing detailed instructions, designed use of multimedia, and the balance between repetitive class features for ease of use, and techniques to prevent boredom. Instructional Support is often included as an element of Teaching Presence, but is also labeled “structure” (Lee & Rha, 2009 ; So & Brush, 2008 ) and instructor facilitation (Eom, Wen, & Ashill, 2006 ). A prime example of the difference between face-to-face and online education is the extensive use of the “flipped classroom” (Maycock, 2019 ; Wang, Huang, & Schunn, 2019 ) in which students move to rehearsal activities faster and more frequently than traditional classrooms, with less instructor lecture (Jung, 2011 ; Martin, Wang, & Sadaf, 2018 ). It has been consistently supported as an element of student perceptions of quality (Espasa & Meneses, 2010 ).

  • Teaching presence

Teaching Presence refers to students’ perceptions about the quality of communication in lectures, directions, and individual feedback including encouragement (Jaggars & Xu, 2016 ; Marks et al., 2005 ). Specifically, instructor communication is clear, focused, and encouraging, and instructor feedback is customized and timely. If Instructional Support is what an instructor does before the course begins and in carrying out those plans, then Teaching Presence is what the instructor does while the class is conducted and in response to specific circumstances. For example, a course could be well designed but poorly delivered because the instructor is distracted; or a course could be poorly designed but an instructor might make up for the deficit by spending time and energy in elaborate communications and ad hoc teaching techniques. It is especially important in student satisfaction (Sebastianelli et al., 2015 ; Young, 2006 ) and also referred to as instructor presence (Asoodar et al., 2016 ), learner-instructor interaction (Marks et al., 2005 ), and staff support (Jung, 2011 ). As with Instructional Support, it has been consistently supported as an element of student perceptions of quality.

Basic online modality

Basic Online Modality refers to the competent use of basic online class tools—online grading, navigation methods, online grade book, and the announcements function. It is frequently clumped with instructional quality (Artino, 2010 ), service quality (Mohammadi, 2015 ), instructor expertise in e-teaching (Paechter, Maier, & Macher, 2010 ), and similar terms. As a narrowly defined concept, it is sometimes called technology (Asoodar et al., 2016 ; Bollinger & Martindale, 2004 ; Sun et al., 2008 ). The only empirical study that did not find Basic Online Modality significant, as technology, was Sun et al. ( 2008 ). Because Basic Online Modality is addressed with basic instructor training, some studies assert the importance of training (e.g., Asoodar et al., 2016 ).

Social presence

Social Presence refers to students’ perceptions of the quality of student-to-student interaction. Social Presence focuses on the quality of shared learning and collaboration among students, such as in threaded discussion responses (Garrison et al., 2003 ; Kehrwald, 2008 ). Much emphasized but challenged in the CoI literature (Rourke & Kanuka, 2009 ), it has mixed support in the online literature. While some studies found Social Presence or related concepts to be significant (e.g., Asoodar et al., 2016 ; Bollinger & Martindale, 2004 ; Eom et al., 2006 ; Richardson, Maeda, Lv, & Caskurlu, 2017 ), others found Social Presence insignificant (Joo, Lim, & Kim, 2011 ; So & Brush, 2008 ; Sun et al., 2008 ).

Online social comfort

Online Social Comfort refers to the instructor’s ability to provide an environment in which anxiety is low, and students feel comfortable interacting even when expressing opposing viewpoints. While numerous studies have examined anxiety (e.g., Liaw & Huang, 2013 ; Otter et al., 2013 ; Sun et al., 2008 ), only one found anxiety insignificant (Asoodar et al., 2016 ); many others have not examined the concept.

  • Cognitive presence

Cognitive Presence refers to the engagement of students such that they perceive they are stimulated by the material and instructor to reflect deeply and critically, and seek to understand different perspectives (Garrison et al., 2003 ). The instructor provides instructional materials and facilitates an environment that piques interest, is reflective, and enhances inclusiveness of perspectives (Durabi, Arrastia, Nelson, Cornille, & Liang, 2011 ). Cognitive Presence includes enhancing the applicability of material for student’s potential or current careers. Cognitive Presence is supported as significant in many online studies (e.g., Artino, 2010 ; Asoodar et al., 2016 ; Joo et al., 2011 ; Marks et al., 2005 ; Sebastianelli et al., 2015 ; Sun et al., 2008 ). Further, while many instructors perceive that cognitive presence is diminished in online settings, neuroscientific studies indicate this need not be the case (Takamine, 2017 ). While numerous studies failed to examine Cognitive Presence, this review found no studies that lessened its significance for students.

Interactive online modality

Interactive Online Modality refers to the “high-end” usage of online functionality. That is, the instructor uses interactive online class tools—video lectures, videoconferencing, and small group discussions—well. It is often included in concepts such as instructional quality (Artino, 2010 ; Asoodar et al., 2016 ; Mohammadi, 2015 ; Otter et al., 2013 ; Paechter et al., 2010 ) or engagement (Clayton, Blumberg, & Anthony, 2018 ). While individual methods have been investigated (e.g. Durabi et al., 2011 ), high-end engagement methods have not.

Other independent variables affecting perceptions of quality include age, undergraduate versus graduate status, gender, ethnicity/race, discipline, educational motivation of students, and previous online experience. While age has been found to be small or insignificant, more notable effects have been reported at the level-of-study, with graduate students reporting higher “success” (Macon, 2011 ), and community college students having greater difficulty with online classes (Legon & Garrett, 2019 ; Xu & Jaggars, 2014 ). Ethnicity and race have also been small or insignificant. Some situational variations and student preferences can be captured by paying attention to disciplinary differences (Arbaugh, 2005 ; Macon, 2011 ). Motivation levels of students have been reported to be significant in completion and achievement, with better students doing as well across face-to-face and online modes, and weaker students having greater completion and achievement challenges (Clayton et al., 2018 ; Lu & Lemonde, 2013 ).

Research methods

To examine the various quality factors, we apply a critical success factor methodology, initially introduced to schools of business research in the 1970s. In 1981, Rockhart and Bullen codified an approach embodying principles of critical success factors (CSFs) as a way to identify the information needs of executives, detailing steps for the collection and analyzation of data to create a set of organizational CSFs (Rockhart & Bullen, 1981 ). CSFs describe the underlying or guiding principles which must be incorporated to ensure success.

Utilizing this methodology, CSFs in the context of this paper define key areas of instruction and design essential for an online class to be successful from a student’s perspective. Instructors implicitly know and consider these areas when setting up an online class and designing and directing activities and tasks important to achieving learning goals. CSFs make explicit those things good instructors may intuitively know and (should) do to enhance student learning. When made explicit, CSFs not only confirm the knowledge of successful instructors, but tap their intuition to guide and direct the accomplishment of quality instruction for entire programs. In addition, CSFs are linked with goals and objectives, helping generate a small number of truly important matters an instructor should focus attention on to achieve different thresholds of online success.

After a comprehensive literature review, an instrument was created to measure students’ perceptions about the importance of techniques and indicators leading to quality online classes. Items were designed to capture the major factors in the literature. The instrument was pilot studied during academic year 2017–18 with a 397 student sample, facilitating an exploratory factor analysis leading to important preliminary findings (reference withheld for review). Based on the pilot, survey items were added and refined to include seven groups of quality teaching factors and two groups of items related to students’ overall acceptance of online classes as well as a variable on their future online class enrollment. Demographic information was gathered to determine their effects on students’ levels of acceptance of online classes based on age, year in program, major, distance from university, number of online classes taken, high school experience with online classes, and communication preferences.

This paper draws evidence from a sample of students enrolled in educational programs at Jack H. Brown College of Business and Public Administration (JHBC), California State University San Bernardino (CSUSB). The JHBC offers a wide range of online courses for undergraduate and graduate programs. To ensure comparable learning outcomes, online classes and face-to-face classes of a certain subject are similar in size—undergraduate classes are generally capped at 60 and graduate classes at 30, and often taught by the same instructors. Students sometimes have the option to choose between both face-to-face and online modes of learning.

A Qualtrics survey link was sent out by 11 instructors to students who were unlikely to be cross-enrolled in classes during the 2018–19 academic year. 1 Approximately 2500 students were contacted, with some instructors providing class time to complete the anonymous survey. All students, whether they had taken an online class or not, were encouraged to respond. Nine hundred eighty-seven students responded, representing a 40% response rate. Although drawn from a single business school, it is a broad sample representing students from several disciplines—management, accounting and finance, marketing, information decision sciences, and public administration, as well as both graduate and undergraduate programs of study.

The sample age of students is young, with 78% being under 30. The sample has almost no lower division students (i.e., freshman and sophomore), 73% upper division students (i.e., junior and senior) and 24% graduate students (master’s level). Only 17% reported having taken a hybrid or online class in high school. There was a wide range of exposure to university level online courses, with 47% reporting having taken 1 to 4 classes, and 21% reporting no online class experience. As a Hispanic-serving institution, 54% self-identified as Latino, 18% White, and 13% Asian and Pacific Islander. The five largest majors were accounting & finance (25%), management (21%), master of public administration (16%), marketing (12%), and information decision sciences (10%). Seventy-four percent work full- or part-time. See Table  1 for demographic data.

Measures and procedure

To increase the reliability of evaluation scores, composite evaluation variables are formed after an exploratory factor analysis of individual evaluation items. A principle component method with Quartimin (oblique) rotation was applied to explore the factor construct of student perceptions of online teaching CSFs. The item correlations for student perceptions of importance coefficients greater than .30 were included, a commonly acceptable ratio in factor analysis. A simple least-squares regression analysis was applied to test the significance levels of factors on students’ impression of online classes.

Exploratory factor constructs

Using a threshold loading of 0.3 for items, 37 items loaded on seven factors. All factors were logically consistent. The first factor, with eight items, was labeled Teaching Presence. Items included providing clear instructions, staying on task, clear deadlines, and customized feedback on strengths and weaknesses. Teaching Presence items all related to instructor involvement during the course as a director, monitor, and learning facilitator. The second factor, with seven items, aligned with Cognitive Presence. Items included stimulating curiosity, opportunities for reflection, helping students construct explanations posed in online courses, and the applicability of material. The third factor, with six items, aligned with Social Presence defined as providing student-to-student learning opportunities. Items included getting to know course participants for sense of belonging, forming impressions of other students, and interacting with others. The fourth factor, with six new items as well as two (“interaction with other students” and “a sense of community in the class”) shared with the third factor, was Instructional Support which related to the instructor’s roles in providing students a cohesive learning experience. They included providing sufficient rehearsal, structured feedback, techniques for communication, navigation guide, detailed syllabus, and coordinating student interaction and creating a sense of online community. This factor also included enthusiasm which students generally interpreted as a robustly designed course, rather than animation in a traditional lecture. The fifth factor was labeled Basic Online Modality and focused on the basic technological requirements for a functional online course. Three items included allowing students to make online submissions, use of online gradebooks, and online grading. A fourth item is the use of online quizzes, viewed by students as mechanical practice opportunities rather than small tests and a fifth is navigation, a key component of Online Modality. The sixth factor, loaded on four items, was labeled Online Social Comfort. Items here included comfort discussing ideas online, comfort disagreeing, developing a sense of collaboration via discussion, and considering online communication as an excellent medium for social interaction. The final factor was called Interactive Online Modality because it included items for “richer” communications or interactions, no matter whether one- or two-way. Items included videoconferencing, instructor-generated videos, and small group discussions. Taken together, these seven explained 67% of the variance which is considered in the acceptable range in social science research for a robust model (Hair, Black, Babin, & Anderson, 2014 ). See Table  2 for the full list.

To test for factor reliability, the Cronbach alpha of variables were calculated. All produced values greater than 0.7, the standard threshold used for reliability, except for system trust which was therefore dropped. To gauge students’ sense of factor importance, all items were means averaged. Factor means (lower means indicating higher importance to students), ranged from 1.5 to 2.6 on a 5-point scale. Basic Online Modality was most important, followed by Instructional Support and Teaching Presence. Students deemed Cognitive Presence, Social Online Comfort, and Online Interactive Modality less important. The least important for this sample was Social Presence. Table  3 arrays the critical success factor means, standard deviations, and Cronbach alpha.

To determine whether particular subgroups of respondents viewed factors differently, a series of ANOVAs were conducted using factor means as dependent variables. Six demographic variables were used as independent variables: graduate vs. undergraduate, age, work status, ethnicity, discipline, and past online experience. To determine strength of association of the independent variables to each of the seven CSFs, eta squared was calculated for each ANOVA. Eta squared indicates the proportion of variance in the dependent variable explained by the independent variable. Eta squared values greater than .01, .06, and .14 are conventionally interpreted as small, medium, and large effect sizes, respectively (Green & Salkind, 2003 ). Table  4 summarizes the eta squared values for the ANOVA tests with Eta squared values less than .01 omitted.

While no significant differences in factor means among students in different disciplines in the College occur, all five other independent variables have some small effect on some or all CSFs. Graduate students tend to rate Online Interactive Modality, Instructional Support, Teaching Presence, and Cognitive Presence higher than undergraduates. Elder students value more Online Interactive Modality. Full-time working students rate all factors, except Social Online Comfort, slightly higher than part-timers and non-working students. Latino and White rate Basic Online Modality and Instructional Support higher; Asian and Pacific Islanders rate Social Presence higher. Students who have taken more online classes rate all factors higher.

In addition to factor scores, two variables are constructed to identify the resultant impressions labeled online experience. Both were logically consistent with a Cronbach’s α greater than 0.75. The first variable, with six items, labeled “online acceptance,” included items such as “I enjoy online learning,” “My overall impression of hybrid/online learning is very good,” and “the instructors of online/hybrid classes are generally responsive.” The second variable was labeled “face-to-face preference” and combines four items, including enjoying, learning, and communicating more in face-to-face classes, as well as perceiving greater fairness and equity. In addition to these two constructed variables, a one-item variable was also used subsequently in the regression analysis: “online enrollment.” That question asked: if hybrid/online classes are well taught and available, how much would online education make up your entire course selection going forward?

Regression results

As noted above, two constructed variables and one item were used as dependent variables for purposes of regression analysis. They were online acceptance, F2F preference, and the selection of online classes. In addition to seven quality-of-teaching factors identified by factor analysis, control variables included level of education (graduate versus undergraduate), age, ethnicity, work status, distance to university, and number of online/hybrid classes taken in the past. See Table  5 .

When the ETA squared values for ANOVA significance were measured for control factors, only one was close to a medium effect. Graduate versus undergraduate status had a .05 effect (considered medium) related to Online Interactive Modality, meaning graduate students were more sensitive to interactive modality than undergraduates. Multiple regression analysis of critical success factors and online impressions were conducted to compare under what conditions factors were significant. The only consistently significant control factor was number of online classes taken. The more classes students had taken online, the more inclined they were to take future classes. Level of program, age, ethnicity, and working status do not significantly affect students’ choice or overall acceptance of online classes.

The least restrictive condition was online enrollment (Table  6 ). That is, students might not feel online courses were ideal, but because of convenience and scheduling might enroll in them if minimum threshold expectations were met. When considering online enrollment three factors were significant and positive (at the 0.1 level): Basic Online Modality, Cognitive Presence, and Online Social Comfort. These least-demanding students expected classes to have basic technological functionality, provide good opportunities for knowledge acquisition, and provide comfortable interaction in small groups. Students who demand good Instructional Support (e.g., rehearsal opportunities, standardized feedback, clear syllabus) are less likely to enroll.

Online acceptance was more restrictive (see Table  7 ). This variable captured the idea that students not only enrolled in online classes out of necessity, but with an appreciation of the positive attributes of online instruction, which balanced the negative aspects. When this standard was applied, students expected not only Basic Online Modality, Cognitive Presence, and Online Social Comfort, but expected their instructors to be highly engaged virtually as the course progressed (Teaching Presence), and to create strong student-to-student dynamics (Social Presence). Students who rated Instructional Support higher are less accepting of online classes.

Another restrictive condition was catering to the needs of students who preferred face-to-face classes (see Table  8 ). That is, they preferred face-to-face classes even when online classes were well taught. Unlike students more accepting of, or more likely to enroll in, online classes, this group rates Instructional Support as critical to enrolling, rather than a negative factor when absent. Again different from the other two groups, these students demand appropriate interactive mechanisms (Online Interactive Modality) to enable richer communication (e.g., videoconferencing). Student-to-student collaboration (Social Presence) was also significant. This group also rated Cognitive Presence and Online Social Comfort as significant, but only in their absence. That is, these students were most attached to direct interaction with the instructor and other students rather than specific teaching methods. Interestingly, Basic Online Modality and Teaching Presence were not significant. Our interpretation here is this student group, most critical of online classes for its loss of physical interaction, are beyond being concerned with mechanical technical interaction and demand higher levels of interactivity and instructional sophistication.

Discussion and study limitations

Some past studies have used robust empirical methods to identify a single factor or a small number of factors related to quality from a student’s perspective, but have not sought to be relatively comprehensive. Others have used a longer series of itemized factors, but have less used less robust methods, and have not tied those factors back to the literature. This study has used the literature to develop a relatively comprehensive list of items focused on quality teaching in a single rigorous protocol. That is, while a Beta test had identified five coherent factors, substantial changes to the current survey that sharpened the focus on quality factors rather than antecedent factors, as well as better articulating the array of factors often lumped under the mantle of “teaching presence.” In addition, it has also examined them based on threshold expectations: from minimal, such as when flexibility is the driving consideration, to modest, such as when students want a “good” online class, to high, when students demand an interactive virtual experience equivalent to face-to-face.

Exploratory factor analysis identified seven factors that were reliable, coherent, and significant under different conditions. When considering students’ overall sense of importance, they are, in order: Basic Online Modality, Instructional Support, Teaching Presence, Cognitive Presence, Social Online Comfort, Interactive Online Modality, and Social Presence. Students are most concerned with the basics of a course first, that is the technological and instructor competence. Next they want engagement and virtual comfort. Social Presence, while valued, is the least critical from this overall perspective.

The factor analysis is quite consistent with the range of factors identified in the literature, pointing to the fact that students can differentiate among different aspects of what have been clumped as larger concepts, such as teaching presence. Essentially, the instructor’s role in quality can be divided into her/his command of basic online functionality, good design, and good presence during the class. The instructor’s command of basic functionality is paramount. Because so much of online classes must be built in advance of the class, quality of the class design is rated more highly than the instructor’s role in facilitating the class. Taken as a whole, the instructor’s role in traditional teaching elements is primary, as we would expect it to be. Cognitive presence, especially as pertinence of the instructional material and its applicability to student interests, has always been found significant when studied, and was highly rated as well in a single factor. Finally, the degree to which students feel comfortable with the online environment and enjoy the learner-learner aspect has been less supported in empirical studies, was found significant here, but rated the lowest among the factors of quality to students.

Regression analysis paints a more nuanced picture, depending on student focus. It also helps explain some of the heterogeneity of previous studies, depending on what the dependent variables were. If convenience and scheduling are critical and students are less demanding, minimum requirements are Basic Online Modality, Cognitive Presence, and Online Social Comfort. That is, students’ expect an instructor who knows how to use an online platform, delivers useful information, and who provides a comfortable learning environment. However, they do not expect to get poor design. They do not expect much in terms of the quality teaching presence, learner-to-learner interaction, or interactive teaching.

When students are signing up for critical classes, or they have both F2F and online options, they have a higher standard. That is, they not only expect the factors for decisions about enrolling in noncritical classes, but they also expect good Teaching and Social Presence. Students who simply need a class may be willing to teach themselves a bit more, but students who want a good class expect a highly present instructor in terms responsiveness and immediacy. “Good” classes must not only create a comfortable atmosphere, but in social science classes at least, must provide strong learner-to-learner interactions as well. At the time of the research, most students believe that you can have a good class without high interactivity via pre-recorded video and videoconference. That may, or may not, change over time as technology thresholds of various video media become easier to use, more reliable, and more commonplace.

The most demanding students are those who prefer F2F classes because of learning style preferences, poor past experiences, or both. Such students (seem to) assume that a worthwhile online class has basic functionality and that the instructor provides a strong presence. They are also critical of the absence of Cognitive Presence and Online Social Comfort. They want strong Instructional Support and Social Presence. But in addition, and uniquely, they expect Online Interactive Modality which provides the greatest verisimilitude to the traditional classroom as possible. More than the other two groups, these students crave human interaction in the learning process, both with the instructor and other students.

These findings shed light on the possible ramifications of the COVID-19 aftermath. Many universities around the world jumped from relatively low levels of online instruction in the beginning of spring 2020 to nearly 100% by mandate by the end of the spring term. The question becomes, what will happen after the mandate is removed? Will demand resume pre-crisis levels, will it increase modestly, or will it skyrocket? Time will be the best judge, but the findings here would suggest that the ability/interest of instructors and institutions to “rise to the occasion” with quality teaching will have as much effect on demand as students becoming more acclimated to online learning. If in the rush to get classes online many students experience shoddy basic functional competence, poor instructional design, sporadic teaching presence, and poorly implemented cognitive and social aspects, they may be quite willing to return to the traditional classroom. If faculty and institutions supporting them are able to increase the quality of classes despite time pressures, then most students may be interested in more hybrid and fully online classes. If instructors are able to introduce high quality interactive teaching, nearly the entire student population will be interested in more online classes. Of course students will have a variety of experiences, but this analysis suggests that those instructors, departments, and institutions that put greater effort into the temporary adjustment (and who resist less), will be substantially more likely to have increases in demand beyond what the modest national trajectory has been for the last decade or so.

There are several study limitations. First, the study does not include a sample of non-respondents. Non-responders may have a somewhat different profile. Second, the study draws from a single college and university. The profile derived here may vary significantly by type of student. Third, some survey statements may have led respondents to rate quality based upon experience rather than assess the general importance of online course elements. “I felt comfortable participating in the course discussions,” could be revised to “comfort in participating in course discussions.” The authors weighed differences among subgroups (e.g., among majors) as small and statistically insignificant. However, it is possible differences between biology and marketing students would be significant, leading factors to be differently ordered. Emphasis and ordering might vary at a community college versus research-oriented university (Gonzalez, 2009 ).

Availability of data and materials

We will make the data available.

Al-Gahtani, S. S. (2016). Empirical investigation of e-learning acceptance and assimilation: A structural equation model. Applied Comput Information , 12 , 27–50.

Google Scholar  

Alqurashi, E. (2016). Self-efficacy in online learning environments: A literature review. Contemporary Issues Educ Res (CIER) , 9 (1), 45–52.

Anderson, T. (2016). A fourth presence for the Community of Inquiry model? Retrieved from https://virtualcanuck.ca/2016/01/04/a-fourth-presence-for-the-community-of-inquiry-model/ .

Annand, D. (2011). Social presence within the community of inquiry framework. The International Review of Research in Open and Distributed Learning , 12 (5), 40.

Arbaugh, J. B. (2005). How much does “subject matter” matter? A study of disciplinary effects in on-line MBA courses. Academy of Management Learning & Education , 4 (1), 57–73.

Arbaugh, J. B., Cleveland-Innes, M., Diaz, S. R., Garrison, D. R., Ice, P., Richardson, J. C., & Swan, K. P. (2008). Developing a community of inquiry instrument: Testing a measure of the Community of Inquiry framework using a multi-institutional sample. Internet and Higher Education , 11 , 133–136.

Armellini, A., & De Stefani, M. (2016). Social presence in the 21st century: An adjustment to the Community of Inquiry framework. British Journal of Educational Technology , 47 (6), 1202–1216.

Arruabarrena, R., Sánchez, A., Blanco, J. M., et al. (2019). Integration of good practices of active methodologies with the reuse of student-generated content. International Journal of Educational Technology in Higher Education , 16 , #10.

Arthur, L. (2009). From performativity to professionalism: Lecturers’ responses to student feedback. Teaching in Higher Education , 14 (4), 441–454.

Artino, A. R. (2010). Online or face-to-face learning? Exploring the personal factors that predict students’ choice of instructional format. Internet and Higher Education , 13 , 272–276.

Asoodar, M., Vaezi, S., & Izanloo, B. (2016). Framework to improve e-learner satisfaction and further strengthen e-learning implementation. Computers in Human Behavior , 63 , 704–716.

Bernard, R. M., et al. (2004). How does distance education compare with classroom instruction? A meta-analysis of the empirical literature. Review of Educational Research , 74 (3), 379–439.

Bollinger, D., & Martindale, T. (2004). Key factors for determining student satisfaction in online courses. Int J E-learning , 3 (1), 61–67.

Brinkley-Etzkorn, K. E. (2018). Learning to teach online: Measuring the influence of faculty development training on teaching effectiveness through a TPACK lens. The Internet and Higher Education , 38 , 28–35.

Chickering, A. W., & Gamson, Z. F. (1987). Seven principles for good practice in undergraduate education. AAHE Bulletin , 3 , 7.

Choi, I., Land, S. M., & Turgeon, A. J. (2005). Scaffolding peer-questioning strategies to facilitate metacognition during online small group discussion. Instructional Science , 33 , 483–511.

Clayton, K. E., Blumberg, F. C., & Anthony, J. A. (2018). Linkages between course status, perceived course value, and students’ preferences for traditional versus non-traditional learning environments. Computers & Education , 125 , 175–181.

Cleveland-Innes, M., & Campbell, P. (2012). Emotional presence, learning, and the online learning environment. The International Review of Research in Open and Distributed Learning , 13 (4), 269–292.

Cohen, A., & Baruth, O. (2017). Personality, learning, and satisfaction in fully online academic courses. Computers in Human Behavior , 72 , 1–12.

Crews, T., & Butterfield, J. (2014). Data for flipped classroom design: Using student feedback to identify the best components from online and face-to-face classes. Higher Education Studies , 4 (3), 38–47.

Dawson, P., Henderson, M., Mahoney, P., Phillips, M., Ryan, T., Boud, D., & Molloy, E. (2019). What makes for effective feedback: Staff and student perspectives. Assessment & Evaluation in Higher Education , 44 (1), 25–36.

Drew, C., & Mann, A. (2018). Unfitting, uncomfortable, unacademic: A sociological reading of an interactive mobile phone app in university lectures. International Journal of Educational Technology in Higher Education , 15 , #43.

Durabi, A., Arrastia, M., Nelson, D., Cornille, T., & Liang, X. (2011). Cognitive presence in asynchronous online learning: A comparison of four discussion strategies. Journal of Computer Assisted Learning , 27 (3), 216–227.

Eom, S. B., Wen, H. J., & Ashill, N. (2006). The determinants of students’ perceived learning outcomes and satisfaction in university online education: An empirical investigation. Decision Sciences Journal of Innovative Education , 4 (2), 215–235.

Espasa, A., & Meneses, J. (2010). Analysing feedback processes in an online teaching and learning environment: An exploratory study. Higher Education , 59 (3), 277–292.

Farrell, O., & Brunton, J. (2020). A balancing act: A window into online student engagement experiences. International Journal of Educational Technology in High Education , 17 , #25.

Fidalgo, P., Thormann, J., Kulyk, O., et al. (2020). Students’ perceptions on distance education: A multinational study. International Journal of Educational Technology in High Education , 17 , #18.

Flores, Ò., del-Arco, I., & Silva, P. (2016). The flipped classroom model at the university: Analysis based on professors’ and students’ assessment in the educational field. International Journal of Educational Technology in Higher Education , 13 , #21.

Garrison, D. R., Anderson, T., & Archer, W. (2003). A theory of critical inquiry in online distance education. Handbook of Distance Education , 1 , 113–127.

Gong, D., Yang, H. H., & Cai, J. (2020). Exploring the key influencing factors on college students’ computational thinking skills through flipped-classroom instruction. International Journal of Educational Technology in Higher Education , 17 , #19.

Gonzalez, C. (2009). Conceptions of, and approaches to, teaching online: A study of lecturers teaching postgraduate distance courses. Higher Education , 57 (3), 299–314.

Grandzol, J. R., & Grandzol, C. J. (2006). Best practices for online business Education. International Review of Research in Open and Distance Learning , 7 (1), 1–18.

Green, S. B., & Salkind, N. J. (2003). Using SPSS: Analyzing and understanding data , (3rd ed., ). Upper Saddle River: Prentice Hall.

Hair, J. F., Black, W. C., Babin, B. J., & Anderson, R. E. (2014). Multivariate data analysis: Pearson new international edition . Essex: Pearson Education Limited.

Harjoto, M. A. (2017). Blended versus face-to-face: Evidence from a graduate corporate finance class. Journal of Education for Business , 92 (3), 129–137.

Hong, K.-S. (2002). Relationships between students’ instructional variables with satisfaction and learning from a web-based course. The Internet and Higher Education , 5 , 267–281.

Horvitz, B. S., Beach, A. L., Anderson, M. L., & Xia, J. (2015). Examination of faculty self-efficacy related to online teaching. Innovation Higher Education , 40 , 305–316.

Inside Higher Education and Gallup. (2019). The 2019 survey of faculty attitudes on technology. Author .

Jaggars, S. S., & Xu, D. (2016). How do online course design features influence student performance? Computers and Education , 95 , 270–284.

Joo, Y. J., Lim, K. Y., & Kim, E. K. (2011). Online university students’ satisfaction and persistence: Examining perceived level of presence, usefulness and ease of use as predictor in a structural model. Computers & Education , 57 (2), 1654–1664.

Jung, I. (2011). The dimensions of e-learning quality: From the learner’s perspective. Educational Technology Research and Development , 59 (4), 445–464.

Kay, R., MacDonald, T., & DiGiuseppe, M. (2019). A comparison of lecture-based, active, and flipped classroom teaching approaches in higher education. Journal of Computing in Higher Education , 31 , 449–471.

Kehrwald, B. (2008). Understanding social presence in text-based online learning environments. Distance Education , 29 (1), 89–106.

Kintu, M. J., Zhu, C., & Kagambe, E. (2017). Blended learning effectiveness: The relationship between student characteristics, design features and outcomes. International Journal of Educational Technology in Higher Education , 14 , #7.

Kuo, Y.-C., Walker, A. E., Schroder, K. E., & Belland, B. R. (2013). Interaction, internet self-efficacy, and self-regulated learning as predictors of student satisfaction in online education courses. Internet and Education , 20 , 35–50.

Lange, C., & Costley, J. (2020). Improving online video lectures: Learning challenges created by media. International Journal of Educational Technology in Higher Education , 17 , #16.

le Roux, I., & Nagel, L. (2018). Seeking the best blend for deep learning in a flipped classroom – Viewing student perceptions through the Community of Inquiry lens. International Journal of Educational Technology in High Education , 15 , #16.

Lee, H.-J., & Rha, I. (2009). Influence of structure and interaction on student achievement and satisfaction in web-based distance learning. Educational Technology & Society , 12 (4), 372–382.

Lee, Y., Stringer, D., & Du, J. (2017). What determines students’ preference of online to F2F class? Business Education Innovation Journal , 9 (2), 97–102.

Legon, R., & Garrett, R. (2019). CHLOE 3: Behind the numbers . Published online by Quality Matters and Eduventures. https://www.qualitymatters.org/sites/default/files/research-docs-pdfs/CHLOE-3-Report-2019-Behind-the-Numbers.pdf

Liaw, S.-S., & Huang, H.-M. (2013). Perceived satisfaction, perceived usefulness and interactive learning environments as predictors of self-regulation in e-learning environments. Computers & Education , 60 (1), 14–24.

Lu, F., & Lemonde, M. (2013). A comparison of online versus face-to-face students teaching delivery in statistics instruction for undergraduate health science students. Advances in Health Science Education , 18 , 963–973.

Lundin, M., Bergviken Rensfeldt, A., Hillman, T., Lantz-Andersson, A., & Peterson, L. (2018). Higher education dominance and siloed knowledge: a systematic review of flipped classroom research. International Journal of Educational Technology in Higher Education , 15 (1).

Macon, D. K. (2011). Student satisfaction with online courses versus traditional courses: A meta-analysis . Disssertation: Northcentral University, CA.

Mann, J., & Henneberry, S. (2012). What characteristics of college students influence their decisions to select online courses? Online Journal of Distance Learning Administration , 15 (5), 1–14.

Mansbach, J., & Austin, A. E. (2018). Nuanced perspectives about online teaching: Mid-career senior faculty voices reflecting on academic work in the digital age. Innovative Higher Education , 43 (4), 257–272.

Marks, R. B., Sibley, S. D., & Arbaugh, J. B. (2005). A structural equation model of predictors for effective online learning. Journal of Management Education , 29 (4), 531–563.

Martin, F., Wang, C., & Sadaf, A. (2018). Student perception of facilitation strategies that enhance instructor presence, connectedness, engagement and learning in online courses. Internet and Higher Education , 37 , 52–65.

Maycock, K. W. (2019). Chalk and talk versus flipped learning: A case study. Journal of Computer Assisted Learning , 35 , 121–126.

McGivney-Burelle, J. (2013). Flipping Calculus. PRIMUS Problems, Resources, and Issues in Mathematics Undergraduate . Studies , 23 (5), 477–486.

Mohammadi, H. (2015). Investigating users’ perspectives on e-learning: An integration of TAM and IS success model. Computers in Human Behavior , 45 , 359–374.

Nair, S. S., Tay, L. Y., & Koh, J. H. L. (2013). Students’ motivation and teachers’ teaching practices towards the use of blogs for writing of online journals. Educational Media International , 50 (2), 108–119.

Nguyen, T. (2015). The effectiveness of online learning: Beyond no significant difference and future horizons. MERLOT Journal of Online Learning and Teaching , 11 (2), 309–319.

Ni, A. Y. (2013). Comparing the effectiveness of classroom and online learning: Teaching research methods. Journal of Public Affairs Education , 19 (2), 199–215.

Nouri, J. (2016). The flipped classroom: For active, effective and increased learning – Especially for low achievers. International Journal of Educational Technology in Higher Education , 13 , #33.

O’Neill, D. K., & Sai, T. H. (2014). Why not? Examining college students’ reasons for avoiding an online course. Higher Education , 68 (1), 1–14.

O'Flaherty, J., & Phillips, C. (2015). The use of flipped classrooms in higher education: A scoping review. The Internet and Higher Education , 25 , 85–95.

Open & Distant Learning Quality Council (2012). ODLQC standards . England: Author https://www.odlqc.org.uk/odlqc-standards .

Ortagus, J. C. (2017). From the periphery to prominence: An examination of the changing profile of online students in American higher education. Internet and Higher Education , 32 , 47–57.

Otter, R. R., Seipel, S., Graef, T., Alexander, B., Boraiko, C., Gray, J., … Sadler, K. (2013). Comparing student and faculty perceptions of online and traditional courses. Internet and Higher Education , 19 , 27–35.

Paechter, M., Maier, B., & Macher, D. (2010). Online or face-to-face? Students’ experiences and preferences in e-learning. Internet and Higher Education , 13 , 292–329.

Prinsloo, P. (2016). (re)considering distance education: Exploring its relevance, sustainability and value contribution. Distance Education , 37 (2), 139–145.

Quality Matters (2018). Specific review standards from the QM higher Education rubric , (6th ed., ). MD: MarylandOnline.

Richardson, J. C., Maeda, Y., Lv, J., & Caskurlu, S. (2017). Social presence in relation to students’ satisfaction and learning in the online environment: A meta-analysis. Computers in Human Behavior , 71 , 402–417.

Rockhart, J. F., & Bullen, C. V. (1981). A primer on critical success factors . Cambridge: Center for Information Systems Research, Massachusetts Institute of Technology.

Rourke, L., & Kanuka, H. (2009). Learning in Communities of Inquiry: A Review of the Literature. The Journal of Distance Education / Revue de l'ducation Distance , 23 (1), 19–48 Athabasca University Press. Retrieved August 2, 2020 from https://www.learntechlib.org/p/105542/ .

Sebastianelli, R., Swift, C., & Tamimi, N. (2015). Factors affecting perceived learning, satisfaction, and quality in the online MBA: A structural equation modeling approach. Journal of Education for Business , 90 (6), 296–305.

Shen, D., Cho, M.-H., Tsai, C.-L., & Marra, R. (2013). Unpacking online learning experiences: Online learning self-efficacy and learning satisfaction. Internet and Higher Education , 19 , 10–17.

Sitzmann, T., Kraiger, K., Stewart, D., & Wisher, R. (2006). The comparative effectiveness of web-based and classroom instruction: A meta-analysis. Personnel Psychology , 59 (3), 623–664.

So, H. J., & Brush, T. A. (2008). Student perceptions of collaborative learning, social presence and satisfaction in a blended learning environment: Relationships and critical factors. Computers & Education , 51 (1), 318–336.

Song, L., Singleton, E. S., Hill, J. R., & Koh, M. H. (2004). Improving online learning: Student perceptions of useful and challenging characteristics. The Internet and Higher Education , 7 (1), 59–70.

Sun, P. C., Tsai, R. J., Finger, G., Chen, Y. Y., & Yeh, D. (2008). What drives a successful e-learning? An empirical investigation of the critical factors influencing learner satisfaction. Computers & Education , 50 (4), 1183–1202.

Takamine, K. (2017). Michelle D. miller: Minds online: Teaching effectively with technology. Higher Education , 73 , 789–791.

Tanner, J. R., Noser, T. C., & Totaro, M. W. (2009). Business faculty and undergraduate students’ perceptions of online learning: A comparative study. Journal of Information Systems Education , 20 (1), 29.

Tucker, B. (2012). The flipped classroom. Education Next , 12 (1), 82–83.

Van Wart, M., Ni, A., Ready, D., Shayo, C., & Court, J. (2020). Factors leading to online learner satisfaction. Business Educational Innovation Journal , 12 (1), 15–24.

Van Wart, M., Ni, A., Rose, L., McWeeney, T., & Worrell, R. A. (2019). Literature review and model of online teaching effectiveness integrating concerns for learning achievement, student satisfaction, faculty satisfaction, and institutional results. Pan-Pacific . Journal of Business Research , 10 (1), 1–22.

Ventura, A. C., & Moscoloni, N. (2015). Learning styles and disciplinary differences: A cross-sectional study of undergraduate students. International Journal of Learning and Teaching , 1 (2), 88–93.

Vlachopoulos, D., & Makri, A. (2017). The effect of games and simulations on higher education: A systematic literature review. International Journal of Educational Technology in Higher Education , 14 , #22.

Wang, Y., Huang, X., & Schunn, C. D. (2019). Redesigning flipped classrooms: A learning model and its effects on student perceptions. Higher Education , 78 , 711–728.

Wingo, N. P., Ivankova, N. V., & Moss, J. A. (2017). Faculty perceptions about teaching online: Exploring the literature using the technology acceptance model as an organizing framework. Online Learning , 21 (1), 15–35.

Xu, D., & Jaggars, S. S. (2014). Performance gaps between online and face-to-face courses: Differences across types of students and academic subject areas. Journal of Higher Education , 85 (5), 633–659.

Young, S. (2006). Student views of effective online teaching in higher education. American Journal of Distance Education , 20 (2), 65–77.

Zawacki-Richter, O., & Naidu, S. (2016). Mapping research trends from 35 years of publications in distance Education. Distance Education , 37 (3), 245–269.

Download references

Acknowledgements

No external funding/ NA.

Author information

Authors and affiliations.

Development for the JHB College of Business and Public Administration, 5500 University Parkway, San Bernardino, California, 92407, USA

Montgomery Van Wart, Anna Ni, Pamela Medina, Jesus Canelon, Melika Kordrostami, Jing Zhang & Yu Liu

You can also search for this author in PubMed   Google Scholar

Contributions

Equal. The author(s) read and approved the final manuscript.

Corresponding author

Correspondence to Montgomery Van Wart .

Ethics declarations

Competing interests.

We have no competing interests.

Additional information

Publisher’s note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ .

Reprints and permissions

About this article

Cite this article.

Van Wart, M., Ni, A., Medina, P. et al. Integrating students’ perspectives about online learning: a hierarchy of factors. Int J Educ Technol High Educ 17 , 53 (2020). https://doi.org/10.1186/s41239-020-00229-8

Download citation

Received : 29 April 2020

Accepted : 30 July 2020

Published : 02 December 2020

DOI : https://doi.org/10.1186/s41239-020-00229-8

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Online education
  • Online teaching
  • Student perceptions
  • Online quality
  • Student presence

what are the possible research questions about online learning

  • Open supplemental data
  • Reference Manager
  • Simple TEXT file

People also looked at

Original research article, insights into students’ experiences and perceptions of remote learning methods: from the covid-19 pandemic to best practice for the future.

what are the possible research questions about online learning

  • 1 Minerva Schools at Keck Graduate Institute, San Francisco, CA, United States
  • 2 Ronin Institute for Independent Scholarship, Montclair, NJ, United States
  • 3 Department of Physics, University of Toronto, Toronto, ON, Canada

This spring, students across the globe transitioned from in-person classes to remote learning as a result of the COVID-19 pandemic. This unprecedented change to undergraduate education saw institutions adopting multiple online teaching modalities and instructional platforms. We sought to understand students’ experiences with and perspectives on those methods of remote instruction in order to inform pedagogical decisions during the current pandemic and in future development of online courses and virtual learning experiences. Our survey gathered quantitative and qualitative data regarding students’ experiences with synchronous and asynchronous methods of remote learning and specific pedagogical techniques associated with each. A total of 4,789 undergraduate participants representing institutions across 95 countries were recruited via Instagram. We find that most students prefer synchronous online classes, and students whose primary mode of remote instruction has been synchronous report being more engaged and motivated. Our qualitative data show that students miss the social aspects of learning on campus, and it is possible that synchronous learning helps to mitigate some feelings of isolation. Students whose synchronous classes include active-learning techniques (which are inherently more social) report significantly higher levels of engagement, motivation, enjoyment, and satisfaction with instruction. Respondents’ recommendations for changes emphasize increased engagement, interaction, and student participation. We conclude that active-learning methods, which are known to increase motivation, engagement, and learning in traditional classrooms, also have a positive impact in the remote-learning environment. Integrating these elements into online courses will improve the student experience.

Introduction

The COVID-19 pandemic has dramatically changed the demographics of online students. Previously, almost all students engaged in online learning elected the online format, starting with individual online courses in the mid-1990s through today’s robust online degree and certificate programs. These students prioritize convenience, flexibility and ability to work while studying and are older than traditional college age students ( Harris and Martin, 2012 ; Levitz, 2016 ). These students also find asynchronous elements of a course are more useful than synchronous elements ( Gillingham and Molinari, 2012 ). In contrast, students who chose to take courses in-person prioritize face-to-face instruction and connection with others and skew considerably younger ( Harris and Martin, 2012 ). This leaves open the question of whether students who prefer to learn in-person but are forced to learn remotely will prefer synchronous or asynchronous methods. One study of student preferences following a switch to remote learning during the COVID-19 pandemic indicates that students enjoy synchronous over asynchronous course elements and find them more effective ( Gillis and Krull, 2020 ). Now that millions of traditional in-person courses have transitioned online, our survey expands the data on student preferences and explores if those preferences align with pedagogical best practices.

An extensive body of research has explored what instructional methods improve student learning outcomes (Fink. 2013). Considerable evidence indicates that active-learning or student-centered approaches result in better learning outcomes than passive-learning or instructor-centered approaches, both in-person and online ( Freeman et al., 2014 ; Chen et al., 2018 ; Davis et al., 2018 ). Active-learning approaches include student activities or discussion in class, whereas passive-learning approaches emphasize extensive exposition by the instructor ( Freeman et al., 2014 ). Constructivist learning theories argue that students must be active participants in creating their own learning, and that listening to expert explanations is seldom sufficient to trigger the neurological changes necessary for learning ( Bostock, 1998 ; Zull, 2002 ). Some studies conclude that, while students learn more via active learning, they may report greater perceptions of their learning and greater enjoyment when passive approaches are used ( Deslauriers et al., 2019 ). We examine student perceptions of remote learning experiences in light of these previous findings.

In this study, we administered a survey focused on student perceptions of remote learning in late May 2020 through the social media account of @unjadedjade to a global population of English speaking undergraduate students representing institutions across 95 countries. We aim to explore how students were being taught, the relationship between pedagogical methods and student perceptions of their experience, and the reasons behind those perceptions. Here we present an initial analysis of the results and share our data set for further inquiry. We find that positive student perceptions correlate with synchronous courses that employ a variety of interactive pedagogical techniques, and that students overwhelmingly suggest behavioral and pedagogical changes that increase social engagement and interaction. We argue that these results support the importance of active learning in an online environment.

Materials and Methods

Participant pool.

Students were recruited through the Instagram account @unjadedjade. This social media platform, run by influencer Jade Bowler, focuses on education, effective study tips, ethical lifestyle, and promotes a positive mindset. For this reason, the audience is presumably academically inclined, and interested in self-improvement. The survey was posted to her account and received 10,563 responses within the first 36 h. Here we analyze the 4,789 of those responses that came from undergraduates. While we did not collect demographic or identifying information, we suspect that women are overrepresented in these data as followers of @unjadedjade are 80% women. A large minority of respondents were from the United Kingdom as Jade Bowler is a British influencer. Specifically, 43.3% of participants attend United Kingdom institutions, followed by 6.7% attending university in the Netherlands, 6.1% in Germany, 5.8% in the United States and 4.2% in Australia. Ninety additional countries are represented in these data (see Supplementary Figure 1 ).

Survey Design

The purpose of this survey is to learn about students’ instructional experiences following the transition to remote learning in the spring of 2020.

This survey was initially created for a student assignment for the undergraduate course Empirical Analysis at Minerva Schools at KGI. That version served as a robust pre-test and allowed for identification of the primary online platforms used, and the four primary modes of learning: synchronous (live) classes, recorded lectures and videos, uploaded or emailed materials, and chat-based communication. We did not adapt any open-ended questions based on the pre-test survey to avoid biasing the results and only corrected language in questions for clarity. We used these data along with an analysis of common practices in online learning to revise the survey. Our revised survey asked students to identify the synchronous and asynchronous pedagogical methods and platforms that they were using for remote learning. Pedagogical methods were drawn from literature assessing active and passive teaching strategies in North American institutions ( Fink, 2013 ; Chen et al., 2018 ; Davis et al., 2018 ). Open-ended questions asked students to describe why they preferred certain modes of learning and how they could improve their learning experience. Students also reported on their affective response to learning and participation using a Likert scale.

The revised survey also asked whether students had responded to the earlier survey. No significant differences were found between responses of those answering for the first and second times (data not shown). See Supplementary Appendix 1 for survey questions. Survey data was collected from 5/21/20 to 5/23/20.

Qualitative Coding

We applied a qualitative coding framework adapted from Gale et al. (2013) to analyze student responses to open-ended questions. Four researchers read several hundred responses and noted themes that surfaced. We then developed a list of themes inductively from the survey data and deductively from the literature on pedagogical practice ( Garrison et al., 1999 ; Zull, 2002 ; Fink, 2013 ; Freeman et al., 2014 ). The initial codebook was revised collaboratively based on feedback from researchers after coding 20–80 qualitative comments each. Before coding their assigned questions, alignment was examined through coding of 20 additional responses. Researchers aligned in identifying the same major themes. Discrepancies in terms identified were resolved through discussion. Researchers continued to meet weekly to discuss progress and alignment. The majority of responses were coded by a single researcher using the final codebook ( Supplementary Table 1 ). All responses to questions 3 (4,318 responses) and 8 (4,704 responses), and 2,512 of 4,776 responses to question 12 were analyzed. Valence was also indicated where necessary (i.e., positive or negative discussion of terms). This paper focuses on the most prevalent themes from our initial analysis of the qualitative responses. The corresponding author reviewed codes to ensure consistency and accuracy of reported data.

Statistical Analysis

The survey included two sets of Likert-scale questions, one consisting of a set of six statements about students’ perceptions of their experiences following the transition to remote learning ( Table 1 ). For each statement, students indicated their level of agreement with the statement on a five-point scale ranging from 1 (“Strongly Disagree”) to 5 (“Strongly Agree”). The second set asked the students to respond to the same set of statements, but about their retroactive perceptions of their experiences with in-person instruction before the transition to remote learning. This set was not the subject of our analysis but is present in the published survey results. To explore correlations among student responses, we used CrossCat analysis to calculate the probability of dependence between Likert-scale responses ( Mansinghka et al., 2016 ).

www.frontiersin.org

Table 1. Likert-scale questions.

Mean values are calculated based on the numerical scores associated with each response. Measures of statistical significance for comparisons between different subgroups of respondents were calculated using a two-sided Mann-Whitney U -test, and p -values reported here are based on this test statistic. We report effect sizes in pairwise comparisons using the common-language effect size, f , which is the probability that the response from a random sample from subgroup 1 is greater than the response from a random sample from subgroup 2. We also examined the effects of different modes of remote learning and technological platforms using ordinal logistic regression. With the exception of the mean values, all of these analyses treat Likert-scale responses as ordinal-scale, rather than interval-scale data.

Students Prefer Synchronous Class Sessions

Students were asked to identify their primary mode of learning given four categories of remote course design that emerged from the pilot survey and across literature on online teaching: live (synchronous) classes, recorded lectures and videos, emailed or uploaded materials, and chats and discussion forums. While 42.7% ( n = 2,045) students identified live classes as their primary mode of learning, 54.6% ( n = 2613) students preferred this mode ( Figure 1 ). Both recorded lectures and live classes were preferred over uploaded materials (6.22%, n = 298) and chat (3.36%, n = 161).

www.frontiersin.org

Figure 1. Actual (A) and preferred (B) primary modes of learning.

In addition to a preference for live classes, students whose primary mode was synchronous were more likely to enjoy the class, feel motivated and engaged, be satisfied with instruction and report higher levels of participation ( Table 2 and Supplementary Figure 2 ). Regardless of primary mode, over two-thirds of students reported they are often distracted during remote courses.

www.frontiersin.org

Table 2. The effect of synchronous vs. asynchronous primary modes of learning on student perceptions.

Variation in Pedagogical Techniques for Synchronous Classes Results in More Positive Perceptions of the Student Learning Experience

To survey the use of passive vs. active instructional methods, students reported the pedagogical techniques used in their live classes. Among the synchronous methods, we identify three different categories ( National Research Council, 2000 ; Freeman et al., 2014 ). Passive methods (P) include lectures, presentations, and explanation using diagrams, white boards and/or other media. These methods all rely on instructor delivery rather than student participation. Our next category represents active learning through primarily one-on-one interactions (A). The methods in this group are in-class assessment, question-and-answer (Q&A), and classroom chat. Group interactions (F) included classroom discussions and small-group activities. Given these categories, Mann-Whitney U pairwise comparisons between the 7 possible combinations and Likert scale responses about student experience showed that the use of a variety of methods resulted in higher ratings of experience vs. the use of a single method whether or not that single method was active or passive ( Table 3 ). Indeed, students whose classes used methods from each category (PAF) had higher ratings of enjoyment, motivation, and satisfaction with instruction than those who only chose any single method ( p < 0.0001) and also rated higher rates of participation and engagement compared to students whose only method was passive (P) or active through one-on-one interactions (A) ( p < 0.00001). Student ratings of distraction were not significantly different for any comparison. Given that sets of Likert responses often appeared significant together in these comparisons, we ran a CrossCat analysis to look at the probability of dependence across Likert responses. Responses have a high probability of dependence on each other, limiting what we can claim about any discrete response ( Supplementary Figure 3 ).

www.frontiersin.org

Table 3. Comparison of combinations of synchronous methods on student perceptions. Effect size (f).

Mann-Whitney U pairwise comparisons were also used to check if improvement in student experience was associated with the number of methods used vs. the variety of types of methods. For every comparison, we found that more methods resulted in higher scores on all Likert measures except distraction ( Table 4 ). Even comparison between four or fewer methods and greater than four methods resulted in a 59% chance that the latter enjoyed the courses more ( p < 0.00001) and 60% chance that they felt more motivated to learn ( p < 0.00001). Students who selected more than four methods ( n = 417) were also 65.1% ( p < 0.00001), 62.9% ( p < 0.00001) and 64.3% ( p < 0.00001) more satisfied with instruction, engaged, and actively participating, respectfully. Therefore, there was an overlap between how the number and variety of methods influenced students’ experiences. Since the number of techniques per category is 2–3, we cannot fully disentangle the effect of number vs. variety. Pairwise comparisons to look at subsets of data with 2–3 methods from a single group vs. 2–3 methods across groups controlled for this but had low sample numbers in most groups and resulted in no significant findings (data not shown). Therefore, from the data we have in our survey, there seems to be an interdependence between number and variety of methods on students’ learning experiences.

www.frontiersin.org

Table 4. Comparison of the number of synchronous methods on student perceptions. Effect size (f).

Variation in Asynchronous Pedagogical Techniques Results in More Positive Perceptions of the Student Learning Experience

Along with synchronous pedagogical methods, students reported the asynchronous methods that were used for their classes. We divided these methods into three main categories and conducted pairwise comparisons. Learning methods include video lectures, video content, and posted study materials. Interacting methods include discussion/chat forums, live office hours, and email Q&A with professors. Testing methods include assignments and exams. Our results again show the importance of variety in students’ perceptions ( Table 5 ). For example, compared to providing learning materials only, providing learning materials, interaction, and testing improved enjoyment ( f = 0.546, p < 0.001), motivation ( f = 0.553, p < 0.0001), satisfaction with instruction ( f = 0.596, p < 0.00001), engagement ( f = 0.572, p < 0.00001) and active participation ( f = 0.563, p < 0.00001) (row 6). Similarly, compared to just being interactive with conversations, the combination of all three methods improved five out of six indicators, except for distraction in class (row 11).

www.frontiersin.org

Table 5. Comparison of combinations of asynchronous methods on student perceptions. Effect size (f).

Ordinal logistic regression was used to assess the likelihood that the platforms students used predicted student perceptions ( Supplementary Table 2 ). Platform choices were based on the answers to open-ended questions in the pre-test survey. The synchronous and asynchronous methods used were consistently more predictive of Likert responses than the specific platforms. Likewise, distraction continued to be our outlier with no differences across methods or platforms.

Students Prefer In-Person and Synchronous Online Learning Largely Due to Social-Emotional Reasoning

As expected, 86.1% (4,123) of survey participants report a preference for in-person courses, while 13.9% (666) prefer online courses. When asked to explain the reasons for their preference, students who prefer in-person courses most often mention the importance of social interaction (693 mentions), engagement (639 mentions), and motivation (440 mentions). These students are also more likely to mention a preference for a fixed schedule (185 mentions) vs. a flexible schedule (2 mentions).

In addition to identifying social reasons for their preference for in-person learning, students’ suggestions for improvements in online learning focus primarily on increasing interaction and engagement, with 845 mentions of live classes, 685 mentions of interaction, 126 calls for increased participation and calls for changes related to these topics such as, “Smaller teaching groups for live sessions so that everyone is encouraged to talk as some people don’t say anything and don’t participate in group work,” and “Make it less of the professor reading the pdf that was given to us and more interaction.”

Students who prefer online learning primarily identify independence and flexibility (214 mentions) and reasons related to anxiety and discomfort in in-person settings (41 mentions). Anxiety was only mentioned 12 times in the much larger group that prefers in-person learning.

The preference for synchronous vs. asynchronous modes of learning follows similar trends ( Table 6 ). Students who prefer live classes mention engagement and interaction most often while those who prefer recorded lectures mention flexibility.

www.frontiersin.org

Table 6. Most prevalent themes for students based on their preferred mode of remote learning.

Student Perceptions Align With Research on Active Learning

The first, and most robust, conclusion is that incorporation of active-learning methods correlates with more positive student perceptions of affect and engagement. We can see this clearly in the substantial differences on a number of measures, where students whose classes used only passive-learning techniques reported lower levels of engagement, satisfaction, participation, and motivation when compared with students whose classes incorporated at least some active-learning elements. This result is consistent with prior research on the value of active learning ( Freeman et al., 2014 ).

Though research shows that student learning improves in active learning classes, on campus, student perceptions of their learning, enjoyment, and satisfaction with instruction are often lower in active-learning courses ( Deslauriers et al., 2019 ). Our finding that students rate enjoyment and satisfaction with instruction higher for active learning online suggests that the preference for passive lectures on campus relies on elements outside of the lecture itself. That might include the lecture hall environment, the social physical presence of peers, or normalization of passive lectures as the expected mode for on-campus classes. This implies that there may be more buy-in for active learning online vs. in-person.

A second result from our survey is that student perceptions of affect and engagement are associated with students experiencing a greater diversity of learning modalities. We see this in two different results. First, in addition to the fact that classes that include active learning outperform classes that rely solely on passive methods, we find that on all measures besides distraction, the highest student ratings are associated with a combination of active and passive methods. Second, we find that these higher scores are associated with classes that make use of a larger number of different methods.

This second result suggests that students benefit from classes that make use of multiple different techniques, possibly invoking a combination of passive and active methods. However, it is unclear from our data whether this effect is associated specifically with combining active and passive methods, or if it is associated simply with the use of multiple different methods, irrespective of whether those methods are active, passive, or some combination. The problem is that the number of methods used is confounded with the diversity of methods (e.g., it is impossible for a classroom using only one method to use both active and passive methods). In an attempt to address this question, we looked separately at the effect of number and diversity of methods while holding the other constant. Across a large number of such comparisons, we found few statistically significant differences, which may be a consequence of the fact that each comparison focused on a small subset of the data.

Thus, our data suggests that using a greater diversity of learning methods in the classroom may lead to better student outcomes. This is supported by research on student attention span which suggests varying delivery after 10–15 min to retain student’s attention ( Bradbury, 2016 ). It is likely that this is more relevant for online learning where students report high levels of distraction across methods, modalities, and platforms. Given that number and variety are key, and there are few passive learning methods, we can assume that some combination of methods that includes active learning improves student experience. However, it is not clear whether we should predict that this benefit would come simply from increasing the number of different methods used, or if there are benefits specific to combining particular methods. Disentangling these effects would be an interesting avenue for future research.

Students Value Social Presence in Remote Learning

Student responses across our open-ended survey questions show a striking difference in reasons for their preferences compared with traditional online learners who prefer flexibility ( Harris and Martin, 2012 ; Levitz, 2016 ). Students reasons for preferring in-person classes and synchronous remote classes emphasize the desire for social interaction and echo the research on the importance of social presence for learning in online courses.

Short et al. (1976) outlined Social Presence Theory in depicting students’ perceptions of each other as real in different means of telecommunications. These ideas translate directly to questions surrounding online education and pedagogy in regards to educational design in networked learning where connection across learners and instructors improves learning outcomes especially with “Human-Human interaction” ( Goodyear, 2002 , 2005 ; Tu, 2002 ). These ideas play heavily into asynchronous vs. synchronous learning, where Tu reports students having positive responses to both synchronous “real-time discussion in pleasantness, responsiveness and comfort with familiar topics” and real-time discussions edging out asynchronous computer-mediated communications in immediate replies and responsiveness. Tu’s research indicates that students perceive more interaction with synchronous mediums such as discussions because of immediacy which enhances social presence and support the use of active learning techniques ( Gunawardena, 1995 ; Tu, 2002 ). Thus, verbal immediacy and communities with face-to-face interactions, such as those in synchronous learning classrooms, lessen the psychological distance of communicators online and can simultaneously improve instructional satisfaction and reported learning ( Gunawardena and Zittle, 1997 ; Richardson and Swan, 2019 ; Shea et al., 2019 ). While synchronous learning may not be ideal for traditional online students and a subset of our participants, this research suggests that non-traditional online learners are more likely to appreciate the value of social presence.

Social presence also connects to the importance of social connections in learning. Too often, current systems of education emphasize course content in narrow ways that fail to embrace the full humanity of students and instructors ( Gay, 2000 ). With the COVID-19 pandemic leading to further social isolation for many students, the importance of social presence in courses, including live interactions that build social connections with classmates and with instructors, may be increased.

Limitations of These Data

Our undergraduate data consisted of 4,789 responses from 95 different countries, an unprecedented global scale for research on online learning. However, since respondents were followers of @unjadedjade who focuses on learning and wellness, these respondents may not represent the average student. Biases in survey responses are often limited by their recruitment techniques and our bias likely resulted in more robust and thoughtful responses to free-response questions and may have influenced the preference for synchronous classes. It is unlikely that it changed students reporting on remote learning pedagogical methods since those are out of student control.

Though we surveyed a global population, our design was rooted in literature assessing pedagogy in North American institutions. Therefore, our survey may not represent a global array of teaching practices.

This survey was sent out during the initial phase of emergency remote learning for most countries. This has two important implications. First, perceptions of remote learning may be clouded by complications of the pandemic which has increased social, mental, and financial stresses globally. Future research could disaggregate the impact of the pandemic from students’ learning experiences with a more detailed and holistic analysis of the impact of the pandemic on students.

Second, instructors, students and institutions were not able to fully prepare for effective remote education in terms of infrastructure, mentality, curriculum building, and pedagogy. Therefore, student experiences reflect this emergency transition. Single-modality courses may correlate with instructors who lacked the resources or time to learn or integrate more than one modality. Regardless, the main insights of this research align well with the science of teaching and learning and can be used to inform both education during future emergencies and course development for online programs that wish to attract traditional college students.

Global Student Voices Improve Our Understanding of the Experience of Emergency Remote Learning

Our survey shows that global student perspectives on remote learning agree with pedagogical best practices, breaking with the often-found negative reactions of students to these practices in traditional classrooms ( Shekhar et al., 2020 ). Our analysis of open-ended questions and preferences show that a majority of students prefer pedagogical approaches that promote both active learning and social interaction. These results can serve as a guide to instructors as they design online classes, especially for students whose first choice may be in-person learning. Indeed, with the near ubiquitous adoption of remote learning during the COVID-19 pandemic, remote learning may be the default for colleges during temporary emergencies. This has already been used at the K-12 level as snow days become virtual learning days ( Aspergren, 2020 ).

In addition to informing pedagogical decisions, the results of this survey can be used to inform future research. Although we survey a global population, our recruitment method selected for students who are English speakers, likely majority female, and have an interest in self-improvement. Repeating this study with a more diverse and representative sample of university students could improve the generalizability of our findings. While the use of a variety of pedagogical methods is better than a single method, more research is needed to determine what the optimal combinations and implementations are for courses in different disciplines. Though we identified social presence as the major trend in student responses, the over 12,000 open-ended responses from students could be analyzed in greater detail to gain a more nuanced understanding of student preferences and suggestions for improvement. Likewise, outliers could shed light on the diversity of student perspectives that we may encounter in our own classrooms. Beyond this, our findings can inform research that collects demographic data and/or measures learning outcomes to understand the impact of remote learning on different populations.

Importantly, this paper focuses on a subset of responses from the full data set which includes 10,563 students from secondary school, undergraduate, graduate, or professional school and additional questions about in-person learning. Our full data set is available here for anyone to download for continued exploration: https://dataverse.harvard.edu/dataset.xhtml?persistentId= doi: 10.7910/DVN/2TGOPH .

Data Availability Statement

The raw data supporting the conclusions of this article will be made available by the authors, without undue reservation.

Ethics Statement

Ethical review and approval was not required for the study on human participants in accordance with the local legislation and institutional requirements. The patients/participants provided their written informed consent to participate in this study.

Author Contributions

GS: project lead, survey design, qualitative coding, writing, review, and editing. TN: data analysis, writing, review, and editing. CN and PB: qualitative coding. JW: data analysis, writing, and editing. CS: writing, review, and editing. EV and KL: original survey design and qualitative coding. PP: data analysis. JB: original survey design and survey distribution. HH: data analysis. MP: writing. All authors contributed to the article and approved the submitted version.

Conflict of Interest

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Acknowledgments

We want to thank Minerva Schools at KGI for providing funding for summer undergraduate research internships. We also want to thank Josh Fost and Christopher V. H.-H. Chen for discussion that helped shape this project.

Supplementary Material

The Supplementary Material for this article can be found online at: https://www.frontiersin.org/articles/10.3389/feduc.2021.647986/full#supplementary-material

Aspergren, E. (2020). Snow Days Canceled Because of COVID-19 Online School? Not in These School Districts.sec. Education. USA Today. Available online at: https://www.usatoday.com/story/news/education/2020/12/15/covid-school-canceled-snow-day-online-learning/3905780001/ (accessed December 15, 2020).

Google Scholar

Bostock, S. J. (1998). Constructivism in mass higher education: a case study. Br. J. Educ. Technol. 29, 225–240. doi: 10.1111/1467-8535.00066

CrossRef Full Text | Google Scholar

Bradbury, N. A. (2016). Attention span during lectures: 8 seconds, 10 minutes, or more? Adv. Physiol. Educ. 40, 509–513. doi: 10.1152/advan.00109.2016

PubMed Abstract | CrossRef Full Text | Google Scholar

Chen, B., Bastedo, K., and Howard, W. (2018). Exploring best practices for online STEM courses: active learning, interaction & assessment design. Online Learn. 22, 59–75. doi: 10.24059/olj.v22i2.1369

Davis, D., Chen, G., Hauff, C., and Houben, G.-J. (2018). Activating learning at scale: a review of innovations in online learning strategies. Comput. Educ. 125, 327–344. doi: 10.1016/j.compedu.2018.05.019

Deslauriers, L., McCarty, L. S., Miller, K., Callaghan, K., and Kestin, G. (2019). Measuring actual learning versus feeling of learning in response to being actively engaged in the classroom. Proc. Natl. Acad. Sci. 116, 19251–19257. doi: 10.1073/pnas.1821936116

Fink, L. D. (2013). Creating Significant Learning Experiences: An Integrated Approach to Designing College Courses. Somerset, NJ: John Wiley & Sons, Incorporated.

Freeman, S., Eddy, S. L., McDonough, M., Smith, M. K., Okoroafor, N., Jordt, H., et al. (2014). Active learning increases student performance in science, engineering, and mathematics. Proc. Natl. Acad. Sci. 111, 8410–8415. doi: 10.1073/pnas.1319030111

Gale, N. K., Heath, G., Cameron, E., Rashid, S., and Redwood, S. (2013). Using the framework method for the analysis of qualitative data in multi-disciplinary health research. BMC Med. Res. Methodol. 13:117. doi: 10.1186/1471-2288-13-117

Garrison, D. R., Anderson, T., and Archer, W. (1999). Critical inquiry in a text-based environment: computer conferencing in higher education. Internet High. Educ. 2, 87–105. doi: 10.1016/S1096-7516(00)00016-6

Gay, G. (2000). Culturally Responsive Teaching: Theory, Research, and Practice. Multicultural Education Series. New York, NY: Teachers College Press.

Gillingham, and Molinari, C. (2012). Online courses: student preferences survey. Internet Learn. 1, 36–45. doi: 10.18278/il.1.1.4

Gillis, A., and Krull, L. M. (2020). COVID-19 remote learning transition in spring 2020: class structures, student perceptions, and inequality in college courses. Teach. Sociol. 48, 283–299. doi: 10.1177/0092055X20954263

Goodyear, P. (2002). “Psychological foundations for networked learning,” in Networked Learning: Perspectives and Issues. Computer Supported Cooperative Work , eds C. Steeples and C. Jones (London: Springer), 49–75. doi: 10.1007/978-1-4471-0181-9_4

Goodyear, P. (2005). Educational design and networked learning: patterns, pattern languages and design practice. Australas. J. Educ. Technol. 21, 82–101. doi: 10.14742/ajet.1344

Gunawardena, C. N. (1995). Social presence theory and implications for interaction and collaborative learning in computer conferences. Int. J. Educ. Telecommun. 1, 147–166.

Gunawardena, C. N., and Zittle, F. J. (1997). Social presence as a predictor of satisfaction within a computer mediated conferencing environment. Am. J. Distance Educ. 11, 8–26. doi: 10.1080/08923649709526970

Harris, H. S., and Martin, E. (2012). Student motivations for choosing online classes. Int. J. Scholarsh. Teach. Learn. 6, 1–8. doi: 10.20429/ijsotl.2012.060211

Levitz, R. N. (2016). 2015-16 National Online Learners Satisfaction and Priorities Report. Cedar Rapids: Ruffalo Noel Levitz, 12.

Mansinghka, V., Shafto, P., Jonas, E., Petschulat, C., Gasner, M., and Tenenbaum, J. B. (2016). CrossCat: a fully Bayesian nonparametric method for analyzing heterogeneous, high dimensional data. J. Mach. Learn. Res. 17, 1–49. doi: 10.1007/978-0-387-69765-9_7

National Research Council (2000). How People Learn: Brain, Mind, Experience, and School: Expanded Edition. Washington, DC: National Academies Press, doi: 10.17226/9853

Richardson, J. C., and Swan, K. (2019). Examining social presence in online courses in relation to students’ perceived learning and satisfaction. Online Learn. 7, 68–88. doi: 10.24059/olj.v7i1.1864

Shea, P., Pickett, A. M., and Pelz, W. E. (2019). A Follow-up investigation of ‘teaching presence’ in the suny learning network. Online Learn. 7, 73–75. doi: 10.24059/olj.v7i2.1856

Shekhar, P., Borrego, M., DeMonbrun, M., Finelli, C., Crockett, C., and Nguyen, K. (2020). Negative student response to active learning in STEM classrooms: a systematic review of underlying reasons. J. Coll. Sci. Teach. 49, 45–54.

Short, J., Williams, E., and Christie, B. (1976). The Social Psychology of Telecommunications. London: John Wiley & Sons.

Tu, C.-H. (2002). The measurement of social presence in an online learning environment. Int. J. E Learn. 1, 34–45. doi: 10.17471/2499-4324/421

Zull, J. E. (2002). The Art of Changing the Brain: Enriching Teaching by Exploring the Biology of Learning , 1st Edn. Sterling, VA: Stylus Publishing.

Keywords : online learning, COVID-19, active learning, higher education, pedagogy, survey, international

Citation: Nguyen T, Netto CLM, Wilkins JF, Bröker P, Vargas EE, Sealfon CD, Puthipiroj P, Li KS, Bowler JE, Hinson HR, Pujar M and Stein GM (2021) Insights Into Students’ Experiences and Perceptions of Remote Learning Methods: From the COVID-19 Pandemic to Best Practice for the Future. Front. Educ. 6:647986. doi: 10.3389/feduc.2021.647986

Received: 30 December 2020; Accepted: 09 March 2021; Published: 09 April 2021.

Reviewed by:

Copyright © 2021 Nguyen, Netto, Wilkins, Bröker, Vargas, Sealfon, Puthipiroj, Li, Bowler, Hinson, Pujar and Stein. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY) . The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

*Correspondence: Geneva M. Stein, [email protected]

This article is part of the Research Topic

Covid-19 and Beyond: From (Forced) Remote Teaching and Learning to ‘The New Normal’ in Higher Education

Advertisement

Advertisement

The effects of online education on academic success: A meta-analysis study

  • Published: 06 September 2021
  • Volume 27 , pages 429–450, ( 2022 )

Cite this article

what are the possible research questions about online learning

  • Hakan Ulum   ORCID: orcid.org/0000-0002-1398-6935 1  

79k Accesses

26 Citations

11 Altmetric

Explore all metrics

The purpose of this study is to analyze the effect of online education, which has been extensively used on student achievement since the beginning of the pandemic. In line with this purpose, a meta-analysis of the related studies focusing on the effect of online education on students’ academic achievement in several countries between the years 2010 and 2021 was carried out. Furthermore, this study will provide a source to assist future studies with comparing the effect of online education on academic achievement before and after the pandemic. This meta-analysis study consists of 27 studies in total. The meta-analysis involves the studies conducted in the USA, Taiwan, Turkey, China, Philippines, Ireland, and Georgia. The studies included in the meta-analysis are experimental studies, and the total sample size is 1772. In the study, the funnel plot, Duval and Tweedie’s Trip and Fill Analysis, Orwin’s Safe N Analysis, and Egger’s Regression Test were utilized to determine the publication bias, which has been found to be quite low. Besides, Hedge’s g statistic was employed to measure the effect size for the difference between the means performed in accordance with the random effects model. The results of the study show that the effect size of online education on academic achievement is on a medium level. The heterogeneity test results of the meta-analysis study display that the effect size does not differ in terms of class level, country, online education approaches, and lecture moderators.

Avoid common mistakes on your manuscript.

1 Introduction

Information and communication technologies have become a powerful force in transforming the educational settings around the world. The pandemic has been an important factor in transferring traditional physical classrooms settings through adopting information and communication technologies and has also accelerated the transformation. The literature supports that learning environments connected to information and communication technologies highly satisfy students. Therefore, we need to keep interest in technology-based learning environments. Clearly, technology has had a huge impact on young people's online lives. This digital revolution can synergize the educational ambitions and interests of digitally addicted students. In essence, COVID-19 has provided us with an opportunity to embrace online learning as education systems have to keep up with the rapid emergence of new technologies.

Information and communication technologies that have an effect on all spheres of life are also actively included in the education field. With the recent developments, using technology in education has become inevitable due to personal and social reasons (Usta, 2011a ). Online education may be given as an example of using information and communication technologies as a consequence of the technological developments. Also, it is crystal clear that online learning is a popular way of obtaining instruction (Demiralay et al., 2016 ; Pillay et al., 2007 ), which is defined by Horton ( 2000 ) as a way of education that is performed through a web browser or an online application without requiring an extra software or a learning source. Furthermore, online learning is described as a way of utilizing the internet to obtain the related learning sources during the learning process, to interact with the content, the teacher, and other learners, as well as to get support throughout the learning process (Ally, 2004 ). Online learning has such benefits as learning independently at any time and place (Vrasidas & MsIsaac, 2000 ), granting facility (Poole, 2000 ), flexibility (Chizmar & Walbert, 1999 ), self-regulation skills (Usta, 2011b ), learning with collaboration, and opportunity to plan self-learning process.

Even though online education practices have not been comprehensive as it is now, internet and computers have been used in education as alternative learning tools in correlation with the advances in technology. The first distance education attempt in the world was initiated by the ‘Steno Courses’ announcement published in Boston newspaper in 1728. Furthermore, in the nineteenth century, Sweden University started the “Correspondence Composition Courses” for women, and University Correspondence College was afterwards founded for the correspondence courses in 1843 (Arat & Bakan, 2011 ). Recently, distance education has been performed through computers, assisted by the facilities of the internet technologies, and soon, it has evolved into a mobile education practice that is emanating from progress in the speed of internet connection, and the development of mobile devices.

With the emergence of pandemic (Covid-19), face to face education has almost been put to a halt, and online education has gained significant importance. The Microsoft management team declared to have 750 users involved in the online education activities on the 10 th March, just before the pandemic; however, on March 24, they informed that the number of users increased significantly, reaching the number of 138,698 users (OECD, 2020 ). This event supports the view that it is better to commonly use online education rather than using it as a traditional alternative educational tool when students do not have the opportunity to have a face to face education (Geostat, 2019 ). The period of Covid-19 pandemic has emerged as a sudden state of having limited opportunities. Face to face education has stopped in this period for a long time. The global spread of Covid-19 affected more than 850 million students all around the world, and it caused the suspension of face to face education. Different countries have proposed several solutions in order to maintain the education process during the pandemic. Schools have had to change their curriculum, and many countries supported the online education practices soon after the pandemic. In other words, traditional education gave its way to online education practices. At least 96 countries have been motivated to access online libraries, TV broadcasts, instructions, sources, video lectures, and online channels (UNESCO, 2020 ). In such a painful period, educational institutions went through online education practices by the help of huge companies such as Microsoft, Google, Zoom, Skype, FaceTime, and Slack. Thus, online education has been discussed in the education agenda more intensively than ever before.

Although online education approaches were not used as comprehensively as it has been used recently, it was utilized as an alternative learning approach in education for a long time in parallel with the development of technology, internet and computers. The academic achievement of the students is often aimed to be promoted by employing online education approaches. In this regard, academicians in various countries have conducted many studies on the evaluation of online education approaches and published the related results. However, the accumulation of scientific data on online education approaches creates difficulties in keeping, organizing and synthesizing the findings. In this research area, studies are being conducted at an increasing rate making it difficult for scientists to be aware of all the research outside of their ​​expertise. Another problem encountered in the related study area is that online education studies are repetitive. Studies often utilize slightly different methods, measures, and/or examples to avoid duplication. This erroneous approach makes it difficult to distinguish between significant differences in the related results. In other words, if there are significant differences in the results of the studies, it may be difficult to express what variety explains the differences in these results. One obvious solution to these problems is to systematically review the results of various studies and uncover the sources. One method of performing such systematic syntheses is the application of meta-analysis which is a methodological and statistical approach to draw conclusions from the literature. At this point, how effective online education applications are in increasing the academic success is an important detail. Has online education, which is likely to be encountered frequently in the continuing pandemic period, been successful in the last ten years? If successful, how much was the impact? Did different variables have an impact on this effect? Academics across the globe have carried out studies on the evaluation of online education platforms and publishing the related results (Chiao et al., 2018 ). It is quite important to evaluate the results of the studies that have been published up until now, and that will be published in the future. Has the online education been successful? If it has been, how big is the impact? Do the different variables affect this impact? What should we consider in the next coming online education practices? These questions have all motivated us to carry out this study. We have conducted a comprehensive meta-analysis study that tries to provide a discussion platform on how to develop efficient online programs for educators and policy makers by reviewing the related studies on online education, presenting the effect size, and revealing the effect of diverse variables on the general impact.

There have been many critical discussions and comprehensive studies on the differences between online and face to face learning; however, the focus of this paper is different in the sense that it clarifies the magnitude of the effect of online education and teaching process, and it represents what factors should be controlled to help increase the effect size. Indeed, the purpose here is to provide conscious decisions in the implementation of the online education process.

The general impact of online education on the academic achievement will be discovered in the study. Therefore, this will provide an opportunity to get a general overview of the online education which has been practiced and discussed intensively in the pandemic period. Moreover, the general impact of online education on academic achievement will be analyzed, considering different variables. In other words, the current study will allow to totally evaluate the study results from the related literature, and to analyze the results considering several cultures, lectures, and class levels. Considering all the related points, this study seeks to answer the following research questions:

What is the effect size of online education on academic achievement?

How do the effect sizes of online education on academic achievement change according to the moderator variable of the country?

How do the effect sizes of online education on academic achievement change according to the moderator variable of the class level?

How do the effect sizes of online education on academic achievement change according to the moderator variable of the lecture?

How do the effect sizes of online education on academic achievement change according to the moderator variable of the online education approaches?

This study aims at determining the effect size of online education, which has been highly used since the beginning of the pandemic, on students’ academic achievement in different courses by using a meta-analysis method. Meta-analysis is a synthesis method that enables gathering of several study results accurately and efficiently, and getting the total results in the end (Tsagris & Fragkos, 2018 ).

2.1 Selecting and coding the data (studies)

The required literature for the meta-analysis study was reviewed in July, 2020, and the follow-up review was conducted in September, 2020. The purpose of the follow-up review was to include the studies which were published in the conduction period of this study, and which met the related inclusion criteria. However, no study was encountered to be included in the follow-up review.

In order to access the studies in the meta-analysis, the databases of Web of Science, ERIC, and SCOPUS were reviewed by utilizing the keywords ‘online learning and online education’. Not every database has a search engine that grants access to the studies by writing the keywords, and this obstacle was considered to be an important problem to be overcome. Therefore, a platform that has a special design was utilized by the researcher. With this purpose, through the open access system of Cukurova University Library, detailed reviews were practiced using EBSCO Information Services (EBSCO) that allow reviewing the whole collection of research through a sole searching box. Since the fundamental variables of this study are online education and online learning, the literature was systematically reviewed in the related databases (Web of Science, ERIC, and SCOPUS) by referring to the keywords. Within this scope, 225 articles were accessed, and the studies were included in the coding key list formed by the researcher. The name of the researchers, the year, the database (Web of Science, ERIC, and SCOPUS), the sample group and size, the lectures that the academic achievement was tested in, the country that the study was conducted in, and the class levels were all included in this coding key.

The following criteria were identified to include 225 research studies which were coded based on the theoretical basis of the meta-analysis study: (1) The studies should be published in the refereed journals between the years 2020 and 2021, (2) The studies should be experimental studies that try to determine the effect of online education and online learning on academic achievement, (3) The values of the stated variables or the required statistics to calculate these values should be stated in the results of the studies, and (4) The sample group of the study should be at a primary education level. These criteria were also used as the exclusion criteria in the sense that the studies that do not meet the required criteria were not included in the present study.

After the inclusion criteria were determined, a systematic review process was conducted, following the year criterion of the study by means of EBSCO. Within this scope, 290,365 studies that analyze the effect of online education and online learning on academic achievement were accordingly accessed. The database (Web of Science, ERIC, and SCOPUS) was also used as a filter by analyzing the inclusion criteria. Hence, the number of the studies that were analyzed was 58,616. Afterwards, the keyword ‘primary education’ was used as the filter and the number of studies included in the study decreased to 3152. Lastly, the literature was reviewed by using the keyword ‘academic achievement’ and 225 studies were accessed. All the information of 225 articles was included in the coding key.

It is necessary for the coders to review the related studies accurately and control the validity, safety, and accuracy of the studies (Stewart & Kamins, 2001 ). Within this scope, the studies that were determined based on the variables used in this study were first reviewed by three researchers from primary education field, then the accessed studies were combined and processed in the coding key by the researcher. All these studies that were processed in the coding key were analyzed in accordance with the inclusion criteria by all the researchers in the meetings, and it was decided that 27 studies met the inclusion criteria (Atici & Polat, 2010 ; Carreon, 2018 ; Ceylan & Elitok Kesici, 2017 ; Chae & Shin, 2016 ; Chiang et al. 2014 ; Ercan, 2014 ; Ercan et al., 2016 ; Gwo-Jen et al., 2018 ; Hayes & Stewart, 2016 ; Hwang et al., 2012 ; Kert et al., 2017 ; Lai & Chen, 2010 ; Lai et al., 2015 ; Meyers et al., 2015 ; Ravenel et al., 2014 ; Sung et al., 2016 ; Wang & Chen, 2013 ; Yu, 2019 ; Yu & Chen, 2014 ; Yu & Pan, 2014 ; Yu et al., 2010 ; Zhong et al., 2017 ). The data from the studies meeting the inclusion criteria were independently processed in the second coding key by three researchers, and consensus meetings were arranged for further discussion. After the meetings, researchers came to an agreement that the data were coded accurately and precisely. Having identified the effect sizes and heterogeneity of the study, moderator variables that will show the differences between the effect sizes were determined. The data related to the determined moderator variables were added to the coding key by three researchers, and a new consensus meeting was arranged. After the meeting, researchers came to an agreement that moderator variables were coded accurately and precisely.

2.2 Study group

27 studies are included in the meta-analysis. The total sample size of the studies that are included in the analysis is 1772. The characteristics of the studies included are given in Table 1 .

2.3 Publication bias

Publication bias is the low capability of published studies on a research subject to represent all completed studies on the same subject (Card, 2011 ; Littell et al., 2008 ). Similarly, publication bias is the state of having a relationship between the probability of the publication of a study on a subject, and the effect size and significance that it produces. Within this scope, publication bias may occur when the researchers do not want to publish the study as a result of failing to obtain the expected results, or not being approved by the scientific journals, and consequently not being included in the study synthesis (Makowski et al., 2019 ). The high possibility of publication bias in a meta-analysis study negatively affects (Pecoraro, 2018 ) the accuracy of the combined effect size, causing the average effect size to be reported differently than it should be (Borenstein et al., 2009 ). For this reason, the possibility of publication bias in the included studies was tested before determining the effect sizes of the relationships between the stated variables. The possibility of publication bias of this meta-analysis study was analyzed by using the funnel plot, Orwin’s Safe N Analysis, Duval and Tweedie’s Trip and Fill Analysis, and Egger’s Regression Test.

2.4 Selecting the model

After determining the probability of publication bias of this meta-analysis study, the statistical model used to calculate the effect sizes was selected. The main approaches used in the effect size calculations according to the differentiation level of inter-study variance are fixed and random effects models (Pigott, 2012 ). Fixed effects model refers to the homogeneity of the characteristics of combined studies apart from the sample sizes, while random effects model refers to the parameter diversity between the studies (Cumming, 2012 ). While calculating the average effect size in the random effects model (Deeks et al., 2008 ) that is based on the assumption that effect predictions of different studies are only the result of a similar distribution, it is necessary to consider several situations such as the effect size apart from the sample error of combined studies, characteristics of the participants, duration, scope, and pattern of the study (Littell et al., 2008 ). While deciding the model in the meta-analysis study, the assumptions on the sample characteristics of the studies included in the analysis and the inferences that the researcher aims to make should be taken into consideration. The fact that the sample characteristics of the studies conducted in the field of social sciences are affected by various parameters shows that using random effects model is more appropriate in this sense. Besides, it is stated that the inferences made with the random effects model are beyond the studies included in the meta-analysis (Field, 2003 ; Field & Gillett, 2010 ). Therefore, using random effects model also contributes to the generalization of research data. The specified criteria for the statistical model selection show that according to the nature of the meta-analysis study, the model should be selected just before the analysis (Borenstein et al., 2007 ; Littell et al., 2008 ). Within this framework, it was decided to make use of the random effects model, considering that the students who are the samples of the studies included in the meta-analysis are from different countries and cultures, the sample characteristics of the studies differ, and the patterns and scopes of the studies vary as well.

2.5 Heterogeneity

Meta-analysis facilitates analyzing the research subject with different parameters by showing the level of diversity between the included studies. Within this frame, whether there is a heterogeneous distribution between the studies included in the study or not has been evaluated in the present study. The heterogeneity of the studies combined in this meta-analysis study has been determined through Q and I 2 tests. Q test evaluates the random distribution probability of the differences between the observed results (Deeks et al., 2008 ). Q value exceeding 2 value calculated according to the degree of freedom and significance, indicates the heterogeneity of the combined effect sizes (Card, 2011 ). I 2 test, which is the complementary of the Q test, shows the heterogeneity amount of the effect sizes (Cleophas & Zwinderman, 2017 ). I 2 value being higher than 75% is explained as high level of heterogeneity.

In case of encountering heterogeneity in the studies included in the meta-analysis, the reasons of heterogeneity can be analyzed by referring to the study characteristics. The study characteristics which may be related to the heterogeneity between the included studies can be interpreted through subgroup analysis or meta-regression analysis (Deeks et al., 2008 ). While determining the moderator variables, the sufficiency of the number of variables, the relationship between the moderators, and the condition to explain the differences between the results of the studies have all been considered in the present study. Within this scope, it was predicted in this meta-analysis study that the heterogeneity can be explained with the country, class level, and lecture moderator variables of the study in terms of the effect of online education, which has been highly used since the beginning of the pandemic, and it has an impact on the students’ academic achievement in different lectures. Some subgroups were evaluated and categorized together, considering that the number of effect sizes of the sub-dimensions of the specified variables is not sufficient to perform moderator analysis (e.g. the countries where the studies were conducted).

2.6 Interpreting the effect sizes

Effect size is a factor that shows how much the independent variable affects the dependent variable positively or negatively in each included study in the meta-analysis (Dinçer, 2014 ). While interpreting the effect sizes obtained from the meta-analysis, the classifications of Cohen et al. ( 2007 ) have been utilized. The case of differentiating the specified relationships of the situation of the country, class level, and school subject variables of the study has been identified through the Q test, degree of freedom, and p significance value Fig.  1 and 2 .

3 Findings and results

The purpose of this study is to determine the effect size of online education on academic achievement. Before determining the effect sizes in the study, the probability of publication bias of this meta-analysis study was analyzed by using the funnel plot, Orwin’s Safe N Analysis, Duval and Tweedie’s Trip and Fill Analysis, and Egger’s Regression Test.

When the funnel plots are examined, it is seen that the studies included in the analysis are distributed symmetrically on both sides of the combined effect size axis, and they are generally collected in the middle and lower sections. The probability of publication bias is low according to the plots. However, since the results of the funnel scatter plots may cause subjective interpretations, they have been supported by additional analyses (Littell et al., 2008 ). Therefore, in order to provide an extra proof for the probability of publication bias, it has been analyzed through Orwin’s Safe N Analysis, Duval and Tweedie’s Trip and Fill Analysis, and Egger’s Regression Test (Table 2 ).

Table 2 consists of the results of the rates of publication bias probability before counting the effect size of online education on academic achievement. According to the table, Orwin Safe N analysis results show that it is not necessary to add new studies to the meta-analysis in order for Hedges g to reach a value outside the range of ± 0.01. The Duval and Tweedie test shows that excluding the studies that negatively affect the symmetry of the funnel scatter plots for each meta-analysis or adding their exact symmetrical equivalents does not significantly differentiate the calculated effect size. The insignificance of the Egger tests results reveals that there is no publication bias in the meta-analysis study. The results of the analysis indicate the high internal validity of the effect sizes and the adequacy of representing the studies conducted on the relevant subject.

In this study, it was aimed to determine the effect size of online education on academic achievement after testing the publication bias. In line with the first purpose of the study, the forest graph regarding the effect size of online education on academic achievement is shown in Fig.  3 , and the statistics regarding the effect size are given in Table 3 .

figure 1

The flow chart of the scanning and selection process of the studies

figure 2

Funnel plot graphics representing the effect size of the effects of online education on academic success

figure 3

Forest graph related to the effect size of online education on academic success

The square symbols in the forest graph in Fig.  3 represent the effect sizes, while the horizontal lines show the intervals in 95% confidence of the effect sizes, and the diamond symbol shows the overall effect size. When the forest graph is analyzed, it is seen that the lower and upper limits of the combined effect sizes are generally close to each other, and the study loads are similar. This similarity in terms of study loads indicates the similarity of the contribution of the combined studies to the overall effect size.

Figure  3 clearly represents that the study of Liu and others (Liu et al., 2018 ) has the lowest, and the study of Ercan and Bilen ( 2014 ) has the highest effect sizes. The forest graph shows that all the combined studies and the overall effect are positive. Furthermore, it is simply understood from the forest graph in Fig.  3 and the effect size statistics in Table 3 that the results of the meta-analysis study conducted with 27 studies and analyzing the effect of online education on academic achievement illustrate that this relationship is on average level (= 0.409).

After the analysis of the effect size in the study, whether the studies included in the analysis are distributed heterogeneously or not has also been analyzed. The heterogeneity of the combined studies was determined through the Q and I 2 tests. As a result of the heterogeneity test, Q statistical value was calculated as 29.576. With 26 degrees of freedom at 95% significance level in the chi-square table, the critical value is accepted as 38.885. The Q statistical value (29.576) counted in this study is lower than the critical value of 38.885. The I 2 value, which is the complementary of the Q statistics, is 12.100%. This value indicates that the accurate heterogeneity or the total variability that can be attributed to variability between the studies is 12%. Besides, p value is higher than (0.285) p = 0.05. All these values [Q (26) = 29.579, p = 0.285; I2 = 12.100] indicate that there is a homogeneous distribution between the effect sizes, and fixed effects model should be used to interpret these effect sizes. However, some researchers argue that even if the heterogeneity is low, it should be evaluated based on the random effects model (Borenstein et al., 2007 ). Therefore, this study gives information about both models. The heterogeneity of the combined studies has been attempted to be explained with the characteristics of the studies included in the analysis. In this context, the final purpose of the study is to determine the effect of the country, academic level, and year variables on the findings. Accordingly, the statistics regarding the comparison of the stated relations according to the countries where the studies were conducted are given in Table 4 .

As seen in Table 4 , the effect of online education on academic achievement does not differ significantly according to the countries where the studies were conducted in. Q test results indicate the heterogeneity of the relationships between the variables in terms of countries where the studies were conducted in. According to the table, the effect of online education on academic achievement was reported as the highest in other countries, and the lowest in the US. The statistics regarding the comparison of the stated relations according to the class levels are given in Table 5 .

As seen in Table 5 , the effect of online education on academic achievement does not differ according to the class level. However, the effect of online education on academic achievement is the highest in the 4 th class. The statistics regarding the comparison of the stated relations according to the class levels are given in Table 6 .

As seen in Table 6 , the effect of online education on academic achievement does not differ according to the school subjects included in the studies. However, the effect of online education on academic achievement is the highest in ICT subject.

The obtained effect size in the study was formed as a result of the findings attained from primary studies conducted in 7 different countries. In addition, these studies are the ones on different approaches to online education (online learning environments, social networks, blended learning, etc.). In this respect, the results may raise some questions about the validity and generalizability of the results of the study. However, the moderator analyzes, whether for the country variable or for the approaches covered by online education, did not create significant differences in terms of the effect sizes. If significant differences were to occur in terms of effect sizes, we could say that the comparisons we will make by comparing countries under the umbrella of online education would raise doubts in terms of generalizability. Moreover, no study has been found in the literature that is not based on a special approach or does not contain a specific technique conducted under the name of online education alone. For instance, one of the commonly used definitions is blended education which is defined as an educational model in which online education is combined with traditional education method (Colis & Moonen, 2001 ). Similarly, Rasmussen ( 2003 ) defines blended learning as “a distance education method that combines technology (high technology such as television, internet, or low technology such as voice e-mail, conferences) with traditional education and training.” Further, Kerres and Witt (2003) define blended learning as “combining face-to-face learning with technology-assisted learning.” As it is clearly observed, online education, which has a wider scope, includes many approaches.

As seen in Table 7 , the effect of online education on academic achievement does not differ according to online education approaches included in the studies. However, the effect of online education on academic achievement is the highest in Web Based Problem Solving Approach.

4 Conclusions and discussion

Considering the developments during the pandemics, it is thought that the diversity in online education applications as an interdisciplinary pragmatist field will increase, and the learning content and processes will be enriched with the integration of new technologies into online education processes. Another prediction is that more flexible and accessible learning opportunities will be created in online education processes, and in this way, lifelong learning processes will be strengthened. As a result, it is predicted that in the near future, online education and even digital learning with a newer name will turn into the main ground of education instead of being an alternative or having a support function in face-to-face learning. The lessons learned from the early period online learning experience, which was passed with rapid adaptation due to the Covid19 epidemic, will serve to develop this method all over the world, and in the near future, online learning will become the main learning structure through increasing its functionality with the contribution of new technologies and systems. If we look at it from this point of view, there is a necessity to strengthen online education.

In this study, the effect of online learning on academic achievement is at a moderate level. To increase this effect, the implementation of online learning requires support from teachers to prepare learning materials, to design learning appropriately, and to utilize various digital-based media such as websites, software technology and various other tools to support the effectiveness of online learning (Rolisca & Achadiyah, 2014 ). According to research conducted by Rahayu et al. ( 2017 ), it has been proven that the use of various types of software increases the effectiveness and quality of online learning. Implementation of online learning can affect students' ability to adapt to technological developments in that it makes students use various learning resources on the internet to access various types of information, and enables them to get used to performing inquiry learning and active learning (Hart et al., 2019 ; Prestiadi et al., 2019 ). In addition, there may be many reasons for the low level of effect in this study. The moderator variables examined in this study could be a guide in increasing the level of practical effect. However, the effect size did not differ significantly for all moderator variables. Different moderator analyzes can be evaluated in order to increase the level of impact of online education on academic success. If confounding variables that significantly change the effect level are detected, it can be spoken more precisely in order to increase this level. In addition to the technical and financial problems, the level of impact will increase if a few other difficulties are eliminated such as students, lack of interaction with the instructor, response time, and lack of traditional classroom socialization.

In addition, COVID-19 pandemic related social distancing has posed extreme difficulties for all stakeholders to get online as they have to work in time constraints and resource constraints. Adopting the online learning environment is not just a technical issue, it is a pedagogical and instructive challenge as well. Therefore, extensive preparation of teaching materials, curriculum, and assessment is vital in online education. Technology is the delivery tool and requires close cross-collaboration between teaching, content and technology teams (CoSN, 2020 ).

Online education applications have been used for many years. However, it has come to the fore more during the pandemic process. This result of necessity has brought with it the discussion of using online education instead of traditional education methods in the future. However, with this research, it has been revealed that online education applications are moderately effective. The use of online education instead of face-to-face education applications can only be possible with an increase in the level of success. This may have been possible with the experience and knowledge gained during the pandemic process. Therefore, the meta-analysis of experimental studies conducted in the coming years will guide us. In this context, experimental studies using online education applications should be analyzed well. It would be useful to identify variables that can change the level of impacts with different moderators. Moderator analyzes are valuable in meta-analysis studies (for example, the role of moderators in Karl Pearson's typhoid vaccine studies). In this context, each analysis study sheds light on future studies. In meta-analyses to be made about online education, it would be beneficial to go beyond the moderators determined in this study. Thus, the contribution of similar studies to the field will increase more.

The purpose of this study is to determine the effect of online education on academic achievement. In line with this purpose, the studies that analyze the effect of online education approaches on academic achievement have been included in the meta-analysis. The total sample size of the studies included in the meta-analysis is 1772. While the studies included in the meta-analysis were conducted in the US, Taiwan, Turkey, China, Philippines, Ireland, and Georgia, the studies carried out in Europe could not be reached. The reason may be attributed to that there may be more use of quantitative research methods from a positivist perspective in the countries with an American academic tradition. As a result of the study, it was found out that the effect size of online education on academic achievement (g = 0.409) was moderate. In the studies included in the present research, we found that online education approaches were more effective than traditional ones. However, contrary to the present study, the analysis of comparisons between online and traditional education in some studies shows that face-to-face traditional learning is still considered effective compared to online learning (Ahmad et al., 2016 ; Hamdani & Priatna, 2020 ; Wei & Chou, 2020 ). Online education has advantages and disadvantages. The advantages of online learning compared to face-to-face learning in the classroom is the flexibility of learning time in online learning, the learning time does not include a single program, and it can be shaped according to circumstances (Lai et al., 2019 ). The next advantage is the ease of collecting assignments for students, as these can be done without having to talk to the teacher. Despite this, online education has several weaknesses, such as students having difficulty in understanding the material, teachers' inability to control students, and students’ still having difficulty interacting with teachers in case of internet network cuts (Swan, 2007 ). According to Astuti et al ( 2019 ), face-to-face education method is still considered better by students than e-learning because it is easier to understand the material and easier to interact with teachers. The results of the study illustrated that the effect size (g = 0.409) of online education on academic achievement is of medium level. Therefore, the results of the moderator analysis showed that the effect of online education on academic achievement does not differ in terms of country, lecture, class level, and online education approaches variables. After analyzing the literature, several meta-analyses on online education were published (Bernard et al., 2004 ; Machtmes & Asher, 2000 ; Zhao et al., 2005 ). Typically, these meta-analyzes also include the studies of older generation technologies such as audio, video, or satellite transmission. One of the most comprehensive studies on online education was conducted by Bernard et al. ( 2004 ). In this study, 699 independent effect sizes of 232 studies published from 1985 to 2001 were analyzed, and face-to-face education was compared to online education, with respect to success criteria and attitudes of various learners from young children to adults. In this meta-analysis, an overall effect size close to zero was found for the students' achievement (g +  = 0.01).

In another meta-analysis study carried out by Zhao et al. ( 2005 ), 98 effect sizes were examined, including 51 studies on online education conducted between 1996 and 2002. According to the study of Bernard et al. ( 2004 ), this meta-analysis focuses on the activities done in online education lectures. As a result of the research, an overall effect size close to zero was found for online education utilizing more than one generation technology for students at different levels. However, the salient point of the meta-analysis study of Zhao et al. is that it takes the average of different types of results used in a study to calculate an overall effect size. This practice is problematic because the factors that develop one type of learner outcome (e.g. learner rehabilitation), particularly course characteristics and practices, may be quite different from those that develop another type of outcome (e.g. learner's achievement), and it may even cause damage to the latter outcome. While mixing the studies with different types of results, this implementation may obscure the relationship between practices and learning.

Some meta-analytical studies have focused on the effectiveness of the new generation distance learning courses accessed through the internet for specific student populations. For instance, Sitzmann and others (Sitzmann et al., 2006 ) reviewed 96 studies published from 1996 to 2005, comparing web-based education of job-related knowledge or skills with face-to-face one. The researchers found that web-based education in general was slightly more effective than face-to-face education, but it is insufficient in terms of applicability ("knowing how to apply"). In addition, Sitzmann et al. ( 2006 ) revealed that Internet-based education has a positive effect on theoretical knowledge in quasi-experimental studies; however, it positively affects face-to-face education in experimental studies performed by random assignment. This moderator analysis emphasizes the need to pay attention to the factors of designs of the studies included in the meta-analysis. The designs of the studies included in this meta-analysis study were ignored. This can be presented as a suggestion to the new studies that will be conducted.

Another meta-analysis study was conducted by Cavanaugh et al. ( 2004 ), in which they focused on online education. In this study on internet-based distance education programs for students under 12 years of age, the researchers combined 116 results from 14 studies published between 1999 and 2004 to calculate an overall effect that was not statistically different from zero. The moderator analysis carried out in this study showed that there was no significant factor affecting the students' success. This meta-analysis used multiple results of the same study, ignoring the fact that different results of the same student would not be independent from each other.

In conclusion, some meta-analytical studies analyzed the consequences of online education for a wide range of students (Bernard et al., 2004 ; Zhao et al., 2005 ), and the effect sizes were generally low in these studies. Furthermore, none of the large-scale meta-analyzes considered the moderators, database quality standards or class levels in the selection of the studies, while some of them just referred to the country and lecture moderators. Advances in internet-based learning tools, the pandemic process, and increasing popularity in different learning contexts have required a precise meta-analysis of students' learning outcomes through online learning. Previous meta-analysis studies were typically based on the studies, involving narrow range of confounding variables. In the present study, common but significant moderators such as class level and lectures during the pandemic process were discussed. For instance, the problems have been experienced especially in terms of eligibility of class levels in online education platforms during the pandemic process. It was found that there is a need to study and make suggestions on whether online education can meet the needs of teachers and students.

Besides, the main forms of online education in the past were to watch the open lectures of famous universities and educational videos of institutions. In addition, online education is mainly a classroom-based teaching implemented by teachers in their own schools during the pandemic period, which is an extension of the original school education. This meta-analysis study will stand as a source to compare the effect size of the online education forms of the past decade with what is done today, and what will be done in the future.

Lastly, the heterogeneity test results of the meta-analysis study display that the effect size does not differ in terms of class level, country, online education approaches, and lecture moderators.

*Studies included in meta-analysis

Ahmad, S., Sumardi, K., & Purnawan, P. (2016). Komparasi Peningkatan Hasil Belajar Antara Pembelajaran Menggunakan Sistem Pembelajaran Online Terpadu Dengan Pembelajaran Klasikal Pada Mata Kuliah Pneumatik Dan Hidrolik. Journal of Mechanical Engineering Education, 2 (2), 286–292.

Article   Google Scholar  

Ally, M. (2004). Foundations of educational theory for online learning. Theory and Practice of Online Learning, 2 , 15–44. Retrieved on the 11th of September, 2020 from https://eddl.tru.ca/wp-content/uploads/2018/12/01_Anderson_2008-Theory_and_Practice_of_Online_Learning.pdf

Arat, T., & Bakan, Ö. (2011). Uzaktan eğitim ve uygulamaları. Selçuk Üniversitesi Sosyal Bilimler Meslek Yüksek Okulu Dergisi , 14 (1–2), 363–374. https://doi.org/10.29249/selcuksbmyd.540741

Astuti, C. C., Sari, H. M. K., & Azizah, N. L. (2019). Perbandingan Efektifitas Proses Pembelajaran Menggunakan Metode E-Learning dan Konvensional. Proceedings of the ICECRS, 2 (1), 35–40.

*Atici, B., & Polat, O. C. (2010). Influence of the online learning environments and tools on the student achievement and opinions. Educational Research and Reviews, 5 (8), 455–464. Retrieved on the 11th of October, 2020 from https://academicjournals.org/journal/ERR/article-full-text-pdf/4C8DD044180.pdf

Bernard, R. M., Abrami, P. C., Lou, Y., Borokhovski, E., Wade, A., Wozney, L., et al. (2004). How does distance education compare with classroom instruction? A meta- analysis of the empirical literature. Review of Educational Research, 3 (74), 379–439. https://doi.org/10.3102/00346543074003379

Borenstein, M., Hedges, L. V., Higgins, J. P. T., & Rothstein, H. R. (2009). Introduction to meta-analysis . Wiley.

Book   Google Scholar  

Borenstein, M., Hedges, L., & Rothstein, H. (2007). Meta-analysis: Fixed effect vs. random effects . UK: Wiley.

Card, N. A. (2011). Applied meta-analysis for social science research: Methodology in the social sciences . Guilford.

Google Scholar  

*Carreon, J. R. (2018 ). Facebook as integrated blended learning tool in technology and livelihood education exploratory. Retrieved on the 1st of October, 2020 from https://files.eric.ed.gov/fulltext/EJ1197714.pdf

Cavanaugh, C., Gillan, K. J., Kromrey, J., Hess, M., & Blomeyer, R. (2004). The effects of distance education on K-12 student outcomes: A meta-analysis. Learning Point Associates/North Central Regional Educational Laboratory (NCREL) . Retrieved on the 11th of September, 2020 from https://files.eric.ed.gov/fulltext/ED489533.pdf

*Ceylan, V. K., & Elitok Kesici, A. (2017). Effect of blended learning to academic achievement. Journal of Human Sciences, 14 (1), 308. https://doi.org/10.14687/jhs.v14i1.4141

*Chae, S. E., & Shin, J. H. (2016). Tutoring styles that encourage learner satisfaction, academic engagement, and achievement in an online environment. Interactive Learning Environments, 24(6), 1371–1385. https://doi.org/10.1080/10494820.2015.1009472

*Chiang, T. H. C., Yang, S. J. H., & Hwang, G. J. (2014). An augmented reality-based mobile learning system to improve students’ learning achievements and motivations in natural science inquiry activities. Educational Technology and Society, 17 (4), 352–365. Retrieved on the 11th of September, 2020 from https://www.researchgate.net/profile/Gwo_Jen_Hwang/publication/287529242_An_Augmented_Reality-based_Mobile_Learning_System_to_Improve_Students'_Learning_Achievements_and_Motivations_in_Natural_Science_Inquiry_Activities/links/57198c4808ae30c3f9f2c4ac.pdf

Chiao, H. M., Chen, Y. L., & Huang, W. H. (2018). Examining the usability of an online virtual tour-guiding platform for cultural tourism education. Journal of Hospitality, Leisure, Sport & Tourism Education, 23 (29–38), 1. https://doi.org/10.1016/j.jhlste.2018.05.002

Chizmar, J. F., & Walbert, M. S. (1999). Web-based learning environments guided by principles of good teaching practice. Journal of Economic Education, 30 (3), 248–264. https://doi.org/10.2307/1183061

Cleophas, T. J., & Zwinderman, A. H. (2017). Modern meta-analysis: Review and update of methodologies . Switzerland: Springer. https://doi.org/10.1007/978-3-319-55895-0

Cohen, L., Manion, L., & Morrison, K. (2007). Observation.  Research Methods in Education, 6 , 396–412. Retrieved on the 11th of September, 2020 from https://www.researchgate.net/profile/Nabil_Ashraf2/post/How_to_get_surface_potential_Vs_Voltage_curve_from_CV_and_GV_measurements_of_MOS_capacitor/attachment/5ac6033cb53d2f63c3c405b4/AS%3A612011817844736%401522926396219/download/Very+important_C-V+characterization+Lehigh+University+thesis.pdf

Colis, B., & Moonen, J. (2001). Flexible Learning in a Digital World: Experiences and Expectations. Open & Distance Learning Series . Stylus Publishing.

CoSN. (2020). COVID-19 Response: Preparing to Take School Online. CoSN. (2020). COVID-19 Response: Preparing to Take School Online. Retrieved on the 3rd of September, 2021 from https://www.cosn.org/sites/default/files/COVID-19%20Member%20Exclusive_0.pdf

Cumming, G. (2012). Understanding new statistics: Effect sizes, confidence intervals, and meta-analysis. New York, USA: Routledge. https://doi.org/10.4324/9780203807002

Deeks, J. J., Higgins, J. P. T., & Altman, D. G. (2008). Analysing data and undertaking meta-analyses . In J. P. T. Higgins & S. Green (Eds.), Cochrane handbook for systematic reviews of interventions (pp. 243–296). Sussex: John Wiley & Sons. https://doi.org/10.1002/9780470712184.ch9

Demiralay, R., Bayır, E. A., & Gelibolu, M. F. (2016). Öğrencilerin bireysel yenilikçilik özellikleri ile çevrimiçi öğrenmeye hazır bulunuşlukları ilişkisinin incelenmesi. Eğitim ve Öğretim Araştırmaları Dergisi, 5 (1), 161–168. https://doi.org/10.23891/efdyyu.2017.10

Dinçer, S. (2014). Eğitim bilimlerinde uygulamalı meta-analiz. Pegem Atıf İndeksi, 2014(1), 1–133. https://doi.org/10.14527/pegem.001

*Durak, G., Cankaya, S., Yunkul, E., & Ozturk, G. (2017). The effects of a social learning network on students’ performances and attitudes. European Journal of Education Studies, 3 (3), 312–333. 10.5281/zenodo.292951

*Ercan, O. (2014). Effect of web assisted education supported by six thinking hats on students’ academic achievement in science and technology classes . European Journal of Educational Research, 3 (1), 9–23. https://doi.org/10.12973/eu-jer.3.1.9

Ercan, O., & Bilen, K. (2014). Effect of web assisted education supported by six thinking hats on students’ academic achievement in science and technology classes. European Journal of Educational Research, 3 (1), 9–23.

*Ercan, O., Bilen, K., & Ural, E. (2016). “Earth, sun and moon”: Computer assisted instruction in secondary school science - Achievement and attitudes. Issues in Educational Research, 26 (2), 206–224. https://doi.org/10.12973/eu-jer.3.1.9

Field, A. P. (2003). The problems in using fixed-effects models of meta-analysis on real-world data. Understanding Statistics, 2 (2), 105–124. https://doi.org/10.1207/s15328031us0202_02

Field, A. P., & Gillett, R. (2010). How to do a meta-analysis. British Journal of Mathematical and Statistical Psychology, 63 (3), 665–694. https://doi.org/10.1348/00071010x502733

Geostat. (2019). ‘Share of households with internet access’, National statistics office of Georgia . Retrieved on the 2nd September 2020 from https://www.geostat.ge/en/modules/categories/106/information-and-communication-technologies-usage-in-households

*Gwo-Jen, H., Nien-Ting, T., & Xiao-Ming, W. (2018). Creating interactive e-books through learning by design: The impacts of guided peer-feedback on students’ learning achievements and project outcomes in science courses. Journal of Educational Technology & Society., 21 (1), 25–36. Retrieved on the 2nd of October, 2020 https://ae-uploads.uoregon.edu/ISTE/ISTE2019/PROGRAM_SESSION_MODEL/HANDOUTS/112172923/CreatingInteractiveeBooksthroughLearningbyDesignArticle2018.pdf

Hamdani, A. R., & Priatna, A. (2020). Efektifitas implementasi pembelajaran daring (full online) dimasa pandemi Covid-19 pada jenjang Sekolah Dasar di Kabupaten Subang. Didaktik: Jurnal Ilmiah PGSD STKIP Subang, 6 (1), 1–9.

Hart, C. M., Berger, D., Jacob, B., Loeb, S., & Hill, M. (2019). Online learning, offline outcomes: Online course taking and high school student performance. Aera Open, 5(1).

*Hayes, J., & Stewart, I. (2016). Comparing the effects of derived relational training and computer coding on intellectual potential in school-age children. The British Journal of Educational Psychology, 86 (3), 397–411. https://doi.org/10.1111/bjep.12114

Horton, W. K. (2000). Designing web-based training: How to teach anyone anything anywhere anytime (Vol. 1). Wiley Publishing.

*Hwang, G. J., Wu, P. H., & Chen, C. C. (2012). An online game approach for improving students’ learning performance in web-based problem-solving activities. Computers and Education, 59 (4), 1246–1256. https://doi.org/10.1016/j.compedu.2012.05.009

*Kert, S. B., Köşkeroğlu Büyükimdat, M., Uzun, A., & Çayiroğlu, B. (2017). Comparing active game-playing scores and academic performances of elementary school students. Education 3–13, 45 (5), 532–542. https://doi.org/10.1080/03004279.2016.1140800

*Lai, A. F., & Chen, D. J. (2010). Web-based two-tier diagnostic test and remedial learning experiment. International Journal of Distance Education Technologies, 8 (1), 31–53. https://doi.org/10.4018/jdet.2010010103

*Lai, A. F., Lai, H. Y., Chuang W. H., & Wu, Z.H. (2015). Developing a mobile learning management system for outdoors nature science activities based on 5e learning cycle. Proceedings of the International Conference on e-Learning, ICEL. Proceedings of the International Association for Development of the Information Society (IADIS) International Conference on e-Learning (Las Palmas de Gran Canaria, Spain, July 21–24, 2015). Retrieved on the 14th November 2020 from https://files.eric.ed.gov/fulltext/ED562095.pdf

Lai, C. H., Lin, H. W., Lin, R. M., & Tho, P. D. (2019). Effect of peer interaction among online learning community on learning engagement and achievement. International Journal of Distance Education Technologies (IJDET), 17 (1), 66–77.

Littell, J. H., Corcoran, J., & Pillai, V. (2008). Systematic reviews and meta-analysis . Oxford University.

*Liu, K. P., Tai, S. J. D., & Liu, C. C. (2018). Enhancing language learning through creation: the effect of digital storytelling on student learning motivation and performance in a school English course. Educational Technology Research and Development, 66 (4), 913–935. https://doi.org/10.1007/s11423-018-9592-z

Machtmes, K., & Asher, J. W. (2000). A meta-analysis of the effectiveness of telecourses in distance education. American Journal of Distance Education, 14 (1), 27–46. https://doi.org/10.1080/08923640009527043

Makowski, D., Piraux, F., & Brun, F. (2019). From experimental network to meta-analysis: Methods and applications with R for agronomic and environmental sciences. Dordrecht: Springer. https://doi.org/10.1007/978-94-024_1696-1

* Meyers, C., Molefe, A., & Brandt, C. (2015). The Impact of the" Enhancing Missouri's Instructional Networked Teaching Strategies"(eMINTS) Program on Student Achievement, 21st-Century Skills, and Academic Engagement--Second-Year Results . Society for Research on Educational Effectiveness. Retrieved on the 14 th November, 2020 from https://files.eric.ed.gov/fulltext/ED562508.pdf

OECD. (2020). ‘A framework to guide an education response to the COVID-19 Pandemic of 2020 ’. https://doi.org/10.26524/royal.37.6

Pecoraro, V. (2018). Appraising evidence . In G. Biondi-Zoccai (Ed.), Diagnostic meta-analysis: A useful tool for clinical decision-making (pp. 99–114). Cham, Switzerland: Springer. https://doi.org/10.1007/978-3-319-78966-8_9

Pigott, T. (2012). Advances in meta-analysis . Springer.

Pillay, H. , Irving, K., & Tones, M. (2007). Validation of the diagnostic tool for assessing Tertiary students’ readiness for online learning. Higher Education Research & Development, 26 (2), 217–234. https://doi.org/10.1080/07294360701310821

Prestiadi, D., Zulkarnain, W., & Sumarsono, R. B. (2019). Visionary leadership in total quality management: efforts to improve the quality of education in the industrial revolution 4.0. In the 4th International Conference on Education and Management (COEMA 2019). Atlantis Press

Poole, D. M. (2000). Student participation in a discussion-oriented online course: a case study. Journal of Research on Computing in Education, 33 (2), 162–177. https://doi.org/10.1080/08886504.2000.10782307

Rahayu, F. S., Budiyanto, D., & Palyama, D. (2017). Analisis penerimaan e-learning menggunakan technology acceptance model (Tam)(Studi Kasus: Universitas Atma Jaya Yogyakarta). Jurnal Terapan Teknologi Informasi, 1 (2), 87–98.

Rasmussen, R. C. (2003). The quantity and quality of human interaction in a synchronous blended learning environment . Brigham Young University Press.

*Ravenel, J., T. Lambeth, D., & Spires, B. (2014). Effects of computer-based programs on mathematical achievement scores for fourth-grade students. i-manager’s Journal on School Educational Technology, 10 (1), 8–21. https://doi.org/10.26634/jsch.10.1.2830

Rolisca, R. U. C., & Achadiyah, B. N. (2014). Pengembangan media evaluasi pembelajaran dalam bentuk online berbasis e-learning menggunakan software wondershare quiz creator dalam mata pelajaran akuntansi SMA Brawijaya Smart School (BSS). Jurnal Pendidikan Akuntansi Indonesia, 12(2).

Sitzmann, T., Kraiger, K., Stewart, D., & Wisher, R. (2006). The comparative effective- ness of Web-based and classroom instruction: A meta-analysis . Personnel Psychology, 59 (3), 623–664. https://doi.org/10.1111/j.1744-6570.2006.00049.x

Stewart, D. W., & Kamins, M. A. (2001). Developing a coding scheme and coding study reports. In M. W. Lipsey & D. B. Wilson (Eds.), Practical meta­analysis: Applied social research methods series (Vol. 49, pp. 73–90). Sage.

Swan, K. (2007). Research on online learning. Journal of Asynchronous Learning Networks, 11 (1), 55–59.

*Sung, H. Y., Hwang, G. J., & Chang, Y. C. (2016). Development of a mobile learning system based on a collaborative problem-posing strategy. Interactive Learning Environments, 24 (3), 456–471. https://doi.org/10.1080/10494820.2013.867889

Tsagris, M., & Fragkos, K. C. (2018). Meta-analyses of clinical trials versus diagnostic test accuracy studies. In G. Biondi-Zoccai (Ed.), Diagnostic meta-analysis: A useful tool for clinical decision-making (pp. 31–42). Cham, Switzerland: Springer. https://doi.org/10.1007/978-3-319-78966-8_4

UNESCO. (2020, Match 13). COVID-19 educational disruption and response. Retrieved on the 14 th November 2020 from https://en.unesco.org/themes/education-emergencies/ coronavirus-school-closures

Usta, E. (2011a). The effect of web-based learning environments on attitudes of students regarding computer and internet. Procedia-Social and Behavioral Sciences, 28 (262–269), 1. https://doi.org/10.1016/j.sbspro.2011.11.051

Usta, E. (2011b). The examination of online self-regulated learning skills in web-based learning environments in terms of different variables. Turkish Online Journal of Educational Technology-TOJET, 10 (3), 278–286. Retrieved on the 14th November 2020 from https://files.eric.ed.gov/fulltext/EJ944994.pdf

Vrasidas, C. & MsIsaac, M. S. (2000). Principles of pedagogy and evaluation for web-based learning. Educational Media International, 37 (2), 105–111. https://doi.org/10.1080/095239800410405

*Wang, C. H., & Chen, C. P. (2013). Effects of facebook tutoring on learning english as a second language. Proceedings of the International Conference e-Learning 2013, (2009), 135–142. Retrieved on the 15th November 2020 from https://files.eric.ed.gov/fulltext/ED562299.pdf

Wei, H. C., & Chou, C. (2020). Online learning performance and satisfaction: Do perceptions and readiness matter? Distance Education, 41 (1), 48–69.

*Yu, F. Y. (2019). The learning potential of online student-constructed tests with citing peer-generated questions. Interactive Learning Environments, 27 (2), 226–241. https://doi.org/10.1080/10494820.2018.1458040

*Yu, F. Y., & Chen, Y. J. (2014). Effects of student-generated questions as the source of online drill-and-practice activities on learning . British Journal of Educational Technology, 45 (2), 316–329. https://doi.org/10.1111/bjet.12036

*Yu, F. Y., & Pan, K. J. (2014). The effects of student question-generation with online prompts on learning. Educational Technology and Society, 17 (3), 267–279. Retrieved on the 15th November 2020 from http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.565.643&rep=rep1&type=pdf

*Yu, W. F., She, H. C., & Lee, Y. M. (2010). The effects of web-based/non-web-based problem-solving instruction and high/low achievement on students’ problem-solving ability and biology achievement. Innovations in Education and Teaching International, 47 (2), 187–199. https://doi.org/10.1080/14703291003718927

Zhao, Y., Lei, J., Yan, B, Lai, C., & Tan, S. (2005). A practical analysis of research on the effectiveness of distance education. Teachers College Record, 107 (8). https://doi.org/10.1111/j.1467-9620.2005.00544.x

*Zhong, B., Wang, Q., Chen, J., & Li, Y. (2017). Investigating the period of switching roles in pair programming in a primary school. Educational Technology and Society, 20 (3), 220–233. Retrieved on the 15th November 2020 from https://repository.nie.edu.sg/bitstream/10497/18946/1/ETS-20-3-220.pdf

Download references

Author information

Authors and affiliations.

Primary Education, Ministry of Turkish National Education, Mersin, Turkey

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Hakan Ulum .

Additional information

Publisher's note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Ulum, H. The effects of online education on academic success: A meta-analysis study. Educ Inf Technol 27 , 429–450 (2022). https://doi.org/10.1007/s10639-021-10740-8

Download citation

Received : 06 December 2020

Accepted : 30 August 2021

Published : 06 September 2021

Issue Date : January 2022

DOI : https://doi.org/10.1007/s10639-021-10740-8

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Online education
  • Student achievement
  • Academic success
  • Meta-analysis
  • Find a journal
  • Publish with us
  • Track your research

Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

  • View all journals
  • My Account Login
  • Explore content
  • About the journal
  • Publish with us
  • Sign up for alerts
  • Open access
  • Published: 09 May 2024

Looking back to move forward: comparison of instructors’ and undergraduates’ retrospection on the effectiveness of online learning using the nine-outcome influencing factors

  • Yujie Su   ORCID: orcid.org/0000-0003-1444-1598 1 ,
  • Xiaoshu Xu   ORCID: orcid.org/0000-0002-0667-4511 1 ,
  • Yunfeng Zhang 2 ,
  • Xinyu Xu 1 &
  • Shanshan Hao 3  

Humanities and Social Sciences Communications volume  11 , Article number:  594 ( 2024 ) Cite this article

39 Accesses

Metrics details

  • Language and linguistics

This study delves into the retrospections of undergraduate students concerning their online learning experiences after the COVID-19 pandemic, using the nine key influencing factors: behavioral intention, instruction, engagement, interaction, motivation, self-efficacy, performance, satisfaction, and self-regulation. 46 Year 1 students from a comprehensive university in China were asked to maintain reflective diaries throughout an academic semester, providing first-person perspectives on the strengths and weaknesses of online learning. Meanwhile, 18 college teachers were interviewed with the same questions as the students. Using thematic analysis, the research identified 9 factors. The research revealed that instruction ranked highest among the 9 factors, followed by engagement, self-regulation, interaction, motivation, and others. Moreover, teachers and students had different attitudes toward instruction. Thirdly, teacher participants were different from student participants given self-efficacy and self-regulation due to their variant roles in online instruction. Lastly, the study reflected students were not independent learners, which explained why instruction ranked highest in their point of view. Findings offer valuable insights for educators, administrators, and policy-makers involved in higher education. Recommendations for future research include incorporating a more diverse sample, exploring relationships between the nine factors, and focusing on equipping students with skills for optimal online learning experiences.

Similar content being viewed by others

what are the possible research questions about online learning

Exploring the influence of teachers’ motivating styles on college students’ agentic engagement in online learning: The mediating and suppressing effects of self-regulated learning ability

what are the possible research questions about online learning

Nudge or not, university teachers have mixed feelings about online teaching

what are the possible research questions about online learning

A longitudinal Q-study to assess changes in students’ perceptions at the time of pandemic

Introduction.

The outbreak of the COVID-19 pandemic has had a profound impact on education worldwide, leading to the widescale adoption of online learning. According to the United Nations Educational, Scientific and Cultural Organization (UNESCO), at the peak of the pandemic, 192 countries had implemented nationwide closures, affecting approximately 99% of the world’s student population (UNESCO 2020 a). In response, educational institutions, teachers, and students quickly adapted to online learning platforms, leveraging digital technologies to continue education amidst the crisis (Marinoni et al. 2020 ).

The rapid and unexpected shift to online learning brought about a surge in research aiming to understand its impact, effectiveness, and challenges. Researchers across the globe have been investigating various dimensions of online learning. Some focus on students’ experiences and perspectives (Aristovnik et al. 2021 ), technological aspects (Bao 2020 ), pedagogical strategies (Hodges et al. 2020 ), and the socio-emotional aspect of learning (Ali 2020 ). Tan et al. ( 2021 ) found that motivation and satisfaction were mostly positively perceived by students, and lack of interaction was perceived as an unfavorable online instruction perception. Some center on teachers’ perceptions of the benefits and challenges (Lucas and Vicente, 2023 ; Mulla et al. 2023 ), post-pandemic pedagogisation (Rapanta et al. 2021 ), and post-pandemic further education (Kohnke et al. 2023 ; Torsani et al. 2023 ). It was worth noting that elements like interaction and engagement were central to the development and maintenance of the learning community (Lucas and Vincente 2023 ),

The rise of online learning has also posed unprecedented challenges. Studies have pointed out the digital divide and accessibility issues (Crawford et al. 2020 ), students’ motivation and engagement concerns (Martin and Bolliger 2018 ), and the need for effective online instructional practices (Trust and Whalen 2020 ). The rapid transition to online learning has highlighted the need for robust research to address these challenges and understand the effectiveness of online learning in this new educational paradigm.

Despite the extensive research on online learning during and after the COVID-19 pandemic, there remains a notable gap in understanding the retrospective perspectives of both undergraduates and teachers. Much of the current literature has focused on immediate response strategies to the transition to online learning, often overlooking the detailed insights that reflective retrospection can provide (Marinoni et al. 2020 ; Bao 2020 ). In addition, while many studies have examined isolated aspects of online learning, they have not often employed a comprehensive framework, leaving undergraduates’ voices, in particular, underrepresented in the discourse (Aristovnik et al. 2021 ; Crawford et al. 2020 ). This study, situated in the context of the COVID-19 pandemic’s impetus toward online learning, seeks to fill this crucial gap. By exploring online learning from the perspectives of both instructors and undergraduates, and analyzing nine key factors that include engagement, motivation, and self-efficacy, the research contributes vital insights into the dynamics of online education (Wang and Wang 2021 ). This exploration is especially pertinent as digital learning environments become increasingly prevalent worldwide (UNESCO 2020b ). The findings of our study are pivotal for shaping future educational policies and enhancing online education strategies in this continuously evolving educational landscape (Greenhow et al. 2021 ). Thus, three research questions were raised:

Q1: How do undergraduates and teachers in China retrospectively perceive the effectiveness of online learning after the COVID-19 pandemic?
Q2: Which of the nine outcome influencing factors had the most significant impact on online learning experiences after the pandemic, and why?
Q3: What recommendations can be proposed to enhance the effectiveness of online learning in the future?

The research takes place at a comprehensive university in China, with a sample of 46 Year 1 students and 18 experienced teachers. Their reflections on the effectiveness of online learning were captured through reflective diaries guided by four questions. These questions investigated the students’ online learning states and attitudes, identified issues and insufficiencies in online learning, analyzed the reasons behind these problems, and proposed improvements. By assessing their experiences and perceptions, we seek to explore the significant factors that shaped online learning outcomes after the pandemic and the means to enhance its effectiveness.

This paper first presents a review of the existing literature, focusing on the impact of the pandemic on online learning and discussing the nine significant factors influencing online learning outcomes. Following this, the methodology utilized for this study is detailed, setting the stage for a deeper understanding of the research process. Subsequently, we delve into the results of the thematic analysis conducted based on undergraduate students and teachers’ retrospections. Finally, the paper concludes by offering meaningful implications of the findings for various stakeholders and suggesting directions for future research in this critical area.

Literature review

Online learning application and evaluation in higher education.

Online learning, also known as e-learning or distance learning, refers to education that takes place over the Internet rather than in a traditional classroom setting. It has seen substantial growth over the past decade and has been accelerated due to the COVID-19 pandemic (Trust and Whalen 2020 ). Online learning allows for a flexible learning environment, breaking the temporal and spatial boundaries of traditional classroom settings (Bozkurt and Sharma 2020 ). In response to the COVID-19 pandemic, educational institutions globally have embraced online learning at an unprecedented scale. This has led to an immense surge in research focusing on the effects of the pandemic on online learning (Crawford et al. 2020 ; Marinoni et al. 2020 ).

Researchers were divided in their attitudes toward the effects of online learning, including positive, neutral, and negative. Research by Bahasoan et al. ( 2020 ), Bernard et al. ( 2004 ), Hernández-Lara and Serradell-López ( 2018 ), and Paechter and Maier ( 2010 ) indicated the effectiveness of online learning, including improved outcomes and engagement in online formats, providing flexibility and enhancing digital skills for instance. Research, including studies by Dolan Hancock and Wareing ( 2015 ) and Means et al. ( 2010 ), indicates that under equivalent conditions and with similar levels of support, there is frequently no substantial difference in learning outcomes between traditional face-to-face courses and completely online courses.

However, online learning was not without its challenges. Research showing less favorable results for specific student groups can be referenced in Dennen ( 2014 ), etc. The common problems faced by students included underdeveloped independent learning ability, lack of motivation, difficulties in self-regulation, student engagement and technical issues (Aristovnik et al. 2021 ; Martin and Bolliger 2018 ; Song et al. 2004 ; Zheng et al. 2022 ).

Moreover, factors like instructional strategies, course design, etc. were also linked to learning outcomes and successful online learning (Ali 2020 ; Hongsuchon et al. 2022 ). Careaga-Butter et al. ( 2020 ) critically analyze online education in pandemic and post-pandemic contexts, focusing on digital tools and resources for teaching in synchronous and asynchronous learning modalities. They discuss the swift adaptation to online learning during the pandemic, highlighting the importance of technological infrastructure, pedagogical strategies, and the challenges of digital divides. The article emphasizes the need for effective online learning environments and explores trends in post-pandemic education, providing insights into future educational strategies and practices.

Determinants of online learning outcomes

Online learning outcomes in this paper refer to the measurable educational results achieved through online learning methods, including knowledge acquisition, skill development, changes in attitudes or behaviors, and performance improvements (Chang 2016 ; Panigrahi et al. 2018 ). The literature review identified key factors influencing online learning outcomes, emphasizing their significant role in academic discourse. These factors, highlighted in scholarly literature, include student engagement, instructional design, technology infrastructure, student-teacher interaction, and student self-regulation.

Student Engagement: The level of a student’s engagement significantly impacts their learning outcomes. The more actively a student is engaged with the course content and activities, the better their performance tends to be. This underscores the importance of designing engaging course content and providing opportunities for active learning in an online environment (Martin and Bolliger 2018 ).

Instructional Design: How an online course is designed can greatly affect student outcomes. Key elements such as clarity of learning objectives, organization of course materials, and the use of diverse instructional strategies significantly impact student learning (Bozkurt and Sharma 2020 ).

Technology Infrastructure: The reliability and ease of use of the learning management system (LMS) also play a significant role in online learning outcomes. When students experience technical difficulties, it can lead to frustration, reduced engagement, and lower performance (Johnson et al. 2020 ).

Student-Teacher Interaction: Interaction between students and teachers in an online learning environment is a key determinant of successful outcomes. Regular, substantive feedback from instructors can promote student learning and motivation (Boling et al. 2012 ).

Student Self-Regulation: The autonomous nature of online learning requires students to be proficient in self-regulated learning, which involves setting learning goals, self-monitoring, and self-evaluation. Students who exhibit strong self-regulation skills are more likely to succeed in online learning (Broadbent 2017 ).

While many studies have investigated individual factors affecting online learning, there is a paucity of research offering a holistic view of these factors and their interrelationships, leading to a fragmented understanding of the influences on online learning outcomes. Given the multitude of experiences and variables encompassed by online learning, a comprehensive framework like is instrumental in ensuring a thorough investigation and interpretation of the breadth of students’ experiences.

Students’ perceptions of online learning

Understanding students’ perceptions of online learning is essential for enhancing its effectiveness and student satisfaction. Studies show students appreciate online learning for its flexibility and convenience, offering personalized learning paths and resource access (Händel et al. 2020 ; Johnson et al. 2020 ). Yet, challenges persist, notably in maintaining motivation and handling technical issues (Aristovnik et al. 2021 ; Händel et al. 2020 ). Aguilera-Hermida ( 2020 ) reported mixed feelings among students during the COVID-19 pandemic, including feelings of isolation and difficulty adjusting to online environments. Boling et al. ( 2012 ) emphasized students’ preferences for interactive and communicative online learning environments. Additionally, research indicates that students seek more engaging content and innovative teaching approaches, suggesting a gap between current online offerings and student expectations (Chakraborty and Muyia Nafukho 2014 ). Students also emphasize the importance of community and peer support in online settings, underlining the need for collaborative and social learning opportunities (Lai et al. 2019 ). These findings imply that while online learning offers significant benefits, addressing its shortcomings is critical for maximizing its potential.

The pandemic prompted a reconsideration of instructional modalities, with many students favoring face-to-face instruction due to the immediacy and focus issues (Aristovnik et al. 2021 ; Trust and Whalen 2020 ). Despite valuable insights, research gaps remain, particularly in long-term undergraduate reflections and the application of nine factors of comprehensive frameworks, indicating a need for more holistic research in online learning effectiveness.

Teachers’ perceptions of online learning

The pandemic has brought attention to how teachers manage instruction in virtual learning environments. Teachers and students are divided in terms of their attitudes toward online learning. Some teachers and students looked to the convenience and flexibility of online learning (Chuenyindee et al. 2022 ; Al-Emran and Shaalan 2021 ). They conceived that online learning provided opportunities to improve educational equality as well (Tenório et al. 2016 ). Even when COVID-19 was over, the dependence on online learning was likely here to stay, for some approaches of online learning were well-received by students and teachers (Al-Rahmi et al. 2019 ; Hongsuchon et al. 2022 ).

Teachers had shown great confidence in delivering instruction in an online environment in a satisfying manner. They also agreed that the difficulty of teaching was closely associated with course structures (Gavranović and Prodanović 2021 ).

Not all were optimistic about the effects of online learning. They sought out the challenges facing teachers and students during online learning.

A mixed-method study of K-12 teachers’ feelings, experiences, and perspectives that the major challenges faced by teachers during the COVID-19 pandemic were lack of student participation and engagement, technological support for online learning, lack of face-to-face interactions with students, no work-life balance and learning new technology.

The challenges to teachers’ online instruction included instruction technology (Maatuk et al. 2022 ; Rasheed et al. 2020 ), course design (Khojasteh et al. 2023 ), and teachers’ confidence (Gavranović and Prodanović 2021 ).

Self-regulation challenges and challenges in using technology were the key challenges to students, while the use of technology for teaching was the challenge facing teachers (Rasheed et al. 2020 ).

The quality of course design was another important factor in online learning. A research revealed the competency of the instructors and their expertise in content development contributed a lot to students’ satisfaction with the quality of e-contents.

Theoretical framework

The theoretical foundation of the research is deeply rooted in multifaceted framework for online learning, which provides a comprehensive and interwoven model encompassing nine critical factors that collectively shape the educational experience in online settings. This framework is instrumental in guiding our analysis and enhances the comparability and interpretability of our results within the context of existing literature.

Central to Yu’s framework is the concept of behavioral intention, which acts as a precursor to student engagement in online learning environments. This engagement, inherently linked to the students’ intentions and motivations, is significantly influenced by the quality of instruction they receive. Instruction, therefore, emerges as a pivotal element in this model, directly impacting not only student engagement but also fostering a sense of self-efficacy among learners. Such self-efficacy is crucial as it influences both the performance of students and their overall satisfaction with the learning process.

The framework posits that engagement, a derivative of both strong behavioral intention and effective instruction, plays a vital role in enhancing student performance. This engagement is tightly interlaced with self-regulation, an indispensable skill in the autonomous and often self-directed context of online learning. Interaction, encompassing various forms such as student-teacher and peer-to-peer communications, further enriches the learning experience. It significantly contributes to the development of motivation and self-efficacy, both of which are essential for sustaining engagement and fostering self-regulated learning.

Motivation, especially when intrinsically driven, acts as a catalyst, perpetuating engagement and self-regulation, which ultimately leads to increased satisfaction with the learning experience. In this framework, self-efficacy, nurtured through effective instruction and meaningful interactions, has a positive impact on students’ performance and satisfaction, thereby creating a reinforcing cycle of learning and achievement.

Performance in this model is viewed as a tangible measure of the synergistic interplay of engagement, instructional quality, and self-efficacy, while satisfaction reflects the culmination of the learning experience, shaped by the quality of instruction, the extent and nature of interactions, and the flexibility of the learning environment. This satisfaction, in turn, influences students’ future motivation and their continued engagement with online learning.

Yu’s model thus presents a dynamic ecosystem where changes in one factor can have ripple effects across the entire spectrum of online learning. It emphasizes the need for a holistic approach in the realm of online education, considering the complex interplay of these diverse yet interconnected elements to enhance both the effectiveness and the overall experience of online learning.

The current study employed a qualitative design to explore teachers’ and undergraduates’ retrospections on the effectiveness of online learning during the first semester of the 2022–2023 school year, which is in the post-pandemic period. Data were collected using reflective diaries, and thematic analysis was applied to understand the experiences based on the nine factors.

Sample and sampling

The study involved 18 teachers and 46 first-year students from a comprehensive university in China, selected through convenience sampling to ensure diverse representation across academic disciplines. To ensure a diverse range of experiences in online learning, the participant selection process involved an initial email inquiry about their prior engagement with online education. The first author of this study received ethics approval from the department research committee, and participants were informed of the study’s objectives two weeks before via email. Only those participants who provided written informed consent were included in the study and were free to withdraw at any time. Pseudonyms were used to protect participants’ identities during the data-coding process. For direct citations, acronyms of students’ names were used, while “T+number” was used for citations from teacher participants.

The 46 students are all first-year undergraduates, 9 females and 37 males majoring in English and non-English (see Table 1 ).

The 18 teachers are all experienced instructors with at least 5 years of teaching experience, 13 females and 5 male, majoring in English and Non-English (see Table 2 ).

Data collection

Students’ data were collected through reflective diaries in class during the first semester of the 2022–2023 school year. Each participant was asked to maintain a diary over the course of one academic semester, in which they responded to four questions.

The four questions include:

What was your state and attitude toward online learning?

What were the problems and shortcomings of online learning?

What do you think are the reasons for these problems?

What measures do you think should be taken to improve online learning?

This approach provided a first-person perspective on the participants’ online teaching or learning experiences, capturing the depth and complexity of their retrospections.

Teachers were interviewed separately by responding to the four questions the same as the students. Each interview was conducted in the office or the school canteen during the semester and lasted about 20 to 30 min.

Data analysis

We utilized thematic analysis to interpret the reflective diaries, guided initially by nine factors. This method involved extensive engagement with the data, from initial coding to the final report. While Yu’s factors provided a foundational structure, we remained attentive to new themes, ensuring a comprehensive analysis. Our approach was methodical: familiarizing ourselves with the data, identifying initial codes, systematically searching and reviewing themes, and then defining and naming them. To validate our findings, we incorporated peer debriefing, and member checking, and maintained an audit trail. This analysis method was chosen for its effectiveness in extracting in-depth insights from undergraduates’ retrospections on their online learning experiences post-pandemic, aligning with our research objectives.

According to the nine factors, the interviews of 18 teachers and 46 Year 1 undergraduates were catalogued and listed in Table 3 .

Behavioral intention towards online learning post-pandemic

Since the widespread of the COVID-19 pandemic, both teachers and students have experienced online learning. However, their online teaching or learning was forced rather than planned (Baber 2021 ; Bao 2020 ). Students more easily accepted online learning when they perceived the severity of COVID-19.

When entering the post-pandemic era, traditional teaching was resumed. Students often compared online learning with traditional learning by mentioning learning interests, eye contact, face-to-face learning and learning atmosphere.

“I don’t think online learning is a good form of learning because it is hard to focus on learning.” (DSY) “In unimportant courses, I would let the computer log to the platform and at the same time do other entertains such as watching movies, listening to the music, having snacks or do the cleaning.” (XYN) “Online learning makes it impossible to have eye contact between teachers and students and unable to create a face-to-face instructional environment, which greatly influences students’ initiative and engagement in classes.” (WRX)

They noted that positive attitudes toward online learning usually generated higher behavioral intention to use online learning than those with negative attitudes, as found in the research of Zhu et al. ( 2023 ). So they put more blame on distractions in the learning environment.

“Online learning relies on computers or cell phones which easily brings many distractions. … I can’t focus on studying, shifting constantly from study and games.” (YX) “When we talk about learning online, we are hit by an idea that we can take a rest in class. It’s because everyone believes that during online classes, the teacher is unable to see or know what we are doing.” (YM) “…I am easily disturbed by external factors, and I am not very active in class.” (WZB)

Teachers reported a majority of students reluctantly turning on their cameras during online instruction and concluded the possible reason for such behavior.

“One of the reasons why some students are unwilling to turn on the camera is that they are worried about their looks and clothing at home, or that they don’t want to become the focus.” (T4)

They also noticed students’ absent-mindedness and lazy attitude during online instruction.

“As for some students who are not self-regulated, they would not take online learning as seriously as offline learning. Whenever they are logged onto the online platform, they would be unable to stay focused and keep their attention.” (T1)

Challenges and opportunities in online instruction post-pandemic

Online teaching brought new challenges and opportunities for students during and after the pandemic. The distractions at home seemed to be significantly underestimated by teachers in an online learning environment (Radmer and Goodchild 2021 ). It might be the reason why students greatly expected and heavily relied on teachers’ supervision and management.

“The biggest problem of online learning is that online courses are as imperative as traditional classes, but not managed face to face the same as the traditional ones.” (PC) “It is unable to provide some necessary supervision.” (GJX) “It is incapable of giving timely attention to every student.” (GYC) “Teachers can’t understand students’ conditions in time in most cases so teachers can’t adjust their teaching plan.” (MZY) “Some courses are unable to reach the teaching objectives due to lack of experimental conduction and practical operation.” (YZH) “Insufficient teacher-student interaction and the use of cell phones make both groups unable to engage in classes. What’s more, though online learning doesn’t put a high requirement for places, its instructional environment may be crucial due to the possible distractions.” (YCY)

Teachers also viewed online instruction as an addition to face-to-face instruction.

“Online learning cannot run as smoothly as face-to-face instruction, but it can provide an in-time supplement to the practical teaching and students’ self-learning.” (T13, T17) “Online instruction is an essential way to ensure the normal function of school work during the special periods like the pandemic” (T1, T15)

Factors influencing student engagement in online learning

Learning engagement was found to contribute to gains in the study (Paul and Diana 2006 ). It was also referred to as a state closely intertwined with the three dimensions of learning, i.e., vigor, dedication, and absorption (Schaufeli et al. 2002 ). Previous studies have found that some key factors like learning interaction, self-regulation, and social presence could influence learning engagement and learning outcomes (Lowenthal and Dunlap 2020 ; Ng 2018 ). Due to the absence of face-to-face interaction like eye contact, facial expressions and body language, both groups of interviewees agreed that the students felt it hard to keep their attention and thus remain active in online classes.

“Students are unable to engage in study due to a lack of practical learning environment of online learning.” (ZMH, T12) “Online platforms may not provide the same level of engagement and interaction as in-person classrooms, making it harder for students to ask questions or engage in discussions.” (HCK) “The Internet is cold, lack of emotional clues and practical connections, which makes it unable to reproduce face-to-face offline learning so that teachers and students are unlikely to know each other’s true feelings or thoughts. In addition, different from the real-time learning supervision in offline learning, online learning leaves students more learning autonomy.” (XGH) “Lack of teachers’ supervision and practical learning environment, students are easily distracted.” (LMA, T9)

Just as Zhu et al. ( 2023 ) pointed out, we had been too optimistic about students’ engagement in online learning, because online learning relied more on students’ autonomy and efforts to complete online learning.

Challenges in teacher-student interaction in online learning

Online learning has a notable feature, i.e., a spatial and temporal separation among teachers and students. Thus, online teacher-student interactions, fundamentals of relationship formation, have more challenges for both teachers and students. The prior studies found that online interaction affected social presence and indirectly affected learning engagement through social presence (Miao and Ma 2022 ). In the present investigation, both teachers and students noted the striking disadvantage of online interaction.

“Online learning has many problems such as indirect teacher-student communication, inactive informative communication, late response of students and their inability to reflect their problems. For example, teachers cannot evaluate correctly whether the students have mastered or not.” (YYN) “Teachers and students are separated by screens. The students cannot make prompt responses to the teachers’ questions via loudspeakers or headphones. It is not convenient for students to participate in questioning and answering. …for most of the time, the students interact with teachers via typing.” (ZJY) “While learning online, students prefer texting the questions to answering them via the loudspeaker.”(T7)

Online learning interaction was also found closely related to online learning engagement, performance, and self-efficacy.

“Teachers and students are unable to have timely and effective communication, which reduces the learning atmosphere. Students are often distracted. While doing homework, the students are unable to give feedback to teachers.” (YR) “Students are liable to be distracted by many other side matters so that they can keep their attention to online learning.” (T15)

In the online learning environment, teachers need to make efforts to build rapport and personalizing interactions with students to help them perform better and achieve greater academic success (Harper 2018 ; Ong and Quek 2023 ) Meanwhile, teachers should also motivate students’ learning by designing the lessons, giving lectures and managing the processes of student interactions (Garrison 2003 ; Ong and Quek 2023 ).

Determinants of self-efficacy in online learning

Online learning self-efficacy refers to students’ perception of their abilities to fulfill specific tasks required in online learning (Calaguas and Consunji 2022 ; Zimmerman and Kulikowich 2016 ). Online learning self-efficacy was found to be influenced by various factors including task, learner, course, and technology level, among which task level was found to be most closely related (Xu et al. 2022 ). The responses from the 46 student participants reveal a shared concern, albeit without mentioning specific tasks; they highlight critical aspects influencing online learning: learner attributes, course structure, and technological infrastructure.

One unifying theme from the student feedback is the challenge of self-regulation and environmental distractions impacting learning efficacy. For instance, participant WSX notes the necessity for students to enhance time management skills due to deficiencies in self-regulation, which is crucial for successful online learning. Participant WY expands on this by pointing out the distractions outside traditional classroom settings, coupled with limited teacher-student interaction, which hampers idea exchange and independent thought, thereby undermining educational outcomes. These insights suggest a need for strategies that bolster students’ self-discipline and interactive opportunities in virtual learning environments.

On the technological front, participants WT and YCY address different but related issues. Participant WT emphasizes the importance of up-to-date course content and learning facilities, indicating that outdated materials and tools can significantly diminish the effectiveness of online education. Participant YCY adds to this by highlighting problems with online learning applications, such as subpar functionalities that can introduce additional barriers to learning.

Teacher participants, on the other hand, shed light on objective factors predominantly related to course content and technology. Participant T5’s response underscores the heavy dependency on technological advancement in online education and points out the current inability of platforms or apps to adequately monitor student engagement and progress. Participant T9 voices concerns about course content not being updated or aligned with contemporary trends and student interests, suggesting a disconnect between educational offerings and learner needs. Meanwhile, participant T8 identifies unstable network services as a significant hindrance to online teaching, highlighting infrastructure as a critical component of online education’s success.

Teachers also believed the insufficient mastery of facilities and unfamiliarity with online instruction posed difficulty.

“Most teachers and students are not familiar with online instruction. For example, some teachers are unable to manage online courses so they cannot design the courses well. Some students lack self-regulation, which leads to their distraction or avoidance in class.” (T9)

Influences on student performance in online learning

Students’ performance during online lessons is closely associated with their satisfaction and self-efficacy. Most of the student participants reflected on their distractions, confusion, and needs, which indicates their dissatisfaction with online learning.

“During online instruction, it is convenient for the students to make use of cell phones, but instead, cell phones bring lots of distraction.” (YSC) “Due to the limits of online learning, teachers are facing the computer screen and unable to know timely students’ needs and confusion. Meanwhile, it’s inconvenient for teachers to make clear explanations of the sample questions or problems.” (HZW)

They thought their low learning efficiency in performance was caused by external factors like the learning environment.

“The most obvious disadvantage of online learning goes to low efficiency. Students find it hard to keep attention to study outside the practical classroom or in a relaxing environment.” (WY) “Teachers are not strict enough with students, which leads to ineffective learning.” (WRX)

Teacher participants conceived students’ performance as closely related to valid online supervision and students’ self-regulation.

“Online instruction is unable to create a learning environment, which helps teachers know students’ instant reaction. Only when students well regulate themselves and stay focused during online learning can they achieve successful interactions and make good accomplishments in the class.” (T11) “Some students need teachers’ supervision and high self-regulation, or they were easily distracted.” (T16)

Student satisfaction and teaching effectiveness in online learning

Online learning satisfaction was found to be significantly and positively associated with online learning self-efficacy (Al-Nasa’h et al. 2021 ; Lashley et al. 2022 ). Around 46% of student participants were unsatisfied with teachers’ ways of teaching.

“Comparatively, bloggers are more interesting than teachers’ boring and dull voices in online learning.” (DSY) “Teachers’ voice sounds dull and boring through the internet, which may cause listeners to feel sleepy, and the teaching content is not interesting enough to the students.” (MFE)

It reflected partly that some teachers were not adapted to online teaching possibly due to a lack in experience of online teaching or learning (Zhu et al. 2022 ).

“Some teachers are not well-prepared for online learning. They are particularly unready for emergent technological problems when delivering the teaching.” (T1) “One of the critical reasons lies in the fact that teachers and students are not well trained before online learning. In addition, the online platform is not unified by the college administration, which has led to chaos and difficulty of online instruction.” (T17)

Teachers recognized their inadequate preparation and mastery of online learning as one of the reasons for dissatisfaction, but student participants exaggerated the role of teachers in online learning and ignored their responsibility in planning and managing their learning behavior, as in the research of (Xu et al. 2022 ).

The role of self-regulation in online learning success

In the context of online learning, self-regulation stands out as a crucial factor, necessitating heightened levels of student self-discipline and autonomy. This aspect, as Zhu et al. ( 2023 ) suggest, grants students significant control over their learning processes, making it a vital component for successful online education.

“Online learning requires learners to be of high discipline and self-regulation. Without good self-regulation, they are less likely to be effective in online learning.” (YZJ) “Most students lack self-control, unable to control the time of using electronic products. Some even use other electronic products during online learning, which greatly reduces their efficiency in learning.” (GPY) “Students are not well developed in self-control and easily distracted. Thus they are unable to engage fully in their study, which makes them unable to keep up with others” (XYN)

Both groups of participants had a clear idea of the positive role of self-regulation in successful learning, but they also admitted that students need to strengthen their self-regulation skills and it seemed they associated with the learning environment, learning efficiency and teachers’ supervision.

“If they are self-motivated, online learning can be conducted more easily and more efficiently. However, a majority are not strong in regulating themselves. Teachers’ direct supervision in offline learning can do better in motivating them to study hard…lack of interaction makes students less active and motivated.” (LY) “Students have a low level of self-discipline. Without teachers’ supervision, they find it hard to listen attentively or even quit listening. Moreover, in class, the students seldom think actively and independently.” (T13)

The analysis of participant responses, categorized into three distinct attitude groups – positive, neutral, and negative – reveals a multifaceted view of the disadvantages of online learning, as shown in Tables 4 and 5 . This classification provides a clearer understanding of how attitudes towards online learning influence perceptions of self-regulation and other related factors.

In Table 4 , the division among students is most pronounced in terms of interaction and self-efficacy. Those with neutral attitudes highlighted interaction as a primary concern, suggesting that it is less effective in an online setting. Participants with positive attitudes noted a lack of student motivation, while those with negative views emphasized the need for better self-efficacy. Across all attitudes, instruction, engagement, self-regulation, and behavior intention were consistently identified as areas needing improvement.

Table 5 sheds light on teachers’ perspectives, revealing a consensus on the significance and challenges of instruction, motivation, and self-efficacy in online learning. Teachers’ opinions vary most significantly on self-efficacy and engagement. Those with negative attitudes point to self-efficacy and instructional quality as critical areas needing attention, while neutral attitudes focus on the role of motivation.

Discussions

Using a qualitative and quantitative analysis of the questionnaire data showed that among the 18 college teachers and 46 year 1 undergraduate students of various majors taking part in the interview, about 38.9% of teachers and about 30.4% of students supported online learning. Only two teachers were neutral about online learning, and 50% of teachers did not support virtual learning. The percentages of students who expressed positive and neutral views on online learning were the same, i.e., 34.8%. This indicates that online learning could serve as a complementary approach to traditional education, yet it is not without challenges, particularly in terms of student engagement, self-regulation, and behavioral intention, which were often attributed to distractions inherent in online environments.

In analyzing nine factors, it was evident that both teachers and students did not perceive these factors uniformly. Instruction was a significant element for both groups, as validated by findings in Tables 3 and 5 . The absence of face-to-face interactions in online learning shifted the focus to online instruction quality. Teachers cited technological challenges as a central concern, while students criticized the lack of engaging content and teaching methods. This aligns with Miao and Ma ( 2022 ), who argued that direct online interaction does not necessarily influence learner engagement, thus underscoring the need for integrated approaches encompassing interactions, self-regulation, and social presence.

Furthermore, the role of technology acceptance in shaping self-efficacy was highlighted by Xu et al. ( 2022 ), suggesting that students with higher self-efficacy tend to challenge themselves more. Chen and Hsu ( 2022 ) noted the positive influence of using emojis in online lessons, emphasizing the importance of innovative pedagogical approaches in online settings.

The study revealed distinct priorities between teachers and students in online learning: teachers emphasized effective instruction delivery, while students valued learning outcomes, self-regulation, and engagement. This divergence highlights the unique challenges each group faces. Findings by Dennen et al. ( 2007 ) corroborate this, showing instructors focusing on content and guidance, while students prioritize interpersonal communication and individualized attention. Additionally, Lee et al. ( 2011 ) found that reduced transactional distance and increased student engagement led to enhanced perceptions of learning outcomes, aligning with students’ priorities in online courses. Understanding these differing perspectives is crucial for developing comprehensive online learning strategies that address the needs of both educators and learners.

Integrating these findings with broader contextual elements such as technological infrastructure, pedagogical strategies, socio-economic backgrounds, and environmental factors (Balanskat and Bingimlas 2006 ) further enriches our understanding. The interplay between these external factors and Yu’s nine key aspects forms a complex educational ecosystem. For example, government interventions and training programs have been shown to increase teachers’ enthusiasm for ICT and its routine use in education (Balanskat and Bingimlas 2006 ). Additionally, socioeconomic factors significantly impact students’ experiences with online learning, as the digital divide in connectivity and access to computers at home influences the ICT experience, an important factor for school achievement (OECD 2015 ; Punie et al. 2006 ).

In sum, the study advocates for a holistic approach to understanding and enhancing online education, recognizing the complex interplay between internal factors and external elements that shape the educational ecosystem in the digital age.

Conclusion and future research

This study offered a comprehensive exploration into the retrospective perceptions of college teachers and undergraduate students regarding their experiences with online learning following the COVID-19 pandemic. It was guided by a framework encompassing nine key factors that influence online learning outcomes. To delve into these perspectives, the research focused on three pivotal questions. These questions aimed to uncover how both undergraduates and teachers in China view the effectiveness of online learning post-pandemic, identify which of the nine influencing factors had the most significant impact, and propose recommendations for enhancing the future effectiveness of online learning.

In addressing the first research question concerning the retrospective perceptions of online learning’s effectiveness among undergraduates and teachers in China post-COVID-19 pandemic, the thematic analysis has delineated clear divergences in attitude between the two demographics. Participants were primarily divided into three categories based on their stance toward online learning: positive, neutral, and negative. The results highlighted a pronounced variance in attitude distribution between teachers and students, with a higher percentage of teachers expressing clear-cut opinions, either favorably or unfavorably, towards the effectiveness of online learning.

Conversely, students displayed a pronounced inclination towards neutrality, revealing a more cautious or mixed stance on the effectiveness of online learning. This prevalent neutrality within the student body could be attributed to a range of underlying reasons. It might signify students’ uncertainties or varied experiences with online platforms, differences in engagement levels, gaps in digital literacy, or fluctuating quality of online materials and teaching methods. Moreover, this neutral attitude may arise from the psychological and social repercussions of the pandemic, which have potentially altered students’ approaches to and perceptions of learning in an online context.

In the exploration of the nine influential factors in online learning, it was discovered that both teachers and students overwhelmingly identified instruction as the most critical aspect. This was closely followed by engagement, interaction, motivation, and other factors, while performance and satisfaction were perceived as less influential by both groups. However, the attitudes of teachers and students towards these factors revealed notable differences, particularly about instruction. Teachers often attributed challenges in online instruction to technological issues, whereas students perceived the quality of instruction as a major influence on their learning effectiveness. This dichotomy highlights the distinct perspectives arising from their different roles within the educational process.

A further divergence was observed in views on self-efficacy and self-regulation. Teachers, with a focus on delivering content, emphasized the importance of self-efficacy, while students, grappling with the demands of online learning, prioritized self-regulation. This reflects their respective positions in the online learning environment, with teachers concerned about the efficacy of their instructional strategies and students about managing their learning process. Interestingly, the study also illuminated that students did not always perceive themselves as independent learners, which contributed to the high priority they placed on instruction quality. This insight underlines a significant area for development in online learning strategies, emphasizing the need for fostering greater learner autonomy.

Notably, both teachers and students concurred that stimulating interest was a key factor in enhancing online learning. They proposed innovative approaches such as emulating popular online personalities, enhancing interactive elements, and contextualizing content to make it more relatable to students’ lives. Additionally, practical suggestions like issuing preview tasks and conducting in-class quizzes were highlighted as methods to boost student engagement and learning efficiency. The consensus on the importance of supervisory roles underscores the necessity for a balanced approach that integrates guidance and independence in the online learning environment.

The outcomes of our study highlight the multifaceted nature of online learning, accentuated by the varied perspectives and distinct needs of teachers and students. This complexity underscores the necessity of recognizing and addressing these nuances when designing and implementing online learning strategies. Furthermore, our findings offer a comprehensive overview of both the strengths and weaknesses of online learning during an unprecedented time, offering valuable insights for educators, administrators, and policy-makers involved in higher education. Moreover, it emphasized the intricate interplay of multiple factors—behavioral intention, instruction, engagement, interaction, motivation, self-efficacy, performance, satisfaction, and self-regulation—in shaping online learning outcomes. presents some limitations, notably its reliance on a single research method and a limited sample size.

However, the exclusive use of reflective diaries and interviews restricts the range of data collection methods, which might have been enriched by incorporating additional quantitative or mixed-method approaches. Furthermore, the sample, consisting only of students and teachers from one university, may not adequately represent the diverse experiences and perceptions of online learning across different educational contexts. These limitations suggest the need for a cautious interpretation of the findings and indicate areas for future research expansion. Future research could extend this study by incorporating a larger, more diverse sample to gain a broader understanding of undergraduate students’ retrospections across different contexts and cultures. Furthermore, research could also explore how to better equip students with the skills and strategies necessary to optimize their online learning experiences, especially in terms of the self-regulated learning and motivation aspects.

Data availability

The data supporting this study is available from https://doi.org/10.6084/m9.figshare.25583664.v1 . The data consists of reflective diaries from 46 Year 1 students from a comprehensive university in China and 18 college teachers. We utilized thematic analysis to interpret the reflective diaries, guided initially by nine factors. The results highlight the critical need for tailored online learning strategies and provide insights into its advantages and challenges for stakeholders in higher education.

Aguilera-Hermida AP (2020) College students’ use and acceptance of emergency online learning due to COVID-19. Int. J. Educ. Res. Open 1:100011. https://doi.org/10.1016/j.ijedro.2020.100011

Article   Google Scholar  

Al-Emran, M, & Shaalan, K (2021, October 27). E-podium technology: A medium of managing knowledge at Al Buraimi University College via M-learning. In Proceedings of the 2nd BCS International IT Conference, Abu Dhabi, United Arab Emirates 2014. Retrieved October 17, 2023, from https://dblp.uni-trier.de/rec/conf/bcsit/EmranS14.html

Ali W (2020) Online and remote learning in higher education institutes: A necessity in light of COVID-19 pandemic. High. Educ. Stud. 10(3):16–25. https://doi.org/10.5539/hes.v10n3p16

Al-Nasa’h M, Al-Tarawneh L, Awwad FMA, Ahmad I (2021) Estimating Students’ Online Learning Satisfaction during COVID-19: A Discriminant Analysis. Heliyon 7(12):1–7. https://doi.org/10.1016/j.helyon.2021.e08544

Al-Rahmi WM, Yahaya N, Aldraiweesh AA, Alamri MM, Aljarboa NA, Alturki U (2019) Integrating technology acceptance model with innovation diffusion theory: An empirical investigation on students’ intention to use E-learning systems. IEEE Access 7:26797–26809. https://doi.org/10.1109/ACCESS.2019.2899368

Aristovnik A, Keržič D, Ravšelj D, Tomaževič N, Umek L (2021) Impacts of the COVID-19 pandemic on life of higher education students: A global perspective. Sustainability 12(20):8438. https://doi.org/10.3390/su12208438

Article   CAS   Google Scholar  

Baber H (2021) Modelling the Acceptance of E-learning during the Pandemic Of COVID-19-A Study of South Korea. Int. J. Manag. Educ. 19(2):1–15. https://doi.org/10.1016/j.ijme.2021.100503

Article   MathSciNet   Google Scholar  

Bahasoan AN, Ayuandiani W, Mukhram M, Rahmat A (2020) Effectiveness of online learning in pandemic COVID-19. Int. J. Sci., Technol. Manag. 1(2):100–106

Google Scholar  

Balanskat A, Bingimlas KA (2006) Barriers to the Successful Integration of ICT in Teaching and Learning Environments: A Review of the Literature. Eurasia J. Math. Sci. Technol. Educ. 5(3):235–245. https://doi.org/10.12973/ejmste/75275

Bao W (2020) COVID-19 and Online Teaching in Higher Education: A Case Study of Peking University. Hum. Behav. Emerg. Technol. 2(2):113–115. https://doi.org/10.1002/hbe2.191

Bernard RM et al. (2004) How Does Distance Education Compare with Classroom Instruction? A Meta-Analysis of the Empirical Literature. Rev. Educ. Res. 74(3):379–439

Boling EC, Hough M, Krinsky H, Saleem H, Stevens M (2012) Cutting the distance in distance education: Perspectives on what promotes positive, online learning experiences. Internet High. Educ. 15:118–126. https://doi.org/10.1016/j.iheduc.2011.11.006

Bozkurt A, Sharma RC (2020) Emergency remote teaching in a time of global crisis due to Coronavirus pandemic. Asian J. Distance Educ. 15(1):i–vi

Broadbent J (2017) Comparing online and blended learner’s self-regulated learning strategies and academic performance. Internet High. Educ. 33:24–32. https://doi.org/10.1016/j.iheduc.2017.01.004

Calaguas NP, Consunji PMP (2022) A Structural Equation Model Predicting Adults’ Online Learning Self-efficacy. Educ. Inf. Technol. 27:6233–6249. https://doi.org/10.1007/s10639-021-10871-y

Careaga-Butter M, Quintana MGB, Fuentes-Henríquez C (2020) Critical and Prospective Analysis of Online Education in Pandemic and Post-pandemic Contexts: Digital Tools and Resources to Support Teaching in synchronous and Asynchronous Learning Modalities. Aloma: evista de. psicologia, ciències de. l’educació i de. l’esport Blanquerna 38(2):23–32

Chakraborty M, Muyia Nafukho F (2014) Strengthening Student Engagement: What do Students Want in Online Courses? Eur. J. Train. Dev. 38(9):782–802

Chang V (2016) Review and Discussion: E-learning for Academia and Industry. Int. J. Inf. Manag. 36(3):476–485. https://doi.org/10.1016/j.ijinfomgt.2015.12.007

Chen YJ, Hsu LW (2022) Enhancing EFL Learners’ Self-efficacy Beliefs of Learning English with emoji Feedbacks in Call: Why and How. Behav. Sci. 12(7):227. https://doi.org/10.3390/bs12070227

Chuenyindee T, Montenegro LD, Ong AKS, Prasetyo YT, Nadlifatin R, Ayuwati ID, Sittiwatethanasiri T, Robas KPE (2022) The Perceived Usability of the Learning Management System during the COVID-19 Pandemic: Integrating System Usability Scale, Technology Acceptance Model, and Task-technology Fit. Work 73(1):41–58. https://doi.org/10.3233/WOR-220015

Crawford J, Butler-Henderson K, Rudolph J, Malkawi B, Glowatz M, Burton R, Lam S (2020) COVID-19: 20 countries’ higher education intra-period digital pedagogy responses. J. Appl. Teach. Learn. 3(1):120. https://doi.org/10.37074/jalt.2020.3.1.7

Dennen VP (2014) Becoming a blogger: Trajectories, norms, and activities in a community of practice. Computers Hum. Behav. 36:350–358. https://doi.org/10.1016/j.chb.2014.03.028

Dennen VP, Darabi AA, Smith LJ (2007) Instructor–learner Interaction in Online Courses: The Relative Perceived Importance of Particular Instructor Actions on Performance and Satisfaction. Distance Educ. 28(1):65–79

Dolan E, Hancock E, Wareing A (2015) An evaluation of online learning to teach practical competencies in undergraduate health science students. Internet High. Educ. 24:21–25

Garrison DR (2003) Cognitive Presence for Effective Asynchronous Online Learning: The Role of Reflective. Inq., Self-Direction Metacognition. Elem. Qual. Online Educ.: Pract. Direction 4(10):47–58

Gavranović V, Prodanović M (2021) ESP Teachers’ Perspectives on the Online Teaching Environment Imposed in the Covid-19 Era-A Case Study. N. Educ. Rev. 2:188–197. https://doi.org/10.15804/TNER.2021.64.2.15

Greenhow C, Lewin C, Staudt Willet KB (2021) The Educational Response to Covid-19 across Two Countries: a Critical Examination of Initial Digital Pedagogy Adoption. Technol., Pedagog. Educ. 30(1):7–25

Händel M, Stephan M, Gläser-Zikuda M, Kopp B, Bedenlier S, Ziegler A (2020) Digital readiness and its effects on higher education students’ socio-emotional perceptions in the context of the COVID-19 pandemic. J. Res. Technol. Educ. 53(2):1–13

Harper B (2018) Technology and Teacher-student Interactions: A Review of Empirical Research. J. Res. Technol. Educ. 50(3):214–225

Article   ADS   Google Scholar  

Hernández-Lara AB, Serradell-López E (2018) Student interactions in online discussion forums: their perception on learning with business simulation games. Behav. Inf. Technol. 37(4):419–429

Hodges C, Moore S, Lockee B, Trust T, Bond A (2020) The difference between emergency remote teaching and online learning. Educause Rev. 27:1–12

Hongsuchon T, Emary IMME, Hariguna T, Qhal EMA (2022) Assessing the Impact of Online-learning Effectiveness and Benefits in Knowledge Management, the Antecedent of Online-learning Strategies and Motivations: An Empirical Study. Sustainability 14(5):1–16. https://doi.org/10.3390/su14052570 . (2020)

Johnson N, Veletsianos G, Seaman J (2020) US faculty and administrators’ experiences and approaches in the early weeks of the COVID-19 pandemic. Online Learn. 24(2):6–21

Khojasteh L, Karimian Z, Farahmandi AY, Nasiri E, Salehi N (2023) E-content Development of English Language Courses during COVID-19: A Comprehensive Analysis of Students’ Satisfaction. J. Computer Educ. 10(1):107–133. https://doi.org/10.1007/s40692-022-00224-0

Kohnke, L, & Foung, D (2023). Exploring Microlearning for Teacher Professional Development: Voices from Hong Kong. In Tafazoli, D, M Picard (Eds.). Handbook of CALL Teacher Education Professional Development (pp. 279-292). Singapore: Springer Nature Singapore Pte Ltd

Lai CH, Lin HW, Lin RM, Tho PD (2019) Effect of Peer Interaction among Online Learning Community on Learning Engagement and Achievement. Int. J. Distance Educ. Technol. (IJDET) 17(1):66–77

Lashley PM, Sobers NP, Campbell MH, Emmanuel MK, Greaves N, Gittens-St Hilaire M, Murphy MM, Majumder MAA (2022) Student Satisfaction and Self-Efficacy in a Novel Online Clinical Clerkship Curriculum Delivered during the COVID-19 Pandemic. Adv. Med. Educ. Pract. 13:1029–1038. https://doi.org/10.2147/AMEP.S374133

Lee SJ, Srinivasan S, Trail T, Lewis D, Lopez S (2011) Examining the Relationship among Student Perception of Support, Course Satisfaction, and Learning Outcomes in Online Learning. Internet High. Educ. 14(3):158–163

Lowenthal PR, Dunlap JC (2020) Social Presence and Online Discussions: A Mixed Method Investigation. Distance Educ. 41:490–514. https://doi.org/10.1080/01587919.2020.1821603

Lucas M, Vicente PN (2023) A Double-edged Sword: Teachers’ Perceptions of the Benefits and Challenges of Online Learning and Learning in Higher Education. Educ. Inf. Technol. 23:5083–5103. https://doi.org/10.1007/s10639-022-11363-3

Maatuk AM, Elberkawi EK, Aljawarneh S, Rashaideh H, Alharbi H (2022) The COVID-19 Pandemic and E-learning: Challenges and Opportunities from the Perspective of Students and Instructors. J. Comput. High. Educ. 34:21–38. https://doi.org/10.1007/s12528-021-09274-2

Marinoni G, Van’t Land H, Jensen T (2020) The Impact of Covid-19 on Higher Education around the World. IAU Glob. Surv. Rep. 23:1–17

Martin F, Bolliger DU (2018) Engagement matters: Student perceptions on the importance of engagement strategies in the online learning environment. Online Learn. 22(1):205–222

Means B, et al. (2010). Evaluation of Evidence-Based Practices in Online Learning: A Meta-Analysis and Review of Online Learning Studies. Washington, D. C.: U.S. Department of Education

Miao J, Ma L (2022) Students’ Online Interaction, Self-regulation, and Learning Engagement in Higher Education: The Importance of Social Presence to Online Learning. Front. Psychol. 13:1–9. https://doi.org/10.3389/fpsyg.2022.815220

Mulla T, Munir S, Mohan V (2023) An Exploratory Study to Understand Faculty Members’ Perceptions and Challenges in Online Teaching. Int. Rev. Educ. 69:73–99. https://doi.org/10.1007/s11159-023-100

Ng EW (2018) Integrating Self-regulation Principles with Flipped Classroom Pedagogy for First Year University Students. Computer Educ. 126:65–74. https://doi.org/10.1007/s11409-011-9082-8

Ong SGT, Quek GCL (2023) Enhancing teacher–student interactions and student online engagement in an online learning environment. Learn. Environ. Res. 26:681–707. https://doi.org/10.1007/s10984-022-09447-5

Organisation for Economic Co‑operation and Development. (2015). The g20 skills strategy for developing and using skills for the 21st century. Retrieved from https://www.oecd.org/g20/topics/employment-andsocial-policy/The-G20-Skills-Strategy-for-Developing-and-Using-Skills-for-the-21st-Century.pdf

Paechter M, Maier B (2010) Online or face-to-face? Students’ experiences and preferences in e-learning. internet High. Educ. 13(4):292–297

Panigrahi R, Srivastava PR, Sharma D (2018) Online Learning: Adoption, Continuance, and Learning Outcome—A Review of Literature. Int. J. Inf. Manag. 43:1–14

Paul C, Diana M (2006) Teaching Methods and Time on Task in Junior Classrooms. Educ. Res. 30:90–97

Punie, Y, Zinnbauer, D, & Cabrera, M (2006). A review of the impact of ICT on learning. European Commission, Brussels, 6(5), 635-650

Radmer F, Goodchild S (2021) Online Mathematics Teaching and Learning during the COVID-19 Pandemic: The Perspective of Lecturers and Students. Nord. J. STEM Educ. 5(1):1–5. https://doi.org/10.5324/njsteme.v5i1.3914

Rapanta C, Botturi L, Goodyear P, Guadia L, Koole M (2021) Balancing Technology, Pedagogy and the New Normal: Post-pandemic Challenges for Higher Education. Postdigital Sci. Educ. 3:715–742. https://doi.org/10.1007/s42438-021-00249-1

Rasheed, RA, Kamsin, A, & Abdullah, NA (2020) Challenges in the Online Component of Blended Learning: A Systematic Review. Computers & Education, 144. https://doi.org/10.1016/j.compedu.2019.103701

Schaufeli W, Salanova M, Gonzalez-Roma V (2002) The Measurement of Engagement and Burnout: A Two Sample Confirmatory Factor Analytic Approach. J. Happiness Stud. 3:71–92

Song L, Singleton ES, Hill JR, Koh MH (2004) Improving online learning: Student perceptions of useful and challenging characteristics. Internet High. Educ. 7(1):59–70. https://doi.org/10.1016/j.iheduc.2003.11.003

Tan HT, Chan PP, Said NM (2021) Higher Education Students’ Online Instruction Perceptions: A Quality Virtual Learning Environment. Sustainability 13:10840. https://doi.org/10.3390/su131910840

Tenório T, Bittencourt II, Isotani S, Silva AP (2016) Does peer assessment in online learning environments work? A systematic review of the literature. Computers Hum. Behav. 64:94–107. https://doi.org/10.1016/j.chb.2016.06.020

Torsani, S (2023) Teacher Education in Mobile Assisted Language Learning for Adult Migrants: A Study of Provincial Centers for Adult Education in Italy. In Tafazoli, D, & M Picard (eds.). Handbook of CALL Teacher Education Professional Development (pp. 179-192). Singapore: Springer Nature Singapore Pte Ltd

Trust T, Whalen J (2020) Should teachers be trained in emergency remote teaching? Lessons learned from the COVID-19 pandemic. J. Technol. Teach. Educ. 28(2):189–199

UNESCO (2020a) COVID-19 Impact on education. UNESCO. Retrieved from https://en.unesco.org/covid19/educationresponse

UNESCO (2020b) Education: From Disruption to Recovery. UNESCO. Retrieved from https://en.unesco.org/covid19/educationresponse

Wang M, & Wang F (2021, August) Comparative Analysis of University Education Effect under the Traditional Teaching and Online Teaching Mode. In The Sixth International Conference on Information Management and Technology (pp. 1-6)

Xu Q, Wu J, Peng HY (2022) Chinese EFL University Students’ Self-efficacy for Online Self-regulated Learning: Dynamic Features and Influencing Factors. Front. Psychol. 13:1–12. https://doi.org/10.3389/fpsyg.2022.912970

Zheng RK, Li F, Jiang L, Li SM (2022) Analysis of the Current Situation and Driving Factors of College Students’ Autonomous Learning in the Network Environment. Front. Humanit. Soc. Sci. 2(7):44–50

Zhu XM, Gong Q, Wang Q, He YJ, Sun ZQ, Liu FF (2023) Analysis of Students’ Online Learning Engagement during the COVID-19 Pandemic: A Case Study of a SPOC-Based Geography Education Undergraduate Course. Sustainability 15(5):4544. https://doi.org/10.3390/su15054544

Zhu Y, Geng G, Disney L, Pan Zihao (2023) Changes in University Students’ Behavioral intention to learn online throughout the COVID-19: Insights for Online Teaching in the Post-pandemic Era. Educ. Inf. Technol. 28:3859–3892. https://doi.org/10.1007/s10639-022-11320-0

Zhu YH, Xu YY, Wang XY, Yan SY, Zhao L (2022) The Selectivity and Suitability of Online Learning Resources as Predictor of the Effects of Self-efficacy on Teacher Satisfaction During the COVID-19 Lockdown. Front. Psychol. 13:1–11. https://doi.org/10.3389/fpsyg.2022.765832

Zimmerman WA, Kulikowich JM (2016) Online Learning Self-efficacy in Students with and Without Online Learning Experience. Am. J. Distance Educ. 30(3):180–190. https://doi.org/10.1080/08923647.2016.1193801

Download references

Acknowledgements

The Corresponding author received the National Social Science Foundation of China for Education General Program (BGA210054) for this work.

Author information

Authors and affiliations.

School of Foreign Studies, Wenzhou University, Wenzhou, China

Yujie Su, Xiaoshu Xu & Xinyu Xu

Faculty of Languages and Translation, Macao Polytechnic University, Macao, China

Yunfeng Zhang

Faculty of Applied Sciences, Macao Polytechnic University, Macao, China

Shanshan Hao

You can also search for this author in PubMed   Google Scholar

Contributions

XSX was responsible for conceptualization and, alongside YFZ, for data curation. YJS and XYX conducted the formal analysis. Funding acquisition was managed by YFZ. The investigation was carried out by YJS and YFZ. Methodology development was a collaboration between YJS and XSX. XSX and YJS also managed project administration, with additional resource management by SSH and XYX. YJS handled the software aspect, and supervision was overseen by XSX. SSH and XYX were responsible for validation, and visualization was managed by YJS. The original draft was written by XSX and YJS, while the review and editing were conducted by YFZ and SSH.

Corresponding author

Correspondence to Xiaoshu Xu .

Ethics declarations

Competing interests.

The authors declare no competing interests.

Ethical approval

The questionnaire and methodology for this study were approved by the Human Research Ethics Committee of Wenzhou University (Ethics approval number WGY202302).

Informed consent

Informed consent was diligently obtained from all participants involved in the study, ensuring adherence to ethical standards. Detailed consent forms, outlining the study’s scope and participants’ rights, were signed by participants. Documentation of this process is well-maintained and can be provided upon request during peer review or post-publication.

Additional information

Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ .

Reprints and permissions

About this article

Cite this article.

Su, Y., Xu, X., Zhang, Y. et al. Looking back to move forward: comparison of instructors’ and undergraduates’ retrospection on the effectiveness of online learning using the nine-outcome influencing factors. Humanit Soc Sci Commun 11 , 594 (2024). https://doi.org/10.1057/s41599-024-03097-z

Download citation

Received : 24 October 2023

Accepted : 17 April 2024

Published : 09 May 2024

DOI : https://doi.org/10.1057/s41599-024-03097-z

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

Quick links

  • Explore articles by subject
  • Guide to authors
  • Editorial policies

what are the possible research questions about online learning

logo

FAQs: How Online Courses Work

what are the possible research questions about online learning

The Benefits of Online Education

How online education works.

  • The Effectiveness of Online Education

Choosing Online Degree Programs

Technical skills and considerations, paying for online degree programs.

Recent reports detail just how quickly colleges adopted online learning. According to the Babson Survey Research Group, university and student participation in online education is at an all-time high. Even some of the largest and most prestigious universities now offer online degrees. Despite its growing popularity, online education is still relatively new, and many students and academics are completely unacquainted with it. Questions and concerns are normal. This page addresses some of the most frequently asked questions about online degree programs. All answers are thoroughly researched; we include links to relevant studies whenever possible.

Question: What are some of the advantages of attending college online?

[Answer] Online education is known for its flexibility, but studies have identified several additional benefits of attending class online. Among them:

  • Communication : Many students are more comfortable engaging in meaningful discussions online than in a classroom. These students might have hearing or speech impairments; speak different languages; have severe social anxiety; or simply need more time to organize their thoughts.
  • Personalized learning : Not all students learn the same way. Web-based learning allows instructors to deliver the same content using different media, like videos or simulations, personalizing learning. Online classes providing round-the-clock access to materials and lectures also let students study when they feel most focused and engaged.
  • Accessibility : Online programs transcend time, geographic, and other barriers to higher education. This can be helpful for those who work full-time, live in remote regions, or serve in the military.
  • Adaptability : Learning management systems that integrate text-to-speech and other adaptive technologies support learners with physical, behavioral, and learning challenges.
  • Efficiency : Studies show online students tend to achieve the same learning results in half the time as classroom-based students.
  • Engagement : Online instructors can use games, social media, virtual badges, and other engaging technologies to motivate students and enhance learning.

Question: How does online education work on a day-to-day basis?

[Answer] Instructional methods, course requirements, and learning technologies can vary significantly from one online program to the next, but the vast bulk of them use a learning management system (LMS) to deliver lectures and materials, monitor student progress, assess comprehension, and accept student work. LMS providers design these platforms to accommodate a multitude of instructor needs and preferences. While some courses deliver live lectures using video conferencing tools, others allow students to download pre-recorded lectures and use message boards to discuss topics. Instructors may also incorporate simulations, games, and other engagement-boosters to enhance learning. Students should research individual programs to find out how and when they would report to class; how lectures and materials are delivered; how and how much they would collaborate with faculty and peers; and other important details. We address many of these instructional methods and LMS capabilities elsewhere in this guide.

Question: Can you really earn online degrees in hands-on fields like nursing and engineering?

[Answer] Yes and no. While schools do offer online and hybrid programs in these disciplines, students must usually meet additional face-to-face training requirements. Schools usually establish these requirements with convenience in mind. For example, students in fields like nursing, teaching, and social work may be required to complete supervised fieldwork or clinical placements, but do so through local schools, hospitals/clinics, and other organizations. For example, students enrolled in the University of Virginia’s Engineers PRODUCED in Virginia program can complete all their engineering classes online in a live format while gaining practical experience through strategic internships with employers across the state. Some online programs do require students to complete on-campus training, seminars and assessments, but visits are often designed to minimize cost and travel. Students should consider these requirements when researching programs.

The Effectiveness and Credibility of Online Education

Question: is online education as effective as face-to-face instruction.

[Answer] Online education may seem relatively new, but years of research suggests it can be just as effective as traditional coursework, and often more so. According to a U.S. Department of Education analysis of more than 1,000 learning studies, online students tend to outperform classroom-based students across most disciplines and demographics. Another major review published the same year found that online students had the advantage 70 percent of the time, a gap authors projected would only widen as programs and technologies evolve.

While these reports list several plausible reasons students might learn more effectively online—that they have more control over their studies, or more opportunities for reflection—medium is only one of many factors that influence outcomes. Successful online students tend to be organized self-starters who can complete their work without reporting to a traditional classroom. Learning styles and preferences matter, too. Prospective students should research programs carefully to identify which ones offer the best chance of success.

Question: Do employers accept online degrees?

[Answer] All new learning innovations are met with some degree of scrutiny, but skepticism subsides as methods become more mainstream. Such is the case for online learning. Studies indicate employers who are familiar with online degrees tend to view them more favorably, and more employers are acquainted with them than ever before. The majority of colleges now offer online degrees, including most public, not-for-profit, and Ivy League universities. Online learning is also increasingly prevalent in the workplace as more companies invest in web-based employee training and development programs.

Question: Is online education more conducive to cheating?

[Answer] The concern that online students cheat more than traditional students is perhaps misplaced. When researchers at Marshall University conducted a study to measure the prevalence of cheating in online and classroom-based courses, they concluded, “somewhat surprisingly, the results showed higher rates of academic dishonesty in live courses.” The authors suggest the social familiarity of students in a classroom setting may lessen their sense of moral obligation.

Another reason cheating is less common in online programs is that colleges have adopted strict anti-cheating protocols and technologies. According to a report published by the Online Learning Consortium, some online courses require students to report to proctored testing facilities to complete exams, though virtual proctoring using shared screens and webcams is increasingly popular. Sophisticated identity verification tools like biometric analysis and facial recognition software are another way these schools combat cheating. Instructors often implement their own anti-cheating measures, too, like running research papers through plagiarism-detection programs or incorporating challenge-based questions in quizzes and exams. When combined, these measures can reduce academic dishonesty significantly.

In an interview with OnlineEducation.com, Dr. Susan Aldridge, president of Drexel University Online, discussed the overall approach many universities take to curbing cheating–an approach that includes both technical and policy-based prevention strategies.

“Like most online higher education providers, Drexel University employs a three-pronged approach to maintaining academic integrity among its virtual students,” said Dr. Aldridge. “We create solid barriers to cheating, while also making every effort to identify and sanction it as it occurs or directly after the fact. At the same time, we foster a principled community of inquiry that, in turn, motivates students to act in ethical ways. So with this triad in mind, we have implemented more than a few strategies and systems to ensure academic integrity.”

Question: How do I know if online education is right for me?

[Answer] Choosing the right degree program takes time and careful research no matter how one intends to study. Learning styles, goals, and programs always vary, but students considering online colleges must consider technical skills, ability to self-motivate, and other factors specific to the medium. A number of colleges and universities have developed assessments to help prospective students determine whether they are prepared for online learning. You can access a compilation of assessments from many different colleges online. Online course demos and trials can also be helpful, particularly if they are offered by schools of interest. Students can call online colleges and ask to speak an admissions representative who can clarify additional requirements and expectations.

Question: How do I know if an online degree program is credible?

[Answer] As with traditional colleges, some online schools are considered more credible than others. Reputation, post-graduation employment statistics, and enrollment numbers are not always reliable indicators of quality, which is why many experts advise students to look for accredited schools. In order for an online college to be accredited, a third-party organization must review its practices, finances, instructors, and other important criteria and certify that they meet certain quality standards. The certifying organization matters, too, since accreditation is only as reliable as the agency that grants it. Students should confirm online programs’ accrediting agencies are recognized by the U.S. Department of Education and/or the Council on Higher Education Accreditation before submitting their applications.

Online Student Support Services

Question: do online schools offer the same student support services as traditional colleges.

[Answer] Colleges and universities tend to offer online students many of the same support services as campus-based students, though they may be administered differently. Instead of going to a campus library, online students may log in to virtual libraries stocked with digital materials, or work with research librarians by phone or email. Tutoring, academic advising, and career services might rely on video conferencing software, virtual meeting rooms, and other collaborative technologies. Some online colleges offer non-academic student support services as well. For example, Western Governor University’s Student Assistance Program provides online students with 24/7 access to personal counseling, legal advice, and financial consulting services. A list of student support services is usually readily available on online colleges’ websites.

Question: What technical skills do online students need?

[Answer] Online learning platforms are typically designed to be as user-friendly as possible: intuitive controls, clear instructions, and tutorials guide students through new tasks. However, students still need basic computer skills to access and navigate these programs. These skills include: using a keyboard and a mouse; running computer programs; using the Internet; sending and receiving email; using word processing programs; and using forums and other collaborative tools. Most online programs publish such requirements on their websites. If not, an admissions adviser can help.

Students who do not meet a program’s basic technical skills requirements are not without recourse. Online colleges frequently offer classes and simulations that help students establish computer literacy before beginning their studies. Microsoft’s online digital literacy curriculum is one free resource.

Question: What technology requirements must online students meet? What if they do not meet them?

[Answer] Technical requirements vary from one online degree program to the next, but most students need at minimum high-speed Internet access, a keyboard, and a computer capable of running specified online learning software. Courses using identity verification tools and voice- or web-conferencing software require webcams and microphones. Scanners and printers help, too. While online schools increasingly offer mobile apps for learning on-the-go, smartphones and tablets alone may not be sufficient.

Most online colleges list minimum technology requirements on their websites. Students who do not meet these requirements should contact schools directly to inquire about programs that can help. Some online schools lend or provide laptops, netbooks, or tablets for little to no cost, though students must generally return them right away if they withdraw from courses. Other colleges may offer grants and scholarships to help cover technical costs for students who qualify.

Question: Are online students eligible for financial aid?

[Answer] Qualifying online students enrolled in online degree programs are eligible for many of the same loans, scholarships, and grants as traditional campus-based students. They are also free to apply for federal and state financial aid so long as they:

  • Attend online programs accredited by an organization recognized by either the U.S. Department of Education or the Council on Higher Education Accreditation.
  • Attend online schools that are authorized to operate in their state of residence.
  • Meet all additional application requirements, including those related to legal status, citizenship, age, and educational attainment.
  • Submit applications and all supporting materials by their deadlines.

Students can visit the U.S. Department of Education’s Federal Student Aid website to review all eligibility requirements and deadlines, and to submit their Free Application for Student Aid (FAFSA). Note that many states, colleges, and organizations use FAFSA to determine students’ eligibility for other types of aid, including grants, scholarships, and loans. Students can contact prospective schools directly to speak with financial aid advisors.

Disclaimer: Financial aid is never guaranteed, even among eligible online students. Contact colleges and universities directly to clarify their policies

Question: Can students use military education benefits to pay for online education?

[Answer] Active-duty and veteran military service-members can typically apply their military education benefits toward an online degree, though they must still meet many of the same eligibility requirements detailed in the previous answer. Many state-level benefits have additional residency requirements. Most colleges have whole offices dedicated to helping these students understand and use their benefits effectively. They may also clarify applicable aid programs and requirements on their official websites. When in doubt, students should contact schools directly or report to the nearest Department of Veteran Affairs to learn more about their options.

" Educational Benefits of Online Learning ," Blackboard Learning, Presented by California Polytechnic State University, San Louis Obispo

" Four Proven Advantages of Online Learning (That are NOT the Cost, Accessibility or Flexibility) , Coursera Blog, Coursera

" Evaluation of Evidence-Based Practices in Online Learning: A Meta-Analysis and Review of Online Learning Studies ," U.S. Department of Education

" Twenty years of research on the academic performance differences between traditional and distance learning ," M. Sachar, Y. Neumann, Journal of Online Learning and Teaching, Merlot

" The Market Value of Online Degrees as a Credible Credential ," Calvin D. Foggle, Devonda Elliott, accessed via New York University

" Cheating in the Digital Age: Do Students Cheat More in Online Courses ?" George Watson, James Sottile, accessed via the University of Georga

" Student Identity Verification Tools and Live Proctoring in Accordance With Regulations to Combat Academic Dishonesty in Distance Education ," Vincent Termini, Franklin Hayes, Online Learning Consortium

" Student Readiness for Online Learning ," G. Hanley, Merlot

" Recognized Accrediting Organizations ," Council for Higher Education Accreditation  

" Digital Literacy ," Microsoft, Inc.  

" Free Application for Federal Student Aid ," Office of Federal Student Aid, U.S. Department of Education

Online Education Guide

  • Expert Advice for Online Students
  • Instructional Design in Online Programs
  • Learning Management Systems
  • Online Student Trends and Success Factors
  • Online Teaching Methods
  • Student Guide to Understanding and Avoiding Plagiarism
  • Student Services for Online Learners
  • Skip to main content
  • Skip to primary sidebar
  • Skip to footer
  • QuestionPro

survey software icon

  • Solutions Industries Gaming Automotive Sports and events Education Government Travel & Hospitality Financial Services Healthcare Cannabis Technology Use Case NPS+ Communities Audience Contactless surveys Mobile LivePolls Member Experience GDPR Positive People Science 360 Feedback Surveys
  • Resources Blog eBooks Survey Templates Case Studies Training Help center

what are the possible research questions about online learning

Home Surveys Academic

Distance learning survey for students: Tips & examples

Distance learning survey questions for students

The COVID-19 pandemic changed learning in many unprecedented ways. Students had to not just move to online learning but also keep a social distance from their friends and family. A student interest survey helps customize teaching methods and curriculum to make learning more engaging and relevant to students’ lives. It was quite challenging for some to adjust to the ‘new normal’ and missed the in-person interaction with their teachers. For some, it simply meant spending more time with the parents.

Schools need to know how students feel about distance education and learn more about their experiences. To collect data, they can send out a survey on remote learning for students. Once they have the results, the management team can know what students like in the existing setup and what they would like to change.

The classroom response system allowed students to answer multiple-choice questions and engage in real-time discussions instantly.

Here are the examples of class survey questions of distance learning survey for students you must ask to collect their feedback.

LEARN ABOUT:  Testimonial Questions

Examples of distance learning survey questions for students

1. How do you feel overall about distance education?

  • Below Average

This question collects responses about the overall experience of the students regarding online education. Schools can use this data to decide whether they should continue with teaching online or move in-person learning.

2. Do you have access to a device for learning online?

  • Yes, but it doesn’t work well
  • No, I share with others

Students should have uninterrupted access to a device for learning online. Know if they face any challenges with the device’s hardware quality. Or if they share the device with others in the house and can’t access when they need it.

3. What device do you use for distance learning?

Know whether students use a laptop, desktop, smartphone, or tablet for distance learning. A laptop or desktop would be an ideal choice for its screen size and quality. You can use a multiple-choice question type in your questionnaire for distance education students.

4. How much time do you spend each day on an average on distance education?

Know how much time do students spend while taking an online course. Analyze if they are over-spending time and find out the reasons behind it. Students must allocate some time to play and exercise while staying at home to take care of their health. You can find out from answers to this question whether they spend time on other activities as well.

5. How effective has remote learning been for you?

  • Not at all effective
  • Slightly effective
  • Moderately effective
  • Very effective
  • Extremely effective

Depending on an individual’s personality, students may like to learn in the classroom with fellow students or alone at home. The classroom offers a more lively and interactive environment, whereas it is relatively calm at home. You can use this question to know if remote learning is working for students or not. 

6. How helpful your [School or University] has been in offering you the resources to learn from home?

  • Not at all helpful
  • Slightly helpful
  • Moderately helpful
  • Very helpful
  • Extremely helpful

The school management teams need to offer full support to both teachers and students to make distance education comfortable and effective. They should provide support in terms of technological infrastructure and process framework. Given the pandemic situation, schools must allow more flexibility and create lesser strict policies.

7. How stressful is distance learning for you during the COVID-19 pandemic?

Studying in the time of pandemic can be quite stressful, especially if you or someone in the family is not doing well. Measure the stress level of the students and identify ways to reduce it. For instance, you can organize an online dance party or a lego game. The responses to this question can be crucial in deciding the future course of distance learning. 

8. How well could you manage time while learning remotely? (Consider 5 being extremely well and 1 being not at all)

  • Academic schedule

Staying at home all the time and balancing multiple things can be stressful for many people. It requires students to have good time-management skills and self-discipline. Students can rate their experience on a scale of 1-5 and share it with the school authorities. Use a multiple-choice matrix question type for such questions in your distance learning questionnaire for students.

LEARN ABOUT: System Usability Scale

9. Do you enjoy learning remotely?

  • Yes, absolutely
  • Yes, but I would like to change a few things
  • No, there are quite a few challenges
  • No, not at all

Get a high-level view on whether students are enjoying learning from home or doing it because they are being forced to do so. Gain insights on how you can improve distance education and make it interesting for them.

10. How helpful are your teachers while studying online?

Distance education lacks proximity with teachers and has its own set of unique challenges. Some students may find it difficult to learn a subject and take more time to understand. This question measures the extent to which students find their teachers helpful.

You can also use a ready-made survey template to save time. The sample questionnaire for students can be easily customized as per your requirements.

USE THIS TEMPLATE

Other important questions of distance learning survey for students

  • How peaceful is the environment at home while learning?
  •  Are you satisfied with the technology and software you are using for online learning?
  • How important is face-to-face communication for you while learning remotely?
  • How often do you talk to your [School/University] classmates?
  • How often do you have a 1-1 discussion with your teachers?

How to create a survey?

The intent behind creating a remote learning questionnaire for students should be to know how schools and teachers can better support them. Use an online survey software like ours to create a survey or use a template to get started. Distribute the survey through email, mobile app, website, or QR code.

Once you get the survey results, generate reports, and share them with your team. You can also download them in formats like .pdf, .doc, and .xls. To analyze data from multiple resources, you can integrate the survey software with third-party apps.

If you need any help with designing a survey, customizing the look and feel, or deriving insights from it, get in touch with us. We’d be happy to help.

MORE LIKE THIS

email survey tool

The Best Email Survey Tool to Boost Your Feedback Game

May 7, 2024

Employee Engagement Survey Tools

Top 10 Employee Engagement Survey Tools

employee engagement software

Top 20 Employee Engagement Software Solutions

May 3, 2024

customer experience software

15 Best Customer Experience Software of 2024

May 2, 2024

Other categories

  • Academic Research
  • Artificial Intelligence
  • Assessments
  • Brand Awareness
  • Case Studies
  • Communities
  • Consumer Insights
  • Customer effort score
  • Customer Engagement
  • Customer Experience
  • Customer Loyalty
  • Customer Research
  • Customer Satisfaction
  • Employee Benefits
  • Employee Engagement
  • Employee Retention
  • Friday Five
  • General Data Protection Regulation
  • Insights Hub
  • Life@QuestionPro
  • Market Research
  • Mobile diaries
  • Mobile Surveys
  • New Features
  • Online Communities
  • Question Types
  • Questionnaire
  • QuestionPro Products
  • Release Notes
  • Research Tools and Apps
  • Revenue at Risk
  • Survey Templates
  • Training Tips
  • Uncategorized
  • Video Learning Series
  • What’s Coming Up
  • Workforce Intelligence

Purdue Online Writing Lab Purdue OWL® College of Liberal Arts

Welcome to the Purdue Online Writing Lab

OWL logo

Welcome to the Purdue OWL

This page is brought to you by the OWL at Purdue University. When printing this page, you must include the entire legal notice.

Copyright ©1995-2018 by The Writing Lab & The OWL at Purdue and Purdue University. All rights reserved. This material may not be published, reproduced, broadcast, rewritten, or redistributed without permission. Use of this site constitutes acceptance of our terms and conditions of fair use.

The Online Writing Lab at Purdue University houses writing resources and instructional material, and we provide these as a free service of the Writing Lab at Purdue. Students, members of the community, and users worldwide will find information to assist with many writing projects. Teachers and trainers may use this material for in-class and out-of-class instruction.

The Purdue On-Campus Writing Lab and Purdue Online Writing Lab assist clients in their development as writers—no matter what their skill level—with on-campus consultations, online participation, and community engagement. The Purdue Writing Lab serves the Purdue, West Lafayette, campus and coordinates with local literacy initiatives. The Purdue OWL offers global support through online reference materials and services.

A Message From the Assistant Director of Content Development 

The Purdue OWL® is committed to supporting  students, instructors, and writers by offering a wide range of resources that are developed and revised with them in mind. To do this, the OWL team is always exploring possibilties for a better design, allowing accessibility and user experience to guide our process. As the OWL undergoes some changes, we welcome your feedback and suggestions by email at any time.

Please don't hesitate to contact us via our contact page  if you have any questions or comments.

All the best,

Social Media

Facebook twitter.

Appointments at Mayo Clinic

Kratom: unsafe and ineffective.

Users swear by kratom for mood enhancement and fatigue reduction, but safety issues and questions about its effectiveness abound.

If you read health news or visit vitamin stores, you may have heard about kratom, a supplement that is sold as an energy booster, mood enhancer, pain reliever and antidote for opioid withdrawal. However, the truth about kratom is more complicated, and the safety problems related to its use are concerning.

Kratom is an herbal extract that comes from the leaves of an evergreen tree (Mitragyna speciosa) grown in Southeast Asia. Kratom leaves can be chewed, and dry kratom can be swallowed or brewed. Kratom extract can be used to make a liquid product. The liquid form is often marketed as a treatment for muscle pain, or to suppress appetite and stop cramps and diarrhea. Kratom is also sold as a treatment for panic attacks.

Kratom is believed to act on opioid receptors. At low doses, kratom acts as a stimulant, making users feel more energetic. At higher doses, it reduces pain and may bring on euphoria. At very high doses, it acts as a sedative, making users quiet and perhaps sleepy. Some people who practice Asian traditional medicine consider kratom to be a substitute for opium.

Some people take kratom to avoid the symptoms of opioid withdrawal and because kratom may be bought more easily than prescription drugs.

Kratom is also used at music festivals and in other recreational settings. People who use kratom for relaxation report that because it is plant-based, it is natural and safe. However, the amount of active ingredient in kratom plants can vary greatly, making it difficult to gauge the effect of a given dose. Depending on what is in the plant and the health of the user, taking kratom may be very dangerous. Claims about the benefits of kratom can't be rated because reliable evidence is lacking.

Side effects and safety concerns

Although people who take kratom believe in its value, researchers who have studied kratom think its side effects and safety problems more than offset any potential benefits. Poison control centers in the United States received about 1,800 reports involving use of kratom from 2011 through 2017, including reports of death. About half of these exposures resulted in serious negative outcomes such as seizures and high blood pressure. Five of the seven infants who were reported to have been exposed to kratom went through withdrawal. Kratom has been classified as possibly unsafe when taken orally.

Kratom has a number of known side effects, including:

  • Weight loss
  • Chills, nausea and vomiting
  • Changes in urine and constipation
  • Liver damage
  • Muscle pain

Kratom also affects the mind and nervous system:

  • Hallucinations and delusion
  • Depression and delusion
  • Breathing suppression
  • Seizure, coma and death

Kratom takes effect after five to 10 minutes, and its effects last two to five hours. The effects of kratom become stronger as the quantity taken increases. In animals, kratom appears to be more potent than morphine. Exposure to kratom has been reported in an infant who was breastfed by a mother taking kratom.

Many of the problems that occur with pain medications happen when these drugs are used at high doses or over a long period of time. It's not known exactly what level of kratom is toxic in people, but as with pain medications and recreational drugs, it is possible to overdose on kratom.

Research shows little promise

At one time, some researchers believed that kratom might be a safe alternative to opioids and other prescription pain medications. However, studies on the effects of kratom have identified many safety concerns and no clear benefits.

Kratom has been reported to cause abnormal brain function when taken with prescription medicines. When this happens, you may experience a severe headache, lose your ability to communicate or become confused.

In a study testing kratom as a treatment for symptoms of opioid withdrawal, people who took kratom for more than six months reported withdrawal symptoms similar to those that occur after opioid use. Too, people who use kratom may begin craving it and require treatments given for opioid addiction, such as naloxone (Narcan) and buprenorphine (Buprenex).

Kratom also adversely affects infant development. When kratom is used during pregnancy, the baby may be born with symptoms of withdrawal that require treatment.

In addition, substances that are made from kratom may be contaminated with salmonella bacteria. As of April 2018, more than 130 people in 38 states became ill with Salmonella after taking kratom. Salmonella poisoning may be fatal, and the U.S. Food and Drug Administration has linked more than 35 deaths to Salmonella-tainted kratom. Salmonella contamination has no obvious signs, so the best way to avoid becoming ill is to avoid products that may contain it.

Kratom is not currently regulated in the United States, and federal agencies are taking action to combat false claims about kratom. In the meantime, your safest option is to work with your doctor to find other treatment options.

There is a problem with information submitted for this request. Review/update the information highlighted below and resubmit the form.

From Mayo Clinic to your inbox

Sign up for free and stay up to date on research advancements, health tips, current health topics, and expertise on managing health. Click here for an email preview.

Error Email field is required

Error Include a valid email address

To provide you with the most relevant and helpful information, and understand which information is beneficial, we may combine your email and website usage information with other information we have about you. If you are a Mayo Clinic patient, this could include protected health information. If we combine this information with your protected health information, we will treat all of that information as protected health information and will only use or disclose that information as set forth in our notice of privacy practices. You may opt-out of email communications at any time by clicking on the unsubscribe link in the e-mail.

Thank you for subscribing!

You'll soon start receiving the latest Mayo Clinic health information you requested in your inbox.

Sorry something went wrong with your subscription

Please, try again in a couple of minutes

  • Chien GCC, et al. Is kratom the new "legal high" on the block?: The case of an emerging opioid receptor agonist with substance abuse potential. Pain Physician. 2017;20:E195.
  • Feng L, et al. New psychoactive substances of natural origin: A brief review. Journal of Food and Drug Analysis. 2017;25:461.
  • Griffin III OH, et al. Do you get what you paid for? An examination of products advertised as kratom. Journal of Psychoactive Drugs. 2016;48:330.
  • Drug Enforcement Administration. Kratom (Mitragyna speciosa korth). https://www.deadiversion.usdoj.gov/drug_chem_info/kratom.pdf. Accessed April 17, 2018.
  • Yusoff NHM, et al. Opioid receptors mediate the acquisition, but not the expression of mitragynine-induced conditioned place preference in rats. Behavioural Brain Research. 2017;332:1.
  • Diep J, et al. Kratom, an emerging drug of abuse: A case report of overdose and management of withdrawal. Anesthesia & Analgesia Case Reports. In press. Accessed May 2, 2018.
  • Swogger MT, et al. Experiences of kratom users: A qualitative analysis. Journal of Psychoactive Drugs. 2015;47:360.
  • Fox J, et al. Drugs of abuse and novel psychoactive substances at outdoor music festivals in Colorado. Substance Use & Misuse. In press. Accessed May 2, 2018.
  • Kowalczuk AP, et al. Comprehensive methodology for identification of kratom in police laboratories. Forensic Science International. 2013;233:238.
  • Fluyua D, et al. Biochemical benefits, diagnosis, and clinical risks of kratom. Frontiers in Psychiatry. 2017;8:62.
  • Castillo A, et al. Posterior reversible leukoencephalopathy syndrome after kratom ingestion. Baylor University Medical Center Proceedings. 2017;30:355.
  • Grundmann O. Patterns of kratom use and health impact in the US — Results from an online survey. Drug and Alcohol Dependence. 2017;176:63.
  • Drago JD, et al. The harm in kratom. The Oncologist. 2017;22:1010.
  • Pizarro-Osilla C. Introducing…kratom. In press. Accessed May 2, 2018.
  • Kruegel AC, et al. The medicinal chemistry and neuropharmacology of kratom: A preliminary discussion of a promising medicinal plant and analysis of its potential for abuse. Neuropharmacology. In press. Accessed May 2, 2018.
  • Ismail I, et al. Kratom and future treatment for the opioid addiction and chronic pain: Periculo beneficium? Current Drug Targets. In press. Accessed May 2, 2018.
  • Singh D, et al. Kratom (Mitragyna speciosa) dependence, withdrawal symptoms and cravings in regular users. Drug and Alcohol Dependence. 2014;139:132.
  • Swogger MT, et al. Kratom use and mental health: A systematic review. Drug and Alcohol Dependence. 2018;183:134.
  • Food and Drug Administration. FDA investigates multistate outbreak of salmonella infections linked to products reported to contain kratom. https://www.fda.gov/Food/RecallsOutbreaksEmergencies/Outbreaks/ucm597265.htm. Accessed April 17, 2018.
  • Food and Drug Administration. Statement from FDA Commissioner Scott Gottlieb, M.D., on FDA advisory about deadly risks associated with kratom. https://www.fda.gov/NewsEvents/Newsroom/PressAnnouncements/ucm584970.htm. Accessed April 17, 2018.
  • Voelker R. Crackdown on false claims to ease opioid withdrawal symptoms. JAMA. 2018;319:857.
  • Post S. Kratom exposures reported to United States poison control centers: 2011-2017. Clinical Toxicology. Published online February 20, 2019.
  • Drug Enforcement Administration. Kratom—drug fact sheet. https://www.dea.gov/sites/default/files/2020-06/Kratom-2020.pdf. Accessed January 26, 2022.
  • Therapeutic Research Center. Kratom. https://naturalmedicines.therapeuticresearch.com/databases/food,-herbs-supplements/professional.aspx?productid=1513. Accessed January 26, 2022.
  • Umbehr G, et al. Acute liver injury following short-term use of the herbal supplement kratom. JAAPA. 2022;35:39.
  • Medication-free hypertension control
  • Alcohol: Does it affect blood pressure?
  • Alpha blockers
  • Ambien: Is dependence a concern?
  • Angiotensin-converting enzyme (ACE) inhibitors
  • Angiotensin II receptor blockers
  • Antidepressant withdrawal: Is there such a thing?
  • Antidepressants and alcohol: What's the concern?
  • Antidepressants and weight gain: What causes it?
  • Antidepressants: Can they stop working?
  • Antidepressants for children and teens
  • Antidepressants: Side effects
  • Antidepressants: Selecting one that's right for you
  • Antidepressants: Which cause the fewest sexual side effects?
  • Anxiety: A cause of high blood pressure?
  • Atypical antidepressants
  • Automated external defibrillators: Do you need an AED?
  • Beta blockers
  • Beta blockers: Do they cause weight gain?
  • Beta blockers: How do they affect exercise?
  • Bipolar disorder
  • Bipolar disorder and alcoholism: Are they related?
  • Bipolar disorder in children: Is it possible?
  • Bipolar medications and weight gain
  • Bipolar treatment: I vs. II
  • Blood pressure: Can it be higher in one arm?
  • Blood pressure chart
  • Blood pressure cuff: Does size matter?
  • Blood pressure: Does it have a daily pattern?
  • Blood pressure: Is it affected by cold weather?
  • Blood pressure medication: Still necessary if I lose weight?
  • Blood pressure medications: Can they raise my triglycerides?
  • Blood pressure readings: Why higher at home?
  • Blood pressure tip: Get more potassium
  • Caffeine and hypertension
  • Calcium channel blockers
  • Calcium supplements: Do they interfere with blood pressure drugs?
  • Can whole-grain foods lower blood pressure?
  • Central-acting agents
  • Choosing blood pressure medicines
  • Clinical depression: What does that mean?
  • Depression and anxiety: Can I have both?
  • Depression, anxiety and exercise
  • What is depression? A Mayo Clinic expert explains.
  • Depression in women: Understanding the gender gap
  • Depression (major depressive disorder)
  • Depression: Supporting a family member or friend
  • Diuretics: A cause of low potassium?
  • High blood pressure and exercise
  • Free blood pressure machines: Are they accurate?
  • Home blood pressure monitoring
  • Heart arrhythmia
  • Heart Rhythm Conditions
  • High blood pressure (hypertension)
  • High blood pressure and cold remedies: Which are safe?
  • High blood pressure and sex
  • High blood pressure dangers
  • How opioid use disorder occurs
  • How to tell if a loved one is abusing opioids
  • What is hypertension? A Mayo Clinic expert explains.
  • Hypertension FAQs
  • Hypertensive crisis: What are the symptoms?
  • Insomnia: How do I stay asleep?
  • Insomnia treatment: Cognitive behavioral therapy instead of sleeping pills
  • Intervention: Help a loved one overcome addiction
  • Isolated systolic hypertension: A health concern?
  • Kratom for opioid withdrawal
  • Lack of sleep: Can it make you sick?
  • L-arginine: Does it lower blood pressure?
  • Low blood pressure (hypotension)
  • Male depression: Understanding the issues
  • MAOIs and diet: Is it necessary to restrict tyramine?
  • Marijuana and depression
  • Medications and supplements that can raise your blood pressure
  • Menopause and high blood pressure: What's the connection?
  • Mental health: Overcoming the stigma of mental illness
  • Mental health providers: Tips on finding one
  • Mental illness
  • Monoamine oxidase inhibitors (MAOIs)
  • Natural remedies for depression: Are they effective?
  • Nervous breakdown: What does it mean?
  • Opioid stewardship: What is it?
  • Pain and depression: Is there a link?
  • Prescription drug abuse
  • Prescription sleeping pills: What's right for you?
  • Pulse pressure: An indicator of heart health?
  • Reactive attachment disorder
  • Resperate: Can it help reduce blood pressure?
  • Selective serotonin reuptake inhibitors (SSRIs)
  • Serotonin and norepinephrine reuptake inhibitors (SNRIs)
  • Sleep deprivation: A cause of high blood pressure?
  • Stress and high blood pressure
  • Tapering off opioids: When and how
  • Teen depression
  • Teen drug abuse
  • Nutrition and pain
  • Pain rehabilitation
  • Self-care approaches to treating pain
  • Treatment-resistant depression
  • Tricyclic antidepressants and tetracyclic antidepressants
  • Unexplained weight loss
  • Valerian: A safe and effective herbal sleep aid?
  • Vasodilators
  • How to measure blood pressure using a manual monitor
  • How to measure blood pressure using an automatic monitor
  • What is blood pressure?
  • Vitamin B-12 and depression
  • Can a lack of vitamin D cause high blood pressure?
  • What are opioids and why are they dangerous?
  • White coat hypertension
  • Wrist blood pressure monitors: Are they accurate?
  • Mayo Clinic Minute: Do not share pain medication
  • Mayo Clinic Minute: Avoid opioids for chronic pain
  • Mayo Clinic Minute: Be careful not to pop pain pills

Mayo Clinic does not endorse companies or products. Advertising revenue supports our not-for-profit mission.

  • Opportunities

Mayo Clinic Press

Check out these best-sellers and special offers on books and newsletters from Mayo Clinic Press .

  • Mayo Clinic on Incontinence - Mayo Clinic Press Mayo Clinic on Incontinence
  • The Essential Diabetes Book - Mayo Clinic Press The Essential Diabetes Book
  • Mayo Clinic on Hearing and Balance - Mayo Clinic Press Mayo Clinic on Hearing and Balance
  • FREE Mayo Clinic Diet Assessment - Mayo Clinic Press FREE Mayo Clinic Diet Assessment
  • Mayo Clinic Health Letter - FREE book - Mayo Clinic Press Mayo Clinic Health Letter - FREE book
  • Kratom - unsafe and ineffective

Your gift holds great power – donate today!

Make your tax-deductible gift and be a part of the cutting-edge research and care that's changing medicine.

IMAGES

  1. How to Develop a Strong Research Question

    what are the possible research questions about online learning

  2. How to Write a Research Question in 2024: Types, Steps, and Examples

    what are the possible research questions about online learning

  3. What Is a Research Question? Tips on How to Find Interesting Topics

    what are the possible research questions about online learning

  4. Research Question Examples: Ultimate Guide For 2023

    what are the possible research questions about online learning

  5. Research Question: Definition, Types, Examples, Quick Tips

    what are the possible research questions about online learning

  6. Research Questions: Definition, Types, and How to Write One

    what are the possible research questions about online learning

VIDEO

  1. What is Online Learning?

  2. English Introduction class for Section Officer

  3. Concept of Profit and Loss

  4. 5 Myths about Online Learning

  5. How Library Stuff Works

  6. Internet Research Skills

COMMENTS

  1. 45 Survey Questions to Understand Student Engagement in Online Learning

    4. When you are not in school, how often do you talk about ideas from your classes? 5. Overall, how interested are you in your classes? 6. What are the most engaging activities that happen in this class? 7. Which aspects of class have you found least engaging? 8.

  2. PDF A Systematic Review of the Research Topics in Online Learning During

    Table 1 summarizes the 12 topics in online learning research in the current research and compares it to Martin et al.'s (2020) study, as shown in Figure 1. The top research theme in our study was engagement (22.5%), followed by course design and development (12.6%) and course technology (11.0%).

  3. Top 6 Questions People Ask About Online Learning

    The answer is an emphatic "no." Most online programs appear on your transcript the same as on-campus programs would. You may also wonder if an online program will impact your plans for a higher degree later. As long as your degree is from an accredited institution, it won't harm your chances of acceptance. 4.

  4. Frontiers

    BackgroundThe effectiveness of online learning in higher education during the COVID-19 pandemic period is a debated topic but a systematic review on this topic is absent.MethodsThe present study implemented a systematic review of 25 selected articles to comprehensively evaluate online learning effectiveness during the pandemic period and identify factors that influence such effectiveness ...

  5. A systematic review of research on online teaching and learning from

    This review enabled us to identify the online learning research themes examined from 2009 to 2018. In the section below, we review the most studied research themes, engagement and learner characteristics along with implications, limitations, and directions for future research. 5.1. Most studied research themes.

  6. Integrating students' perspectives about online learning: a hierarchy

    This research study has two research questions. The first research question is: What are the significant factors in creating a high-quality online learning experience from students' perspectives? That is important to know because it should have a significant effect on the instructor's design of online classes.

  7. Online education in the post-COVID era

    The COVID-19 pandemic has forced the world to engage in the ubiquitous use of virtual learning. And while online and distance learning has been used before to maintain continuity in education ...

  8. (PDF) A Systematic Review of the Research Topics in Online Learning

    New research topics included parents, technology acceptance or adoption of online learning, and learners' and instructors' perceptions of online learning. The Percentage of Research Topics

  9. Online Learning: A Panacea in the Time of COVID-19 Crisis

    Research Methodology. The study is descriptive and tries to understand the importance of online learning in the period of a crisis and pandemics such as the Covid-19. The problems associated with online learning and possible solutions were also identified based on previous studies.

  10. Online Teaching in K-12 Education in the United States: A Systematic

    A wide variety of terminology is used in varied and nuanced ways in educational literature to describe student learning mediated by technology, including terms such as virtual learning, distance learning, remote learning, e-learning, web-based learning, and online learning (e.g., Moore, Dickson-Deane, & Galyen, 2011; Singh & Thurman, 2019).For example, in a systematic review of the literature ...

  11. Insights Into Students' Experiences and Perceptions of Remote Learning

    This spring, students across the globe transitioned from in-person classes to remote learning as a result of the COVID-19 pandemic. This unprecedented change to undergraduate education saw institutions adopting multiple online teaching modalities and instructional platforms. We sought to understand students' experiences with and perspectives on those methods of remote instruction in order to ...

  12. Assessing the Impact of Online-Learning Effectiveness and Benefits in

    Online learning is one of the educational solutions for students during the COVID-19 pandemic. Worldwide, most universities have shifted much of their learning frameworks to an online learning model to limit physical interaction between people and slow the spread of COVID-19. The effectiveness of online learning depends on many factors, including student and instructor self-efficacy, attitudes ...

  13. (Pdf) Research on Online Learning

    The CoI model has formed the basis for a good deal of research on online learning. Most of this research. has focused on one of the three pr esences, social presence being the most frequently ...

  14. Online and face‐to‐face learning: Evidence from students' performance

    1.1. Related literature. Online learning is a form of distance education which mainly involves internet‐based education where courses are offered synchronously (i.e. live sessions online) and/or asynchronously (i.e. students access course materials online in their own time, which is associated with the more traditional distance education).

  15. The effects of online education on academic success: A meta ...

    The purpose of this study is to analyze the effect of online education, which has been extensively used on student achievement since the beginning of the pandemic. In line with this purpose, a meta-analysis of the related studies focusing on the effect of online education on students' academic achievement in several countries between the years 2010 and 2021 was carried out. Furthermore, this ...

  16. The qualitative evidence behind the factors impacting online learning

    We included peer-reviewed journal articles, dissertations and theses, and book chapters that (a) were published within the last 12 years (January 2007-August 2019) to cover the most recent and up-to-date insights; (b) used the CoI framework; (c) covered students' fully online learning experiences in higher education; (d) provided qualitative evidence including verbatim evidence from the ...

  17. Students' online learning challenges during the pandemic and how they

    To address the research questions, we used both quantitative and qualitative analyses. ... challenges during the pandemic somehow vary from the typical challenges that students experience in a pre-pandemic online learning environment. One possible explanation for this result is that restriction in mobility may have aggravated this challenge ...

  18. Looking back to move forward: comparison of instructors' and ...

    The research takes place at a comprehensive university in China, with a sample of 46 Year 1 students and 18 experienced teachers. Their reflections on the effectiveness of online learning were ...

  19. 206 questions with answers in ONLINE LEARNING

    Online Learning - Science topic. Explore the latest questions and answers in Online Learning, and find Online Learning experts. Questions (206) Publications (338,281) Questions related to Online ...

  20. Online Learning: Challenges and Solutions for Learners and Teachers

    The article presents some challenges faced by teachers and learners, supplemented with the recommendations to remove them. JEL Code: A20. The COVID-19 pandemic has led to an expansion in the demand for online teaching and learning across the globe. Online teaching and learning is attracting many students for enhanced learning experiences.

  21. Frequently Asked Questions About Online Education

    Recent reports detail just how quickly colleges adopted online learning. According to the Babson Survey Research Group, university and student participation in online education is at an all-time high. Even some of the largest and most prestigious universities now offer online degrees. Despite its growing popularity, online education is still ...

  22. Distance learning survey for students

    Distance education lacks proximity with teachers and has its own set of unique challenges. Some students may find it difficult to learn a subject and take more time to understand. This question measures the extent to which students find their teachers helpful. You can also use a ready-made survey template to save time.

  23. What is Online Learning Today?

    And 66% called it "extremely important.". Flexibility and convenience were way ahead of the other big reasons, like: work/life balance - 66%. cost savings - 37%. preference for virtual learning - 24.4%. Even taken as separate topics, flexibility and convenience were huge: Flexibility - 58%. Convenience - 31.6%.

  24. 10 Questions To Ask Students About Online Learning

    Get To Know Your Online Learners: 10 Questions For eLearning Professionals. Without knowing what your online learners need, what they expect from the eLearning course, and, most importantly, who they are as individuals, you simply cannot create a personalized eLearning experience.You must learn as much as possible about their background, goals, and preferences to make the eLearning course ...

  25. Students' experience of online learning during the COVID‐19 pandemic: A

    Even though online learning research has been advancing in uncovering student experiences in various settings (i.e., tertiary, adult, and professional education), very little progress has been achieved in understanding the experience of the K‐12 student population, especially when narrowed down to different school‐year segments (i.e ...

  26. Parental perspectives on the management of online learning and school

    Method. Design. The researcher's aims and question aligned with qualitative methodology and a phenomenological approach. Phenomenological research focuses on participant experiences through analysis of written or spoken words ) enabling a rich engagement with the participants' lived experiences and how they view and understand their 'lifeworld' (Banister et al. Citation 2012; Spinelli ...

  27. <em>British Journal of Educational Psychology</em>

    Learning is not possible without errors. 6: 12.0: Other: 18: 36.0: B. Categories (experimental group 2) f % Errors make you to memorize the wrong information instead of the right one. 29: ... In doing so, new research questions emerged about how learners' adaptive responses to errors can best be supported, or how learners can best be prepared ...

  28. Welcome to the Purdue Online Writing Lab

    Mission. The Purdue On-Campus Writing Lab and Purdue Online Writing Lab assist clients in their development as writers—no matter what their skill level—with on-campus consultations, online participation, and community engagement. The Purdue Writing Lab serves the Purdue, West Lafayette, campus and coordinates with local literacy initiatives.

  29. Kratom: Unsafe and ineffective

    Five of the seven infants who were reported to have been exposed to kratom went through withdrawal. Kratom has been classified as possibly unsafe when taken orally. Kratom has a number of known side effects, including: Weight loss. Dry mouth. Chills, nausea and vomiting. Changes in urine and constipation. Liver damage.