26 minute read
Efficacy of Virtual Asynchronous Didactic Delivery and Case-Based Discussions for Predoctoral Orthodontic Education
James Chen, DDS, PhD, MPH, is an assistant professor in the department of orthodontics at the University of Pacific Arthur, Dugoni School of Dentistry. Conflict of Interest Disclosure: None reported.
Brandon Zegarowski, DDS, is a recent graduate of the University of Pacific Arthur, Dugoni School of Dentistry. He is pursuing a Master of Science in dentistry in the orthodontics and dentofacial orthopedics program at the Oregon Health and Science University in Portland. Conflict of Interest Disclosure: None reported.
Advertisement
Mandy Lam, DDS, is a recent graduate of the University of Pacific Arthur, Dugoni School of Dentistry. She is now serving the U.S. Navy Dental Corps at the Great Lakes Naval Station in Chicago. Conflict of Interest Disclosure: None reported.
ABSTRACT
Background: Case-based learning (CBL) has been widely accepted as an integral part of active learning in dental curriculums. An all asynchronous didactic and CBL model for orthodontic education was developed out of necessity during the COVID-19 pandemic. The purpose of this research is to evaluate the efficacy of virtual didactic delivery and CBL discussions for a predoctoral orthodontics course using a pilot study of “coded discussions” and contrasting student course evaluations with a previous traditional in-person lecture iteration of the same course.
Materials and methods: Using the concept of “discussion coding,” a numerical rubric can be used to individually evaluate a student’s experience in a virtual forum. The data gathered include the quantity and timing of student posts as well as metrics evaluating student engagement via post quality, post type and discussion-generating characteristics of posts based on topics across each of the four themed module discussions. Student course evaluations from 2018 were then contrasted to the evaluations from 2020 with the virtual format.
Results: The discussion data showed high levels of student engagement in clinically relevant topics with quantity and timeliness of posts exceeding the course minimum requirements. When compared to previous course evaluations with solely passive learning, the educational outcomes and student satisfaction of virtual courses appear to be similar in nature.
Conclusion: Pairing short asynchronous passive learning lectures and in-depth active learning online case discussions, based on student feedback, offers a method to create immersive learning experiences that are not just a product of modern necessity.
Keywords: Virtual, case based, discussions, predoctoral, orthodontic education, asynchronous, outcome
__________
Dental education in the U.S. has long been based on passive learning, with didactic material being presented to students in lecture format. Over the past decades, more dental programs have implemented active learning modules to improve the overall educational experience for dental students through problem-based learning (PBL) or casebased learning (CBL).[1] Research has shown that proper implementation of PBL and CBL learning modules has significantly improved the overall educational experience of dental students.[2] Subtle differences exist between PBL and CBL, with the latter requiring some level of knowledge prior to the case discussions.
Case-based learning has been instrumental in medical education for decades. The use of formative didactic learning as a base from which students engage in clinical case problem-solving has shown to be a good adjunct to traditional lectures.[3] CBL is collaborative, hence the faculty member is directly involved in the experience and learning. This style of education lends itself perfectly toward virtual learning, as both the didactic and case information can be delivered directly to the students. Richards et al. concluded that case-based teaching may encourage dental students to focus on their patients’ cultural background and whole complexity when assessing treatment decisions.[4] This would encourage collaboration with other disciplines within the predoctoral curriculum, though at the same time could cause issues with the limited time a course instructor would have to devote to small groups. Historically, CBL has been a strain on faculty time and has required additional resources; however with the use of the education technology, asynchronous learning can offer flexibility to both the faculty and students, leading to positive educational outcomes.[1]
Predoctoral orthodontic education in U.S. dental programs has primarily been based on some combination of passive learning, faculty lectures and clinic rotations. Unlike other dental specialties, orthodontic treatment typically lasts one to three years, which in the context of three- to four-year dental programs offers very few opportunities to apply didactic information to first-hand clinical experiences. While clinic rotations are helpful, they are often a snapshot of the whole treatment. Yang et al. described the challenges faced by orthodontic educators, in that “limited curriculum time and clinical probation is difficult in creating a meaningful learning experience and makes students have rare exposure to a broad range of existing clinical situations.”[5] Yang et al. further poses the challenges orthodontic faculty face in attempting to present a large amount of information to students with little clinical exposure to facilitate closure of the information loop.[5] In their article, Yang et al. showed implementation of case-based learning modules was both well received and helped improve student understanding of problem lists, diagnosis and treatment planning.[5]
Due to the COVID-19 pandemic, many dental programs had to move most, if not all, of their current courses from in-person to some combination of remote learning. At the University of the Pacific, Arthur A. Dugoni School of Dentistry, a rapid shift occurred within the biomedical and clinical didactic curriculum, transitioning to a combination of Zoom and Canvas for certain classes that formerly consisted of solely in-person lectures. With the post COVID-19 transition, faculty increased the application of Canvas and in some cases used it to actively interact with students in lieu of in-person meetings.[6] Using a virtual model developed by the University of California at Berkeley Online Master of Public Health program, the predoctoral orthodontic courses implemented a unique combination of asynchronous passive learning modules and asynchronous active learning through CBL online discussion forums.[7]
Yang et al. has shown the value of CBL modules in orthodontic education in predoctoral education. The COVID-19 pandemic forced educators to combine previously successful educational formats such as CBL with utilizing online educational platforms. The null hypothesis of this pilot study is for dental students at the University of the Pacific, Arthur Dugoni School of Dentistry, passive and active learning modules newly developed in the predoctoral orthodontic curriculum do not differ fundamentally from previous in-person passive learning. To test this hypothesis, the following specific aims were constructed: 1) Compare student course evaluations between the new asynchronous passive and active learning format to previous in-person passive learning only format and 2) collect and analyze discussion data and to determine the efficacy and quality of virtual asynchronous delivery of the course. To achieve this aim, a standardized “Discussion Forum Coding Guide” was developed through which every post in the course was manually quantified and qualified.
Methods: Overall Course Evaluation
This project was completed with IRB approval (IRB2021-101). Aggregate student course evaluations for the introductory orthodontic course (OR244) from 2018 at the University of the Pacific, Arthur A. Dugoni School of Dentistry were compiled and compared to the compilation of student course evaluations after conclusion of the OR244 class in 2020 (TABLE 1). A major course difference between these years is in 2018, OR244 was delivered in person with lectures while 2020 saw the transition to an online format with the new inclusion of asynchronous didactic content delivery and use of discussion forums on Canvas. In both these years, a digital questionnaire was distributed to students who successfully completed the course.
Course Layout
The basic structure of the online class was to separate the course material into easily digestible weekly modules with asynchronously recorded lectures of 20 to 40 minutes each for passive learning, and case-based discussions modules were a critical piece of student engagement in terms of active learning. For each discussion section, students were divided into 10 groups (two groups with 16 students and eight groups with 17 students). Of the seven modules of content in the course, only modules 2, 3, 4 and 7 had discussion forums in the format evaluated in this study. Each of these modules lasted one week, and within that time, students were given the first three days to post their answers to the case of their choice along with four days to post a reply to a classmate’s original post. Students were presented different clinical cases with specific questions that related back to the weekly course content. Each student was asked to choose a case they thought was interesting to them, answer the associated questions and then respond to another classmate’s post. The course director was actively engaged in making sure to respond to each student’s post and questions in an environment where right or wrong answers were not as important as the dialogue generated by their discussion posts.
Discussion Assessment
Data was collected after completion of the orthodontics course, and as such, it is retrospective cross-sectional data in nature. All student identifiers and demographics were removed to have purely deidentified data. The quality and quantity of the discussion posts for this new orthodontic didactic delivery system were evaluated though designing a coding system for each discussion post (APPENDIX 1). Each discussion response was coded into a numerical value to allow for statistical analysis of each module (discrete numerical values ranging from 0–4 depending on the criteria). Seven specific categories to measure quantity, timing and quality of each response were developed. These categories were assessed by two research graders and input into Microsoft Excel documents. The researchers spent three hours prior to collecting data evaluating test posts together as a way of standardizing the “Discussion Forum Coding Guide” metrics. The first two categories involved the quantity and timing of student posts in the discussion forum. Both categories were based on the instructions given to the students in the course syllabus regarding a minimum of one original response to a case and one reply to a peer or instructor as well as a first post due date on Monday at 11:59 p.m. and a second post due on Thursday at 11:59 p.m. of the module week. Posts were then assessed for the case topic they were in response to as well as whether it was an original response to the assigned case or a reply to a peer or instructor. To determine the level of student interaction via discussion, individual posts were evaluated for the type of response elicited (peer, instructor or none) as well as the inclusion of a follow-up question or novel material in the post to elicit further discussion. The individual posts were also evaluated for quality based on a standard rubric seen in the “Discussion Forum Coding Guide” in APPENDIX 1. There were no course requirements for the quality or discussion-generating nature of a post; the only two requirements were the quantity and timing of the posts.
Discussion Coding Calibration
The discussion coding primarily focused on objective assessments of each discussion post such as times and frequency of posts. One subjective assessment focused on quality of the discussion post. To assess the calibration between the two graders, the kappa statistic for agreement for this subjective measure was measured in Module 2 with both individual graders completing the quality data collection for the entire module independently. Module 2 quality measure had an agreement level of 98% compared to the expected 44% along with a kappa value of 96.5%. These values show both grades were well calibrated to the most subjective assessment tool in the discussion coding.
Statistical Analysis
Microsoft Excel was used as the primary form of data entry along with basic data analysis (Microsoft Corp., Seattle) and data was transferred to Stata for further statistical analysis (Stata, StataCorp LLC, College Station, Texas). Basic descriptive analysis was used to evaluate quantity and time (mean and SD) for each post. Qualitative measures were primarily focused on within module differences as opposed to between module differences due to the statistical complexity of looking for differences among different groupings of topics. Within each module, the focus remained on the relationship between the type of case or topic and the quality of the posts. The primary comparison measure was the type of case within each module. The type of case is classified as nominal form of data, and as such, nonparametric tests were used for statistical analysis. The Kruskal-Wallis test was used to assess if there were any statically significant differences in quality based on the topics for within each module. The types of cases varied among all the modules, which limited the individual comparisons compared to a broader analysis. All student identifiers were removed prior to evaluating the discussion response, therefore differences in demographics such as age, sex and dental program (DDS compared to international dental student) were unable to be assessed. Furthermore, each module had a varying degree of responses both in total and per student making it difficult to directly compare between modules.
Results: Overall Course Evaluation
First, the overall student evaluations from 2018 (pre-COVID-19) were compared to this current 2020 (during COVID-19) course (TABLE 1). The comparison showed that for all course criteria the new format was similar in quality to that of the 2018 course. When looking at the average scoring for course quality, both formats were almost identical, except for a slight difference in “faculty appeared interested in my learning the material.”
While numerically similar in overall course evaluation, several noticeable differences arose in the open-ended questions given at the end of the course evaluation. Due to the large number of open-ended responses, just a few will be highlighted to illustrate what was noticed. The positive responses for the traditional format centered on lecture content specifically focusing on Invisalign. The negative open-ended responses primarily focused on the delivery of the lecture material and some requests for more in-depth case discussion. In contrast, the positive open-ended responses for this new format centered on class organization, quality of the learning from discussion forums, interaction with faculty and peers and case presentation. Some examples of positive response are: “I thought this was the most effective online class we had spring quarter. I felt like my learning was very active and the courses/cases presented were both informative and enjoyable to watch from home. I also believed all the faculty were online and interested in helping us at any hour. Great job!” and “The content of this course is pretty complex and there was a steep learning curve but once I started to learn more about orthodontics I could logic my way through different treatment plans. The most useful parts of this course were the discussion boards because we got to see the thought processes of our peers and learn from them.” The negative responses for the new format focused on logistics of the course and the amount of material provided at one time. Some examples of these negative responses are: “There was way too much thrown at us at once. I had a hard time following along and had to do outside research to be able to answer the discussion questions” and “The amount of material this course presented us with was extensive. It often took me more than eight hours each Friday to get through the material and I often needed two days to work through the materials.”
Module Discussion Evaluation
For each discussion group, there was a minimum number of posts ranging from 32 to 34 depending on which group had more or less students. When all 10 groups were combined, the minimum expected number of posts/reposts amounted to 321 per module. TABLE 3 shows that the average number of total posts (minus all instructor posts) was 409 (range of 365–430). This indicates that more than just the minimum number of posts were occurring for each module. Though this result is not an ideal proxy for student engagement, it does give a picture of the different levels of participation within each module.
Both the quantity and timing of each post for all modules were assessed as well. On average, each module generated between the minimum to at least one additional post (APPENDIX TABLE 2) with Module 4 being the only module with closer to the minimum number. The initial post by students in the first two modules was completed generally one day before the due date, but in the last two modules, that average moved closer to the due date (APPENDIX TABLE 2). The student reposts to their colleagues were consistently posted at least one day before the due date (APPENDIX TABLE 2).
The criteria pertaining to the level of engagement of students in the discussion forum was also evaluated. The quantity of posts generating discussion were evaluated with 540 of 1,635 evaluated posts across all groups and modules eliciting further discussion, which was an average of 33%. Across all modules, the total responses elicited by peers were the most common form of response as well, with 589 of 1,635 posts eliciting a response solely by a peer in the same group, for an average of 36%.
Lastly, the quality of each student post and repost sorted by topic was evaluated. The comparison was just within modules as opposed to the cross-comparison of modules due to the significant differences in the range of contents between the modules along with the varied number of individual posts. There were no major differences within each module across different types of clinical cases (TABLE 3). Two quality measures showed a statistical significance (P < 0.05), Module 2 response quality and Module 3 response type. The P-value for Module 2 response quality was 0.03 which is below the classical P < 0.05 threshold, but if one considers the number of comparisons within each module, this value is most likely due to random chance (TABLE 3). However, for the response type in Module 3, the P-value was 0.0001, and in the face of multiple comparisons, this value appears to be more realistic (TABLE 3). This module is unique in that it is the only module with one case that is not clinically relevant (orthodontic staging), and as such it had the lowest number of posts. The quality of the posts was further evaluated by assessing the percentage of posts that had high, medium and low quality. When graphically evaluating the percentages, a few specific topics were noted that had 50% or higher of medium to high quality of responses (fIGure 1). Modules 2 and 3 presented with the greatest number of topics of quality posts having a medium quality designation.
Discussion
The COVID-19 pandemic has significantly impacted dental education specifically when it pertains to educational content given in person. While there is no clear substitute for clinical training, traditionally passive learning courses such as orthodontics had an opportunity during the mandated shutdown to attempt and assess different teaching methods. This study focused on the implementation of a fully asynchronous orthodontic curriculum with both a passive and active learning component. A point of interest was to see how this new curriculum fared both with student satisfaction and grasping of important fundamental orthodontic knowledge. The analysis methods were developed after the course had completed, thus this study was conducted as a pilot with limited data available.
The results of this pilot study showed similarities between an in-person passive learning-only course (prior to COVID-19) and the new asynchronous passive and active-learning course (during COVID-19) based on student feedback. Course ratings were not expected to be significantly higher in the new model compared to the old learning model because the general material of the course was the same. However, similar course evaluation would indicate that students fundamentally enjoyed the asynchronous environment the same as the passive learning environment. While quantitatively similar, this online teaching modality produced more student feedback and open-ended questions. Students reported to benefit from the increased level of engagement, particularly from the unique engagement with faculty and colleagues in a virtual open learning space. This difference is most likely attributed to the direct interactions the course director had with students in small group online discussion forums under the new format.
Beyond the course evaluations, an additional point of focus was the quality aspect of each asynchronous CBL module. The setup of the discussion modules was twofold, one to increase the opportunity of students viewing more than just one case and to encourage professional interaction they would experience in private practice. Quality was assessed through metrics evaluating the engagement level of each discussion module. With approximately a third of posts eliciting further discussion and peer responses, the level of student engagement was more than satisfactory given that no course requirements defined quality of posts or discussion generation metrics, thus all the peerto-peer interaction occurred naturally without course grade incentive. Data did not show any module or topic stood out as “more” engaging, however, topics that were not clinically relevant did not garner as much interest (orthodontic staging) compared to clinically relevant ones. Specifically, modules 2 and 3 tended to induce a higher percentage of quality posts. It is possible that the quality of posts was much higher earlier in the quarter than later on in the course. These results, while not surprising, highlight the importance of understanding what the interests are of students when designing CBL modules. Similar to the Buelow et al. study, identifying course content that attracts a higher level of interest is a way that programs can encourage and support online students’ learning engagement.[8]
Some common concerns with live-discussion forums are the unequal distribution of participation. With in-person discussions, one would expect a bell-shaped curve with a few students extremely eager and a few shy in expressing their thoughts. While the study was conducted after the course was completed, online discussion forums offer a unique space for students to be curious and comfortable that their posts will be discussed without judgement.[9] Discussions moderated by faculty further instill this curiosity and development through personal engagement. In this particular discussion-forum design, the goal was to elicit both quality posts relating to each case as well to facilitate professional interactions between peers. Anecdotally, in reviewing each response, we were pleased to see students being both encouraging and supportive of each other. The findings in this study help to encourage this modality of teaching and education in predoctoral orthodontics for future courses.
The active learning component of the new orthodontic curriculum sets up an open environment for improved learning, but the passive learning aspect of the course faced several pitfalls. The biggest concern with a purely asynchronous online passive learning model is procrastination or a potential for a lack of self-motivation to keep up with the material. A study by Landrum at the University of Dallas examined how online learning may increase a student’s academic confidence.[10] Surprisingly, some students do not learn as effectively when given the freedom to set their own pace of learning. This is relevant to Cheng’s concern that virtual learning may lead to procrastination and lower educational outcomes.[11] Compared to mandatory in-person attendance, some students reported feeling overwhelmed by the extent of lecture material provided when asked to respond to challenging discussion prompts. The course directors addressed this issue with personal responses to each student in the discussions to help guide them to the proper train of thought. Active participation by the course directors in the discussion forum could be the catalyst for students who are not as self-motivated as others and needs to be studied further. Another important aspect is reducing intimidation of discussing topics that students are not familiar with. To accomplish this, the expectations of discussion can be communicated more clearly, such that the quality of discussion is more important than the accuracy of student’s responses. A focus on developing an open area where students feel comfortable making mistakes is important in promoting their understanding of the material.
Case-based learning (CBL) has long been recommended as a promising tool for medical and dental professionals and has been proposed to be further integrated into dental curriculums.[9] A study conducted by Cutinan et al. with regard to predoctoral operative dental education demonstrated that case-based activities increase students’ comfort levels by bridging the theory-practice gap between didactics and clinic.[12] CBL has been shown to promote small-group learning as well as engaging students and facilitating deeper learning,[13] through structured activities linked to realistic clinical practice scenarios.[14] The use of virtual platforms allows greater accessibility by the instructor to the students, allowing for high-quality discussion and problemsolving in an asynchronous manner that is beneficial to both students and faculty.[15] By evaluating student discussion habits with relation to certain case topics via the application “discussion coding,” conclusions can be made regarding which topics are more interesting to students, and adjustments can be made in course material to improve student engagement. With regard to orthodontic course material, studies have shown teleeducation to be as effective as traditional teaching methodologies.[16] As the shift back to the classroom occurs across the country, it is vital to acknowledge the benefit virtual case-based discussions and asynchronous didactic delivery have in predoctoral dental education outcomes.
This pilot cross-sectional study has several potential limitations. The first is this project did not take into consideration how the students may be personally and mentally impacted by the COVID-19 pandemic and how that affects their learning abilities. A study conducted at the University of Jordan during the pandemic by Hattar et al. concluded that though there is appreciation for new learning methods, students still acknowledge missing many educational experiences and do not believe it to be effective enough as a substitute for clinical practice.[17] Dental school is a heterogeneous population of students with different ages and backgrounds, and additionally, there are students with no dental education and others with international dental education. Moving forward, a prospective cohort study would be an ideal method to assess how demographics and education level affect both interest and quality of a new orthodontic curriculum. Several students noted they preferred the traditional form of passive learning during the open-ended and anonymous course evaluations. Another potential confounder to the study could be the level of technology experience of students because those with a strong understanding of new technology might perform better than those without. All of this key demographic information will need to addressed in a long-term prospective study. In comparing two types of education delivery methods, another potential weakness in this project is the lack of long-term data on this new format of learning, and further studies looking at students’ ability to translate didactic information into clinical practice are needed. Further research is needed regarding different topics and disciplines within dentistry to evaluate the efficacy of virtual learning for differently themed content.
Conclusion
These findings enhance the quality of education by taking into consideration how this generation learns best, such that participating in a virtual learning space may increase a student’s academic confidence and choosing topics of interest can encourage online students’ learning engagement.[1]
Developing a curriculum where students have smaller but more frequent lectures coupled with an open environment for case-based learning strongly stimulated students’ interest.
Virtual case-based discussions allow for ample student/faculty interaction with high-quality levels of discourse.
The findings of this orthodontics course study offer insight on how to leverage online educational tools and programs to progress the needs of our generation and adapt to the ever-changing climate of health care education both currently and moving forward in post-pandemic settings.
Further research is needed regarding different topics and disciplines within dentistry to evaluate the efficacy of virtual learning for differently themed content.
REFERENCES
1. Nadershahi N A, Bender DJ, Beck C, Lyon C, Blaseio A. An overview of case-based and problem-based learning methodologies for dental education. J Dent Educ 2013 Oct;77(10):1300–5.
2. Nerali J, Telang L, Telang A, Chakravarth P. Problem-based learning in dentistry, implementation and student perceptions. Saudi J Oral Sci 2020; 7(3)194–198. doi: 10.4103/sjos. SJOralSci_15_20.
3. Williams, Brett. Case based learning — a review of the literature: Is there scope for this educational paradigm in prehospital education? Emerg Med J 2005 Aug;22(8):577–81. doi: 10.1136/emj.2004.022707. PMCID: PMC1726887.
4. Richards PS, Inglehart, MR. An interdisciplinary approach to case‐based teaching: Does it create patient‐centered and culturally sensitive providers? J Dent Educ 2006 Mar;70(3):284–91.
5. Yang Z, Ding Y, Jin F. Student Perceptions of Effectiveness of Case-Based Learning in Orthodontic Education. Open Med 2015; 2:48–52. doi: 10.2174/1874220301401010048.
6. Zheng M, Louie K. Promoting student engagement and online interaction with cloud-based technology. J Dent Educ 2021 Apr 24. doi: 10.1002/jdd.12628. Online ahead of print.
7. UC Berkeley School of Public Health. General Information of Online MPH Programs.
8. Buelow JR, Barry T, Rich LE. Supporting Learning Engagement With Online Students. Online Learning 2018; 22(4):313–340. doi: dx.doi.org/10.24059/olj.v22i4.1384.
9. Regier DS, Smith WE, Byers HM. Medical genetics education in the midst of the COVID-19 pandemic: Shared resources. Am J Med Genet Part A 2020 Jun;182(6):1302–1308. doi: 10.1002/ajmg.a.61595. Epub 2020 Apr 23. PMCID: PMC7264783.
10. Landrum B. Examining Students’ Confidence to Learn Online, Self-Regulation Skills and Perceptions of Satisfaction and Usefulness of Online Classes. Online Learning 2020; 24(3):128–146. doi: dx.doi.org/10.24059/olj.v24i3.2066.
11. Cheng SL, Xie K. Why college students procrastinate in online courses: A self-regulated learning perspective. Internet High Educ 2021 Jun;(50):100807. doi.org/10.1016/j. iheduc.2021.100807.
12. Chutinan S, Kim J, Chien T, Meyer HY, Ohyama H. Can an interactive case‐based activity help bridge the theory‐practice gap in operative dentistry? Eur J Dent Educ 2021 Feb;25(1):199–206. doi: 10.1111/eje.12591. Epub 2020 Oct 13.
13. Hofsten A, Gustafsson C, Häggström E. Case seminars open doors to deeper understanding — Nursing students’ experiences of learning. Nurse Educ Today 2010 Aug;30(6):533–8. doi: 10.1016/j.nedt.2009.11.001. Epub 2009 Dec 16.
14. Thistlethwaite, JE, Davies D, Ekeocha S, Kidd JM, MacDougall C, Matthews P, Clay D. The effectiveness of case-based learning in health professional education. A BEME systematic review: BEME Guide No. 23. Med Teach 2012;34(6):e421–44. doi: 10.3109/0142159X.2012.680939.
15. Zheng M, Spires H. Teachers’ interactions in an online graduate course on Moodle: A social network analysis perspective. Meridian 2011, 13(12).
16. Lima MS, Tonial FG, Basei E, Brew MC, Grossmann E, Haddad AE, Bavaresco, CS. Effectiveness of the distance learning strategy applied to orthodontics education: A systematic literature review. Telemed J E Health 2019 Dec;25(12):1134–1143. doi: 10.1089/tmj.2018.0286. Epub 2019 Sep 30.
17. Hattar S, AlHadidi A, Sawair FA, Abd Alraheam I, ElMa’aita A, Wahab FK. Impact of COVID-19 pandemic on dental education: Online experience and practice expectations among dental students at the University of Jordan. BMC Med Educ 2021 Mar 8;21(1):151. doi: 10.1186/s12909-02102584-0.
THE CORRESPONDING AUTHOR, Brandon Zegarowski, DDS, can be reached at bzegarowski@gmail.com.