24 minute read
The Dental Licensure OSCE: A Modern Licensure Examination for Dentistry
Anthony J. Ziebert, DDS, MS, and David M. Waldschmidt, PhD
ABSTRACT Objective structured clinical examinations (OSCEs) have been used to assess clinical skills in the health sciences for over 40 years and have a proven track record when used by boards to determine whether a licensure candidate possesses the necessary level of skills to practice. A dental licensure OSCE is being developed by the Joint Commission on National Dental Examinations. Extensive field testing will be conducted to ensure there is validity evidence supporting use of the dental licensure OSCE. Dental boards will be provided the necessary validity evidence supporting use of the DLOSCE to identify candidates who do and do not possess the level of dental clinical skills that are necessary to safely practice.
Advertisement
An objective structured clinical examination (OSCE) is a type of highstakes examination widely used in the health sciences, including midwifery, occupational therapy, optometry, medicine, physician assistants/associates, physical therapy, radiography, rehabilitation medicine, nursing, pharmacy, podiatry and veterinary medicine. [1] Introduced in the 1970s, OSCEs are now part of the U.S. Medical Licensing Examination for all U.S. medical graduates. [2] Developed to assess the complex notion of clinical competence in the medical field, the OSCE is a performance-based examinationthat typically utilizes multiple stations with examinees performing various clinical tasks at each station. [3] Test-takers rotate through a set number of stations, with a time limit to complete the task at each station. For more complex tasks, time can be extended by yoking adjacent stations together, effectively doubling or tripling the available time. Tasks may include test results interpretation, history taking, physical examination, patient education, order writing and/or other activities. Over time, most medical OSCEs have come to rely on standardized patients (SPs) at many of the stations. SPs are often highly trained actors who will portray a patient with a particular disease or condition, thus affording the evaluator an opportunity to witness the candidate provide a hands-on demonstration of their skills. Candidates can also interview the SP while they perform a physical examination, enabling a highly interactive diagnostic experience.
The standardized nature of the OSCE is key to its validity and consequently its success as a measurement tool. The use of trained actors serving as standardized patients provides a fair and level playing field for assessment, with little to no variation in the patient serving as the target of candidates’ evaluation. Candidates face the same tasks and constraints under identical conditions and are evaluated against the same performance criteria. This is in contrast to evaluative work with live patients who — even if they have the same diagnosis — may differ substantially in both subtle and not-so-subtle ways that impact accurate candidate skill measurement. Due to its validity and success, the OSCE method, as used today, has evolved into a flexible testing approach that can incorporate SPs as well as observer ratings, short written tests and other tools to provide a comprehensive clinical evaluation. [4] In an OSCE, it is the examinee’s clinical skills and the actions they take when faced with a patient or clinical situation that are assessed. In other words, OSCEs focus on the application of knowledge and skills to practice so that the examinee’s level of competence can be understood. [5]
OSCEs were introduced in medicine as a replacement for the traditional clinical examination, which had proven to be unsatisfactory for several reasons in the evaluation of medical student skills. [6] Over the years and throughout the health-related professions, the OSCE has become not only the gold standard for performance assessment, but also has had the ancillary benefit of impacting student learning and curriculum. [7–9] Compared to traditional clinical examinations, OSCEs provide consistent, stable results and represent a valid method of skill evaluation that is rigorous in its measurement of clinical competence, providing strong evidence of content validity. [10] OSCEs are also flexible and can be used to assess individuals in a range of situations, including undergraduate, graduate, continuing education, licensure and certification. [11,12] Finally, OSCEs possess high levels of face validity and are perceived by both students and examiners as being a fair and acceptable tool to assess clinical competence. [13]
While OSCEs have been used by dental educators for many years as an assessment mechanism, [14–16] their potential for use in informing U.S. dental licensure decisions has not been recognized until recently, despite its use in this capacity in Canada since the 1980s. The National Dental Examining Board (NDEB) of Canada’s OSCE is currently implemented as a multiple-choice examination that utilizes separate stations containing images and physical objects, such as radiographs, photographs and typodonts. [17] Over the years, the NDEB of Canada OSCE has accumulated a substantial amount of validity evidence that supports its usage, including published articles and annual technical reports that document the examination’s validity and reliability as it informs licensure decisions. [18] In one published article, Gerrow et al. obtained a positive correlation between student performance in the final year of dental school and performance on the OSCE (r = 0.46, p < .001) for examinations completed between 1995 and 2000. Yearto-year and school-to-school variations in performance were found to be minimal. [19] On the basis of such findings, U.S. dental boards have begun to take notice. Currently, three U.S. dental boards accept a passing grade on the NDEB of Canada OSCE as evidence that a candidate for licensure possesses the level of clinical skills necessary to safely practice: Minnesota, Colorado and Washington. Minnesota limits the acceptance of Canadian OSCE results to graduates of the University of Minnesota’s dental program.
In 2017, the American Dental Association (ADA) Board of Trustees formed a Dental Licensure OSCE (DLOSCE) steering committee and allocated funding to begin development of a U.S. DLOSCE. DLOSCE development would be overseen by the steering committee, working in close partnership with the ADA’s department of testing services (DTS), which is staffed by a team of examination professionals, many of whom possess advanced degrees in psychometrics, industrial/organizational psychology and related fields. The DTS has a long history of developing valid and reliable high-stakes examinations both for licensure (the National Board Dental Examination (NBDE) Part I and Part II; the Integrated National Board Dental Examination) and dental program admissions (Dental Admission Test and the Advanced Dental Admission Test). [20–22] The dental licensure examinations — currently the NBDE Part I and NBDE Part II — have been administered since 1980 under the governance of the Joint Commission on National Dental Examinations (JCNDE). Passing grades on the NBDEs are required by every licensing jurisdiction in the U.S. for candidates to obtain initial licensure. Though requirements vary by state and jurisdiction, all dental licensure applicants must meet three basic requirements: education (graduation from a Commission on Dental Accreditation (CODA)-accredited or CODArecognized dental education program), a written examination (NBDE Parts I and II; Integrated National Board Dental Examination starting in August 2020) and a demonstration of clinical competence. In the past, dental boards understandably relied on the single-encounter, procedure-based clinical examination of a patient as the examination modality to demonstrate the clinical competence of a licensure candidate. There were few proven alternatives. In addition, there were varying points of view regarding the perceived rigor of the CODA accreditation process and corresponding questions concerning the scope and rigor of school-based assessment procedures. However, thanks to over 25 years of hard work and the adoption and evolution of competency-based education in accredited dental schools, as well as the identification of new, effective pathways for dental clinical assessment, dental boards no longer need to rely on the legacy single-encounter approach for the clinical assessment of licensure candidates. The legacy approach is subject to random error; does not have strong validity evidence; inadequately samples tasks from the domain of clinical dentistry; is not reflective of the broad set of skills and knowledge expected of a dentist; and poses ethical challenges for test-takers, dental schools and the dental profession. [23–28] The ADA has recognized many of these shortcomings since the late 1990s, when it adopted policies calling for increased portability of dental licenses through development of a national clinical examination in the same vein as the national board written examinations and when it adopted policies calling for the elimination of the procedure-based treatment of patients in the clinical examination process. [29] In funding the development of an OSCE for dental licensure, the ADA believes that a reliable and valid dental OSCE provides a proven testing modality that dental boards can use with confidence in determining whether a candidate for licensure is competent to practice dentistry. Use of an OSCE enhances dental boards’ ability to screen out incompetent beginning practitioners, enabling boards to better fulfill their duty to protect the public.
The purpose of the DLOSCE program is to identify whether a candidate for dental licensure possesses the level of clinical skills that is necessary for the competent practice of dentistry. The program accomplishes this through the use of a valid and reliable examination that has been professionally developed and scored. Successful candidates will have demonstrated the ability to recall important information from the clinical dental sciences and will have applied their skills in an applied, problem-solving context. The DLOSCE will be a national exam with no variation in examination content. This means that content will not be based on the region of the country where it is administered nor on variation of curriculum at different dental schools. DLOSCE development is informed by the Standards for Educational and Psychological Testing [30] that provides considerations for developing, implementing and evaluating examinations. The standards were developed by the American Educational Research Association (AERA), American Psychological Association (APA) and the National Council on Measurement in Education (NCME) and provide considerations for developing, implementing and evaluating tests. The standards and industry best practices help guide DLOSCE development activities as the DLOSCE is designed, constructed and implemented. In developing the DLOSCE, the focus is on validity. Validity concerns the evidence that supports the interpretation and use of examination results relative to its intended purpose. Validity is the most important and most fundamental consideration in developing and evaluating tests. [30] A validity argument lays out the evidence in support of a specific interpretation of a test score (Standard 1.0). The standards indicate that clear articulation of each intended test score interpretation for a specified use should be set forth and appropriate validity evidence in support of each intended interpretation should be provided. [30] In testing, reliability refers to the consistency and precision of scores across replications of a testing procedure. [30] Standard 2.0 states that, “Appropriate evidence of reliability/precision should be provided for the interpretation for each intended score use.” [30] Reliability is reduced by random error, which can be caused by such things as fluctuations in candidate attention or memory, momentary distractions present in the testing environment and inconsistencies in examiner ratings. Evidence of high-score reliability strengthens a validity argument. Reliability is necessary for validity, but caution is warranted when placed in assessment situations where one is forced to choose between maximizing validity or maximizing reliability. Validity remains the focal, fundamental consideration.
Development of the DLOSCE is being overseen by the DLOSCE steering committee, which is composed of general practitioners in private practice; educators with experience teaching comprehensive, general dentistry in dental school clinics; current state dental board members; and consultants with experience in developing an OSCE for licensure purposes. The steering committee was charged with the following:
■ Identify and establish the content domain and test specifications for the examination through utilization of a practice analysis.
■ Establish the general structure for the examination (number of stations) and permissible item formats.
■ Identify and contract with key vendors (e.g., technology, administration) in support of the examination.
■ Identify and establish test construction team (TCT) structure based on the content domain and test specifications.
■ Call for TCT member applications and appoint members to the TCTs. TCT members develop the test questions for the examination.
■ Administer and interpret the results of the DLOSCE field tests.
■ Develop the candidate guide.
■ Identify the appropriate governance structure for DLOSCE administration.
The steering committee has held regular, face-to-face meetings since 2017 and has made significant progress in all areas of its charge, in addition to completely fulfilling its charge in several areas. The establishment of the DLOSCE content domain and test specifications represented a key early accomplishment. This occurred through the formation of a subject-matter expert panel that carefully considered the appropriate role of a modern dental clinical licensure examination and the areas of content that must be considered to safely protect the public. The panel considered data from multiple relevant sources and, most importantly, the results of a 2016 dental practice analysis involving over 2,500 entry-level general dentists practicing throughout the U.S. and its jurisdictions. As highlighted by the Standards for Educational and Psychological Testing, practice analysis plays a vital role in properly establishing the content domain of a high-stakes licensure examination. Practice analyses for licensure purposes in the health sciences — including dentistry — involve soliciting ratings from practicing entry-level professionals concerning the tasks performed and the frequency and criticality of those tasks to patient care. This in turn is used to understand and identify the knowledge, skills, abilities and other characteristics (KSAOs) that are required to perform those tasks. The information obtained from this exercise for the DLOSCE was used by subject-matter experts to determine the number of items that would be devoted to each area of competency. This in turn informed the structure and expertise-related composition of TCTs charged with the task of developing and reviewing examination content. The judgment of subject-matter experts was critical to each step in this process to help ensure that the examination properly reflected the breadth and complexity of clinical dental skills required for safe entry-level practice.
DLOSCE questions are focused exclusively on the clinical tasks a dentist performs while providing direct, chairside treatment to patients. Based on the aforementioned test specifications, topic areas included in the DLOSCE are as follows: restorative dentistry, prosthodontics, periodontics, oral surgery, endodontics, orthodontics, prescription writing, medical emergencies, oral pathology, pain control and temporomandibular dysfunction. Diagnosis, treatment planning and occlusion are assessed across the topic areas listed above. The examination includes questions involving each of the following patient types: pediatric, geriatric, special needs and medically complex. DLOSCE questions can include one or more correct answers, one or more critical errors and response options that are unscored (neither contributing nor deducting from the points awarded for the question). Answering questions correctly without missing points due to a critical error requires depth and breadth of clinical judgment with emphasis on clinical application and synthesis of clinical knowledge (as opposed to simple recall).
The steering committee established a separate DLOSCE working committee comprised of subject-matter experts to guide and facilitate the test question development process. Calls for test constructor applications for the DLOSCE were quite successful. Taken as a whole, the DLOSCE test constructor pool includes more than 170 subject-matter experts, including general dentists and specialists, who the DLOSCE steering and working committees can draw upon to develop and review DLOSCE questions and stimulus materials. The first series of DLOSCE TCT meetings took place between February and November 2019 resulting in the development of several hundred potential examination questions. In addition to the development and review work occurring within each DLOSCE TCT, each DLOSCE question is also separately reviewed by a panel of general dentists to help confirm and ensure clinical relevance and the appropriateness of the content in service to the examination purpose.
In designing the DLOSCE, the steering committee considered various question types, measurement methodologies and presentation formats as well as administration vendors who would be capable of presenting this information accurately and efficiently to candidates. The committee determined that the DLOSCE would best and most accurately evaluate candidate skills when implemented as a computer-administered examination, and it selected a test administration vendor accordingly. The committee considered various technologies and determined that 3D representations have developed to the point where 3D models accompanied by software enabling the candidate to rotate, move and enlarge the models can provide an interactive experience of such high fidelity that the candidate experience closely mirrors the experience of the practicing entry-level dentist. This realization represents an advancement of the OSCE methodology beyond its current focus on stations and physical materials. Utilization of 3D interactive models to modernize dental licensure represents an important advancement not only to the OSCE methodology itself, but also to the evaluation of dental candidate clinical skills within the dental licensure process.
In considering the DLOSCE, a common question concerns the difference between the DLOSCE and the Integrated National Board Dental Examination (INBDE). Both of these examinations are computer administered, both assess clinical skills (e.g., diagnosis and treatment planning, oral health management) and there is certainly overlap in the tested topics. However, despite surface similarities, closer inspection reveals the presence of key differences. As indicated previously, the DLOSCE focuses exclusively on the clinical tasks a dentist performs while providing direct, chairside treatment to patients. This narrow focus includes microjudgments, the recognition of errors in treatment and knowledge of success criteria. In contrast, the INBDE focuses on cognitive skills and the biomedical underpinnings of clinical decisions. The INBDE has a broader focus that includes practice and professional considerations, evidencebased dentistry, understanding research and patient oral health care education. The INBDE also evaluates candidate understanding of the reasoning behind decisions (e.g., mechanisms of action and the reasons why certain clinical outcomes are anticipated). An example of the difference between the examinations can be illustrated with questions related to the topic of pharmacology. For the INBDE, pharmacology questions require in-depth knowledge of microbiology, pharmacokinetics and pathology as foundational knowledge areas. For the DLOSCE, a candidate would simply be asked to write a prescription for the appropriate drug based on the clinical situation presented in the question.
A related question concerns the difference between the Advanced Dental Admission Test (ADAT) and both the DLOSCE and INBDE. Once again, while each of these examinations is computer administered and there is some overlap in clinical science topics, these examinations are fundamentally different. The ADAT is a norm-referenced examination designed to help advanced dental education program directors understand applicants’ likelihood of success in advanced dental education programs and to give these directors the ability to rank order applicants based on their potential for success. On the other hand, the DLOSCE and INBDE are criterion-referenced examinations designed to help dental boards determine whether a candidate for licensure possesses the level of clinical skills that is necessary to practice safely. The norm-referenced methods for interpreting ADAT results — where applicant performance is considered relative to the performance of others — stands in sharp contrast to the criterionreferenced methods for interpreting DLOSCE and INBDE results where candidate performance is interpreted relative to specific skill requirements established through standard activities involving subject-matter expert review of a content domain. In accordance with these principles, for the ADAT, education programs and applicants receive quantitative scale scores that facilitate applicant comparison. Scale scores range from 200 to 800 with a target mean of 500 and a target standard deviation of 100. In contrast, DLOSCE and INBDE performance is simply reported as “pass” or “fail,” with the corresponding standard determined by a panel of subject-matter experts charged with the task of reviewing test content relative to a domain to identify the level of clinical skills required to safely practice. This panel is composed of a representative group of dental board members, general practitioners (both entry level and more experienced) and clinical dental educators. In accordance with the purposes of these two examinations, test-takers whose performance meets or exceeds the established standard are simplify notified that they have passed the examination (i.e., no additional performance information is reported to these individuals). Individuals whose performance falls short of the standard are informed that they have failed the examination. These individuals are in turn provided with additional performance information to help them understand the areas where remediation would be beneficial. Lastly, there are test scales that are exclusive to the ADAT, including biomedical sciences (anatomic sciences, biochemistry, physiology, microbiology and pathology) and data (research interpretation and evidence-based dentistry). In comparison, DLOSCE topics relate exclusively to clinical decision-making. With regard to the INBDE, biomedical science questions are presented in an applied clinical context, whereas for the ADAT, biomedical science questions are presented in a pure format (not integrated). In other words, the emphasis on the ADAT rests with the test-taker’s discrete knowledge of biomedical and clinical science concepts. In contrast, the INBDE emphasizes clinical relevance and whether candidates can appropriately integrate foundational knowledge areas and clinical content areas to make appropriate treatment decisions.
As indicated previously, an important charge of the DLOSCE steering committee involved the identification of a suitable governing body for the examination program. For the first few years of DLOSCE development, the DLOSCE steering committee operated as a committee of the ADA Board of Trustees. During this time, in accordance with the aforementioned charge, the steering committee carefully and thoughtfully considered issues involving DLOSCE governance. Licensure examination programs involve a public trust that requires the examinations be administered and decisions made in a consistent manner that is as free from bias and conflict of interest as possible. Similarly, confidentiality protections and due process must be afforded to candidates seeking licensure. Given the preceding, the steering committee determined that governance by a commission would be most appropriate. Through the promulgation of policies and procedures in accordance with the ADA bylaws’ mandated duties, a commission endeavors to protect the public and conduct its activities with integrity in a manner befitting its charge. Commissions have mechanisms in place to guard against any particular community of interest having an undue influence on decisions and program administration. In considering these issues further, the steering committee concluded that the Joint Commission on National Dental Examinations would represent an ideal choice to administer the DLOSCE, as the Joint Commission is a wellrespected and trusted agency that has been developing and administering written licensure examinations that are accepted by all U.S. licensure jurisdictions since 1990. This led to a conversation between the steering committee and the JCNDE during which both parties recognized the merits of this line of thought. In December 2019, the ADA Board of Trustees received a corresponding report from the steering committee and formally made the decision to assign governance of the DLOSCE to the JCNDE, maintaining the makeup of the DLOSCE steering committee for continuity purposes and placing it as a committee of the JCNDE. The steering committee will be sunset once field test results provide strong evidence that the examination is valid and reliable and once the steering committee has successfully fulfilled its charge. At that time, the oversight, administration and continued development of the DLOSCE will be fully integrated into the standard organizational structure of the JCNDE. The JCNDE released the DLOSCE for use by dental boards to determine a licensure candidate’s competence as a safe, beginning practitioner in June 2020.
In conclusion, OSCEs have been used to assess clinical skills in the health sciences for over 40 years. OSCEs have a proven track record when used by boards to determine whether a licensure candidate possesses the necessary level of skills to safely practice at an entry level. DLOSCE development activities are guided by the Standards for Educational and Psychological Testing and industry best practices (i.e., releasing annual technical reports with reliability and validity evidence for the examinations) and implemented by a professional staff operating under the oversight of a commission of the ADA (JCNDE) with strong protections in place to help reduce conflicts of interest. The JCNDE pursues DLOSCE activities through a DLOSCE steering committee with a clearly defined charge. DLOSCE development efforts involve a working committee and a substantial number of highly educated and trained dental subject-matter experts. There will be extensive field testing of the DLOSCE to provide dental boards with the necessary validity evidence supporting use of the DLOSCE to identify candidates who do and do not possess the level of dental clinical skills that are necessary to safely practice. The standardized nature of the DLOSCE is key to its validity. The DLOSCE format addresses many of the shortcomings of current legacy single-encounter, procedurebased clinical examinations involving varied patients and advances the OSCE methodology through utilization of sophisticated 3D models with high fidelity. Administration of the DLOSCE by the Joint Commission on National Dental Examinations, an agency with a long track record of providing valid and reliable licensure examinations to all U.S. licensing jurisdictions, provides further confidence to dental boards that the DLOSCE will serve as a strong tool that can be confidently used by boards to understand candidate clinical skills and protect the public health.
REFERENCES
1. OSCE. Objective structured clinical examination. www. oscehome.com/What_is_Objective-Structured-ClinicalExamination_OSCE.html.
2. Federation of State Medical Boards (FSMB) and National Board of Medical Examiners NBME). USMLE Bulletin of Information. 2019. www.usmle.org/pdfs/ bulletin/2020bulletin.pdf.
3. Harden RM, Lilley P, Patrício M. The Definitive Guide to the OSCE. Edinburgh: Elsevier; 2016:A1.
4. Turner JL, Dankoski ME. Objective structured clinical exams: A critical review. Fam Med 2008 Sep;40(8):574–8. PMID: 18988044.
5. Harden RM, Lilley P, Patrício M. The Definitive Guide to the OSCE. Edinburgh: Elsevier; 2016:A7.
6. Harden et al. The Definitive Guide to the OSCE. Edinburgh: Elsevier; 2016:1.
7. Harden RM, Lilley P, Patrício M. The Definitive Guide to the OSCE. Edinburgh: Elsevier; 2016:A33.
8. Conto LB, Durand MT, Wolff, ACD, et al. Formative assessment scores in tutorial sessions correlates with OSCE and progress testing scores in a PBL medical curriculum. Med Educ Online 2019 Dec;24(1):1560862. doi: 10.1080/10872981.2018.1560862.
9. Kumar V, Gadbury-Amyot CC. Predoctoral curricular revision for dental interpretation competence based on OSCE results. J Dent Educ 2019 Oct;83(10):1233–1239. doi: 10.21815/ JDE.019.112. Epub 2019 Jun 10.
10. Goh HS, Zhang H, Lee CN, Wu XV, Wang W. Value of nursing objective structured clinical examinations: A scoping review. Nurse Educ 2019 Oct;44(5):E1–E6. doi: 10.1097/ NNE.0000000000000620.
11. Dong T, Swygert KA, Durning SJ, et al. Validity evidence for medical school OSCEs: Associations with USMLE step assessments. Teach Learn Med 2014;26(4):379–86. doi: 10.1080/10401334.2014.960294.
12. Warner DO, Isaak RS, Peterson-Layne C, et al. Development of an objective structured clinical examination as a component of assessment for initial board certification in anesthesiology. Anesth Analg 2020 Jan;130(1):258–264. doi: 10.1213/ANE.0000000000004496.
13. Bousicot KAM, Roberts TE, Burdick WP. Structured assessments of clinical competence. In: Swanwick T, ed. Understanding Medical Education Evidence, Theory and Practice. 2nd ed. Chichester, U.K.: John Wiley and Sons; 2014:246–258.
14. Graham R, Zubiaurre Bitzer LA, Anderson OR. Reliability and predictive validity of a comprehensive preclinical OSCE in dental education. J Dent Educ 2013 Feb;77(2):161–7.
15. Pack SE, Anderson NK, Karimbux NY. OSCE and case presentations as active assessments of dental student performance. J Dent Educ 2016 Mar;80(3):334–338.
16. Zartman RR, McWhorter AG, Seal NS, Boone WJ. Using OSCE-based evaluation: Curricular impact over time. J Dent Educ 2002 Dec;66(12):1323–1330.
17. The National Dental Examining Board of Canada. OSCE. 2019. ndeb-bned.ca/en/accredited/osce-examination.
18. The National Dental Examining Board of Canada. Technical Report: Objective Structured Clinical Examination. 2019. ndeb-bned.ca/sites/ndeb/files/pdf/ TechnicalManuals/2018/acj_2018_technical_report_ approved091419.pdf.
19. Gerrow JD, Murphy HJ, Boyd MA, Scott DA. Concurrent validity of written and OSCE components of the Canadian dental certification examinations. J Dent Educ 2003 Aug;67(8):896-901.
20. Joint Commission on National Dental Examination. 2018 Technical Report National Board Dental Examinations. www. ada.org/~/media/JCNDE/pdfs/NBDE_Technical_Rpt. pdf?la=en.
21. American Dental Association. Dental Admission Test (DAT) Validity Study 2014–2016 Data. www.ada.org/~/media/ ADA/Education%20and%20Careers/Files/dat_validity_study. pdf?la=en.
22. American Dental Association. The Advanced Dental Admission Test (ADAT) Program Update for 2019. www. ada.org/~/media/ADA/Education%20and%20Careers/ Files/2019_ADEA_ADAT_Program_Update.pdf?la=en.
23. Ranney RR, Wood M, Gunsolley JC. Works in progress: A comparison of dental school experiences between passing and failing NERB candidates, 2001. J Dent Educ 2003 Mar;67(3):311–316.
24. Ranney RR, Gunsolley JC, Miller LS, Wood M. The relationship between performance in a dental school and performance on a clinical examination for licensure: A nineyear study. J Am Dent Assoc 2004 Aug;135(8):1146–1153. doi: 10.14219/jada.archive.2004.0374.
25. Ranney RR. What the available evidence on clinical licensure exams shows. J Evid Based Dent Pract 2006 Mar;6(1):148–154. doi: 10.1016/j.jebdp.2005.12.012.
26. Chambers DW. Board-to-board consistency in initial dental licensure examinations. J Dent Educ 2011 Oct;75(10):1310–1315.
27. Friedrichsen SW. Moving toward 21st-century clinical licensure examinations in dentistry. J Dent Educ 2016 June;80(6):639–640.
28. American Dental Association. Ethical considerations when using patients in the examination process. www.ada.org/~/ media/ADA/Education%20and%20Careers/Files/ethicalconsiderations-when-using-patients-in-the-examination-process. pdf.
29. American Dental Association. Comprehensive Policy on Dental Licensure. www.ada.org/~/media/ADA/Member%20 Center/Members/current_policies.pdf?la=en.
30. American Educational Research Association, American Psychological Association, National Council on Measurement in Education and Joint Committee on Standards for Educational and Psychological Testing. Standards for Educational and Psychological Testing. Washington, D.C.: 2014:11,23,33,42.
THE CORRESPONDING AUTHOR, Anthony J. Ziebert, DDS, MS, can be reached at zieberta@ada.org.
AUTHORS
Anthony J. Ziebert, DDS, MS, received his dental degree from the Georgetown University School of Dentistry and a certificate in general practice from the Veteran’s Administration Hospital in Milwaukee and a MS in prosthodontics from Marquette University. He currently is the senior vice president for education and professional affairs at the American Dental Association and maintains a part-time private practice in Milwaukee limited to prosthodontics. Conflict of Interest Disclosure: None reported.
David M. Waldschmidt, PhD, is the director of testing services for the American Dental Association and the director of the Joint Commission on National Dental Examinations. He holds a PhD in industrialorganizational psychology and has worked in the professional testing industry for over 20 years. Dr. Waldschmidt has authored papers published in the Journal of Dental Education and the journal Industrial and Organizational Psychology: Perspectives on Science and Practice and presented at many meetings. Conflict of Interest Disclosure: None reported.