CDEWorld > Courses > Setting and Validating the Pass/Fail Score for the NBDHE

CE Information & Quiz

Setting and Validating the Pass/Fail Score for the NBDHE

Tsung-Hsun Tsai, PhD; Barbara Leatherman Dixon, RDH, BS, Med

May 2013 Course - Expires Tuesday, May 31st, 2016

American Dental Hygienists' Association

Abstract

Purpose: This report describes the overall process used for set­ting the pass/fail score for the National Board Dental Hygiene Examination (NBDHE). Methods: The Objective Standard Setting (OSS) method was used for setting the pass/fail score for the NBDHE. The OSS method requires a panel of experts to determine the criterion items and proportion of these items that minimally competent candidates would answer correctly, the percentage of mastery and the confidence level of the error band. A panel of 11 ex­perts was selected by the Joint Commission on National Dental Examinations (Joint Commission). Panel members represented geographic distribution across the U.S. and had the following characteristics: full-time dental hygiene practitioners with expe­rience in areas of preventive, periodontal, geriatric and special needs care, and full-time dental hygiene educators with experi­ence in areas of scientific basis for dental hygiene practice, pro­vision of clinical dental hygiene services and community health/ research principles. Utilizing the expert panel’s judgments, the pass/fail score was set and then the score scale was established using the Rasch measurement model. Results: Statistical and psychometric analysis shows the actual failure rate and the OSS failure rate are reasonably consistent (2.4% vs. 2.8%). The analysis also showed the lowest error of measurement, an index of the precision at the pass/fail score point and that the highest reliability (0.97) are achieved at the pass/fail score point. Conclusion: The pass/fail score is a valid guide for making de­cisions about candidates for dental hygiene licensure. This new standard was reviewed and approved by the Joint Commission and was implemented beginning in 2011.

You must be signed in to read the rest of this article.

Login Sign Up

Registration on CDEWorld is free. Sign up today!
Forgot your password? Click Here!

Introduction

In examinations used for making decisions about candidates for licensure purposes, candidates’ levels of achievement on the examinations are classified into “pass” if the scores are at or above the established pass/fail score, and “fail” if the scores are below the established pass/fail score. Deriving psychometric and legally defensible pass/fail scores is important to identify minimal competency of the candidate, thereby assisting state boards in making valid decisions regarding licensure and providing protection to the public from unqualified candidates.1-5 The Standards for Educational and Psychological Testing also suggest a pass/fail score for a licensing examination be set appropriately to ensure the results of the assessment are valid.6 In response to these recommendations, the Joint Commission on National Dental Examinations (Joint Commission), the agency responsible for developing, administering, scoring and reporting the National Board Dental Hygiene Examination (NBDHE) results, conducted a standard setting to set the pass/fail score for the NBDHE to accurately classify passing and failing candidates. As an essential part of providing the validity evidence to communities of interest who use the results of the NBDHE for making decisions, it is important that the Joint Commission reports the process for setting and validating the pass/fail score for the NBDHE in a professional journal. The purpose of this report is to fulfill this responsibility by describing the overall process used for setting the pass/ fail score for the NBDHE.

The National Board Dental Hygiene Examination7

The NBDHE is designed to assist state boards in assessing the qualifications of individuals who seek licensure to practice dental hygiene. The examination is typically taken by student candidates during the last year of the dental hygiene program. The NBDHE assesses the candidate’s ability to understand important information from basic biomedical, dental and dental hygiene sciences and the ability to apply such information in a problem-solving context. This comprehensive, computer-based examination consists of 350 multiple-choice items covering 3 major areas for 13 disciplines (Table I). Items are balanced within multiple disciplines from which the items are sampled. Items are presented with a stem pairing a question or statement with a list of 4 or 5 possible responses. The examination includes 200 discipline-based items and 150 items based on 12 to 15 dental hygiene patient cases. Each case presented in the examination consists of patient histories, dental charts, diagnostic radiographs and clinical photographs.

Selection of the Panelists

The guidelines from the Standards for Educational and Psychological Testing were used in the selection of a panel of the NBDHE experts.6 The Joint Commission reviewed and approved the following selection criteria:
• Full-time practicing dental hygienists with experience in areas of preventive, periodontal, geriatric and special needs care
• Full-time dental hygiene educators with experience in areas of scientific basis for dental hygiene practice, provision of clinical dental hygiene services and community health/research principles
• Geographic distribution with both urban and rural representation from major regions of the U.S. by ensuring demographic diversity (gender, age, race, ethnicity, etc.)
The Joint Commission sent a call for nominations to all communities of interest and then reviewed all nominees’ credentials. Of the nominees, 11 individuals (10 dental hygienists and 1 dentist) were selected. The Joint Commission determined that these experienced clinicians and educators represented expertise in all areas of content in the NBDHE and that their judgments would characterize the dental hygiene profession’s estimation of what the new or entry level dental hygienist should know and do.

Methods and Materials

The standard setting was conducted using the Objective Standard Setting method (OSS).8 Onsite training was provided to panelists at the meeting. First, the panelists’ role and responsibilities were clarified. Second, the background and purpose of the NBDHE were presented. Third, the meeting materials, including the standard setting protocol which was developed by the Joint Commission providing detailed information regarding the concept and the use of the OSS method, the NBDHE content specifications and sample questions from the NBDHE, were reviewed. Fourth, the overall process involved in validating the pass/fail score for the NBDHE was presented. In addition, to help the panelists conceptualize and correctly use the OSS method, sample items from the NBDHE were used. During the practice training process, each panelist rated each sample item individually by judging its importance to patient care using a rating scale ranging from 1 to 5, with 1 indicating very unimportant to patient care and 5 indicating critically important to patient care. After the ratings were complete, the group was asked if there were any specific problems or issues fundamental to rating the item. Concerns or issues were then addressed and discussed. Based on the group discussion, panelists were given the opportunity to change their ratings if they wanted. Once the discussion and revisions were done, the group moved on to the next sample item. This process repeated until the panelists understood the concept and felt comfortable applying the principles to the actual activities.
The OSS method requires panelists to make 3 recommendations:
• Selection of criterion items and proportion of these items that minimally competent candidates would answer correctly
• Determination of the percentage of mastery
• Determination of the confidence level
Each panelist selected items that they considered to be very important using the following criteria:
• The content of criterion items must be central, or directly related, to practice
• Criterion items must assess the knowledge and problem-solving skills that are utilized frequently in practice
• Criterion items must assess the knowledge and problem-solving skills that are dynamic and subject to change with current research and development in the field
• The content of the criterion items must be of fundamental and critical importance to successful practice
• The content of the criterion items must assess the minimum knowledge and problem-solving skills that are to have been acquired by the candidate
• Criterion items must be selected from throughout the examination
• Criterion items must be selected from a full range of the content included on the examination

The next task was related to the level of mastery. The panelists were instructed to record their estimates for an acceptable level of mastery (0 to 100%) necessary to pass the NBDHE. This estimate was based on the panelists’ knowledge of the reference group and the content sampled by the examination. The reference group consists of all students who are currently enrolled in accredited dental hygiene programs and who are taking the examination for the first time. Finally, judgments regarding the extent of error were necessary to complete the standard-setting activities. The panelists recorded their estimates as to how large the error band around a score should be. The notion of error is involved in measuring the performance of candidates. The true score of a candidate is somewhere within an error band. When a candidate’s score falls within the error band around the standard, the score could be evaluated as a passing or failing score. There are several options to consider. If the emphasis is protection of the public, one would pass only candidates whose scores exceed the upper limit of the error band. At the other end of the spectrum, if the focus is on protecting the innocent candidate, all candidates whose scores exceed the lower limit of the error band would pass. A 95% confidence level is considered appropriate.9 From the independent judgments of the panelists, the estimate fell within this suggested appropriate error band.

Results

Based on the panelists’ judgments, the NBDHE pass/fail score was set using the OSS method. The score scale was then established using the Rasch model.10 In the Rasch model, candidate ability and item difficulty are described by a single measurement scale. This means that candidate ability can be directly related to the specific abilities, knowledge knowledge and problem solving skills that underlie items on the NBDHE. The candidate’s ability is estimated based on the probability of a right or wrong response on each item. The underlying ability scale is centered at 0 and typically ranges from a -5.00 to a 5.00, with more negative values indicating relatively easier items and lower-scoring candidates. In like manner, more positive values indicate relatively more difficult items and higher-scoring candidates. Because candidate ability and item difficulty are on the same scale, it is possible to directly relate the 2 statistics relative to the criterion items. According to the judgments of the panelists, the knowledge underlying the criterion items is critically important to patient care. The pass/fail score was derived by the average difficulty of the criterion items in concert with the error band and the percentage of mastery suggested by the panelists. Those candidates whose scores were at or above this pass/fail point would pass. This point along the measurement scale is assigned a standard score of 75. After the pass/fail score was determined, the abilities of candidates were estimated for every possible raw score (number of correct responses), ranging from 0 to 350. Score conversions were developed to translate raw scores into standard scores for all exam forms using the common-item equating design.11

Discussion

Among various criteria available to evaluate the appropriateness of the pass/fail score produced by the panelists’ judgments using the OSS method, one major criterion used by the Joint Commission was to examine the consistency of the failure rates between what actually happened and the results produced by the OSS method. To meet this objective, a statistical analysis was conducted to compute the following statistics. The data were based on the 4,528 candidates taking the March 2009 edition of the NBDHE:
• The actual percentage of failing candidates
• The percentage of failing candidates using the results from the OSS method
Table II presents the comparison of failure rates between what actually happened and the panelists’ results using the OSS method. As shown, of the 4,528 candidates taking the March 2009 edition of the NB DHE, 108 (2.40%) failed. If the panelists’ judgments had been employed as the minimum passing score, 129 (2.8%) would have failed. Comparison of actual versus the OSS failure rates shows little change.
In addition, a psychometric analysis was conducted to examine the precision at the pass/fail score derived by the OSS method. Results show that the error of measurement at the pass/fail score point on the measurement scale is the lowest. In other words, maximum reliability (0.97) is achieved at the pass/fail score point.

Conclusion

A statistical analysis and a psychometric analysis were conducted to verify the appropriateness of the pass/fail score derived by the OSS method. The results of the analyses show that the actual failure rate and the failure rate derived by the OSS method are reasonably consistent. The error of measurement is lowest and the reliability is highest at the pass/fail score point on the measurement scale. Results of the standard-setting activities support the conclusion that the pass/fail score on the NBDHE is a valid guide for making decisions about candidates who seek licensure to practice dental hygiene.
When scores on an examination are used as a basis for making high stakes pass/fail decisions, it is necessary to validate the cut score that separates passing and failing candidates.6 This report provides psychometrically sound process, analyses and guidelines to set and validate the pass/fail score for making decisions about candidates for dental hygiene licensure.

Tsung-Hsun Tsai, PhD, is a research consultant in educational measurement and testing. Barbara Dixon, RDH, BS, MEd, is a dental hygienist with over 30 years experience in clinical practice and education.

Disclaimer

The information and opinions contained in this article reflect and are solely the work of the authors and are not those of the American Dental Association or its employees or members.

References

1. Boulet JR, De Champlain AF, McKinley DW. Setting defensible performance standards on OSCEs and standardized patient examinations . Med Teach. 2003;25(3):245-249.

2. Boulet JR, Smee SM, Dillon GF, Gimpel JR. The use of standardized patient assessments for certification and licensure decisions . Simul Healthc. 2009;4(1):35-42.

3. Kane MT, Crooks TJ, Cohen AS. Designing and evaluating standard-setting procedures for licensure and certification tests . Adv Health Sci Educ Theory Pract. 1999;4(3):195-207.

4. Norcini JJ. Research on standards for professional licensure and certification examinations . Eval Health Prof. 1994;17(2):160-177.

5. Cizek GJ, Bunch MB. Setting performance standards: contemporary methods . Educ Measur. 2004;23(4):31-50.

6. American Educational Research Association, American Psychological Association, National Council on Measurement in Education. Standards for educational and psychological testing . APA [Internet]. 1999. Available from: http://www.apa.org/science/programs/testing/standards.aspx

7. Joint Commission on National Dental Examinations. National board dental hygiene exami-nation program guide 2010 . ADA [Internet]. 2010. Available from: http://www.ada.org/sec-tions/educationAndCareers/pdfs/nbdhe_examinee_guide.pdf

8. Stone GE. Objective standard setting (or truth in advertising). In: Smith EV, eds. Introduction to Rasch measurement: Theory, models and applications. Maple Grove, MN: JAM Press; 2004:445-459.

9. Kramer GA, DeMarais DR. Setting a standard on the pilot national board dental Examination . J Dent Educ. 1992;56(10):684-688.

10. Rasch G. Probabilistic models for some intelligence and attainment tests. Chicago, IL: The University of Chicago Press; 1981.

11. Kolen MJ, Brennan RL. Test equating: methods and practices. New York, NY: Springer-Verlag; 1995.

CREDITS: 0
COST: $0
PROVIDER: American Dental Hygienists' Association
SOURCE: American Dental Hygienists' Association | May 2013

Learning Objectives:

  • Describe the overall process used for setting the pass/fail score for the National Board Dental Hygiene Examination.
  • Explain why psychometric and legally defensible pass/fail scores are important.
  • Discuss the Objective Standard method used for setting the pass/fail score.

Disclosures:

The author reports no conflicts of interest associated with this work.

Queries for the author may be directed to justin.romano@broadcastmed.com.