Professional Testing, Inc.
Providing High Quality Examination Programs

From the Item Bank

The Professional Testing Blog

 

Ensuring Equivalence for Exam Form Difficulty Variations

October 12, 2017  | By  | Leave a comment

One of the predominant goals a psychometrician strives to achieve is the development of a fair assessment program. The term fairness, as we use it, indicates evidence that every test taker is provided equal opportunity for equal outcome regardless of where they take an exam or when they take an exam. The subject of this post is to discuss creating equivalence for exam form difficulty variations. Prior to discussing this, it may be beneficial to briefly discuss some of the other opportunities to create fairness.

  1. Content consistency- In certification testing, an exam blueprint is created from a role delineation study/job analysis. This blueprint is used to assure that the same content is being measured regardless of exam form or the exam items. A certain number of questions will be used for each content or topic area within each exam form to assure that the content being sampled is consistent (fair) across multiple exam forms.
  2. Item development- Items are developed in a manner to reduce both content and general bias. Additionally, items are developed that are important and relevant to the intended content being measured.
  3. Test administration – Policies and procedures are put in place to ensure an equivalent testing environment including instructions provided to candidates, regardless of where the exam is administered.
  4. Passing Score study- A method using Subject Matter Experts to provide input as to where the cut-score should be within the continuum of scores helps assure the cut-score is fair.

 

Once a passing score is established it is very important to carry the passing score criterion forward to subsequent forms. This helps assure fairness, with respect to the passing score and examination difficulty, across multiple forms. We call this form equivalency. As mentioned in number 1. above, we first maintain content equivalence across forms. The second component of form equivalence is maintaining a fair (consistent) passing score across forms with respect to form difficulty.

As one could imagine all forms may not be created to have the exact same difficulty based on the difficulty of the items selected. To make slight adjustments, based on slight variations in exam form difficulty, we use a process called equating. While there are different measurement models and techniques for equating, we will be discussing traditional Classical Test Theory equating. Equating is the method used to ensure a fair passing standard for all exam forms, which in turn, assures all candidates are held to the same passing standard.

Example #1

Here are the results from two forms. Common items represent an identical set of items used on both Form A and Form B. Form A has a mean score of 25 for the common items and for Form B the common items mean score is 27. What does that tell us? It tells us that the cohort taking form B is smarter than those taking A. We also see that all items (total test score) have a mean of 75 on form A and B. Because of this, it is likely that the cut-score for form B should be lowered a few points. Lowering the cut-score will help assure fairness.

Form A Form B
Common Items Mean All Items Mean Common Items Mean All Items Mean
25 75 27 75

 

Throughout the exam development process, there are many opportunities to create evidence for validity. Many of these opportunities exist to help assure fairness for candidates. Equating is one tool we use to help create exam forms that are fair across time and people. Without equating, candidates would not be held to the same passing standard, thus making the examination process less valid.

For more information on equating and exam form equivalence go to Equating Test Forms for Fairness or find out more about the process of Standard Setting.

Tags:

Categorized in:

Leave a Reply

Your email address will not be published. Required fields are marked *