Professional Testing, Inc.
Providing High Quality Examination Programs

From the Item Bank

The Professional Testing Blog

 

The Short Answer – Why, When, and How to Use this Item Type

July 19, 2016  | By  | 

Overview

The Short Answer (SA) is a constructed response item type in which the examinee types a short response. Like the fill-in-the-blank (FIB) and essay item types, SA items prompt examinees to produce their responses by typing, rather than selecting from a list as in a MC item.

Some people use the terms for SA and FIB items interchangeably, while others use FIB to refer to the most discrete items, where the correct response is typically a single word or very short phrase, while SA might refer to items in which the correct response could be slightly longer. In both cases, the item is typically designed to be scored by the computer, using a key word list prepared by the item writer. At the other end of the continuum, SA items are generally distinguished from essay items: the response to a SA item is typically much shorter than an essay response, and while SA items can use keylists for scoring, essays cannot.

When to Use

A major advantage of SA items is their ability to measure the test-taker’s recall of information, rather than recognition. A related advantage is that SA items are able to measure content without cuing or guessing. Additionally, in some content areas, such as mathematical computation and language translation, SA items can be used to measure at the application cognitive level.

SA image

Note: Adapted from Parshall, C.G., & Cadle, A. (2015, March). How to identify, develop, and implement innovative items. Presented at the annual meeting of ATP, Palm Springs, CA.

 

Issues to Consider

A frequent item writing challenge for the SA item is that it can be difficult to write stems that are clear, yet avoid cuing. If the stem is not sufficiently focused, then test-takers may find the item confusing, but if the item is overly focused, then the item may be too easy or at too low of a cognitive level. Because of this, a typical weakness of SA items is that they are often written at the knowledge level.

Another item writing challenge is identifying a full keylist of correct responses. If the test-takers are unclear about the question being posed by the SA item, they may respond very differently than anticipated. Even if an examinee is responding to the intended question, and knows the correct response, there is the possibility that he or she will phrase the response differently. SA items must also address the likelihood that some examinee responses will include typos or misspellings.

Due to all of these potential risks, a thorough analysis of any SA item at the pretest stage is recommended, in order to capture all the possible variants of correct responses that examinees may provide. For the automated scoring of an SA item to be successful, this full keylist will be needed. In addition, the length of the correct response, and the clarity of the stem, are crucial.

For many exam programs there is one further issue to consider with the SA item type. If the SA is suitable for use on only a small number of items in the test blueprint, the testing organization should think about whether the SA provides sufficient advantages to outweigh the need to inform stakeholders and instruct examinees in its use on the test.

Summary

The SA item type is ideal for a limited set of content areas, such as computation. It can also be used successfully in somewhat broader content categories, when careful development of the stem and thorough compilation of the keylist have been undertaken. In these instances, the SA item is worth including on an exam.

This post is part of the series “Alternative Item Types.”

Tags: , , , ,

Categorized in:

Comments are closed here.