Friday, June 26, 2009

On the Assessment of Writing

One of the topics being considered in many states is how to best assess students' writing skills. The implementation of multiple-choice items to assess the writing ability of students has become more popular in recent years. Among states where Galileo K-12 Online is currently used, California and Massachusetts both use multiple-choice items to assess some aspects of writing. Arizona is reportedly adding multiple-choice writing to the AIMS in the next round of pilot testing, and we expect to see those items supporting the revised Arizona English and Language Arts standards which will be adopted in 2010-2011.

It is not surprising that multiple-choice holds a certain appeal for those wishing to assess writing. Multiple-choice items take less time away from instruction, can be scored using automated procedures such as those available to users of Galileo K-12 Online, and are scored consistently due to the use of a single correct answer instead of relying on evaluators to score to a rubric. These advantages make multiple-choice a compelling option, but there are other considerations that limit the usefulness and effectiveness of multiple-choice items in the assessment of writing. The use of multiple-choice to assess writing is an attractive but limited approach. Thomas M. Haladyna explains the limits of using multiple-choice to assess writing in Developing and Validating Multiple-Choice Items:

The most direct measure would be a performance-based writing prompt. MC items might measure knowledge of writing or knowledge of writing skills, but they would not provide a direct measure (p.11).

Therefore a crucial concern is the logical connection between item formats and desired interpretations. For instance, an MC test of writing skills would have low fidelity to actual writing. A writing sample would have much higher fidelity (p.12).

To assess writing, it is necessary to apply a standardized rubric and a writing prompt that allows students to express their responses in a manner that represents accurately their ability to compose, convey and communicate in a way that fulfills the designated purpose of a text and that utilizes appropriate information they possess relevant to the topic.

While multiple-choice reading items addressing an analysis standard may not require the student to compose a full analytical expression, they do require the student to utilize the same analytical processes to identify the correct analysis from the distractors provided. However, the ability to identify the best compositional example does not reflect accurately the skills and abilities inherent in good writing as the ability to recognize persuasive, informative or expressive quality does not indicate the ability of the student to create the same level of written content.

Galileo provides content to allow for writing assessments using prompts for the most authentic measure of student writing, while also covering writing knowledge and skills in multiple-choice items that help to establish data for basic skills measurement and test reliability in predicting standardized test performance.

Text Referenced

Haladyna, T.M. (2004). Developing and Validating Multiple-Choice Test Items (3rd ed.). Mahwah, N.J.: Lawrence Erlbaum Associates.

No comments: