Monday, September 28, 2009

NEW Galileo Teacher Dashboard

As districts are nearing the end of the 1st quarter or semester, it’s likely they'll also administer benchmark assessments. Once the results have been scored in Galileo, teachers should become familiar with an exciting new enhancement called the Teacher Dashboard. The Teacher Dashboard page provides an area where teachers can track recent and upcoming events, such as assessments and dialogs for their classroom as well as test results (benchmark and formative assessments). One of the new reports that’s included on this screen is the Intervention Alert. This report displays the percentage of students who have met the standard on the standards/performance objectives from a particular test, giving you access to Quiz Builder and intervention assignments directly from the results page.

Wednesday, September 23, 2009

Reaching for Precision in the Imperfect World of Multiple Choice Items – It Begins with Item Specification Right from the Start

Multiple-choice questions are the most widely used approach to assessing student mastery of standards. In fact, they represent the largest proportion of item types on state-wide tests measuring standards mastery, on national tests measuring educational progress, on college entrance exams, and on benchmark and formative assessments utilized as part of local school district assessment programs. The information gleaned on student learning from this broad array of assessments is used by a diversity of educational decision-makers to accomplish a wide range of educational goals.

These goals may include: 1) re-teaching and enrichment intervention for a student, group, or class; 2) school- or district-wide proactive educational planning to help promote student mastery of standards throughout the school year; and 3) comprehensive strategic planning initiatives that may substantially alter the kinds of programs, policies, and practices implemented within a school district or throughout an entire state.

Given the extent to which assessment data can and does drive educational decision-making, it is imperative that the construction, review, certification, and empirical validation of multiple-choice items included on these assessments be very, very precise.

A typical multiple-choice item has three parts. These include a stem that presents a problem, the correct answer; and several distractors. These items can be constructed to assess a variety of learning outcomes, from simple recall of facts to highly complex cognitive skills.

Regardless of learning outcome being assessed, precision in the item writing process begins with precision in designing an item in such a way so as to ensure that the focus of the item is on the specification (i.e., the construct) being measured. In the ideal world of assessment students who correctly answer an item built in this fashion are assumed to do so because they have mastered the principle or construct being assessed. Of course, as we all know, the real world of assessment is an imperfect one, and one in which measurement is not always precise. Measurement error and guessing, for example are a permanent part of that real world.

That being said, we can take considerable steps right from the start in fostering a high level of precision in item development even in this imperfect world of assessment. For example, when new items are to be added to the ATI item banks for use in Galileo K-12, the first step is to review the standard which is to be assessed. The standard is broken down into the skills that make up the standard. These skills are the starting point for developing an online list of item specifications defining the characteristics of the particular class of items to be written.

Item specifications indicate the defining characteristics of the item class, the rationale for the class, and the required characteristics for each item component. In the case of multiple-choice items, the required characteristics of the stem and the alternatives are specified. Specifications address such factors as the cognitive complexity intended for items included in the specification, the appropriateness of vocabulary, requirements related to readability levels, and the alignment of the item with standards. The value of creating specifications as a guide for the item development process is recognized as a critical part of a process documenting that assessments are reliable and valid indicators of the ability they are intended to measure (Haladyna, 2004). Their structure and specificity also afford many advantages for ensuring that assessments may be readily adapted as district needs and or state/federal requirements change.

Extensive information about the ATI item specification process as well as the multi-stage item construction, review and certification procedures use by ATI can be accessed by contacting us directly.

We would, of course, like to hear from you as well. For example, what kinds of challenges have you faced in developing items within your district or for your classroom? And what kinds of solutions/procedures have you implemented to help enhance the precision with which locally developed test items are developed, reviewed and empirically validated?

Reference: Haladyna, T.M. (2004). Developing and Validating Multiple-Choice Test Items (3rd ed.). Mahwah, N.J.: Lawrence Erlbaum Associates.

Saturday, September 12, 2009

The Galileo Data Import Process

Many schools are welcoming students back from summer break. With returning students comes a new Galileo program year, accompanied by new class lists and rosters.

For those new to the process, the best way to create your class lists and rosters is through the Galileo Data Importation process. Through this process, districts provide an export from their Student Information System (SIS) that lists all classes, teachers, and students within the district. ATI staff then import this data directly into Galileo K-12 Online once the import has passed quality assurance. Instructions for the import process can be found in the Tech Support section of Galileo K-12 (and Preschool) Online, as well as at the following links:



As you prepare your 2009-2010 program year data for import, please remember the following important points:

1) Be sure to include all required information in your import.

2) Optional information is not required in the Galileo database, but failure to include this information may adversely affect filtering.

3) If TeacherID or StudentID fields change within your SIS, please notify ATI prior to providing any import files to ensure proper transition within the Galileo database.

Please refer to the links above for more details about the import process.