Friday, June 26, 2009

On the Assessment of Writing

One of the topics being considered in many states is how to best assess students' writing skills. The implementation of multiple-choice items to assess the writing ability of students has become more popular in recent years. Among states where Galileo K-12 Online is currently used, California and Massachusetts both use multiple-choice items to assess some aspects of writing. Arizona is reportedly adding multiple-choice writing to the AIMS in the next round of pilot testing, and we expect to see those items supporting the revised Arizona English and Language Arts standards which will be adopted in 2010-2011.

It is not surprising that multiple-choice holds a certain appeal for those wishing to assess writing. Multiple-choice items take less time away from instruction, can be scored using automated procedures such as those available to users of Galileo K-12 Online, and are scored consistently due to the use of a single correct answer instead of relying on evaluators to score to a rubric. These advantages make multiple-choice a compelling option, but there are other considerations that limit the usefulness and effectiveness of multiple-choice items in the assessment of writing. The use of multiple-choice to assess writing is an attractive but limited approach. Thomas M. Haladyna explains the limits of using multiple-choice to assess writing in Developing and Validating Multiple-Choice Items:

The most direct measure would be a performance-based writing prompt. MC items might measure knowledge of writing or knowledge of writing skills, but they would not provide a direct measure (p.11).

Therefore a crucial concern is the logical connection between item formats and desired interpretations. For instance, an MC test of writing skills would have low fidelity to actual writing. A writing sample would have much higher fidelity (p.12).

To assess writing, it is necessary to apply a standardized rubric and a writing prompt that allows students to express their responses in a manner that represents accurately their ability to compose, convey and communicate in a way that fulfills the designated purpose of a text and that utilizes appropriate information they possess relevant to the topic.

While multiple-choice reading items addressing an analysis standard may not require the student to compose a full analytical expression, they do require the student to utilize the same analytical processes to identify the correct analysis from the distractors provided. However, the ability to identify the best compositional example does not reflect accurately the skills and abilities inherent in good writing as the ability to recognize persuasive, informative or expressive quality does not indicate the ability of the student to create the same level of written content.

Galileo provides content to allow for writing assessments using prompts for the most authentic measure of student writing, while also covering writing knowledge and skills in multiple-choice items that help to establish data for basic skills measurement and test reliability in predicting standardized test performance.

Text Referenced

Haladyna, T.M. (2004). Developing and Validating Multiple-Choice Test Items (3rd ed.). Mahwah, N.J.: Lawrence Erlbaum Associates.

Thursday, June 18, 2009

Care must be taken when administering benchmark assessments to subsets of students or to students from multiple grade levels

Galileo K-12 Online benchmark assessments serve two functions simultaneously. One is to provide teachers with timely feedback regarding which standards their students have and have not mastered. The other is to forecast the students’ likely performance on the high stakes statewide assessment such as AIMS in Arizona or MCAS in Massachusetts. Both of these functions are equally important, and in most cases both goals are achieved in harmony by the single benchmark assessment. However, there are some cases where the two goals are in conflict. In today’s post, I want to alert district administrators to a potential problem and to give them a way to avoid it when planning benchmark assessments.

In the typical scenario, a benchmark assessment is given to all students in the district in a given grade level. For example, all fifth-graders in the district might take a fifth-grade math benchmark assessment. It is expected that all of these students will also take the fifth-grade math high-stakes statewide assessment. This is important because the benchmark assessment must be aligned to the statewide assessment in order to generate cut scores for performance levels and to forecast student performance on the statewide assessment. If the same set of students is expected to take both the benchmark assessment and the statewide assessment, then the comparison between the two assessments is essentially a comparison of apples to apples, and all is well. The cut scores that are calculated for the benchmark assessment should provide accurate forecasts of student performance on the statewide assessment and, in fact, the accuracy rate for Galileo K-12 Online benchmark assessments is quite high (see the Galileo K-12 Online Technical Manual.)

There are cases, however, where the set of students taking a benchmark assessment is not the same as the set that will be taking the statewide assessment. In these cases, the calculation of accurate cut scores for benchmark assessments becomes more complicated. A common scenario is one in which advanced 8th-graders are taking a high school algebra course and, quite reasonably, they take the high school math benchmark assessments instead of the 8th grade math benchmark assessments. This makes perfect sense for the first goal of benchmark assessments: providing feedback to teachers regarding student mastery of state standards. It does, however, create problems for the goal of forecasting student performance on the statewide assessment. In most cases these students will be taking the 8th grade statewide assessment, and not the high school statewide assessment, and so the comparison when calculating cut scores becomes one of apples to oranges.

In order to calculate accurate cut scores for the high school math benchmark assessment in the above scenario, the scores from the 8th grade students must be removed from the data set, so that the set of students on the benchmark assessment will be the same as the set of students who will be taking the high school statewide assessment. Additionally, care must be taken when calculating the cut scores for the 8th grade math benchmark assessment. This is because a specific region of the student distribution, the advanced students, will not be present in the distribution of scores for the 8th grade benchmark. If no adjustment is made to account for the absence of the advanced students, then the cut scores that are calculated will be too low, and too many students will be classified as being likely to pass the statewide assessment. This, of course, will result in rude surprises when the statewide assessment results come in.

The take-home message, then, is to be sure to be clear about who will be taking benchmark assessments when you are planning them. Steps can be taken in cases such as the one described here to make sure that the cut scores on benchmark assessments are accurate, but only if ATI knows about the unusual circumstances in advance. If you are designing benchmark assessments in Galileo K-12 Online and there will be any out-of-grade testing, or if the set of students on the benchmark assessment will not be the same as the set that is taking a particular statewide assessment, please let your Field Services or Educational Management Services representative know right away. Forearmed with as much information as possible, ATI can work with your district to make sure that the benchmark assessments provide accurate forecasts of student performance on statewide assessments as well as providing timely feedback regarding the mastery of standards to classroom teachers.

Wednesday, June 3, 2009

Help for Math Teachers

The purpose of this thread is to provide information and a way for math teachers to converse with each other about specific states standards both interpretations of state provided language and ideas about how to teach these standards to students.

Please comment on posts or add new posts including questions, ideas, and answers about how to teach math standards.

High School: Post #1

AZ-MCW-S3C4-PO10. Determine an effective retirement savings plan to meet personal financial goals including IRAs, ROTH accounts, and annuities.

AZ provided connection: MCWR-S5C2-09. Use mathematical models to represent and analyze personal and professional situations.

AZ provided explanation: An IRA is an “Individual Retirement Account,” and a ROTH is a specific type of IRA, with a more complex tax-advantaged structure.
I have searched for formulas or information about how to figure returns, advantages, and how to figure how much to invest in order to reach a retirement goal, but I have only found calculators not any information about formulas to mathematically figure the answer.

What materials/formulas do you plan to teach students to figure this information?

Middle School: Post #1

AZ-M06-S2C4-01. Investigate properties of vertex-edge graphs
· Hamilton paths,
· Hamilton circuits, and
· shortest route.

How do you teach students to check their answers on the vertex-edge graph items?

How do you know if you found all possible paths on a vertex-edge graph?


AZ provided explanation: A Hamilton path in a vertex-edge graph is a path that starts at some vertex in the graph and visits every other vertex of the graph exactly once. Edges along this path may be repeated. A Hamilton circuit is a Hamilton path that ends at the starting vertex. The shortest route may or may not be a Hamilton path. Depending upon the constraints of a problem, each vertex may not need to be visited.
Elementary School: Post #1

AZ-M02-S5C2-03. Select from a variety of problem-solving strategies and use one or more strategies to arrive at a solution.

What problem strategies do you think are appropriate to teacher primary students?

Which problem strategies are your student’s favorites?



Monday, June 1, 2009

Share Your Lessons With Others

Have you created a lesson that you are incredibly proud of? Do you wish there was an easier way to let your colleagues access the lesson to use with their students? With Galileo sharing is easy. In order to share your content, you will want to attach it to a Dialog. Don’t worry. You needn’t recreate your lesson in a Dialog. We recommend that you do the following:

  1. Link your Dialog to state standards. Most of your colleagues will search for lessons based on standards.
  2. Give your Dialog a title and add any notes that will be relevant to other users.
  3. Add a description. The words you place in the description box will be searchable by other users once you share your lesson. Examples of keywords could include: emerging language learners, hand-held responders, or teacher-facilitated.
  4. Attach the lesson as a resource.
  5. Automatically generate a follow up quiz. This is optional and only necessary if you’d like to use Galileo’s Formative Test Reports to evaluate students’ learning of the lesson.
  6. Publish your lesson.

Once your lesson is published you can share it in two ways. Once your Dialog is published you will see a Share Dialog button. Click this button to add your lesson to the community bank. Sharing your Dialog to the community bank will allow Galileo users in your district and other districts to see your Dialog when searching, and they can schedule and use it with students. If you would prefer only to share your content with colleagues in your district, that is possible as well. You will just need to provide your colleague’s access to your Dialog Library or copy your Dialog into their library. ATI will be more than happy to show you how this is done. For more information on sharing your lessons, e-mail ATI’s Professional Development staff at professionaldevelopment@ati-online.com for assistance. Or call us at 1-800-367-4762 ext. 132.