Thursday, April 30, 2009
Rest assured, your participation in the Galileo K-12 Online and Instructional Dialogs trial offer will not put you in this awkward position. As part of the trial, ATI provides the same level of support we provide to all Galileo clients. ATI support offerings include extensive Galileo online help files, website form submission for general and tech-related questions, email support via email@example.com, and telephone help available at (800) 367-4762 x130.
Support is available for computer and browser-related issues encountered within Galileo K-12 Online, supported eInstruction and Promethean response pad setup and use, and general Galileo K-12 Online and instructional Dialog use.
Monday, April 20, 2009
The Instructional Dialogs created for the Educational Interventions Forum were assembled into customized sets for each of the states involved in the forum. The intent was to offer participants a quick, easy to use set of interventions that would allow for a quick review or a new approach to introduce standards that were likely to appear on the end of year state test. Now as state testing is wrapping up in some states and soon to begin in others there is still an opportunity to use the Instructional Dialogs. The combination of instructional content and immediate feedback provided by the optional test attached to the dialog will allow you to evaluate areas that may still be essential to the completion of the courses the students are finishing this semester.
Accessing the available Instructional Dialogs from the trial offer will allow a quick and simple learning experience focused on one of these important standards. The use of the trial dialogs will also allow teachers to familiarize themselves with the types of activities that are available through the Galileo Instructional Dialogs banks. Districts who already subscribe to Galileo for their educational management and assessment needs will be able to sample these Instructional Dialogs and then move to working with the more than 800 dialogs written for math and English. For districts considering Galileo, the dialogs will provide an indication of the types of instructional content that Assessment Technology is committed to providing through our continuous expansion of the Instructional Dialogs.
To access the trial dialogs, please contact the Field Services department at Assessment Technology, Inc. (520) 323-9033 or 1-800-367-4762.
Thursday, April 2, 2009
The Aggregate Multitest report can be run in two modes and the choice is made by selecting either the Display risk levels or the Display benchmark performance levels radio button. When the Aggregate Multitest report is run with the benchmark performance levels option, it generates an independent analysis of student performance for each of the selected assessments. Based on their performance on each benchmark assessment, students are categorized into the same classification system that the statewide assessment uses, such as the FAME scale in Arizona or the Advanced, Proficient, Needs Improvement, or Warning categories in Massachusetts. What you’ll see below the bar graph is a display that shows the percent of students in each category, such as this:
That’s good news, right? Well, maybe. Probably. Our research has indicted that Galileo benchmark assessments are very good at forecasting likely student performance on statewide assessment. But our research also indicates that considering student performance on multiple benchmark assessments yields even more accurate forecasts of student performance on statewide assessments than considering student performance on individual benchmark assessments in isolation. This is true even when the one, isolated benchmark assessment is the one that is administered most recently when statewide testing begins, as would be the case with benchmark #3 in this example. Details of these investigations can be found in the Galileo K-12 Online Technical Manual.
In order to capture the increased accuracy of data from multiple benchmark assessments, the Galileo K-12 Online Risk Assessment report was developed. The Risk Assessment report is accessed via the second mode for generating the Aggregate Multitest report, by selecting the radio button that says display risk levels. The Risk Assessment report provides the same information Kerridan referred to in her recent post (How Can Galileo Assist in Interventions?), except that the data can be aggregated at the school or district level as well as the classroom level, and it yields a display that looks like this:
Students are classified into the different levels of risk according to their performance on a series of benchmark assessments. This example refers to the same three 4th grade math assessments that were considered earlier. Students are classified as being “On Course” if they scored above the cut score for “meets” (or “proficient” in many other states) on all three benchmark assessments. If they fell below that cut score for one assessment, they are classified as being at “Low Risk”. If they fell below on two of the three assessments, they are at “Moderate Risk”, and students who never scored above the critical cut score are classified as being at “High Risk”. Kerridan’s blog illustrates how to use this information to plan interventions.
This method of projecting student risk of not meeting the standard on the statewide assessment has proven to be very accurate. On average, 96% of students who are classified as being “On Course” after three benchmark assessments go on to demonstrate mastery on the statewide assessment (see the Technical Manual). This is the most accurate approach to projecting student risk for the purposes of identifying students for intervention efforts. It is also the most accurate information a district could have when assessing risk with regard to AMOs. However, because its primary function is to identify groups of students for intervention efforts, its format may not be the most convenient for looking toward AMOs. In this case, only 52% of the students are on course, which is well below the AMO of 66%. But a district can count on a number of students from each of the other risk categories to pass the statewide assessment as well. We have conducted a preliminary investigation, based on 7 large school districts, to see how many students in each risk level category tend to go on to demonstrate mastery on the statewide assessment. The results are presented in the following table.