Many of us who are involved in education today have heard about the value added approach to analyzing assessment data coming from a classroom. In a nutshell, this is a method of analyzing test results that attempts to answer the question of whether something that is in the classroom, typically the teacher, adds to the growth that the students make above and beyond what would otherwise be expected. The approach made its way onto the educational main stage in the state of Tennessee as the Tennessee Valued Added Assessment System (TVAAS).
This rather dry topic would likely not have been something that was well known outside the ivory towers were it not for the growing question of merit pay for educators. One of my statistics professors used to love to say that “Statistics isn’t a topic for polite conversation”. The introduction of pay into the conversation definitely casts it in a different light. In NYC, consideration is being given to utilizing a value added analyses in tenure decisions for principals. Value added models have been used for determining teacher bonus pay in Tennessee and Texas. Michelle Rhee has argued for using a value added approach to determining teacher performance in the DC school system. One might say it is all the rage, both for the size of the spotlight shining its way and the emotion that its use for this purpose has brought forth.
I will not be using this post to venture into the turbulent waters of discussing who should be getting paid based on results and who shouldn’t. I’ll leave it to others to opine on that very difficult and complicated question. My purpose here is to introduce the idea that the type of questions one asks from a value added perspective, the mindset if you will, can greatly inform instructional decision making through creative application. The thoughts that I will write about here are not intended to say that current applications of the value added type approach are wrong or misguided. I intend only to offer a different twist for everyone’s consideration.
The fundamental question in the value added mind set is whether something that has been added to the classroom positively impacts student learning above and beyond the status quo. One could easily ask this question of new instructional strategies introduced to the classroom that are intended to teach a certain skill. For instance, one might evaluate a new instructional activity designed to teach finding the lowest common denominator between two fractions. Given the limited scope of the activity, this evaluation could be conducted with a great deal of efficiency in very short time by the administration of a few test questions. This sort of evaluation will provide the sort of data that could be used immediately to guide instruction. If the activity is successful then teachers can move on to the next topic. If it is unsuccessful then a new approach may be utilized. The immediacy of the results puts one in a position of being able to make decisions informed by data without having to wait for the year or the semester to end.
Conducting short term small scale evaluations is different from the typical approach in value added analysis of being concerned about impact over a long period. The question of long term impact over time could easily be asked of collection of instructional activities or lessons. In an earlier post, Christine Burnham discusses some of the ways that impact over time could be tested.
As always, we look forward to hearing your thoughts about these issues.