Monday, July 27, 2009

Attending Conferences

Attending conferences can be overwhelming. When you visit a booth, be sure to take home any pertinent literature or samples, etc. that vendor offers. When you get home, all the amazing products you’ve seen over the last day or so might blur together. You may want to jot down a note or two while visiting the booth/vendor so you know what each product offers. This can be a great way to remember later on.

Monday, July 20, 2009

Ah, “Those Lazy, Crazy, Hazy, Days of Summer” ... No data needed for now, but do you want it and will you have it in the fall?

The summer of 2009 is now in full swing with students, parents, teachers, and administrators enjoying a well-earned vacation from a very exciting, busy, and oftentimes challenging school year.

During this past year, and as a result of the American Recovery and Reinvestment Act (ARRA) of 2009, we have seen a remarkable array of new policies and reforms occurring in K-12 education, backed by an unprecedented re-investment in education by the federal government. While this might not be among the “hot topics” of discussion around the summer lemonade stand or by the poolside, it has certainly been on most people’s radar and in the news almost daily these past few months.

Suffice to say, the immediate impact of the Act is akin to creating a new story-line and a new debate among educators, researchers, policy-makers, and the public about the future of our nation’s educational system. Certainly not the stuff of summertime fun but undoubtedly, a topic that will pick up momentum again as the 2009-2010 school year approaches. If however, you find yourself yearning for a summer thirst-quencher on this topic then consider the following.

One of the major goals of the ARRA is to improve our nation’s education system and enhance student learning through the increased use of technology innovations and actionable data to help inform educational decision-making.

In order to accomplish this goal, two key types of data are needed. The first is data on student mastery of state standards obtained not only through end-of-year statewide tests, but more importantly, continuous data on student learning and mastery of standards that can be used in “real-time” to inform instruction and intervention decision-making. Consequently, technology innovations represented in the new generation of online educational management systems must have the capacity to provide local school districts with an integrated array of locally customized assessment tools aligned with the district’s overall educational plan (e.g., pacing guide) for the year. To the extent that student learning and progress can be captured in this fashion, the second type of data – data that documents the impact of interventions on student learning and standards mastery – becomes a reality.

The paramount and practical importance of local school district empowerment in implementing an online educational management system that provides data in this way should not be underestimated. Rapid access to reliable data on student learning - where the student is and what needs to be planned for next for progress to continue - is a key element for planning effective learning opportunities and helping students meet the educational challenges of the 21st century.

It is perhaps stating the obvious to say that the importance of the data lies not in the need to gather and report it, or to simply answer a question, but rather so that positive action in the best interests of students can occur in a timely and purposeful fashion.

As stated by Pennsylvania Gov. Ed Rendell, chairman of the National Governors Association, at the March 2009 forum, Leveraging the Power of Data to Improve Education, “…even the best data collection system is worthless if it does not change what goes on in the classroom."

A few of my friends have wondered about this issue. Why collect data, or for that matter use all this sophisticated technology if it does not really change what is going on? Then there are my other friends who point out, that access to the technology and to the data is not supposed to change things, but rather, make change possible. It’s an interesting debate and I can see a valid argument on both sides.

What do you think? Let us know and in the meantime, enjoy those “Lazy, Crazy, Hazy, Days of Summer.”

Saturday, July 11, 2009

Counting the Mountains and the Lakes: Quantile Regression and NCLB

I am sure that the title of this post sounds a bit odd. Let me explain.....

A statistics book that I was recently reading starts out in the preface with a quote by Francis Galton in which he teased some of his colleagues for always falling back on averages to the exclusion of other analytic approaches thereby missing much of what could be discovered. Galton chided that they were much the same as a resident of “flat English counties, whose retrospect of Switzerland was that, if its mountains could be thrown into its lakes, two nuisances would be got rid of at once (Natural Inheritance).” The author of this statistics book (which can be seen here) then proceeds to describe the use of a statistical technique called quantile regression which provides a means to examine some of the “mountains and lakes” that might be found in data by those willing to look beyond averages. I’ll get back to this procedure in a bit. Don’t worry… I won’t bore you with its inner workings. One of my colleagues here is rather fond of pointing out that statistics isn’t a topic for polite conversation. Rather, I will try and talk a bit about what sort of real life questions quantile regression is being used to answer. Some of these real life questions concern new ways to look at student growth within the context of NCLB.


We are all very aware of the data that are gathered as part of NCLB and the types of questions that these data are used to address. The fundamental question has been: Are children meeting the standard? If they aren’t, then schools and districts are subject to penalties. Over the course of the years since NCLB was implemented, there have been a growing number of educators and members of the research community arguing that this approach isn’t adequately attentive to issues that are essential to the ultimate success of efforts to raise student achievement. The fundamental issue that has not yet been adequately addressed is student growth. Looking only at whether students have met the standard doesn’t make a distinction between a school in which students started at a low level and are making rapid progress towards ultimately mastering state standards from one in which the students started at a similar level but weren’t progressing. To paraphrase Galton, failing to recognize this particular mountain range could mean that opportunities are missed to support educational intervention efforts that are proving successful. Lack of sensitivity to student growth also has potential implications for high achieving students . Without being attentive to student growth, there is no way to highlight the differences between high achieving students who are growing and those that are not.


In order to get a more complete view, several states have implemented growth models for determining accountability under NCLB. Thus far 15 states make use of such a model for determining AYP. The growth model that is used in Colorado is particularly intriguing because of the fashion in which it applies quantile regression to the question of growth. The Colorado approach allows for a student’s growth to be compared against his or her academic peers. Students can be evaluated to determine if they are making more or less progress than students who are essentially starting from the same place. High achieving students aren’t lumped together with students who are behind. This approach focuses attention both on each student’s current level of skill and on the progress that they are making. Because of this more complete view, the Colorado Department of Education (CDE) is able to give schools credit for moving students forward, even if they haven’t yet got to the point where they will ultimately pass the test at the end of the year. This approach also more clearly identifies student progress at the upper end.


The information that looking at accountability in this fashion can provide is obviously more nuanced and complete than the more basic approaches that have been employed. The question that must be addressed is whether the approach is shining the light on all the mountains and lakes that should ultimately be considered. We believe that tracking growth using quantile regression can provide information that is useful for guiding instruction and that cannot be easily obtained in other ways. For example, quantile regression analysis can be of assistance in determining growth rates for students starting at different ability levels. Information of this kind can be very useful for guiding instruction in ways that elevate student achievement and that are maximally beneficial for all students. This fall, ATI will be developing new reporting tools providing growth information derived from quantile regression. We would be interested in hearing from you regarding this initiative. We are particularly interested in hearing from those of you working in states where such an approach has been put in place. How has it worked in practice? What sort of issues have arisen?