What the NAEP results mean for New York
On Board Online • November 9, 2009
By Merryl Tisch
Chancellor, Board of Regents
Many New Yorkers were left scratching their heads when the National Assessment of Educational Progress results were released in October. Though our state math scores have skyrocketed over the last two years, our NAEP scores showed no improvement during the same period. The NAEP also told a very different story about our state’s progress in reducing racial achievement gaps, which have narrowed on state tests but are as large as ever on the NAEP.
What should you as school leaders, and more broadly, citizens and parents, make of these discrepancies? It’s rare for educational leaders to question rising state test scores. But we have a responsibility to our children to move beyond simplistic headlines and take a hard, honest look at the tests on which we rely so heavily to judge educational progress.
The two most common explanations for these substantial disparities are unconvincing. The first explanation is that these are different tests of different skills, and thus we should not expect similar results. In fact, New York adopted the architecture of the NAEP standards in 2005 when we wove its five areas of mathematics – algebra, geometry, number sense and operations, statistics and probability, and measurement – as well as mathematics process standards, into each grade’s core curriculum. If we drill down further and examine the specific skills demanded by each set of standards, there really are more similarities than differences.
The second common explanation is that NAEP tests a sample of students, while the state assesses all students. This storyline is simply incorrect. Just as a doctor takes a small sample of blood to measure the body’s overall health, the NAEP tests a random sample of students to measure the state’s educational health. The necessary size of that sample is carefully determined by statisticians, who estimate how large the sample needs to be to produce a reliable measure of the state’s educational progress.
These state/NAEP differences are not unique to New York. But how we react to them can be. Rather than address the glaring gaps between their NAEP and state test performance, many state education leaders around the country have taken the easy road and looked the other way. Why should we be surprised? In the short-term, state leaders have few incentives to question rising scores. The headlines are positive, schools clear the No Child Left Behind Act’s adequate yearly progress targets, and everyone gets to pat themselves on the back.
Over the long-term, ignoring the disparity between NAEP and state test results will leave our kids less prepared for college and the global economy. The failure to drill down and develop accurate assessments at the state level creates a burden that falls disproportionately on our state’s disadvantaged children – who turn out to be much further behind than anyone recognized. Without that knowledge providing real accountability, we will lose our greatest leverage in the battle to turn around failing schools and close the achievement gap.
The Board of Regents and Commissioner David Steiner have a simple answer to those who would rather ignore the vast state/NAEP discrepancy: not on our watch.
In the coming months, we will review every aspect of our testing program to better understand why the state and NAEP results are so different, and to determine what aspects of our testing programs should be revised. The commissioner and the board are committed to revising the design of our state tests so they cover more of the curriculum and, at the same time, become less predictable regarding which learning objectives are emphasized in a given administration of a test.
At the same time, we will be advancing a long-term reform agenda that includes improving our standards and assessments as well as our curriculum materials and teacher and principal preparation programs. Together, we will restore confidence in the state testing program and keep our promise to New York’s children and families to provide an accurate measure of how our students are performing.
Show Other Stories