Main image
4th October
2002

First Published in The New York Sun, October 4, 2002
By Andrew Wolf

The city’s Department of Education has fired the company that prepares and scores its reading tests, a move that should come as no surprise. The relationship between CTB/McGraw Hill and the city has been troubled in recent years. In 1999, a miscalculation on scoring a group of tests resulted in thousands of students being ordered to summer school and a number of superintendents being removed — one of them, Robert Riccobono of Brooklyn’s District 19, was dumped from his post and then had to be rehired when the scoring error was discovered. In a similar vein, Department of Education officials now believe that scores on the 2001 sixth grade reading tests were too high and that scores on this year’s seventh grade reading tests were too low. Considering that last year’s sixth graders and this year’s seventh graders are for the most part the same children, the officials may well have a point.
As the city begins a relationship with a new vender, Harcourt Educational Measurement, it must insist that the tests be seamlessly consistent from grade to grade so that the growth of individual children can be accurately measured. Beyond the rhetoric of accountability that surrounds educational testing, and the often toxic need to assess blame, the most important use of tests is as a diagnostic tool to identify the strengths and weaknesses of each child.

Unfortunately, the greatest impediment to such assessment comes not from the city, but from the state. New York State has instituted its own reading and math tests for the fourth and eighth grades, using testing devices wholly different from those used by the city. While attempts have been made to compare the results of the two tests, one simply cannot compare apples to oranges. The tests administered by the city for grades three, five, six, and seven are pure multiple-choice tests, administered in one sitting. The state’s fourth and eighth grade tests include extensive writing and are given over several days.

Essay tests are notoriously subjective. The state-administered essays are now scored by the individual school districts, using teachers who have been conscripted into this mindnumbing task and who have no special expertise. These teachers are given a rubric — guidelines for what to look for in scoring — and set loose. By the fourth or fifth day of this, or the 500th essay, whichever comes first, you can be sure that the rubric is being applied much differently from how it was applied to essay one. Moreover, one must ask what subtle pressures, real or perceived, are applied by the superintendents, to whom the scorers owe their jobs, to insure favorable results.

Apart from the scattershot scoring, the manner in which the tests’ results are interpreted is scandalously deficient. Rather than measure the growth of individual children, New York State evaluates success by measuring, for example, this year’s fourth grade against last year’s fourth grade, and then ranks the schools by score.

This misses the point. The best schools are not necessarily those that score highest, but rather those that achieve the greatest improvement of their individual students. Only if we look at the schools by this measure can we evaluate the efficacy of the curriculum and teaching methods they employ. This is known as “value added” testing, the adoption of which could cure many systemic ills.

In New York City, we have our own problems with how we interpret test scores, even though they are based on more objective multiple choice tests. For instance, we have a feel-good fiction I call the “District 2 Deception.” By mistaking the high scores of District 2 students coming from, on the whole, higher income homes, we run the risk of getting suckered into adopting questionable programs, such as constructivist, or “fuzzy,” mathematics.

Do District 2 students do better in math than kids in other districts because of constructivist math? Or do they do better because, as a number of parents and mathematicians suggest, their families can afford expensive individual tutoring? The latter seems far more likely, but we’ll never know for sure until we can measure the progress of individual students across the city and correlate it with the programs used to teach them. This, though, is the last thing the educrats would like to see happen.

Maybe we would find that some of the lowest-scoring schools are achieving greater gains than some of the highest performers. After all, there’s no trick in rigging a District 2 school like East Side Middle to score at the top of the ranking list when you only admit children scoring above the 90th percentile on standardized tests, even if you have to cherrypick them from other districts.

The failure of the current methodology of assessing schools has reached a ridiculous new height with the recent release of the state’s list of so-called failing schools. These are the schools that the state is required to identify under the provisions of the newly enacted No Child Left Behind Act. A child attending one of these schools has the right to transfer to a another school or receive private tutoring services paid for with federal funds. The feds, however, left it to the states to come up with the criteria to determine which schools are failing. In the case of New York, that was a big mistake.

It should come as no surprise to anyone that the same fools up in Albany who botched the physics Regents exam and found it necessary to censor literature for political correctness when choosing passages for the reading tests would screw this up as well. As it turned out, some of the best and most dynamic schools found themselves on the list, and some of the very lowest performing schools were omitted.

Take the case of two middle schools in the northwest Bronx’s District 10. M.S. 45 and M.S. 118 are both rated as “Far Above Average” in the Board of Education’s Similar School Comparison report, considered the most accurate measure of school performance. Both principals were awarded cash bonuses for their schools’ accomplishments. Yet both made the state’s failing schools list. Meanwhile other demonstrably worse performing schools, such as M.S. 254, rated “Far Below Average,” dodged the bullet.

In short, children at perfectly good schools, who are doing just fine, will have the option to go elsewhere or get special help they may not need. Meanwhile, some children who really need help, badly, will be ignored. In addition, dozens of schools that have no fourth or eighth grade, mainly early childhood schools, simply are not evaluated and their kids are rendered ineligible for help.
I guess some children will be left behind after all.

© 2002 The New York Sun, One, SL, LLC. All rights reserved.

Leave a Reply