** Reprinted here in the Washington Post
We’ve entered the time of year during which states and districts release their testing results. It’s fair to say that the two districts that get the most attention for their results are New York City and the District of Columbia Public Schools (DCPS), due in no small part to the fact that both enacted significant, high-profile policy changes over the past 5-10 years.
The manner in which both districts present annual test results is often misleading. Many of the issues, such as misinterpreting changes in proficiency rates as “test score growth” and chalking up all “gains” to recent policy changes, are quite common across the nation. These two districts are just among the more aggressive in doing so. That said, however, there’s one big difference between the test results they put out every year, and although I’ve noted it a few times before, I’d like to point it out once more: Unlike New York City/State, DCPS does not actually release test scores.
That’s right – despite the massive national attention to their “test scores," DCPS – or, specifically, the Office of the State Superintendent for Education (OSSE) – hasn’t released a single test score in many years. Not one.
Instead, they only report the percent of students who are advanced, proficiency, basic and below basic (with a particular focus on the proficiency rates). Although these rates have advantages, particularly the fact that they are much easier to interpret than scale scores (and that matters), they are a terribly distorted way to measure performance, particularly over time.
Consider, as just one example, the NYC test results in 2011 (discussed here). There was a slight 2-3 point increase in the citywide proficiency rate in both math and reading between 2010 and 2011. The city made a pretty big deal out of this increase, issuing a press release hailing the “continued progress." The problem was that the average scores, which cannot be compared across grades, told a somewhat different story. Actually, out of the 12 separate citywide scores reported that year (six tested grades, in two subjects), 7 were either totally flat or declined. This illustrates the basic fact that the rates and the actual scores upon which they’re based often move in different directions, even when the sample is as large as NYC’s.
In the case of DCPS, however, there is no way to even check things like this. The public is not given the information necessary to do so.
And, unfortunately, DCPS is not alone. A quick scan of state/district websites suggests that many do release actual scores (though they are sometimes difficult to find), but it seems that a fair number fail to do so.
This has to change. The scores are public information, releasing them costs nothing, and they are required for anything resembling proper interpretation. If we’re going to put so much faith in test results, we should at least have the data.
- Matt Di Carlo