Policy And Research: A Shotgun Wedding In New Jersey

Earlier today, New Jersey Governor Chris Christie announced his plan to open 23 new charter schools in his state.  Just hours before this announcement, the NJ education department issued an analysis of new data on the performance of charter schools in the state (during the 2009-10 school year).   In an accompanying press release, the department claims that “the data affirms [sic] the need for Governor Christie’s reform proposals to grow the number of high-quality charter schools…” 

The release also contains several other extremely bold assertions that the results support expanding the state’s charter sector.  The title of the actual report, which contains only tables, is: "Living Up to Expectations: Charter Schools in New Jersey Outperforming District Schools."

Unfortunately, however, the analysis could barely pass muster if submitted by a student in one of the state’s high school math classes (charter or regular public).

It seems that the department compared the overall proficiency rates of each of the state’s 70 or so charter schools with the rates of the district in which they are located, as well as with the overall state average (for the record, the tables and press release incorrectly call the rates “scores,” though it is an important distinction).  They then tallied up the results to see how many did “better” and how many did “worse.”  The overall conclusion: 79 percent of charters outperformed their “host” districts on language arts exams, and 69 percent did so in math.

There are so many monumental limitations of this analysis - not the least of which being that it makes absolutely no effort, even crudely, to control for student characteristics - that they’re really not worth enumerating.  Suffice it to say that the results, by even the most generous policy research standards, demonstrate very little about charter schools’ relative performance (versus comparable regular public schools in their areas).  New Jersey charters may have done better this year, or they may have done worse, but these results, as presented, cannot be used to judge it either way. The bold, sweeping conclusions in the press release are, at best, misleading and, at worst, absurd.

Look – I’m all about people and organizations, including government departments, producing their own analyses, and states’ education departments cannot be held to the same standard as universities or large research organizations.  So, needless to say, I have absolutely no problem with the fact that this analysis is purely descriptive. Quite the contrary – by themselves, the tables represent exactly the kind of information that these departments should be producing. Even the simplest comparisons can be very useful if interpreted correctly.

But drawing grand, completely unsupportable conclusions from the results, and then using these conclusions to justify policy decisions scheduled to be announced later that day, is alamaringly poor research practice. Moreover, regardless of your opinion about the governor’s decision to expand charter schools, this type of misleading, coordinated roll-out is unbecoming of an education department, and it does not serve the people of New Jersey well.

Permalink

Finally, a critical analysis of the data. Everything in this article is so accurate. It is very unnerving to me that the public just so readily accepts the headlines and "bold assertions" rather than asking the tough questions that this article poses.