A List Of Education And Related Data Resources

We frequently present quick analyses of data on this blog (and look at those done by others). As a close follower of the education debate, I often get the sense that people are hungry for high-quality information on a variety of different topics, but searching for these data can be daunting, which probably deters many people from trying.

So, while I’m sure that many others have compiled lists of data resources relevant to education, I figured I would do the same, with a focus on more user-friendly sources.

But first, I would be remiss if I didn’t caution you to use these data carefully. Almost all of the resources below have instructions or FAQ’s, most non-technical. Read them. Remember that improper or misleading presentation of data is one of the most counterproductive features of today’s education debates, and it occurs to the detriment of all.

That said, here are a few key resources for education and other related quantitative data. It is far from exhaustive, so feel free to leave comments and suggestions if you think I missed anything important.

The Year In Research On Market-Based Education Reform

** Also posted here on “Valerie Strauss’ Answer Sheet” in the Washington Post.

Race to the Top and Waiting for Superman made 2010 a banner year for the market-based education reforms that dominate our national discourse. By contrast, a look at the “year in research” presents a rather different picture for the three pillars of this paradigm: merit pay, charter schools, and using value-added estimates in high-stakes decisions.

There will always be exceptions (especially given the sheer volume of reports generated by think tanks, academics, and other players), and one year does not a body of research make.  But a quick review of high-quality studies from independent, reputable researchers shows that 2010 was not a particularly good year for these policies.

"No Comment" Would Have Been Better

Bruce Baker is a professor at Rutgers University who writes an informative blog called School Finance 101.  He presented some descriptive analysis of New Jersey charter schools in a post, and ended up being asked to comment on the data by a reporter.  The same reporter dutifully asked the New Jersey Charter Schools Association (NJCSA) to comment on the analysis. 

The NJCSA describes itself as “the leading statewide advocate for charter public schools in New Jersey and a principal source of public information about charter schools in the state.”  The organization issued the following response to Baker’s analysis:

The New Jersey Charter Schools Association seriously questions the credibility of this biased data. Rutgers University Professor Bruce Baker is closely aligned with teachers unions, which have been vocal opponents of charter schools and have a vested financial interest in their ultimate failure.

Baker is a member of the Think Tank Review Panel, which is bankrolled by the Great Lakes Center for Education Research and Practice. Great Lakes Center members include the National Education Association and the State Education Affiliate Associations in Illinois, Indiana, Michigan, Minnesota, Ohio and Wisconsin. Its chairman is Lu Battaglieri, the executive director of the Michigan Education Association.

There are now thousands of children on waiting lists for charters schools in New Jersey. This demand shows parents want the option of sending their children to these innovative schools and are satisfied with the results.

Note the stretch that they have to make to allege that Baker is “closely aligned” with teachers unions—he occasionally reviews papers for an organization that is partly funded by unions. There is no formal connection beyond that. Note also that the NJCSA statement “questions the credibility of [sic] this biased data”—meaning they doubt the credibility of data from the State of New Jersey, which Baker merely recasts as graphs and maps. There is not a shred of substance in this statement that addresses the data or Baker’s description of them. It’s pure guilt by association (and there’s not really even an association).

Research Wars

Weeks before the fact, a Sept. 29 forum sponsored by the Economic Policy Institute and the National Education Policy Center has sparked some interesting debate over at the National Journal. The event, centered around the recent book, Think Tank Research Quality: Lessons for Policy Makers, the Media and the Public, is an effort to separate "the junk research from the science."

The crux of the debate is whether the recent explosion of self-published reports by various educational think tanks has helped or hindered the effort to improve the quality of educational research. (Full disclosure: The Albert Shanker Institute is often called a "think tank" and we frequently self-publish.) The push and pull of dueling experts and conflicting reports, say some, has turned education research into a political football—moved down the field by one faction, only to be punted all the way to the other end by a rival faction—each citing "research" as their guide.

"My research says this works and that doesn’t," can always be countered by, "Oh yeah, well my research says that works and this doesn’t." There are even arguments about what "what works" means because, except for performance on standardized tests, our goals remain diverse, decentralized and subject to local control. As a result, public education is plagued by trial and error policies that rise and fall, district by district and state by state, like some sort of crazed popularity contest.