Skip to:


  • A Controversial Consensus On KIPP Charter Schools

    Written on March 12, 2013

    A recent Mathematica report on the performance of KIPP charter schools expands and elaborates on their prior analyses of these schools' (estimated) effects on average test scores and other outcomes (also here). These findings are important and interesting, and were covered extensively elsewhere.

    As is usually the case with KIPP, the results stirred the full spectrum of reactions. To over-generalize a bit, critics sometimes seem unwilling to acknowledge that KIPP's results are real no matter how well-documented they might be, whereas some proponents are quick to use KIPP to proclaim a triumph for the charter movement, one that can justify the expansion of charter sectors nationwide.

    Despite all this controversy, there may be more opportunity for agreement here than meets the eye. So, let’s try to lay out a few reasonable conclusions and see if we might find some of that common ground.

  • Student Attrition Is A Core Feature Of School Choice, Not A Bug

    Written on August 27, 2012

    The issue of student attrition at KIPP and charter schools is never far beneath the surface of our education debates. KIPP’s critics claim that these schools exclude or “counsel out” students who aren’t doing well, thus inflating student test results. Supporters contend that KIPP schools are open admission with enrollment typically determined by lottery, and they usually cite a 2010 Mathematica report finding strong results among students in most (but not all) of 22 KIPP middle schools, as well as attrition rates that were no higher, on average, than at the regular public schools to which they are compared.*

    As I have written elsewhere, I am persuaded that student attrition cannot explain away the gains that Mathematica found in the schools they examined (though I do think peer effects of attrition without replacement may play some role, which is a very common issue in research of this type).

    But, beyond this back-and-forth over the churn in these schools and whether it affected the results of this analysis, there’s also a confusion of sorts when it comes to discussions of student attrition in charters, whether KIPP or in general. Supporters of school choice often respond to “attrition accusations” by trying to deny or downplay its importance or frequency. This, it seems to me, ignores an obvious point: Within-district attrition - students changing schools, often based on “fit” or performance - is a defining feature of school choice, not an aberration.

  • Explaining The Consistently Inconsistent Results of Charter Schools

    Written on November 16, 2011

    This is the second in a series of three posts about charter schools. Here is the first part, and here is the third.

    As discussed in a previous post, there is a fairly well-developed body of evidence showing that charter and regular public schools vary widely in their impacts on achievement growth. This research finds that, on the whole, there is usually not much of a difference between them, and when there are differences, they tend to be very modest. In other words, there is nothing about "charterness" that leads to strong results.

    It is, however, the exceptions that are often most instructive to policy. By taking a look at the handful of schools that are successful, we might finally start moving past the “horse race” incarnation of the charter debate, and start figuring out which specific policies and conditions are associated with success, at least in terms of test score improvement (which is the focus of this post).

    Unfortunately, this question is also extremely difficult to answer – policies and conditions are not randomly assigned to schools, and it’s very tough to disentangle all the factors (many unmeasurable) that might affect achievement. But the available evidence at this point is sufficient to start draw a few highly tentative conclusions about “what works."

  • A Matter Of Time

    Written on December 3, 2010

    Extended school time is an education reform option that seems to be gaining in popularity. President Obama gave his endorsement earlier this year, while districts such as DCPS have extended time legislation under consideration.

    The idea is fairly simple: Make the school day and/or year longer, so kids will have more time to learn.  Unlike many of the policy proposals flying around these days, it’s an idea that actually has some basis in research. While, by itself, more time yields negligible improvements in achievement, there is some evidence (albeit mixed evidence) that additional time devoted to “academic learning” can have a positive effect, especially for students with low initial test scores. So, more time might have potential benefits (at least in terms of test scores), but the time must be used wisely.

    Still, extending schools days/years, like all policy options, must of course be evaluated in terms of cost effectiveness.  Small increases, such as adding a few days to the school calendar, are inconsistently and minimally effective, while larger increases in school time are an expensive intervention that must be weighed against alternatives, as well as against the fact that states and districts are, facing a few more years of fiscal crisis, cutting other potentially effective programs.

  • The Time Factor: It's Not Just KIPP

    Written on July 20, 2010

    In this post, I argue that it is important to understand why a few charters (like KIPP) perform better than others. An editorial in today's Washington Post points out that KIPP’s results suggest the achievement-improving potential of more school time for lower-income students – i.e., longer days and years.

    Through longer days, mandatory Saturdays, and summer school, KIPP students spend about 60 percent more time in school than typical regular public school students. That's the equivalent of over 100 regular public school days of additional time. This is an astounding difference.

    But it's not just KIPP.

  • What Is "Charterness," Exactly?

    Written on July 14, 2010

    ** Also posted here on Valerie Strauss' Answer Sheet in the Washington Post.

    Two weeks ago, researchers from Mathematica dropped a bomb on the education policy community. It didn’t go off.

    The report (prepared for the Institute for Education Sciences, a division of the USDOE) includes students in 36 charter schools throughout 15 states. The central conclusion: the vast majority of charter students does no better or worse than their regular public counterparts in math and reading scores (or on most of the other 35 outcomes examined). On the other hand, charter parents and students are more satisfied with their schools, and charters are more effective boosting scores of lower-income students.

Subscribe to kipp


This web site and the information contained herein are provided as a service to those who are interested in the work of the Albert Shanker Institute (ASI). ASI makes no warranties, either express or implied, concerning the information contained on or linked from The visitor uses the information provided herein at his/her own risk. ASI, its officers, board members, agents, and employees specifically disclaim any and all liability from damages which may result from the utilization of the information provided herein. The content in the Shanker Blog may not necessarily reflect the views or official policy positions of ASI or any related entity or organization.