The Equity Projection

A new Mathematica report examines the test-based impact of The Equity Project (TEP), a New York City charter school serving grades 5-8. TEP opened up for the 2009-10 school year, receiving national attention mostly due to one unusual policy: They paid teachers $125,000 per year, regardless of experience and education, in addition to annual bonuses (up to $25,000) for returning teachers. TEP largely makes up for these unusually high salary costs by minimizing the number of administrators and maintaining larger class sizes.

As is typical of Mathematica, the TEP analysis is thorough and well-done. The school's students' performance is compared to that of similar peers with a comparable probability of enrolling in TEP, as identified with propensity scores. In general, the study’s results were quite positive. Although there were statistically discernible negative impacts of attendance for TEP’s first cohort of students during their first two years, the cumulative estimated test-based impact was significant, positive and educationally meaningful after three and four years of attendance. As always, the estimated effect was stronger in math than in reading (estimated effect sizes for the former were very large in magnitude). The Mathematica researchers also present analyses on student attrition, which did not appear to bias the estimates substantially, and they also show that their primary results are robust when using alternative specifications (e.g., different matching techniques, score transformations, etc.).

Now we get to the tricky questions about these results: What caused them and what can be learned as a result? That’s the big issue with charter analyses in general (and with research on many other interventions): One can almost never separate the “why” from the “what” with any degree of confidence. And TEP, with its "flagship policy" of high teacher salaries, which might appeal to all "sides" in the education policy debate, provides an interesting example in this respect.

A Few More Points About Charter Schools And Extended Time

A few weeks ago, I wrote a post that made a fairly simple point about the practice of expressing estimated charter effects on test scores as “days of additional learning”: Among the handful of states, districts, and multi-site operators that consistently have been shown to have a positive effect on testing outcomes, might not those “days of learning” be explained, at least in part, by the fact that they actually do offer additional days of learning, in the form of much longer school days and years?

That is, there is a small group of charter models/chains that seem to get good results. There are many intangible factors that make a school effective, but to the degree we can chalk this up to concrete practices or policies, additional time may be the most compelling possibility. Although it’s true that school time must be used wisely, it’s difficult to believe that the sheer amount of extra time that the flagship chains offer would not improve testing performance substantially.

To their credit, many charter advocates do acknowledge the potentially crucial role of extended time in explaining their success stories. And the research, tentative though it still is, is rather promising. Nevertheless, there are a few important points that bear repeating when it comes to the idea of massive amounts of additional time, particularly given the fact that there is a push to get regular public schools to adopt the practice.

Extended School Time Proposals And Charter Schools

One of the (many) education reform proposals that has received national attention over the past few years is “extended learning time” – that is, expanding the day and/or year to give students more time in school.

Although how schools use the time they have with students, of course, is not necessarily more or less important than how much time they have with those students, the proposal to expand the school day/year may have merit, particularly for schools and districts serving larger proportions of students who need to catch up. I have noticed that one of the motivations for the extended time push is the (correct) observation that the charter school models that have proven effective (at least by the standard of test score gains) utilize extended time.

On the one hand, this is a good example of what many (including myself) have long advocated – that the handful of successful charter school models can potentially provide a great deal of guidance for all schools, regardless of their governance structure. On the other hand, it is also important to bear in mind that many of the high-profile charter chains that receive national attention don’t just expand their school years by a few days or even a few weeks, as has been proposed in several states. In many cases, they extend it by months.

Explaining The Consistently Inconsistent Results of Charter Schools

This is the second in a series of three posts about charter schools. Here is the first part, and here is the third.

As discussed in a previous post, there is a fairly well-developed body of evidence showing that charter and regular public schools vary widely in their impacts on achievement growth. This research finds that, on the whole, there is usually not much of a difference between them, and when there are differences, they tend to be very modest. In other words, there is nothing about "charterness" that leads to strong results.

It is, however, the exceptions that are often most instructive to policy. By taking a look at the handful of schools that are successful, we might finally start moving past the “horse race” incarnation of the charter debate, and start figuring out which specific policies and conditions are associated with success, at least in terms of test score improvement (which is the focus of this post).

Unfortunately, this question is also extremely difficult to answer – policies and conditions are not randomly assigned to schools, and it’s very tough to disentangle all the factors (many unmeasurable) that might affect achievement. But the available evidence at this point is sufficient to start draw a few highly tentative conclusions about “what works."

What The "No Excuses" Model Really Teaches Us About Education Reform

** Also posted here on “Valerie Strauss’ Answer Sheet” in the Washington Post

In a previous post, I discussed “Apollo 20," a Houston pilot program in which a group of low-performing regular public schools are implementing the so-called “no excuses” education model common among high-profile charter schools such as KIPP. In the Houston implementation, “no excuses” consists of five basic policies: a longer day and year, resulting in 21 percent more school time; different human capital policies, including performance bonuses and firing and selectively rehiring all principals and half of teachers (the latter is one of the "turnaround models" being pushed by the Obama Administration); extensive 2-on-1 tutoring; regular assessments and data analysis; and “high expectations” for behavior and achievement, including parental contracts.

A couple of weeks ago, Harvard professor Roland Fryer, the lead project researcher, released the results of the pilot’s first year. I haven’t seen much national coverage of the report, but I’ve seen a few people characterize the results as evidence that “’No excuses’ works in regular public schools." Now, it’s true that there were effects – strong in math – and that the results appear to be persistent across different model specifications.

But, when it comes to the question of whether “no excuses works," the reality is a bit more complicated. There are four main things to keep in mind when interpreting the results of this paper, a couple of which bear on the larger debate about "no excuses" charter schools and education reform in general.

The Real Charter School Experiment

The New York Times reports that there is a pilot program in Houston, called the "Apollo 20 Program" in which some of the district’s regular public schools are "mimicking" the practices of high-performing charter schools. According to the Times article, the group of pilot schools seek to replicate five of the practices commonly used by high-flying charters: extended school time; extensive tutoring; more selective hiring of principals and teachers; “data-driven” instruction, including frequent diagnostic quizzing; and a “no excuses” culture of high expectations.

In theory, this pilot program is a good idea, since a primary mission of charter schools should be as a testing ground for new policies and practices that could help to improve all schools. More than a decade of evidence has made it very clear that there’s nothing about "charterness" that makes a school successful – and indeed, only a handful get excellent results. So instead of arguing along the tired old pro-/anti-charter lines, we should, like Houston, be asking why these schools excel and working to see if we can use this information productively.

I’ll be watching to see how the pilot schools end up doing. I’m also hoping that the analysis (the program is being overseen by Harvard’s EdLabs) includes some effort to separate out the effects of each of the five replicated practices. If so, I’m guessing that we will find that the difference between high- and low-performing urban schools depends more than anything else on two factors: time and money.

Among Charter Schools, Inconsistency Begets Opportunity

Andrew Rotherham – who writes the blog "Eduwonk" – has also recently started writing a weekly column for Time Magazine. Most of his articles have been interesting and relatively fair, even on the controversial issues. He has a point of view, just like the rest of us, but usually makes a good-faith effort to present alternate viewpoints and the relevant research.

His most recent piece was a partial disappointment. In it, Rotherham takes up the issue of charter schools. His overarching argument is that too many people focus on whether or not charter schools are “better” or “worse” than regular public schools, rather than why – which policies and practices are associated with success or failure.

As I stated in my very first post on this blog (and others), I completely agree. Given the overt politicization of the charter school discussion, the public desperately needs a move away from the pro/anti-charter framework, towards a more useful conversation about how and why particular schools do or don’t work. Their inconsistent performance has caused controversy, but it also an opportunity.

But, when Rotherham lays out the characteristics (“ethos and operations”) that these successful charters supposedly share, the factors he specifies are vague and unsubstantiated – it’s hard to figure what they mean, to say nothing of whether they actually have the stated effect.

A Matter Of Time

Extended school time is an education reform option that seems to be gaining in popularity. President Obama gave his endorsement earlier this year, while districts such as DCPS have extended time legislation under consideration.

The idea is fairly simple: Make the school day and/or year longer, so kids will have more time to learn.  Unlike many of the policy proposals flying around these days, it’s an idea that actually has some basis in research. While, by itself, more time yields negligible improvements in achievement, there is some evidence (albeit mixed evidence) that additional time devoted to “academic learning” can have a positive effect, especially for students with low initial test scores. So, more time might have potential benefits (at least in terms of test scores), but the time must be used wisely.

Still, extending schools days/years, like all policy options, must of course be evaluated in terms of cost effectiveness.  Small increases, such as adding a few days to the school calendar, are inconsistently and minimally effective, while larger increases in school time are an expensive intervention that must be weighed against alternatives, as well as against the fact that states and districts are, facing a few more years of fiscal crisis, cutting other potentially effective programs.

The Time Factor: It's Not Just KIPP

In this post, I argue that it is important to understand why a few charters (like KIPP) perform better than others. An editorial in today's Washington Post points out that KIPP’s results suggest the achievement-improving potential of more school time for lower-income students – i.e., longer days and years.

Through longer days, mandatory Saturdays, and summer school, KIPP students spend about 60 percent more time in school than typical regular public school students. That's the equivalent of over 100 regular public school days of additional time. This is an astounding difference.

But it's not just KIPP.