In this post, I argue that it is important to understand why a few charters (like KIPP) perform better than others. An editorial in today's Washington Post points out that KIPP’s results suggest the achievement-improving potential of more school time for lower-income students – i.e., longer days and years.
Through longer days, mandatory Saturdays, and summer school, KIPP students spend about 60 percent more time in school than typical regular public school students. That's the equivalent of over 100 regular public school days of additional time. This is an astounding difference.
But it's not just KIPP.
Many (perhaps most) of the urban charter schools that get relentless national attention for their results have their students for much more time than regular public schools (which typically provide 6-6.5 hours a day).
For instance, the much-touted Harlem Success Academies, like KIPP, run almost 9 hours per day (about 40-50 percent more time). Achievement First schools (in New York and Connecticut) provide an 8.5 hour day and summer school. The BASIS charter schools in Arizona run almost 8 hours. The Aspire schools in California have a 7.5 hour day and 10 more days than regular publics in the state. The SEED charter schools use a boarding model that keeps the students 24 hours a day, 5 days a week.
So, how much of the higher performance of these widely-publicized charter schools is attributable to the simple fact that they have the kids for the equivalent of 40-100 additional days?
This is a difficult question to answer empirically. The KIPP report couldn’t measure the effect of school time because KIPP schools don’t vary much in their days and hours, and researchers cannot differentiate the time effects from the overall school effects. More school time is also often correlated with other variables, such as revenue (which is needed to keep schools open longer, and which these charters often receive from private donations), and it is tough to isolate these variables’ effects.
But both of the most recent charter lottery studies (here and here) find associations between school time and higher performance. These findings square with other evidence outside the charter realm, such as this clever Education Next article's use of snow days to estimate the effect of longer school years, and there are several analyses showing that student attendance (i.e., more time in school) improves performance (see this recently-published paper). There is also common sense: What school couldn’t do better with a few extra months?
Some charter advocates would no doubt argue that more school time is an inherent part of the “charter advantage," since so few regular public districts have established a longer day or year (and many are cutting them back for budgetary reasons). But this is more of a political than an empirical argument about “what works." It also ignores the fact that additional time has been successfully tried in urban districts, such as the longer days and years in the Chancellor’s District in New York City (also here) and the School Improvement Zone in Miami-Dade.
I’m not saying that charter school effects should be dismissed if they are largely explained by longer school days/years. Actually, I’m saying the opposite: If this is the case, then we should consider it among the more important findings that charter school research can produce.
Charters like KIPP are held up as models of what schools can achieve with students from low-income families. As examples, these schools provide much of the rationale for the continued expansion of charter schools, despite growing evidence that most charters don’t work any better than regular public schools and many have results that are worse. If this success largely boils down to an extra few months of school time, this should, in my view, change the substance of the charter debate immediately and permanently. We will have discovered "charterness."