New Policy Brief: The Evidence On The Florida Education Reform Formula

The State of Florida is well known in the U.S. as a hotbed of education reform. The package of policies spearheaded by then Governor Jeb Bush during the late 1990s and early 2000s focused, in general, on test-based accountability, competition, and choice. As a whole, they have come to be known as the “Florida Formula for education success,” or simply the “Florida Formula.”

The Formula has received a great deal of attention, including a coordinated campaign to advocate (in some cases, successfully) for its export to other states. The campaign and its supporters tend to employ as their evidence changes in aggregate testing results, most notably unadjusted increases in proficiency rates on Florida’s state assessment and/or cohort changes on the National Assessment of Educational Progress. This approach, for reasons discussed in the policy brief, violates basic principles of causal inference and policy evaluation. Using this method, one could provide evidence that virtually any policy or set of policies “worked” or “didn’t work,” often in the same place and time period.

Fortunately, we needn’t rely on these crude methods, as there is quite a bit of high quality evidence pertaining to several key components of the Formula, and it provides a basis for tentative conclusions regarding their short- and medium term (mostly test-based impact. Today we published a policy brief, the purpose of which is to summarize this research in a manner that is fair and accessible to policymakers and the public.

Here is the executive summary:

The education reforms implemented in Florida throughout the late 1990s and 2000s, commonly known as the “Florida Formula,” have received a great deal of attention in recent years. The policies included in this package are focused on test-based accountability, competition, and choice. Its supporters often rely on crude, speculative forms of policy evidence, such as aggregate, unadjusted test score changes coinciding with the period of the reforms. In reality, while not all of the policies constituting the “Formula” have been subject to empirical scrutiny, there is a relatively large body of high quality research available on a number of its key elements. Several of these policies have had a positive estimated impact, gauged mostly in terms of testing outcomes, whereas others have not. And there is virtually no evidence of any negative impacts. Overall, however, most of the evidence on the “Florida Formula” is likely still to come, and the research that does exist supports nuanced, cautious policy conclusions.

The full publication is available here.

Issues Areas
Permalink

If I understand Matt's report, it suggests that while it is unwarranted to ascribe the Florida Formula as the cause of increases in student performance on NAEP and Florida exams because other initiatives may have been responsible, a few of the components have demonstrated small effect sizes increases when looked at individually.

One of those alternative explanations should be explored more thoroughly: Florida has some of the best large districts in the country such as Hillsborough. If you ask these districts educators why their districts improved, many will claim it was not because of the state's high-stakes accountability initiatives or increased competition which can cause considerable collateral damage. Instead they will claim that they followed a build and support strategy which succeeded despite state policies. They will also argue that recent substantial cuts to educational funding have stalled Florida's progress and far outweigh any meager gains found in the research cited in this report stemming from a few of the Florida formula's components.
As to the statement in the report that no negative consequences of the Florida Formula components have been documented, that claim has not been adequately researched by evaluating the individual measures. As you mention, and some of the research you cited finds, high-stakes accountability can cause gaming the tests, narrowing the curriculum, and deterring cooperative building of social capital--a key to improved performance as this blog has argued consistently. Thus, the Florida Formula might be detracting from those initiatives that do produce high effect sizes such as team building and reduce the overall effects of these research based initiatives which would otherwise produce larger gains in performance. None of the research cited examined that issue.