Skip to:

The Data-Driven Education Movement

Comments

Obviously that conclusion about the helicopters can't be drawn from the "study" you describe - you'd need a control group that also attended the private schools but used another form of transportation. If outcomes for the helicopter group were statistically superior then one could conclude that the helicopters (rather than the private schools) had an impact. You, of course, already know this. Shame on you for presenting such a flawed example and pretending it's a strong parallel to the voucher study. I have come to expect a more rigorous standard of discussion on this blog. This type of manipulative dialogue will get us nowhere - we would be better served by helping the public understand how statistics actually works.

Laura said "you’d need a control group that also attended the private schools but used another form of transportation" However, proponents of vouchers ARE comparing the education potential of kids who attend one school versus another. The study design described does not conclude anything about helicopters, as you correctly state. However, telling the public that kids who have a voucher-based entrance into an elite private school will do better as compared to students going to the local public school also says nothing about the voucher itself either. In fact, such studies are inherently flawed because the means of attendance is not the only variable. How many studies out there actually DO look at the college entrance rates of poor minority kids who are placed in an elite private school as compared to other kids at the same school who can afford it? Good luck finding the studies you describe that could have been referenced in this post. It is very difficult to perform such studies as the sample sizes are quite small and variables are many.

Because I was told in graduate school that raw data needs to be interpreted, I took the 17 daily data sheets required for one student with autism home with me to attempt to create a scatterplot. For this, I was "written up" for not having the data available the next day for the ABA coach to inspect. This convinced me of something I already suspected; the massive amounts of data teachers are forced to churn out are not meant to drive instruction but to prove that we are not sitting around eating bonbons all day. I would have preferred to have a camera mounted on my shoulder to prove I was doing my job. At least that would have freed me up to actually get some teaching done!

Data driven education produces lots of data but little education.

This is what SMART goals are all about - specific, measurable, attainable, realistic and timely - if you can't measure it, can't compartmentalize it, it's not useful. And as you say, it not only changes what we measure, it changes what we ask. It affects what we focus on and what is considered important. It sacrifices a process, a journey, for a snapshot. It purports to measure growth, but doesn't really acknowledge that growth takes time and growth is not necessarily a linear process. And it reinforces the idea that if it's not recognizably practical before you even begin, it's not worth consideration and as a result stifles growth, experimentation, and innovation.

Some time ago my school started an independent reading program. Middle school students chose their own book and were required to read for 30 minutes a day, make weekly entries into a journal to which their teacher responded, and give book talks. The purpose of the program was not simply to improve vocabulary and reading comprehension but to create a culture of reading in our school. (Previously we had a lot of students who were self-proclaimed non-readers, as in "I don't read.") One day I saw a student who got kicked out of class sitting in the office. He took his book out of his book bag and sat there reading until the assistant principal was available to deal with him. That's when I knew our the reading program was effective, even though that evidence was totally non-measurable.

I can provide several examples of how data is misused by administrators from my own experience: I was told that my ESL students "suffered" under my teaching because out of the FIVE students I had who were ESL, ONE of mine did not do as well as each of my PLC partners FIVE ESL students. I was moved out of core LA, and this was one of the reasons cited. Five students is statistically insignificant, but that was never considered. Nor were the qualitative factors that made this student a unique situation. I also know that my principal gathered data on me from students who were in ISS (in school suspension). He had a theory all right (I wouldn't tamper with grades so he wanted me out of core LA), and was trying to gather evidence to support removing me from core LA from students who were habitually in ISS--not usually sent by me, by the way. If data driven education is going to be used, administrators need to take a few math courses in how to use it appropriately. Some other problems with data-driven education I wrote about here: http:robinwilsonjohnston.edublogs.org See "Fixing Only What is Broken"

This is a very insightful piece. I agree with the overall gist but would suggest two important qualifications. 1. Yes, it is true that those who believe in the power of "data" fail to interpret it correctly--and that this is because they fail to begin with a theory or question. But the problem goes deeper than Quintero suggests. She writes, "Excessive faith in data crunching as a tool for making decisions has interfered with the important task of asking the fundamental questions in education, such as whether we are looking for answers in the right places, and not just where it is easy (e.g., standardized test data)." But before asking such a question, one should ask, "what are we hoping to find out here, and what does each of our key terms mean?" If our main question pertains to 'achievement'; one should ask, "how am I defining 'achievement' in this context, and why?" Give up or ignore such a question, and everything else gets muddled. 2. Instead of expanding the definition of "data" to include enthnographies, interviews, and so forth, I would <i>narrow</i> it but acknowledge that data do not provide all the important information. This is a technicality but a key one. Data, in their traditional sense, are comparable bits of measurable information that, when collected in large numbers and interpreted in relation to a research question, can provide information. Student essays do not count as data, because you cannot treat them as comparable bits or break them down into comparable bits (rubrics aside). Some scholars would disagree with me and say that the term "data" has loosened over time to include things that can't be directly compared with one another. I see problems with such loosening: in particular, methodological confusion and corruption. Calling something "data" often sets the stage for breaking it down into comparable and supposedly measurable bits, when it should not be broken down in such a manner. On the other hand, I see many reasons to consider things that do not fall under the stricter definition of "data." Just call them what they are: essays, interviews, or what have you. I wrote about this here: http://dianasenechal.wordpress.com/2012/06/30/the-misuse-of-data-the-word/ (non-satirical piece) and here: http://dianasenechal.wordpress.com/2012/11/10/student-shows-23-percent-growth-in-finding-central-idea/ (satirical piece). Also, for an alarming example of data-worship in a second-grade classroom, see this video (thanks to Robert Pondiscio for originally pointing it out and to James O'Keeffe for reminding me of it): http://blog.coreknowledge.org/2010/12/09/data-is-fabulous/ Note: I posted this comment on Diane Ravitch's blog as well.

DISCLAIMER

This web site and the information contained herein are provided as a service to those who are interested in the work of the Albert Shanker Institute (ASI). ASI makes no warranties, either express or implied, concerning the information contained on or linked from shankerblog.org. The visitor uses the information provided herein at his/her own risk. ASI, its officers, board members, agents, and employees specifically disclaim any and all liability from damages which may result from the utilization of the information provided herein. The content in the Shanker Blog may not necessarily reflect the views or official policy positions of ASI or any related entity or organization.