Curmudgucation: Pre-K and the Long Haul
You may remember a study from Tennessee that suggested that pre-K actually led to worse results for students further down the road. It was a little alarming, and lots of folks tried hard to come up with an explanation, because it looked like the worst results happened to the poorest kids. Threw a major monkey wrench in the whole Universal Pre-K Will Bring Equity To Education thing.
Well, just hold on there for a second.
Now there's a study from Oklahoma that suggests that you just have to take a longer view to find the benefits. A brand new study find that preschool grads were way more likely to go to college, either right after high school or within a year or two. Here it is, summed up nicely:
“Don’t give up on the protagonist until the story is told,” said William Gormley, a professor of government and public policy at Georgetown University and co-director of its Center for Research on Children in the United States, which has overseen much of the Tulsa research. “This is a classic story of a big bounce from pre-K in the short run, followed by disappointing fade out in standardized test scores in the median run, followed by all sorts of intriguing, positive effects in the longer run, and culminating in truly stunning positive effects on college enrollment.”
There are more studies that are roughly in line with this new one. What's still missing is an explanation. Various explanations being offered include:
The parents who are likely to send their kids to preschool are the same ones likely to send their kids to college. It's just a Family That Values Education thing. Researchers think they can kind of sort of adjust for that, and still find that preschool improves college chances, probably, kind of. But they can't come up with any clear data for Head Start grads.
The preschool program could actually help with both attitudes toward and attainment of education. But if that's the case, why the big dips in the intervening years?
I have a theory.
You are growing a tree, and you decide to start measuring it with a measuring stick that you came up with yourself. You measure and measure with your made-up measuring stick, and find no signs of growth--in fact, at one point you find the tree has shrunk. The finally, you measure the tree with a regulation yardstick, and find that it has grown far more than your previous DIY measurements.
You are trying to boil a pot of water. You measure the water temperature with an instrument that you came up with yourself, and it consistently tells you that the water is 45 degrees. Then, thirty seconds later, you see the water is boiling.
You can reach two conclusions here:
1) The stuff that I'm measuring is behaving in strange, mysterious, and counter-intuitive ways. We will have to figure out what is causing this strange behavior.
2) My DIY measuring instruments are crap. I should throw them away.
In other words, despite decades of insistence to the contrary, Big Standardized Test data is not predictive of college attainment.
The data is largely junk. First, the tests are not good. Second, before the tests can collect useful data, students have to care.
It's the same thing with the infamous middle school dip, the drop in scores that schools experience from their 8th grade test takers. It has baffled districts and led more than a few to change their district organization so that 8th graders are folded in with either higher or lower grades, thereby mitigating the results. It's a mystery. Why do 8th graders lose so much learning?
The mystery can be solved by the two step process of A) meeting middle school students and B) watching them actually take these tests. You will not find any group of people who are more tired or taking the damned test and more likely to be unmoved by what the olds want from them. If you want to measure middle school educational attainment, you could not devise a worse system than giving them a Big Standardized Test after giving the Big Standardized Tests for the previous seven years of their lives.
We get sign after sign that the Big Standardized Test does not measure what we want it to measure, but we keep ignoring them.
I'm a fan of pre-K done right, so hooray for some research supporting that, I guess. But I wish we could learn some of the other lessons hinted at here.
This blog post has been shared by permission from the author.
Readers wishing to comment on the content are encouraged to do so via the link to the original post.
Find the original post here:
The views expressed by the blogger are not necessarily those of NEPC.