Curmudgucation: Do 3rd Grade Tests Predict Anything?
The National Center for Analysis of Longitudinal Data in Education Research (CALDER) is a program of the American Institute for Research (AIR) along with an assortment of universities. They've just released a working paper version of some research that some people think is pretty significant. I've read it so you don't have to. Let me explain what they found, and why nobody needs to get excited about it.
AIR is better known for their work in test manufacturing and marketing, which gives them a dog in this particular hunt, but I'm not sure that really need concern us here.
"Assessing the Accuracy of Elementary School Test Scores as Predictors of Students' High School Outcomes" was written by Dan Goldhaber (AIR/CALDER/University of Washington), Malcolm Wolff (University of Washington), and Timothy Daily (EdNavigator). EdNavigator is a business positioned to help parents navigate a complicated education-flavored-products market--again, I'm not sure we need to be concerned here, other than to note that many of the involved parties are not exactly disinterested observers of the Big Standardized Test biz. Oh yeah--the Carnegie Foundation did some of the funding for this.
There's a lot of fancy math in this paper, but the outcome is pretty straightforward--studying student test results in North Carolina, Massachusetts, and Washington, the researchers found that third grade test scores correlate with high school outcomes, namely, high school test scores, advanced course-taking in high school, and high school graduation.
The result is that third grade tests correlate with those factors about as well as eighth grade tests do. They also found that poverty and race/ethnicity correlate with a drop of those high school outcomes.
There is nothing surprising here. They have been painstaking in their figuring and crunching and mathing the crap out of their data. But ultimately, they have asked the wrong question.
Because--say it with me--correlation is not causation.
We know, via many many many pieces of research, that Big Standardized Test scores are directly tied to socioeconomic status. Both third grade and high school tests.
We know that socioeconomic status correlates with other life outcomes, like graduation and jobs etc. We can even talk about how it correlates with baloney like the Success Sequence (which just puts things ass-backwards, declaring that if you wear large clothes, that's what caused you to get bigger).
We know all about these correlations, and they point pretty clearly to SES as a cause. So research like this, while not a total waste of time, because I suppose if third grade scores were a terrible predictor of high school stuff, we'd know something was definitely cattywumpus somewhere in the system-- research like this isn't helpful because it's asking the wrong question.
What still remains unproven is this-- if you take a student who would have scored 60 on the third grade test and somehow get them to score 80 or 90, would that improve the student's later outcomes?
We have (and have had for years) evidence that the answer is no, that raising student test scores does not improve student outcomes.
We have been subjected to a multi-decade parade of reformsters asserting and assuming that raising student test scores would unlock a host of benefits for our students, our economy, our entire nation. But instead of proof, we've just had tautological research proving that students who do well in school do well in school., or evidence that your socioeconomic background is well-measured by the Big Standardized Test no matter what grade you're in.
What we really need is research that asks the right question. This paper is not that.
This blog post has been shared by permission from the author.
Readers wishing to comment on the content are encouraged to do so via the link to the original post.
Find the original post here:
The views expressed by the blogger are not necessarily those of NEPC.