Skip to main content

Bunkum Award 2011

Not all think tank reports are created equal…

Among the 20 think tank reviews NEPC published in 2011, several got a “thumbs up” from NEPC’s expert reviewers. In the carefully considered judgment of those reviewers, the authors did a good job of applying sound social science research principles. Such reports made a contribution to our overall knowledge about the topics they examined. Those reviews are fulfilling for us.

Other reviews described reports produced by think tanks that are worthless and mundane, wasting the reader’s time, obscuring what is already known, and contributing little if anything of value to the overall education policy discussion.

And then there were those special cases that shone through as prime exemplars of incompetent science. It is these marvels of multi-colored packaging garnished with impressive-looking footnotes, charts and appendices – these advocacy pieces cloaked in research panoply – that we illuminate with a particular type of recognition of merit: our annual Bunkum Award.

 

Watch the 2011 Bunkum Award Ceremony:

 

2011 Bunkum Award Honorees:

The 'Cancer is Under-Rated' Award - 2011 Grand Prize Winner

Progressive Policy Institute for NEPC Review: Going Exponential: Growing the Charter School Sector's Best (February 2011)

The Progressive Policy Institute deserves our top award for combining a weak analysis, agenda-driven recommendations, and the most bizarre analogy we have seen in a long time. This report spoke to us in ways matched by no other publication.

The report’s authors seemed to think it was a good idea – in a report advocating the exponential growth of their favorite charter schools – to compare those charters to delightful things such as viruses and cancer:

We also conducted research about when and how exponential growth occurs in the natural world, specifically examining mold, algae, cancer, crystals and viruses. We used these findings in addition to cross-sector lessons to fuel our thinking about fresh directions for the charter sector. The similarities between the natural world and organizational worlds are rather striking and useful for understanding the critical elements of exponential growth.

As our reviewer pointed out, the report never quite explains how the growth of cancer and viruses applies to the growth of charter schools. But even if the authors had somehow managed to do so, we can’t help but feel that the comparison is a bit tone deaf. Setting aside the example of crystals, which we think are rather interesting and attractive (NEPC is housed in Boulder, after all), what we take away from this comparison is that exponential growth in the natural world is not always desirable. It can, in fact, be quite deadly.

Beyond the analogy, the report suffers from an almost complete lack of acceptable scientific evidence or original research supporting the policy suggestions. It presents nine “lessons” or suggestions that are essentially common and vague aphorisms from the business world. Yet it fails to make the case that the suggestions or references are relevant to school improvement.

The 'Mirror Image (What You Read Is Reversed)' Award - 2011 First Runner-Up

Gates Foundation for NEPC Review: Learning About Teaching (December 2010)

Money can’t buy you everything. Yes, the $45 million allocated by the Gates Foundation to its “Measures of Effective Teaching” (MET) project is going to fund much more than this first-year evaluation report. And the MET project as a whole seems quite worthwhile. But one would hope that the evaluation of such a major project would be a serious effort. Unfortunately, that is not the case here. Using data from six school districts, the report examined correlations between student survey responses and value-added scores computed both from state tests and from higher-order tests of conceptual understanding. The study found that the measures are related, but only modestly, and the report interprets this as support for the use of value-added as the basis for teacher evaluations. The Gates Foundation touted the report as “some of the strongest evidence to date of the validity of ‘value-added’ analysis,” showing that “teachers’ effectiveness can be reliably estimated by gauging their students’ progress on standardized tests.” This would indeed be impressive were it not for the results suggesting exactly the opposite.

As our review points out, the data indicate that a teachers’ value-added for the state test is not strongly related to her effectiveness in a broader sense. Most notably, value-added for state assessments is correlated 0.5 or less with that for the alternative assessments included in the study, meaning that many teachers with low value-added using one test are in fact quite effective when judged by the other. As there is every reason to think that the problems with value-added measures apparent in the MET data would be worse in a high-stakes environment, these initial MET results raise serious questions about the value of student achievement data as a significant component of teacher evaluations – even if that’s not what the authors of the Gates report would have readers believe.

The 'If Political Propaganda Counted as Research' Award

Center for American Progress for NEPC Review: Charting New Territory: Tapping Charter Schools to Turn Around the Nation's Dropout Factories (Center for American Progress, June 2011)

Competent social scientists base their findings and draw their conclusions from the evidence they have gathered using rigorous and appropriate research methods. This report reverses the process. Drawing on mysterious backwards-engineering techniques, the authors of this report were able to build a foundation for their findings and conclusions that (when looked at with just the right mood lighting) bears a glancing resemblance to real evidence. For this achievement, we felt they deserved our recognition.

The report’s citations to “research” literature about school turnarounds, for instance, consisted of four references: a blog, a consultant’s template, a non-peer reviewed case study, and an article from the Hoover Institution journal Education Next. The report also focused on the ostensibly inspiring improvements of one school that, after concentrated, intensive and skillful charter management, catapulted English Language Arts proficiency rates to 14.9% and math proficiency rates all the way to 7%.

Sadly, the inspired imagination of the authors was lost on our expert reviewer, who wrote, “The report is designed and presented in a manner more consistent with political propaganda than as a research document.”

The 'If Bernie Madoff Worked in School Finance' Award

ConnCAN for NEPC Review: Spend Smart: Fix Our Broken School Funding (ConnCAN, March 2011)

This report from a Connecticut-based education advocacy group aims to Fix Our Broken School Funding System with a reverse Robin Hood take-from-the-poor approach nicely wrapped in a pious package of verbiage designed to hide its true effects. In truth, few areas of school policy are as ripe for legitimate critique as the way states allocate funds to schools. A disconcerting level of arbitrariness and inadequacy often marks these funding formulas. Yet this report doesn’t make use of well-established research conventions (adequacy or equity studies), or for that matter any sensible approach for determining if a formula is in fact broken. Instead, it promotes a “money follows the child” funding system that our reviewer points out would have the effect of making the system even more inequitable by shifting funding away from students learning English and those in poverty.

The report’s empty claims are perfectly combined with its evidentiary barrenness. Consider, for instance, the genealogy of the report’s claim that the current formula provides low-income children with only an additional 11.5% of funding. That claim is based on a previous ConnCAN report, which refers readers to information in a footnote, which then refers readers to that report’s appendix. But pity the intrepid reader who makes it that far; the appendix provides no justification or further reference to the phantom 11.5% figure. Our reviewer pointed to similar evidentiary black holes regarding the report’s claims about charter school funding and performance. And yet another instance of fantasy numbers comes from the report’s recommended removal of funding for English language learners based on the contention that these children have already been counted as low-income children. There is no compelling evidence to this effect in the literature – nor is there any in the report.

The 'Discovering the Obvious While Obscuring the Important' Award

Third Way for NEPC Review: Incomplete: How Middle Class Schools Aren't Making the Grade (September 2011)

This Third Way report is not shy about its intent: to “ignite the conversation about middle-class schools.” In particular, the report attempts to convince its readers that middle-class schools are doing a lot worse than we think. So what do we think? Years of research tell us that family wealth is an excellent predictor of test scores. One would therefore expect middle-class students to do substantially better than poor students but substantially worse than wealthy students. As our reviewer shows, that’s exactly how well these schools do. Middle-class schools score (…brace yourselves for this shocking finding…) in the middle.

What, then, is basis of the conversation Third Way is attempting to ignite? We’re not sure. That’s because in a normal conversation, one can understand what the other person is saying. Yet this report mixes and matches data sources and units of analysis to such an extent that it’s almost impossible for readers to figure out which analyses go with which data. Even more troubling, since the report defines “middle class” as having between 25% and 75% of students qualifying for free and reduced lunch, its analyses of district-level data include the urban schools districts in Detroit, Philadelphia, Houston and Memphis. The Third Way appears to have found a new way to address urban poverty: define it out of existence.