William J. Mathis, (802) 383-0058, email@example.com
Casey D. Cobb, (860) 486-6278, firstname.lastname@example.org
URL for this press release: http://tinyurl.com/7agpr5w
BOULDER, CO (April 19, 2012) – Three recent reports on the Milwaukee Parental Choice Program (MPCP), produced by the School Choice Demonstration Project at the University of Arkansas use largely sound methods, but the data they assemble provide little to support the 22-year-old school voucher program.
Those are the conclusions of three separate reviews released today of the reports.
The reviews are all written by Casey D. Cobb of the University of Connecticut. They are published by the National Education Policy Center, housed at the University of Colorado Boulder School of Education.
The three reports were produced by the School Choice Demonstration Project, which has conducted a five-year longitudinal growth study of the Milwaukee voucher program. Milwaukee’s program, which was created by state legislation in 1990, enables low-income residents of the Milwaukee Public School district (MPS) to enroll at taxpayer expense in private schools that have been certified by the state Department of Public Instruction.
The reports under review are Nos. 29, 30, and 32, two of which offered what superficially appear to be positive findings:
- A sample of elementary and middle school MPCP students outperformed a matched sample of Milwaukee Public Schools (MPS) students in reading in the fifth year of the program. The MPCP sample also showed trends of outscoring the MPS sample in math, but these were not statistically significant (No. 29).
- Voucher students who attended a private school in 8th or 9th grade in 2006 “were more likely to graduate high school,” “enroll in a four-year post-secondary institution,” and “persist in that four-year institution beyond the first year of enrollment” (No. 30).
- Comparisons of the test performance of MPCP students with that of a sample of students from the Milwaukee Public Schools revealed mixed findings with no clear pattern (No. 32). For example, a sample of low-income MPS students scored higher than MPCP students on average in 4th grade reading, math, and science, and in 8th and 10th grade math. The MPCP students scored higher than the MPS sample in 8th and 10th grade reading and science.
In his reviews, Cobb found that the study comparing voucher and MPS elementary and middle school test-scores (No. 29) used sound and appropriately qualified methods. But he also cautioned that the study overgeneralized its findings.
The report’s authors acknowledged that their findings were surprising in light of earlier data showing no differences between MPCP and MPS samples, and the report suggested that the jump in the final year may be due to the introduction of a high-stakes accountability policy for the private schools just before the fifth year. But in his review, Cobb observes that the failure of math scores to similarly jump for the voucher students raises questions about that explanation – as well as about what if anything can be learned from the study as a result.
Regarding the study comparing graduation and post-secondary enrollment and persistence rates of MPS and MPCP students (No. 30), Cobb raised methodological concerns.
By 12th grade, he notes, roughly three out of four of the original 801 MPCP 9th graders were no longer enrolled in a participating private school. The sample attrition “severely clouded” the inferences that could be legitimately drawn about MPCP’s real impact on graduation rates.
Additionally, only one of the findings in the study’s most carefully controlled analytic models was statistically significant by conventional measures. Both limitations, Cobb writes, prevent broad conclusions about whether MPCP really improves graduation and higher education continuation rates over MPS.
Of the third study, on comparative test performance of MPCP and MPS students in grades 4, 8 and 10 in reading, math, and science, Cobb agrees that it reveals mixed findings with no clear pattern. MPS students scored higher than MPCP students in fourth grade in all three areas, and in eighth and 10th grade in math. MPCP students, however, scored higher than MPS students in reading and science in those two upper grades.
“The results are not particularly useful beyond providing a snapshot of how MPCP students and a comparison group of low-income MPS students performed on a battery of state exams,” Cobb writes. Moreover, a supplementary analysis identifying more “very low” performing MPS schools employed “arbitrary cut scores” and was potentially biased by unequal group sample sizes, he adds.
None of the three reports, Cobb concludes, provide substantial support for the voucher program. To some extent, this is because of specific methodological or analytical shortcomings. But it’s also because the data and the reports simply fail to demonstrate that voucher schools are associated with improved outcomes.
Find Casey D. Cobb’s reviews on the NEPC website at:
Find the three reports from the Milwaukee School Choice Demonstration Project on the web at:
The Think Twice think tank review project (http://thinktankreview.org) of the National Education Policy Center (NEPC) provides the public, policy makers, and the press with timely, academically sound reviews of selected publications. NEPC is housed at the University of Colorado Boulder School of Education. The Think Twice think tank review project is made possible in part by support provided by the Great Lakes Center for Education Research and Practice.
The mission of the National Education Policy Center is to produce and disseminate high-quality, peer-reviewed research to inform education policy discussions. We are guided by the belief that the democratic governance of public education is strengthened when policies are based on sound evidence. For more information on the NEPC, please visit http://nepc.colorado.edu/.
This review is also found on the GLC website at http://www.greatlakescenter.org/