Skip to main content

Can We Reverse the Wrong Course on Data and Accountability?

New NEPC report and model legislation offer a positive alternative to today’s
poor uses of student data and punitive approaches to accountability

 

Contact: 
William J. Mathis, (802) 383-0058, wmathis@sover.net
Andy Hargreaves, (617) 552-0680, andrew.hargreaves@bc.edu
Henry Braun, (617) 552 4638, henry.braun@bc.edu
Kathleen Gebhardt, (303) 588-8804, kgebhardt@childrens-voices.org

URL for this press release: http://tinyurl.com/ksydtx9
 

BOULDER, CO (October 22, 2013) – A new report by two professors at Boston College urges American schools to use data and accountability policies in the more successful ways now seen in high-performing countries and in other sectors of U.S. society.

In their report, Data-Driven Improvement and Accountability, authors Andy Hargreaves, the Thomas More Brennan Professor of Education in the Lynch School of Education, and Henry Braun, the Boisi Professor of Education and Public Policy in the Lynch School of Education, find that the use of data in the U.S. is too often limited to simply measuring short-term gains or placing blame, rather than focusing on achieving the primary goals of education. The report is published by the National Education Policy Center (NEPC), which is housed at the University of Colorado Boulder.

The report’s findings have national significance because data-driven improvement and accountability (DDIA) strategies in schools and school systems are now widespread. When used thoughtfully, DDIA provides educators with valuable feedback on their students’ progress by pinpointing where the most useful interventions can be made. DDIA also can give parents and the public accurate and meaningful information about student learning and school performance.

However, in the United States, measures of learning are usually limited in number and scope, and data suggesting poor performance schools and teachers are often used punitively. The new report explains how the current system of student data collection more often than not creates “perverse incentives” for educators to narrow the curriculum, teach to the test and allocate their efforts disproportionately to students who yield the quickest test-score gains, rather than those with the greatest needs.

The end result, the authors find, is double jeopardy:

  • First, accountability impedes improvement. Under pressure to avoid poor scores and unpleasant consequences, many educators concentrate their efforts on narrow tasks such as test preparation and coaching targeted at students whose improved results will contribute most to their schools’ test-based indicators.
  • Second, accountability is undermined because there is a clear incentive to “game the system” to get scores up quickly.

To ensure that student improvement becomes the main driver of DDIA – and not simply an afterthought to accountability concerns – Hargreaves and Braun offer two key recommendations:

  • Base professional judgments and interventions on a wide range of evidence and indicators that properly reflect what students should be learning.
  • In line with best practice in high-performing countries and systems, design systemic reforms to promote collective responsibility for improvement, with top-down accountability serving as a strategy of last resort when this falls short.

A report containing model legislation, authored by attorney Kathy Gebhardt, accompanies Data-Driven Improvement and Accountability. The legislation, based on the Hargreaves and Braun brief, details a legal structure that would use data effectively to create a multi-level system of accountability designed for school improvement.

“We as a nation are faced with three choices,” explains Gebhardt. “We can continue to incorporate data and accountability in dysfunctional systems; we can abandon these policy tools as inherently counter-productive; or we can design and implement systems that use these tools in sensible, positive ways. This model code is intended to provide a foundation for the last of these choices.”

The flawed use of DDIA in much of U.S. education has significant ramifications. “When accountability is prioritized over improvement, DDIA neither helps educators make better pedagogical judgments nor enhances educators’ knowledge of, and relationships with, their students. Instead of being informed by the evidence, educators become driven to distraction by narrowly defined data that compel them to analyze dashboards, grids and spreadsheets in order to bring about short-term improvements in results,” Hargreaves and Braun caution.

“Schools are driven to place too much emphasis on test scores that capture what can be easily measured. In doing so, they neglect other important skills and qualities that are difficult to quantify. Numerical data should also be combined with the collective professional judgment of teachers if effective decisions are going to be made that truly promote students’ learning and development,” Hargreaves says.

The report’s authors draw comparisons with uses of data in business and in professional sports. Thoughtful application of DDIA in those sectors, employing a wide range of measures, has given rise to sustainable, high-level performance. For example, the authors highlight the Oakland Athletics, the first baseball team to rely on players’ performance statistics for recruitment; as a result, the club has routinely made the play-offs, often beating teams with much higher payrolls. “The point here is not just that effective teams use data but that the data are valued by everyone, and analyzed together with shared responsibility for improvement,” says Hargreaves.

The authors note the implications for public education. “Policymakers must take responsibility for assuring the public not only that the range and quality of the indicators used for improvement and/or accountability are adequate to these tasks, but also that educators have sufficient resources and support to effectively implement improvement strategies,” Braun says.

Hargreaves and Braun conclude in their report, “Data are neither a substitute nor a surrogate for professional judgment. The purpose of data is to support, stimulate and inform the judgment that is necessary for educational improvement and accountability. Expertise has no algorithm. Wisdom does not manifest itself on a spreadsheet. Numbers must be the servant of professional knowledge, not its master. Educators can and should be guided and informed by data systems; but never driven by them.”

Find the report, Data-Driven Improvement and Accountability, by Andy Hargreaves and Henry Braun, and the companion model legislation by Kathleen Gebhardt, on the web at
http://nepc.colorado.edu/publication/data-driven-improvement-accountability/

Funding for the policy brief and the legislative brief was provided in part by the Great Lakes Center for Education Research and Practice (greatlakescenter.org). In addition, the Ford Foundation provided funding for the policy brief.

The mission of the National Education Policy Center is to produce and disseminate high-quality, peer-reviewed research to inform education policy discussions. We are guided by the belief that the democratic governance of public education is strengthened when policies are based on sound evidence. For more information on NEPC, please visit http://nepc.colorado.edu/. For more information on the Ford Foundation-funded project, called the Initiative on Diversity, Equity, and Learning (IDEAL), please visit http://nepc.colorado.edu/ideal.

The policy brief and the model legislation are also found on the Great Lakes Center website,
http://www.greatlakescenter.org