Skip to main content

VAMboozled!: It’s a VAM Shame...

It’s a VAM shame to see how VAMs have been used, erroneously yet as assuredly perfect-to-near-perfect indicators of educational quality, to influence educational policy. A friend and colleague of mine just sent me a PowerPoint that William L. Sanders – the developer of the Tennessee Value-Added Assessment System (TVAAS) now more popularly known as the Education Value-Added Assessment System (EVAAS®) and “arguably the most ardent supporter and marketer of [for-profit] value-added” (Harris, 2011; see also prior links about Sanders and his T/EVAAS model here, here, here, and here) – presented to the Tennessee Board of Education back in 2013.

The simple and straightforward (and hence policymaker-friendly) PowerPoint titled “Teacher Characteristics and Effectiveness” consists of seven total slides with figures illustrating three key points: teacher value-added as calculated using the TVAAS model does not differ by (1) years of teaching experience, (2) teachers’ education level, and (3) teacher salary. In other words, and as translated into simpler terms but also terms that have greatly influenced (and continue to influence) educational policy: (1) years of teacher experience do not matter, (2) advanced degrees do not matter, and (3) teacher salaries do not matter.

While it’s difficult to determine how this particular presentation influenced educational policy in Tennessee (see, for example, here), at a larger scale these are the three key policy trends that have since directed (and continue to direct) particularly state policy initiatives. What is trending in educational policy is to evaluate teachers only by their teacher-level value-added. At the same time, this “research” supports simultaneous calls to destruct teachers’ traditional salary schedules that reward teachers for their years of experience (which matters, as per other research) and advanced degrees (on which other research is mixed).

This “research” evidence is certainly convenient when calls for budget cuts are politically in order. But this “research” is also more than unfortunate in that the underlying assumption in support of all of this is that VAMs are perfect-to-near-perfect indicators of educational quality; hence, their output data can and should be trusted. Likewise, all of the figures illustrated in this and many other similar PowerPoints can be wholly trusted because they are based on VAMs.

Despite the plethora of methodological and pragmatic issues with VAMs, highlighted here within the first post I ever posted on this blog and also duly noted by the American Statistical Association as well as other associations (e.g., the National Association of Secondary School Principals (NASSP), the National Academy of Education), these VAMs are being used to literally change and set bunkum educational policy, because so many care not to be bothered with the truth, as inconvenient.

Like I wrote, it’s a VAM shame…

This blog post has been shared by permission from the author.
Readers wishing to comment on the content are encouraged to do so via the link to the original post.
Find the original post here:

The views expressed by the blogger are not necessarily those of NEPC.

Audrey Amrein-Beardsley

Audrey Amrein-Beardsley, a former middle- and high-school mathematics teacher, received her Ph.D. in 2002 from Arizona State University (ASU) from the Division of...