Skip to main content

The Conversation: The Business of Rankings: Did the US News & World Report Make Substantial Mistakes?

What school rankings tell us and what they don’t. Report card image via www.shutterstock.com

 

Here’s an easy task: choose between the following two schools for your child.

School #1 provides high-quality instruction and strong supports in order to academically accelerate students and challenge them to take difficult courses.

School #2 provides high-quality instruction but doesn’t push students to do more than what is normally expected.

If you’re an actual parent, you probably prefer school #1.

But if you’re US News & World Report, selecting the “Best High Schools” in America, you apparently prefer school #2.

This outcome offers a cautionary tale about why it’s never a good idea to design a formula and start plugging in numbers before first understanding the system you’re analyzing. A seemingly small mistake can lead to big problems.

Those problems arose in New York State, where three very different math tests were treated by US News as if they were equal, which resulted in its rankings penalizing schools that encouraged students to take on challenges.

Let me explain why I started researching this issue.

As a policy researcher, I focus on practices that can close opportunity gaps. In this context, I have examined long-term effects on the achievement of students at a diverse suburban school, the South Side High School in Long Island. It is one of the excellent schools that lost its high (“Gold”) ranking from US News because of the publication’s altered approach.

And why did that happen?

Here is how the rating process works

US News begins its rating process by calculating proficiency rates for reading and math on what it describes as “state exit exams” or “high school proficiency tests” (both terms are used).

In New York, these exit exams or proficiency exams are (or at least should be) what US News calls “the required Integrated Algebra and Comprehensive English exams.”

But the publication then violated its own guidelines about which tests should be counted and decided to also include scores on optional, more difficult tests.

New York students can, in addition to taking the requisite Algebra exam, take one or both of two elective Math Regents exams, in Geometry and Algebra 2/Trigonometry. All three of these standardized math exams are a part of the state’s assessment system. The optional tests are taken by fewer students because they serve a different purpose.

In 2013, the year that supplied the data for the current US News rankings, approximately 290,000 students took the Algebra exam, 162,000 took Geometry and 116,000 took Algebra 2/Trigonometry.

As the decreasing numbers suggest, these elective end-of-course tests are not exit exams or proficiency exams. They are substantially more difficult and are taken mainly by a self-selected subgroup of students attempting to achieve an advanced diploma. Yet these three math exams are all treated equally in the US News analyses.

Rankings give misleading information

To illustrate this approach and its inherent problem, imagine an evaluation of schools based on students’ high-jumping ability. Schools are ranked higher if more students clear a high-jump bar set at a minimum of four feet, a standard generally met by 80% of students in this hypothetical state. But if the bar is set at five feet, only 50% succeed; if it’s set at six feet, only 25% succeed.

Now imagine two schools:

School #1 pushes as many students as possible to not just clear the four-foot bar, but also to attempt jumps over bars set at five feet and six feet.

School #2 pushes only its best athletes to attempt these more challenging jumps.

Finally, imagine a ranking system that counts all jump attempts, regardless of where the bar is set. School #2 looks good in the rankings, with 80% success, but School #1 sees its ranking drop, due to all those unsuccessful attempts to clear higher bars, even though 80% of its students also clear the four-foot bar.

When US News counted results on all three math tests taken by a high school’s students, it penalized schools striving to achieve.

Two schools may both have a 95% pass rate on the Integrated Algebra test, but if School #1 convinces many of its students to take the more difficult courses and exams as well, then its overall rates will surely fall — and US News will consequently and misleadingly advise its readers that School #1 is worse.

   

 

School rankings tell only part of the story. Rankings image via www.shutterstock.com
 

A scan of the New York results does, in fact, yield examples of schools designated by US News as “Gold” that, by key measurable criteria, have lesser results than schools with similar proportions of economically disadvantaged students, but that are not so designated.

These left-out schools have higher college readiness rates and higher rates of students awarded a Regents Diploma “with advanced designation” (meaning they passed the most difficult math test, Algebra 2/Trigonometry).

This lapse is significant given the importance of the very well-promoted US News rankings. Powerful constituencies in school communities pay attention, and district leaders hear from them.

The rankings are not low stakes; they therefore put in place a series of incentives and disincentives.

With strong enough incentives, people do nonsensical things

In sports, NBA teams will sometimes tank in order to better their lottery chances, and college teams will respond to incentives to schedule so-called cupcake opponents to improve their win-loss records.

Politicians will cater to the interests of donors and the wealthy even if those interests differ greatly from the preferences of the average voter.

Schools will set aside creative, meaningful lessons in favor of test prep, in order to avoid “No Child Left Behind” sanctions.

Colleges and universities will create early decision systems in order to increase their rating on the US News Best Colleges list. One measure used in the rankings is how many students choose another college. Early decisions simply don’t give admitted students the choice to go elsewhere.

According to Paul Glastris and Jane Sweetland, authors of The Other College Guide: A Road Map to the Right School for You:

“By elevating selectivity, US News creates incentives for schools to game the system by raising admissions standards and accepting fewer students who are less prepared or from lower-income backgrounds — that is, the ones most likely to need extra help graduating.”

How I got involved

Coming back to the issue with the rankings in the case of New York’s South Side High School: in May of this year, the school’s principal, Carol Burris, coauthored a piece on the Washington Post’s website that raised some of the problems with the rankings.

Burris had, back in 2013, convinced me to join her in a project to create an alternative way to recognize great high schools. She explained that while existing lists do identify many high-quality schools, the approaches underlying these lists generally fail to reward schools that are unsuccessful in enrolling high-scoring students.

South Side High School had long been ranked highly by US News, including during the time when I conducted research there, and Burris was certainly happy that her school was highly ranked. Moreover, I can attest, based on my research there, that the school is deserving of the honors it has received.

A long and consistent body of educational research stands for the basic tenet that great schools are those that engage and support student learning. No reputable research contends that schools become high-quality by enrolling the strongest students or by declining to encourage students to take the most challenging classes and exams.

Accordingly, we designed a project that we called Schools of Opportunity, which recognizes schools for using research-based best practices and for challenging and supporting their entire student body — including those without rich opportunities to learn outside of school.

It is built on criteria set forth in the 2013 book, Closing the Opportunity Gap, which I co-edited with Stanford University Professor Prudence Carter.

We piloted it this past school year in Colorado and New York, recognizing 17 high schools for their excellence. We plan to scale the project up next year, to recognize schools across the US.

Meanwhile, the ranking approach used by US News has led parents to wonder what happened to their communities' formerly excellent schools. Seeking answers, the principals who coauthored the Washington Post piece contacted US News. This chain of email shows how they were rebuffed.

Also, when Valerie Strauss of the Washington Post asked US News to respond to the concerns raised about the high school rankings, they responded with the following statement:

“This test was used for all schools in New York – it was applied equally across the state. Because [our calculations] are relative, schools are being measured against each other. … We applied the data in the way that we outlined in our methodology and technical appendix and these rankings reflect that methodology. We are very clear in our methodology and technical appendix about what assessment data we used and how it was applied.”

Compare this explanation with the argument offered in favor of standardized testing, as published just a week earlier in the satirical Onion newspaper: “Every student [is] measured against [the] same narrow, irrelevant set of standards.”

An unreasonable approach remains unreasonable even if it is applied equally to all schools and is openly explained in a methods section.

This blog post has been shared by permission from the author.
Readers wishing to comment on the content are encouraged to do so via the link to the original post.
Find the original post here:

The views expressed by the blogger are not necessarily those of NEPC.

Kevin G. Welner

Professor Kevin Welner teaches educational policy and law at the CU Boulder School of Education. He’s also the director of the National Education Policy Center, w...