Skip to main content

The Answer Sheet: A Critical Look at the Annual High School Rankings by U.S. News

U.S. News & World Report is famous for its college rankings — but it does others too, including an annual ranking of high schools. Are the schools declared the best really the best? Kevin Welner, the director of the National Education Policy Center at the University of Colorado Boulder, examines this issue in this post and the following post. Welner is an attorney and a professor education policy at the UC Boulder. He is also the co-founder of the Schools of Opportunity project, an effort to identify and highlight high schools that use research-based practices to ensure that all students have rich opportunities to succeed.

A version of this appeared on The Conversation, and I asked U.S. News to comment. The magazine took issue with Welner’s argument in a statement, which you can find at the end of this post.

By Kevin Welner

PART ONE

Rankings hold an odd sway over Americans. Assigning a numerical order somehow transforms a subjective opinion into an objective fact. It seems to matter little how the rankings are calculated or the care with which those calculations are undertaken.

I have recently been taking a closer look at the U.S. News and World Report ranking of “best high schools,” the nation’s most influential high school recognition program. In addition to surprising instances of carelessness, I discovered that the publication had deleted evidence concerning the earlier mistakes, and I found a troubling pattern whereby the rankings, which are supposed to recognize quality schooling, actually reward elite enrollment instead.

Given that this write-up is a bit lengthy, I will divide it into two separate parts. The first points out specific technical problems with the rankings. I briefly summarize the key problem that Carol Burris, Sean Feeney, and John Murphy identified and described here on The Answer Sheet, and I then provide additional evidence of how this problem plays out. I also describe how U.S. News apparently responded by removing evidence.

The second parts explores more fundamental problems with the rankings that would remain in place even if the technical problems were fixed. I will look at the results of the U.S. News ratings process, and I question whether it yields rankings that meaningfully identify “best” high schools.

A Big Problem, Now Concealed

The U.S. News rating process this year altered its past approach for determining whether a school’s students attained proficiency. To illustrate, imagine students A, B, and C entering high school in Grade 9. In the past, U.S. News used what is known as a “cohort” approach to evaluate school quality (e.g., a cohort is the group of entering ninth graders at a school). The question it would ask is whether these three students passed their state-level proficiency exams.

This year, the publication instead took a snapshot of test results in a school in the measured year. So the test scores of A, B, and C may or may not appear, depending on whether they took a proficiency exam that year. Further, the test scores of students outside their cohort might appear, if those non-cohort students took a test that U.S. News decided to include. Moreover, in high-performing districts, one or more of these students might be accelerated and thus may have already passed the math proficiency exam in grade 8, before entering high school. When that occurs, the score does not count at all.

So far, reasonable people might differ about whether this change makes sense. The biggest concern, in my view, is that the change creates an incentive for a school district to stop the district’s most advanced students from accelerating and taking the math proficiency exam before Grade 9, so that their scores would “count” for the high school’s ranking.

But Burris, Feeney and Murphy discovered a bigger problem. Instead of just using the New York math proficiency exam (a Regents Exam called Integrated Algebra, or Algebra 1), U.S. News also included in its calculations two additional Regents Exams: for Geometry and Algebra 2/Trigonometry. As explained here and here, these optional math exams (Geometry and Algebra 2/Trigonometry) are much more difficult, so schools that encourage more students to take these challenging Regents classes and exams are penalized by the altered U.S. News approach. Some of the state’s best high schools were accordingly disqualified from receiving recognition.

Several interesting responses followed from the original Answer Sheet piece on the U.S. News rankings. U.S. News replied to Valerie Strauss with a statement that was included at the end of the piece.

This test was used for all schools in New York – it was applied equally across the state. Because Step 1 and Step 2 are relative, schools are being measured against each other. … We applied the data in the way that we outlined in our methodology and technical appendix and these rankings reflect that methodology. We are very clear in our methodology and technical appendix about what assessment data we used and how it was applied.

This provides no real explanation or defense. In fact, as I pointed out elsewhere, this justification looks a lot like the one offered as the first “Pro” that the Onion newspaper satirically listed for standardized testing: “Every student [is] measured against [the] same narrow, irrelevant set of standards.”

U.S. News also published a second, lengthier response, saying in part:

For the 2015 rankings, U.S. News used annual exam results, which only count results from a specific school year, and included optional Regents exams in Geometry and Algebra 2/Trigonometry (in addition to the required Integrated Algebra and Comprehensive English exams). U.S. News and RTI [the contractor who carried out the research] decided to use these results because annual results are more consistent with how most states report their assessment data. Including the optional math exams is also consistent with U.S. News’ approach to include all applicable high school tests and reflects the achievement of schools who encourage their students to take these exams.

As with any change in data sources or methodology, this change can affect some schools more than others. In particular, the change in New York state assessment results means that schools who push their students to take Regents exams multiple times in order to pass would not get the benefit of the cumulative measure …

The statement acknowledges that the new approach, basing proficiency in part on the results of optional tests that are more challenging, hurts schools that accelerate students. Further, the wording used is unfathomable: “Including the optional math exams is also consistent with U.S. News’ approach to include all applicable high school tests and reflects the achievement of schools [that] encourage their students to take these exams.” What does “all applicable” mean and entail? It seems to beg the question of what should be applicable. Why are scores on optional, difficult tests now considered “applicable” for determining whether a school’s students are proficient?

Moreover, when the publication asserts that its approach “reflects the achievement of schools [that] encourage their students to take these exams” (emphasis added), “reflects” really means “penalizes.” To understand why U.S. News would want to penalize such schools, we have to go down to the second paragraph: “schools [that] push their students to take Regents exams multiple times in order to pass would not get the benefit of the cumulative measure…” Are they saying that schools should give up on a student who doesn’t pass the first time? Or are they saying schools that work with a student to help her pass a test that’s difficult should be given less credit than schools whose students have little or no difficulty with the test? Either way, the clear incentive is for schools to discourage students from taking difficult optional exams, which is an odd behavior to promote when seeking to recognize our “best” schools.

Yet there was a third, and still more troubling, response from U.S. News. In the Answer Sheet piece written by Burris, Feeney, and Murphy, they were able to document the above-described problem by using a list of the numbers of test takers that U.S. News included on its website. The following table was included in the piece, which shows the number of math test takers as well as an incorrect number listed for English test takers.

 

U.S. News reported English test takers

U.S. News reported math test takers

Actual English test takers

Actual math test takers (all 3 exams)

Wheatley

557

557

171

557

South Side HS

712

712

284

712

Syosset HS

1131

1131

590

1131

Uniondale HS

1513

1513

577

1513

But soon after The Washington Post published those numbers, U.S. News removed them—not just for these four schools or even just for New York schools, but for every school in the country. The explanation given was as follows:

Two of the data points displayed for schools in the Best High Schools section of usnews.com were the total numbers of students who took state assessments in reading and in math. U.S. News has decided for consistency purposes and to avoid confusion to remove these data points – which were not used to calculate the Best High Schools rankings – for all high schools in all states.

Social science research firm RTI International, our partner in the data analysis for Best High Schools, recently informed U.S. News that these two data points it provided and that were posted on usnews.com were incorrect because of a computer coding error.

The incorrect numbers that we have removed from our website were not used in the ranking analysis to produce the Best High Schools rankings of gold, silver and bronze medal winners. The proficiency percentages reported are correct, and the rankings were calculated using the correct numbers.

The final paragraph of this statement is correct; the actual English test-taker numbers were apparently used in the calculations. But the “consistency” and “to avoid confusion” statements seem disingenuous, particularly since these numbers were used in past years, with no consistency or confusion problems. Those problems only arose because of the above-described use of optional math Regents exams, and because Burris and her colleagues published here on The Answer Sheet an account of the muddle resulting from the change and from the “computer coding error.”

Why is this important? For one thing, it is no longer possible for principals, researchers or others to calculate even the U.S. News Performance Index (the first step of their rating process), to determine if it is correct. One open question was (is) whether there were errors made in states other than New York. We now cannot tell. What we do know is that prior to this year, and initially during this year, the number of test takers was posted and results could therefore be checked.

Further, this deletion from the website is telling because it relates to the key problem: the inclusion of two optional and more difficult math tests in New York. As Burris, Feeney and Murphy pointed out on The Answer Sheet, the school summaries had originally and falsely made it appear as if there were the same number of test-takers of math and English tests (i.e., that they were only using cohort data—looking at whether a given child in a cohort had achieved proficiency—which is how they’d done it in the past and how the State of New York still does it). When Burris and her colleagues called U.S. News out, showing that the website’s number of English exam test-takers was flatly wrong, the publication responded by removing the evidence.

Please know that my criticism here is not that U.S. News made a change to address a mistake—which would have been reasonable and responsible. What the publication did here, however, was to: (a) delete information, not correct it; (b) leave in place the underlying problematic calculations that penalize schools that promote more challenging learning; and (c) fail to seriously address the problems and the deletion.

Some evidence of the problem, however, remains; we can still compare the percentages of students who earn a Regents diploma with Advanced Designation, which requires students to pass the more difficult Algebra 2/Trigonometry test. Consider the following two upstate New York high schools: Brighton High School and Williamsville South. Following the links, one can see that they have similar demographics—13 percent of students at each school were “economically disadvantaged” during the 2012-13 school year, according to New York State.

Brighton High School was on the 2014 U.S. News list; it dropped off the 2015 list with the altered methodology used for math. Williamsville South made this year’s list. The chart below shows relevant math data obtained from the state website and the U.S. News rankings.

 

Brighton High School

Williamsville South

2012-13 math proficiency, according to New York State (using a cohort approach)

96%

95%

U.S. News reported math proficiency

87%

95%

Regents diploma with advanced designation

71%

60%

Before continuing, I want to make clear that nothing here should be understood as an argument that Williamsville South is not an excellent school or should not be on the list. Rather, it is an argument that the altered approach used by U.S. News unreasonably and misleadingly excluded Brighton.

Note that Brighton has a high level of students who are awarded a Regents Diploma “with Advanced Designation” (71 percent). To receive this diploma, students have to pass the most difficult math test, Algebra 2/Trigonometry. Brighton also had, according to U.S. News’ own calculation, a high college readiness rate of 57.3 percent; Williamsville South’s rate was 52.9 percent. Brighton, however, was disadvantaged by (1) accelerating more students by having them take Integrated Algebra in Grade 8, and/or (2) pushing more students to take advanced Regents math courses, even though some may not pass the exam. The same case could be made for South Side High School, The Wheatley School, Locust Valley High School and other excellent New York high schools that were previously on the list and dropped off.

The carelessness in this year’s U.S. News ratings almost surely includes states beyond New York. One example that jumped out at me was the listing for California’s #1 high school: the American Indian Public High School. According to the school’s May/June listing on U.S. News, the “Percentage of Disadvantaged Students Who Are Proficient” at this charter school, which ironically enrolls “0%” Native American students, is 911.4 percent.

This rate, while certainly impressive, is a mathematical impossibility. Sometime after June 7th, when I captured the linked image, U.S. News changed the school’s listing to eliminate the line for “Percentage of Disadvantaged Students Who Are Proficient.” The 911.4 percent figure wasn’t corrected—just disappeared, and I could find no explanation for the deletion or for the earlier mistake. When contacted by Valerie Strauss, after I sent her this post, the U.S. News representatives explained that the California percent-proficient information was removed because of “an internal data programming issue that was causing the data to be displayed incorrectly on our site.” (Interestingly, part of the U.S. News defense of its earlier deletion of information was, as quoted above, “The proficiency percentages reported are correct…”.)

I will return to the story of this high-scoring California charter school in the second part, above this post.

—-

I asked U.S. News to comment on the version of this that appeared in The Conversation. Here’s the response:

Thank you for reaching out. Mr. Welner’s overall argument is incorrect. Schools that do better and are awarded medals in the U.S. News rankings are the ones that are doing more (better than the average in the state or way better than the average in the state) than what is expected given their level of poverty or proportion of economic disadvantaged. In other words, the U.S. News rankings do reward schools that push students to do more than is normally expected of them, in fact it’s a key premise of the rankings.

This blog post has been shared by permission from the author.
Readers wishing to comment on the content are encouraged to do so via the link to the original post.
Find the original post here:

The views expressed by the blogger are not necessarily those of NEPC.

Valerie Strauss

Valerie Strauss is the Washington Post education writer.
,

Kevin G. Welner

Professor Kevin Welner teaches educational policy and law at the CU Boulder School of Education. He’s also the director of the National Education Policy Center, w...