Yesterday the big announcement in New York was the results of the latest ‘common core aligned’ state tests. As expected, the scores plummeted. The New York Times reported that from last year to this one, the percent of students scoring ‘proficient’ on English dropped from 47% to 26% while the percent scoring ‘proficient’ on Math dropped from 60% to 30%.
Various ‘reformers’ weighed in and tried to put a positive spin on it. Joel Klein, in an Op-Ed in the The New York Post wrote “While some may confuse lower scores as a negative development, the fact that we’re finally being honest about academic achievement is a very positive sign.” Arne Duncan said “Too many school systems lied to children, families and communities”
The ‘lies’ they are talking about are the ones that say that our education system is doing an OK job. It is very important to these ‘reformers’ that our schools are ‘failing’ so they can justify their radical approach to reform which is centered upon shutting down public schools, opening, charters, and firing teachers based significantly on standardized test scores.
So I looked carefully at the data and found that these test scores do, in fact, prove there was some lying going on. But it is not the lie that that Klein and Duncan were talking about. The lie that these test scores reveal is the one about charter schools being better than public schools.
I was tipped off to this story by an excellent article by AP writer Stephanie Simon, who seems to be one of the only reporters in the country on the trail of the fraudulent reformers. In an article called “New York fails Common Core tests”, Simon points out that, in general, charters did not do very well on these new tests.
Just 23 percent of charter students scored proficient in language arts, compared with 31 percent in public schools overall. That’s a greater gap than had shown up in last year’s exams.
In math, charter schools beat the public school average in each of the past two years — but not this year. On the new tests, just 31 percent of charter students scored proficient, the same as in public schools overall.
The Democracy Prep chain posted uneven results, with particularly poor scores in sixth grade. In its Harlem charter, fewer than 4 percent of sixth graders passed the language arts exam, and fewer than 12 percent passed math. Its best results came at the eighth grade level, but even then the pass rate on both tests was under 33 percent — better than the citywide average of about 26 percent, but not a quantum leap above other public schools.
Democracy Prep officials didn’t respond to a request for comment.
The highly touted KIPP network also stumbled, with proficiency rates well below the city average for several grades and subjects. At KIPP Star College Prep, just 11 percent of fifth graders were proficient in math and just 16 percent passed the reading test. Seventh grade was another weak point, with 11 percent proficient in language arts and 14 percent in math. KIPP also did not respond to a request for comment.
She also wrote that another charter chain, Success Academies, did quite well on the new tests. So we have mixed results with at least one charter doing well and at least two doing poorly.
So I thought I’d download all the data and take a look for myself — and yes, there WILL be scatter plots!
There are two ways to make test scores go down: make the test harder or make the passing score higher. As a teacher when I give a test, my hope is that all my students will pass that test and, as I’m at a specialized high school, I hope that everyone breaks 90, which actually rarely happens. If only 30% of my students passed a test, well, something went wrong. Maybe the test was too hard. Maybe it was too long. Maybe there was a ‘bad’ question which caused kids to waste a lot of time going down the wrong path. I’d likely curve the test and, if I determined that the issue was how I taught the topic, I’d do some reteaching.
The test scores in New York went down because the test was much more difficult, testing, in theory, ‘higher order thinking skills’ though, in practice, the questions are really quite confusing. I actually got one of the third grade questions wrong. The older ‘skills based’ tests were problematic too since students could do very well on them without really understanding math. But these new tests have the opposite problem: Students can do very poorly on them even if they do understand math. This is why I don’t like to base 20% of my teacher rating on a single test that I didn’t write.
When the scores came back, John King could have put some kind of ‘curve’ on the test, as they did in Florida recently, but decided not to. So the scores are very low with about 2/3 of students failing.
Now this does not mean, necessarily, that teachers are going to be fired and schools shut down over this. The New York City ‘progress reports’ compare schools to each other so a harder test, if it still has a mix of some easy and some medium level questions, could still be at least as accurate as the old tests (if you think the old tests were accurate).
So the first thing I did was make a scatter plot comparing all the schools that tested 7th grade math in 2012 and 2013. On the horizontal axis I put the 2012 score and on the vertical axis I put the 2013 score. If the test was so hard as to become almost random, there would be little correlation, but as you can see there is a somewhat of a correlation between the two years.
But here’s where it gets interesting. To see if most charter schools were like KIPP Star and Democracy Prep, scoring well below the 22% city average, or if most were still doing relatively well, like the Success Academies, I made another scatter plot, but on this one I marked all the charter schools (or at least the ones that had the word ‘charter’ in them) with a red circle.
As can be clearly seen, the charters are, in general, the ‘outliers’ meaning the schools that had the biggest drops relative to other schools with similar 2012 scores.
In the Stephanie Simon report she mentions that KIPP Star and Democracy Prep hadn’t done so well with their proficiency rate, but she doesn’t mention how far they had dropped. Out of over 500 schools, which includes about 35 charter schools, of the one hundred largest drops, 22 were charter schools.
The most stunning example is the famed Harlem Village Academy which had 100% passing in 2012, but only 21% passing in 2013 for a 79% drop (you can see that sad dot all the way at the right of the scatter plot). Democracy Prep Harlem Charter, run and staffed by many TFAers, dropped 84% in 2012 to 13% in 2013. KIPP Amp dropped from 79% in 2012 to just 9% in 2013. The Equity Project (TEP) which pays $125,000 for the best teachers had finally gotten some test scores they can brag about with 76% in 2012, but that has now sunk to just 20% in 2013. The Bronx Charter School Of Excellence, which recently received money from a $4.5 million grant to help public schools emulate what they do, dropped from 96% in 2012 to 33% in 2013. So these are the schools that are the red ‘outliers’ hovering near the bottom right of the scatter plot. In general, the average charter school went down by 51 percentage points compared to 34 percentage points for the average public school. The most plausible explanation for charters dropping so much more than public schools is that their test prep methods were not sufficient for the more difficult tests. In other words “you’re busted.”
I just don’t see how the ‘reformers’ can reconcile these statistics with their statement that these lower scores are a good thing since we are now being honest about where we stand. The low scores in general do not decisively prove anything. The cutoff scores for passing were an arbitrary choice by some politicians in Albany. But the evidence that charters are certainly not working the miracles they claim is very clear from this data.
Now, remember that I don’t think that test scores capture all the good in a school. Perhaps these charter school have a lot of those intangibles that help kids eat grits or whatever. I don’t know. But I do know that if the ‘reformers’ really value their ‘data’ so much, they should really think about how to interpret the charter grade crash. To me, this suggests that maybe the hundreds of millions of dollars given to charters, both from the government and from private benefactors could be spent elsewhere in education more effectively.
This blog post has been shared by permission from the author.
Readers wishing to comment on the content are encouraged to do so via the link to the original post.
Find the original post here:
The views expressed by the blogger are not necessarily those of NEPC.