Skip to main content

Curmudgucation: NYC: Testing Is Still Not Teaching

People say dumb things when they're trying to defend bad policy decisions.



Alex Zimmerman at Chalkbeat reports that New York City schools are going to hit third and sixth graders at 76 schools labeled "low performing" by the state with three more tests per year. This is part of a plan to start testing the crap out of students doing formative assessments four times a year. Chancellor Richard Carranza's justification is that the tests are "to help the education department understand whether students are mastering material across the system in 'real time,' allowing officials to direct extra help to schools where students are struggling."

 

This is a dumb idea. Really dumb. You know what makes students struggle, in "real time"? Having their school year interrupted every other month by days of testing. Having their school year shortened by several weeks that are spent testing instead.



When this dumb plan was initially floated out there, the department didn't offer any details about where the tests would come from, who would create them, or what they would focus on. Mike Mulgrew, president of the city's teacher union, does not always say smart things, but he did this time:



How do you use a standardized formative assessment when you don’t have any sort of standardized curriculum?



Well, you can't, but as it turns out, the district didn't even try.



Instead, they're going to use the NWEA MAP test, an assessment in a box. It's normed (sort of) which means it's "graded" on a curve, and it's adaptive, which means it is different for every student who takes it. My old school used the MAP test, and one of my students explained the adaptive feature this way: "Hey, if you blow the first few questions, it just gives you easy stuff to do." The personalized adaptivity also means that you can't just pull up specific questions and check to see how individual students did-- you have to take the test manufacturer's word for it that they have accurately measure how students did on Standard 4.3-q. But it does come with a cool feature that claims to be able to read students thoughts, attitudes and character based on how long they take to click the next multiple choice answer.



And if you like data, I can tell you that I personally checked MAP results over two years against our state Big Standardized Test results, and the MAP was only slightly more predictive than rolling dice.



I have talked to people who found the MAP useful. I'm not one of them. We also gave the test three times a year, hoping to make use of its promise to show growth in students. But here's the thing-- for that to have any hope of working, the students have to care about this parade of multiple choice baloney testing that has zero stakes for them. Otherwise, you just get a measure of how much they feel like humoring the system on that particular day. The tests are supposed to take 45 minutes. For some students they take much longer, and for some students they take about a minute and a half. Even when taken seriously, they yield no information. Not once did I look at MAP and results and think, "I had no idea this student was struggling this standard!" Nothing that came back was ever a surprise.



You would think that this late in the corporate reform game we'd be smarter about tests, but MAPs give you fun graphics and oodles of data that make pretty charts. And then we get this winner for the Dumb Quote contest from David Hay, the education department deputy chief of stafF:



This isn’t a test — this is actually instruction.



No, no it's not. Testing is not teaching, and multiple choice tests delivered by computer are very especially not teaching. How can anybody working in education who hasn't been asleep or on the Pearson payroll actually say this sentence out loud. Yes, Hay went to Harvard Graduate School of Education, but before that he was a principal, and before that he taught for three whole year after getting a BS is in Marketing Education-- oh, hell, he should still know better. No, testing is not teaching.



Okay, in one respect he's correct. In authentic assessment, we want the assessment to match the actual  desired outcome as closely as possible (if you want to assess a student's ability to shoot foul shots, you have her shoot foul shots and not take a quiz on basketball dimensions). These 76 schools are in trouble mostly because of their standardized test scores, and standardized multiple-choice tests are a good authentic measure of-- taking standardized multiple choice tests. So if we pretend that we're doing authentic assessment, the best preparation for taking a standardized multiple-choice tests is to take standardized multiple-choice tests. So it is learning-- learning how to take the test.



But I'll bet dollars to donuts that NYC will adopt one of the common uses of these practice tests-- help break down students into three groups. The now worries, the hopeless, and the on the cusp so maybe if we test prep the hell out of them we can squeeze them onto the upper side of the cut line.



That's not learning, either.



No, more testing is a lousy idea. Trying to argue that it is actually instruction is just dumb. Because this is the worst part of a move like this-- by identifying the struggling schools, you actually target the students who most need help to have their teaching year cut. No sane district leader would say, "These students are having trouble getting the material, so let's give them less instruction time." This is the education version of the corporate "We're going to have daily meetings to figure out why we're not getting more work done."

This blog post has been shared by permission from the author.
Readers wishing to comment on the content are encouraged to do so via the link to the original post.
Find the original post here:

The views expressed by the blogger are not necessarily those of NEPC.

Peter Greene

Peter Greene has been a high school English teacher in Northwest Pennsylvania for over 30 years. He blogs at Curmudgucation. ...