Skip to main content

Curmudgucation: What is Test Prep?

Yesterday I fell into a discussion of test prep on Twitter where a participant tossed forward the notion that test prep actually decreases test results. Others asserted that test prep doesn't really help. I'm pretty sure that both of those assertions are dead wrong, but I also suspect part of the problem is that "test prep" is an Extremely Fuzzy Term that means a variety of things.



The research itself is not exactly stunning. A study that turns up from time to time is a study from Chicago from 2008 that looks at test prep and ACT results (in Illinois, everyone takes the ACT, so congrats to whatever salesperson/lobbyist from ACT's parent company that landed that contract-- ka-ching! We'll skip over all the reasons that's a bad idea for now). A quick look at the summary shows that this study didn't exactly prove that test prep is a bust:



CPS students are highly motivated to do well on the ACT, and they are spending extraordinary amounts of time preparing for it. However, the predominant ways in which students are preparing for the ACT are unlikely to help them do well on the test or to be ready for college-level work. Students are training for the ACT in a last-minute sprint focused on test practice, when the ACT requires years of hard work developing college-level skills.



That's a nice piece of sleight of hand there. Test prep wasn't failing-- just one particular type was. There are, of course, many other test prep alternatives, but the study ignores those, shrugs, and says, "I guess our only alternative is to believe the ACT PR about how the test measures 'years of hard work' on college level skills."



Meanwhile, the College Board has been touting how their special brand of test prep totally works on the SAT. I'm going to summarize the test prep research by saying that there isn't much, what exists is kind of sketchy, and clear patterns fail to emerge. So let me get back to my main question.



First of all-- which test? For our purposes, when we talk about test prep, we're talking about the Big Standardized Test that the Common Core reform wave inflicted on every state. On the subject of test prep, those are the tests that matter most because that brand of test-centered high-stakes data-generation is the thing that has twisted our schools into test prep factories.



Lots of folks have tried to define test prep very narrowly as simply drilling or rote-working the specific information that is going to be on the test. That definition serves members of the testing cult because by that definition, not much test prep goes on. But I suspect virtually no actual classroom teachers would define test prep that way.



How would I define it?



Test prep is anything that is given time in my classroom for the sole purpose of having a positive impact on test scores.



Right up front, I'll note there is some grey area. There are some test prep things that I can turn into useful learning experiences, and there are some actual education things that may have a positive impact on test scores.



But if I'm only doing it because it will help with test scores, I say it's test prep, and I say to hell with it.



This covers a broad range of activities. It is necessary, particularly in the younger grades, to teach them how to deal with a multiple choice test, doubly necessary if the test is going to be taken on computer.



But once we've introduced that, we never let it go. Fifteen years ago, the amount of time I would have spent in my English class on activities in which students read a short passage and then answered a few multiple choice questions-- that time would have been pretty close to zero. Short excerpts and context-free passages are a crappy way to build reading skills or interest in reading, and multiple choice questions are just about the worst way to assess anything, ever. But now, like English departments across the country,  we have bought stacks of workbooks chock full of short passages coupled with sets of multiple choice questions. We don't buy them because we think they represent a great pedagogical approach; we buy them because they are good practice for the sort of thing the students will deal with on the BS Test. They are test prep, pure and simple, and if I were deciding strictly on educational merit, I wouldn't include them in my class at all. Not only are they a lousy way to teach reading, but they reinforce the mistaken notion that for every piece of reading, there's only one correct way to read it and that the whole purpose of reading is to be able to answer questions that somebody else asks you with the answers that somebody else wants.



Writing is even easier to do test prep for, and my department is the proof. We teach students some quick and simple writing strategies:

   1) Rewrite the prompt as your first sentence.

   2) Write neatly.

   3) Fill up as much paper as you can. Do not worry about redundancy or wandering.

   4) Use big words. It doesn't matter if you use them correctly (I always teach my students "plethora")

   5) Indent paragraphs clearly. If that's a challenge, skip a line between paragraphs.



With those simple techniques, we were able to ride consistent mid-ninety-percent of our students writing proficiently.



In addition, because the state wants the BS Test to drive curriculum, they make sure to let us know about the anchor standards (the standards that will actually be on the test) so we can be sure to include them, which to be effective, has to be done by using the state's understanding of the standards. Our professional judgment is not only irrelevant, but potentially gets in the way. This can cover everything from broader standards to specific terms likely to appear.



And, of course, we need to familiarize the students with the state's style of questioning. For instance, PA likes to test context clue use by giving students a familiarish word used an uncommon meaning for the word to make sure that the students decipher the word using only the sentence context and not actually knowledge already in their brains.



None of this is rote memorization of details for the test. All of it is test prep, and all of it is effective up to a point. Particularly students who are neither good nor enthusiastic test-takers, this can make the difference between terrible and mediocre results. And every year it leaves an ugly bad taste in my mouth, and every year all of us struggle with maintaining a balance between that educational malpractice and doing the teaching jobs that we signed up for when we started our careers.



Test prep does, in a sense, carry beyond the classroom. The article that kicked off yesterday's conversation was a piece by Matt Barnum about the shuffling of weaker teachers to younger grades. That is absolutely a thing-- I suspect every single teacher in the country can tell a story about administration moving teachers to where they won't "hurt us on the test results." Teachers who can do good test prep are moved to the testing windows; those who can't are moved out of the BS Test Blast Zone. There are far better ways to assign staff, but many administrations, eyes on their test scores, are afraid not to make test scores Job One.



In fact, there are districts where the structure of the schools is changed in response to testing. Eighth graders do notoriously badly on BS Tests, so it's smart to put sixth graders in your middle school with the eighth graders to mitigate the testing hit.



And there is the test prep that goes beyond instruction, because teachers understand the biggest obstacle to student performance on the BS Test-- the students have to care enough to bother to try.In a state like PA, where my students take a test that will effect my rating and my school's rating, but which has absolutely no stakes for them, that's an issue. The BS Tests are long and boring and, in some cases, hard. Youtube is filled with peppy videos and songs and cheers from the pep rallies and other endless attempts to make students actually care enough to try. This kind of test prep is not so much toxic to actual instruction as it attacks the foundation of trust in the school itself. Elementary teachers may feel it's helping, but by high school the students have figured out that it was all, as one student told me, "a big line of bullshit. You just want us to make you look good."



The most authentic assessment is the assessment that asks students to do what they've practiced doing. The reverse is also true-- the most effective preparation for an assessment task is to repeatedly do versions of that exact task. And so all across the country, students slog through various versions of practice tests. If you want students to get good at writing essays, you have them write essays. If you want them to get good at reading short stories, you read short stories. And if you want them to get good at taking bad multiple choice standardized tests, you take a steady diet of bad multiple choice standardized tests.



That's test prep, and it's effective. It won't make every student score above average for a variety of reasons, not the least of which is that the tests are meant to create a bell curve and not everybody can be above average.



But if you think the solution to getting ready for the BS Tests is to just teach students really, really well and the scores will just appear, like magic, then I would like to sell you a bridge in Florida that crosses the candy cane swamp to end in a land of unicorns that poop rainbows. 

This blog post has been shared by permission from the author.
Readers wishing to comment on the content are encouraged to do so via the link to the original post.
Find the original post here:

The views expressed by the blogger are not necessarily those of NEPC.

Peter Greene

Peter Greene has been a high school English teacher in Northwest Pennsylvania for over 30 years. He blogs at Curmudgucation. ...